AI Code Epoch 15-2:

Gesture recognition on a microcontroller

A lot of people couldn’t finish in one session, so we’ll have another one about microcontrollers.
(Note: It’s not required to have attended Epoch 15, we’ll provide you with everything you need to catch up quickly)

Challenge description:

Machine learning models are getting popular in small microcontrollers. With suitable sensors, microcontrollers are a good choice for small models that need to be fast, running on battery powered devices for interactive purposes.

This week, we are going to train a model that will distinguish hand gestures, with the data coming from an on board accelerometer. Such models are useful for devices that rely on UX, gaming controllers, VR/AR.

Helpful Terms

AI Code Epoch 15:

Gesture recognition on a microcontroller

Machine learning models are getting popular in small microcontrollers. With suitable sensors, microcontrollers are a good choice for small models that need to be fast, running on battery powered devices for interactive purposes.

This week, we are going to train a model that will distinguish hand gestures, with the data coming from an on board accelerometer. Such models are useful for devices that rely on UX, gaming controllers, VR/AR.

Helpful Terms

AI Code Epoch 14:

Amazon Reviews Classification using NLP

This week’s topic is text classification using text mining techniques or what is known as natural language processing. We will explain some text mining preprocessing and classification techniques and we will apply it to a sample of Amazon Reviews. The challenge is to classify the reviews to be either a positive or a negative review. Let’s see who will get the highest accuracy!

Helpful Terms

AI Code Epoch 13:

Reinforcement Learning Experiments with OpenAI Gym

This week’s topic is Reinforcement Learning (RL) using OpenAI’s gym package. You will receive a few different basic RL algorithms that you can experiment with in the many different environments in the gym package.

Experimenting together with the community will help you understand these algorithms. Understanding the workings of these algorithms is also the first step in improving them!

Helpful Terms
keras, Q-learning, deep Q-learning, double Q-learning

AI Code Epoch 12:

We will revisit the previous challenges once again. Trying to solve the challenges we didn’t do or improve our answers from the ones we did!

Helpful Terms

AI Code Epoch 11:

This week we won’t have a new challenge. We’ll focus on getting up to speed and learning more by working on the tutorials and finishing or improving previous challenges.

Helpful Terms

AI Code Epoch 10:

Convolutional Autoencoder

In this challenge we discover a convolutional autoencoder and build an encoder and decoder on the MNIST dataset. There are two options in this challenge:

  • Create an encoder yourself and a corresponding decoder (this is a harder and more creative challenge)
  • Create only the decoder based on a given encoder (when you are a beginner)

The data preprocessing and displaying the image is already included in the challenges. At the end you can view the generated number based on the trained model!

Helpful Terms
CNN, autoencoders, MNIST, UpSampling, MaxPooling, Dense, Flatten, Reshape, BatchNormalization

AI Code Epoch 9:

Transfer Learning

Today, neural networks are rarely trained from scratch. Instead, they use previously trained models as a starting point. This makes it possible to develop high accuracy models even with small datasets. This is called “Transfer Learning” and this week you will use it to train a neural network.

Helpful Terms
Transfer learning, Curriculum learning

AI Code Epoch 8:

Kaggle

This week you will be introduced to Kaggle by making an example submission to a competition. Then, you will try to climb the learderboard and compete with each other.

  • Register to Kaggle and join the Titanic competition
  • Run the Jupyter notebook Example First Submission.ipynb
  • It will create a .csv submission file, submit the file to the competition
  • Improve the solution and climb the leaderboard
Helpful Terms

AI Code Epoch 7:

Monte Carlo methods

Monte Carlo (MC) methods are indispensable for any data scientist’s toolbox. They have wide-ranging applications from finance to physics, from law to AI. DeepMind’s AlphaGo used MC based algorithms (paper) to beat Lee Sedol in Go. This week you will be presented with several challenges to solve using MC methods.

Helpful Terms

AI Code Epoch 6:

Signature Verification Challenge

This week we will build an ML model for signature verification i.e. tell if a signature is genuine or forged. The challenge is based on the paper (see GitHub). You will try to implement the architecture described in the paper.

Helpful Terms

AI Code Epoch 5:

Implementing the random forest algorithm

After two applied challenges, we step back a little for a fundamental challenge. Implement one of the most popular machine learning algorithms: Random forest. This will improve your understanding of machine learning.

Helpful Terms

AI Code Epoch 4:

Machine learning on a Microcontroller

Machine learning is getting increasingly popular among devices on the edge, namely in IoT. In this challenge we are going to build a machine learning model that will run on small, battery-powered devices to detect trigger words just like “Ok, Google” or “Alexa” in home assistants.

For this challenge we’re having a guest lecturer: Mehmet Ali Anil from Grus BV. He is going to guide you through this walkthrough.

Since this is such a different challenge we understand there might be questions afterwards. You can contact Mehmet through .

Helpful Terms

AI Code Epoch 3:

Hacking a Random Number Generator (RNG)

RNG’s are used in a wide range of fields one of which is cryptography. In this challenge, you will be asked to build a machine learning model that can predict the next “random” number by looking at the previous numbers generated by the RNG.

Helpful Terms

AI Code Epoch 2:

MNIST

This week we will continue with a similar challenge to last week: MNIST. This challenge is considered the “Hello, World!” of neural networks. You will tackle this challenge using several machine learning techniques and neural network architectures.

MNIST is a large dataset containing images of handwritten digits. The challenge is to create an algorithm that can recognise these digits with a high accuracy. In today’s AI Code we challenge you to solve this in the following 4 ways:

  1. Do not use machine learning.
  2. Use a machine learning algorithm but not a neural network.
  3. Use a fully-connected neural network.
  4. Use a convolutional neural network.
Helpful Terms

AI Code Epoch 1:

Deep Learning Microscopy

This week we will continue with an applied challenge. Imagine you can obtain high resolution images using a cheap microscope! This is possible thanks to deep learning. Your task this week will be to build a neural network to tackle this challenge. The challenge is based on this paper.

The goal: Build a neural network that takes 14×14 MNIST images and spits them out in higher resolution (28×28).

  • x_train_lowres contains 60000 low resolution (14×14) images
  • x_train contains 60000 high resolution (28×28) counterparts.
Helpful Terms

AI Code Epoch 0:

Linear Regression

This week we will implement linear regression. Understanding linear regression is the first step towards understanding many of the fundamental AI concepts. Particularly we will implement gradient descent to perform linear regression. Most of the learning algorithms used today rely on some form of gradient descent.

Helpful Terms

Learning material

🍒 The challenges can be difficult. To get up to speed for the AI Code sessions or to get some quick learning in, we’ve prepared some tutorials!

🍉 We’ve prepared an AI Canon: a collection of AI tutorials, guides, literature and video’s for people that want to learn more in depth about AI.