Machine Learning Asteroid Game

Anything is an interface

I explored basic machine learning to create an interacive asteroid shooting game that uses both motion tracking and sound tracking to interface with the game

Time: FALL 2021 // Creative Technology

Role: Creative Coder, Game Designer

Tools: ML5, P5.JS

Final Project:   HERE


Create an interactive system (a game, a story, a performance, etc) that relies on the recognition and tracking of the real-world objects by a machine learning system (ML5js). Explore the possibilities of the technology: what it can and cannot do? how subtle can the variations be? what is easy and what is hard to do with the existing tech? Starting with your big idea outline a backup plan - what is plan B to scale it down? What is plan C? What will work if nothing else works?


first, I created a basic shooter game based around the object oriented principles we explored in preivous weeks

Then, I explored ML5's pose net library.

I mapped an object to my nose...

... and integrated it into my game environment

Then, I started working on training the model to understand the word "piew"

and also integrated that into my system


This project opened many doors in my mind for interactive opportunities. Realizing how easy it was to get basic machine learning to work on my comuter opens up unlimited possibilities as I realize that really anything can be an interface. With these machine learning systems, however, I did realize their limitations. For instance, training my sound model only off of my own voice in my bedroom limited the ability for the program to work well in other environments and by other people. For next steps, I would like to train the model with more people and in more enviornments to make it more universally accessible.

Instructor: Maxim Safioulline