Air Guitar with TinyML - Gesture Recognition

Hello Everyone,

I would like to share with you one of the really cool projects developed within the Fall 2020 version of the course 249r at Harvard. I am planning to share projects weekly on some very interesting ideas, so stay tuned.
I will start the marathon of TinyML projects with the Air Guitar applications, using gesture recognition. I can’t wait to see what the community thinks about this application and get some ideas on how it can be explored further.

1. Project: Air Guitar: Using TinyML to Identify Complex Gestures
2. Authors: Robert Jomar Malate, Hossam Mabed, Kathryn Wantlin
3. Description: This project aims to explore and expand on how tiny machine learning (TinyML) can be used in the field of gesture recognition. We used the premise of playing an airguitar (someone pretending to play a guitar without actually holding or playing one) to be an application of this. We designed and developed the hardware, database, and software framework to mimic the motions of playing a guitar. From our model training evaluation and physical experimentation, we were able to successfully mimic the basic functions of a guitar.

Please find more details here:
Air Guitar Video Presentation

Air Guitar


Hey @adrianarotaru !
Great initiative! The applications of this project are basically limitless and I guess could be applied to almost any musical instrument.

Hey Atindro,

So far, I can think of applying this project to string instruments specifically. I think it would be more difficult to integrate it in complex woodwind or brass instruments, but I guess you could make your model detect the rhythm/frequency of the sounds based on the hand motion. One interesting aspect is the latency, so like in the case of the Air Guitar, it would detect gestures pretty well when there is a large detection window (when gestures are separated and sparsed out in frequency).

1 Like