I would like to share with you one of the really cool projects developed within the Fall 2020 version of the course 249r at Harvard. I am planning to share projects weekly on some very interesting ideas, so stay tuned.
I will start the marathon of TinyML projects with the Air Guitar applications, using gesture recognition. I can’t wait to see what the community thinks about this application and get some ideas on how it can be explored further.
1. Project: Air Guitar: Using TinyML to Identify Complex Gestures
2. Authors: Robert Jomar Malate, Hossam Mabed, Kathryn Wantlin
3. Description: This project aims to explore and expand on how tiny machine learning (TinyML) can be used in the ﬁeld of gesture recognition. We used the premise of playing an airguitar (someone pretending to play a guitar without actually holding or playing one) to be an application of this. We designed and developed the hardware, database, and software framework to mimic the motions of playing a guitar. From our model training evaluation and physical experimentation, we were able to successfully mimic the basic functions of a guitar.
Please find more details here:
Air Guitar Video Presentation