In 2020 I took my first foray into experimenting with tinyML models as tensorflow released compatible models for the Arduino nano BLE sense. Running classification models on embedded devices could mean no data would need to be transferred out of the device. Potential applications in the future could mean harnessing ML capablities without private data needed to be exchanged.
I built a fun punch classifier inspired by the wearable watches that count steps but share private data back to the mothership. My intention was to protoype a functioning excercise simulator that could be used offline.
View the video with audio for sounds of delight :)
To train the model I needed to know how the hand moved. So I collected 6 parameters in a single punch sample. The X, Y, Z positions using the onboard accelerometer and the orientations using the gyroscope.
The first three images above are graphical representations of what a Jab, a hook & a uppercut look from a sensors perspective. The last image shows all of them plotted in the same graph.
Upon gathering instances of the data I trained it using a sample Google colab notebook by Don Coleman. This process took a couple of tries since I needed to test it with couple of people to make sure I remove any potential bias that may arise.
The gesture classification is happening on the Arduino, which is a physical component without any internal audio/visual feedback tooling. Attaching a small LCD screen to the Arduino was a option but it presented itself with some issues.
So I decided to gamify it with a screen based dashboard:
Sucessful jab feedback
Sucessful uppercut feedback
I had a chance to showcase my project at the ITP winter show 2019. Which let put the project up for 100+ potential users & get a feedback on the interactions.
You can see me presenting it on the coding train youtube channel.