Proposal suggested by James Larus from (the PEPP-PT https://www.pepp-pt.org/):"Here’s a real problem for which we need some good solutions. As part of our privacy-preserving proximity tracking work (https://github.com/DP-3T), we are going to use Bluetooth to measure distance between peoples’ phones. One big concern is that people put their phones in different places (front pocket, back pocket, shirt pocket, bag, etc.). Can we use the accelerometer and pressure sensor (altitude) to infer where a phone is located on a person’s body? Doesn’t need to be perfect, but the better the inference, the more precise we can make the distance estimation."
Our solution uses IMU and light sensor data from the smartphone and a supervised classification technique to decipher in which pocket/body location the smartphone is currently.
How we built it
During this 72-h LauzHack we dit:
- Study of smartphone usage habits: reference possible "placements" and "actions" to select the most revelant ones (and sugestion of anthropological/cultural factors to consider for PEPP-PT)
- First data collect test with the existing "hfalan - Sensor Log" app: define revelant sensors choise
- Data analysis
- Creation of our own "Indie-Pocket" app: user interface, coding, optimization, make it running on android and ios
- Data collect using "Indie-Pocket" app from 12 different users
- Machine learning: Processed the data with PCA in freq domain for all sensor data.
Challenges we ran into
Homogeneity of the data before using ML
Accomplishments that we're proud of
The platform can now be used worldwide and contribute to the accuracy of the model continuously.
- Larger study and data collect until monday april 13
- Machine learning
- Android/iOS dev
- Matlab, Python, R
- IMU, wearable sensors, biomechanics
Try It out
knn, machine-learning, supervised-learning