A lot of devices are able to measure acceleration and rotation. For example every Smartphone is capable of this, the Tinkerforge IMU or even Sensor Tags for Car-2-X Communication, if they are equipped with such sensors. The interesting question is, how to get information out of the raw sensor data? For example Car-2-X communication: Every advanced driver assistant system (ADAS) nowadays is just able to ‘see’ a potential collision within the line of sight, like shown in this video:
If, for example, a child is crossing the street from behind a bus, the collision avoidance system of the car (camera system) is not able to ‘see’ the approaching child, because it is hidden behind the bus.
City of the Future: Smart and Connected
What if a sensor tag could broadcast the information, that somebody/something is ‘walking’ behind the bus in direction of the street, where the car is approaching? What, if a device is estimating ‘biking’/’running’ in direction of the car? Life-saving!
Modern Car-2-X Communication infrastructure is able to share movement information between traffic lights, cars, bikes and people in a city. If the system is able to determine the position (via GPS or sensor networks) and the direction of the movement, it is able to calculate a possible collision.
Activity Classification using Machine Learning
The hard part is to predict the kind of movement (standing, walking, running, biking), because this is an essential information for collision prevention algorithms (in the car). We have developed a real time classification algorithm, which is able to predict, whether the device is walking, running, sitting or going by bike.
We used sensor data of an iPhone to get data for these 4 different activities and developed a highly sophisticated classifier based on rotation rates and acceleration sensor data. The activity classification is reaching an accuracy score of about 97% in predicting the correct activity in realtime.
Watch it here: