perjantai 17. elokuuta 2012

Sidetrack: Sensor Fusion

At Finwe Ltd. we have experts whose core competence is related to context awareness and adaptation, but also professionals who specialize in sensors, physics modelling and 3D graphics. While this blog has so far discussed mostly the former topic, we'll now take a peek to the lab next door and see what's brewing in their kettles.

Everyone's learned at school that humans have five primary senses. But did you know that the smartphone in your pocket has probably much more? Let's count...

Similar to human senses, sensors in smartphones make them more aware of the environment. For example, MEMS sensors such as accelerometer (1), magnetometer (2) and gyroscope (3) are used for movement and orientation tracking; most notably responsible for rotating the screen between portrait and landscape orientations. Light based sensors, such as front (4) and back cameras (5), proximity sensor (6), and ambient light intensity sensor (7) are pretty standard components in all smartphones and allow taking nice pictures, turning off display during phone calls, and adjusting screen brightness automatically. Some models can even sense features of the ambient environment with a temperature (8) and an air pressure sensor (9). Microphone (10) for audio input, touch panel (11) for pointing input and even short range radio networks such as NFC (12), Bluetooth (13) and WiFi (14) can be used as sensors.

That's already fourteen sensors. Yet there's more to it: you can measure multiple things with a single sensor component, or combine multiple physical sensors to work in concert to produce different kind of data. As a result, you'll get a bunch of virtual sensor channels as a bonus. With all that information pouring in, why aren't smart phones truly smart? Because its what you do with the information. A monkey can see the same thing than you can, but its still a monkey, right?

One of the most compelling topics in sensors is sensor fusion, i.e. combining input from multiple sensors to create something better. This requires quite involved math, where a tiny error completely blows the whole thing. But when it works correctly, it can do pretty amazing things. For example, sensor components are not ideal in performance and typically measure only a single physical phenomenon, such as linear acceleration or magnetic field. If movement tracking is based on data from a single sensor type, it is fairly easy to come up with movements where the illusion breaks. However, when multiple sensors work together, they can complement each other and truly make a big difference to a product.

Our specialists are preparing a free technology demo: a compass application that combines 3D accelerometer, 3D magnetometer and 3D gyroscope data to a so-called 9 DOF setup. Sensor fusion algorithms mix sensor inputs together in a clever way, and as a result, the compass is stable, accurate and responds quickly to movements in a way not possible with a single sensor. There's also fast 3D graphics and some physics modelling to give the finishing touch. The app is already developed and is absolutely state-of-the-art level quality. We will release it shortly after testing phase is finished. There's also a Youtube video in the works, as movement algorithm apps really need to be seen with own eyes before they can be appreciated. Nevertheless, here's a screenshot as a teaser.

Ei kommentteja:

Lähetä kommentti