How can the power of AI be leveraged to take full advantage of diverse sensor-generated data?
Artificial intelligence (AI) is presently revolutionizing many diverse aspects of our society. For example, by combining the advancements in data mining and deep learning, it is now feasible to utilize AI to analyze large chunks of data from various sources, to identify patterns, provide interactive insights and make intelligent predictions.
One example of this innovative development is applying AI to sensor-generated data, and specifically to data gathered from smartphones and other consumer devices. Motion sensor data, together with other information such as GPS location, provides massive and diverse datasets. Therefore, the question is: “How can the power of AI be leveraged to take full advantage of these synergies?”
Motion data analysis
An illustrative real-world application would be to analyze usage data to determine what a smartphone user is doing at any given moment: sitting, walking, running or sleeping?
In this case, the benefits for a smart product are self-evident:
The heightened interest of smartphone manufacturers in these AI-enabled functions has clearly highlighted the importance of recognizing simple activities such as steps, which are certain to lead to more in-depth analysis of, for example, sports activities. With popular sports, such as football, product designers will not focus solely on the athletes themselves, but provide benefits to coaches, fans, and even large corporations such as broadcasters and sportswear designers who will profit from the deep level of insight that can accurately quantify, improve and predict sports performance.
Data acquisition and preprocessing
Having identified the business opportunity, the next logical step is to consider how these massive datasets can be effectively acquired.
In the activity tracking example, the required raw data is collected by means of axial motion sensors, e.g. accelerometers and gyroscopes, which are installed in smartphones, wearables and other portable devices. The motion data is acquired along the three axes (x, y, z) in an entirely unobtrusive way, i.e. movements are continuously tracked and evaluated in a very user-friendly manner.
Training the model
For supervised learning approaches to AI, labeled data is required for training a ‘model’, so that the classification engine can use this model to classify actual user behavior. For example, we can gather motion data from test users that we know are running or walking, and provide the information to the model to help it learn.
Since this is basically a one-time method, this user ‘labeling’ task can be performed with very simple apps and camera systems. Our experience indicates that the human error rate in labeling tends to decline with increasing numbers of samples collected. Hence, it makes more sense to take a larger number of sample sets from a limited number of users than to take smaller sample sets from more users.
Getting the raw sensor data alone is not enough. We have observed that to achieve highly accurate classification, certain features need to be carefully defined, i.e. the system needs to be told what features or activities are important for distinguishing individual sequences from one another. The artificial learning process is iterative, and during the preprocessing stage, it is not yet evident which features will be of most relevance. Thus, the device must make certain guesstimates based on domain knowledge of the kind of information that may have an impact on classification accuracy.
For activity recognition purposes, an indicative feature could comprise of ‘filtered signals’, for example body acceleration (raw acceleration data from a sensor), or ‘derived signals’ such as Fast Fourier Transform (FFT) values or standard deviation calculations.
For example, a dataset created by UC Irvine Machine Learning Repository (UCI) defines 561 features, based on a group of 30 volunteers who performed six basic activities: standing, sitting, lying down, walking, walking downstairs and walking upstairs.