Neuro-Visual Insights in Mobile User and Shopper Engagement - by SMI and Emotiv
BERLIN, July 8, 2013 /PRNewswire/ --
- An integrated toolkit allows to analyze subconscious and affective responses of users and consumers in real-world environments - with mobile eye tracking data from SMI Eye Tracking Glasses and brain response data from the Emotiv EEG Neuroheadset
Get full insights into users' engagement in a real-world environment and shoppers' decision making (photo) - with the new mobile neuro-visual toolset combining eye tracking data from SensoMotoric Instruments (SMI) and EEG data from Emotiv. Eye movements recorded with the mobile SMI Eye Tracking Glasses and brain response recorded with the wireless Emotiv EEG Neuroheadset are captured and synchronized in SMI's iView ETG software. The toolset will be presented at the UXPA trade show, the leading venue for usability professionals, taking place this week in Washington D.C, USA.
The unobtrusive set of mobile technologies allows users to naturally interact with content on their mobile or other devices and consumers to freely move within the store while their subconscious and affective responses are efficiently captured. This enables usability experts to assess visual and neural engagement of users while they are interacting with content on their mobile devices. It allows market researchers to analyze stages and motivations of shoppers' decision making. It also helps applied neuroscientists to conduct mobile neuro-rehabilitation studies with clinical subject groups.
An integrated SMI user interface allows for real-time visualization of both eye tracking and EEG data and for in-depth analysis of the data streams synchronized to a common timestamp. The SMI BeGaze analysis software shows individual patterns of attention along with the corresponding emotional states. Moreover, SMI BeGaze will aggregate and average eye tracking and EEG data over several trials and allow to identify trends in behavior of user and consumer groups. Findings can be visualized by creating heatmaps of eye movement data together with presentations of the dynamic emotion measures calculated by the Emotiv Affectiv™ Suite. The software also allows for export of eye tracking measures together with synchronized raw EEG streams.
Olivier Oullier, Ph.D, Professor Behaviour, Brain & Cognition Institute, Aix-Marseille University & CNRS: "It is great news that the integration of SMI Eye Tracking and Emotiv EEG data has been implemented for the mobile SMI Eye Tracking Glasses as well. This enables us to analyze behavior in public or industrial environments, e.g. assess how people are coping with critical tasks in safety situations. By merging information on visual attention and affective responses, we can learn a lot more about performance under stress but also about decision-making of various kinds of consumers, such as people suffering from obesity, in retail stores and other environments."
Tan Le, CEO of Emotiv Lifesciences & Co-Founder of Emotiv: "Emotiv is excited to partner and collaborate with SMI. We have always believed in the synergies between eye tracking and EEG. By bringing together our unique technologies in a fully integrated solution, we can offer researchers worldwide a new dimension to their research endeavors."
Christian Villwock, Business Director, SensoMotoric Instruments: "We have successfully introduced the integration of our monitor-based SMI RED-m device with the Emotiv headset to the market. Now, we are excited to add another level of mobility to this perfect match of eye tracking and EEG technologies by implementing this solution for our mobile SMI Eye Tracking Glasses."
SOURCE SensoMotoric Instruments GmbH