Biosensor Integration and Biosignal Processing for User Experience and Research

Biosignals, such as signals obtained from various psychophysiological measures and eye-trackers, provide objective data that can be used for user research. Capturing such data during product usage can help researchers better understand the physical, mental, and emotional state of a user in real-time and provide valuable insights into user experience. This quantitative approach allows the detection of more subtle or furtive emotional reactions that elude self-reporting, either because participants are unwilling to share them or articulate them or because participants are unaware of them.

Incorporating bioelectric/psychophysiological sensors (i.e., unique devices converting physiological changes into electrical activity) and biosignal processing into user experience studies can benefit designers and engineers by improving product appeal, enhancing product usability, and acting as a “window to the mind.” This is particularly important for social media content, advertising, gaming, and movie/video streaming, to name a few examples where it is key to provide users appealing and stimulating content. Integrating relevant biosensors into such research can assist with targeting critical product features for improvement.

Technology and Techniques

Exponent, at its advanced Phoenix User Research Center (PURC), has developed a strong experimental framework integrating psychophysiology, eye-tracking, and video-capture technologies with automated and semi-automated data analysis algorithms to extract relevant metrics rapidly and efficiently and to correlate them with user experience in various conditions and across diverse population demographics.

We have effectively used our framework to develop numerous approaches to pinpoint and improve critical product parameters. Our approach has clearly demonstrated that monitoring biosignal activity may be useful for designers and engineers of a variety of products with which users interact and for which a real-time assessment of user experience and usability is desired.

Examples of our Framework in Action

Exponent has established protocols for seamlessly executing user research studies that implement biosensor data collection and for developing custom algorithms for feature detection/extraction, enabling larger population sample sizes (several hundreds) and the scalability of our analyses. Below are examples of our framework in action. This framework is not limited to the examples provided, as it can also be applied to a multitude of other applications.

1. Media Content Evaluation Using Biosignals and Eye Movements

 Fig1. Biosignal IntegrationExponent has evaluated participants’ emotional and physical responses to different media content. To do so, participants were connected to a suite of psychophysiological recording devices. Specifically, cardiac activity such as heart rate and heart rate variability (electrocardiogram [ECG]), galvanic skin response (electrodermal activity [EDA]), and muscle activation (electromyograph [EMG] and facial electromyography [fEMG]) were monitored and analyzed. Participants were also asked to wear a head-mounted eye-tracker. Then, participants performed different activities with the content. Data from the psychophysiological recording devices and the eye-tracker were followed up with post-exposure self-report measures indexing the participants’ subjective experiences. Several measures were then automatically/semi-automatically obtained through software custom-built by Exponent. By synching the respective feeds of the psychophysical data with the data from the eye-tracker, we were able to create a quasi-event-related analysis. This allowed researches to observe exactly what participants were viewing and how they were responding to it. Through this methodology, key metrics were defined and provided to improve products.

2. Using Biosignals to Monitor Driver and Passenger StateFig 2. Biosensor Integration

In addition to using biosignals to quantify user attitudes regarding the use of products, we can also use biosignals to monitor the physical and mental state of drivers and passengers in vehicles. In previously published work, we described a large-scale, on-road eye- and head-tracking study conducted in an instrumented test vehicle to understand and assess human behavior in a naturalistic driving environment. We are continually augmenting our driver studies with additional biosensors to study a plethora of driver and passenger attributes. For example, our methodologies can be applied to address questions related to the use of autonomous vehicles. Specifically, our methodologies can be used to monitor passengers for potential onset of sickness. This data can then be used by designers and engineers to inform their development of mitigation strategies to improve passenger comfort. In addition to monitoring passenger comfort, our methodologies can also be used to monitor mental workload of drivers and passengers, which will be critical as secondary tasks become commonplace among autonomous vehicle occupants.

Fig 2. Biosensor Integration