exploring the intersection of virtual reality and eeg data

Design and Augmented Intelligence Lab: Promoting Human-centered Architectural Design through Biometric Data and Virtual Response Testing 

my contributions:
  • Manage and train 10 other undergraduate and graduate students on the data collection process for VR and EEG experiments.

  • Code Unreal Engine to optimize the User Experience of the VR environments for both the researcher and participant.

  • Clean and analyze biometric data measured by an electroencephalogram (EEG) system for Virtual Reality experiments evaluating spatial landmark recognition and classroom learning.

  • Assist in coding (MATLAB) functions to extract and analyze neurological data from brain waves.

  • Develop a strict protocol for and conduct further EEG and VR experiments evaluating hospital and retail design.

hospital and retail experiment brief:

Recent developments in innovative architectural designs encourage creativity and complexity, straying away from traditional paradigms. As a result, it can be difficult to understand human behavior as a reaction to innovative designs when they challenge established methods. It can be unpredictable, and may be too late to accommodate, how users will react to their environment once it is built. To address this issue, we will evaluate participants' human factors of stress, visual-spatial memory, and fatigue and electrical brain signals in virtual reality environments of hospital and retail settings.

Position: Data Collection Manager

creative work | User experience research:

*All images are graphical representations of the environments to keep it confidential*

Introducing the Virtual Reality environment:

Participants must complete widgets by answering questions in the environments. We introduce them to the game play by telling them to think of the experiment as if they were in story mode. The completion of one task will lead to the next. We support this by including realistic events that would occur in these spaces, just like a storyboard. By participating in tasks over free exploration, our VR environments can give us a more accurate reflection of user experience. 

Elevator theory:

Completing each widget is vital for moving on to the next task. Widgets pop up when they sense the player is nearby. We noticed that all pilot testing participants missed the elevator widget and had to be directed back because they would head straight to the open elevator. By moving the widget in front of the open elevator, we ensure that participants will complete their task and move on to the next. In reality, users will not stop to complete tasks. However, it can be more disruptive to interrupt the participant and direct them to go back rather than having them naturally  answer the questions by themselves.

Asset 21-8.png

Environment in pilot testing

Environment in final data collection

We want to evaluate the environmental effects of the hospital and retail spaces, not their psychological or behavioral reactions to the researchers and instructions!

Environment in pilot testing

Environment in final data collection

Verbal responses from participants improved in the final data collection condition versus the pilot testing condition due to improved ergonomics. There were no more complaints about the inability to see the products. Participants also spent much less time trying to see the widget when it was right in front compared to when it was on the side.

Widget placement:

In the retail experiment, participants look at products in aisles while sitting in a chair. Movement around this environment was limited to small sections of the floor in each aisle. Many pilot participants noted that they could not see the products clearly and had to lean in because they couldn't get closer. This would pull on the EEG wires and cause noise in the brain signal data. To view the widget at the end, they would have to turn their neck 90 degrees in order to view and answer. At this angle, they were much closer to the numbers 1 - 5 on the scale, which may result in answer bias. Our final data collection file enlarges the area of teleportation and puts the widget in immediate view. This way, the participant can see clearly and does not need to turn their head.

1/7
1/7

How User Interface affects the testing environment:

For the accuracy of the experiment and clarity of the user interface, answering the questions on each widget should feel intuitive and distinct. The user interfaces (UI) of question widgets are displayed. Please note that these pictures are an animation. To see the interaction, press the play button on the bottom right.

User Interface during pilot testing

User Interface for final data collection

Pilot participants often noted how they couldn't tell when they answered a question because the continue button would stay the same and the only indication of change was the question and scale. They would often click continue multiple times and skip questions. The slider also defaults to in between 5 and 6, but there is no 5.5 value. As a result, researchers are unsure how to code 5.5 as an answer.

I re-coded the functionality of each widget in Unreal Engine to make the UI clearer and more accurate. For every question, the participant must click out of the default setting at 5.5 in order to continue. They can only click directly over each number for a clear answer. When they click on an answer or finish changing their answer, the continue button will appear and they can go to the next task.

Other necessary changes to the user interface included this strategy of making buttons disappear and reappear when answering questions on widgets. During pilot testing, we tested the method of changing colors when hovered over the button. However, due to the nature of VR and the occasional glitch, the button color would often flash and was barely perceptible in the environment.

Making the buttons disappear became clear to the participants that they had successfully answered a question.

creative work | problem solving with code:

The Design and Augmented Intelligence Lab gave me the opportunity to apply what I have learned in my computer science courses to analyzing data with MATLAB and improving User Experience with Unreal Engine. 

Working on graphics and updating this page now!