Samuel Cheng
UX Designer & Engineer
Summary

This project explored the use of electrical muscle stimulation (EMS or estim) as a technique in building human-computer interfaces. We've developed kEMStree, which is a virtual reality based education tool for students starting to learn basic chemistry. It uses EMS to help guide users as they build molecules and allows them to "feel" the interactions. We hope that this will improve the learning experience for students and use force feedback to give them a deeper learning of the subject matter.

Details:

  • August - October 2016
  • Team of 4
  • Accepted into the UIST 2016 Student Innovation Competition
  • Literature review, iterative design, Unity3D (game logic and atom behavior), C#
Introduction

Conventional UX processes generally start with understanding the user by conducting various forms of user research and understanding the problem or pain points by methods like task analysis, affinity mapping, and context inquiry. However, in this project, we engaged in what is generally frowned upon by first starting with the technology and asking "how can this be used?". Our purpose was not to circumvent the conventional wisdom of the UX cycle but to engage in a more exploratory project with novel user interfaces in which the computer itself can control human body movements.

EMS is a technique which elicits muscle contraction by placing a pair of electrode pads on various areas of the body. These electrode pads are connected to a channel on a FDA-approved medical device which acts as a signal generator and sends an electrical current through one pad and onto the skin, activating motor nerves which causes muscle tissue to contract as if your brain had sent an electrical impulse. Dials on the device can control the intensity of the current sent through the pads.


EMS is widely used in medical-related fields including sports therapy and physical therapy, for use in muscle training, weight loss, or perhaps for therapeutic effects. Though some research has been done into the use of EMS in designing computer interfaces, it is not quite as mainstream as other systems such as augmented reality or wearable computing. In this project, we want to explore ways to incorporate EMS as a human-computer interface.
Ideation

In order to discover how to best incorporate EMS as a user interface, we discussed questions such as:

  • Where on the body could we place electrodes?
  • What muscles would it actuate?
  • In what scenarios would actuating user movement be necessary or important?
  • Why would a user want force feedback and how would they benefit from it?
In general, we wanted to avoid placing electrodes on opposite sides of the body (e.g. one on each arm) because we did not want an electric current running through the heart. We also wanted to avoid places on the body that would be more intimate or too difficult to access such as the chest, stomach, thigh, or back. Limbs are generally considered safe places for the user.

We experimented with different muscle groups to see what motions we could trigger. We also briefly considered electrode placement on the neck but thought it best to avoid it for now.

We also discussed the scenarios in which a user might benefit from movement and several topics including exercise, muscle memory, and repetition came up. Through our literature review, we found that people generally associate movement with better memory and retention of learned material. We wanted to capitalize on this finding and see how we could incorporate movement into various forms of learning.

As we continued brainstorming, we talked about having users "feel" various concepts in learning physics. We discussed how to incorporate a haptic component to learning things like Newton's Laws, mechanics equations, and magnetism. From that point, we converged on an idea of feeling interactions which ultimately led us to an idea involving learning chemistry.

We wanted to build a system for students learning chemistry (i.e. middle shool or high school students) that would allow them to augment their learning with the use of force feedback.
Hardware Specifications

In order to get started with EMS, we obtained an EMS toolkit consisting of an Arduino nano-based hardware control module (which is capable of modulating the amplitude of EMS signals) with a Bluetooth low-energy (BLE) chip that integrates off-the-shelf EMS devices as signal generators and a simple communication protocol to connect with ordinary mobile devices.

This toolkit comes with an API allowing connection to different environments. We will have a Unity3D game engine use this API to connect to the EMS hardware module in order to send signals to the electrode pads at various points throughout the game.

The Unity3D virtual reality application was built onto an Android smartphone and viewed through a Google Cardboard headset. In addition, a Leap Motion hardware device was required in order to perform 3D hand tracking. The Leap Motion was connected via a USB to USB C adaptor to the Android phone who in turn was connected to the EMS module via Bluetooth. The Android phone was loaded with a Unity3D application and placed into the Google Cardboard headset.

In order to achieve this level of interoperability, a GoogleVR SDK for Unity was required in order to adapt our Unity3D application to VR and a Leap Motion Unity SDK was required in order to incorporate hand tracking into our VR game.
Unity3D Application

The Unity game was build in C# using basic built-in GameObjects in addition to various third party assets such as a LowPoly asset pack, textures, and music. As described above, packages were installed to incorporate Leap Motion, Google Cardboard VR, and the EMS hardware module into the game.

Upon donning the headset, users are presented with a 360 degree view of the game world. Using their gaze, users can click the "Start" button. They are then presented with the possibility to select various substances such as ocean water or rocks in order to learn the basic building blocks for each substance.

If "ocean" is selected, the user is presented with the molecule builder area which asks him or her to build molecules of water and salt. Five different colored bins representing different atoms are available to be selected. The user can position their hands over the Leap Motion device and select different atoms to bond together. If two atoms are selected that can bond to make up one of the molecules, the EMS hardware will send a current into the electrod pads that will contract the muscles in such a way that the user's wrist will be pulled together indicating an attractive force. If the two atoms are incorrect, the user's wrist will be pulled back indicing a repulsive force.

After successfully building both molecules, the screen will show some educative trivia about the ocean or about whatever substance is currently being built.
Feedback and Conference Demonstration

We were able to demonstrate our prototype application to various faculty members and students affiliated with Georgia Tech. The response was varied as many people indicated strong interest in our exploratory system and thought that the interface benefited from novelty effects leading to more people curious about trying it out.

However, because most, if not all, of our testers had not had previous personal experience with EMS, the initial reactions to the feeling of involuntary muscle contractions were mostly negative. Many people disliked the "tingling" feeling claiming that it almost "hurt." A few users shouted or laughed nervously when trying it out for the first time. Some faculty were so turned off by the system that they questioned the ethics between a human-computer interface in which a human does not maintain autonomy.

This lends itself to a good discussion topic outside of the domain of simple user experience but about the morality and ethics of how a human and a machine should interact with one another. Is it a requirement that a human must retain body autonomy when using a machine? What about using EMS for blind navigation in which it can trigger muscle contractions that guide the legs while walking? Regardless of one's stance on this issue, the very fact that we were able to trigger such thoughts and discussions is a good step forward for us in this project.

Our project was among the 17 selected for participation and demonstration at the User Interface Software & Technology (UIST) 2016 conference in Tokyo, Japan. It was our unique honor to represent Georgia Tech and network with other fellow HCI researchers and practitioners in advancing the field of user interfaces.


Conclusion

In this project, we presented our work related to the development of a novel human-computer interface involving electrical muscle stimulation. We wanted to explore how EMS could potentially work in tandem with a virtual reality application in order to use haptic feedback to help reinforce learning among students learning chemistry. Though we were unable to perform evaluation or usability tests with our target user group, we hope that what we learned from this experience can translate into further work performed in this domain.