g2.png

Move2Improve

Move2improve Case Study

Bringing gamification to the physical therapy experience

 
Screen Shot 2019-02-06 at 9.30.36 PM.png
 
 
 

About the Hackathon

The Reality Virtually Hackathon Hosted by the Massachusetts Institute of Technology is one of the largest XR hackathon's in the world. Out of over 400 participants and 105 teams, we took the top prize for Best VR. The event did not have a set theme other than being focused on XR technologies from companies on the leading edge of the industry.

None of us knew each other before meeting during the pitching and team formation, but we all had a common goal of wanting to create something that will directly impact the lives of our users. We worked for 40 hours over 2.5 days. All technologies used had to be open source or free-to-use.

My Role: Lead UX/UI Designer

Team Structure: Agile development with 1 UX/UI Designer (me), 3 Unity Developers, and a 3D Artist/Animator

Tools/Technologies Used: Pencil & Paper, Sticky Notes, Sketch (Prototyping), Slack, Maya, Unity, Google Poly, HTC Vive Headset & Controllers

 
 

 
 

The Need

Improving patient engagement with physical therapy. Patients often dread the work of physical therapy and as a result may fail to practice their exercises at home and/or may skip appointments. We wanted to create an application that would make the patients experience more enjoyable and that would incentivize them to continue with their therapy sessions.

 

 
 

Solution/Product Overview

IMG_0464.jpeg

We created a game in which the patient uses an HTC Vive controller to trace the outline of a 3D form. The arm motion required to properly trace the form are the same as those used in a physical therapy exercise for treating a shoulder injury (gradually increasing range of motion, agility, etc.).  When a patient has successfully completed the outline, they have completed one rep of the exercise.

 Once the exercise is completed a specific number of times, the outline fills in to reveal a 3D animated animal with which the patient can interact. At the end of the session, the animals that were “created” through the exercises are added to the patient’s “zoo”. Between physical therapy sessions, the patient can log in to play with these virtual animals. This additional play mode provides additional opportunities for physical therapy exercises and also provides the patient with positive reminders about the physical therapy experience.

 The motion data from the Vive controller is saved with the patient’s profile so that the physical therapist can easily track progress at a granular level.

 We initially targeted the application for the HTC Vive VR headset running in a Windows environment, but we developed with Unity so that the application can eventually be deployed to a variety of VR Headsets.

 

 
 

Inspiration

The concept of creating a game came from two of our developers who are separately researching ways to aid physical therapy patients. One of those developers, Leila, had experienced a traumatic injury and thus had first-hand experience as a physical therapy patient. The other, Brian, is a scientist who is researching how VR can aid motor function improvement. With under 40 hours of planned development time, I took advantage of two of our developers being subject matter experts to keep the research quick.

 

 
 

Ideation

IMG_0426.jpg

With the goal of a physical therapy game in place, I had to flesh out the game concept, environment, and user interface. For the theme of the game there were a few initial ideas. We chose the drawing concept because of the near-infinite possibilities in creating exercises from 3D forms. I went with single-line forms so that the exercise would involve one fluid movement. Rather than having to draw multiple lines which might have created a choppy feeling or even risked injury to the patient.

We spent most of the first day designing the game framework. The original idea was to create a collaborative multiplayer game. After doing some researching about the coding required to network multiple headsets, we decided that a single-player proof-of-concept better fit into the time constraints of the hackathon. In working through the proposed schedule with our artist/animator, we decided that the prototype should involve tracing a 2D sketch within 3D space that would ultimately transform into an abstract animated 3D form. This still provided a good sense of the full product that we planned, without adding a lot more overhead to our already packed development schedule.

 

 
 

UI Design Considerations

Making accessibility a forethought not an afterthought

When creating a medical product, accessibility is a key consideration. While thinking about the needs of someone who is in pain, I wanted to create multiple ways to exit the simulation if the patient found themselves in need of help. In addition to the standard exit feature in the hand menu, there are two other ways to exit. 

Help.png

First, if the patient finds themselves in pain, they can immediately drop the controller.  When the controller drops below knee level the simulation automatically ends while immediately calling the therapist into the room for help. I chose the set default detection point at knee level to ensure that we properly interpreted the patient’s intention; the knee line is generally too low to accidentally trigger with any swinging motion from a standing posture. This line is also adjustable to meet individual patient needs.

Second, in the case that a user is not able to let go of the controller (for example, if the patient needs the controller velcro-ed to their hand or arm because of a physical issue with their hand), they can simply look up directly into the virtual sky and focus on a red cube. Upon the gaze being locked on the cube for 3 seconds the program will automatically exit and call the physician into the room for help.

Full color

Full color

Gray-scale

Gray-scale

All UI elements were designed so that people with color-blindness can see them when the application is presented in gray-scale.

 
 

Response to user testing and subject matter expert feedback

During user testing, I found that the users did not know where to start tracing and some did not naturally follow the lines in order. To aid with this, now the first 3 dots illuminate and follow ahead of the paint brush to guide the user through the proper form of the exercise and keep them on track. With firsthand knowledge from our Dev/scientists, we integrated a message that tells the patient to slow down if they are moving too fast.

Another issue was communicating to the user that the exercise continues beyond one repetition. Now, every time the user completes a repetition of the exercise, the form animates itself from the 2D sketch into 3D form. This 3D form then moves into the background until the user completes the allotted number of reps for the exercise. The 3D forms in the background give the patient a fun way of tracking their repetitions and encourage them to keep going. When they have completed their exercise, all of the animals they have created interact with them to stimulate feelings of wonder and excitement instead of the usual negative feelings associated with physical therapy.

 

 

DEMO

 
 
The judging team was extremely impressed with Move2Improve. Their gamified approach to physical therapy has tremendous potential to improve the lives of patients. The team brought their vision to life with thoughtfully and brilliantly executed UX and UI decisions - their design made it clear that the team truly understands their users.
— Jacquelin Assar, Judge & past Reality Virutally Hacakthon Winner
 

 
 

REFLECTION

IMG_0577 2.jpeg

This was my first hackathon experience and it was more than I ever could’ve hoped for. We ended up taking home the top prize for Best VR. It was amazing to go in not knowing anyone and coming out with a winning team and idea that can change the world.