NMY SPEED RACER
APR 2018
Advanced VR Through Motion Simulation
Combining Virtual Reality with motion simulation to create a stunning hyperreality experience.
Problem & OpportunitySolutionIntroducing the NMY Speed RacerProcessStrategyDesignPrototypeTestReflection
Intent:
Internship
Role:
VR, UI/UX, Game Design, Development
Tech:
Unity, Cinema 4D, Autodesk Maya, Substance Painter, Octane Render, Adobe CC, C#
Duration:
25 weeks
NMY is continuously striving to drive immersion in virtual 3D environments to the maximum. I interned at NMY for 6 months and for my bachelor thesis, in cooperation with the NMY team, I created a hyperreality application for virtual reality trainings and simulations by making use of a motion simulator – the NMY Speed Racer.
Problem
Virtual Reality flight simulations lack immersion. How might we drive immersion to the maximum by creating an application where you could experience the action phyiscally as well as visually considering a simple HMD (head mounted display) app would not be enough?
Opportunity
By fusing together sensor technologies, hardware, and software, we hoped to create a new mixed reality app that fulfills our high expectations of a life-like flight experience. The motion simulator platform is set up on four pneumatically driven legs. On top of that is a pilot cockpit that we have equipped with a virtual reality setup.
Solution
By creating the NMY Speed Racer VR game our dream came true: the perfect illusion of translational acceleration is achieved by the communication between motion platform and virtual reality app. The movements of the Motion Sim are triggered by values calculated in VR. The accurate calculation is essential because it leads to a reduction in motion sickness. This also means that the pilot can be immersed in the 3D world completely without experiencing nausea.
Promotional video of the NMY Speed Racer
Introducing the NMY Speed Racer
The NMY Motion Sim Racer combines virtual reality with adrenaline-charged racing action. To intensify the virtual experience, the user is seated in a special racing seat that simulates centrifugal forces while driving. The combination of state-of-the-art media technology and motion simulation hardware creates an immersive experience that normal VR simulations cannot compete with.
The whole user experience of the NMY Speed Racer simulation
Aim of the Application
The aim of the application is to offer a previously unparalleled experience and arouse interest, especially among automotive and aviation customers. By creating a “wow” effect and showing the technical possibilities of NMY, it attracts attention at trade fairs and on the web.
Key Features
The NMY Speed Racer is a time trial racing game where the player has to beat either his or her own personal best time or the best overall time. The current best time is represented as a ghost that shows the name of the user who created that best time. The name of the user is retrieved through the introductory tablet app that is connected to the desktop application wirelessly. Each player has five rounds until the simulation ends.
The Process
In the following, I will explain the process of how we arrived at the final product – the NMY Speed Racer. In specific, it's about our research, tests we had to conduct to learn how to reduce simulator sickness, reasons for the final environment look, UX challenges for working in VR, as well as examples of the many prototypes that had to be created before the final product has been finished.

Remember, the process is based on the agile model and is therefore highly iterative and non-linear.
The Process: Strategy
Understanding
To kick off the project, we first set ourselves obligatory goals that the final simulation must achieve and thought of factors that might pose critical risks to success imd how we might minimize them.
Framing our Vision
The application should be developed using Unity3D and should be playable on the motion simulator via HTC Vive VR headset in a safe and pleasant manner where simulator sickness is kept at a minimum. The final simulation should have references to NMY or its customers (CI, logo, colors, etc.) and is ideally kept short at approximately 2 to 3 minutes. Because it
is a VR application, it necessarily has over 90FPS and under 20ms latency and as it is a safe environment, users can opt to terminate at any time and as quickly as possible.

Critical Risks
In addition, we asked ourselves questions which may pose ciritical problems, and came up with initial answers through research and experience with the target hardware.

How can we find out more about effectively reducing simulator sickness?
Through primary and secondary research, and testing with comparable factors.

How can we connect the motion simulator with virtual reality technology?
By leveraging motion compensation using HTC Vive's tracker accessory.

What is our main priority to reach our objectives?
If the application induces too much sickness, it is going to be unplayable. Therefore, design and development should depend on test results from the very beginning.
What is Simulator Sickness?
Simulator sickness is a form of motion sickness, whose perception is strongly subjective and is, thus, difficult to explore. There are different theories about the occurrence of simulator sickness. Some theories state, it is caused by the inaccurarcy of simulating the motion environment. Our brain associates certain forces with the visually perceived movements. If they do not correspond, simulator sickness is induced. The reason for this is that the sense of equilibrium tries to react to non-existent/non-conform movement. The brain notices the discrepancy between what is seen and what is felt and concludes that something is wrong.

Symptoms and signs may last from a few minutes to several hours. Typical symptoms and signs of simulator sickness include the following: dizziness, headaches and disorientation, drowsiness and balance disorders, sweating and hot flashes, nausea, eye fatigue, pallor, salivation, and vomiting.
How can we reduce Simulator Sickness?
Through initial tests, we sought to find out how to keep simulator sickness at a minimum. This is a crucial part before the ideating and concepting phase, as these factors may influence the visual appearance of the simulation.

Movement Limitation
Limiting certain movements of the motion simulator has become apparent to reduce the overall stress on the operator and limit causal factors of simulator sickness. We weighed up these limits against the overall purpose of the simulation, e.g. whether limiting certain movements may harm the overall entertainment or realism of the simulation. Limiting movement includes limiting the maximum angle of the degrees of freedom or limiting sudden and fast movement.

Familiarity of the Environment
For this test, test subjects compared a familiar environment, in this case the office environment, with an unfamiliar environment, which is in this case a futuristic dystopian environment. The results demonstrated that test subjects had a notably higher susceptibility to simulator sickness when using a familiar environment, as opposed to the unfamiliar environment. For this reason, we should focus on unfamiliar environments.




Points of Reference

Points of reference in a virtual environment, especially where locomotion is a primary factor, seemed to be a very effective method to reduce simulator sickness. Tests have been conducted with the use of a cockpit, static HUD and different architecture of the virtual vehicle which can be better perceived peripherally. All options transpired to have a positive impact on simulator sickness. When test subjects were told to stay focused with their eyes on the points of reference, they were, surprisingly, not susceptible to simulator sickness. Although this would not be a natural behavior in the virtual environment, it suggested that points of reference which are peripherally perceived positively impact the reduction of simulator sickness. Subsequent tests have confirmed this. For this reason, the architecture of the virtual vehicle was adapted in a way that the vehicle body extends into the peripheral vision of the operator, which is shown in detail later.

Points of Orientation
Besides points of reference, which move with the orientation of the operator and are always in sight, the impact of points of orientation on simulator sickness have been tested. Points of orientation can be objects, that help the operator to roughly know the velocity, orientation and position of the virtual vehicle in the 3D environment. Tests have been conducted with and without points of orientation alongside a predefined track. Although the impact was not as significant as with the use of points of reference, the results have shown that points of orientation slightly reduce the susceptibility to simulator sickness. It is important to note, that there is an individually varying upper limit of points of orientation, which has to be individually tested. Too many points of orientation may have a negative impact on simulator sickness. There should still be enough room, that, e.g. the horizon is still visible. In general, there should rather be only few points of orientation than too many which distract the operator’s vision. The final simulation therefore uses few skyscrapers as points of orientation.

Habituation
Throughout the course of testing, it was significantly noticeable, that using the same person as a test subject several times reduced the objectivity of the test, as the subject was adapting to the simulation. Decreased susceptibility to simulator sickness correlated with the amount of times, a person functioned as a test subject. Therefore, new test subjects had to be recruited for the following testing sessions to avoid habituation. Observations have shown, that habituation is individually varying: Some test subjects did not or only slowly adapt while others experienced meaningfully less simulator sickness after the second time being a test subject.
Why do we need Motion Compensation?
As the name suggests, the motion cancellation software cancels out simulator movements in such a way that the point of view in virtual reality is not affected by it and is therefore staying at the same position when the simulator is translating or changing orientation. The image below shows the difference between enabling and disabling motion compensation. It requires a HTC Vive tracker accessory, as shown on the left, which is attached to the motion simulator and on which the motion compensation software runs.
The motion of the motion sim must not be interpreted by the head-mounted display because it does not correspond to the virtual motion of the cockpit. Due to lacking degrees of freedom, the simulator is not capable to translate in forward/backward or left/right direction. The forces of these degrees of freedom are artificially simulated through gravity: When the user accelerates, the seat is tilting backwards resulting in the gravity to push the user into the seat. The opposite happens when the user brakes which is simulated through forward tilting of the seat resulting in the gravity to pull the user away from the seat.
User Controls of the Simulator
The image below shows the user controls of the motion simulator. To ensure easy controlling, the final simulation should be controlled in the same way as driving a car. Steering the yoke (4.) would also steears the virtual vehicle, whereas the left foot pedal would brake the vehicle and the right foot pedal would be used for accelerating (5.).
The Process: Strategy
Ideating and Concepting
We started off our ideation phase with a competitive analysis where we first evaluated existing examples of VR racing simulations and recorded pros and cons of the simulation and physics controls in a document.
Click here to view the full document.
The Process: Design
Environment
Based on our research findings, we decided to create an unfamiliar environment. Moodboard and final visual style are shown below.
Moodboard
Below, the mood board is shown. The setting is supposed to resemble a futuristic city above the clouds, whereas some of the skyscrapers are supposed to be from Frankfurt am Main, Germany, the location of NMY’s headquarter.
Visual Style
Below, you can see the final virtual environment of the VR motion simulation including sky-scrapers from Frankfurt am Main, Germany, the location of NMY’s headquarter, that we remodeled for the simulation and takes into account all of our research findings.
The Process: Design
Spaceship
To perfectly match physical and virtual environment, we first had to model the motion simulator. Afterwards, we could design a more visually appealing spaceship that has similar proportions.
Modeling the Motion Simulator
Below, you can see a detailed modeling of the motion simulator for which this simulation has been built.
Modeling the Spaceship
The spaceship was modeled from scratch to meet the user needs. As found out through research, points of reference are helpful to reduce simulator sickness. Therefore, the architecture of the virtual vehicle adapts in a way that the vehicle body extends into the peripheral vision of the operator. Also, most of the body is in front of the user’s sight as what is behind the user’s POV is not as relevant in virtual reality.
The Process: Design
UI / UX
A VR simulation creates new challenges as far as user experience goes. Below, there are two examples and how we overcame them.
Virtual Tooltips to increase Usability
Through virtual tooltips, we sought to improve the usability of the simulation. As the video below shows, the virtual tooltip is directly attached to the steering wheel, which is an exact 3D model of the physical steering wheel of the motion simulator. Hence, the user can easily find the physical button and follow directions. Through a relocated rotation anchor point, the tooltip is always easily legible, even when the steering wheel is moved.
Introductory app for personalized User Input
How do we record high scores through a personalized username in VR and ensure that the user has control of it? As text input in VR is rather inconvenient, we created an introductory tablet app where the player can enter his or her name. After the simulation, the user could also check high scores on the app.
HUD Screen Motion Design
For the HUD of the spaceship, we created several motion designs that were rendered in Otoy Octane and exported as video to be used as movie textures.
The Process: Prototyping
Evaluating the User Experience
Prototyping and minimizing simulator sickness was at the core of this project. A lot of prototyping had to be done to keep these factors at a minimum.
Roadmap
To properly research how to minimize simulator sickness, we created a prototyping roadmap with several prototypes for movement, controls, gameplay, and more. Keep in mind, not all of the prototypes were tested to be beneficial which is why some of the categories are not part of the final simulation.
Click here to view the full document.
Prototyping Track
The course of the track has significant influence on the overall user experience. Hence, we had to throughly test it through several prototypes. Some examples are shown below.
Prototyping Physics
Our final spaceship behaves like the combination of car and plane physics. Several tests had to be made to develop the final physics which ensures comfortable driving behavior.
Prototyping Environment
In terms of environment, we tested several different looks before arriving at our final look.
The Process: Testing
Evaluating the User Experience
To make sure that the final simulation may be used as showcase of NMY’s complex technical possibilities and that it is comfortbale in regard to simulator sickness, it was evaluated by numerous volunteers through user testing.
The aim of this evaluation is whether or not test user respond positively to realism and immersion (including comfortability) of the VR motion simulation. Test users drove three laps on the pre-defined track. Afterwards, they have filled out a survey, whose results are revealed below.
User Demographics
Information about the user demographics of test subjects is shown below.
Immersion and Comfortability
How realistic or immersive did test users find the simulation on a scale of 1 to 10? And how comfortable were they during the simulation? Did they feel any specific symptoms that made it a little uncomfortable?
Assessment of the Results
When test users should rate the immersion of the simulation on a scale of 1 (not immersive) to 10 (most immer- sive), a rating of at least 7 has been given. The majority of users rated immersion 9 out of 10, followed by a maximum rating of 10 which has been evaluated by 5 of 17 test users. We should consider that especially users with no experience in dealing with virtual reality more likely gave a higher rating of immersion. In contrast, test users who have experienced VR motion simulations before more likely rated immersion considerably lower. In specific, when test users should respond whether the way of motion simulation contributed to immersion, 58.8% responded they “pretty much agree” and an additional 29.4% responded they “most definitely agree”, whereas only 11.8% “agreed mediocre”. Other possibilities of answer were “do not agree” and “partly agree”. Test users responded the same question in regard to the virtual environment, where all participants at least “pretty much agreed” (41.2%) or “most definitely agreed” (58.8%). In regard to inducement of simulator sickness, the majority of participants have not perceived general discomfort (35.3%) or have only perceived minor general discomfort (23.5%). As opposed to this, 11.8% of test users have perceived rather severe general discomfort or above-average general discomfort (17.6%). Lastly, another 11.8% remain in the mediocre field. One should consider, that most of the higher general discomfort ratings came from test users, who would describe themselves as above-average susceptible to any form of motion sickness. Additionally, 10 of 17 test users responded they did not feel any symptoms during or after the simulation. Symptoms test users have perceived the most have been nausea and dizziness.
Reflection
This was a highly collaborative project and as I am a big fan of collaboration, I want to share 5 learnings from working in groups:

1. Crucial benefits
In a group, you collaborate to accomplish a shared goal, you constantly interact through communication, and the outcome is an achievement that even the same people couldn’t have realized by working independently. Everyone cooperates with each other to combine efforts for the greatest success. There’s no competition. Instead, we grow with each other. One person’s success does not mean that others are less successful or cannot be thriving at all. In fact, there are infinite opportunities for positive interactive growth and progress for individuals.

2. 1 + 1 != 2
1 + 1 is not equal to 2. It’s equal to infinity. Stephen Covey described this phenomenon of creative cooperation as “synergy”. Synergy means that “the whole is greater than the sum of its parts.” It’s in the group’s relationship and the cooperation that creative powers are maximized. So even if only two people come together to collaborate they may achieve something much more valuable than what both of them may have achieved individually.

3. Empathy is key
Empathy is the key to effective communication. Only if you understand someone emotionally and intellectually, you can cooperatively solve problems. People see things through different lenses. They have different perspectives and there is not always a right or wrong answer to a problem. We should be aware of this and listen with the intention to understand holistically rather than to reply, keeping in mind that it may even happen that we unintentionally filter what we hear through our own personal perception. We can control this, at least to some extent, through self-awareness.

4. Working with more experienced people
When collaborating, group members reflect a lot about yourself. They can help you in the process of becoming an independent, fulfilled individual with the ability to create profoundly satisfying and prolific relationships with others. My internship at NMY was my first hyperreality project (combining VR and motion simulation), yet I was treated like someone who “can” from the beginning. As Johann Wolfgang von Goethe once said, ”Treat a man as he is and he will remain as he is. Treat a man as he can and should be and he will become as he can and should be.” Even when I was stuck or something did not go according to plan I had never felt as if I would not have been able to accomplish it. There was always will. There was always a belief. And the group made a big contribution to maintaining this philosophy.

5. Handling disagreements
When collaborating and dealing with disagreements, you should always seek to find mutually beneficial or satisfactory solutions. How can this be achieved? By observing the issue from the other perspective and emphasizing the necessities and worries of others better than they can, by classifying the fundamental problems and concerns, by defining what outcomes would create a thoroughly agreeable solution and finally, by recognizing new options to accomplish those results.

How do my learnings apply to future projects?

1. I acknowledge collaboration as the most effective way to handle problem solving.

2. I aim to listen empathically, to fully, deeply understand group members.

3. I am aware there is not always right or wrong.

4. I can see unseen potential in others and help them become more independent, fulfilled individuals with the ability to create profoundly satisfying and prolific relationships with others.

5. I am prepared to transform disagreements into Win/ Win situations with mutually satisfactory intent.

Team
Dominik Hofacker
Experience Designer, Developer, Creative Director, Motion Designer, 3D Artist
Alexander Oster
3D Artist
Wolfram Kresse
Hardware Developer
Jasmin Färger
Project Manager