For a company like HTX Labs, business outcomes are learning outcomes. What use is a VR training platform, if it’s not clear to the customer that the training is working? Across the industry, discussions are being held on this topic. How do we show customers that VR training is effective and efficient? How do we show them that VR training is cost-effective? Initial case studies have shown promising results. VR training saves time and money . VR-trained users produce fewer errors and feel more confident applying what they have learned [1, 2]. Leaders in the field agree that the potential of VR training is massive, however, many questions remain about how to maximize its potential. Now, in the era of COVID-19, more people than ever before are training and working remotely, making these questions increasingly relevant and timely.
Part of my role on the HTX team has been to help solve these problems analytically. When a user steps into one of our virtual training environments, they become quantifiable in a way that is not feasible with real-world training. In our immersive environments, we know precisely where every object is and when the user interacts with those objects. We can allow users to practice the real physical motions associated with the task they're training for and we can automate the training assessment, removing the need for a supervisor to directly observe their training. However, this is just scratching the surface. There are so many ways that the data we’re already collecting can be used to improve learning outcomes. For example, if student training performances are stored, for instance in a Learning Management System like the ones we integrate with, then past student performances can be fed into new training experiences to adapt the difficulty of training offered to the student.
With the incorporation of eye tracking into VR headsets, the amount of information we can infer about our users increases ten-fold. By tracking where people are looking in an immersive environment, we can begin to automate processes that identify when and help us understand why users may be struggling. For example, if a student clearing hazardous debris from a runway misses a piece of debris, their instructor (or an automated algorithm) can see where the student was looking when they made that error and better instruct the student about how to improve their performance. If the user intended to pick up the piece of debris but struggled to do so through the VR interface, eye tracking or motion tracking data of their controllers can be used to differentiate this kind of error from the procedural error where they didn’t see the piece of debris. Another data point that will be useful in automating these kinds of algorithms is pupil size. By tracking a user’s pupil size, which is tied to their emotional state, we can also infer when a user is struggling with their task, or when they're losing focus and need to be re-engaged with the content. As an industry, VR training has many options when it comes to using our data to improve our training outcomes. Our main challenge now is researching which options will provide the best outcomes.
At HTX Labs, we’re focused on humanizing VR training for our customers, so we want the analytics features we build to make our training simulations easier to use and our learning outcomes easier to understand. We actively work with our customers and with the experiences of our VR users to achieve these goals. HTX Labs has worked with customers in many different industries, each with their own unique needs and challenges. HTX Labs is currently working with the US Air Force to deliver immersive content for pilot training as well as working with maintenance teams to provide effective training for aircraft mechanics. We have also worked with petrochemical companies to create safety simulations that train users to identify potential hazards and handle hazardous materials. Also, the healthcare industry is increasingly looking to VR to train employees for high-consequence procedures like surgeries and patient assessment. Our customers’ needs, in terms of training content, vary widely, so our approach to tracking and improving their training outcomes should be adaptable to meet their varied needs.
User research allows us to understand and adapt to our customers' needs along the entire training process. In the real world, performance assessments for pilots who have learned emergency procedures are not the same as those for a maintenance team, so our methods for assessing training effectiveness should treat these types of training differently. As HTX Labs scales, we are working to build industry-specific metrics and processes that can be used to effectively track and improve learning outcomes. By creating automation tools that confirm return on investment for our customers, and by integrating customer insights from different industries, we are expanding our capabilities to address the needs of new customers, no matter their training context. Also, by researching how our customers assess training outcomes, we position ourselves to help answer one of the most important questions for our customers — how does VR training compare to real-world training in terms of real-world outcomes like on-the-job performance?
Lastly, the learning outcomes for our training simulations also depend on the user experience (UX). VR is a relatively new technology and the standards for designing the immersive experience of VR are not yet set. The industry continues to test new designs for immersive experiences, and these designs lead to different levels of usability and presence. One of the benefits of immersive training is that student learning is improved when a user is placed in a scenario that makes them feel the same way they would feel in its real-world counterpart . For example, if a pilot training in one of our simulations is put through an emergency procedure that will be high-stress and high-risk in the real world, we want their sense of presence during the procedure to create a realistic sense of urgency. We also want the experience to feel natural for our users and don’t want them to struggle to interact with objects or move around in their environment. With our UX research, we focus on reducing usability issues and enhancing the sense of presence for our users, leading to better learning outcomes for all our customers.
With VR training already shown to be an improvement over traditional training methods, the analytics and research topics discussed here only stand to improve learning outcomes. With so many opportunities to provide more value for our customers, our ongoing discussions revolve around deciding what features to build first. Hopefully, this article has helped to clarify some of our options and how we are beginning to prioritize them through user research. If you have any questions or would like to discuss any of the topics discussed here, please get in touch.
- Unity Technologies. (2019). The Incredible Impact of AR and VR. (link)
- PricewaterhouseCoopers. (2020). The VR Advantage: How virtual reality is redefining soft skills. (link)
- Kayur Patel. Stanford University. (2006). The Effects of Fully Immersive Virtual Reality on the Learning of Physical Tasks (link)
Liz Halfen, candidate Ph.D. in neuroscience. In my thesis research, I studied spatial attention in touch and vision and used motion-tracking technology to link eye gaze to tactile perception. Through my work at HTX Labs, I’ve learned about the impact of VR training and where researchers & analysts fit into the equation.