A prototype of adaptive virtual environments therapy system (AVET) was developed which will enable innovative Virtual Reality (VR)-based therapy approach for children with attention deficit on the autism spectrum. Many systems have successfully used VR in Autism Spectrum Disorders (ASD) therapies. Most of them use VR as an alternative way to conduct therapies by simulating traditional therapies or real-life experiences. The AVET employed VR-exclusive “impossible experiences” (e.g., a chair that deforms upon the user’s gaze, a transparent human) which are not available in real world. The AVET identifies, influences the user’s cognition, and delivers a customized Prolonged Exposure (PE)-style VR therapy for children with attention deficits on the autism spectrum.
The overall logic of AVET is to first expose the user to a Virtual Environment (VE) to finish a task (e.g., listen to a virtual teacher’s talking) with potential distractions. At the second step, the system will detect the user’s attention with eye-tracking data and physiological signals, then remove all the detected distractions (e.g., a red mug, the virtual teacher’s necklace) from the VE. The last step is to gradually put all distractions back into the VE, which will allow the user exposure to one more distraction object at a time. In this step, the system will provide positive/negative reinforcements to reinforce the desired specific attention. Figure 1 – a) shows the initial VE which is a classroom and a virtual teacher stands in front of the user. The Cyan and the red balls representing the current left and right eye gazing points of the user. The eye gazing point in this figure is just for demonstration purpose, it is hidden from the actual users. b) shows the user is gazing at the teacher. c) shows the user is distracted by the globe and gazed at the globe. d) shows another distraction – the table. e) shows that all the distractions have been removed from the scene. f) shows one of the previous detected distractions, the table, is reintroduced to the scene.
We conducted a preliminary evaluation to the current AVET prototype with the experts. Based on the interview feedbacks, we anticipate the AVET will have a great potential to deliver innovative and effective ASD attention training therapies.
Abstract: The deficit in joint attention is an early predictor of children with Autism Spectrum Disorder (ASD). Training of joint attention has been a significant topic in ASD intervention. We propose a novel training approach using a Customizable Virtual Human (CVH) and a Virtual Reality (VR) game. We developed a CVH in an educational game – Imagination Drums – and conducted a user study on adolescents with high functioning ASD. The study results showed that the CVH makes the participants gaze less at the irrelevant area of the game’s storyline (i.e. background), but surprisingly, also provided evidence that participants react slower to the CVH’s joint attention bids, compared with Non-Customizable Virtual Humans.
Virtual Reality (VR) training games have many potential benefits for autism spectrum disorder (ASD) therapy, such as increasing motivation and improving the abilities of performing daily living activities. Persons with ASD often have deficits in hand-eye coordination, which makes many activities of daily living difficult. A VR game that trains hand-eye coordination could help users with ASD improve their quality of life. Moreover, incorporating users’ interests into the game could be a good way to build a motivating game for users with ASD.
I developed a Customizable Virtual Human (CVH) which enables users with ASD to easily customize the appearance of a virtual human and then interact with the CVH in a 3D task. Specifically, we investigated the effects of CVHs with a VR hand-eye coordination training game – Imagination Soccer – and conducted a user study on adolescents with high functioning ASD.
Chao Mei joined the Dept. of Software Engineering and Game Development at Kennesaw State University on Aug. 2016, as an Assistant Professor. He received his Ph.D in Computer Science from the Computer Science Dept. at The University of Texas at San Antonio on Aug. 2016. His Ph.D. advisor is Dr. John Quarles who is the director of the San Antonio Virtual Environment (SAVE) lab. His main research areas are: Human-Computer Interaction – Virtual/Augmented Reality, Special Education, Educational Gaming and Software Engineering. He is also an M.B.A. candidate at Georgia Institute of Technology.
Dr. Chao Mei joined the Dept. of Software Engineering and Game Development at Kennesaw State University on Aug. 2016, as an Assistant Professor. He received his Ph.D in Computer Science from the Computer Science Dept. at The University of Texas at San Antonio on Aug. 2016. His Ph.D. advisor is Dr. John Quarles who is the director of the San Antonio Virtual Environment (SAVE) lab. His main research areas are: Human-Computer Interaction – Virtual/Augmented Reality, Special Education, Educational Gaming and Software Engineering. He is also an M.B.A. candidate at Georgia Institute of Technology.
Chao is a software engineer and computer scientist with strong programming skills and professional academic background. Funded by the National Science Foundation (NSF), National M,S. Society, Kennesaw State Universities, and other organizations, Chao worked on various projects individually and in team through the whole software life cycle. Project experiences include Virtual Reality (VR) and Augmented Reality (AR) systems, Games, software development tools, computer games and web applications. Research areas include Human-Computer Interaction (HCI) – Virtual/Augmented Reality, Virtual Humans medical simulation training, special education, accessible computing. Some projects have got acknowledged by federal congress man, national and local non-profit organizations and users and their families. Have social communication skills and experiences of organizing user studies and fundraising.
The ‘Virtual Reality MS walk’ is design to raise funds and awareness for MS and MS related research. Specifically, the it is a 3D online multi user virtual environment modeled after the real walk held at the AT&T Center every year. The Virtual Reality MS Walk will occur concurrently with the real MS Walk, and will enable people all over the world to actively participate in the walk, regardless of any mobility impairment they may have.
Our team are finalizing the release version, and it is going to be available online at Google Play Store and Apple App Strore the same day with the real walk event on Mar. 7 the 2015.
March 1st 2014, our team is making a demo of the project Virtual Walk M.S to Federal Congressman Lloyd Doggett. More information about this project please step to its website: http://18.104.22.168/index.html
I as a main developer with our team made this project for National M.S society as an virtual online extension for their annual found raising event. In the traditional offline event, people get together and walk for 1 or 2 miles in order to raise found for M.S. society to support M.S research and M.S. patients’ daily life. Although lots of people are enthusiastic to fight for M.S. disease together, but they are not able to present in a certain place at the same time. We simulate this whole event as an online virtual reality game, so that the presenting population could be extended to people all of the world. For more information and downloading this project, please step to: http://22.214.171.124/index.html