See the attached question file
Full Terms & Conditions of access and use can be found at https://www.tandfonline.com/action/journalInformation?journalCode=hihc20 International Journal of Human –Computer Interaction ISSN: 1044-7318 (Print) 1532-7590 (Online) Journal homepage: https://www.tandfonline.com/loi/hihc20 A Systematic Review of a Virtual Reality System from the Perspective of User Experience Yong Min Kim, Ilsun Rhiu & Myung Hwan Yun To cite this article: Yong Min Kim, Ilsun Rhiu & Myung Hwan Yun (2020) A Systematic Review of a Virtual Reality System from the Perspective of User Experience, International Journal of Human –Computer Interaction, 36:10, 893-910, DOI: 10.1080/10447318.2019.1699746 To link to this article: https://doi.org/10.1080/10447318.2019.1699746 Published online: 13 Dec 2019.Submit your article to this journal Article views: 1869View related articles View Crossmark dataCiting articles: 20 View citing articles A Systematic Review of a Virtual Reality System from the Perspective of User Experience Yong Min Kim a, Ilsun Rhiu b, and Myung Hwan Yun a aDepartment of Industrial Engineering and Institute for Industrial System Innovation, Seoul National University, Seoul, South Korea; bDivision of Big Data and Management Engineering, Hoseo University, Asan, South Korea ABSTRACT Virtual reality (VR) is receiving attention enough to be considered as its revival age in both industrial and academic field. Since VR systems have various types of interaction with users and new types of interaction are constantly being developed, various studies investigating user experience (UX) of VR systems are continuously needed. However, there is still a lack of research on the taxonomy that canrecognize the main characteristics of VR system at a glance by reflecting the influencing factors of UX.Therefore, we collected and reviewed the research related to the UX evaluation of the VR system in order to identify the current research status and to suggest future research direction. To achieve this, a systematic review was conducted on UX studies for VR, and taxonomies of VR system including influencing factors of UX were proposed. A total of 393 unique articles were collected, and 65 articleswere selected to be reviewed via Systematic Reviews and Meta-Analyses methodology. The selected articles were analyzed according to predefined taxonomies. As a result, current status of research can be identified base on the proposed taxonomies. Besides, issues related to VR devices and technology, and research method for future research directions can be suggested. 1. Introduction Although the concept of virtual reality (VR) has been described using various definitions and discussed for a long time, there are some common properties among the defini- tions. The common properties of VR are computer-generated digital environment, interaction, and immersion (Jayaram, Connacher, & Lyons, 1997 ; Jerald, 2015 ; Pantelidis, 1993 ; Pratt, Zyda, & Kelleher, 1995 ). In other words, the meaning of VR is not limited to an artificial space synthesized in a computer environment. VR is a computer environment in which a user can interact with system components, obtaining a sense of immersion. VR systems are being implemented in a conventional personal computer (PC) environment; a head- mounted display (HMD) platform, which has been commer- cialized recently; or a cave automatic virtual environment (CAVE), which is a wall-sized platform surrounding a user. Today, VR is receiving attention enough to be considered as its revival age. In fact, VR is not a field that has suddenly emerged. The first HMD was developed in the 1960s and introduced as an ultimate display (Sutherland, 1965 ). Since then, technologies related to the implementation of VR sys- tems have continued to evolve. In particular, the revival of VR accelerated as low-priced immersive HMD-based VRs became commercially available to the public (Wang & Lindeman, 2015 ). Representative examples of personal HMD VR include Oculus Rift, HTC Vive, and Sony PlayStation VR, which provide a high degree of immersion by providing a wide field of view and high resolution. In addition, advanced track- ing technology with low latency and high accuracy is being implemented and developed. Compared to these HMDs, more affordable HMDs, such as Samsung Gear or Google Cardboard VR, are also available. VR systems have application not only in the field of enter- tainment but also in various other fields such as medicine, rehabilitation, education, engineering, and military. Howard (2017 ) reviewed previous studies using VR-based rehabilita- tion programs and found these programs are more effective than the traditional one. More specifically, VR based- rehabilitation has been found to be more effective in motor control of people with stroke (Henderson, Korner-Bitensky, & Levin, 2007 ; Saposnik, Levin, & Group, 2011 ) and cerebral palsy (Reid, 2002 ). Experimental evidence of learning effec- tiveness in education fields have also been demonstrated. For example, the use of VR simulators has improved orthopedic technological skills of surgeons (Aïm, Lonjon, Hannouche, & Nizard, 2016 ). The VR system can also help children recog- nize pedestrian safety and improve their crossing behavior (McComas, MacKay, & Pivik, 2002 ). In addition, Lau and Lee ( 2015 ) showed that a VR-based learning platform pro- vides students with a positive learning experience. In the military field as well, VR technology has been successfully utilized (Lele, 2013 ). For example, the VR system has been effectively adopted for simulated training (Bhagat, Liou, & Chang, 2016 ;Ko źlak, Kurzeja, & Nawrat, 2013 ) or for treating CONTACT Ilsun Rhiu [email protected] Division of Big Data and Management Engineering, Hoseo University, Asan, South Korea. Color versions of one or more of the figures in the article can be found online at www.tandfonline.com/hihc . INTERNATIONAL JOURNAL OF HUMAN –COMPUTER INTERACTION 2020, VOL. 36, NO. 10, 893 –910 https://doi.org/10.1080/10447318.2019.1699746 © 2019 Taylor & Francis Group, LLC anxiety disorders and posttraumatic stress disorder (Botella, Serrano, Baños, & Garcia-Palacios, 2015 ; Pallavicini, Argenton, Toniazzi, Aceti, & Mantovani, 2016 ). As such, the effectiveness of VR applications has been recognized in var- ious industries, and thus, this area has high prospects. Studies on systems that actively interact with users (e.g., PC and smartphone) should be conducted from the perspec- tives of human –computer interaction (HCI) and user experi- ence (UX). From these perspectives, the VR system also requires UX research in the following three aspects. First, the VR system is constructed through a combination of var- ious components, and the interactions performed in the sys- tem vary. These can be specified as contextual components (e.g., users, devices, and interactions), which can influence UX (Forlizzi & Battarbee, 2004 ; Hassenzahl & Tractinsky, 2006 ). Especially, the VR system has many device combinations and interaction methods that can be adopted. For example, HMD, large screen, wall-sized projectors, or conventional monitors can be used for visual display, and speakers or headphones can be selected for auditory feedback. In the case of a tracking system, devices for full- or local-body tracking, such as head or hand, can be adopted. Besides, the same task can be performed with various interaction techniques. For example, Boletsis ( 2017 ) reported that locomotion techniques that have been implemented so far can be classified into eleven criteria (e.g., real-walking, walking-in-place, controller, teleportation, redirected walking, arm swinging, and human joystick).
Furthermore, new VR equipment and interaction techniques are constantly being developed and introduced. Therefore, VR systems specified with various usage contexts can provide different UX, so continuous research on UX in VR systems is needed to understand the effect of the newly adopted usage context and provide better experience. Second, there is a need to strengthen the UX components such as the presence, immersion, and engagement that VR seeks. The sense of presence is one of the representative UX components in VR (Schuemie, Van Der Straaten, Krijn, & Van Der Mast, 2001 ; Takatalo, Nyman, & Laaksonen, 2008 ), and it can be described as the subjective perception of being in a mediated environment (Slater & Wilbur, 1997 ; Stanney & Salvendy, 1998 ). According to Bulu ( 2012 ), presence and immersive tendencies of learners have positive correlation with satisfaction in the virtual world. In addition, in a VR system that provides more sense of presence and immersion, the task performance is also increased. For example, task performance is more successfully completed under a stereoscopic display condition, which provides better sense of immersion compared to a non-stereoscopic display (Loup- Escande, Jamet, Ragot, Erhel, & Michinov, 2017 ). In the medicine field, surgeons can perform the tasks in the surgical simulation system more accurately with the haptic feedback than the case without it (Girod, Schvartzman, Gaudilliere, Salisbury, & Silva, 2016 ). Moreover, the target UX compo- nents can be different for each research topic. Therefore, UX study in VR is necessary to effectively achieve the purpose of VR (e.g., presence, immersion, pleasure, learning effect, and training effect). Third, it is necessary to reduce the side-effects caused by VR experience, which can have a negative impact on the overall UX. Cobb, Nichols, Ramsey, and Wilson ( 1999 ) assessed the effects related to health and safety for various virtual environments (VEs), and demonstrated that the fol- lowing symptoms can be defined as virtual reality-induced symptoms and effects (VRISEs): simulator sickness, postural instability, psychomotor control, perceptual judgment, con- centration, stress, and ergonomics effects. Nichols and Patel (2002 ) reviewed the empirical evidence of health and safety issues in VR, and found VR-induced sickness to be a major problem. The authors suggested an experimental procedure model to manage VRISE with a minimized level, emphasizing that someone can be inevitably impacted by the VR experi- ence. In addition, Stanney, Mollaghasemi, Reeves, Breaux, and Graeber ( 2003 ) suggested a hierarchical model of usability criteria for VE and its side-effect was declared as one of the major elements for usability of VE, which needs to be mini- mized. As such, the symptoms caused from VR experience have been raised from the past. However, these problems remain in immersive VRs today. Jerald ( 2018 ), who presented five essential guidelines for human-centered VR design, emphasized that VR developers should understand any type of adverse effects of VR experience and recognize these issues.
Furthermore, there may be potential problems when new VR technologies or platforms are developed. Therefore, UX study in a VR system is required for users to experience a VR system safely and pleasantly. Prior to reviewing UX evaluation in VR systems, we first suggest a framework for evaluating the UX in VR systems, which forms the basis of the review results. A framework- based review can provide a more structured overview of the UX evaluation in today ’s VR systems, and is expected to provide researchers with insights into building VR systems and performing UX assessments. Previously, VR systems were generally classified or described in terms of the following immersion level criteria – non-immersive, semi-immersive, and full immersive (Henderson et al., 2007 ; Kozhevnikov & Gurlitt, 2013 ; Kyriakou, Pan, & Chrysanthou, 2017 ; Ma & Zheng, 2011 ; Moro et al., 2014 ). However, such classification of VR systems has several limitations. This classification criterion is limited to the visual display characteristic. VR is a system composed of various devices. Although the characteristics of the visual display have a dominant influence on the subjective sense of immersion, the immersive feeling can be improved or reduced by factors other than the visual display. Especially, there is no objective criterion for the semi-immersive level. For this rea- son, the same system can be classified differently. In addition, the immersion level may differ even for the same classification condition. For example, typical desktop VRs are classified as non-immersive VRs. However, there is no clear basis for whether desktop VRs with added tracking technology should be classified as non-immersive. Although Muhanna ( 2015 ) proposed a hierarchical struc- ture for VR systems, the taxonomy did not deviate completely from the existing immersion-based criteria and focused on CAVE. A detailed classification of VR sub-elements has also been performed. Anthes, García-Hernández, Wiedemann, and Kranzlmüller ( 2016 ) proposed a structural overview for cur- rent input and output devices. Bowman and Hodges ( 1999 ) 894 Y. M. KIM ET AL. systematized the interaction methods and techniques in detail.
However, in this case, there is a limitation in understanding the characteristics of each element of the VR system to which various contexts are applied. Therefore, it is necessary to reorganize the systematic classification according to the main characteristics of the VR systems. Therefore, this study aims to organize the various studies focusing on UX as it relates to VR systems together, based on the following research purposes: ● To provide a structural methodology for categorizing the current VR studies. ● To classify and summarize studies related to UX in VR. ● To clarify the current research limitations for future research directions. The remainder of this paper is structured as follows. Section 2 illustrates the classification framework of VR sys- tems from a UX perspective and describes an elaborated taxonomy of VR systems. Section 3 presents a methodology to extract the articles. Section 4 presents the analysis results of the collected papers. Section 5 contains discussions and sug- gestions for future research. Finally, Section 6 concludes the paper with a brief summary and remarks.
2. UX framework in VR system Before evaluating UX, it is necessary to first clarify the factors influencing UX (Schulze & Krömker, 2010 ). According to Forlizzi and Battarbee ( 2004 ), UX in interactive systems can be viewed from the product-centric, user-centric, and interaction- centric perspectives. Thüring and Mahlke ( 2007 )proposed a components of user experience (CUE) model of the human – technology interaction system. In this model, the interaction characteristics are influenced by system properties, user charac- teristics, task, and context, and affect the UX components such as instrumental quality (e.g., controllability, effectiveness, and learnability), non-instrumental quality (e.g., visual esthetics and haptic quality), and emotional reactions (e.g., subjective feelings, motor expressions, and physiological reactions). Thus, in VR systems as well, attributes of users, devices, and interaction can be specified as the major factors influencing UX. In addition, the UX evaluation method should be adopted appropriately in accordance with the research goal and UX components that need to be observed (Hartson & Pyla, 2012 ). Even if the UX components are the same, the resulting data type and interpreta- tion may differ if the evaluation method is different. As a result, “users ”,“devices ”,“user activity, ”and “evalua- tion ”are included in the UX evaluation framework in VR systemsasinfluencingfactorsofUX( Figure 1 ). User character- istics include demographic information and health status of people who experience VR systems. Devices are hardware that constitutes a VR system and are divided into input and output devices. User activity includes interaction elements that can be used to identify specific usage contexts such as task type, inter- action partner, and posture. Evaluation factors include attributes of evaluation methods and characteristics of acquired data. As aforementioned, previous studies have suggested a detailed classification for the sub-parts of a VR system (Anthes et al., 2016 ; Boletsis, 2017 ; Bowman & Hodges, 1999 ); however, the establishment of a VR system from over- all perspectives has been hardly studied. Therefore, this study formulates the details of the framework by comprehensively referring to a well-organized handbook on a VR system and previous research related to 3D interaction and the attributes of UX evaluation (Bowman, Kruijff, LaViola, & Poupyrev, 2001 ; Burdea & Coiffet, 2003 ; Jerald, 2015 ; Rajeshkumar, Omar, & Mahmud, 2013 ; Vermeeren et al., 2010 ). In addition, the classification criteria are added and some elements are modified. Each factor is detailed as follows.
2.1. Users In user characteristics, demographic information, knowledge, personality, and cognitive or physical impairment are selected as major categories. The demographic information contains Figure 1. A suggested UX framework of VR system. INTERNATIONAL JOURNAL OF HUMAN –COMPUTER INTERACTION 895 age, sex, occupation, education level, or race. In addition, there is a domain knowledge gap between the end-user and expert, which might result in different perspectives on UX issues. According to Kober and Neuper ( 2013 ), individual differences such as personality can reveal different appear- ances for presence experience. VR systems are also actively applied to physical rehabilitation or trauma treatment in the medicine field and may require changes in interaction pat- terns or assessment methods for the VR system depending on the patient ’s physical or perceptual limitations. 2.2. Device In device characteristics, input and output devices are the main categories of the VR system hardware (Anthes et al., 2016 ; Li, Yi, Chi, Wang, & Chan, 2018 ; Zhang, 2017 ). The input device delivers the physical signal provided by the user in digital form to the VR engine, while the output device provides the user with a specific modality (e.g., visual, audi- tory, and haptic) in response to the collected information. It is important to select an adequate combination of devices con- sidering interaction fidelity and characteristics of the selected device to provide positive VR experience, such as enhance- ment of sense of immersion or presence, and high perfor- mance. Figure 2 shows the classification scheme for input and output devices according to the detailed criterion.
2.2.1. Input device The input device is divided into non-hand and hand input devices depending on whether hands are required. For a non- hand input device, whether tracking technology is applied is added as a detailed criterion. If tracking system is available, body tracking, head tracking, eye tracking, microphone, and treadmill are included. In the absence of tracking, non- tracked and non-hand devices are added, which can include pedal-type inputs. In case of whole-body tracking, the entire body of the user is tracked and the user ’s movement pattern is recognized. Head tracking is generally available when the user wears an HMD such as Oculus Rift, HTC Vive, or Sony PlayStation VR. Eyeglasses with a tracking sensor can also track head movements. Under this condition, a VR environ- ment corresponding to the head movement of the user is presented. Eye tracking tracks the user ’s eye movement such as the gaze point. A microphone can be used for system command by tracking the user ’s voice information. When a treadmill is used as an input device, the user can actually walk on the treadmill and the gait motion is transmitted to the VR engine, which changes the virtual world according to the user ’s motion. It includes not only traditional treadmill, but omnidirectional treadmill such as Omni-treadmill and VirtuSphere. When tracking is applied to a hand input device, it includes input types of a tracked hand-held controller, hand- worn, and bare-hand. The tracked hand-held controllers include controllers such as Oculus Touch, Wii Remote Controller, and Sony Move, which track the hand movement and have operation buttons on their surface. In the case of the hand-worn type, users wear the device in their hand directly, and the Data Glove is one of the representative examples. In bare-hand type, which is generally known as a natural user Figure 2. Device taxonomy in VR (up: input devices; down: output devices). 896 Y. M. KIM ET AL. interaction method, users do not need to wear any device and either their hand movement is tracked or hand gestures can be recognized. The cases in which the hand input device does not have tracking enabled are classified as world-grounded type. A world-grounded device includes a keyboard and a mouse, which are generally used in desktop VR systems. If it is not world-grounded type, it is classified as a non-tracked hand-held controller, and gamepads are typically included in this criterion.
2.2.2. Output device Output devices can first be sorted based on sensory cue, and each type includes detailed display types depending on the sub-criteria. Among the sensory cues, the visual cue is essen- tial and can dominantly influence users ’perception on a VR experience. Types of visual display can be broadly divided into two categories depending on whether they are fixed in the world. The world-fixed type is installed in the real world and its position does not change with the user ’s movement. A conventional monitor and screen or projector-based display belong to this category, which can be installed with one or more displays. Multiple displays can provide a wide field of view. If it is not world-fixed type, it is classified as HMD type, which can be classified into non-see through HMD and video see-through HMD. There is also see-through HMD, but this case is excluded from the VR display category because it corresponds to a smart glass applied with augmented reality.
The non-see-through HMD is again classified into a smartphone-based HMD (e.g., Samsung Gear) and an assembled HMD (e.g., Oculus Rift and HTC Vive). CAVE is also included in the world-fixed display type. The first CAVE was introduced in 1992 (Cruz-Neira, Sandin, DeFanti, Kenyon, & Hart, 1992 ) and had a cubic structure of 10 × 10 ×10ft 3. A common feature of CAVE is that the displays are wall-sized and surround the users to provide more immersive experience (Kageyama & Tomiyama, 2016 ; Manjrekar et al., 2014 ; Muhanna, 2015 ). Auditory feedback can be provided through an earphone or a headphone. Speakers can also be adopted as world-fixed type. Both types of auditory displays can provide 3D sound to enhance user immersion. The manner in which haptic feed- back is provided can be broadly divided into passive or active type. According to Jerald ( 2015 ), a passive-type haptic indicates a feedback obtained from structures built in the real world in a VR environment, while an active-type haptic is a feedback received from a haptic device. The active haptic is again classified as tactile feedback or proprioceptive force feedback. Tactile feedback is transmitted to the skin through vibration, and proprioceptive force provides force feedback to the user. In addition, independent of the feedback properties, active haptic can be installed in the real world or worn by users. In the handbook of Jerald ( 2015 ), a motion platform is classified as sensory cue, and provides feedback, such as motion, to the user. For example, a vehicle mimicking a roller coaster corresponds to a motion platform when it moves in response to a rail slope presented in a VE. Since the motion platform can contribute to the immersive experience of the user as well as cause motion sickness (Jerald, 2015 ; Riecke, Schulte-Pelkum, Caniard, & Bulthoff, 2005 ), it can be seen as a key factor of UX evaluation in VR systems. The passive motion platform defines the case where the user is affected by the system, while the active motion platform defines the case where the user operates the motion platform directly.
2.3. User activity In user activity characteristics, three main categories of task, environment, and application are selected. 2.3.1. Task Task type, posture, and interaction partner are selected as detailed classification criteria related to task ( Figure 3 ). 2.3.1.1. Task type. In a VR system, the main task types that involve interaction are navigation, selection & manipulation, and system control. These tasks are also representative of the 3D interaction (Bowman et al., 2001 ; Reyes-Lecuona & Diaz- Estrella, 2006 ). Navigation is a core task that is evaluated in the VR system (Santos et al., 2009 ), and can be implemented in situations where avatars, cars, or airplanes are moving. In selection & manipulation, selection is a task of picking a specific object. Manipulation is the task of transforming or moving an object and the manipulation task follows after the selection task. Thus, the selection and manipulation task types are categorized in one category. System control is the task of Figure 3. User activity-related component taxonomy in VR. INTERNATIONAL JOURNAL OF HUMAN –COMPUTER INTERACTION 897 selecting a menu bar or activating a specific system in a VR environment. The three types of tasks described above are the main tasks of 3D interaction, but we add the task of watching type to it. In most cases except for special cases, users pas- sively accept the visual, auditory, or haptic information pro- vided by the VR system in the watching task. In case of using HMD, however, users can actively use visual control by con- trolling virtual camera with head movements.
2.3.1.2. Interaction partner. Interaction partner can be cate- gorized into three categories depending on whether users experience VE while sharing information in the constructed VR system. A single user is a situation where there is no other human partner. In this case, a user only interacts with VR devices. A situation in which multiple users experience the VE simultaneously in the same VE system can be classified as co- located multi-user. Multiple users who are in different loca- tions may simultaneously connect to one VR and interact with each other, and this case can be classified as a remote multi-user.
2.3.1.3. Posture. When experiencing VR, the user ’sposture can be limited or completely free. In most cases, users experi- ence VR in a sitting or standing posture. The users may feel a higher sense of immersion in a situation where their posture is not limited; however, the postures can be limited to increase interaction fidelity or the degree of real system imitation. For example, when experiencing a car simulation in a VR system, the user ’s posture may be limited to a sitting posture that matches the actual driving situation. Therefore, posture selec- tion can be an important consideration for UX when experi- encing VR.
2.3.2. Environment Environment characteristics are classified depending on whether or not the riding platform is provided. The riding platform is generally provided for increasing interaction mod- ality and can include car seats, treadmills, or cockpit. The riding platform is further classified as non-movable or mova- ble platform. The movable platform corresponds to the motion platform. In addition, CAVE is added to the VR environment since it can be seen as a visual-spatial platform (Figure 4 ). 2.3.3. Application VR applications are classified based on the industrial field and its purpose. The official classification of VR applications does not currently exist but is typically used in the following areas:
Education & training, Entertainment, Healthcare & Therapy, Product development, Architectural & Urban design. Note that these are representative VR applications and not all applications of the VR system.
2.4. Evaluation When conducting UX evaluation, it is important to adopt an appropriate UX evaluation method and understand its attri- butes (Hartson & Pyla, 2012 ). Measurement, measure, evalua- tor, location, system development phase, and period of experience are selected as the main attributes of UX evalua- tion. The measurement can be classified as subjective or objective. In a VR system, presence, flow, or engagement belongs to subjective measurement, while error rate or task completion time belongs to objective measurement. The col- lected data can be classified as quantitative or qualitative measure. Focus group interview, think-aloud, and deep- interview are representative methods for collecting qualitative data, and questionnaires, physiological signals can be used to collect quantitative data. Evaluator refers to the subject who evaluated UX and can include general users and experts. In addition, UX evaluation can be conducted in the laboratory, field, or online, and can be divided into fully functional systems, functional prototypes, conceptual design ideas in a very early phase, and nonfunctional prototypes depending on the time of system development. Fully functional systems mean that all devices and content utilized in the VR system are currently being commercialized. When the device or con- tent developed by the researchers is included in the VR system, it is defined as functional prototype. The period of experience can be classified as before usage, during usage, after usage, or overtime (Roto, Law, Vermeeren, & Hoonhout, 2011 ). Figure 4. Environment taxonomy in VR. 898 Y. M. KIM ET AL. 3. Method In this study, a systematic review was conducted according to the Preferred Reporting Items for Systematic Reviews and Meta- Analyses (PRISMA) (Liberati et al., 2009 ). The papers were searched through several web databases on May 27, 2019. The publication date was limited from 2009 to 2018. The selection criteria and review procedures are detailed as follows.
3.1. Information source A total of six web databases were selected: Scopus, Web of Science, Science direct, IEEE Xplore, EBSCO, and ProQuest. These search engines cover a wide spectrum of perspectives as well as engineering and medical perspectives (Powers, Bieliaieva, Wu, & Nam, 2015 ). 3.2. Inclusion and prescreening criteria The journal articles in English were selected for the review only, and short reports, news, proceeding papers, books, and dissertations were excluded. “virtual reality ”,“virtual environ- ment ”,“VR, ”and “VE ”were selected as keywords for virtual reality, while “user experience ”,“UX, ”and “human experi- ence ” were selected for user experience. Therefore, we searched a total of 12 combinations in each search engine.
3.3. Eligibility criteria After the screening process, the paper was selected by reading the entire text. We selected a study that collected UX-related indicators by evaluating UX on a VR platform as an essential selection condition. Thus, the studies selected should (1) utilize VR platform and (2) evaluate UX in VR. Except these two criteria, there were no restrictions (e.g., type of VR system, characteristics of participant, study design, and eva- luation methods) on article selection.
3.4. Quality assessment In addition to eligibility criteria screening, quality assessment was performed to select the final papers for review. To assess the collected studies, QualSyst standard was used (Kmet, Cook, & Lee, 2004 ). This tool consists of 14 criteria evaluating appropriate- ness of study design, research question, participant selection, sample size, outcomes, and conclusion (See. Appendix). Each criterion was graded according to fulfillment level (2 = yes, 1 = partial, 0 = no). Criteria that do not apply to a particular study design were marked as ‘n/a ’and were excluded from the summary score calculation. The final score was obtained by dividing the sum of the points by the maximum possible points. For example, if there is one ‘n/a ’, the maximum possible score is 26 points (13 criteria x 2 points = 26 points). Two reviewers (YMK and IR) evaluated each study independently and disagreements were resolved by consensus or the third reviewer (MHY). Papers with a score less than 55 were indicated as weak quality (Van Cutsem et al., 2017 ) and were excluded from this study. 3.5. Study selection The review procedure and the number of selected papers accord- ing to each procedure are shown in Figure 5 .Asaresultofthe keyword search, a total of 635 papers were collected. The number of initially collected papers for each search engine was as follows:
Scopus (302), Web of Science (148), Science Direct (58), IEEE Xplore (30), EBSCO (32), and ProQuest (65). After removing duplicate papers, 393 papers were left. As a result of the initial screening process, 208 papers remained. After that, we reviewed the full text of these papers by carefully considering the eligibility criteria, and performed quality assessment for the papers which Figure 5. Flow diagram of study selection. INTERNATIONAL JOURNAL OF HUMAN –COMPUTER INTERACTION 899 met the eligibility criteria. As a r esult, 65 papers were selected with the purpose of this study.
4. Results 4.1. User characteristics As shown in Figure 6, the experimental group had its average age concentrated in 20s. The 30s, 40s, and 70s group, respec- tively, were studied only in one case, and there were two cases in each of 10s, 50s, 60s, and 80s. If the age information was provided as a range or if it was difficult to know the average value because no age information was provided at all, it was classified as no information. Figure 7 shows the results of classifying the selected papers based on the subject ’s health status and age group. In most cases, studies were conducted on physically and mentally healthy sub- jects. Only three studies were conducted on people who were physically uncomfortable. There was only one case in which the experiment was conducted on both healthy and impaired people.
In addition, most studies were co ncentrated on non-elderly sub- jects. The elderly was studied in only four papers. Figure 8 shows the sex ratio of subjects in the experiment. Specifically, it represents the ratio of male subjects to the total number of subjects. In other words, exactly 50% means that the number of male and female subjects is exactly the same.
There were 13 cases in the range of 45 –54%, and this range means that the number of male and female subjects is set to approximately the same. There were five cases where exactly the same number of men and women participated in the experiment. Although there may not be a significant differ- ence in UX by gender, several studies proved that gender can Figure 8. Classification by sex ratio in experiment. There were 17 cases on no information. The total number of cases might be greater than the number of papers selected because there were cases where multiple experiments were performed in one paper.
Figure 7. Classification of studies by type of participant. Figure 6. Classification of studies by age. The total number of cases might be greater than the number of papers selected because there were cases where multiple experiments were performed in one paper. 900 Y. M. KIM ET AL. be one of the significant factors in VR experience such as cybersickness (Baños et al., 2004 ) and presence (Narciso, Bessa, Melo, Coelho, & Vasconcelos-Raposo, 2017 ). However, the results showed that the male ratio was the most frequent in the range of 75 –100%, and the sum of the cases in which the male ratio was higher than 55% was larger than the sum of cases in which the male ratio was less than 44%.
4.2. Device –input device characteristics The types of input devices used in VR systems are classified in Table 1. In 27 cases, both hand and non-hand input devices were used simultaneously. There were also 21 cases where only the hand input device was used whereas the frequency of using only non-hand input device was relatively low. The results of a detailed classification of the types of input devices are presented in Table 2 .Innon-handinputtype,thehead tracking method was widely used in comparison with other methods. Besides trac kingthehead,otherbodypartswerealso tracked to recognize specific body gestures or to track the trajec- tory of the body parts. For example, a participant can perform navigation tasks in a VE through specific actions defined by researchers, such as walking in place (Monteiro, Carvalho, Melo, Branco, & Bessa, 2018 ), placing the right foot in front, or rotating theshoulder(Bradeetal., 2017 ). In some cases, an eye tracker was used for applying the eye movement as an input channel. Vinnikov, Allison, and Fernandes (2017 ) developed a gaze-contingent display that allows the user to adjust the volume of a specific region on which they are concen- trating, using real-time gaze tracking system. In addition, Lin, Breugelmans, Iversen, and Schmidt ( 2017 ) utilized an eye- tracking system as a non-intrusive interaction method for patients with arthritis in the hand, replacing conventional computer devices such as keyboard and mouse. Electroencephalogram (EEG) signals combined with the brain –computer interface (BCI) application were also used in VR systems. Vourvopoulos and Liarokapis ( 2014 ) found that commercial BCI can be used effectively for robot navig ationinaVE.Tidonietal.( 2017 ) applied BCI and robotics to VR, and found that the participant exhibited good performance for BCI within the immersive sce- narios. There was one case where the text input was enabled by recognizing the voice of the user via a microphone (Pick, Weyers, Hentschel, & Kuhlen, 2016 ) and one where a pressure sensor and actuators were attached to the shoe insole to provide haptic feed- back according to the walking style (Turchet, Burelli, & Serafin, 2013 ). In addition, there was one case where the omnidirectional treadmill, VirtuSphere, was used for virtual navigation task (Monteiro et al., 2018 ). In the absence of tracking condition, pedals were used as a non-tracked device in a car simulation scenario (Georgiou & Demiris, 2017 ). In the hand input device, the number of cases in which tracking was impossible was more than that in the case where tracking was possible. Thus, the traditional input device has been adopted more for UX research in a VR system so far. In case of the hand input applied tracking technology, the tracked hand-held controllers were used in nine cases. While the tracked hand-held controllers were commercial products such as Wii remote controller (Jonsdottir et al., 2018 ), trackable pen (Anton, Kurillo, & Bajcsy, 2018 ; Rieuf & Bouchard, 2017 ;Son, Shin, Choi, Kim, & Kim, 2018 ), or gripper (Morán et al., 2015 ), researchers have also built trackable controllers by attaching tracking sensors to specific products (Wang & Lindeman, 2015 ). In addition, bare-hand type was used in 13 cases and Leap Motion device was used in most of these to recognize the hand movement or hand gesture. Hand-worn type was used in three cases and include gloves (Lin et al., 2017 ; Xiong, Wang, Huang, & Xu, 2016 ) and bracelets (Camporesi & Kallmann, 2016 ). In the no tracking condition for hand input devices, the frequencies of using non-tracked hand-held controllers (e.g., gamepad) and world-grounded devices (e.g., mouse and key- board) were similar. Furthermore, de Jesus Oliveira, Nedel, and Maciel ( 2018 ) implemented touch screen for articulatory inter- face by attaching a smartphone to the back of the HMD, and this input device belongs to non-tracked and non-hand-held devices.
4.3. Device –output device characteristics The classification by sensory cue used in the VR system is presented in Figure 9 . Visual feedback was provided in all studies, while auditory feedback was relatively less provided.
In addition, haptic feedback was provided in 14 studies, commonly through an input device that utilizes hands. For example, Phantom Omni (Culbertson & Kuchenbecker, 2017 ; Erfanian, Hu, & Zeng, 2017 ; Schvartzman, Silva, Salisbury, Gaudilliere, & Girod, 2014 ), Novint Falcon (Ahn, Fox, Dale, & Avant, 2015 ; Jin, 2013 ), steering wheel (Georgiou & Demiris, 2017 ) or head bend (de Jesus Oliveira et al., 2018 ), which Table 1. Classification by input device used in experiment. Input device types No. of case Non-hand input device only 16 Hand input device only 21 Both 27 No information 3 N/A 2 The total number of cases might be greater than the number of papers selectedbecause there were cases where multiple experiments were performed in one paper.
Table 2. Classification by hand and non-hand input devices. Maincategories subcategories Items No. ofcase Non-handinputdevice Tracking Head tracking 31 Eye tracking 5 Body tracking (body gesture, upper orlower body movement) 8 Microphone 1 Physiological signal (EEG) 2 Treadmill 1 Pressure sensor 1 No tracking Non-tracked & non-hand devices 1 Hand inputdevice Tracking Tracked hand held controller 9 Hand worn 3 Bare hand 13 No tracking Non-tracked hand held controller 14 World-grounded devices 15 Non-tracked & non-hand held devices 1 No information 4 If no information about the input device corresponding to the two maincategories is given, it is classified as no information. The total number ofcases might be greater than the number of papers selected because therewere cases where multiple experiments were performed in one paper. INTERNATIONAL JOURNAL OF HUMAN –COMPUTER INTERACTION 901 provide vibration or force feedback, were used. Especially, Wang and Lindeman ( 2015 ) provided tactile feedback with blowing wind through fans to enhance the sense of motion in the VR system. There was only one case that examined the effect of olfac- tory feedback on presence (Baus & Bouchard, 2017 ). These authors identified that unpleasant odor had a statistically sig- nificant effect on the sense of presence and argued that exposure to unpleasant odors may increase their presence because there is no obvious visual clue to connect the odor to the visual scene in a VE. In addition, motion platform was used in three cases. Bian et al. ( 2016 ) installed a dynamic seat that provides motion feedback corresponding to the situation presented in the VR. Monteiro et al. ( 2018 ) used VirtuSphere, which allows users to move omnidirectionally in the virtual world. Pedroli et al. ( 2018 ) provided a bike-type motion plat- form integrated in the CAVE system, and the users performed the given task while cycling. Visual feedback is essential in VR systems, and a detailed classification by visual feedback devices is presented in Table 3. In non-world-fixed type, which is generally classified as HMD type, assembled HMDs, including Oculus Rift and HTC Vive, were most frequently used. On the other hand, there was one case that used an HMD with a smartphone.
Especially, Wang and Lindeman ( 2015 ) developed coordi- nated hybrid VR system to provide more seamless 3D interaction in a VR system. In this system, an assembled HMD and a wearable tablet display were worn on user ’s head and non-dominant forearm, respectively. The UX eva- luation results from various angles including subjective assess- ment, task performance, interview, and video observation showed that this system can have a positive impact on UX.
In the case of world-fixed type, a conventional monitor-based type was most frequently adopted for providing visual feed- back. Screen and projector type were adopted in 5 and 8 cases, respectively. On the other hand, the CAVE system was used in relatively few cases. A certain simulator for virtual robotic surgery was used in one case (Tergas et al., 2013 ). In addition, Harish and Narayanan ( 2013 ) developed a multi-planar dis- play by applying a novel rendering scheme. These authors built a spherical display using multiple polygonal facets and demonstrated that the task performance was better in the multi-planar display compared to a flat panel display.
4.4. User activity –interaction partner, posture, and task type Table 4 shows the classification by user activity including interaction partner, posture, and task type. In a VR system, a single-user system is commonly used where the user inter- acts only with the input and output devices. A total of 58 cases were a single user. In three cases, multi-users utilized one system in the same location. This is the case where each input device is given to users and the task is performed in Figure 9. Classification by sensory cue. Table 3. Classification by types of output devices. Main categories Subcategories Items No. ofcase Non-world-fixed Non-see-throughHMD Smartphone HMD 1 Assembled HMD 19 Video-see-through HMD -0 Others Hybrid VR 1 World-fixed Non-surrounding display Conventional monitor based 23 Screen based 5 Projector based 8 Surrounding display CAVE 8 Others da Vinci skill simulator, Multi- planar display 2 No information 7 If no information about the visual display corresponding to the two maincategories is given, it is classified as no information. The total number ofcases might be greater than the number of papers selected because there were cases where multiple experiments were performed in one paper. Table 4. Classification by interaction attributes –interaction partner, posture, and task type.
Main categories Subcategories No. ofcase Interactionpartner Single user 58 Co-located multi-user 3 Remote multi-user 5 Posture Sitting 28 Standing 22 Sitting/standing 8 No information 8 Task type Passive –watching only 10 Active –navigation, selection, manipulation or system control 59 The total number of cases might be greater than the number of papers selectedbecause there were cases where multiple experiments were performed in one paper. 902 Y. M. KIM ET AL. the same VE. Users can perform a collaborative assembly task in a VE (Erfanian et al., 2017 ) or explore education activity together (Alves Fernandes et al., 2016 ; Naya & Ibáñez, 2015 ). In five cases, multi-users experienced one VE in different locations, which was classified as remote multi-user. In this system, users could perform collaborative tasks by continu- ously sharing information (Anton et al., 2018 ; de Jesus Oliveira et al., 2018 ; Oprean, Simpson, & Klippel, 2018 ) and it was possible to have a virtual meeting (Sutcliffe & Alrayes, 2012 ) or virtual learning (Vosinakis, Anastassakis, & Koutsabasis, 2018 ). The sitting position was adopted more than the standing position, and special vehicles were rarely provided. When experiencing VR in standing position, users could walk freely in a limited space or body gestures were tracked. Task type was divided into active or passive tasks, and in most cases, an active task type was adopted. This indicates that UX evaluation has more been focussed on active tasks such as navigation, selection, manipulation, and system control compared to passive task.
4.5. Environment As shown in Figure 10, most studies did not establish a specific platform other than the input and output devices.
There were three cases where the movable platform was used.
These correspond to motion platform described in Section 4.3 . For non-movable platform, fixed seat was provided for vehicular navigation content such as driving simulation (Georgiou & Demiris, 2017 ). In addition, few CAVE systems were used. The movable platform and CAVE system would be expensive to build and would require technical experts. In particular, the installation space should be ensured sufficiently for the CAVE system.
4.6. Application Unexpectedly, VR systems in most of the abovementioned studies could not be established for specific applications, and these cases were classified as basic application ( Table 5 ), which corresponds to the case where UX evaluation is per- formed on various factors implemented in a VR system. In other words, the findings of studies classified as basic applica- tion can be applied to any VR application. For example, there were cases where UX was evaluated according to the feedback modality, interaction type, and display type. For the case of applying the VR system directly to a specific field besides basic application or utilizing it for a specific field, most cases were in the Education and training field. The numbers of cases according to other VR applications are as follows:
healthcare & therapy (5), communication (5), entertainment (5), product development (4), architectural & urban design (2), transportation (1), digital marketing (1), art (1), forestry (1), and exhibition & tour (1).
4.7. Evaluation The results of classification by each attribute of the UX evaluation method are presented in Table 6 . In the measure- ment characteristics, most cases used both subjective and objective measurements, whereas only two cases used only objective measurement. In other words, in most studies, sub- jective measurements were evaluated. This result shows that most studies evaluated subjective feelings such as presence and immersion, which are representative UX components that VR system pursues to provide to users, rather than evaluating performance measurements of function implemen- tation. In measure characteristic, a quantitative method was adopted more frequently to evaluate UX in a VR system compared to a qualitative method. As a quantitative method, a questionnaire was frequently used, and the performance was evaluated using the completion time or error rate. The adapted or entire version of well- Figure 10. Classification by environmental attributes. Table 5. Classification by VR application. Main categories subcategories No. of case VR applications Education & training 10 Healthcare & Therapy 5 Communication 5 Entertainment 5 Product development 4 Architectural & Urban design 2 Transportation 1 Digital marketing 1 Art 1 Forestry 1 Exhibition & tour 1 Basic 27 INTERNATIONAL JOURNAL OF HUMAN –COMPUTER INTERACTION 903 structured questionnaires was mainly used, and the question- naires varied. For example, to evaluate certain UX components (e.g., presence, immersion, engagement, usability, and simulator sickness) various questionnaires were used such as Presence Questionnaire (Witmer & Singer, 1998 ), The Independent Television Commission ’s Sense of Presence Inventory (Lessiter, Freeman, Keogh, & Davidoff, 2001 ), System Usability Scale (Brooke, 1996 ), and Simulator Sickness Questionnaire (Kennedy, Lane, Berbaum, & Lilienthal, 1993 ). In addition, most of the studies focused on general users. The laboratory was the most widely used experimental loca- tion, and there was one case in each field and the online environment. In addition, for the system development phase, all VR systems were included in the second half of the phase.
More specifically, most of VR systems were functional proto- types. In this case, the self-development contents or tracking systems were implemented in VR systems. On the other hand, there were few experiments that used both content and devices that were being commercialized. In other words, the number of experiments confirming the effect of specific fac- tors in a well-structuralized experiment was relatively large, and there were few studies that focused on a VR system for a specific application. Meanwhile, as for the period of experience, there was no case where the UX was evaluated before and during the VR experience; in most cases, UX was evaluated after the VR experience. This also means that UX was generally evaluated after completing certain tasks. Only three cases were included in the cumulative UX criterion. This includes cases where the degree of rehabilitation progress was assessed after experien- cing the VR system several times (Jonsdottir et al., 2018 ; Schwenk et al., 2014 ) and a case where evaluation was per- formed after experiencing the system for a relatively long period (Newe, Becker, & Schenk, 2014 ). 5. Discussion and Recommendation The proposed taxonomies were developed before analyzing the collected papers and were used as the basis for reporting the review results. The results were mostly reported in corre- spondence with the proposed taxonomies. However, there was a case where VR system used in previous study was not assigned to the proposed taxonomy, and this part has been adjusted. For example, in hand input category, the non- tracked and non-hand-held devices were added based on the analysis of collected papers. In the future, input devices can be mounted on other VR system components or a new interac- tion paradigm can emerge. Thus, although the suggested taxonomies in this study can be enough to cover the recent studies on UX in VR system, the taxonomies need to be extended and refined in the future according to the develop- ment of VR system. In this paper, there are two main points of discussion about UX studies in a VR system. The first is related to the implementation of equipment and technology including input devices, output devices, feedback forms, platforms, and appli- cations and the other is related to research methods including user characteristics, interactions, and evaluation method.
5.1. Issues related to VR devices and technology ● In the non-hand input category, assembled-HMDs were mainly used, and efforts were made to apply new ways of interaction in addition to head tracking. However, UX research on input methods other than head tracking is still lacking. The non-hand type input interaction method using eye movement, EEG signal or voice com- mand needs to be actively studied in the future because people can experience VR even under conditions where they cannot use their hands freely because of being mentally or physically impaired. Furthermore, if various interaction methods can be applied in the same VR system, users can experience VR by selecting the inter- action type according to their physical ability and preference. ● In the hand input category, t here is a high proportion of types of no tracking device. This means that the gamepad, keyboard, and mouse conventionally used in a PC environment are constantly being used in a VR system. However, these devices can limit the natural hand movement of the user. On the other hand, bare- hand or hand-worn interaction can realize a natural user interface, which promotes the degree of immer- sion when performing tasks. Nonetheless, the bare- hand interaction still has a main issue in that the system cannot provide haptic feedback to users (Koutsabasis & Vosinakis, 2018 ). On the other hand, hand-worn devices are encumbered input devices, which means that users need to wear physical hard- ware (Jerald, 2015 ). This can cause problems with physical comfort, installation complexity, or tracking reliability. Therefore, cont inuous effort is required to solve these problems for more seamless and natural interaction in VR systems. Table 6. Classification by attributes of UX evaluation method in VR. Main categories Subcategories No. ofcase Measurement Subjective 64 Objective 32 Subjective only 30 Objective only 2 Both 34 Measure Quantitative 65 Qualitative 26 Quantitative only 40 Qualitative only 0 Both 26 Evaluator Users 57 Expert 9 Location Laboratory 64 Field 1 Online 1 System developmentphase Fully functional system 10 Functional prototypes 55 Conceptual design ideas in very early phases 0 Nonfunctional prototypes 0 Period of experience Before usage (Anticipated UX) 0 During usage (Momentary UX) 0 After usage (Episodic UX) 62 Overtime (Cumulative UX) 3 The total number of cases might be greater than the number of papers selectedbecause there were cases where multiple experiments were performed in onepaper.
904 Y. M. KIM ET AL. ● Furthermore, tracked hand- held controllers have been actively used in VR systems. These controllers can inevitably cause physical loads on users ’body parts; however, studies on the risks that may arise from ergonomic aspects are lacking. Therefore, it is desir- able that the risk factors such as wrist load due to the weight of the controller and long-term use, excessive force due to the design features, or excessive bending during work are studied in combination with overall UX in a VR system. ● In feedback modality, visual stimulation was provided in all studies, while other stimuli were provided rela- tively less. In particular, there was only one case in which olfactory stimulation was used. Since multi- sensory feedback helps improve immersion in a VR system (Leonardis, Frisoli, Barsotti, Carrozzino, & Bergamasco, 2014 ;Mikropoulos&Natsis, 2011 ), further UX studies on the combinations of multiple feedback forms are needed. In addition, though a new motion platform for a VR system has been introduced and developed, the UX on it has been rarely been experimentally investigated. While the motion plat- form can enhance immersion, it can also cause motion sickness. Thus, it needs to focus on the overall UX research on the motion platform before the motion platform becomes popular in the theme park or VR experience space. ● Compared to assembled HMD, world-fixed displays such as conventional monitor, screen or projector- based displays have been adopted more often as visual displays. However, about 84% of the total cases of using HMDs belong to after the year 2016, when the Oculus Rift was launched to the public and HMDs began to gain popularity. As an increasing number of advanced HMDs are constantly being released to the public today, we expect more UX studies on HMDs in the future. ● The CAVE system is one of the high immersive VR systems; however, it has been rarely researched. This is because CAVE systems are not only costly to build but also require a special setup and large space. Thus, there would be practical difficulties for small groups of researchers to build a CAVE system and conduct UX research, so there is a need to expand the investment for the CAVE system. The CAVE system has been effec- tively used in various industries such as military, educa- tion, medicine, and scientific visualizations (Muhanna, 2015 ), and thus the findings of UX research on CAVE system might be worthily utilized in these fields. It is noteworthy that half of studies adopted CAVE were in 2018. Thus, it is expected that more valuable results of UX evaluation for the CAVE system will be actively presented in the future. ● In addition, studies conducted on UX in a VR system have focussed more on factors related to the imple- mentation of the VR function, rather than focusing on the specific application. In other words, research has been conducted to discover and establish empirical evidence on the effects of implementation of VR sys- tem components and functions. This evidence will be used valuably for building a safe and enjoyable VR system. Moreover, since VR systems are expected to be used in various industrial domains, efforts should be made to maximize the effects of VRs applied to specific applications. 5.2. Issues related to the research method ● The UX evaluation study in a VR system is characterized as being concentrated in younger adult and healthy people. However, there is a need to expand the age range of subjects, including the elderly and children, since the response to the same stimulus can be differ- ently revealed from different psychological or physical aspects. Especially, since elderly people are mentally and physically more vulnerable than younger adults due to aging, the potential problems caused by their declined functional capabilities should be clarified and research- ers should consider these issues. Furthermore, as the VR system is being applied in the field of rehabilitation and therapy, it is necessary to focus on discovering UX issues related to VR experience for people with disabilities. ● In interaction partner category, the single-user system was more frequently used than the collaborative system.
However, researches on the collaborative VR system should be expanded and the related technologies should also be advanced. In daily life, people communicate and collaborate with each other. Thus, in the future, there is a high possibility of collaborative VR systems being implemented for various fields such as experience cen- ter, product design, virtual meeting, and multiuser entertainment. More importantly, this collaborative VR system can require different interaction patterns and provide totally different UX compared with a single- user system. ● In case of task type, active interactions were more fre- quently performed than passive tasks. Today, users can experience various contents photographed through a 360°camera or videos taken in bird ’s eye view through a drone by wearing an HMD. Hence, research on UX can be expanded on this area. Additionally, in contrast to our expectation, there were many cases in which the exact tasks performed were not clearly defined. To pro- vide appropriate reference and direction to the research- ers who study the UX in VR in the future, it is necessary to describe the task performed in detail. ● In the evaluation characteristic, the qualitative mea- sure is relatively less adopted for evaluating UX.
However,itwouldbevaluabletoevaluateUXthrough a deep interview or observation in terms of obtaining rich evidence on subjective assessment and unex- pected contextual issues. In addition, UX was mainly evaluated after experiencing VR using questionnaires.
However, if negative effects can be predicted by ana- lyzing a user ’s physiological signal or behavior pattern while experiencing a VR, a safer VR experience can be promoted. Besides, when evaluating the user ’s INTERNATIONAL JOURNAL OF HUMAN –COMPUTER INTERACTION 905 immersion through the quest ionnaire, it is difficult to know at what point the feeling was aroused and there is a possibility to evaluate it differently from the situation felt during an actual VR experience.
Therefore, evaluation methodologies that can evaluate the users ’subjective state without interfering with their sense of immersion during the VR experience will be constantly attempted to be developed. 6. Conclusion This paper proposed systematic taxonomies for classifying the types of VR systems and conducted a systematic review the previous studies from the perspectives of HCI and UX. Today, with the commercialization of HMDs and the introduction of advanced technologies related to VR systems, the prospects of VR industry are highly appreciated. In line with this trend, UX studies in VR systems have been increased significantly since 2017. However, in comparison with the development of the VR technology, the research of UX in VR needs to be further studied mainly from two aspects: issues related to VR devices and technology and those related to the research method. A myriad of usage contexts can be defined due to the combination of the components constituting the VR system and the result of the UX evaluation can be derived differently according to the context. Therefore, there may be uncon- firmed negative effects on the VR context that is not currently introduced. In addition, as various contents are newly devel- oped and released, it is likely to be extended to several indus- tries such as education, e-commerce, and healthcare beyond the entertainment market. Furthermore, because Information and Communications Technologies and machine learning are being extensively studied in the academic field, research on utilizing sensor data of the VR system might be performed sufficiently. Hence, a comprehensive UX study in VR systems with diverse use environments should be conducted, and the proposed taxonomies and findings of this study are expected to contribute to future research on this field.
Funding This work was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education [NRF-2017R1D1A3B03034321].
ORCID Yong Min Kim http://orcid.org/0000-0003-4796-490X Ilsun Rhiu http://orcid.org/0000-0001-8229-7220 Myung Hwan Yun http://orcid.org/0000-0001-8554-3132 References Ahn, S. J., Fox, J., Dale, K. R., & Avant, J. A. ( 2015 ). Framing virtual experiences: Effects on environmental efficacy and behavior over time.Communication Research ,42,839 –863. doi: 10.1177/0093650214534973 Aïm, F., Lonjon, G., Hannouche, D., & Nizard, R. ( 2016 ). Effectiveness of virtual reality training in orthopaedic surgery. Arthroscopy. The Journal of Arthroscopic & Related Surgery , 32(1), 224 –232. doi: 10.1016/j.arthro.2015.07.023 Alves Fernandes, L. M., Cruz Matos, G., Azevedo, D., Rodrigues Nunes, R., Paredes, H., Morgado, L., … Cardoso, B. ( 2016 ). Exploring educational immersive videogames: An empirical study with a 3D multimodal inter- action prototype. Behaviour & Information Technology ,35,907 –918. doi: 10.1080/0144929X.2016.1232754 Anthes, C., García-Hernández, R. J., Wiedemann, M., & Kranzlmüller, D.(2016 ).State of the art of virtual reality technology . Paper presented at the Aerospace Conference, 2016 IEEE, Big Sky, MT.
Anton, D., Kurillo, G., & Bajcsy, R. ( 2018 ). User experience and inter- action performance in 2D/3D telecollaboration. Future Generation Computer Systems ,82,77 –88. doi: 10.1016/j.future.2017.12.055 Baños, R. M., Botella, C., Alcañiz, M., Liaño, V., Guerrero, B., & Rey, B.(2004 ). Immersion and emotion: Their impact on the sense of presence. CyberPsychology & Behavior ,7(6), 734 –741. doi: 10.1089/ cpb.2004.7.734 Baus, O., & Bouchard, S. ( 2017 ). Exposure to an unpleasant odour increases the sense of presence in virtual reality. Virtual Reality ,21, 59–74. doi: 10.1007/s10055-016-0299-3 Bhagat, K. K., Liou, W.-K., & Chang, C.-Y. ( 2016 ). A cost-effective inter- active 3D virtual reality system applied to military live firing training. Virtual Reality ,20(2), 127 –140. doi: 10.1007/s10055-016-0284-x Bian, Y., Yang, C., Gao, F., Li, H., Zhou, S., Li, H., … Meng, X. ( 2016 ). A framework for physiological indicators of flow in VR games:
Construction and preliminary evaluation. Personal and Ubiquitous Computing ,20,821 –832. doi: 10.1007/s00779-016-0953-5 Boletsis, C. ( 2017 ). The new era of virtual reality locomotion: A systematic literature review of techniques and a proposed typology. Multimodal Technologies and Interaction , 1(4), 24. doi: 10.3390/mti1040024 Botella, C., Serrano, B., Baños, R. M., & Garcia-Palacios, A. ( 2015 ). Virtual reality exposure-based therapy for the treatment of post-traumatic stress disorder: A review of its efficacy, the adequacy of the treatment protocol, and its acceptability. Neuropsychiatric Disease and Treatment ,11, 2533. doi: 10.2147/NDT Bowman, D. A., & Hodges, L. F. ( 1999 ). Formalizing the design, evalua- tion, and application of interaction techniques for immersive virtual environments. Journal of Visual Languages & Computing ,10(1), 37–53. doi: 10.1006/jvlc.1998.0111 Bowman, D. A., Kruijff, E., LaViola, J. J., Jr, & Poupyrev, I. ( 2001 ). An introduction to 3-D user interface design. Presence: Teleoperators & Virtual Environments ,10(1), 96 –108. doi: 10.1162/105474601750182342 Brade,J.,Lorenz,M.,Busch,M.,Hammer,N.,Tscheligi,M.,&Klimant, P. ( 2017 ). Being there again –Presence in real and virtual environments and its relation to usability and user experi- ence using a mobile navigation task. International Journal of Human Computer Studies , 101 ,76 –87. doi: 10.1016/j. ijhcs.2017.01.004 Brooke, J. ( 1996 ). SUS-A quick and dirty usability scale. Usability Evaluation in Industry ,189 (194), 4 –7. Bulu, S. T. ( 2012 ). Place presence, social presence, co-presence, and satisfaction in virtual worlds. Computers & Education ,58(1), 154 –161. doi: 10.1016/j.compedu.2011.08.024 Burdea, G. C., & Coiffet, P. ( 2003 ).Virtual reality technology . Hoboken, NJ: John Wiley & Sons.Camporesi, C., & Kallmann, M. ( 2016 ).The effects of avatars, stereo vision and display size on reaching and motion reproduction. IEEE Transactions on Visualization and Computer Graphics ,22, 1592 –1604. doi: 10.1109/TVCG.2015.2440231 Cobb,S.V.,Nichols,S.,Ramsey,A.,&Wilson,J.R.( 1999 ). Virtual reality-induced symptoms and effects (VRISE). Presence: Teleoperators & Virtual Environments ,8(2), 169 –186. doi: 10.1162/ 105474699566152Cruz-Neira, C., Sandin, D. J., DeFanti, T. A., Kenyon, R. V., & Hart, J. C.
(1992 ). The CAVE: Audio visual experience automatic virtual environment. Communications of the ACM , 35(6), 64 –73. doi: 10.1145/129888.129892 Culbertson, H., & Kuchenbecker, K. J. ( 2017 ). Importance of matching physical friction, hardness, and texture in creating realistic haptic 906 Y. M. KIM ET AL. virtual surfaces. IEEE Transactions on Haptics ,10,63 –74. doi: 10.1109/ TOH.2016.2598751de Jesus Oliveira, V. A., Nedel, L., & Maciel, A. ( 2018 ). Assessment of an articulatory interface for tactile intercommunication in immersive virtual environments. Computers & Graphics ,76,18 –28. doi: 10.1016/ j.cag.2018.07.007 Pick, S., Weyers, B., Hentschel, B., & Kuhlen, T. W. ( 2016 ). Design and evaluation of data annotation workflows for CAVE-like virtual environments. IEEE Computer Society 22 , 1452 –1461. Erfanian, A., Hu, Y., & Zeng, T. ( 2017 ). Framework of multiuser satisfac- tion for assessing interaction models within collaborative virtual environments. IEEE Transactions on Human-Machine Systems ,47, 1052 –1065. doi: 10.1109/THMS.2017.2700431 Forlizzi, J., & Battarbee, K. ( 2004 ).Understanding experience in interac- tive systems . Paper presented at the proceedings of the 5th conference on designing interactive systems: Processes, practices, methods, and techniques, Cambridge, MA.Georgiou, T., & Demiris, Y. ( 2017 ). Adaptive user modelling in car racing games using behavioural and physiological data. User Modeling and User-adapted Interaction ,27, 267 –311. doi: 10.1007/ s11257-017-9192-3 Girod, S., Schvartzman, S. C., Gaudilliere, D., Salisbury, K., & Silva, R.(2016 ). Haptic feedback improves surgeons ’user experience and frac- ture reduction in facial trauma simulation. Journal of Rehabilitation Research & Development , 53(5), 561 –570. doi: 10.1682/ JRRD.2015.03.0043Harish, P., & Narayanan, P. J. ( 2013 ). Designing perspectively correct multiplanar displays. IEEE Transactions on Visualization and Computer Graphics ,19, 407 –419. doi: 10.1109/TVCG.2012.135 Hartson, R., & Pyla, P. S. ( 2012 ).The UX book: Process and guidelines for ensuring a quality user experience . San Francisco, CA: Elsevier. Hassenzahl, M., & Tractinsky, N. ( 2006 ). User experience-a research agenda. Behaviour & Information Technology , 25(2), 91 –97. doi: 10.1080/01449290500330331 Henderson, A., Korner-Bitensky, N., & Levin, M. ( 2007 ). Virtual reality in stroke rehabilitation: A systematic review of its effectiveness for upper limb motor recovery. Topics in Stroke Rehabilitation ,14(2), 52–61. doi: 10.1310/tsr1402-52 Howard, M. C. ( 2017 ). A meta-analysis and systematic literature review of virtual reality rehabilitation programs. Computers in Human Behavior ,70, 317 –327. doi: 10.1016/j.chb.2017.01.013 Jayaram, S., Connacher, H. I., & Lyons, K. W. ( 1997 ). Virtual assembly using virtual reality techniques. Computer-aided Design ,29(8), 575 –584. doi: 10.1016/S0010-4485(96)00094-2 Jerald, J. ( 2015 ).The VR book: Human-centered design for virtual reality . New York, NY: Morgan & Claypool.Jerald, J. ( 2018 ). Human-centered VR design: Five essentials every engi- neer needs to know. IEEE Computer Graphics and Applications ,38(2), 15–21. doi: 10.1109/MCG.2018.021951628 Jin, S.-A.-A. ( 2013 ). The moderating role of sensation seeking tendency in robotic haptic interfaces. Behaviour and Information Technology , 32, 862 –873. doi: 10.1080/0144929X.2012.687769 Jonsdottir, J., Bertoni, R., Lawo, M., Montesano, A., Bowman, T., &Gabrielli,S.( 2018 ). Serious games for arm rehabilitation of persons with multiple sclerosis. A randomized controlled pilot study. Multiple Sclerosis and Related Disorders ,19,25 –29. doi: 10.1016/j.msard.2017.10.010 Kageyama, A., & Tomiyama, A. ( 2016 ). Visualization framework for CAVE virtual reality systems. International Journal of Modeling, Simulation, and Scientific Computing ,7(04), 1643001. Kennedy, R. S., Lane, N. E., Berbaum, K. S., & Lilienthal, M. G. ( 1993 ). Simulator sickness questionnaire: An enhanced method for quantify-ing simulator sickness. The International Journal of Aviation Psychology ,3(3), 203 –220. doi: 10.1207/s15327108ijap0303_3 Kmet, L. M., Cook, L. S., & Lee, R. C. ( 2004 ). Standard quality assessment criteria for evaluating primary research papers from a variety of fields.
Edmonton, AB: Alberta Heritage Foundation for Medical Research.Kober, S. E., & Neuper, C. ( 2013 ). Personality and presence in virtual reality: Does their relationship depend on the used presence measure?
International Journal of Human-computer Interaction ,29(1), 13 –25. doi: 10.1080/10447318.2012.668131 Koutsabasis, P., & Vosinakis, S. ( 2018 ). Kinesthetic interactions in museums: Conveying cultural heritage by making use of ancient tools and (re-) constructing artworks. Virtual Reality ,22(2), 103 –118. doi: 10.1007/s10055-017-0325-0 Kozhevnikov, M., & Gurlitt, J. ( 2013 ).Immersive and non-immersive virtual reality system to learn relative motion concepts . Paper presented at the 3rd Interdisciplinary Engineering Design Education Conference(IEDEC), Santa Clara, CA.Ko źlak, M., Kurzeja, A., & Nawrat, A. ( 2013 ). Virtual reality technology for military and industry training programs. In A. Nawrat & Z.Ku ś(Eds.), Vision based systems for UAV applications (pp. 327 –334). Heidelberg, Germany: Springer International Publishing.Kyriakou, M., Pan, X., & Chrysanthou, Y. ( 2017 ). Interaction with virtual crowd in immersive and semi-immersive virtual reality systems.
Computer Animation and Virtual Worlds ,28, e1729. doi: 10.1002/ cav.1729 Lau, K. W., & Lee, P. Y. ( 2015 ). The use of virtual reality for creating unusual environmental stimulation to motivate students to explore creative ideas. Interactive Learning Environments ,23(1), 3 –18. doi: 10.1080/10494820.2012.745426 Lele, A. ( 2013 ). Virtual reality and its military utility. Journal of Ambient Intelligence and Humanized Computing ,4(1), 17 –26. doi: 10.1007/ s12652-011-0052-4Leonardis, D., Frisoli, A., Barsotti, M., Carrozzino, M., & Bergamasco, M.
(2014 ). Multisensory feedback can enhance embodiment within an enriched virtual walking scenario. Presence: Teleoperators and Virtual Environments ,23(3), 253 –266. Lessiter, J., Freeman, J., Keogh, E., & Davidoff, J. ( 2001 ). A cross-media presence questionnaire: The ITC-sense of presence inventory.
Presence: Teleoperators & Virtual Environments ,10(3), 282 –297. doi: 10.12968/bjon.2001.10.5.12353 Li, X., Yi, W., Chi, H.-L., Wang, X., & Chan, A. P. ( 2018 ). A critical review of virtual and augmented reality (VR/AR) applications in construction safety. Automation in Construction ,86, 150 –162. doi: 10.1016/j.autcon.2017.11.003 Liberati, A., Altman, D. G., Tetzlaff, J., Mulrow, C., Gøtzsche, P. C.,Ioannidis, J. P., … Moher, D. ( 2009 ). The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evalu-ate health care interventions: Explanation and elaboration. PLoS Medicine ,6(7), e1000100. doi: 10.1371/journal.pmed.1000100 Lin, Y., Breugelmans, J., Iversen, M., & Schmidt, D. ( 2017 ). An Adaptive Interface Design (AID) for enhanced computer accessibility and rehabilitation. International Journal of Human-computer Studies ,98, 14–23. doi: 10.1016/j.ijhcs.2016.09.012 Loup-Escande, E., Jamet, E., Ragot, M., Erhel, S., & Michinov, N. ( 2017 ). Effects of stereoscopic display on learning and user experience in an educational virtual environment. International Journal of Human – Computer Interaction , 33(2), 115 –122. doi: 10.1080/ 10447318.2016.1220105Ma, M., & Zheng, H. ( 2011 ). Virtual reality and serious games in healthcare. In S. Brahnam & L. C. Jain (Eds.), Advanced computational intelligence paradigms in healthcare 6. Virtual reality in psychotherapy, rehabilitation, and assessment (pp. 169 –192). Heidelberg, Germany: Springer-Verlag Berlin Heidelberg.Manjrekar, S., Sandilya, S., Bhosale, D., Kanchi, S., Pitkar, A., & Gondhalekar, M. ( 2014 ).CAVE: An emerging immersive technology – A review . Paper presented at the Computer Modelling and Simulation (UKSim), 2014 UKSim-AMSS 16th International Conference on, Washington, DC.McComas, J., MacKay, M., & Pivik, J. ( 2002 ). Effectiveness of virtual reality for teaching pedestrian safety. CyberPsychology & Behavior ,5 (3), 185 –190.doi: 10.1089/109493102760147150 Mikropoulos, T. A., & Natsis, A. ( 2011 ). Educational virtual environments: A ten-year review of empirical research (1999 –2009). Computers & Education ,56(3), 769 –780. doi: 10.1016/j.compedu.2010.10.020 Monteiro,P.,Carvalho,D.,Melo,M.,Branco,F.,&Bessa,M.( 2018 ). Application of the steering law to v irtual reality walking navigation interfaces. Computers & Graphics ,77,80 –87. doi: 10.1016/j.cag.2018.10.003 Morán, A., Ramírez-Fernández, C., Meza-Kubo, V., Orihuela-Espina, F.,García-Canseco, E., Grimaldo, A. I., & Sucar, E. ( 2015 ). On the effect of INTERNATIONAL JOURNAL OF HUMAN –COMPUTER INTERACTION 907 previous technological experience on the usability of a virtual rehabilita- tion tool for the physical activation and cognitive stimulation of elders. Journal of Medical Systems ,39,104.doi: 10.1007/s10916-015-0297-0 Moro, S. B., Bisconti, S., Muthalib, M., Spezialetti, M., Cutini, S., Ferrari, M., … Quaresima, V. ( 2014 ). A semi-immersive virtual reality incremental swing balance task activates prefrontal cortex:
A functional near-infrared spectroscopy study. Neuroimage ,85, 451 –460. doi: 10.1016/j.neuroimage.2013.05.031 Muhanna, M. A. ( 2015 ). Virtual reality and the CAVE: Taxonomy, interaction challenges and research directions. Journal of King Saud University-Computer and Information Sciences ,27(3), 344 –361. doi: 10.1016/j.jksuci.2014.03.023 Narciso, D., Bessa, M., Melo, M., Coelho, A., & Vasconcelos-Raposo, J.(2017 ). Immersive 360 ∘ video user experience: Impact of different variables in the sense of presence and cybersickness. Universal Access in the Information Society ,1–11. doi: 10.1007/s10209-017-0581-5 Naya, V. B., & Ibáñez, L. A. H. ( 2015 ). Evaluating user experience in joint activities between schools and museums in virtual worlds. Universal Access in the Information Society ,14(3), 389 –398. doi: 10.1007/s10209-014-0367-y Newe,A.,Becker,L.,&Schenk,A.( 2014 ). Application and evaluation of interactive 3D PDF for presenting and sharing planning results for liver surgery in clinical routine. PloS One ,9.doi: 10.1371/journal.pone.0115697 Nichols, S., & Patel, H. ( 2002 ). Health and safety implications of virtual reality: A review of empirical evidence. Applied Ergonomics ,33(3), 251 –271. doi: 10.1016/S0003-6870(02)00020-0 Oprean, D., Simpson, M., & Klippel, A. ( 2018 ). Collaborating remotely: An evaluation of immersive capabilities on spatial experiences and team membership. International Journal of Digital Earth ,11(4), 420 –436. doi: 10.1080/17538947.2017.1381191 Pallavicini, F., Argenton, L., Toniazzi, N., Aceti, L., & Mantovani, F.(2016 ). Virtual reality applications for stress management training in the military. Aerospace Medicine and Human Performance ,87(12), 1021 –1030. doi: 10.3357/AMHP.4596.2016 Pantelidis, V. S. ( 1993 ). Virtual reality in the classroom. Educational Technology ,33(4), 23 –27. Pedroli, E., Greci, L., Colombo, D., Serino, S., Cipresso, P., Arlati, S., … Goulene, K. ( 2018 ). Characteristics, usability, and users experience of a system combining cognitive and physical therapy in a virtual envir- onment: Positive bike. Sensors ,18(7), 2343. doi: 10.3390/s18072343 Powers, J. C., Bieliaieva, K., Wu, S., & Nam, C. S. ( 2015 ). The human factors and ergonomics of P300-based brain-computer interfaces.Brain Sciences ,5(3), 318 –356. doi: 10.3390/brainsci5030318 Pratt, D. R., Zyda, M., & Kelleher, K. ( 1995 ). Virtual reality: In the mind of the beholder. Computer ,28,17 –19. Rajeshkumar, S., Omar, R., & Mahmud, M. ( 2013 ).Taxonomies of user experience (UX) evaluation methods . Paper presented at the Research and Innovation in Information Systems (ICRIIS), 2013 International Conference on, Kuala Lumpur, Malaysia.Reid,D.T.( 2002 ). Benefits of a virtual play rehabilitation environment for children with cerebral palsy on percept ions of self-efficacy: A pilot study. Pediatric Rehabilitation ,5(3), 141 –148. doi: 10.1080/1363849021000039344 Reyes-Lecuona, A., & Diaz-Estrella, A. ( 2006 ).New interaction paradigms in virtual environments . Paper presented at the electrotechnical con- ference. MELECON 2006. IEEE Mediterranean, Malaga, Spain.Riecke, B. E., Schulte-Pelkum, J., Caniard, F., & Bulthoff, H. H. ( 2005 ). Towards lean and elegant self-motion simulation in virtual reality .Paper presented at the Virtual Reality, 2005. Proceedings. VR 2005. IEEE, Arles, Camargue-Provence, France.Rieuf, V., & Bouchard, C. ( 2017 ). Emotional activity in early immersive design: Sketches and moodboards in virtual reality. Design Studies ,48, 43–75. doi: 10.1016/j.destud.2016.11.001 Roto, V., Law, E., Vermeeren, A., & Hoonhout, J. ( 2011 ). Bringing clarity to the concept of user experience. Retrieved from www.allaboutux. org/files/UX-WhitePaper.pdfSantos, B. S., Dias, P., Pimentel, A., Baggerman, J.-W., Ferreira, C., Silva, S., & Madeira, J. ( 2009 ). Head-mounted display versus desktop for 3D navigation in virtual reality: A user study. Multimedia Tools and Applications ,41(1), 161. doi: 10.1007/s11042-008-0223-2 Saposnik, G., & Levin, M., & Group, S. O. R. C. W. ( 2011 ). Virtual reality in stroke rehabilitation: A meta-analysis and implications for clinicians. Stroke, Strokeaha , 42, 1380 –1386. doi: 10.1161/ STROKEAHA.110.605451Schuemie, M. J., Van Der Straaten, P., Krijn, M., & Van Der Mast, C. A.(2001 ). Research on presence in virtual reality: A survey. CyberPsychology &Behavior ,4(2), 183 –201. doi: 10.1089/109493101300117884 Schulze, K., & Krömker, H. ( 2010 ).A framework to measure user experi- ence of interactive online products . Paper presented at the proceedings of the 7th international conference on methods and techniques in behavioral research, Eindhoven, The Netherlands.
Schvartzman, S. C., Silva, R., Salisb ury, K., Gaudilliere, D., & Girod, S. ( 2014 ). Computer-aided trauma simulation system with haptic feedback is easy and fast for oral-maxillofacial surgeons to learn and use. Journal of Oral and Maxillofacial Surgery ,72, 1984 –1993. doi: 10.1016/j.joms.2014.05.007 Schwenk, M., Grewal, G. S., Honarvar, B., Schwenk, S., Mohler, J., Khalsa, D. S., & Najafi, B. ( 2014 ). Interactive balance training inte- grating sensor-based visual feedback of movement performance:
A pilot study in older adults. Journal of Neuroengineering and Rehabilitation ,11, 164. doi: 10.1186/1743-0003-11-164 Slater, M., & Wilbur, S. ( 1997 ). A framework for immersive virtual environments (FIVE): Speculations on the role of presence in virtual environments. Presence: Teleoperators & Virtual Environments ,6(6), 603 –616. doi: 10.1162/pres.1997.6.6.603 Son, H., Shin, S., Choi, S., Kim, S.-Y., & Kim, J. R. ( 2018 ). Interacting automultiscopic 3D with haptic paint brush in immersive room. IEEE Access ,6, 76464 –76474. doi: 10.1109/Access.6287639 Stanney, K., & Salvendy, G. ( 1998 ). Aftereffects and sense of presence in virtual environments: Formulation of a research and development agenda. International Journal of Human-computer Interaction ,10(2), 135 –187. doi: 10.1207/s15327590ijhc1002_3 Stanney, K. M., Mollaghasemi, M., Reeves, L., Breaux, R., &Graeber, D. A. ( 2003 ). Usability engineering of virtual environments (VEs): Identifying multiple criteria that drive effective VE system design. International Journal of Human-computer Studies ,58(4), 447 –481. doi: 10.1016/S1071-5819(03)00015-6 Sutcliffe, A., & Alrayes, A. ( 2012 ). Investigating user experience in second life for collaborative learning. International Journal of Human Computer Studies ,70,508 –525. doi: 10.1016/j.ijhcs.2012.01.005 Sutherland, I. E. ( 1965 ). The ultimate display. In Multimedia: From Wagner to virtual reality (pp. 506 –508). London: Macmillan and Co. Takatalo, J., Nyman, G., & Laaksonen, L. ( 2008 ). Components of human experience in virtual environments. Computers in Human Behavior , 24(1), 1 –15. doi: 10.1016/j.chb.2006.11.003 Tergas, A. I., Sheth, S. B., Green, I. C., Giuntoli, R. L., II, Winder, A. D., &Fader, A. N. ( 2013 ). A pilot study of surgical training using a virtual robotic surgery simulator. JSLS-Journal of the Society of Laparoendoscopic Surgeons ,17,219 –226. doi: 10.4293/108680813X13654754535872 Thüring, M., & Mahlke, S. ( 2007 ). Usability, aesthetics and emotions in human –Technology interaction. International Journal of Psychology , 42(4), 253 –264. doi: 10.1080/00207590701396674 Tidoni, E., Abu-Alqumsan, M., Leonardis, D., Kapeller, C., Fusco, G., Guger, C., … Aglioti, S. M. ( 2017 ). Local and remote cooperation with virtual and robotic agents: A P300 BCI study in healthy and people living with spinal cord injury. IEEE Transactions on Neural Systems and Rehabilitation Engineering , 25, 1622 –1632. doi: 10.1109/ TNSRE.2016.2626391 Turchet, L., Burelli, P., & Serafin, S. ( 2013 ). Haptic feedback for enhan- cing realism of walking simulations. IEEE Transactions on Haptics ,6, 35–45. doi: 10.1109/TOH.2012.51 Van Cutsem, J., Marcora, S., De Pauw, K., Bailey, S., Meeusen, R., &Roelands, B. J. S. M. ( 2017 ). The effects of mental fatigue on physical performance: A systematic review. Sports Medicine (Auckland, N.Z.) , 47(8), 1569 –1588. doi: 10.1007/s40279-016-0672-0 Vermeeren, A. P., Law, E. L.-C., Roto, V., Obrist, M., Hoonhout, J., &Väänänen-Vainio-Mattila, K. ( 2010 ).User experience evaluation meth- ods: Current state and development needs . Paper presented at the proceedings of the 6th Nordic conference on human-computer inter- action: Extending boundaries, Reykjavik, Iceland.Vinnikov, M., Allison, R. S., & Fernandes, S. ( 2017 ). Gaze-contingent auditory displays for improved spatial attention in virtual reality. ACM Transactions On Computer-Human Interaction ,24,1–38. doi: 10.1145/3067822 908 Y. M. KIM ET AL. Vosinakis, S., Anastassakis, G., & Koutsabasis, P. ( 2018 ). Teaching and learning logic programming in virtual worlds using interactive micro- world representations. British Journal of Educational Technology ,49 (1), 30 –44. doi: 10.1111/bjet.2018.49.issue-1 Vourvopoulos, A., & Liarokapis, F. ( 2014 ). Evaluation of commercial brain-computer interfaces in real an dvirtualworldenvironment:Apilot study. Computers and Electrical Engineering ,40, 714 –729. Elsevier Ltd. Wang, J., & Lindeman, R. ( 2015 ).Coordinatedhybridvirtualenvir- onments: Seamless interaction conte xts for effective virtual reality. Computers & Graphics ,48,71 –83. doi: 10.1016/j.cag.2015.02.007 Witmer, B. G., & Singer, M. J. ( 1998 ). Measuring presence in virtual environments: A presence questionnaire. Presence ,7(3), 225 –240. doi: 10.1162/105474698565686 Xiong, W., Wang, Q.-H., Huang, Z.-D., & Xu, Z.-J. ( 2016 ). A framework for interactive assembly task simulation in virtual environment. International Journal of Advanced Manufacturing Technology ,85,955 –969. doi: 10.1007/s00170-015-7976-3 Zhang, H. ( 2017 ). Head-mounted display-based intuitive virtual reality train- ing system for the mining industry. International Journal of Mining Science and Technology ,27(4), 717 –722. doi: 10.1016/j.ijmst.2017.05.005 About the Authors Yong Min Kim is PhD candidate in Department of Industrial Engineering at Seoul National University, South Korea. He received BS degree in Biosystems Engineering from Seoul National University in2014. His research focuses on human-computer interaction, user-cen- tered design, and user experience, especially virtual reality systems and augmented reality.
Ilsun Rhiu is currently an assistant professor in the Division of Big Data and Management Engineering at the Hoseo University, South Korea. He received PhD degree in Industrial Engineering from Seoul NationalUniversity, South Korea, in 2015. His research interests includehuman-computer interaction, user-centered design, and user research method.
Myung Hwan Yun is Professor in the Department of Industrial Engineering at Seoul National University. He received his PhD degree in Industrial and Manufacturing Engineering at Penn State University, USA in 1994. His research interests include human factors, user-centered design, affective engineering, and intelligent human –machine interface. INTERNATIONAL JOURNAL OF HUMAN –COMPUTER INTERACTION 909 Appendix C1: Question/objective sufficiently described?
C2: Study design evident and appropriate? C3: Method of subject/comparison group selection or source of information/input variables described and appropriate?C4: Subject (and comparison group, if applicable) characteristics sufficiently described?
C5: If interventional and random allocation was possible, was it described? C6: If interventional and blinding of investigators was possible, was it reported?
C7: If interventional and blinding of subjects was possible, was it reported? C8: Outcome and (if applicable) exposure measure(s) well defined and robust to measurement/misclassification bias? Means of assessment reported?C9: Sample size appropriate?
C10: Analytic methods described/justified and appropriate? C11: Some estimate of variance is reported for the main results?C12: Controlled for confounding?
C13: Results reported in sufficient detail? C14: Conclusions supported by the results?
910 Y. M. KIM ET AL.