Annual Meeting Program Chair: Kay M. Stanney Department of Industrial Engineering and Management Systems University of Central Florida Orlando, FL 32816 Phone: 407 823-5582 firstname.lastname@example.org
Special Issue of HUMAN FACTORS "Virtual Environments: Empirical Studies and Models"
Dr. Robert C. Williges 302 Whittemore Hall Virginia Tech Blacksburg, VA 24061-0118 Telephone: 540-231-6270 Fax: 540-231-3322 email@example.com
Dr. Woodrow Barfield 302 Whittemore Hall Virginia Tech Blacksburg, VA 24061-0118 Telephone: 540-231-2547 Fax: 540-231-3322 firstname.lastname@example.org
The special edition will focus on papers which represent experiments on presence and human performance effects of variables used to create virtual and augmented reality environments. Specifically, papers showing how performance may vary as a function of different types of virtual environment display technology (spatialized audio, force and tactile feedback devices, and visual display technology) are sought. Papers describing differences as a function of a desktop display, a large-screen projection system, a head-mounted display, or a fully immersive 3D environment are of interest. Example application areas can include, but are not limited to, manufacturing, medicine, computer-supported cooperative work, augmented reality, virtual teleconferencing, and simulator training. Furthermore, papers focusing on models proposed to conceptualize human performance and presence in virtual environments are sought.
Five copies of complete manuscripts must be prepared according to the Information For Contributors on the inside back cover of HUMAN FACTORS and mailed to:
Editor, HUMAN FACTORS ATTN: Virtual Environments Special Issue Human Factors and Ergonomic Society P.O. Box 1369 Santa Monica, CA 90406-1369 USA
(For overnight, use the street address.) 1124 Montana Avenue, Suite B Santa Monica, CA 90403 USA
Manuscripts can be submitted for publication consideration anytime until JUNE 30, 1997.
Back to Contents
Within the framework of a German research project, the use of a head-mounted display as an innovative interface on the shop floor is investigated. The display used is voice controlled and designed to support operators of next-generation manufacturing systems in critical supervisory tasks. From a technical point of view, the connection of the display to a modern numerical control has been realized. Additionally, a speech recognition system has been implemented for controlling the HMD user interface as well as the machine tool. For the HMD, a usability study has been carried out, which investigates the user's perspective according to 13 skilled operators. This study focuses on the highly critical task of "approach". The results indicate that the head-mounted display enables the operator to fulfill the control task without the necessity of conventional eye and head movements, which induce visual and motor strain. Thus the majority of the participants approves such an innovative display technology as a component of future human-machine interfaces.
Back to Contents
The Office of Naval Research through the Naval Command, Control and Ocean Surveillance Center (NRaD) is currently investigating the development of a Command and Control Multi-user Virtual Environment. This four year project is integrating Multi-User Domain technology, available World Wide Web resources, and Commercial-off-the-shelf products, to create a distributed collaborative virtual environment for use during humanitarian assistance operations. During these operations, C2MUVE will be employed by military, civilian, and foreign government entities, as well as volunteer and commercial organizations to acquire timely information and perform tasks such as decision making and planning in order to address critical humanitarian needs. Two important features of C2MUVE are scalability, the system can accommodate a few to hundreds of users, and the 3D VRML interface, which will ease interaction for all levels of users. The C2MUVE development team, consisting of the U. S. Navy and contractor support from Science Applications International Corporation, is in the second year of continuing development. The C2MUVE environment utilizes an office metaphor, with virtual rooms and everyday objects, in which multi-person interactions, conferences, and collaboration will take place. The C2MUVE toolbar has iconic buttons to invoke Java applet windows for navigation within the virtual environment, paging, communication and setting preferences. Also accessible from the toolbar are buttons for dynamic modification and extension of the C2MUVE environment. These tools allow people who are not technically trained to easily and quickly customize the virtual environment.
Back to Contents
Accompanying the military's transition from cold-war to post-cold-war doctrine have been changes in duties of military soldiers on assignment in foreign countries. Specifically, in addition to the traditional skills such as moving, communicating and employing weapons, today's soldier must also master skills relating to keeping, maintaining, or even enforcing peace. Military missions such as those in Bosnia, Haiti, and Somalia have underscored the importance of decision making, crowd control, and interpersonal communications skills. Unfortunately, the methods currently used to train such skills tend to be inflexible, and suffer from low fidelity and inability to assess performance. During the past year, my colleagues and I performed research at the University of Houston's Virtual Environment Technology Laboratory. This work involved examining the feasibility of using VR to train military peacekeeping skills. One investigation involved training teams of participants to navigate an unfamiliar space, and to demonstrate their route information by constructing a sketch following training. We found that while there was no discernible effect between teams and individuals, participants showed superior route knowledge for a single- than a double-story building. A second project involved determining whether participants could reliably interpret emotional displays while immersed in a virtual world, or whether limitations of visual resolution and computing power might preclude this task. In a comparison of VE-based emotional displays with a monitor and with photos, we found that participants were equally proficient at labeling and describing emotions. Currently, ongoing projects include an investigation of the effect of training medium (VE, filmed walkthrough, or blueprints) on a teamed building search task, and the creation, demonstration and evaluation of an immersive system to train decision making and procedural skills to teams of peacekeepers.
Back to Contents
This research involves the investigation of effective methods of improving a pilot's ability to perceive their surrounding environment. This includes providing imagery detailing the topographical layout of the terrain together with surrounding aircraft. The primary focus is on the incorporation of multiple depth cues (including stereopsis and motion parallax). This abstract concentrates on an experiment to investigate an effect known as chromostereopsis which was discovered by Bruecke (1868). This is a negative depth effect that causes colors of different hues to appear at different depths, which in this case would lead to a pilot mis-representing the environment and this could have potentially disastrous consequences. The effect will also have an impact on other types of displays including those commonly used in Virtual Environments. Previous studies with chromostereopsis, for example McClain et al. (1990), Dengler & Nitschke (1993) and Thompson et al. (1993) have relied on the presentation of simple imagery such as a red and blue square on a black background. It was necessary to see if this effect occurred in a more complex scene with texturing. The display involved the presentation of
two aircraft, one red and one blue, above a square of terrain. A subjects' (n=8) task was to compare the distance of the two aircraft and decide which was closer to the front of the terrain. The relative depths of the aircraft was varied across trials (n=16). The scene was presented with and without stereopsis to determine if this extra cue would help to reduce any effects of chromostereopsis. Contrary to previous studies which have revealed the effects of chromostereopsis, the results of this study indicate no effects. This is most likely due to the increased complexity of the scene being investigated. The results suggest that although chromostereopsis may influence depth perception in simple cases, it's effect is minimized when more complex imagery is presented.
Other activities within the Virtual Environments Laboratory include haptic feedback, physically-based modeling, knee arthroscopy trainers and distributed virtual environments.
Further details can be obtained from Paul Dunnett, Department of Computer
Science, University of Hull. Hull. HU6 7RX. ENGLAND.
Stereo WWW site
Back to Contents
Virtual Reality Environments for Psycho-neuro-physiological Assessment and Rehabilitation - is an European Community funded project (Telematics for health - HC 1053 - Project WWW site whose aim is: - to develop a PC based virtual reality system (PC-VRS) for the medical market that can be marketed at a price which is accessible to its possible end-users (hospitals, universities and research centers) and which would have the modular, connectability and interoperability characteristics that the existing systems lack; - to develop three hardware/software modules for the application of the PC VRS in psycho-neuro-physiological assessment and rehabilitation. The chosen development areas are eating disorders (bulimia, anorexia and obesity), movement disorders (Parkinson's disease and torsion dystonia) and stroke disorders (unilateral neglect and hemiparesis). References: Wann, J.P. & Rushton S.K. (1995a). The use of virtual environments in perception-action research: Grasping the impossible and controlling the improbable. in D. Glencross & J. Piek (Eds) Motor Control and Sensory-Motor Integration. Amsterdam: North-Holland. Wann, J.P. & Rushton S.K.(1995b) Grasping the impossible: Stereoscopically presented virtual balls In B. Bardy, R. Bootsma & Y. Guiard. (Eds) Proc. of the 8th International Conference on Event Perception and Action. 207-210. Wann, J.P. & Rushton S.K. & Lee D.N. (1995) Can you control where you are heading when you are looking at where you want to go? In B. Bardy, R. Bootsma & Y. Guiard. (Eds) Proc. of the 8th International Conference on Event Perception and Action. 171-174. Kuhlen, T & Dohle, C. (1992) Virtual reality for physically disabled people. Comput. Biol. Med. 25, 205-211. Cioffi, G. (1993). Le variabili psicologiche implicate in un'esperienza virtuale [Psychological variables that influence a virtual experience]. In G. Belotti (Ed.), Del Virtuale [Virtuality]. Milano, Italy.
Back to Contents
The Human Factors and Ergonomics Society: Human Factors and Ergonomics Society
The Ergonomics Society: The Ergonomics Society
The Computer-Human Interaction Special Interest Group of the Association for Computer Machinery: SIGCHI
Professor Ronald Mourant of Northestern University has information of the activities of the Virtual Environments Laboratory of the Mechanical, Industrial and Manufacturing Engineering Department at: NU Virtual Environments Laboratory
Back to Contents