CyberWalk – a holistic approach spanning science, technology and applications to enable unconstrained omni-directional walking in virtual worlds.
It is active exploration that enables us humans to construct a rich and coherent percept of our environment. By far the most natural way to interact with large-scale environments is via locomotion. The same should also be true for environments created in Virtual Reality (VR). However, today it is still mostly impossible to freely walk through virtual environments (VEs) in order to actively explore them. The primary reason for this is the scientific and technological underdevelopment in this sector. State-of-the-art walking simulators are usually unidirectional, they do not provide a convincing sense of interaction with the simulated environment, the visualization quality is poor and the scientific fundament for understanding the human-machine interaction is mostly missing.
Here we want to change this unsatisfactory situation. Therefore, our goal is to develop a novel multimodal omni-directional walking environment. For that, CyberWalk follows a holistic approach covering science, technology and applications to best realize the visionary goal of enabling quasi-natural unconstrained omni-directional walking in virtual environments. We plan to develop a unique high-fidelity multimodal platform – named CyberCarpet – on which people can freely walk in any direction. Throughout the project we will put the CyberWalk developments on a solid, human-centred footing. At the end of CyberWalk, the ability to experience a physical walkthrough in an extended virtual environment will showcase the achievements of the project. For this we here exploit an archaeological site. However, this application will only be a case in point for the wide range of applications that the CyberCarpet will offer, including entertainment, rehabilitation, training, sports, and architecture, to name just a few.
It is apparent that at the moment the scientific and technological deficit in the area of walking simulation holds back important developments and many promising applications. In the CyberWalk project we focus on breaking the cycle by pushing research in this field ahead, based on the necessary blend of cognitive/behavioural understanding and high-fidelity technological development, to end up with a fully immersive showcase. To get there, CyberWalk will develop an entirely new concept of an omni-directional treadmill, the CyberCarpet. As envisaged at the moment the walking platform of the carpet will contain a dense 2D matrix of small balls, which are bedded to freely rotate in any direction. The plan is to omni-directionally actuate the balls from below while the user walks on top of the balls. One favoured design is to mount a large conveyor belt on a turntable below this walking platform. The desired rotation velocities of the balls can then be transmitted by friction contacts between belt and balls, whereas friction forces between shoes and balls can manipulate the motion of the user (see B.6/WP2 for details). The exact design of the platform, however, can only be determined after an initial validation of the technical parameters and a profound exploration of the psychophysics of such manipulated walking. To provide the user with an integrated fully immersive Virtual World the CyberCarpet will be combined with visual and auditory input, supplied via a Head Mounted Display (HMD).
The related problems can only be approached systematically if a solid scientific foundation can be established for the underlying perceptual principles. The CyberCarpet development will therefore be complemented from the very beginning by perceptual studies. These will not only provide guidelines for the specific design of an omni-directional treadmill, but will also enhance our understanding of locomotion in Virtual Reality in general, which is essential for such a project to succeed. Basic psychophysical research is therefore central to the CyberWalk project. It will also serve as a benchmark for the evaluation of the technical design concept and its comparison with existing approaches. Psychophysical research ensures that the selected solutions are user friendly. Very little is known about the perceptual basis underlying walking in large-scale virtual environments [Slater95]. Therefore, this strand of psychophysics is challenging and ranges from exploring practical issues like how to hide for the user that s/he is walking on a grid of small balls, to basic scientific questions addressing natural human navigation in virtual worlds. For example, the control system for the linear/angular motion of the platform should incorporate within its design-constraints the psychophysical data on maximum velocity and acceleration (applied force) that a walker may experience without noticing the underlying platform motion, and use then proper smoothing of commands to provide the most natural feeling of free walk. Thus, the psychophysical research conducted within CyberWalk will start from the onset of the project, initially using a simpler setup (uni-directional and circular treadmills, see the work plan for details) before switching to the CyberCarpet design to best cover psycho-perceptual issues related to walking, concerns about the psychophysiology of action, and questions related to multimodal integration.
Other VR setups simulating locomotion
Several VR setups that simulate locomotion have already been developed and it has been shown that the sense of immersion is greatly affected by the selected approach [Slater98, Usoh99]. Proposed methods range from 3D wands via "walking in place"-metaphors to treadmill systems. In general, treadmill-like devices are designed to travel in the opposite direction to the user’s movement, adapting the tread’s speed to keep him or her in essentially the same position. Current state-of-the-art one degree-of-freedom (DOF) treadmills, which are commonly used for fitness, rehabilitation or VR, can be equipped with a tilt device and/or a tether to simulate uphill walking (e.g., the “Treadport” by Sarcos, USA [Hollerbach00]), but they have the disadvantage of confining the user’s movement to one direction only. At present there have only been a few attempts to enable omni-directional walking on 2D-treadmills. Two degrees-of-freedom can be achieved by using two orthogonal translatory actuators (“Omni-directional treadmill” by Virtual Space Devices, USA [Andrew99], and the “Torus treadmill” by H. Iwata, Japan [Iwata99]). A similar setup is also described in [Darken97]. Another way of achieving 2D omni-directional actuation is by combining translation in one direction with rotation in the other (Noma and Miyasato, Japan [Noma99]). Three degrees-of-freedom make it possible to control the orientation and the x/y position of the user on the simulated floor (Wang et al., Japan [Wang03]). Another solution described in [Dis03] uses a “universal wheel”, in which the treads are not tyres but are composed of several freely spinning rollers which actuate the surface. Furthermore, in [Roston97] a design of programmable foot platforms providing 3D motion have been proposed. Similar systems are currently in prototype development at Cybernet Systems Corp. (www.cybernet.com) and Sarcos Corp. (www.sarcos.com). Apart from this, a different approach suggested for omni-directional walking are fully immersive spherical projection systems. Such designs are described for instance by VR-Systems Inc. (www.vr-systems.ndtilda.co.uk) or Time's Up Inc. (www.timesup.org).
Although such treadmills have the advantage of being holonomic [cf. Campion96], these systems have not passed the early prototype stage. Furthermore, common problems with these designs are their high complexity and the noise level generated. Moreover and most importantly, current treadmill designs attempt to immediately counteract each step of the user, which generates relatively large forces due to abrupt accelerations. Consequently this leads to an unnatural walking behaviour, which is why all the attempts made to date are still unsatisfactory. In summary, there is no convincing omni-directional system available as of yet.
One key idea of CyberWalk is to generate only gentle accelerations of the floor under the user’s feet and to only pull the user in the direction towards the centre of the platform. That is, the user may take a few steps before the platform is fully accelerated (in rotation and translation) and it may take a while to stop after the user stands still. A planar design of the CyberCarpet is key here. The advantage of the proposed design concept is that the forces can be relatively small and therefore should not interfere with natural – quick or slow – walking behaviour. Apart from delivering a better walking experience, this feature will also lead to higher user acceptance and throughput (e.g. of a visitor queue for a display at a museum) as users require less time to adapt to the system. Compared to state-of-the-art technology we envision that the CyberCarpet will be easy to use and will enable high immersive virtual-locomotion by allowing: i) unrestricted omni-directional walking, ii) quick or slow movements, and iii) stepping over and crossing legs.
In order to achieve the goal of only producing gentle accelerations on the user, the platform ideally should be very large. From a technical and financial viewpoint, however, there will be distinct limits to the size of the platform, which still have to be explored. One possibility to counteract a restricted platform size is to try using “perceptual tricks” to virtually increase the effective size of the platform. Initial work has been done in this direction (e.g., Slater93, Slater95, Slater98, Razzaque02, Popp04). It will be part of the CyberWalk project to further explore whether by for example introducing different gain factors for the visual display and the movement of the motion platform, we can virtually generate a larger workspace without this being noticed by the user. Gain factors during the turn of the user may serve a similar purpose. In doing so we may force the user to follow a circular path, while s/he thinks to be walking straight. Such perceptual tricks should virtually increase the workspace. Other “perceptual tricks” will be considered as well, concerned with multimodal channel integration, sensory recalibration mechanisms, and human sensorimotor adaptation.
As part of the challenging task of developing such a complex human-centred walking platform there are many scientific questions to be addressed. Several have to do with the control of the platform. Only precise control algorithms will achieve the intended, gentle forces on the user. Several issues may affect the walking behaviour and have to be considered in the overall mechatronical design of the CyberCarpet, like differential acceleration of the individual feet, centrifugal forces due to platform rotations, chatter vibrations due to friction, or undesired singularities that may occur at particular platform locations.
Input for all the control algorithms will be the position of the user on the platform. It is foreseen that this position is extracted from a marker-less visual tracking approach. Even though this approach poses an extra challenge we want the tracking to be marker-less, so we will increase the chances of immersion in the virtual environment (markers do always interfere with the user’s task, e.g. by imposing dedicated clothing) and to increase both user friendliness and client throughput (again no additional set up procedure needed and therefore less queuing up for public displays or medical training stations based on CyberWalk technology). The task will be to extract the position (updates at high rates and small latencies) and possibly the posture (at slower rates) of the user on the platform from regular camera images. Posture information of the user may be useful in two respects: First, to allow for a gesture based interaction with the application scenario and second to enable movement predictions of the user. Such predictions would be most useful to counteract already even before the actual movement is executed, and with that to further enhance the control of the platform.
As mentioned, the project will culminate in a showcase that combines the scientific and technological advancements of CyberWalk. In particular, we plan to let the user pay a virtual visit to the ancient site of Pompeii. However, this scenario is thought of as being exemplary and should easily be adapted to a range of other application areas. We plan to develop software that automatically creates complex environments, based on prescribed design rules. There are first attempts already emerging for applications of virtual walking. These include training of escape-routes for emergency situations (e.g. in burning or destroyed houses) [Jung99], military training, architectural walk-throughs, and rehabilitation. However, none of these systems has been successful due to unsatisfactory and limited VR hardware. It seems almost certain that real locomotion in virtual worlds increases the sense of presence. By properly integrating realistic proprioceptive and visual cues the sense of immersion is expected to excel everything achievable with hand-held locomotion control devices (e.g. like wands, 3D mice or walking-in-place sensorized mats). All VR projects including a locomotion component – and this is the vast majority - could largely benefit from the science and technology developed within CyberWalk.
The synthesis of psychophysical fundamentals, advanced technology, and user experiences from actual application scenarios, that will underlie the CyberCarpet prototype walking device, will result in a set of design guidelines for immersive VR walking systems in general.
Quantifiable objectives will result from the psychophysical work. There, specs will be formulated of what the limits in the accelerations are, before users start to notice that they are not walking on stable ground. Achieved parameters will have to be compared against these specs. The fact that the latter are not known yet highlights the innovative nature of the project and can hardly be held against it.
Andrew J. Mitchell, “Omni-directional treadmill”, International Patent Application PCT/GB97/00785, March 20, 1996.
Bülthoff, H.H. and van Veen, H.A.H.C. (2001): Vision and action in virtual environments: modern psychophysics in spatial cognition research. In: Vision and Attention, (Eds.) J. Jenkins, L. Harris. (Springer, New York).
Campion G., Bastin G. and d'Andrea-Novel B. (1996): Structural Properties and Classification of Kinematic and Dynamic Models of Wheeled Mobile Robot, IEEE Trans. on Robotics and Automation, vol. 12, pp. 47-62.
Darken, R.P., Cockayne, W.R., & Carmein, D. (1997). The Omni-Directional Treadmill: A Locomotion Device for Virtual Worlds. Proceedings of UIST '97, pp. 213-221.
Deutscher J., Blake A. and Reid I. (2000): Articulated Body Motion Capture by Annealed Particle Filtering, CVPR, pp. 126-133.
Dispositif de déplacement Omnidirectionnel, brevet provisionnel n° 2003/0198 déposé le 26 mars 2003.
Ernst, M. O., Banks, M. S. and Bülthoff, H. H. (2000): Touch can change visual slant perception. Nature Neuroscience 3 (1), 69-73.
Ernst, M. O. and Banks, M. S. (2002): Humans integrate visual and haptic information in a statistically optimal fashion. Nature 415, 429-433.
Hillis, J.M., Ernst, M. O., Banks, M. S. and Landy, M. S. (2002): Combining sensory information: Mandatory fusion within, but not between, senses. Science 298, 1627-1630.
Hollerbach, J.M., Xu, Y., Christensen, R., and Jacobsen, S.C. (2000): Design specifications for the second generation Sarcos Treadport locomotion interface, Haptics Symposium, Proc. ASME Dynamic Systems and Control Division, DSC-vol. 69-2, Orlando, pp. 1293-1298.
Isard M. and Blake A. (1998): CONDENSATION -- Conditional Density Propagation for Visual Tracking. International Journal on Computer Vision, vol. 1, no. 29, pp. 5-28.
Iwata H. (1999): Walking About Virtual Environments on an Infinite Floor, IEEE Virtual Reality, 13 - 17 March.
Jung, T. & Haulsen, I.: Immersive Escape-Route Scenario with Locomotion Devices, Proceedings of Workshop on Spatial Cognition in Real and Virtual Environments, Tübingen, Germany, April 1999.
Karnath H-O, Ferber S, Himmelbach M (2001): Spatial awareness is a function of the temporal not the posterior parietal lobe. Nature 411, pp. 950-953.
Lewald J, Karnath H-O (2002): The effect of whole-body tilt on sound lateralization. European Journal of Neuroscience 16, pp. 761-766.
Mittelstaedt, M.-L. & Mittelstaedt, H. (1997) The effect of Centrifugal Forces on the Perception of Rotation about the Vertical Axis. Naturwissenschaften 84, 366-369.
Niemeier M, Karnath H-O (2003): Stimulus-driven and voluntary saccades are coded in different coordinate systems. Current Biology 13, pp. 585-589.
Noma H. and Miyasato T. (1999): A New Approach for Cancelling Turning Motion in the Locomotion Interface, ATLAS", Proc of ASME-DSC-vol.67, pp.405-406.
Popp, M. M., Gouy, E., & Holtmannspötter, J. (2004): Real Walking in Virtual Environments: A new Experimental Device. EuroHaptics Conference Proceedings 2004 (Ed. Buss. M., Technische Universität München).
German Patent Application 10 2004016429.0 by the Max-Planck Society
Razzaque, S., Swapp, D. Slater, M. Whitton, M. C. & Steed, A. (2002). Redirected Walking in Place. Eight Eurographics Workshop on Virtual Environments (Eds, S. Müller, W. Stüzlinger)
Rehg M. and Kanade T. (1995): Model-Based Tracking of Self-Occluding Articulated Objects, CCV, pp. 612-617.
Roston, G.P., and Peurach, T. (1997). A whole body kinesthetic display device for virtual reality applications. Proc. IEEE Intl. Conf. Robotics and Automation, 3006-3011.
Schraudolph N.N. (1999): Local Gain Adaptation in Stochastic Gradient Descent, Proc. Int. Conf. Artificial Neural Networks, pp. 569-574.
Schraudolph N.N. (2002): Fast Curvature Matrix-Vector Products for Second-Order Gradient Descent, Neural Computation, vol. 14, no. 7, pp. 1723-1738.
Sidenbladh H., Black M. J. and Fleet D. J.(2000): Stochastic Tracking of 3D Human figures using 2D Image Motion. ECCV, pp. 702-718.
Slater, M., Steed, A., & Usoh, M. (1993). “The Virtual Treadmill: A Naturalistic Metaphor for Navigation in Immersive Virtual Environments,” First Eurographics Workshop on Virtual Reality, ed. M. Goebel, 71-86.
Slater, M., Usoh, M., & Steed, A. (1995) “Taking Steps: The Influence of a Walking Technique on Presence in Virtual Reality,” ACM Trans. on CHI, Special Issue on Virtual Reality Software and Technology, 2, 3: 201-219, September.
Slater, M., Steed, A., McCarthy, J, Maringelli, F. (1998). The Influence of Body Movement on Subjective Presence in Virtual Environment, Human Factors, 40(3), 469-477.
Sminchisescu C. and Triggs B. (2001): Covariance Scaled Sampling for Monocular 3D Body Tracking, CVPR, pp. 447-454.
Usoh, M., Arthur, K., Whitton, M.C., Bastos, R., Steed, A., Slater, M. & Brooks, Jr., F.P. (1999). Walking > Walking-in-Place > Flying, in Virtual Environments. Proceedings of SIGGRAPH 99.
Wang Z., Bauernfeind K. and Sugar T. (2003): Omni-Directional Treadmill System”, VR2003, Haptics Symposium.
Wu Y., Lin J. Y. and Huang T. S. (2001): Capturing Natural Hand Articulation, ICCV, pp. 426-432.