360-Video Multiuser Analysis Tool

A 360-video multiuser analysis tool is currently being developed at the lab as a way of teleconferencing. Multiple people in multiple places can watch the same video in virtual reality. The application currently works with oculus rift and uses oculus touch controllers for annotations, as seen in blue in the photo above. The development team, consisting of Mitchell Berger, Isaac Andersen, and Rayan Tofique, are working towards a pause and play functionality as well as other networking and UI components. They see one potential use of their application as a teaching tool to be used in museums and other interactive learning environments. The featured photo is a screen shot of the interface including purple annotations.

The Effects of Rest Frames on Simulator Sickness Reduction

Master’s Student Zekun Cao is currently working on a project that addresses simulator sickness, a very common problem with virtual reality. Virtual Reality sickness occurs, like most forms of motion sickness when, visual, vestibular (inner ear), and muscular cues do not match up the way they normally would in the real world. Take a flight simulator, for example. While flying an actual plane, a person would feel and see the plane turning, rolling and pitching and his/her muscle would react to the movement accordingly. In Virtual Reality the only sensory input received is visual and this separation between what is felt and what is seen is what makes users sick. This project researches the effects of rest frames on simulator sickness reduction. In Zekun’s model, the rest frame takes the form of a mask that appears in the user’s field of view. Having this fixed reference point was hypothesized to help users cope with simulator sickness, and was proven effective through a user study. Participants were asked to complete 6 laps of a circle in a simulated environment and were asked to submit a sickness evaluation from 1-10 at six different points in the circle. With the rest frame, subjects completed a greater number of laps and submitted lower sickness scores on average.

Digital Cities and Cyberarcheology

Digital Cities and Cyberarchaeology is a Bass Connections project that is currently housed at the DiVE. This project is a collaboration with individuals across multiple fields of study including Classical Studies, Art, Art History and Visual Studies, Remote Sensing and 3D technologies with the goal of recreating an archeological dig site in the Etruscan city of Vulci, Italy in Virtual Reality. The recreation is built based off of space and environmental data gathered by an archeological team onsite and will be compatible with Oculus Rift and HTC Vive headsets and allow for simultaneous use in hybrid systems. Researchers have implemented user-testing to vet the project, focusing on ease of use and availability of data for the project’s target demographic, researches who will use the product to visualize their work. Since the work is product based, the results are only as good as users say, and according to the research team, so far feedback has been positive.

Specimen Box

Virtual Reality is becoming better and better at imitating the real world as graphics and other technologies improve. Yet, one of the barriers that continues to keep us from being fully immersed is the need for hand held wands, or other tracked devices, to aid interaction. The Specimen Box project explores new interaction techniques that may be more natural in world fixed displays, such as the CAVE-type system here at the Duke Immersive Virtual Environment.

The Specimen Box is a clear physical box with a tracking component which, when in the environment, appears to house virtual content. Users can hold the box and manipulate the content inside. The user feels the weight of a physical object in their hands, resulting in less cognitive dissonance between the user’s mental understanding of what interacting with an object should feel like and what it actually feels like in virtual reality. Researchers hypothesized that this new interaction technique would increase the speed at which individuals can manipulate objects when asked.

A recent user study asked subjects to read the colors written on each side of a virtual box projected inside of the Specimen Box. In the easiest trial, the font color matched the word that was written (ie. “green” was written in green). In the harder trials, the font color did not match the word that was written. The study found over repeated trials that participants’ response times were faster when manipulating a physical box than when manipulating a virtual box.

Escape Room

Senior Meng’en Huang developed a virtual reality application, Escape Room, as an independent study project. The goal of her independent study was to explore the equipment and applications of a virtual environment. As a Computer Science and Visual Media Studies double major, Huang’s interest in virtual reality stems from its ability to merge her interests.

Developed in Unity and enabled to interact with the DiVE through MiddleVR, Escape Room combines trendy entertainment and cutting-edge technology. It presents the user with a variety of puzzles that need to be solved before the room can be escaped, just like a real escape room.

In a real life escape room, a group of people are locked in a room and given many tasks to complete before being allowed to escape. The participants have access to hints if they are having trouble. Escape rooms combine both the social and the physical, enabling people to be entertained without a screen.

Escape rooms have been present in Asia, Europe, and the West Coast for years. Now, they are are an emerging popular form of entertainment in many large United States cities. For more information on escape rooms, read Jessica Contrera’s article in the Washington Post called “It’s no puzzle why ‘escape room’ adventures are so popular.”

The application that Huang created is a virtual version of these escape rooms. It is an old-time wooden house which is inhabited by a soul who wants to be loved and understood. The soul communicates through a mask in the wall. The player may only leave the room when the soul’s requests are met.

The house has an eerie feel to it. It is decorated with floating candles, no door, a mirror, and unintelligible clocks and paintings. When the player attempts to examine something closely, he or she is pushed away. Through exploring and experimentation, the player incidentally determines how to move things around in the room.

There is a risk involved in playing this game: If a player pushes something out of the room, the thing will never be able to reenter the room. Huang says that if this happens the player “may lose the chance to satisfy the lonely soul and will have to accompany him forever.”

The DiVE wishes Meng’en the best in her post-graduation adventures!

Wayfinding by Audio Cues in Virtual Environments

Undergraduate Computer Science major, Ayana Burkins, hopes to evaluate the use of 3D spatial sound as a wayfinding device in virtual environments. Users navigate through mall-like mazes with various target locations, each of which may contain a localized audio cue, and perform several wayfinding tasks. We hypothesize that users will be able to perform the tasks faster and more accurately in an environment with spatial sound than in one without.

Walk Again Project

The first ceremonial kick of the World Cup game (Brazil 2014) may be made by a paralyzed teenager, who, flanked by the two contending soccer teams, will saunter onto the pitch clad in a robotic body suit.

Led by Miguel Nicolelis, the Walk Again Project is a nonprofit, international collaboration among the Duke University Center for Neuroengineering, the Technical University of Munich, the Swiss Federal Institute of Technology in Lausanne, the Edmond and Lily Safra International Institute of Neuroscience of Natal in Brazil, The University of California, Davis, The University of Kentucky, and Regis Kopper of The Duke immersive Virtual Environment.

This started with research from the Nicolelis lab using hair-thin and flexible sensors, known as microwires, that have been implanted into the brains of rats & monkeys. These flexible electrical prongs can detect minute electrical signals, or action potentials, generated by hundreds of individual neurons distributed throughout the animals’ frontal and parietal cortices—the regions that define a vast brain circuit responsible for the generation of voluntary movements.

Now, with further advancements, the candidate teenage kicker will be trained in Virtual Reality to control technology that will eventually allow them to kick the ball at the world cup. They will do this by wearing a non-invasive headpiece that detects brain waves.

More information about the Walk Again Project can be found in this Washington Post article.