Ethics in Investigational and Interventional uses of Immersive Virtual Reality is a conference planned for April 26-27, 2017 at the Discovery Building in Madison, Wisconsin. The objective of this conference is to produce a set of guidelines for the ethical conduct of research experiments as well as clinical treatments including informed consent, integrity of the sense of person, meaningful explanation of side effects and consequences of a VR experience, and fair notification of the extensive data tracking now possible in these systems. Following an optional tour of the LEL CAVE and three public lectures to introduce the ideas of VR, cognition and behavior, and ethics that will be open to a general audience, an invited group of participants will spend an evening and the following day exploring ethical issues and proposing a parsimonious set of guidelines. For more information on the e3iVR, visit the conference website.
Virtualized Homes – Tools for better discharge planning
Contemporary technologies make it possible to create full-scale 3D models of the interiors of a patient’s home. Displaying these models and making them accessible through the electronic health record (EHR) could accelerate patient centered care research by supporting systematic assessment as well as enabling researchers and clinicians to detect safety barriers, recommend reconstruction, and tailor care instructions specific to the space wherein they will occur. The models could be linked with other EHR data, enabling exploration of the relationships between health states and the conditions of the homes. The purpose of this R03 proposal is to develop Home3D, a technical implementation that would allow annotated versions of these models to be used during clinical care and stored in the EHR. With proper protection, these models could be made available for research use in a manner similar to other EHR data. R03HS024623-01
More and more of health care occurs in the home, and a growing number of computer tools are becoming available to help people manage personal health concerns and health information. This project will figure out how features of the home interior, such as the adequacy and privacy of space available for health, and common household objects like calendars and file cabinets, help or hinder people as they manage health information.
The Advanced Visualization Space is planned to be a 400 square foot immersive virtual reality tracking expanse located in the Signe Scott Cooper Hall in the School of Nursing, at the University of Wisconsin-Madison. A motion capture tracking system will allow 2-3 operators to interact within the same virtual space simultaneously. The long-term goal for this facility will be the exploration of new types of nursing training and fundamental virtual reality research.
The Roots project aims to bring the analytical methods that study biology into three dimensions. This project will use a 3D growth chamber for imaging living systems, engineered at the scale of plant roots with the ability to scale down to analyze neural cells. An interdisciplinary group at the LEL is collaborating to integrate this 3D growth chamber with a 3D imaging system to capture and visualize the growth of plant roots.
ALICE (Augmented Live Interactively Controlled Environment) is an interdisciplinary research project which melds existing technologies to pioneer a new live performance methodology. We aim to accomplish this by enabling the performers (actor, dancer, musician, etc.) to interact with their stage environment in a dynamic and unique way. By integrating video projection, entertainment automation, motion capture, and virtual reality technologies together, we aim to enable new possibilities in live performance and enhance the audience’s experience.
3D Scans and Quad Copter Videos of the estate of architect Frank Lloyd Wright were taken in Spring Green, Wisconsin. These were then transformed into a 3D Point Cloud for visualization in the CAVE. The LEL is partnering with the
Frank Lloyd Wright Foundation to share the results of the project in several statewide outreach events.
Model This! Contest
The Model This! Contest called for the participation of students in grades 6 – 12 to develop an idea for an immersive, 3D virtual environment for our CAVE. The contest aspired to engage students in creative planning and generate interest in science and technology. The inaugural winners of the contest are a team of three students: Eva Gemrich, Jack Harris, and Dylan Helmenstine from Wisconsin Heights High School. Their concept proposes an aesthetically pleasing and educational virtual rendition of the D.C. Smith Conservatory. The LEL is currently working with the team to create this virtual environment for the 2015 Wisconsin Science Festival.
Crime Scene Visualization
Crime Scene Visualization is the result of an emerging collaboration with the Dane County Sheriff’s office. This project uses 3D scanning technology to capture crime scenes that can later be visualized, allowing the scenes to be virtually revisited at any point in time in explicit detail. The point cloud data sets also provide a significantly more efficient way to collect key measurements during processing of the crime scene. The LEL and the Dane County Sheriff’s Office are currently working together on a grant proposal to expand the use of 3D scanning by law enforcement agencies. Learn more about this project by reading this article.
UW–Madison runs a high-throughput cluster system using the HTCondor scheduling system which is operated and maintained by the Center for High Throughput Computing (CHTC). This project focuses on visualizing performance and load distribution of the cluster by displaying collected log files. The log files are timestamped and gathered over long time periods; the visualization plays them back one frame at a time and with the ability to pause or reverse. Load of the cluster and of individual nodes can be therefore not only seen, but also traced in time. Questions, such as, “Which part of the cluster is running idle at 2 a.m.?” or “Where is the highest load concentrated?” can be answered quickly and intuitively, with the time-varying datasets showing trends and patterns in the cluster usage. The project is a collaboration between the LEL and the CHTC with Markus Broecker from the LEL, and Becky Gietzel and Nate Yehle from the CHTC.
Visualizing Neutrino Events
The LEL has teamed up with the Wisconsin IceCube Particle Astrophysics Center to create a 3D visualization of neutrino events as observed by the IceCube detector. IceCube is an operating detector located in the ice sheet at the South Pole. A cubic kilometer of ice has been instrumented with more than 5,000 light sensors that record interactions from neutrinos passing through the Earth. This summer, a team of programmers will develop new capabilities to neutrino event visualization software that will allow visitors to the LEL’s CAVE experience the unique opportunity to see neutrino events as if they were standing inside the IceCube detector, more than a mile deep in the South Pole ice.
Smart Asthma Management
Asthma is an acute and chronic lung disease that impacts more than 23 million Americans and 7.9 percent of the population. The cost of asthma is significant both for individuals and for society as a whole. Recent development in sensor and mobile computing technology enables patients to create self-reported data and record their symptoms, medicine usage and vital physiological signals.
The SAM project takes the advantage of unprecedented patient-generated data to estimate patient conditions and drive clinical intervention decisions. This project is establishing a series of statistical modeling, monitoring, prognosis and clinical intervention decision-making methodologies for SAM systems.
SculptUp is an immersive modeling system that makes creating complex computer-generated images easier and faster via an interaction paradigm that resembles real-world sculpting and painting. Users interact with the immersive virtual environment through a handheld wand and voice recognition. There are two unique states of manipulation: sculpt mode and world mode. In sculpt mode, the user creates, modifies and paints volumetric material using a minimalist control scheme. In world mode, the user positions and resizes his or her detailed volumetric models to design a large-scale scene. Ultimately, SculptUp’s interface substantially expedites the creative process and catalyzes rapid scene prototyping.
Learning Kanban in Healthcare
The healthcare industry is increasingly using engineering to better the healthcare system, creating a demand for engineers to understand how to design and implement efficient solutions in healthcare settings. To make it easier and less expensive for engineering students to visit a healthcare location to study the healthcare system, the LEL has created an innovative approach to engage students to learn by using the CAVE to simulate realistic healthcare environments that can be studied, interacted with and designed for. This method is beneficial because it allows students to learn the healthcare system without disrupting patient care and impacting workflows. In a collaboration with the Department of Industrial and Systems Engineering at UW–Madison, engineering students were able to come to the CAVE to learn how to apply a common industrial engineering method (Kanban) to manage inventory levels in an operating room. Students collaborated in pairs by using a novel technology called Dual View, which allows each user to have their own tracked perspective within the CAVE.
Led by Theme Leader Patricia Flatley Brennan, this project focuses on developing a systematic study of imagination and its potential for encouraging individuals to engage in their own health outcomes. Goals of the project include applying emerging computer technologies to better understand the human experience of imagination, discern alignment between various approaches to manipulating and measuring imagination and evaluate interventions.
Voice Activated CAVE
Traditionally, 3D models used within a virtual reality application must reside locally on the system in order to be properly loaded and displayed. These models are often manually created using traditional 3D design applications such as Autodesk products or Trimble SketchUp. Within the past five to 10 years, web-based repositories of 3D models, such as the Trimble 3D Warehouse, have become available for sharing 3D models with other users around the world. PC speech recognition capabilities have also become much more commonplace and accessible to developers, particularly with the advent of the Microsoft Kinect. We’ve developed a method to perform a search of an online model repository, the Trimble 3D Warehouse, from within our fully immersive CAVE by using speech. Search results are displayed within the CAVE, and the user can select a model he or she wishes to download and see drop into the CAVE. The newly acquired model can then be instantly manipulated and placed within the user’s current scene at a desired location.
Users perceive virtual environments differently than they do physical spaces, but the causes for these discrepancies are unknown. Lead by researcher Kevin Ponto, a novel method was designed for generating 3D content that is calibrated to the perception of each individual. Through empirical human subject testing, the method showed that this calibrated experience is able to mitigate many of the artifacts attributed to immersive virtual reality.
Led by researchers Rob Radwin and Kevin Ponto, this project creates a system enabling users to interact with virtual objects via natural body movements. The project combines visual information, kinesthetic and biofeedback methods, which allow users to grasp, move and drop objects via muscle exertions that match the real world.
Effective Presentation Virtual Experiences
Led by Kevin Ponto, this project presents methods for replaying the experiences of users in immersive virtual environments by developing and implementing novel metrics for projecting participant movements and viewpoints, as well as presenting performance analyses. Continued work enabled online real-time presentation of experiences for external viewers.
Just-in-Time Information for Consumers
If a house could look inside its refrigerator and report back on what it found, would that help the people living in that house eat healthier meals? Discovery Fellow Catherine Arnott-Smith is interested in how the home may be the ultimate site for behavioral interventions in the general public. Library and information science (LIS) brings a valuable perspective to these interventions. Unlike medicine and nursing, in LIS, the information-seeker is the focus of the process. This focus places control with the information-seeker rather than with the healthcare professional. This is an ideal perspective from which to investigate preclinical behaviors around health-related information and to isolate specific components of diabetes and nutrition information which “contribute most to behavior change and long-term clinical benefit.” Virtual reality (VR) is a technology that allows us to study individual agency in action and information in context, as it is used in real-time by healthy consumers. Researchers with the project want to understand this dimension of information behavior through exploratory research. VR CAVEs may provide sufficient contextual cues to best test real-world, real-time approaches.
Trauma Bay Scenario
This project works to simulate an ER trauma bay based on similar training space in the UW Hospital Simulation Center and enables users to diagnose and treat a simulation of tension pneumothorax. The thrust of this project is to further develop and utilize this scenario to conduct research on individual and team performance in stressful trauma settings. We also hope to evaluate the efficacy of medical procedural training in a virtual environment, while offering an opportunity to develop multiple clinical scenarios in varied settings for medical teamwork and procedural training. Ideally, this scenario will help other researchers create improved clinical spaces and instrumentation where practitioners can deliver medical care.
<1>:“der”//pattern for a virtual
<1>:“der”//pattern for a virtual environment is a 15-minute art exhibition 3D by Lisa A. Frank consisting of six explorable artworks initially developed for the LEL’s six-sided immersive CAVE. This exhibition presents the outcome of a collaboration between two different disciplines: computer science and the visual arts. With the assistance of Nathan Mitchell, a computer science graduate student, Lisa Frank reinterpreted her nationally exhibited art work using a combination of programming strategies that endow the imaginary settings with a deep sense of spaciality. As the viewer shifts his or her gaze and position in virtual reality, the viewer senses that he or she is moving ‘through’ the pattern, is ‘inside’ of the pattern, and has actually ‘become’ the pattern. Plant life grows large and fades away while the immersant interacts with 3D models drawn out directly from the 2D plane. First shown in December, 2011, the exhibit was enormously popular with more than 800 people experiencing it during the course of 10 evenings.
What inspires us to move our bodies during art and expression? Does it come from within or through our experience with the outside world? Each one of us finds that voice. Sometimes it comes from outside ourselves; what we might hear, what we might see. It’s scenery, environment or other surrounding. From a grant from the Virginia Horne Henry Fund, a new collaboration between the LEL and Discovery Fellow Li Chiao-Ping from the Dance Department will explore these external inspirations with the use of virtual reality (VR). This project allows dancers and the Dance Department to expand their art across campus into the domains of engineering and technology. In this project, we seek to expand the use of technology and virtual reality from a technical and computational modality to tools that can merge with dance. We want to explore the possibilities and modalities of using the CAVE with the human body and examine how expressive motions capture, interpret and interact with virtual reality space.
WordCAKE (Collated Aggregate of Key Elements) is a 3D model incorporating multiple metrics of word frequency and is designed to show how words change over time, sequence, or document groupings. There is value in being able to distill correspondence, literary works, scripts, copy, notes and other groupings of texts down to their key concepts or vocabulary. WordCAKE offers a more informative and complex representation of word frequencies than word clouds or tag clouds because it incorporates time/sequence, data context and statistics. WordCAKE is a layered representation of key words in multiple texts where each layer or disc represents one text. It uses alphabetical arrangement, placement, color, highlights, transparency and interactive features to allow users to efficiently probe and explore word frequencies and how they change in any collection of txt, PDF or CSV files. The immersive 3D environment of the CAVE offers a text analysis experience that engages and interacts with Big Data through novel visualization techniques.