Mixed Reality (MR) and Augmented Reality (AR) create exciting opportunities to engage users in immersive experiences, resulting in natural human-computer interaction.
Many MR interactions are generated around a First-person Point of View (POV). In these cases, the user directs to the environment, which is digitally displayed either through a head-mounted display or a handheld computing device. One drawback of such conventional AR/MR platforms is that the experience is user-specific. Moreover, these platforms require the user to wear and/or hold an expensive device, which can be cumbersome and alter interaction techniques.
We create a solution for multi-user interactions in AR/MR, where a group can share the same augmented environment with any computer generated (CG) asset and interact in a shared story sequence through a third-person POV. Our approach is to instrument the environment leaving the user unburdened of any equipment, creating a seamless walk-up-and-play experience. We demonstrate this technology in a series of vignettes featuring humanoid animals.
Participants can not only see and hear these characters, they can also feel them on the bench through haptic feedback. Many of the characters also interact with users directly, either through speech or touch. In one vignette an elephant hands a participant a glowing orb. This demonstrates HCI in its simplest form: a person walks up to a computer, and the computer hands the person an object.
We create a 3D reconstruction of a scene using a combination of the depth and color sensors on an off-the-shelf Microsoft Kinect. To do this, we draw polygons using each point in the point cloud as a vertex, creating the appearance of a solid mesh. The mesh is then aligned to the RGB camera feed of the scene from the same Kinect. This alignment gives the mesh color, and completes a 3D reconstructed video feed.
There are several problems that arise with the 3D constructed feed. First, the monocular feed creates “depth shadows” in areas where there is no direct line-of-sight to the depth sensor. Second, the depth camera is laterally offset from the RGB camera (since they cannot physically occupy the same space) and therefore have slightly different viewing angles, creating further depth shadowing. The resulting data feed is sparse and cannot represent the whole scene (see Figure 3. To solve this, we align the 3D depth feed with the 2D RGB feed from the Kinect. By compositing the depth feed over a 2D backdrop, the system effectively masks these depth shadows, creating a seamless composite that can then be populated with 3D CG assets.
This mixed reality platform centers around the simple setting of a bench. The bench works in an novel way to constrain a few problems, such as identifying where a user is and subsequently inferring the direction of the user’s gaze (i.e., toward the screen). It creates a stage with a foreground and background, with the bench occupants in the middle ground. The bench also acts as a controller; the mixed reality experience won’t trigger until at least one person is detected sitting on the bench. Further, different
seating formations on the bench trigger different experiences.
Magic Bench is a custom Software and custom Hardware platform from Disney Research, necessitating a solution to bridge both aspects. Between the two exists a series of patches created in Cycling ’74 Max designed to convert signals sent from the game engine (via OSC) about the positions and states of objects in the scene, into the haptic sensations felt on the bench. Haptic actuators are dynamically driven based on the location of animated content. The driving waveform for each actuator is designed according to the desired feel — in the current setup we can tweak base frequency, frequency of modulation, general amplitude, amplitude envelope, and three-dimensional position. These parameters can be manually tuned and/or adjusted in real time.
2 INSTALLATION OPTIONS
This piece can run as a traditional VR Village installation or as an autonomous piece in an unsuspecting area at SIGGRAPH — imagine sitting on a bench to rest your feet or check your email; in front of you is a screen showing a SIGGRAPH showreel. Once the system detects you, the content switches to a video feed of you,
creating a mirror effect. From there, an unexpected AR experience unfolds.
The Latest on: Holodeck
[google_news title=”” keyword=”holodeck” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]
Are we close to the holodeck? Google unveils Genie — an AI model creating playable virtual worlds from a single imageon February 26, 2024 at 3:27 amGenie is a new AI model from Google’s DeepMind lab that can take a sketch or picture and turn it into a playable 2D game.
10 Star Trek Moments That Broke Our Heartson February 23, 2024 at 3:00 amStill - audiences clamoured for that reunion, one that we finally received thanks to Star Trek: Picard's third season episode Imposters. Here, we learned that Ro had left the Maquis before they had ...
Holodeck Productions’ Fantome unveils lineupon February 19, 2024 at 7:52 amHOLODECK Productions will celebrate the lively Philippine dream pop, shoegaze, and indie rock scenes with their upcoming music event, ‘Fantome’. This will be he ...
Star Trek: The Next Generation: Advanced Holodeck Tutorialon February 14, 2024 at 11:18 pmEnter OVERRIDE as a password. The following effects will be enabled when starting the game: *When Picard describes the current mission, press Start to select a different one. *In the description ...
Our chat with the first Disney employee since Walt to be inducted into the National Inventors Hall of Fame.on February 13, 2024 at 10:50 amIGN had the chance to speak to Imagineer Lanny Smoot, the first Disney employee since Walt to be inducted to the National Inventors Hall of Fame, about inventing lightsabers and the HoloTile floor, ...
Disney Just Brought Star Trek's Holodeck Closer To Realityon February 12, 2024 at 7:30 amDisney Imagineer Lanny Smoot's HoloTile Floor makes Star Trek's holodeck technology feel a bit less out of reach.
VR therapy takes veterans back to 'virtual Vietnam' to heal decades-old trauma: 'A path to peace'on February 11, 2024 at 4:00 pm"VR sometimes has been seen as some kind of Star Trek science fiction holodeck kind of thing, but functionally, it's a technology that allows us to put people in simulations, in a controlled fashion ...
Jim Beard: Holodeck Waltzon February 2, 2024 at 4:01 pmListen to what is probably the album's centerpiece composition, "Holodeck Waltz" and it becomes clear how a relative newcomer like Beard could attract a veritable Who's Who of electric jazz to ...
How Star Trek: The Next Generation's Original Story For Moriarty's First Episode Could've Given Him A Happy Endingon January 24, 2024 at 4:00 pmThe Original Story Gave Moriarty The Ability To Leave The Holodeck According to the excerpt, it was established in the original story that Moriarty was able to leave the holodeck in the same way ...
UC&C's Future Is Closer Than You Think: A 'Star Trek' Holodeck Conference Room Isn't For A Galaxy Far, Far Awayon January 20, 2024 at 7:09 pmOr how about conducting a business meeting in a setting similar to the Holodeck in "Star Trek?" This type of technology is not a century away but, rather, a decade, say unified communications and ...
via Google News and Bing News