You are floating through a dark, peaceful world, drifting over a clearing where a stream feeds the roots of a tree. Your breath carries you through this dream-like scene: inhaling, you rise; exhaling, you sink... down through the transparent ground, into the root system of the tree, where white lights move like blood through arteries into the trees. All around, you hear haunting sounds, some like music, some like crickets.

The music changes tone, and now you see this world in a bubble before you. As you try to approach, it moves away. You lean in to reach it, but it recedes faster, growing smaller in a black abyss, until it disappears, leaving only the void filled with gently moving white lights.

Osmose, Tree Pond The tree, reflected here in the pond, is a central metaphor in Osmose.

You've just been booted from Osmose, an immersive 3D environment designed by artist Char Davies and produced by Softimage in Montreal. The work represents the most recent fruition of a more than 15-year journey for Davies, who began her career as a painter and moved into computer graphics as a medium to explore three-dimensional, enveloping space. Davies joined Softimage in 1987, and as director of visual research heads a small team dedicated to “pushing the boundaries of VR.” Through works like Osmose, Davies hopes to bring an alternative aesthetic to virtual reality, one that allows for more ambiguity, soft edges, and translucency, rather than the hard-edged, angular visuals found in many applications.

The project stems from Davies' painting and work she did at Softimage in the early 1990s. Georges Mauro made the 3D models and animations in Osmose's 12 worlds with Softimage software, adapting them for realtime viewing with custom software written by John Harrison. (See the sidebar, “Osmose Technology and Exhibitions.” - below) In public installations, the user dons a helmet-mounted display while an audience watches the show on a large screen, sometimes through 3D polarized glasses.

Breathe deeply

Undoubtedly the most unique piece of hardware in the system is an input device that looks like a leather bra. It's a navigation tool that reacts to the expansion and contraction of the user's breathing. Davies' scuba diving experiences in deep water informed this application. “While diving, you don't feel you're in water,” Davies said. “You learn to navigate your breath and balance, so that as you're moving over a bed of coral, you take in a breath and can clear it by inches. There's a finesse and delicacy to controlling breathing to move through space.” Davies said users quickly learn to use their breath and tilt their body to drift through the ground of one world into the sky of another. Many VR applications let users fly, but their real-world traffic patterns are still based on horizontal movement. Osmose's verticality is just one way Davies wants the work to challenge users to perceive things differently.

The breathing input also leads users to rely not just on their sight and hands but to integrate the body into the virtual reality experience. This is reminiscent of Myron Krueger's early work in VR more than 25 years ago, when he began experimenting with ways to use the whole body to communicate with the computer through large screens and tracking devices. (More research in this arena might have spared us carpal tunnel syndrome!) “Breathing in Osmose tends to center people and make them calm,” Davies said, “almost like Tai Chi.” This peaceful atmosphere certainly differs from many commercial virtual reality applications that seek to quickly raise the pulse (in part, to give the user the sense that the brief experience was worth $10).

Metaphor and suggestion

Building a metaphorical, virtual world and immersing a person within it may seem like an extreme expression of the artist as God. But Davies bristles at the suggestion, affirming that it's just one person sharing an expression with others. She similarly resists any attempts to categorize the experience as spiritual, since that opens a Pandora's box of preconceived notions and stereotypes, and one of her goals is to cause people to look at things with a new, altered perception.

These goals—particularly Davies' efforts to invoke metaphors from nature—bring to mind the work of Brenda Laurel and her Placeholder installation. John Harrison, Osmose's programmer, came to Softimage from the Banff Centre for the Arts, where he worked on Placeholder and 20 or so other projects. In some of those installations, he worked with Dorota Blaszczak and Rick Bidlack, who composed and programmed the music for Osmose. (This music is actually samples from male and female voices, but within Osmose, it sounds like music and environmental sounds such as crickets.) “Banff was an excellent experience,” Harrison wrote, “and helped me learn about working with artists to create VR. I particularly enjoy collaborating with artists because we each bring complementary skills—the artist brings a vision of a world they want to build, and I bring the technical skills to realize that vision. By working together, we create something that neither of us could make alone.”

Judging from user reaction, they have created something powerful. Davies said that very often users (or “immersants,” as she calls them) cannot speak for a few minutes after the experience. In public exhibitions, users are limited to 10 or 15 minutes, but in private usage people emerge after 45 minutes thinking that only 10 minutes have passed. “This is a beautiful, significant step in VR,” one user wrote. “It is rare that interface, aesthetics, and vision can come together as they have here— you have given the participants a wonderful experience.”

Not everyone is so moved. Davies heard harsh criticism from two 11-year-old boys who were accustomed to a different VR experience. “It's too slow,” they told her. “There's nothing to do”—and, perhaps most importantly—“there's nothing to kill!”

Osmose Technology and Exhibitions  

Osmose. Tree Pond
Softimage's Georges Mauro, who created Osmose's graphics, demonstrates its user interface. Photo: Jacques Dufresne

Softimage's Georges Mauro used Softimage 3D to create Osmose's modeling and animation. Two main types make up the environment: static objects (like the tree or ground) and animated splines for the particles to follow (like the stream that winds through the clearing).  

John Harrison wrote a program to make the static models more efficient in a real-time environment, using a prerelease version of Softimage's Saaphire Development Kit (now commercially available). He also wrote a program (in SDK) to update the spline animations in each frame and calculate the particles' positions along the splines. He modified some of the Softimage channel drivers to get information from the breathing vest and other input devices.  

Early development was done on a Silicon Graphics lndigo2 Extreme, moving to an SGI Onyx to add textures. Harrison also used SGI's Performer software to get real-time performance. Osmose's main program reads in data from the static models, spline animations, and channel drivers to determine what to draw, from what viewpoint, the degree of transparency, the position of the lights, special effects, and sound information. Another parallel process determines the set of triangles to render, and a third process renders them.  

"The two main challenges in VR are latency and frame rate," Harrison wrote. "Latency is critical, because if there is a noticeable delay between the user moving their head and the updating of the   image, the illusion of immersion will be broken.... Frame rate is always an issue—often an artist will ask for more and more graphics complexity, but eventually the frame rate will suffer." Harrison said that after working for more than four years at the Banff Centre for the Arts, he told Char Davies he was looking for a challenge.  

"So, during my first week of work here, Char showed me the computer graphics stills she had made earlier as part of an awardwinning series of images called the Interior Body Series. She explained that these had taken up to 40 hours to render, worked from only one viewpoint, and were extremely complex not because they were based on complicated models but because they used extensive transparency and ephemeral lighting effects. Then she said she wanted to use a similar effect, except in virtual reality, rendered in real time (1/30th of a second per image). I didn't think it would be possible, but we did indeed reach that goal with Osmose."  

Osmose was exhibited in 1995 in Montreal and New York City. Two exhibitions in the United Kingdom are scheduled, one in Newcastle-upon-Tyne this winter and another at the Barbican Art Centre in London next summer.


This article may include minor changes from the original publication in order to improve legibility and layout consistency within the Immersence Website.  † Significant changes from the original text have been indicated in red square brackets.
Put online: June 2017. Last verified: June 22nd, 2017.