Bringing the real into VR and AR into the real

Image: Nicky Callinan

Melbourne Knowledge Week explores ideas and actions about our future. With a strong focus on emerging technologies, the VEDT attended a guest lecture on VR and AR from Frank Vetere, Professor of Human-Computer Interaction at the University of Melbourne. Within the heritage meat market in North Melbourne, the journey of AR and VR was shown with amazing modern applications. From the movie ‘Metropolis’ in 1927, to Ivan Sutherland’s ground-breaking AR machine in 1967, there are regular 40-year cycles in AR and VR concepts. The most striking development in these technologies today is the use of emotion; adding this emotional element allows the user to feel more invested and connected to the activity.

Josh, Oli and Erica at Melbourne Knowledge Week

The first example of an emotional application was with an AR dog. Via a projection from a live AR headset, an AR dog appeared on the stage. The dog, wagging its tail and tilting its head, responded to cues and movements from the user. When AR balls were introduced (in a real basket) the dog began to jump, yipping and yapping with excitement. The user then threw the AR balls on the stage for the dog to collect. Both dog and AR balls responded to their surroundings by bouncing off surfaces and moving around solid objects. The dog returned the AR balls and waited excitedly for more play. Aside from it simply being someone playing with a cute doggy, there was an added level of interaction and connection made with the user.

AR Dog
AR Dog

The second use of emotion within these technologies used VR. This was a social VR program designed with older adults to facilitate interacting and reminiscing. Elderly members of the community created avatars to represent themselves. With the technology used, the users held controllers so they could move their arms and hands in the virtual world, allowing them to pass and receive objects.

The Highway of Life slideshow image
A VR program designed to link older adults through technology

With one group in Melbourne and the other in Bendigo, the users met in a virtual primary school class room, a place of high emotion for all. Each member brought with them a scan of a real image that they could share within the virtual world. The example shown was that of an elderly man’s avatar passing a photo of himself as a young school boy to another avatar. This brought the real into the virtual world.

The final segment of the presentation involved projecting the virtual on the real. A 2D image is brought to life by projecting muscles and bones, in proportion, on a white-dressed user. With the muscles and bones visible, as if the users skin has been removed, the user can move their limbs with the projection reacting with the movement. This shows the layers of muscles, the skeletal structure, and organs on a real, living person. The instructor can ‘draw’ on the user or iPad highlighting specifics. Two members of our team experienced being able to see projected muscles on their bodies and were surprised to see it was an Xbox Kinect sensor that was tracking their movements.

Josh Davies in muscular AR

Previously, instructors for medical-related courses like physiotherapy would paint directly onto a volunteer to demonstrate where organs, muscles and bone structure sit in a real context. Being able to project realistic images directly onto a human that tracks with their movement in real time showed that this was a far more practical, realistic and engaging way to convey this information to students.

By engaging with emotional contexts using technology, a great learning experience can be achieved by connecting users, creating realistic activities and mixing the virtual with the real. Our team can’t wait to see what happens in the next 40-year cycle.

 

Written by Nicky Callinan, Erica Managh, Josh Davies and Oliver Lorraine-Wedd.