What’s in store for 2020?

Everything from:VR

What’s in store for 2020?

I love the beginning of a new year. Everything seems fresh. A feeling of relaxation and fun hangs in the air as everyone settles back into work after their summer break. Walking around campus, there is calm and quiet, mixed with signs of preparation for the year ahead. While our team was getting ready to face a new year, I asked for their predictions on the trends we are likely to see in 2020. Here is what they came up with. Personalised learning Personalisation of the learning journey is still on the agenda. At VEDT, we are looking at different streams of interactions and assessments. In 2020, we envisage that students will have more opportunity to choose their own learning pathways through content, and hopefully also through the assessment experience. 3P Learning predict that customised, student-centred and adaptive learning experiences will take centre stage. This involves students being able to select the mediums and lessons based on their interests and needs. The Digital Marketing Institute points out that new tools are deepening the capacity for personalised learning, with artificial intelligence and machine learning offering the ability for content to be responsive and to evolve with the learner. Learning analytics Of […]

Read More…

Bringing the real into VR and AR into the real

Melbourne Knowledge Week explores ideas and actions about our future. With a strong focus on emerging technologies, the VEDT attended a guest lecture on VR and AR from Frank Vetere, Professor of Human-Computer Interaction at the University of Melbourne. Within the heritage meat market in North Melbourne, the journey of AR and VR was shown with amazing modern applications. From the movie ‘Metropolis’ in 1927, to Ivan Sutherland’s ground-breaking AR machine in 1967, there are regular 40-year cycles in AR and VR concepts. The most striking development in these technologies today is the use of emotion; adding this emotional element allows the user to feel more invested and connected to the activity. The first example of an emotional application was with an AR dog. Via a projection from a live AR headset, an AR dog appeared on the stage. The dog, wagging its tail and tilting its head, responded to cues and movements from the user. When AR balls were introduced (in a real basket) the dog began to jump, yipping and yapping with excitement. The user then threw the AR balls on the stage for the dog to collect. Both dog and AR balls responded to their surroundings by […]

Read More…