HCI researchers to present at OzCHI 2020

30 November 2020

OzChi 2020 logo

The 32nd Australian Conference on Human-Computer Interaction (OzCHI 2020) will take place over three days between 2-4 December 2020. OzCHI conferences bring together local, regional and international researchers to discuss engaging, thought-provoking and innovative new work in Human-Computer Interaction (HCI) and Interaction Design.

This year, researchers from The University of Melbourne have worked as members of the Organising Committee for OzCHI 2020.

A number of HCI researchers, from our group, have been invited to present their papers at this year’s virtual conference. We wish the very best to our successful authors for their upcoming presentations.

OzCHI 2020: Paper presentations

(in order of presentation, from the full OzCHI 2020 program)

Wednesday 2 December 2020

‘Feed the Tree: Representation of Australia-based Academic Women at HCI Conferences’

Authors: Dana McKay and George Buchanan
Session: Paper Session: Motivating
Time: 11.00 - 11.45 AM

‘Page-Turning Techniques for Reading Interfaces in Virtual Environments’

Authors: Tilman Dingler, Siran Li, Niels van Berkel, and Vassilis Kostakos
Session: Paper Session: VR Interacting
Time: 13:15 - 13:45 PM

‘Online Dating Meets Artificial Intelligence: How the Perception of Algorithmically Generated Profile Text Impacts Attractiveness and Trust’

Authors: Yihan Wu and Ryan Kelly
Session: Paper Session: Emotion
Time: 14:00 - 14:45 PM

Thursday 3 December 2020

‘Exploring the Effects of User Control on Social Engagement in Virtual Reality’

Authors: Weijia Wang, Steven Baker, and Andrew Irlitti
Session: Paper Session: VR Control
Time: 14:00 - 14.45 PM

Melbourne 2100: Dystopian Virtual Reality to Provoke Engagement with Climate Change’

Authors: Kate Ferris, Gonzalo Garcia Martinez, Greg Wadley, and Kathryn Williams
Session: Paper Session: Futures
Time: 16:00 - 16.45 PM

Friday 4 December 2020

‘Lessons Learnt from Designing a Smart Clothing Telehealth System for Hospital Use’

Authors: Deepti Aggarwal, Thuong Hoang, Bernd Ploderer, Frank Vetere, Rohit Ashok Khot, and Mark Bradford
Session: Paper Session: Health
Time: 10:00 - 10:45 AM

‘Empowering Caregivers of People Living with Dementia to Use Music Therapeutically at Home’

Authors: Romina Carrasco, Felicity Baker, Anna A. Bukowska, Imogen Clark, Libby Flynn, Kate McMahon, Helen Odell-Miller, Karette A Stensaeth, Jeanette Tamplin, Tanara Vieira Sousa, Jenny Waycott, and Thomas Wosch
Session: Paper Session: Aged Care
Time: 11:00 - 11:45 AM

‘Challenges of Deploying VR in Aged Care: A Two-Phase Exploration Study’

Authors: Wei Zhao, Steven Baker, and Jenny Waycott
Session: Paper Session: Aged Care
Time: 11:00 - 11:45 AM

‘Privacy by Design in Aged Care Monitoring Devices? Well, Not Quite Yet!’

Authors: Sami Alkhatib, Jenny Waycott, George Buchanan, Marthie Grobler, and Shuo Wang
Session: Paper Session: Aged Care
Time: 11:00 - 11:45 AM

‘Older Adults and their Acquisition of Digital Skills: A Review of Current Research Evidence’

Authors: Priyankar Bhattacharjee, Steven Baker, and Jenny Waycott
Session: Paper Session: Ageing Well
Time: 11:00 - 11:45 AM

‘How Older Adults Respond to the Use of Virtual Reality for Enrichment: A Systematic Review’

Authors: Kong Saoane Thach, Reeva Lederman, and Jenny Waycott
Session: Paper Session: Ageing Well
Time: 11:00 - 11:45 AM

Congratulations to Kong, Reeva and Jenny for receiving an Honourable Mention award, for this paper.

‘Worth 1000 Words? The Influence of Image Versus Text Content on the Book Selection Process’

Authors: Huiwen Zhang, Dana McKay, and George Buchanan
Session: Paper Session: Human Factors
Time: 15:00 - 15:45 PM

The following paper was selected as a Late Breaking Work, and will be published in the conference proceedings.

‘Using Videogames to Regulate Emotions’ 
Zhanna Sarsenbayeva, Benjamin Tag, Shu Yan, Vassilis Kostakos, Jorge Goncalves

Registration for OzCHI is open until December 1st at 12 PM (AEST), and this year it’s free for full-time students. See the website for further details.