Our researchers take home two awards at UbiComp 2019

24 September 2019

The award winners

The Interaction Design Lab (IDL) team had a strong presence at UbiComp 2019, the annual A*-rated conference on Pervasive and Ubiquitous Computing, which took place this year between 9–13 September in London, UK. Our team was honoured to take home two prestigious awards from the event. PhD Candidate Zhanna Sarsenbayeva received the Gaetano Borriello Outstanding Student Award, while a Distinguished Paper Award was taken home by a group of IDL researchers, for an article on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT):

  • Assisted Medication Management in Elderly Care Using Miniaturised Near-Infrared Spectroscopy
    S. Klakegg, J. Goncalves, A. Visuri, C. Luo, A. Popov, N. van Berkel, Z. Sarsenbayeva, V. Kostakos, S. Hosio, S. Savage, A. Bykov, I. Meglinski, D. Ferreira. 2018.

Our team also presented a number of papers this year, which were featured across the technical program tracks of the conference:

Full papers

  • Revisitation in Urban Space vs. Online: A Comparison across POIs, Websites, and Smartphone Apps.
    Hancheng Cao, Zhilong Chen, Fengli Xu, Yong Li, Vassilis Kostakos
  • Measuring the Effects of Stress on Mobile Interaction
    Zhanna Sarsenbayeva, Niels Van Berkel, Danula Hettiachchi, Weiwei Jiang, Tilman Dingler, Eduardo Velloso, Vassilis Kostakos, Jorge Goncalves
  • Classifying Attention Types with Thermal Imaging and Eye Tracking
    Yomna Abdelrahman, Anan Ahmad Khan, Joshua Newn, Eduardo Velloso, Sherine Ashraf Safwat, James Bailey, Andreas Bulling, Frank Vetere, Albrecht Schmidt
  • Combining Lowand Mid-Level Gaze Features for Desktop Activity Recognition
    Namrata Srivastava, Joshua Newn, Eduardo Velloso

Doctoral Colloquium

  • Using Contactless Sensors to Estimate Learning Difficulty in Digital Learning Environments
    Namrata Srivastava
  • Towards indoor localisation analytics for modelling flows of movement
    Gabriele Marini
  • Gaze assisted voice note taking system
    Anam Ahmad Khan

Demos

  • Combining Implicit Gaze and AI for Real-Time Intention Projection
    Joshua Newn, Ronal Singh, Eduardo Velloso, Frank Vetere
  • LiftSmart-A monitoring and warning wearable for weight trainers
    Yousef Kowsar, Eduardo Velloso, Lars Kulik, Christopher Leckie

Workshop Papers

  • Towards context-free Semantic Localisation
    G. Marini, J. Goncalves, E. Velloso, Raja Jurdak, V Kostakos
  • Improving Wearable Sensor Data Quality Using Context Markers
    Chaofan Wang, Z. Sarsenbayeva, C. Luo, J. Goncalves, V Kostakos
  • Capturing Contextual Morality: Applying Game Theory on Smartphones
    N. van Berkel, S. Hosio, B. Tag, J. Goncalves
  • AI-mediated gaze-based intention recognition for smart eyewear: opportunities & challenges
    Joshua Newn, Benjamin Tag, Ronal Singh, Eduardo Velloso, Frank Vetere
  • Ubiquitous smart eyewear interactions using implicit sensing and unobtrusive information output
    Qiushi Zhou, Joshua Newn, Benjamin Tag, Hao-Ping Lee, Chaofan Wang, Eduardo Velloso