VAM-HRI 2024 Program

Lightning Talk / Poster Info:

Workshop Info:

  • March 11 | 02:00 PM - 06:00 PM MST
  • Intro + Meet someone new!
    02:00 PM MST | 10 min
  • Lightning Talk Group 1
    02:10 PM MST | 40 min
  • Paper #20: Work in Progress: Teaching a Pipetting Task to a Robot Using Natural Gestures with Haptic Feedback in Augmented Reality
    Max Alexej Pötter, Fabian Wagner, Jan Lagast
  • Paper #9: Towards Understanding the Role of Humans in Collaborative Tasks
    Ayesha Jena, Elin A. Topp
  • Paper #8: Quest2ROS: An App to Facilitate Teleoperating Robots
    Michael C. Welle, Nils Ingelhag, Martina Lippi, Maciej Wozniak, Andrea Gasparri, Danica Kragic
  • Paper #3: Helping People Predict the Outcome of Robotic Pouring Behaviors with Augmented Reality
    Andre Cleaver, Reuben M Aronson, Jivko Sinapov
  • Paper #13: Stereoscopic Virtual Reality Teleoperation for Human Robot Collaborative Dataset Collection
    Yi-Shiuan Tung, Matthew B Luebbers, Alessandro Roncone, Bradley Hayes
  • Paper #4: Prototyping Mid-Air Display for Anywhere Robot Communication With Projected Spatial AR
    Uthman Tijani, Zhao Han
  • Short Break
    02:50 PM MST | 5 min
  • Lightning Talk Group 2
    02:55 PM MST | 35 min
  • Paper #10: Towards Mixed Reality Applications to Support Active and Lively Ageing
    Marta Gabbi, Valeria Villani, Lorenzo Sabattini
  • Paper #16: A VR Prototype for One-Dimensional Movement Visualizations for Robotic Arms
    Bram van Deurzen, Dries Cardinaels, Gustavo Rovelo Ruiz, Kris Luyten
  • Paper #11: Participant Training for Virtual Reality User Studies: Lessons Learned and Open Questions
    Jordan Allspaw, Gregory LeMasurier, Elizabeth Phillips, Holly Yanco
  • Paper #15: Proposing a Hybrid Authoring Interface for AR-Supported Human-Robot Collaboration
    Rasmus Lunding, Sebastian Hubenschmid, Tiare Feuchtner
  • Paper #1: Evaluating an Augmented Reality Interface for Drone Search Tasks
    David Kortenkamp, Debra Schreckenghost, Gavin Love, Patrick Hagan
  • Coffee Break (Poster Preparation)
    03:30 PM MST | 30 min
  • Keynote + Q&A:
    04:00 PM MST | 50 min

    Zhao Han is an Assistant Professor of Computer Science and Engineering at the University of South Florida. He leads the Reality, Autonomy, and Robot Experience (RARE) Lab. His research lies broadly in human-robot interaction (HRI), augmented reality (AR), robotics, and AI. He focuses on designing, developing, and evaluating novel robotic systems and interactions, for embodied robots to be more capable and understandable while interacting, collaborating, and teaming up with humans.
    Zhao Han

    Assistant Professor at University of South Florida

    Home
  • Lightning Talk Group 3
    04:50 PM MST | 25 min
  • Paper #5: An Interactive Protocol to Measure a Driver’s Situational Awareness
    Abhijat Biswas, Pranay Gupta, David Held, Henny Admoni
  • Paper #14: Augmented Reality Demonstrations for Scalable Robot Imitation Learning
    Yue Yang, Bryce Ikeda, Gedas Bertasius, Daniel Szafir
  • Paper #12: Improving Robot Predictability via Trajectory Optimization Using a Virtual Reality Testbed
    Clare Lohrmann, Bradley Hayes, Alessandro Roncone
  • Paper #2: To Understand Indicators of Robots' Vision Capabilities
    Hong Wang, Tam Do, Zhao Han
  • Poster Session Group 1
    05:15 PM MST | 20 min
  • Paper #1: Evaluating an Augmented Reality Interface for Drone Search Tasks
  • Paper #4: Prototyping Mid-Air Display for Anywhere Robot Communication With Projected Spatial AR
  • Paper #10: Towards Mixed Reality Applications to Support Active and Lively Ageing
  • Paper #12: Improving Robot Predictability via Trajectory Optimization Using a Virtual Reality Testbed
  • Paper #13: Stereoscopic Virtual Reality Teleoperation for Human Robot Collaborative Dataset Collection
  • Paper #15: Proposing a Hybrid Authoring Interface for AR-Supported Human-Robot Collaboration
  • Poster Session Group 2
    05:35 PM MST | 20 min
  • Paper #2: To Understand Indicators of Robots' Vision Capabilities
  • Paper #5: An Interactive Protocol to Measure a Driver’s Situational Awareness
  • Paper #11: Participant Training for Virtual Reality User Studies: Lessons Learned and Open Questions
  • Paper #14: Augmented Reality Demonstrations for Scalable Robot Imitation Learning
  • Paper #16: A VR Prototype for One-Dimensional Movement Visualizations for Robotic Arms
  • Closing
    05:55 PM MST | 5 min
  • YouTube Recording