Artificial Intelligence and Human computer Interactions
- 10:00 GMT - 11:00 GMT
- Kiyoshi Kiyokawa
- Session 5
Smart Reality Modulation for Inclusive Society
VR and AR technologies can be used to overcome physical constraints and maximize individual abilities and promote social participation and contribution. For example, it is possible to redesign the inherent visual ability using advanced head-mounted displays. In this talk, I will introduce the technology for flexible modulation of reality and its future prospects, along with various research examples that I have been involved in. In particular, I will introduce examples of improving QoL in various scenes of daily life using intelligent real-time video conversion using technologies such as deep learning.
- 11:00 GMT - 12:00 GMT
- Eric Vezzoli
- Session 5
Haptics in XR: How to create compelling experiences with today technology.
The hand interaction with virtual content in XR is mainly mediated by controllers equipped with haptics actuators. Haptics feedback can enhance the perception and the perceived quality of XR content in several applications: gaming, training, marketing among them.
One of the issues of haptics in XR today is how to create compelling haptics experiences which are supporting and enhancing the XR content for the use case. The talk will focus on haptics design guidelines for XR, and a practical approach on how to implement and realize them thanks to the haptic composer of Interhaptics.
- 12:00 GMT - 13:00 GMT
- Roderick Mc Call
- Session 5
Future Challenges: The Augmented World
This talk will look at how mixed reality, in particular augmented reality, has the potential to radically change our future and with it present some key technical, ethical and social challenges. The talk will look at how the technology has moved from the lab, to one which so far has been useful in specific niche sectors such as training, security and games. As the technologies improves is there an opportunity to move beyond one-off experiences to a more augmented world? What will this new Wild West be like? Now is the time to ask the questions.
- 13:00 GMT - 14:00 GMT
- Mark Melnykowycz, PhD
- Session 5
Enabling the Internet of Humans
The development of immersive technologies over the past decades have focused heavily on the visual/audio replacement or augmentation of user experiences.
With the advancement of wearable sensor and motor technologies, the ability to include tactile feedback has steadily increased alongside spatial positioning of the user in an environment. Additionally, smell augmentation has been implemented to limited extents. Neurofeedback is one of the last human data elements to be characterized and integrated into XR experience design.
Neurofeedback largely encompasses the characterization of brain activity and associated feedback through one of the other sensory inputs (visual, audio, tactile). This has widely been accomplished through EEG electrodes on the scalp, which allow brain frequency patterns to be acquired and characterized. Although scalp-located EEG technology and devices have been in use for decades, they have largely been used in controlled laboratory settings due to limitations associated with motion artifacts and the need to use electrolytic gel to improve the electrode-skin interface. Recent advances in EEG and materials research have enabled EEG signals to be reliably acquired from inside the ear canal. Due to the stable interface between ear electrodes and the skin in the ear canal, it’s possible to design EEG devices which can be used as every-day devices and integrated into more user experiences.
Advanced use cases include adaptive game and experience development. However, by connecting brain metrics to Internet of Things (IoT) networks and device, the Internet of Humans (IoH) foundations can be built to provide a new framework for human computer interaction (HCI) and including in XR environments.
- 15:00 GMT - 16:00 GMT
- Veronica Costa Orvalho
- Session 5
The art and science behind Digital Humans
AI and HCI
- 14:00 GMT - 15:00 GMT
- Daniela Romano
- Session 5
Human-AI trust, and Implications for Explainable AI (XAI)
In this talk the concept of trust between human and AI is discussed, considering the different forms an AI can have, as well as the difference in human's personality and the behaviour. In particular, considerations are made on the implications that AI-trust has on the need to explain the AI reasoning and when an explanation is needed and when it might not be necessary.
- 16:00 GMT - 17:00 GMT
- Pr. Diane Gromala
- Session 5
Human computer interactions
Human computer interactions