Internet of Things (IoT) enables the creation of sensing and computing machines to enhance the level of continuous adaptation and support provided by intelligent systems to humans. Nevertheless, these systems still depend on human intervention, for example, in maintenance and (re)configuration tasks. To this measure, the development of an Adaptive Instructional System (AIS) in the context of IoT allows for the creation of new, improved learning and training environments. One can test new approaches to improve the training and perception efficiency of humans. Examples are the use of virtual and augmented reality, the inclusion of nature inspired metaphors based on biophilic design and calm computing principles and the design of technology that aims at changing the users’ behaviour through persuasion and social influence. In this work, we specifically propose a nature inspired visual representation concept, BioIoT, to communicate sensor information. Our results show that this new representation contributes to the users’ well-being and performance while remaining as easy to understand as traditional data representations (based on an experiment with twelve participants over two weeks). We present a use case under which we apply the BioIoT concept. It serves the purpose of demonstrating the BioIoT benefits in a AR setting, when applied in households and workplaces scenarios. Furthermore, by leveraging our previous experience in the development of adaptive and supportive systems based on eye-tracking, we discuss the application of this new sensing technology to the support of users in machine intervention by using the user attention, i.e., eye-gaze, on different machine parts as a way to infer the user’s needs and adapt the system accordingly. In this way, a new level of continuous support can be provided to the users depending on their skill level and individual needs in the form of contextualized instructions and action recommendations based on user attention.
|Title of host publication||International Conference on Human-Computer Interaction|
|Publication status||Published - 2020|
- Augmented Reality