We propose a radically new type of intelligent user interface (IUI) that brings together elements from robotics, augmented reality (AR), and ubiquitous computing. We intend to build the users flying organizer (UFO), a semi-autonomous micro aerial vehicle (MAV) equipped with pico-projector and cameras that allows for projected interactive pixels anywhere in the users environment. Whilst MAVs have been explored in the context of autonomous flight, this project will for the first time explore how these novel computing devices can be made fully interactive, working with real users in real-world environments to enable new application scenarios, including:
1) UFOs can act as personal assistants for the users and carry out mixed initiative tasks. Here the user specifies a high-level task such as find me the book with title A, and the UFO plans and executes the detailed instructions autonomously.
2) The spontaneous visualization of documents on physical surfaces around the user, turning the environment into an ad-hoc smart space augmented with graphics. Whilst such spaces have been proposed previously, user interfaces have been physically constrained to steerable projectors or requiring users to hold mobile projectors. Our system allows UFOs to navigate freely in the environment, projecting content where it is needed, without user instrumentation.
3) Revealing hidden information in the environment, such as cables, pipes or architectural features, associated with a real world location. The UFOs create a new form of unencumbered IUI/AR experience. Traditionally, AR requires users to wear head-mounted displays, carry smartphones in their hands or use projectors mounted in the environment. Hence, the applicability of AR is limited, as the users are encumbered with hardware. UFOs remove this limitation on AR.
4) Co-located collaboration and remote collaboration via UFOs. The UFO can record and transmit information to a remote participant, while projecting the remote participants reactions back into the local environment. Since the UFO can move independently of local users, the remote participant can utilize the UFO to assume alternate viewpoints and project information onto arbitrary surfaces in the environment.
As an IUI, the UFO will require the development of novel multi-modal interfaces that allow users to manipulate virtual content projected onto the real world. Of particular interest are rich, direct, effortless and natural interaction techniques with spatially registered 2D and 3D graphics. This is especially challenging because both the user and the UFO are mobile. Thus, interaction modalities need to be developed, applied and studied.