The £1M HAPPIE project
Haptic authoring pipeline for the production of immersive experiences
The £1M HAPPIE project is being conducted by a consortium comprising UWL, Generic Robotics, Numerion Software, Science Museum Group, Sliced Bread Animation and Open University.
Between us, we are developing a haptics pipeline – a tool to enable a workflow that facilitates transparent communication from software application to user experience, where the graphics match the properties of the real world and the end-user can feel actual friction, mass, momentum and texture using physics modelling of objects in the virtual world combined with custom-built hardware.
Our contribution is to apply such haptics to music production within a virtual environment.
In 2019, Innovate UK (part of the new UK Research and Innovation group) awarded £1 million to the “HAPPIE” project (Haptic Authoring Pipeline for the Production of Immersive Experiences).
The 18-month UWL HAPPIE project is now well underway.
The research was based at the dedicated ‘HAPPIE Lab’ in UWL’s London College of Music, but due to recent events, is currently being conducted remotely and the team are collaborating on-line.
About the HAPPIE project
The sense of touch is our physical connection to the world around us. Haptic technology recreates our sense of touch for the digital world. Haptics in the creative industries has primarily been applied as vibration in computer-game feedback and in other niche industry areas such as surgical training and flight simulators, but the scope goes far beyond simple vibration. As the necessity for touch in immersive experiences becomes clear, interest in haptics is flourishing.
According to Professor Paterson, “Music production follows a long-established paradigm, that of the recording studio, be it physical or desktop. In both cases, the interface is highly complex with a very steep learning curve. Huge numbers of parameters need to be managed in parallel and the operator constantly needs to correlate these with aural effect. Many novel interfaces have been developed in efforts to manage such problems. Bespoke hardware associated with specific manufacturers and touchscreen apps offers new modes of engaging with limited feature sets. There have been some attempts in academia to form 3-D representations of sound and its control, but these have been very awkward to use in the absence of actual touch.”
Paterson goes on to suggest that ”Virtual and other extended realities will soon grow beyond the games sector and become a major disruptive force, impacting productivity, education, medicine and daily life. Virtual reality is being overhauled by the promise of extended realities where computer-generated artefacts can be superimposed upon and integrated into the user’s physical space. Systems such as Hololens 2 and Magic Leap are leading the way, yet the visual illusions that they offer are undermined by lack of that all-pervasive feature of the real world – touch.”
To address these issues, the HAPPIE project has been conceived in order to define the future of interacting with sound in mixed reality. What this means is that any sound might be represented by a physical animated model that is superimposed upon the user’s field of view in mixed reality. The user will be able to reach out and actually touch the shape, and through physics-based modelling, feel a natural weight and resistance, and be able to sculpt the animation with their hands by stretching or twisting etc.
Not only will it change appearance with lifelike behaviour, but the shape will be intelligently mapped to sets of audio parameters utilising ‘semantic’ relationships. Thus, the user’s action will instantly and intuitively modify the sound, doing away with the need to iteratively adjust myriad controls, yet still allowing them all to be visible and adjustable as might be required by using mixed-reality solutions. Such manipulation might be of a sound in isolation or whilst it plays alongside many others. The control could also be in live music-performance, or over an entire music-mix in stereo or 3-D audio.
Research Assistant Andy Visser is currently working on the connection between haptic devices (within the Unreal game-engine environment) and outside world use-cases such as Digital Audio Workstations (DAW). He has designed a bi-directional protocol that communicates, using Open Sound Control, between a haptic robotic device (affectionately christened as HAL) and a DAW, and is currently involved in user-experiment design in order to test various forms of haptic audio-control within virtual reality.
Visser states that “Research into the use and deployment of haptics in audio within a virtual or mixed-reality environment is a relatively new area. Although various groups have done a lot of work around haptics and also, separately, on virtual audio-production, the two areas have not really been successfully integrated yet. To be involved in this project is both exciting and rewarding – the work is truly cutting edge and at the very intersection of haptic robotics, virtual reality and music production.”
Research team
-
Research team
Principal Investigator, Professor Justin Paterson leads the UWL HAPPIE research group supported by research assistant Andy Visser with contributions from some of Paterson’s PhD students.
Find out more
-
Research Centres and Groups
Find out about our multi-disciplinary areas of expertise, PhD research, and teaching.
-
Research impact
Learn how our PhD research has helped communities locally, nationally and internationally.
-
The Graduate School
If you are interested in studying for a PhD or Professional Doctorate, the Graduate School is here to support your research.