Conclusion


Is it useful?

User testing has successfully proved that once we tangibly represent the direction and the intensity of the user made sounds, such a representation might be naturally used as an interaction trigger. Therefore it can serve as a valid interaction channel, and it may contribute to regular events, such as touch, point, grab and so on.

It also turns out that alignment of senses with user’s intention helps a user to perform complex interactions such as locomotion without experiencing cybersickness. However, this statement is far from a claim, and it requires further research.


Is it usable?


A significant number of participants were able to detect and recognize both simple objects and complex geometries, and with each immersion, their capability to do so got improved.

Although specific advanced interactions were tested with an only small group of volunteers, our gathered observations confirm that it is possible to reduce the whole experience down to two used channels, as in our case, while being able to move objects or ourselves.

However, this approach ultimately fails in the noisy environment. As we could not separate the user made sounds from the background noise, any advanced interaction such as sound activated locomotion, were triggered randomly and were the source of the motion sickness. Also, as users could not internally link the sound they produced with the related interaction, they needed much longer time to take over comfortable control over their action.



Is it desirable?

The emotional aspect of this reduced VR experience turned out to be the biggest surprise. There were no intentions to stylize this experiment anyhow, yet the most frequent responses recorded during the first immersions were on a large scale on how beautiful it is, without users being too specific.

Even when particles received changing color as they age, users related to the initial, unsaturated experience as the most desirable.

We ought to hypothesize, that the sensory alignment works towards the satisfaction with this VR embodiment, and current cognitive science theories seem to confirm it. Yet, this is far from a claim, and it requires further research.




Further research and development

Along with the development of the B_BODY project, several side experiments and attempts to practically use the proposed framework, here are two of them:
Magicles 01 audio reactive particles: Simple particle painting VR application, where created strokes react on the sound and look for its source. This particle behavior forces users to use their full body because as they move and make sounds, particles will attempt to follow them and so the motion along with sound becomes part of the drawing.
https://www.youtube.com/watch?v=6Sk57EHoFiI&t=11s

Cyber Haiku, Created as TiltBrush VR painting test, the characters and environments inspired by the Japanese art of ink drawings, receive voice triggered interactions. We are currently evaluating the potential partnership to develop this concept into a narrative VR experience.
https://www.youtube.com/watch?v=R5xM3kl4VGo&t=77s

On top of above mentioned practical applications of this project, additional questions were raised. Moreover, as they cross the border between design and cognitive sciences, We feel that more knowledge and research must be earned, and therefore this project will undoubtedly continue.





Continue to Acknowledgement