What is the …float… app?

A key component of the …float… project was the development by a team at Memorial University of a free iOS app that gave audience members the opportunity to transform their iPhone devices into a singing bowl (aka Tibetan bowl or prayer bowl) and participate in the performance’s finale. Each person was invited to download the app in advance of the performance and play it on cue in the closing section of Andrew Staniland’s work, “On the Surface of Water”.  The result was a stunning, surround-sound cloud of three resonant “singing bowl” pitches enveloping the choral singing.

The app will remain free in perpetuity and can be used for future performances by choirs from around the world of Andrew Staniland’s piece or of other works that its availability might inspire.  In developing this app, Choral Canada brought accessible new technology into the creative process, helping break down barriers between performer and listener.

What makes the …float… app unique?

As the development moved forward, there were a few challenges along the way that not only enhanced our learning experience but also helped make this app different from many other apps available online.

Choreography is a crucial factor as there are two audio sources the visualization responds to: 1) live-stream real-time microphone data, 2) delay-tolerant pre-saved audio clips. Analysis of both sources cover many parameters such as frequency components and their amplitude. However, sampling microphone data in real time and collecting a very small fraction of data very quickly to avoid latency and analyzing its frequencies, proved to be more challenging than analyzing the stored audio clips.

The added playback feature allows users to play the previously-recorded audio clip to participate in the event. However, multitrack and parallel playback showed to be challenging as each track needed to be separately defined, initialed, tuned and activated, yet all tracks needed to smoothly merge and become one! Fade-in/fade-out algorithms were created and added to assist with parallel playback.

To ensure audio visualization looks as natural as possible, we implemented physical laws and made use of continuous temporal data to simulate a smooth transition between each visualization component. There were a good number of choices for audio visualization as we considered and explored nebula, stars, waves, and ripples; all in 2D and 3D effects. After long deliberation and tryouts, we decided on nebula effect as it was the most unique and consistent to the context of …float… event. In order to precisely tune nebula effect to match the input data, many of its parameters such as mean size and variance, velocity, acceleration, field of depth, and edge effect were adjusted accordingly.

To allow users interact with the app using whip motion, which can be slower and wider than a usual shake gesture, we needed to bypass the built-in shake motion detector and access raw core-motion data (accelerometer and gyroscope) and build detection algorithms around it.  This module involves low-pass filtering and motion integral in 3D space.  These algorithms are then used to distinguish different motion movements of the device and perform appropriate actions accordingly.  The result is that users are able to invoke the playback of the Tibetan bowl sounds without having to be conscious of whether it was a shake or whip motion; it will happen regardless if it is slow or quick, wide or compact.

Additional features such as meditation mode, background mode, offline support and manual play/stop actions were added to bring more fun, usefulness and control to the users. Geo-fenced audio selection that would automatically select specific audio playbacks or variants based on user location was explored during the development.  Nevertheless, we were very pleased with the result and that the app has enhanced the audience’ experience; subtle but not intrusive.

Report provided by Dr. Yuanzhu Chen and Ali Alfosool, co-creators of the …float… app

Download for iOS