The physical interface for 'Channels' -- a boat and water -- allows you to actually move through virtual 3D space, while paddling in real water. You can change your direction the same way you would in a real boat, by paddling more on one side than the other, or by paddling in reverse. You can slow down the same way, as well.
To detect the speed of water in the buckets, we used two flex sensor strips with half of a plastic spoon attached to the end of the strip. This allowed us to measure the movement of water. While this worked for our purposes, it probably wouldn't work as a flow meter, which is another option we looked at.
We tested many different designs for this sensor, with various sizes and weights of paddles attached to the flex sensor, and this design proved to be the most accurate and reliable. We also covered the soldered connection point at the top of the flex sensor with a rigid outer-covering to prevent the sensor from bending or breaking in the most vulnerable place. To firmly secure the sensor in place and suspend it in the water, we cut out a square brick of insulate foam, and attached the flex sensor to the side of the foam brick, and then glued the foam brick to a piece of wood that we attached to the rim of the metal buckets.
It should be noted that flex sensors are extremely fragile; we broke four during this project. So one should be mindful before using this particular sensor for any long-term functionality -- especially if the usage involves water. There must be a lot of thought put into how one can secure the base of the sensor to prevent breakage.
The two flex sensors are connected to the Arduino microcontroller as analog inputs. The user's speed is determined by the sum of both sensor readings. The direction of the user's movement is determined by a simple function: right sensor – left sensor = direction.
We created a 3-D scene in Processing using OpenGL (Open Graphics Library), which allowed us to draw a three-dimensional scene from 2-D primitive graphics. The objects in our virtual world were made from PNG images to allow for transparency. Read more about our work making the scene in previous blog posts.
We used serial communications to send values from the Arduino microcontroller to our virtual scene in Processing and mapped the incoming values of the sensors to a range of numbers in our sketch representing “speed” and “direction." As the incoming sensor values are read in Arduino, Processing converts these values using the map() function to a speed value along the “z” axis and a direction value along the ‘x’ axis.
It should be noted that, instead of the boat moving in the scene, we have all the objects in the scene moving instead. So it gives the illusion of movement -- the boat isn't actually moving. While this was a good solution at first, we should've made the boat move; this would've allowed for the boat to face different directions. Right now, the boat can only pan left and right -- it cannot turn its head.
We created our own audio piece combining sound effects of water rippling, wind and birds. The Minim library plays and loops the audio in Processing.
The Processing code can be found here. (download zip)
We tested our physical construction and user interaction models over the course of a few weeks, and we received some really good feedback along the way. People reacted negatively to the cold temperature of the water, so we made sure to put room-temperature water in the metal buckets.
People also really liked the surprise image of an old bridge in our virtual scene, and they requested that we put more one-of-a-kind objects in the scene, so we added fireflies, ducks, fish and logs.
We requested user feedback on the physical form factor because we wanted to ensure that people felt comfortable when seated in the "boat." We recorded people's different sizes and heights, considering the fact that we want children to be able to participate as well, and we adjusted the height of the seat and water buckets based on average measurements.
We also asked several people to paddle through the scene to see if they could navigate around various objects, under the bridge, in order to test whether our sensor mappings were accurate. Based on these tests, we concluded that we should add semi-rigid plastic stays around the flex sensors to cause more resistance against the water pressure in order to enable more accurate control of the water. As a result, people were able to control their movements through the water even more precisely.
Read more about our user scenarios and testing here.
Each group member documented this project on their respective blogs, which can be found here: