To add an extra layer of interactivity, our system features a web app that allows the user to select which sensing type they want to use and how they want the system to respond. We give them the option of using audio or vision to gather sensory data. We also have the option for the user to choose a demo mode to demonstrate the movement of our structure when there is no one interacting with it. The flask web application is able to switch between modes by using Python processing, starting and stopping the processes when different modes are selected.
Check out our code for the webapp here.
Using OpenCV and a Raspberry Pi camera, our system's video motion detection mode tracks motion within the frame. First, we take the difference of the most recent three images to determine where there has been motion. Then we divide the input image into six panes, so that we can detect which frame or frames have motion in them. The six frames correspond to the six groups of origami nodes on the sculpture, allowing us to control the motion of the structures dependent on which panels have motion in them. The system transmits the frames that detect motion to the Arduino by sending Serial messages consisting of 1s (indicating motion) and 0s (indicating no motion), which then control the position of the servos. We use threading to allow us to delay time between messages sent while still constantly tracking for motion with our camera code.
Check out our motion detection code here.
Similarly, our system's audio detection mode makes use of Allen Downey's thinkdsp library to build a wave spectrum and get the amplitudes of varying frequencies. Again, we use threading to record sound and process sound waves from the previous few seconds at the same time. We determine what the maximum magnitude of sound is within the range of frequencies that are audible to the human ear and use that to decide what message we should send to the Arduino, increasing the number of nodes moved as the noise in the environment increases.
Check out our sound processing code here.
The demo mode (shown in the video below) on our structure sends similar messages consisting of 0's and 1's indicating which sections should have movement at which times to make for interesting and exciting patterns. These patterns can easily be adjusted by changing the messages sent in our demo program.
Check out our demo code here.