During this sprint we came up with the initial idea for our project: A swarm of robots that communicate with each other using colored light, inspired by Conway's Game of Life where simple rules of interaction produce complex and unpredictable behaviors.
We started the sprint by determining the basic functionality we wanted our bots to have, which was the ability to differentiate between different colors of light and run towards or away from specific colors. At this point, the idea of the robot system consisted of photodiodes to detect red, green, and blue light, motors for the wheels, an Arduino for processing, and a chassis to house all of these components. For prototyping we used an existing chassis from a line-following robot as our platform and 3-D printed a holder for the photodiodes that also held a piece of colored film in front of them to filter for the color they should detect.
In addition to the ability to sense colored light, the robots needed an external source of light to interact with. This came in the form of our "beacons." For this two-week sprint we made a quick and dirty beacon out of 3D printed parts and red LEDs for testing purposes. The software for processing on the Arduino compared the magnitude of the photodiode readings and took the highest value as the color being sensed. When the robot sensed red, it drove forward towards the light. A gif of this first-pass integrated prototype of the functionality of our robots can be seen here.
After the first sprint, where we proved that we could differentiate between colors of light and send some response to the motors in response to the color, we shifted our focus to creating a custom chassis with more compact electronics and sensor mounting in order to reach the goal of a small-scale robot platform. The first iteration of our custom chassis retained the general shape of the line-follower chassis but was scaled down, and included mounting features to have photodiodes looking in multiple directions.
However, we quickly did a literal pivot of our bot design to a more vertical design that hid a lot of the interior wiring and things, providing an aesthetically cleaner design. The circular head allows the photodiodes be mounted so that they have a 360 field of view, and the head also comes off to allow easy servicing. This chassis was prototyped and outfitted with motors and wheels to test its driving capabilities during this sprint.
Electronics-wise we decided to move from an Arduino and breadboard for our circuitry to a custom-made printed circuit board that could be slotted into our chassis. This board was designed and sent to be manufactured during this sprint, but we did not receive the boards and components in time to integrate it with the rest of the chassis.
While we waited, we prototyped our PCB circuit on a breadboard to make sure that we could read from all our of our sensors correctly. This breadboard was then strapped to the back of one of our prototype chassis to prove out the color detection and reaction functionality of our little robot so far.
We realized at the beginning of this sprint that we didn't have a concrete idea of what success would look like for our project -- we knew generally what we wanted our bots to do, but didn't have any metrics or definite goals to meet. So, during a long brainstorming session, we came up with the idea of Freeze Tag. Having the robots simulate playing a game of Freeze Tag, with bots that chase and bots that run away, would demonstrate all of the functionalities that we've been working towards while giving a more easily understandable goal for what our robots were trying to accomplish.The new circuit boards arrived just before winter break and those of us who stayed at school started working on populating the circuit boards and printing the final iterations of chassis and beacons. By the time we all got back, we were able to assemble a robot in its semi-final stages and begin testing and debugging. Adding LED strips to the tops of the robots and integrating an accelerometer for bump-detection, we demonstrated all of the functionalities we needed to play a game of tag!
After demonstrating all of the functionalities that we've been working towards, it was time to manufacture more robots and teach them the rules of tag.
We compiled the code we'd used to test individual functionalities into "Chasing" and "Chased" robot behaviors -- the Chasing bot was the equivalent of "it" in tag, while the Chased bot was "not it." We also constructed an arena for the robots to operate in so they could be demoed in a constrained space. Now, our robots could react to both each other and the beacons, and were ready for demo! We had afew issues with the robots having a limited range in registering colored light, but overall were happy with the endearing way our robots behaved and the fun human interaction they encouraged from those who came to the demo.