An Attempt to make an Interactive Desktop Companion


Interactive Desktop Companion: A Summary

System Diagram for Desktop Companion

The Interactive Desktop Companion is the best friend for those who spend extended periods of time at their desk, in front of their computers. The product is a kinetic sculpture, with its movements based upon the strandbeest, and it interacts with the user through the recognition of both ambient and active inputs. Major electrical inputs, such as brightness, and vibrations, are tied to major outputs such as brightness of LEDs, and the interactive state on the sculpture. Inputs based on the software include gesture recognition using a webcam, that invokes a response from the sculpture. The mechanical system brings together the system in a polished form, with a fin-based movement.

Straandbeest inspired Desktop Companion

In creating our Desktop Companion, we strive to bring about the most polished product as possible. Our focus is not solely on features, but appearance, safety, and appealability of the finished product. In this process, we attempt to minimize bugs when we bring together the different subsystems. Integration and interaction between subsystems is a major theme of the project, as different subsystems act as inputs, computation, and outputs. Below is the system diagram of inputs and outputs.

System Diagram

System Diagram for Desktop Companion

The major inputs for the desktop companion are the Webcam, the Touch Screen, the Photodiode, and the Accelerometer. The webcam recognizes people and enables interaction, as it can detect faces and smiles. The touch screen detects touches, and an emoji face, that is presented an an output on the touch screen, blushes. The data from the touch screen and the webcam are computed by the Raspberry Pi.

The photodiode is an input that recognizes surrounding brightness. The data is processed by an Arduino, and the brightness outputs of the LEDs are subsequently changed. The Accelerometer recognizes knocks made to the front panel and recognizes them as inputs. The impulse is processed by the Arduino, and toggles LED patterns.

The LEDs and the output motors, which control fin movements, are controlled by the Arduino. The patterns the LED and the motors generate depend on what 'state' the desktop companion is in. The 'state' of the desktop companion can be altered by the different inputs available, mainly the webcam. The Arduino continuously communicates with the Raspberry Pi, and that is how all the all the inputs and outputs are integrated together.