EmotiSphere

Mapping emotions to music

with Sara Burns and Galen Chuang

I was in charge of:
-creating the visual materials
-3D modelling and physically engineering the prototype
-coding the volume control interactions

image
image
Defined interactions
image
Orb reads emotions (i.e. happy or sad)
image
Using arduino processing, we algorithmically send data from the emotion read to PD to compose music. The file is saved for the user to play.
image
Currently, the sound is played through nearby speakers but in the future a speaker could be added inside the base
image
The nuts and bolts (arduino) inside the orb
image
3D printed base and stopper has reed sensors and magnet to track rotation to increase/ decrease volume

Low Fidelity Prototype Demo

image
Bonus initial concept sketches