This work was initially titled Eco-Song and was to be the focus for my major project this term. During the research phase I realised that my initial goal, to sonify a larger ecosystem, was not a clear enough target to aim towards. Also – the area has been colonised by tech-bro startups, always a great sign that you should immediately run the other way. As a result I put this project back on the shelf and focused on robotic systems.
Through out class feedback sessions it became clear that to make this more interesting it would be good to consider – “if plants could talk, what would they actually say?”. If nothing else, 2020 is doing a fantastic job of giving humanity a bit of customer feedback. It seems fair to say that the biosphere has “some thoughts” about the current relationship with humanity.
With this in mind, I revisited this concept. I aimed to keep a balance between the pleasing equilibrium of my work last term – the quality of the music and how it evolves through sensory data – with the added dramatic response when a human tries to intervene (through touching the plants).
On a technical level, I wanted to work with multiple Huzzah 8266 WIFI nodes to create a simple IOT network, pushing all the sensory feedback to MaxMSP. To achive this I adapted the WIFI example from the Adafruit library and used the OSC code Nathan Adams created for the OSCIMU.
It would be fun to take this idea further, and give plants unique voices and languages – they are citizens of the world after all. In this prototype they’re all sweary Australians.
2 x Huzzah 8266