Harpreet Sareen

The Plant in the Machine

I imagine that over the next decade humans will form a kind of symbiosis with the plant kingdom where bioengineered plants change colors to signal the presence of pollutants in the air, land, and waterways. Rainbows of color in the foliage might reveal the myriad ways we humans have impacted the natural world. They might warn us when a factory is cheating and releasing contaminated runoff or emitting hazardous fumes into the air. At home, when you water the plants, their leaves might change colors to tell you the concentration of toxic metals or harmful bacteria in the tap. In city squares, urban gardens might provide living information about local environmental health. They might even help us navigate and point you to your destination with their branches using the same cellular mechanisms that point them to the sun.

I am a professor of interaction design, a field that usually investigates relationships between people and products. My practice, however, focuses on people’s interactions with plants; I create plant technologies or what I call bio-digital hybrids. My designs may seem whimsical and at odds with our current ideas about nature, but I argue that the status quo exists because our relationship with nature has been stymied.

For thousands of years, humans have harnessed biological materials for fabrication, medicine, and agriculture. In India, I grew up seeing and using reed baskets, thatched roofs, grown bridges, seagrass cots, and jute textiles. These materials were culturally specific—they grew near our homes and we had relationships with them. But since the Industrial Revolution, we’ve become alienated from our biological materials. They’re no longer intertwined with our daily lives and rituals.

Early in my career, designers like me used the phrase “human-centered design” while designing for digital environments. Something always seemed wrong with that phrase since so much of human experience is lost when we enter the digital realm.

I remember one project where my team and I developed a virtual garden for local communities. When I shared the project with my family back in India, I could sense their bafflement. They reminded me that as a child I would play with the Venus flytrap we had in the house by triggering it to snap its jaw. Memories of dissolving rice paper in water and chasing bubbles came back to me. I realized that all of these tangible experiences meant more to me than anything that ever happened onscreen.

The conversation forced me to reckon with myself. What about humanity were we designing for? My work had migrated to a medium that had no hope of simulating an actual relationship with the living world. And so over time I realigned my career. Today my work asks, how can technology continue shaping the relationships with the botanical world that evolved over generations? Can we expand the range of signals plants can sense in their environment, speed their reactions, and devise novel methods to display environmental impacts in its petals and leaves?

Thirteen years ago, I started to study plants that move: the leaves of the Mimosa pudica, known as the touch-me-not, fold up when you brush them with your fingers; the smaller leaves of the Codariocalyx motorius, the telegraph plant, sweep along an elliptical path to catch the ideal angle of sunlight and signal larger leaves to change their orientation. Darwin called plant movement “the animalness in plants.”

All plants have an “animalness.” To demonstrate, I created a plant-robot hybrid I call Elowan, which means “good light” in Celtic. Fitted with wheels, Elowan can drive itself towards to a window for a sip of light. I fabricated it from a potted Anthurium andreanum (commonly called a laceleaf) with a base fitted with a microprocessor and motorized wheels. Electrodes from the base connect to Elowan’s stems. When a leaf senses light, its stem floods with calcium. This reaction typically redirects growth toward light, but in Elowan it also raises conductivity levels, which signals the wheels to drive toward the light.

Elowan subverts the way people generally relate to plants. Rather than using technology to shape or control the growth of plants to our own desires, as we do in gardens, greenhouses, and farms, this technology empowers the plant.

More recently, colleagues and I injected a Spathiphyllum (Peace Lily) plant with tiny fluorescent nanosensors that sit harmlessly in the space between the cells of the leaves. We call the project Argus, meaning ‘guardian’ in Latin. These sensors stop glowing when given impure water containing lead. We envisioned a plant offering warning before you drink your next glass from the tap.

Prototypes like Argus and Elowan hint at a symbiotic future and a perhaps completely different way to relate to the verdure around us. For me it’s an antidote to the self-importance of human-centered design and a nod in the direction we humans ultimately must go.

Harpreet Sareen

Harpreet Sareen is an Associate Professor of Interaction and Media Design at Parsons School of Design where he directs the Synthetic Ecosystems Lab. His research reimagines human-nature relationships through hybrid technologies that integrate living organisms with digital systems. His work has earned prestigious awards including CHI Golden Mouse, Edison Gold, SXSW Interactive Innovation, MIT Technology Review Under 35 Innovator, and Fast Company World Changing Ideas. He has exhibited internationally at Ars Electronica Festival, Somerset House, CID Grand Hornu, and MIT Museum.

Previous
Previous

Karen Ingram - Still Life, Never Still

Next
Next

Sue Huang | Experience