top of page

Plant thinking uses artificial materials to simulate the way plants interact with the outside world, providing an immersive and interactive experience by revealing the inner workings of plants through sound and motion.

Year

2019/ 2020

Type

Interaction design

5 months

Duration

Inspiration

Wet

Dry

there is enough water, the plants are healthy

there is little water, the cell changes to absorb more water.

I have always believed that plants are intelligent and have developed a strong ability to adapt to their environment over time.  By observing the difference between plants in terms of water absorption and water loss, I realised that they adapt spontaneously to their environment and are far more adaptable than I had imagined.

portfolio24-min.jpg

Research question

How to present the ability to adapt to the plant in a tangible way that allows the audience to experience the resonance with the plant?

Experience in three ways

By utilizing three inputs generated by the audience, the digital plant could display some changes as feedback in real-time, simulating the performence presented by the plant. 

未命名-14-min.jpg

Touch - Deformation

Light - Sound

Noise - Motion

  1. TOUCH AND DEFERMATION

Output: Defermation material experiment  

IMG_20190402_181956R-min_edited.jpg
IMG_20190402_180436R-min.jpg
IMG_20190402_181206R-min.jpg
IMG_20190529_202520-min.jpg
IMG_20190529_203048-min_edited.jpg
IMG_20190529_203028-min_edited.jpg
IMG_20190529_202355_edited.jpg
IMG_20190529_202817-min_edited.jpg

The purpose of this experiment was to find a language of deformation, which would provide the audience with obvious, organic variations that they could easily associate with plants. So I tried Tyvek, a combination of soft and hard deformations, etc. In the end, I chose to use TPU as the material for deformations.

Output: Defermation degree experiment

I then noticed that the size of the airway could change the amount of deformation, so through experiments I determined the maximum size of the deformation.

Variation

ezgif-3-94385b8746.gif
ezgif-3-b30e590d89.gif
IMG_20200613_183654-min.jpg
ezgif-3-b806657fb5.gif

Input: Touch

ezgif-3-5589457efd.gif

2. LIGHT AND SOUND

Intput: Light

The light sensors can measure the light intensity of the recorded light,each light intensity corresponds to a pitch.

Output: Sound

00:00 / 01:08

The system can absorb light, and depending on the light irradiation, the system converts light into music. 

portfolio1-min.jpg

3. NOISE AND MOTION

Intput: Noise

Output: Motion

Noise generated by visitors to the exhibition is converted into a change in the image. 

The sound sensor changes the image according to the volume level

效果预览3.jpg

Fabrication and control

批注 2020-03-03 050057.png
bottom of page