Join us for a cup of tea and some Machine Learning with Vivent’s in-house expert Tommy Meacham

Published on April 30, 2021

Tommy, What is Machine Learning and how did you get into it?

I first dipped my toes into the world of learning algorithms at the University of Edinburgh. There I was taught the theory behind teaching a machine to learn. I left with a collection of models under my belt, and absolutely no way of using them in the real world! I decided to cut my teeth as a Backend Software Engineer to learn how all the pieces fit together. Some years into this journey, I read a blog post about Martin Fowler (somewhat of a personal hero of mine) describing the complexity of deploying Machine Learning algorithms.

They are subject to change in three axis: the code itself, the model, and the data.” 

It’s a daunting challenge, but it boils down to applying robust software engineering principles (Continuous Delivery, Automated Testing, and Version Control) to the world of ML. I transferred into a Data Science division and learnt first-hand about the challenges involved. I’m extremely excited to apply similar ideas here at Vivent, where both the Product and the Data Science research potential are so high!

Machine Learning lets us identify patterns in our data and apply them to future examples. For instance, once you learn the pattern associated with a nutrient deficiency, you can use that model to identify nutrient deficiency in unseen plants. In practice, there are many complexities we need to consider while building these new models, it takes time and patience.

 

Who left the lights on? 

We know that a plant signal is a reaction to an external stimulus, so using ML to learn what an individual plant response looks like, means we can automatically tag portions of a recording. This opens the door to insights such as tagging irrigation, “lights on/off” moments, and identifying soil borne pathogens that invisibly attack the plant from under the ground. By learning to “listen” to what plants are saying, we transform complicated line graphs into an information rich view of what is happening to the plant right now.

 

How smooth is your plant’s ride? 

It’s common knowledge that a consistent growing environment will yield the healthiest plants. A cold day is compensated by increasing greenhouse temperature. Irrigation schedules are fine-tuned and run like clockwork. All of this optimization without a direct data stream from the plants themselves. It’s like going to the doctor, but the only questions they ask you are about your lifestyle and diet. Suddenly, you come in with a Fitbit and the doctor has a clear view of your heart rate over the past weeks in order to make the diagnosis. By measuring how consistent a plant signal has been over the past X days, we can spot events that cause the plants to deviate from their daily rhythm, and therefore suggest changes to those events to make it a smoother ride.

 

More tea and pass me the CRISPR thanks

Personally, my favourite “use case” for plant information flow is how it applies to growing crops that are resistant for example to cold weather, pests, or to water stress. 

Right now, if a gene sequence is known to increase crop resistance to cold weather, CRISPR is used to modify the genes of the plant to make it more resilient to cold. What if we want to modify plants for a new type of stressor, say soil salinity? We don’t know the specific “soil salinity” resistant gene sequence, so the process looks like an A/B test, where 1000 different strains grow together and are exposed to the same stress. If the plants start to wilt, their genes are excluded from further analysis. Imagine how much quicker this process could be if, instead of waiting for them to wilt, you could quickly rule out plants by listening to their electrical signal!

 

Speak to me!

And finally, from a pure Data Science point of view, I’m extremely interested in gaining biological insight into the language of plants using Deep Learning methods. The field of Natural Language Processing focuses mostly on learning human language – Can we extend the field to include plant language as well? 

 

 

Related article: https://www.phytlsigns.com/detecting-venus-flytrap-action-potentials-with-phytlsigns/

Watch our new video about Vivent and our electrical signaling monitor here – PhytlSigns

 

Share This Story, Choose Your Platform!

About the Author: Jose Ojeda