.png)
AgTech360
AgTech360
Engineering Autonomy: John Deere’s AI-Powered Tractors at Work
Aaron Wells, Director of Engineering & Autonomy at Blue River Technology, a John Deere company, discusses the evolution of agricultural machinery—from traditional horsepower to advanced hardware and artificial intelligence. He explains how John Deere’s next-generation perception system, known as the “eyes and brain” of autonomous tractors, is transforming modern farming. This episode explores how AI, machine vision, and retrofit kits are making autonomy in agriculture possible, the engineering challenges of scaling autonomous tractors, and the future of precision ag technology for large-scale farming.
Speaker 1:
Ag Tech 360 discusses breakthrough technologies that are impacting growers, businesses, and consumers. Hear from industry and academic experts about what's on the horizon.Speaker 2:
So welcome to Ag Tech 360. Today we're diving into the evolution of big ag machinery, from horsepower to hardware and AI. Joining me is Aaron Wells, Director of Engineering and Autonomy at Blue River Technology, which is a John Deere company. Aaron leads the teams behind Deere's next generation perception system, the eyes and brains that transform traditional tractors into fully autonomous field vehicles. We'll explore how advanced sensors, artificial intelligence, and retrofit kits are reshaping large scale farming. What it takes to ensure that autonomous tractors operate safely and effectively and what the future holds for autonomy in agriculture. So welcome, Aaron. Thanks for joining me.
Aaron Wells:
Hey, awesome to be here. Thanks for having me.
Adrian Percy:
Absolutely. So let's start with your journey. What was your path that shaped your approach to engineering autonomy at scale?
Aaron Wells:
Totally. No, that's a great question. So my background originally was military aviation. That's sort of what I did coming out of college. And the reason I guess that we're not talking about airplanes today, or at least I don't think we're going to be talking about airplanes today, is I really fell in love with the problem space of agriculture. It's hard to find a more compelling problem than using advanced technology to go feed the world. That's a really neat problem statement to get up every day and go contribute towards. So leaving university, leaving the military behind and getting more exposed to this domain space was really exciting. So I've spent the last 15 years working in the ag tech side, largely on the John Deere side. Anytime that we were interested in putting camera or sensing on a vehicle, those are programs and projects that I tended to gravitate towards.
So for those that maybe know the John Deere portfolio really well, if you think about products like Vision Row Sense, the ability to take a camera and see down a row, that's a great example. And really that all culminated with the sea and spray technology, with taking a whole bunch of cameras, equipping those on a self-propelled sprayer, and the idea that we can spray weeds, not crop. So what bridged that to autonomy is right around that time we joined forces with Blue River Technology and we started thinking about the problems from an AI mindset of instead of trying to describe what a weed looks like and what the leaf shape size is, let's train an AI classifier.
And the performance gains from that approach just made it such a compelling option. So coming out of the sea and spray era, thinking about what's next, autonomy was just such a vibrant opportunity to go participate in both from a customer needs side and an applicability of what we did well and what the machines were ready for. So I've been working on the autonomy side for seven years now, and that has culminated to where we're today.
Adrian Percy:
I mean, everyone knows John Deere, huge legacy of building powerful machines. And now from the sounds of you are laying in advanced technology sensors, cameras and AI. And so as we shift from this more mechanical approach to a combined mechanical and digital approach, what does autonomy at scale truly mean for row crop agriculture?
Aaron Wells:
Yeah, absolutely. So I think if you think about autonomy at scale from first principles and you want to have a scale impact, you want to start with something that's at scale. So if you think about some of the larger row crop vehicles in the John Deere portfolio, I'm talking about some of the eight or nine series tractors, tractors that are very at home in places like Iowa and Nebraska, some of the eye states, the Midwest, where that corn, soy states are very prevalent. What we wanted to do is explore the opportunities to retrofit as many of those as possible to get you autonomy in the seat that you're in today. So if you have that model year 23 tractor, we wanted to be able to go back, add the additional controllers, harnesses, some of the smaller bits and bobs such that we can maybe plop the perception system right on top and enable autonomy for you where you're at today without having to maybe make a big leap into a different or unfamiliar machine.
So that's been the core is how we've thought about autonomy is, hey, let's start with something that's at scale like our fleet today, and let's leverage a lot of the technology that we already have. So if you think about, again, some of the things that John Deere's been on the journey of for the last 20 years, we can think about things like auto steer through a product called Auto Track. That journey started 20 years ago, and we've been refining our ability to drive really precise lines in fields ever since. We've been adding functionality where now we can do the turns, we can plan the field. So how do we build on top of that the capability of autonomy, leveraging all the off-board communications, the planning, the digital components that the customers are already familiar with today.
Adrian Percy:
And so what have been the biggest challenges in bringing those systems or retrofitting existing systems and getting to work in this way?Aaron Wells:
Yeah, that's my favorite question. Let's get to the hard parts. I'm an engineer by trade, right? This is the good question. I think when customers have talked to us, the reason we went on the autonomy journey is they have told us that labor pains, labor challenges, labor availability is so crucial, especially during planting and harvest. There are many ways that we can go from an engineering or product side to go solve that problem. And quite frankly, I don't know the best way. I'm not out farming several thousand acres. So creating the connection to customers is the way to solve that hard problem of figuring out what's the right approach to connect to them. So maybe an example in a great story. So I think I said we started this journey about seven years ago. Seven years ago, we made our first prototype, it was a unit of one fit, maybe in like a backpack or a suitcase.
We took that first prototype, we packaged it up in that backpack or suitcase. We flew that from our office in Santa Clara to Minnesota. We bolted that to the front of a customer machine and we said, "Hey, tell me what you think. Does this work, is this going to solve the problem that you told us about last year? Do you think that this is the right approach? You told us that fall tillage is really hard to get done in Minnesota because you're trying to get the crop out of the ground before it freezes. Is this a viable approach? Yes, no, give us that feedback."
And since then, we've done it every single season since. So spring and fall, we're out there with the latest and greatest saying, "Hey, tell us what we got right. Tell us what we got wrong." Let's be a partner in this journey and really develop this product together. So that culminated, we're now actually on our second generation, we call it, of perception capabilities that we launched at the Consumer Electronics Show earlier this year. And it's all from that feedback. From the understanding that we don't know better than our customers. So we need that direct link and that information highway maybe to have those conversations.
Adrian Percy:
So in regards to that next generation perception system that you've just referenced, so what does it include and can you kind of give us a little bit of a teaser about the additional capabilities or how it extends the autonomous capabilities in the machines that you're referencing?
Aaron Wells:
One of the things that customers told us on that maybe the story I just told you is like, hey, meet us where we're at today. There's a lot of tools that you use to do tillage. There's lots of tools in combinations of tractors and tillage tools. We want as many of those to... I'll tell you everything I know right out of the gate. There's no secret. I don't want to have to change my tillage practice, which is where we started autonomy in fall and spring tillage. I don't want to have to change that practice to be able to take advantage of some of these autonomous capabilities. So our generation one perception system was very focused on an 8R, so think like a medium large size tractor with two tillage tools. Our second perception system enables four different tractor types.
So all the large row crop tractors with a variety of families of tillage tools of all different shapes and sizes. If you have that 43 foot field cultivator, great. There's a path to get you autonomy on that field cultivator. How we did that was through moving all the cameras up to the roof. So if you see an autonomous tractor in the field, you actually have to squint to tell because there's not a lot of extra bits and bobs. But there are 16 cameras all along sort of the eyebrow of the cab, four on every side. And the really neat thing is we do what's called a camera array, which is where if you look at them, they're all kind of cross-eyed in different ways. And the reason for that is that we can get great depth at ranges, at different ranges. And the reason we did that is because when we're standing in the field, customer says, "Hey, I want to use my 23-foot chisel plow and my 60-foot field cultivator."
Those are two very different problems that we need to solve. So we need that flexible vision range to be able to do that. The decision-making is all on board. We have two Nvidia powered GPUs that sit towards the rear of the vehicle, and all those decisions are made in real time to keep the vehicle safe and productive.
Speaker 1:
The North Carolina Plant Sciences Initiative impacts lives through innovative applications and discoveries. By leveraging cutting-edge research and technology, we address global challenges related to agriculture, sustainability, and human health.
Adrian Percy:
What customer needs do you feel you are addressing with these autonomous vehicles?
Aaron Wells:
We have the same method of getting customer needs as we had back to the origins of John Deere is we go out, we get our boots dirty, and then we ask customers how it's going. When you do that today, what they tell you is that the machines are big, the machines are productive. That's great. The hardest part is getting somebody to sit in the seat and get the job done on time. Our operators are customers today. They're on average 56 years old. They're putting in 12 to 18 hours during planting and harvest times, right? Those pinch points during the year where there's just nobody around.
So if we can make that vehicle autonomous overnight for you, so it never stops working, it never needs to take a break, never needs to take lunch, never calls in sick. If we could do that job so you can focus on a more valuable part of your operation, whether that's overseeing harvest while we do tillage, that's what we want to go do for you. We want to focus on the jobs that you can't get done, but agronomically really impact your bottom line and continue to drive focus there.
Adrian Percy:
What are the emerging technologies that you personally are most excited about? Whether they're coming from Deere or not, I mean, those that haven't yet reached the commercial scale or actually on the market, but you feel could be game-changers in the row crop farming industry as we move forward.
Aaron Wells:
I think I come from a company of AI enthusiasts that are very pragmatic about finding applications for that. An example is the machine learning model that runs on those NVIDIA GPUs that makes, I think I did the math, about 4 million decisions every single day. That model is based on the same sort of transformer-based approach that powers ChatGPT today. That's one place where we're looking where the industry is going, and we're saying, "Hey, does this tangibly impact farmers?" It doesn't have to be splashy, it has to be pragmatic. So certainly we are deeply watching like, hey, what things can we do and run real time on the vehicle to impact the performance to solve something practical for the farmer? That's maybe one. And the second, frankly, is why tractors? Why start with tractors at all? And it's because outside of all the great features of a tractor, they have refrigerators.
They have heated and cooled seats. They've got some of the best guidance in the world, sub-inch accuracy. It's incredible. But maybe the most exciting thing about a tractor in general is the hitch. You can use this incredible power unit to do such varied work around the farm. So again, from us going out talking to customers, what we're always asking is, "Hey, what's next? What's that next pain point for you? You have this autonomous tractor. Let's work together to figure out what that portfolio of options look like to keep reducing the pain points when you can't find somebody to sit in the seat." So I think those two things are what has me really excited every week.
Adrian Percy:
And are there other areas that you feel autonomy will ultimately play a role in making agriculture more efficient or more sustainable?Aaron Wells:
Yeah, for sure. Again, I think back to the tractor and I think back to the other jobs that tractor can do, that's certainly a great place to start. But one of the elements that we came up with when we thought about our generation two perception system is we can scale that to other vehicle platforms within the John Deere umbrella. So we're using very similar technology in construction. We're using very similar technology in commercial mowing, in orchard autonomy. These are vehicles that have different problems to solve. The same labor need or the same labor scarcity problem that they're trying to solve in those industries. But we could leverage the same technology. Same thing for agriculture.
As I think through the different production steps from preparing the soil to planting, to harvesting, to moving grain, wherever there's the opportunity and customers are raising their hand saying, hey, once again, I'm having trouble at this step. This is something that you could really help us with. That's going to be next step on our list. The idea to make this scalable architecture of camera arrays that we could shrink or expand and sort of plop on different vehicles is the way we're going to get it done.
Adrian Percy:
Aaron, I wanted to say a big thank you for giving us some insights into what's going on at Blue River and John Deere, of course, and some really inspiring thoughts for the future. Thank you.
Speaker 1:
Ag Tech 360 shares relevant news and breakthroughs with audiences across the globe. Stay connected and join the conversation by following NCPSI on social media.