These next ten years are going to be packed with technological improvements bringing us closer to an utopian self-driving future. We are going to see some big announcements coming from big and small players alike. In this article, I discuss the automotive technologies that I’m most looking forward to in the next 10 years. Feel free to comment below on the technologies you’re anticipating most.
We all like to imagine some not-so-distant utopia where we’re chauffeured around the world while we accomplish our everyday tasks. Automakers have been hinting at a vehicular revolution of the form and function of vehicles for years, and hired many talented designers to give us something to hang in our bedroom wall:
So you’ve got the basics down from Part 1. Now you want to know what software to know for the specific role you want at the autonomous vehicle company you’re pursuing. As we mentioned previously, those roles are plentiful, and there are tens of tools available for each task in any given position.
So, you want to be an Autonomous Vehicle Engineer? Or maybe you already are one looking to see what other skills to pick up? This post is intended to provide an overview of the various software tools that engineers use to build autonomous vehicles. Depending on exactly what you want to do in the world of autonomous vehicles, some of these may be more pertinent than others. We will not dive too deep into any one topic, but rather try to take a holistic view of the autonomous vehicle software landscape.
Congratulations for making it through the series! The last thing we’ll discuss is what it takes to make all this a reality. There’s lots of work still to be done to guarantee the safety of autonomous vehicles, so if you’re working in this field, good luck!
Part 5: What it takes to make self-driving cars objectively safe
In our opinion, there’s a few things that need to happen before we can objectively call autonomous vehicles safe:
So, the car has its plan, and we’re pretty confident in that plan based on Part 2 and Part 3. But it’s still just a plan, the vehicle now needs to execute. We call this the “Act” phase, and that can be short for “Action” or “Actuation.” The car needs to give commands to each actuator, which are most commonly brake, accelerator, and steering box. However, this can also include other actions like changing the suspension settings in real-time (active or semi-active suspension) or applying (vectoring) more brake or acceleration on one wheel over another. This can even actuation modes we’ve haven’t thought of yet, but due to the fact a robot is driving the vehicle and not a person, can be implemented.
Some of these crazy actuation modes might include turning each wheel independently, or leaning the wheel along with turning. It might include an actuator that controls the angle of the trailer that follows a vehicle. It might be self-balancing a vehicle by moving a ballast. It could be anything! This concept can lead to futuristic vehicle designs that can still navigate the road safely due to the diligence the engineers performed in the sensing, panning, and acting software development stage:
So now that you’ve read Part 2, you should feel OK about how well autonomous vehicles can sense both the world around it and itself. What does the car now do with this information? Well, it thinks about it and spits out some command, usually in the form of a steering, braking, and acceleration amount. But before it gets to brake, acceleration, and steer outputs, the vehicle must first plan its path or route. This is called “Path Planning,” and requires all the information from the sensing stack to get correct.
There’s an ongoing battle between LiDAR, Automotive Radar, and Cameras (and a few others too) for the title of the self-driving car’s “eyes.”
But self-driving car do more than just “see” the world. The cars also are equipped with sensors onboard that can tell the vehicle more about the surrounding world and itself. These sensors tell how fast the car is moving; the G-force acceleration experienced by the vehicle in forward-back, side-to-side, and up-and-down directions; the current steering angle; and lots more. It is a combination of these perception systems (camera, LiDAR, Radar) and sensor systems (GPS, IMU, Wheel Speed, etc) that make up the inputs to the “sense” block of the self-driving car AV stack.
Would you let an unmanned Uber take you to the airport? Or a driverless school bus drive your kids to elementary school? You’re probably wondering how we can be sure that the cars of the future are safe, especially if there’s a robot behind the wheel. This series will dissect how the designers and engineers building self-driving cars can objectively ensure that autonomous vehicles make the right decisions at the right time to save lives. The goals of the mobility revolution are to reduce or eliminate traffic fatalities, to decongest cities and roadways, and to ultimately make better use of our natural resources.
This series is intended for all readers, and I apologize to the engineer readers for the high-level nature of the technical content. Enjoy!