Ride in NVIDIA’s Self-Driving Car

Ride in NVIDIA’s Self-Driving Car


Today, in a special edition of DRIVE Labs,
we’re taking you on an autonomous drive and we’re going to show you the pieces of software
we’re building, running together, in the car, enabling the vehicle to drive itself. Our pilot is Dennis. I’m your copilot. Let’s go. We are now on the road and we’ll be engaging
autonomy once we get on the highway, but before we do that, I want to show you our perception
functionality already in action in the car. Perception is basically what enables the car
to see. We take in raw sensor data and translate it
into semantic understanding of the world, of the scene that we’re in. So take a look at that happening on our front
camera. We have DriveNet detecting obstacles, the
bounding boxes around the cars. We have WaitNet detecting the intersection,
the yellow box around everything. WaitNet also detecting traffic lights and
traffic signs and LightNet is classifying traffic lights state correctly as red. We also have signed classification going on
using SignNet. At the same time, DriveNet is detecting pedestrians
in the cyan bounding boxes on the far side of the intersection. We also have OpenRoadNet tracing out the free
space around obstacles on the scene. And on top of that, we have our object tracking
from frame to frame. You see the track IDs on the top of each bounding
box. We also have our camera-based DNN, distance
estimation running, so you see the distance in meters displayed at the bottom of each
box. ClearSightNet is also running in the background,
assessing whether and how well the cameras can see in our four cameras surround perception
set up, on our embedded AGX platform. All of this rich perception functionality
is what our planning and control software are going to use to execute the autonomous
driving maneuvers that you’re about to see. We’re now getting onto the highway on-ramp
and entering the coverage area of the high definition map that we’re going to use today
for the car to create a route plan that we’re going to follow. Basically, the car will localize itself onto
the map and create a lane plan that tells us when it needs to stay in the lane, when
it needs to take a lane change to stay on the route, and when it needs to take a highway
interchange. The second thing that’s about to happen is
that we’re going to transition out of human-driven mode, driven by Dennis, into autonomous machine
driving mode, where the car is going to drive us. So taking a look at the top right of our screen,
we see automatic cruise control, ACC, and Lane Keep, LK. When they’re both off, Dennis is driving. When they come on, the car will be driving
us. So here we go. Taking a look at the screen Lane Keep is now
on. ACC is now on. We’re driving fully autonomously. Dennis, his hands are off the wheel but staying
close for safety reasons and we are officially starting our autonomous drive. Okay. We are now in full autonomy. The car is keeping us in the lane. Let’s take a look at how that is happening. That thick green center path that you see. That is the Path Perception Ensemble, DRIVE
Labs episode one and it is computing, not just the center path and the edges of our
lane but also the center path and edges of the left adjacent lane and the right adjacent
lanes. And we visualize that with different colors. So green is our Ego lane, left adjacent is
red and right adjacent is blue. Next, we need to determine which of the obstacles
belong in which of these different lanes. The way that we do that, we have the bounding
box detections from DriveNet. We have free space boundary detections from
OpenRoadNet. Where those two meets is what is called the
object fence and that fence is off where that object is in space. We combined this object fence information
with lane geometry information from Path Perception Ensemble and this now enables us to do obstacle-to-lane
assignment. The car fence takes on the color of it’s assigned
lane. We are now approaching our first maneuver,
first autonomous maneuver. The car is letting us know that based on our
route plan, we need to take a lane change to the right. Here we go. The car is performing a surround radar and
camera lane change safety check, and we are now moving from Lane Keep mode into Speed
Adaptation in order to figure out the speed profile to get into the next lane and into
Lane Change mode. Moving from the center path of the current
lane into the center path of the target lane. And we have now completed that lane change. Okay. We’re now getting ready for our second set
of autonomous driving maneuvers. Going straight into the highway interchange
onto 280. Now, although we know this is coming up based
on localization to the HD map, we will not be using any clues from the map to actually
navigate this maneuver. We are handling this using Path Perception
Ensemble only. Lane Handling mode on the screen is split
because this is a lane split interchange and now the challenge is going to be for Path
Perception Ensemble to maintain confidence throughout this interchange because it has
both high curvature and high grade. But take a look at Path Perception Ensemble. It’s still green, meaning it has high confidence,
that it’s navigating this difficult curved graded highway interchange correctly. We are now coming up on our next set of autonomous
driving maneuvers to get onto highway 87. The first thing that we’re going to need to
do is another lane change to the right to get into the correct exit lane, and then handle
another lane split highway interchange, followed by another lane change under time pressure. So here we go. First lane change. You see Lane Handling mode, go into Speed
Adaptation, finding the lateral path into the next lane. Ensemble going from red to green as it’s landing
in the target lane. Finding confidence that it’s found the lane. We have just handled another lane merge and
we are going to have a little bit of grade profile changes in the road coming up. Right there, this is why it’s important to
have calibration continuously running in the car. We see the Lane Handling mode move into Split
Mode. The car needs to correctly take that lane
split to the right to not unintentionally exit the route. Path Perception Ensemble is now navigating
another high curvature interchange. We see the center path staying green, and
we are now moving right into that third maneuver. This is a lane change under time pressure. We don’t have a lot of time here to move from
the right lane into the next adjacent left lane, in order to not incorrectly exit from
our planned route. So here we go. We’re switching from Keep mode into Speed
Adaptation into Change mode and landing in the center of the target lane to complete
that set of maneuvers. And we are now going to complete the rest
of our autonomous route and head back to the garage. And we’re back. We hope you enjoyed our autonomous drive today
and enjoyed seeing how our software is enabling the car to drive itself. For any questions, reach out to us through
the comments section. Check out our other DRIVE Labs videos, and
we’ll see you next time.

100 thoughts on “Ride in NVIDIA’s Self-Driving Car

  1. nice 😀

  2. Thanks a lot for wasting my time with his crappy video!!

  3. Sometime in the future… Your car tells you that you’re drunk and won’t let you take the manual control. So you have to use the autonomous driving service which costs you $1 per mile using the software that is provided by NVIDIA. But you can’t cause you have an outstanding balance of $200 from last week by playing couple games on NVIDIA’s cloud. Then you apply for the NVIDIA gpu credit program to cover the bills. So you’ll be driving your car that is a lease using the credit from NVIDIA because of the beer you bought with your Apple credit card at the bar 😂
    And then Google will analyze your activity and give you ads in Gmail to sign up for the alcohol addiction program. The complete cashless future!

  4. goddamn it move the cursor it's annoying it's right in the centre…

  5. i thought nvidia was for gaming

    ok sorry

  6. Useless feature for me, but whatever… some people actually like driving cars with an automatic transmission too, so who knows, this might be of interest to that kind of people!

  7. And how much sensors/camera's did you use? And how much/big is your computer system in the trunk? That's vital info being left out

  8. I beliebe this video RTX ON Ray-traying because 7:00 !

  9. I thought you were more advanced at driving than tesla though…

  10. Ok so the system scans and responds using cameras and sensors. so is anyone working on just interlinking automobiles driving in the same path. have stationary street devices that also communicate with traffic. have them synchronize and communicate with each other. all you need are sensors that identify the four corners of the car and one connected to the automobile computer. changing lanes and decelerating will be communicated to all cars and traffic will be controlled by the stationary devices. these cars would be able to drive mostly without visual scans.

  11. does your ca see the other car lights ( breaking light / turn ) ?

  12. I am super excited about the advances made by Nvidia in developing safe autonomous vehicle operation but… When it rains, when it snows, when the fog rolls in, glare hits the camera or dust crosses the road does the computer know hoe to deal with it? Personally I would be more comfortable if @ProHawkVision were integrated into the solution enabling vision which is superior to even Clark Kent! www.prohawkgroup.com

  13. chakke ho be nvidia valo ek update dhang ka ni deete behen ke lode log mera fps ni ara chut ke pattho

  14. What are you using to run the AI and how many fps does it process?

  15. RTX ON the hood 2:00

  16. funny how nVidia expects us to believe those cheap camera feeds are the actual sensors tracking the stuff on the screen.

  17. I feel like these are going to cause some accidents because of the nonhuman like behavior (i.e. slowing down to match speed to change lanes)

  18. Didn't Tesla ditch you guys?

  19. with all this shit on the Roof, your Destination could have at least been the moon…

  20. Stop endangering south bay traffic with driving this slow! Staring from 3:10, every cars on the road were passing you but you still driving in the middle of the road. Tbh, 80yo gramma drive better than your current AI.

  21. You need a pilot and a Co pilot on a nvidia self driving car lol

  22. Elon Musk have told you, using a LIDAR is a mistake.

  23. Dam she's excellent at explaining tech clearly

  24. CEO has to be at the back of the video filming. He likes publicity. 😀

  25. Needs a driver update every 30 seconds

  26. ITSOCOOLYOUSTARTINGTHISBUZNEZCUZYOUAREWAYBEHINDBUTATLEASTYOUUNDERSTANDWHATTHEGAMEIS

  27. only highway driving thats easy

  28. Show a video wherein Nvidia's car run over people.

  29. crab crab bs

  30. what GPU card i should install ? RTX 2080 with rayTracing ?

  31. why can't the traffic lights be this long in the uk

  32. Can I do that too If I put RTX 2080 Ti in my Toyota Prius ?

  33. so.. you're couple years behind tesla.. how is this a news?

  34. guess they forgot about tesla

  35. Ok… and how is this impressive? One car with a million sensors and a safety driver doing what 500 000 Tesla’s do every day. They use all the fancy words and visualization, but doing this on highways are really not impressive anymore. Show us the same capabilities in city driving and I’ll give you a standing ovation.

  36. I love how this algorithm works. I'm a senior studying computer science and my minor is in data science. Is there any school for autonomous driving algorithm for CS grad students? Thank you for this amazing work.

  37. Looks Like I'm using an ESP Hack in Any Games.

  38. Wait wut

  39. Tesla doing this for years

  40. yes yes. it also has the additional functionality of being able to cook food on your processor enclosure. Will never get in a production car anytime soon.

  41. woah, that's kind of a sucky graphics, for being nvidia

  42. Fake

  43. her voice started shaking a lot towards the end of the video beginning at around 6:20 she was scared as shit.

  44. A current Nissan Altima SL can do this without the on-ramp capability. Oooooh. Make it navigate without defined lines and barriers, then you'll have true innovation.

  45. The Department of Defense would like to know your location

  46. U should overclock Nvidia car to 1000hp.

  47. How's this different than tesla?

  48. Can Auto pilot change a flat tire or stops when gets pulled over ?

  49. *but can it run crysis?
    *

  50. Putting your life in the hands of computer algorithms and AI is way too big of a risk

  51. Where to buy and how much does it cost?

  52. it looks like minecraft hacks, esp

  53. brake thot

  54. no thanks.

  55. how does it handle a random dog running into the street

  56. its looks like esp cheat

  57. Can it Raytrace tho? And can I overclock it?

  58. Jensen everything just works!

  59. Great video and great programming, I have more trust in this system then what Tesla is cooking, although I like their cars. I also like how NVIDIA's CEO has been downgraded to car mechanic in the last scene of this video haha.

  60. she's so freaking CUTE

  61. Tesla is just years ahead of everyone

  62. But can it run minecraft?

  63. but does it turn on blinkers?

  64. Do you really want to be spam in a can

  65. what graphics card do i need to run this…

  66. Chances are NVIDIA is just going to license their product to car manufacturers..

  67. I will wait for AMD's cheap version

  68. OMG 20FSP!!! this is stupid!! You need upgrade procesor to Quantum 30-1-0 potęga 3

  69. this movie is 1080P??? WTF??? are u try to .. yes cya NVIDIA

  70. Good luck buying that and having the car's OS crash every few hours of operation after another "successful driver update". Especially as you are about to carmageddon some pedestrians.

  71. Some parts were omitted, my guess it didn't work perfectly. And this is just a show case event. Need some big improvements to go on.

  72. ESP Hack?….

  73. There are some things that a human brain can do better than technology…. Keep that in mind….

  74. Andrew Yang 2020

  75. Tesla: laughs in richness

  76. Can the system predict a child or dog chasing a ball into the road?

  77. Did you install the safety force technology on the car? Do you have statistics for it usage? Do you have some articles about the GAP between this technology and 5 level of autonomy?

  78. Wooohoo north san jose!!

  79. Tesla-Hahahahahhaha
    Thousands of teslas are soing these tasks very efficiently everyday….
    They had to make a complete video of such a simple task they are doing….

  80. Jah is driving it smh

  81. Very interesting video. But the system doesn't look very reliable to me.

  82. bull shit liers

  83. most of all , it was not me who liked black … … and i don't know why the german guys are with you … … and i know why tesla is not with you … ( elon thinks he's good ) … that's why tesla can't do it ( right ) … … you guys are good … but when it comes to car hardwear ( like camera ) ,,, you find your way out ^^ , ned

  84. Love the Jensen cameo. Thanks for creating this path.

  85. These people look like evil guys from a james bond movie, or an 80s edm music vid.

  86. ugly ass shit car with tons of cameras on top of it. Mobileye car is way better. It looks like a regular car not some nightmare. The vidoe game boxes are awful mobileye tech is way ahead of them.

  87. The yellow and green color together is not friendly for color blind。

  88. Bad… just work with high def maps

  89. I've seen self-drive cars demonstrated on nice, wide, well maintained, USA-type roads with bright, clearly-marked lanes. That is they are demonstrated in, what seems to me to be, idealistic conditions.

    This is nothing like the reality of the rural, muddy, poorly lane-marked, pot-holed, sometimes single track roads here in Devon, UK – often with confusingly marked junctions. How close are self-drive cars from being usable in not idealistic conditions?

  90. But can it run crysis?

  91. full self driving artical @t

  92. How can I contact your team?

  93. Great work guys

  94. Can you make fun for blind pupil who are blind I have 4pesent vision left

  95. Best car 🚘

  96. NVIDIA I LOVE YOU <3 😉 RAY TRACING <3

  97. 1. I don't see hands off from the driver.
    2. Only highway was demoed, which was easier than local.
    3. Too many hardware on top, ugly.
    Conclusion: Mobileye is far better….

  98. AMD: when I grow up I wanna be just like NVIDIA.

  99. I think these cars need much better cameras. look at how blurry 1:18 is. Analyzing the extra pixels requires more processing power, but I am sure NVIDIA would be happy to supply the market with additional parallel processing.

Leave a Reply

Your email address will not be published. Required fields are marked *