Technical off-roading is pretty much impossible without a spotter. I’ve been off-roading for about two decades now, and even with all that experience, when I see a rock garden ahead of me, I need someone to step outside and use their hands, along with the words “driver” and “passenger” to tell me where and when to steer so I can not only traverse the obstacle, but also avoid vehicle damage/personal injury. But, as with many jobs these days, this “spotter” job is at risk of being taken over by robots.
I recently received a Facebook message from an old engineering colleague from my Chrysler days; he pointed out a YouTube video showing a Jeep patent for an off-road drone, and suggested I might find it interesting (i.e. worth writing about). It turns out, it’s his patent. Sneaky stuff there, friend. Sneaky stuff.


Still, I actually do find it interesting, and since it’s something a number of automakers are pursuing, I figured I’d talk about it a bit. It’s an off-road drone, though Stellantis is technically patenting a “METHOD TO CONTROL A VEHICLE AS A FUNCTION OF DATA FROM A VEHICLE-BASED DRONE.” GM’s patent application is a bit more readable — “DRONE-BASED OFFROAD VEHICLE ROUTING — and Rivian’s is simply “Vehicle spotter drone.”
They’re similar in nature; they use inputs from an external source — cameras and sensors mounted to a drone — to help provide information to a driver or vehicle trying to navigate a tricky off-road obstacle like I’m doing below with real, human spotters:
Jeep’s Off-Road Drone Patent Application
Here’s a bit about Stellantis’ 2023 patent application:
In at least some implementations, the secondary navigation data includes information about one or more of the location of roads or paths along an intended path of travel of the vehicle, and the location of one or more obstacles relative to the location of the vehicle. In at least some implementations, the method also includes commanding the drone to fly away from the vehicle to an area of interest.
In at least some implementations, the method includes providing images or video from a camera of the drone to the vehicle. In at least some implementations, the images or video includes at least a portion of the vehicle and part of the area surrounding the vehicle. In at least some implementations, the method includes displaying the images or video on a screen in the vehicle.
In at least some implementations, the secondary navigation data is communicated with a control system of the vehicle and the control system determines a path of travel of the vehicle as a function of the secondary navigation data, and the control system operates the vehicle along the path of travel. In at least some implementations, the need for secondary navigation data is determined when the vehicle is within an area that has incomplete or no primary navigation data.
In at least some implementations, the drone includes a sensor and the secondary navigation data includes signals from the sensor. In at least some implementations, information from the sensor is used to determine the size and location of objects in an intended or optional path of travel of the vehicle. Other information like, but not limited to, the color, reflectivity and any movement of an object may also be determined from information from the sensor.
In at least some implementations, a method of controlling a vehicle includes using primary navigation data to at least in part control operation of the vehicle, determining a need for secondary navigation data within a sensing area of one or more vehicle sensors to supplement, in place of or in the absence of the primary navigation data, commanding a drone carried by the vehicle to depart from the vehicle, receiving at the vehicle secondary navigation data from the drone, and operating the vehicle at least in part as a function of the secondary navigation data.
Here are some images from the patent, with captions describing what’s going on. Don’t worry about reading the captions fully unless you’re really geeky/interested, as they’re kinda jargon-y:
Here’s the patent application’s caption for the figure above:
FIG. 1 is a diagrammatic view of a vehicle 10 that travels over land and has a drone 12 that is releasably carried by the vehicle 10. The drone 12, which is an unmanned vehicle 10, may be decoupled from the vehicle 10 and may fly in the general area near the vehicle 10 as well as away from the vehicle 10 to provide information to the vehicle 10 via one or more sensors carried by the drone 12. The vehicle 10 may then be operated at least in part as a function of the information provided to the vehicle 10 by the drone 12. In the example of a vehicle 10 that may be operated autonomously (e.g. steered and advanced along a path of travel, without regard to a particular level of autonomy), one or more vehicle controllers which may be called a vehicle control system 14 may determine, as a function of the information from the drone 12, one or more of a path of travel and a rate of travel along one or more portions of the path of travel. In the example of a human controlled vehicle 10, at least some information from the drone 12 is relayed to a driver to assist the driver in navigating a desired path of travel. The information may be provided to a driver on a screen, audibly and/or by tactile feedback (e.g. vibrations in areas that may be sensed by the driver). It is understood that a vehicle 10 may operate at certain times under human control and sometimes autonomously, and that some autonomous modes involve or may permit some human interaction.
Patents can be tricky to read, but basically the drone gives info to the driver on the screen or via noise/vibrations in the steering wheel/seat. The info from the drone could be used by the driver or the car to change driving behavior.
Here’s the caption for figure 4:
FIG. 4 illustrates an example in which the vehicle 10 is travelling up an incline 50, and is near a peak 52 of the incline. In this orientation or the vehicle 10, the forward-facing sensors 20, which may include but are not limited to a camera, have a sensing area 54 (e.g. a field of view of a camera) that is oriented upward relative to a flat area (e.g. the top of the peak 52) and/or relative to a decline 56 the follows the peak. In at least some circumstances, the sensing area 54 does not include the peak 52 or decline surface 56 and so the vehicle does not have information about the upcoming terrain at the peak and on the decline surface. Similarly, nearby terrain can be obscured by a turn in the path, as shown in FIG. 6, with or without an incline or decline, and, for example with a wall, ground (e.g. side of a hill), tree or other object or objects 58 between the vehicle onboard sensors 20 and the nearby area inhibiting or preventing sensing of at least part of the upcoming, nearby terrain, as shown by the shaded area 60. This may occur even when the obscured nearby terrain 60, and an obstacle/hazard 62 therein, is within a normal sensing area 54 of one or more sensors 20 (i.e. within a maximum distance and angle range of one or more of the sensors) but the obstacle 58 is in the way, as shown in FIG. 6. In such situations, the drone 12 may be employed to provide feedback regarding the nearby, upcoming terrain.
In the example shown in FIG. 4, an obstacle/hazard 62 exists on the decline surface 56 within a distance from the vehicle 10 that would normally be detectable by one or vehicle sensors 20, but the obstacle is outside of the sensing area of the vehicle sensors/cameras 20 in that location of the vehicle. In other words, it isn’t that the obstacle 62 is too far away from the vehicle 10 to be sensed or detected, but the vehicle sensors 20 cannot sense or detect the obstacle 62 because of an object in the way or in between (where the hill in the example of FIG. 4 is an object). In instances in which the vehicle 10 is traveling at a rate of speed that makes steering around the obstacle 62 difficult if such steering occurs when the vehicle is close to the obstacle (in the example of FIG. 4, only after the vehicle has crested the peak 52 and is on the decline 56), the vehicle 10 may either hit the obstacle 62 or need to make a sudden, sharp turn to try and avoid all or part of the obstacle. With feedback from the drone 12 while the vehicle 10 is still sufficiently far from the obstacle 62, however, maneuvering of the vehicle can be accomplished sooner to better navigate the decline surface 56 and obstacles 62 thereon, where such maneuvering may include both steering and speed control prior to the obstacle being detected by the vehicle sensors 20.
The feedback may include information (e.g. video/images) of the nearby terrain and that information may include the terrain by itself or it may include the position of the vehicle 10 relative to the nearby terrain. That is, the vehicle or part of the vehicle may be within the sensing area of the drone 12, for example, part of the vehicle 10 may be within a field of vision of a camera of the drone 12. The feedback is usable by the control system 14 or the driver, or both, to navigate the vehicle 10 over the nearby terrain, and/or to determine if the vehicle 10 may be navigated over the nearby terrain where in some instances the upcoming terrain may be too severe for the vehicle to traverse, requiring a different path of travel to be determined and followed.
In short: Some terrain isn’t visible by sensors on the car, so the drone could come in clutch.
And here’s the caption for figure 5:
In FIG. 5, the vehicle 10 is shown navigating large boulders 64 or similar uneven terrain. The particular path of the vehicle 10 relative to the specific boulders 64 or other obstacles/uneven surface features (e.g. stumps, craters, ruts, etc) may be important for successfully traversing the terrain. In this example, the drone 12 may be deployed to provide direct feedback of the position of the vehicle 10 relative to the terrain on which the vehicle is currently located and some upcoming, nearby terrain ahead of the vehicle in the vehicle’s path of travel. More specifically, the drone 12 may be positioned as desired around the outside of the vehicle 10 and, at least in some instances, the drone may provide a video feed of the position of one or more vehicle wheels 66 to enable more precise control of the vehicle path of travel and the specific area that the vehicle wheel(s) 66 roll over as the vehicle moves. That is, the drone 12 may show at least part of an obstacle 62, 64 that the vehicle is to drive over or around, and part of a vehicle wheel 66 as the wheel 66 rolls over or around the obstacle 62, 64. This may facilitate precise positioning of one or more wheels 66 relative to one or more obstacles. The drone 12 may provide feedback regarding one wheel 66 or more than one wheel 66, as desired. This direct visual guidance can provide real-time feedback to the vehicle control system 14 or driver, or both, to facilitate maneuvering the vehicle 10 over the uneven terrain.
In FIG. 5, the drone 12 is shown in two, representative positions. The drone 12 may be positioned at different altitudes or heights, and at different distances and fore-aft locations relative to the vehicle 10 or part of the vehicle, including with a view at least partially beneath the vehicle, or with part or all of the drone 12 under part of the vehicle 10, as desired. The drone 12 may remain in the same general location relative to the outside terrain (e.g. the vehicle moves relative to the drone), or the drone 12 may move relative to the vehicle 10, or the drone 12 may stay in the same general location with respect to the vehicle 10 by moving at the same general rate as the vehicle, for all or part of the vehicle’s movement over a given area. The drone 12 can also provide images or video of portions of the underbody of the vehicle and/or the vehicle wheels, to, among other things, show clearance of the vehicle relative to one or more objects, or to permit inspection of areas or components under the vehicle (e.g. to determine if some part of the vehicle has been damaged). Further, while shown as providing imagery to help navigate over objects, the drone could also show clearance of the vehicle sides relative to objects and/or the roof as the vehicle drives beneath an object (e.g. a tree limb).
In at least some implementations, the drone 12 is commanded to fly to provide navigation data in an area that includes the vehicle, and in a nearby area that is at a distance that would be within a sensing area of one or more vehicle sensors 20 but due to an object, is not detectable by the vehicle sensors 20. That is, the nearby area is within the distance range of one or more vehicle sensors 20, but at least part of the area is obscured from the sensors by one or more objects between the vehicle and a nearby area of the path of travel. Such objects could be trees, rocks, a hill or a wall, etc, that orients, obscures or blocks the vehicle sensors 20 such that features or data about a nearby area is not detectable by the vehicle sensors 20.
Further, secondary data may be needed in an area that was within a sensing area of one or more sensors 20 before the vehicle 10 was in that area, and for which real-time information is helpful to navigation of the vehicle on a particular portion of the path on which the vehicle 10 is located, to get the vehicle around or over obstacles currently being encountered by the vehicle. In one example, the area was not blocked by an object when the vehicle was at a distance from the area, but the area is now outside the sensing area of the vehicle sensors because the vehicle currently is on top of that area and the vehicle sensors 20 have a forward projected sensing area. Additionally, the back side of an obstacle, or things behind an obstacle that is within the sensing area might not be detected by the sensors 20 and so secondary navigation data may be needed to see behind or the back side of obstacles like rocks, stumps, or the like.
The vehicle/drone system enables a method of operation of the vehicle 10 equipped with a drone 12 that has one or more sensors 26 or systems that provide secondary navigation data to a vehicle 10 that may include information obtained from perspectives and views not obtainable by vehicle mounted sensors 20 and systems, including, for example, views around blockages or obstructions, views of terrain above/below vehicle sensor area or field of view. Thus, the information my relate to not only road information like the presence or a road or path, but the particular terrain of the road, path or area being traversed by the vehicle 10 including the location(s) of obstacles relative to the vehicle 10, both in a static manner and as the vehicle moves through the terrain, as desired. This collaborative approach for vehicle control can improve or extend autonomous driving options and adds advantages over vehicle mounted sensors.
That caption talks about how the camera could be used for off-road spotting, about how the drone can be positioned in different locations, how it move with the vehicle, etc. You get the idea.
Rivian’s Off-Road Drone Patent Application
Now let’s look at Rivian’s 2022 patent application. Here’s the abstract:
A visual guidance system may include a drone having a sensor. The drone may be configured to autonomously position the sensor outside a vehicle with a line of sight to a current path of the vehicle to collect path data. The visual guidance system may also include a controller in communication with the sensor and configured to provide visual guidance to a vehicle operator based on the path data. Example methods of providing visual guidance may include autonomously positioning a sensor outside a vehicle using a drone such that the sensor has a line of sight to a current path of the vehicle to collect path data. Example methods may also include using a controller to determine visual guidance from the path data and provide the visual guidance to a vehicle operator.
Here’s a bit of a deeper discussion that gets into specifically why how drone could work as an off-road spotter:
Off road maneuvers in a vehicle may require the assistance of a spotter outside the vehicle. For example, when traversing extreme terrain, e.g., large rocks or significant inclines or declines, a driver may not be able to see areas immediately surrounding the vehicle. Accordingly, a spotter may stand outside the vehicle with a view of the areas of the terrain that cannot be directly seen by the driver. The spotter may signal the driver, e.g., with hand gestures or the like, to help the driver guide the vehicle to traverse the terrain. A spotter may be needed to assist the driver when particularly extreme or challenging terrain is encountered, e.g., tight rock formations, large boulders, etc. Some off-road vehicles have, in an effort to address the inability of the driver to see certain areas of the terrain, been provided with underbody cameras for directly viewing the terrain. Merely providing views of the terrain taken directly from the vehicle, however, may not provide adequate views of the vehicle’s path, e.g., particularly where front and rear wheels of the vehicle are in contact with the ground surface(s). Some vehicles use a surrounding view of a vehicle with cameras along the outside/body of the vehicle, artificially generating a “360-degree” view of vehicle, and may also provide a projected path for vehicle tires. The field of view from these perspectives is limited to areas that are within a few inches or, at most, a few feet from the vehicle. Moreover, even when the vehicle can provide views of areas surrounding a vehicle or an underneath the vehicle, the driver may have difficulty determining a best path for the vehicle over/around obstacles.- [0003]
Accordingly, in at least some example approaches, a visual guidance system is provided comprising a drone having a sensor. The drone may be configured to autonomously position the sensor outside a vehicle with a line of sight to a current path of the vehicle to collect path data. The visual guidance system may also include a controller in communication with the sensor and configured to provide visual guidance to a vehicle operator based on the path data.- [0004]
In at least some example approaches, the visual guidance includes a direction to guide the vehicle.- [0005]
In at least some example illustrations, the drone further comprises a display configured to provide the visual guidance to the vehicle operator from outside the vehicle.- [0006]
In at least some examples, the drone is configured to autonomously position the display within a field of view of the vehicle operator.- [0007]
In at least some example approaches, the visual guidance includes an image of the current path and a representation of a desired path overlaid upon the image.
Here’s a peek at at image:
The caption (I’m using the shortened version instead of the long ones above from Jeep) for the image above reads:
FIG. 3 shows a guidance system providing guidance to a vehicle traversing a hill, with a drone being temporarily out of a line of sight of a vehicle operator, in accordance with some embodiments of the present disclosure;
General Motors’ Off-Road Drone Patent Application
Let’s look at GM’s 2022 patent application while we’re at it; it refers to a drone as an “AI Drone” used to optimize and collect information using artificial intelligence.
GM’s abstract reads:
In one aspect, a vehicle and uniquely assigned an aerial drone embedded with sensors, radar, lidar, etc. may establish a data link. The driver or occupant may input information into the user interface in the vehicle about a region or destination of interest. The vehicle’s processor may instruct the drone to scout a target region and receive data about the topographic features of the terrain. The drone may scout the region, receive the data, and forward the data to the vehicle. The processor may compute optimal routes based on data from both the drone and the vehicle sensors. In some aspects, the drone performs calculations of the route and may communicate in or near real time with the vehicle. The system may take into account vehicle data, the skill level of the driver, or both.
The short version of the caption for the above image reads:
[the figure] is a conceptual diagram of a system including a vehicle in an offroad area using drone technology to aerially scout a target region.
FIG. 2 is a flow diagram illustrating an example integrated process for drone-based offroad vehicle navigation.
Here’s the short-caption for figure 3:
FIG. 3 is a flow diagram illustrating example criteria taken into account by the drone or the vehicle in using artificial intelligence to identify optimal routes.
Here’s a little bit of the description of figure 4:
FIG. 4 is a flow diagram 400 illustrating an example energy management system of the drone-based routing determinations. As in previously discussed aspects, a drone may be instructed by the vehicle processor to aerially scout the terrain to provide sensor data to the vehicle and/or to perform independent computations based on the data to assess the alternative routes that appear to be available. The drone’s controller may communicate an estimated energy consumption to the vehicle over the data link for each route that compares the relevant road/path conditions and its obstacles with the available remaining range of the vehicle.
FIG. 5 is a flow diagram 500 illustrating a plurality of alternative activities achievable by a driver and a vehicle occupant using the drone-based system. The system in FIG. 5 may also be performed by different components of the drone-based system. These include the processor, the controller, the various sensors, the user interface, and other components of the vehicle and the drone. The example diagram 500 may be relevant to a situation where the vehicle (or the occupants) instruct the drone to explore a perceived scenic area identified previously to the occupants in a video or photograph. The area may include a waterfall or other area of natural scenic beauty. Upon performing a flight to scout the target region, however, the drone concludes that there is no paved or otherwise commonly used road to the destination associated with the attraction of interest.
I Actually Like The Idea
I’ve been in a number of off-road scenarios where I’ve had to stop my vehicle and get out just to see what’s ahead. In fact, just last week, while on the Toyota 4Runner press drive, I faced that very situation multiple times. It’s a little annoying, especially if you have to do it over and over and over on the same trail over a short period.
One of the biggest challenges is seeing over crests, especially when dune-driving. You could easy crest a dune — which you have to really speed up in order to make it — and find that the back side is a sheer drop. Having a drone guiding me could be useful in a number of scenarios.
Do I think I could really get a great idea of the relative size and position of the rocks compared with my vehicle? Will I really have a great understanding of my vehicle’s heading compared to the obstacles ahead? And will it be distracting looking down at the screen as I try to approach a rock-garden? I’m not sure.
I’m doubtful there’s much of a market for such a contraption, and I’m also doubtful that any three of these companies will actually launch it (Ford filed a patent in 2016 for a drone that deploys in order to help a stranded vehicle flag down its tow truck), but who knows. I’d really like to try this out.
Just wait until they share the drone footage with your insurance carrier to deny a claim.
Does anyone know if Garmin or any other companies in the mapping and outdoor recreation space have also filed similar patents? This seems a lot like the automakers jumping into the patent space to try to get there first and then rake in the sweet, sweet patent licensing money once the tech is truly feasible.*
*Right now, there’s no way a driver could safely control both their car and a drone (or drones) simultaneously. The drone has to be able to move about autonomously. And it has to be able to orient itself correctly to provide a visual assist to the driver (or automated assist to a vehicle), which requires more complex programming and protocols. It’s a cool idea, and it will probably be real in the not-too-distant future — but not quite today or necessarily next year.
I kind of see this as something that will take hold first as an add-on device system, like the old Garmin GPS units. Clip the screen somewhere and send the drone out to have it beam back a spotters-eye-view based on commonly-used sightlines, with touch controls on the perimeter of the screen to tell the drone to shift viewpoints or fine-tune its location and view. An expensive new tool/toy to add to off-road enthusiasts’ kits.
Also, I can’t wait for the legal disclaimer that will inevitably have to appear when the system boots up. Probably something like “This is not a substitute for an experienced human spotter. Not for use without a human assistant. Do not attempt technical trails alone using only this device. Not liable for damage to vehicles or trails, or injury or death of vehicle occupants due to incorrect use of this device. Using this device constitutes your acknowledgement of the manufacturer’s terms of use and binding arbitration agreement. Touch OK to continue.”
You hit the nail on the head.
It would need to be fully autonomous (including deployment and retrieval), and even then, I think it would face similar challenges to GoPro, but even worse–very expensive since they’re specialized, and an even smaller market to quickly saturate, and then no major room for growth, even if they’re good. I don’t think such a thing is feasible under capitalism, or at least not a company that focuses solely on this technology. But even for a big company to make it, I don’t see it being profitable.
“You get the idea.”
Not really. Just a lot of word salad.
I don’t rock crawl and I don’t use drones so I’m having a hard time visualizing how this might work. Is the drone building a map that includes the vehicle as a point of reference? If so can it do so quickly enough and present the data intuitively enough to a distracted, maybe panicking driver to be useful? Will it need to flip the image for Jeep owners to understand?
Cool idea, but can it be used to spot the curbs in the mall parking lot?
I’d be more interested if it could grab a parking space and fight off claim jumpers till I got there.
Another Speed Racer button on a steering wheel has been made real. How many Speed Racer features are left? I presume it is down to the saws out front, the stilts, and the SCUBA cover?
Trunk monkey. I’m still waiting for that one.
What I’d rather have is something to tell me how deep the water crossing is without having to use a downed log as a dipstick and subsequently falling into the water.
Came down to make this comment. But then thought of the limitations: a lightweight flying drone won’t have the power to probe the bottom like a human with a 1” pole.
Having been (foolishly) stuck in a deceptive puddle once, I’m pretty gun-shy of water hazards
One of the reasons I stopped off-roading was the number of guns I was seeing on the trails. I’d be concerned about my fancy drone being downed by some gun-toting, angry-grill-owning jack-off.
I can see this being super useful for the side-by-side market for dunes: not so much for navigating, but making sure there isn’t a family of ATVs on the other side of the dune you’re about to launch off of.
Also, the muted mono-chromatic dune colors and general lack of features for interpreting scale really play tricks on your depth perception in dunes. Having a drone that can relay “Hey just FYI, it’s 200′ to the bottom of the dune you’re cresting” would be super handy.
Since dunes are always shifting, making detailed terrian maps impossible, a drone is great way to get immediate, accurate information that even a spotter might not see.
I already know the simplest and easiest way to safely go off road on a heavily rutted trail with lots of rocks, etc.
Park the truck, get out and walk.
Oh. Hell. No. No no no no no. There are already too many dipshits flying their drones around national parks and forests (usually illegally); now the plan is to have vehicle launch personal drones to buzz around them on obstacles? If you can’t get your vehicle over an obstacle with your, front facing camera, 360 view on the dash, undercarriage camera, combined with your traction control, ride height control, crawl control, hill descent control or… I don’t know, learning to drive and using the Mk I Eyeball – maybe you should take your expensive computer on wheels and go back to the mall parking lot. Adding drone-view isn’t going to make you any more skilled.
At what point is off roading no longer about pitting your skills against the terrain? You might as well fly to the top of the mountain as a helicopter passenger at this point.
It’s nice to see old flow charts in use again.
Flag on the play. You ever hear of bringing a gun to a knife fight? This is bringing a intercontinental ballistic missile to a kids birthday party. First who is controlling multiple drones and who has control of the language used? I first thought lasers and cameras would be cheaper and much better. They would be and cheaper. But then it occurred to me as an old fart why not use the same technology as Google and the USPGA? Most people use the same trails. Put a 4wd google car on all of the trails, measure height and rocks and divots and things. And just drive the course with the computer telling you like GPS left, right, gear up, gear down, in fact you could probably just program the car to drive itself and sit in the back playing a video game. Because let’s be honest at what point are we no longer in control?
Or just have an expert driver drive it and program a video game of it. Spend quarters not dollars.
And then it rains or tons of people driving the trail change it through wear or manipulation. (Looking at you trolls who dug holes and moved rocks to make the Rubicon “more challenging”.)
I’ll stick to light duty trails myself. I just have a two wheel drive anyway.
Neat trick, but still not as cool as the Homing Robot in the Mach 5.
It’s really interesting and cool, but it absolutely must lock drivers out when they’re not off-road.
I hate the idea of a dozen Rubicons used as status symbol commuter cars suddenly sending drones up because the drivers are bored or want to see what’s stopping traffic. If traffic was stopped because a guy was threatening to jump off of a bridge (or really any emergency) these would definitely make the situation worse.
A guy actually did that in my area. The cop accidentally pushed him off the bridge. (He survived)
But will it have hands to frantically wave when it gets the directions wrong and can’t think of the right words for “NONONONONOOHMYGODNO!”?
Excellent use for drones. I love this idea and look forward to seeing it work. We already have drones that will follow us on our walk. I can see this where you place it in “geostationary orbit”. So cool.
The offroad game Expeditions features a similar, optional in-game drone module, useful for completing scouting tasks or just to help plot a path through terrain. Pretty fun!
I can certainly see how the real-life version could come in handy, streaming footage through an in-dash display. I can’t imagine actually using one like that myself, but I can see the advantage.
What is how to add more electrical and programming faults?
I’ll take Bad Automotive Ideas for $500, Alex
“This manufacturer decided to go away with gaskets”
What is Austin?
“I’m sorry that’s incorrect, they never had any to begin with.”
Didn’t read a word of the article, BUT have been thinking about how cool it would be to have a drone that could fly in front of you on a rural 2 lane to verify nothing is coming the other way so you could pass when the road doesn’t allow line of sight. We used to use radios when running in a convoy, but a drone would be a much better option.
I don’t do a lot of off-roading but having a drone or two pop off my car as soon as I get downtown and finding me a parking space without me having to circle the blocks for half an hour would be awesome.
Till that’s what everyone is doing. Then you have your drone fight their drones to the death and the winner gets the space.
That would be AWESOME
And I have the perfect name for a system of lethal, flying networked battle drones:
Skynet.
I wonder what the default mode is if it detects a crash or rollover while deployed? It would be cool to have a signal go out.
I mean, it isn’t like the first one of these isn’t going straight to Bear Pass by someone that has no business on Bear Pass. They will be grievously injured, probably whilst livestreaming on Insta, but will have a little beacon telling people where to come find them.
2 big rules:
if your margins for success are THAT small then maybe have a good spotter instead because if you are alone you could probably just as easily be dead or stranded.
Speed racer did it first.
https://images.app.goo.gl/cK8odaaMoZedpUQZA
IMO, looking through a camera just can’t give you the same spatial awareness that your eyes do. Yes, in easy situations, popping up a drove to see what’s over the hill might be useful, or maybe flying it up high to give a better lay of the land to make sure you’re going the right way.
Agreed.
This. Though there are plenty of drones with terrain scanning capabilities that can map out the trail in 3D, it’s just hard to visualize that 3D information from a 2D dash display or HUD.
I have the same issue looking at parts in 3D CAD were a part looks massive until you have some physical object for scale. My company literally has a banana CAD file we use during design reviews to “show a banana for scale”.
two words – LIDAR drone
… feverishly filing patient