The Future On Your Way: Self Driving Car
1. A self-driving car
would be unaffordable for most people, likely costing over $100,000.
What is Self-Driving
Car?
A
Self-driving vehicles (cars, bikes and trucks) in which human drivers are never
required to control the vehicle or that work without human intervention to predetermined
destination over roads and this known as
Smart or “driver less” Vehicles. It is combination of sensors, Camera,
Navigation System, Artificial Intelligence (AI) and software to control
and drive the vehicle Safely.
Companies that are developing the
driver-less car.
Many
major automotive manufactures including Ford, Mercedes Benz, General Motors,
Volkswagen, Audi, Toyota, Volvo, BMW and Nissan are in the process of
testing driver-less car system. BMW has been testing driver-less systems since
around 2005.
SAE International Releases Updated Visual Chart for Its “Levels of
Driving Automation” Standard for Self-Driving Vehicles:
which range from Level 0 to 5. Let’s take a brief
look at each stage.
- Level 0 does not feature any self-driving tech at all.
- Level 1 cars offer at least one system that helps the driver brake,
steer, or accelerate, but if there are multiple systems, they are not
capable of communicating with each other.
- Level 2 cars can simultaneously control steering and speed, even if
the driver is not driving, for short periods of time. Think
lane-centring technology combined with advanced cruise control, as an
example.
- Level 3 vehicles are fully autonomous but require driver attention.
These cars aren’t yet available but are being tested by some tech start-ups.
- Level 4 cars, once programmed to a destination, will not need driver
input, but the controls are available should the driver wish to intervene.
- Level 5 cars will be fully autonomous without any driver input.
Core Technologies Used in Self Driving Cars.
Cameras
Cameras
used in self-driving cars have the highest resolution of any sensor. The data
processed by cameras and computer vision software can help identify edge-case
scenarios and detailed information of the car’s surroundings.
All
Tesla vehicles with autopilot capabilities, for example, have 8 external facing
cameras which help them understand the world around their cars and train their
models for future scenarios.
Unfortunately,
cameras don’t work as well when visibility is low, such as in a storm, fog or
even dense smog. Thankfully self-driving cars have been built with redundant
systems to fall back on when one or more systems aren’t functioning
properly.
ADAS System
ADAS stands for Advanced Driver
Assistance System and it's a technology which has been developed for safer
driving. Originally designed for driver-less vehicles, it's a system which uses
cameras and sensors and a sophisticated algorithm, to notify the driver of a
potential problem. This might be a weaving cyclist, a stopped vehicle in the
road, drifting across lanes, or sudden braking of the vehicle in front. All
these hazards can be instantly communicated to the driver, allowing them to
take the necessary action avoid accident.
GPS
However, an image of the
surroundings is not enough to drive safely around the streets of a city. A GPS
system is also present to help the car position and navigate itself. But the
accuracy of commonly available GPS is about 4 meter RMS.
LiDAR and RADAR
Therefore, in order to improve the accuracy of navigation, it also uses
a set of sensors such as Light Detection and Ranging, commonly referred to as
LiDAR. A LiDAR works by measuring distance to a target by illuminating the
target with pulsed laser light and measuring the reflected pulses with a
sensor. Differences in laser return times and wavelengths are then used to make
digital 3-D representations of the target. A LiDAR can give an accuracy of
up-to 2.5 cm. Multiple LiDAR modules throughout the body of the car help
in creating an accurate map of the entire surroundings and avoiding blind
spots. LiDAR and RADAR play an important role in collision avoidance as
well.
LiDAR can detect micro-topography
that is hidden by vegetation which helps archaeologist to understand the
surface. Ground-based LiDAR technology can be used to capture the structure of
the building. This digital information can be used for 3D mapping on the ground
which can be used to create models of the structure.
Other Sensors
Self-driving cars will also
utilise traditional GPS tracking, along with ultrasonic sensor and inertial
sensors to gain a full picture of what the car is doing as well as what’s
occurring around it. In the realm of machine learning and self-driving
technology, the more data collected.
Using the concepts of Transfer Learning, a pre-trained model can be
modified for the purposes of detection of different kinds of objects and their
classification. This functionality is very important in the real-world
autonomous navigation by a vehicle.
Deep learning
using Convolution Neural Networks (CNNs) is being used to detect and
classify the traffic lights which can convey the important navigation
information to an autonomous vehicle.
One more area where deep learning
is being used in autonomous vehicles is the identification of the lanes at
pixel level using Fully Convolutional Networks (FCNs). This helps in making
sure that all the lane and traffic rules are followed by an autonomous vehicle.
How self-driving cars
work.
AI technologies power self-driving car systems.
Developers of self-driving cars use vast amounts of data from image
recognition systems, along with machine learning and neural network,
to build systems that can drive autonomously.
The neural networks identify patterns in the data, which is fed to the
machine learning algorithms. That data includes images from cameras on
self-driving cars from which the neural network learns to identify traffic
lights, trees, curbs, pedestrians, street signs and other parts of any given
driving environment.
For example, Google's self-driving car project, called Waymo, uses a mix
of sensors, LiDAR (light detection and ranging -- a technology similar to
radar) and cameras and combines all of the data those systems generate to
identify everything around the vehicle and predict what those objects might do
next. This happens in fractions of a second. Maturity is important for these
systems. The more the system drives, the more data it can incorporate into its deep
learning algorithms, enabling it to make more nuanced driving choices.
How the vehicle travels from one location to other location?
·
The driver (or passenger) sets a destination. The car's software
calculates a route.
·
A rotating, roof-mounted Lidar sensor monitors a 60-meter range around
the car and creates a dynamic 3D map of the car's current environment.
·
A sensor on the left rear wheel monitors sideways movement to detect the
car's position relative to the 3D map.
·
Radar systems in the front and rear bumpers calculate distances to
obstacles.
·
AI software in the car is connected to all the sensors and collects
input from Google street view and video cameras inside the car.
·
The AI simulates human perceptual and decision-making processes using
deep learning and controls actions in driver control systems, such as steering
and brakes.
·
The car's software consults Google Map for advance notice of things
like landmarks, traffic signs and lights.
·
An override function is available to enable a human to take control of the
vehicle.
What are the
Challenges with Autonomous Cars?
The challenges range from the technological and
legislative to the environmental and philosophical.
Lidar and Radar
Lidar is expensive and is still trying to strike the right balance
between range and resolution. If multiple autonomous cars were to drive on the
same road, would their LiDAR signals interfere with one another? And if
multiple radio frequencies are available, will the frequency range be enough to
support mass production of autonomous cars?
Weather Conditions
What happens when an autonomous car drives in heavy
precipitation? If there’s a layer of snow on the road, lane dividers disappear.
How will the cameras and sensors track lane markings if the markings are
obscured by water, oil, ice, or debris?
Traffic Conditions and Laws
Will autonomous cars have trouble in tunnels or on bridges? How will
they do in bumper-to-bumper traffic? Will autonomous cars be relegated to a
specific lane? Will they be granted carpool lane access? And what about the
fleet of legacy cars still sharing the roadways for the next 20 or 30 years?
Artificial vs. Emotional Intelligence
Human drivers rely on subtle cues and
non-verbal communication like making eye contact with pedestrians or reading
the facial expressions and body language of other drivers to make split-second
judgement calls and predict behaviors. Will autonomous cars be able to
replicate this connection? Will they have the same life-saving instincts as
human drivers?
Predicting agent behavior: It’s currently
difficult to entirely understand the semantics of a scene, the behavior of
other agents on the road and appearance cues such as blinkers and brake lights.
Not to mention, predicting human error such as when a person signals a left
turn but actually turns right.
Understanding
perception complexity: Self-driving
vehicles fail when objects are blocked from view such as during snowstorms,
objects viewed in a reflection, fast moving objects around a blind spot and
other long-tail scenarios.
Cyber security threats: Software is written
by humans, and humans write code with vulnerabilities. Although very few people
understand neural networks well enough to exploit these vulnerabilities, it can
and will be done.
Continuous
development and deployment: One problem facing self-driving
vehicles is the process of re-validating changes to the software. If and when
the code base changes, does this require testing for another 275 million miles
to validate performance?
The future of self-driving cars
Despite
the definite problems, self-driving car companies are moving forward and
improving every day.
Considering
an estimated 93%
of car accidents are
caused by human error, the opportunity for self-driving cars to remove a major
threat in the daily lives of billions of humans is too great to pass up. There
will be many debates over the efficacy of self-driving cars as well as
regulatory hurdles before we see Level 5 autonomy deployed globally.
Advantages
of Driver-less Cars
1. Travelers would be
able to journey overnight and sleep for the duration.
2. Speed limits could
be safely increased, thereby shortening journey times.
3. There would be no
need for driver’s licenses or driving tests.
4.Presumably, with
fewer associated risks, insurance premiums for car owners would go down.
5. Efficient travel
also means fuel savings, simultaneously cutting costs and making less of a
negative environmental impact.
6. Greater efficiency
would mean fewer emissions and less pollution from cars in general.
7. Reduced need for
safety gaps, lanes, and shoulders means that road capacities for vehicles would
be significantly increased.
8. elf-aware cars
would lead to a reduction in car theft.
9. Passengers should
experience a smoother riding experience.
10. Difficult
manoeuvring and parking would be less stressful and require no special skills.
The car could even just drop you off and then go park itself.
11. Human drivers
notoriously bend rules and take risks, but driverless cars will obey every road
rule and posted speed limit.
12. Entertainment technology, such as video screens, could be used without
any concern of distracting the driver.
Disadvantages
of Driver-less Cars
2. Self-driving cars
would be great news for terrorists, as those vehicles could be loaded with
explosives and used as moving bombs.
3. As drivers become
more accustomed to not driving, their proficiency and experience will diminish.
Should they then need to drive under certain circumstances, there may be
problems.
4. if the car crashes
without a driver, whose fault is it: the software designer or the owner of the
vehicle? Driverless systems will definitely trigger many debates about legal,
ethical, and financial responsibility.
5. Human behaviour such
as heavy foot traffic, jaywalkers, and hand signals are difficult for a
computer to understand. In situations where drivers need to deal with erratic
human behaviour or communicate with one another, the driverless vehicle might
fail.
6. Reading road signs
is challenging for a robot. GPS and other technologies might not register
obstacles like potholes, recent changes in road conditions, and newly posted
signs.
7. The road system and
infrastructure would likely need major upgrades for driverless vehicles to
operate on them. Traffic and street lights, for instance, would likely all need
altering.
8. Hackers getting
into the vehicle's software and controlling or affecting its operation would be
a major concern.
9. Truck drivers, taxi
drivers, Uber/Lyft, and other delivery people will eventually lose their jobs
as autonomous vehicles take over.
No comments:
Post a Comment