Remember when you used the rearview mirror to back into a parking space? Some of us still do but many vehicles now include, or provide as an option, automated parking driving aids. In practice, it conveniently works with another system that avoids backing over unsuspecting pedestrians or hitting inanimate objects (the increasingly common AI-based automated braking systems).
This is just one example of what basic driver-assistance systems can do now. Yes, some consumers are still skeptical of fully AVs (autonomous vehicles)/ADS/ADAS (advanced driver assistance systems). They question their practicality but also like the idea of being able to do other things to pass the time while stalled in traffic congestion. The many heavyweight companies developing AVs are counting on the latter. The expected decrease in traffic accidents and minimized congestion are benefits that insurance companies and governments applaud.
To counter the increasingly common dashboard infotainment systems, and all the mobile devices including laptops for consuming media and information brought onboard, and seemingly ubiquitous smartphones, that dangerously distract drivers now, AVs that could safely transport distracted occupants should become more popular and better for all concerned.
Perhaps the practical limits of AV system implementation are “driver” and passenger sensory overload plus the thought of simply not being in control. Also, significant AV system costs and technology maturity are concerns. Even when AV systems become commonplace, the cost of the components alone will be substantial. As Tesla Inc. indicated recently, the cost differential between assisted driving and full AV capabilities is several thousand dollars. What all of these various systems use is dependent, to varying degrees, upon affordable and reliable artificial intelligence (AI) implementation with associated sensors and actuators. [Also see earlier blogs on AI & Virtual Reality—Let The Games (and Work) Begin and AI & Autonomous Vehicles—Awesome Implications]
Some AV Reality Checks
At this point in time, it would be difficult to find an automobile manufacturer that is not actively working on integrating sophisticated autonomous vehicle operation in some, or all, of its vehicles. At the same time, these manufacturers are increasingly focusing on electric vehicles (EVs) or hybrid gasoline-electric vehicles. In a decade, internal-combustion engine vehicles will be disappearing.
Since many “technology” companies regard the automobile industry as lacking in computer, microelectronics and software expertise, the established computer hardware and software companies are either developing AV systems independently or in joint ventures with familiar automotive companies. Alphabet’s Google, Apple, Intel, Microsoft, and others are very active with massive ongoing investments. Recently, NVIDIA Corp. has been releasing AI chips specifically for AV applications with impressive early evaluations. Realistically, not all will be successfully adopted but differing AV systems will eventually enter mass production.
General Motors Co. recently acquired the LIDAR technology company Strobe, Inc. As part of the deal, Strobe’s engineering talent joins GM’s Cruise Automation team to define and develop next-generation LIDAR solutions for self-driving vehicles. GM thinks its AV team is complete now.
Lidar (also called LIDAR, LiDAR, and LADAR) originally was a surveying technology that measured distance by illuminating a target with a pulsed laser light and measuring the reflected pulses with a sensor. Differences in laser return times and wavelengths made digital 3D-representations of the target.
LIDAR is said to create higher-resolution images that provide more accurate views of the world than cameras or radar alone. As self-driving technology continues to evolve, LIDAR’s accuracy may play a critical role in its deployment.
“The successful deployment of self-driving vehicles will be highly dependent on the availability of LIDAR sensors,” said Julie Schoenfeld, Founder and CEO, Strobe, Inc.
GM is planning to test its vehicles in “fully autonomous mode” in New York state in early 2018, according to New York Governor Andrew Cuomo. However, the planned testing by GM and its self-driving unit, Cruise Automation, will initially be a Level 4 autonomous vehicle.
A level 3 car still needs a steering wheel and a driver who can take over if the car encounters a problem, while level 4 promises driverless features in dedicated lanes. A level 5 vehicle is capable of navigating roads without any driver input and in its purest form would have no steering wheel or brake pedal.
The many AV systems use very sophisticated sensors to evaluate the immediate surroundings that vehicle owners, or automobile mechanics, may not fully understand. Looking at AV going forward, the mainstream vacuum-dependent deposition and etching/cleaning processes will likely need refinements and evolution. Unlike a PC or smartphone problem where failure may be annoying, if a driverless vehicle goes awry because of a microprocessor or sensor component manufacturing defect, the consequences can be tragic and very expensive.
Those using the ICs, MEMS and other micro devices for AI and AV have at least two options to minimize liability, build in considerable redundancy or use fail-safe components and systems. With IC manufacturing, although each device on a wafer may be very similar, they are not absolutely identical. That means that the patterning lithography must be nearly perfect and each layer of material deposited must be incredibly uniform. What was acceptable yesterday may not be good enough going forward.
AI, in the most basic terms, suggests that its (dedicated) computer is cognizant of the specified environment (using sensors) and performs tasks within its defined decision-making capabilities (using actuators). In a vehicle, AV control is tied into an already extremely complex electrical/electronic system with its own central computer and with dedicated embedded computers within specific components—a complex network.
“Artificial intelligence (AI, also machine intelligence, MI) is apparently intelligent behavior by machines, rather than the natural intelligence (NI) of humans and other animals. In computer science AI research is defined as the study of “intelligent agents”: any device that perceives its environment and takes actions that maximize its chance of success at some goal. Colloquially, the term “artificial intelligence” is applied when a machine mimics “cognitive” functions that humans associate with other human minds, such as “learning” and “problem solving” per Wikipedia.
“Intelligent agents are often described schematically as an abstract functional system similar to a computer program. For this reason, intelligent agents are sometimes called abstract intelligent agents (AIA) to distinguish them from their real world implementations as computer systems, biological systems, or organizations.”
A Broader Perspective of AI and AV
Microelectronics technology is invading many products and usually are interconnected via the Internet. If one looks at a self-driving car, i.e., will the passengers be viewing live TV?
Recently, Gartner, Inc., the Stamford, CT technology research company, highlighted the top strategic technology trends that will impact most organizations in 2018. Creating systems that learn, adapt and potentially act autonomously will be a major battleground for technology vendors through at least 2020. The ability to use AI to enhance decision making, reinvent business models and ecosystems, and remake the customer experience will drive the payoff for digital initiatives through 2025.
“Currently, the use of autonomous vehicles in controlled settings (for example, in farming and mining) is a rapidly growing area of intelligent things. We are likely to see examples of autonomous vehicles on limited, well-defined and controlled roadways by 2022, but general use of autonomous cars will likely require a person in the driver’s seat in case the technology should unexpectedly fail,” said David Cearley, vice president and Gartner Fellow in his Top 10 Strategic Technology Trends for 2018 presentation. “For at least the next five years, we expect that semiautonomous scenarios requiring a driver will dominate. During this time, manufacturers will test the technology more rigorously, and the nontechnology issues such as regulations, legal issues and cultural acceptance will be addressed.”
“AI techniques are evolving rapidly and organizations will need to invest significantly in skills, processes and tools to successfully exploit these techniques and build AI-enhanced systems,” added Mr. Cearley. “Investment areas can include data preparation, integration, algorithm and training methodology selection, and model creation. Multiple constituencies including data scientists, developers and business process owners will need to work together.”
The first three strategic technology trends explore how artificial intelligence (AI) and machine learning are seeping into virtually everything and represent a major battleground for technology providers over the next five years. The next four trends focus on blending the digital and physical worlds to create an immersive, digitally enhanced environment. The last three refer to exploiting connections between an expanding set of people and businesses, as well as devices, content and services to deliver digital business outcomes.
Cearley adds, “Intelligent things are physical things that go beyond the execution of rigid programming models to exploit AI to deliver advanced behaviors and interact more naturally with their surroundings and with people. AI is driving advances for new intelligent things (such as autonomous vehicles, robots and drones) and delivering enhanced capability to many existing things (such as Internet of Things [IoT] connected consumer and industrial systems).”
Next time: Examining widespread ADS/ADAS (advanced driver assistance systems) efforts worldwide continues.