Autonomous driving has already found its way into public traffic in the USA, with vehicles that can operate largely independently. In Germany, however, autonomous vehicles have only been available to a limited extent, mostly in the form of pilot projects or under specific conditions such as on motorways. However, technological progress is progressing worldwide. In this article, we present new technical developments for self-driving cars and your own experiences.
Content
Autonomous driving is at the heart of current developments in the automotive industry. Vehicles with partially autonomous driving functions, such as the Mercedes-Benz EQS with Level 3 autonomy, are already on the roads. These systems allow the driver to give up control under certain conditions.
The next step, fully autonomous vehicles of the Stage 4 and 5, is being intensively researched and tested. Pilot projects with autonomous shuttles and special use cases are showing promising results. However, experts assume that fully autonomous road traffic could only become a reality in this country in the 2030s, as technological, legal and infrastructural challenges still need to be overcome.
The situation is quite different in America. Here, autonomous vehicles already on the road. I had the opportunity to enjoy such a driving experience.
15.08.2024 | While drivers in Germany still have to dream of 100 percent autonomous driving, this unimaginable future mobility can be America I have already experienced this first hand. When I recently drove a Tesla with a friend for about 50 miles through town and country, completely autonomously, in every imaginable traffic situation, I was particularly impressed by the ease with which this groundbreaking paradigm shift is taking place.
At first I was very impressed with how smoothly the computer handled the driving. After just a few miles I no longer even noticed that the car was driving itself. It drove just like I drive myself, there was no jerking or unnecessary braking, no delayed starting or sudden stopping. The car drives as if it were the most normal thing in the world and cars have always been autonomous.
The eight cameras do a good job. They realize that in America you can turn right on a red light if it is convenient in traffic. The computer makes the right decisions when turning left without a left-turn light. The car also correctly observed a flashing yellow light. And at the equal intersections with a stop sign, it behaved absolutely correctly. It drove when it was its turn. Finally, parking also worked smoothly.
During the entire journey, there was only one tiny situation when reversing out of a parking space that required human intervention, but it was solved in no time. The driver must remain alert throughout the journey. If he is not alert, or if he looks at the monitor for too long, for example, the car stops to warn him. If this happens several times, the autonomous mode is blocked for a week – a automotive education measures so to say.
All in all, I wasn't nervous at all and certainly not scared. It was really crazy. Would I buy a car like that myself? I don't think so (yet?). Because if I'm already behind the wheel, I can do something. Then I'll save myself the $8000 for buying the autonomous function for the life of the car or the $100 monthly license fee for this service.
March 30.03.2023, 1 | With the Scala 2 and XNUMX from Valeo conditionally automated driving in traffic jams became a reality. Now the area of application of private vehicles can be expanded with the Scala 3 Lidar due to a larger scope of use and support for higher speeds.
31.03.2021/XNUMX/XNUMX | In the TUM Researchers have developed a new early warning system for autonomous vehicles that uses Artificial intelligence (AI) learns from thousands of real traffic situations based on recurrent neural networks. The system can warn of a potentially critical situation 85 seconds in advance with more than 7 percent accuracy in today's self-driving development vehicles. Self-driving cars cannot yet master this on their own. The study was carried out by TUM together with the BMW Group.
That way self-driving cars Many developers rely on sophisticated models with which the cars can assess the behavior of all participants in road traffic. However, there are complex, unforeseen situations in which such models are currently still inadequate.
6G network | Bidirectional radio link and Thz receiver
A team from Technical University of Munich (TUM) led by Prof. Eckehard Steinbach, holder of the chair for media technology and member of the board of directors of the Munich School of Robotics and Machine Intelligence (MSRM) at TUM. Artificial intelligence enables your system to learn from previous situations in which autonomous test vehicles reached their system limits in real traffic. In such situations, people take control of the vehicle again because they have decided to do so for safety reasons or the car has asked them to intervene.
The new technology captures over Cameras and Sensors the environment and records the vehicle and environmental condition. This could be the position of the steering wheel, the condition of the road, the weather or speed and visibility. The AI, based on recurrent neural networks (RNN), learns to recognize patterns from this data. If the technology recognizes a pattern in a new driving situation, which in the past has already overwhelmed the automated control under these circumstances, it warns the driver early on of the potentially critical situation thanks to the AI.
“In order to make vehicles more autonomous, many of the previous methods are examining what cars have understood about traffic so far and then improving the models that the cars are based on. The big advantage of our technology is that we completely ignore the opinion of the car and instead look purely at the data of what is actually happening and find patterns, "says Professor Steinbach. “In this way, the AI also discovers potentially critical situations that may not or have not yet been recognized in models. Our system thus offers a safety function that knows when and where the cars are weak. "
The researchers have developed the technology together with the car manufacturer BMW and their automated development vehicles were tested on public roads. Around 2500 situations in which the drivers had to intervene were evaluated. The study found an 85% accurate prediction of potentially critical situations up to 7 s before they occurred.
A large amount of data is required for the technology to work. Because AI can only then recognize and predict experiences with the system boundary that have already been made. In view of the large number of development vehicles, data would be generated almost by itself. Study author Christopher Kuhn says: "Every time a potentially critical situation arises during test drives, we lose a new training example." The central storage of the data makes it possible for every vehicle to learn from the records of the entire fleet.
15.10.2020 | Modern Driving assistance systems use radar technology. There are many systems for adaptive speed control, support for changing lanes, avoiding collisions and recognizing pedestrians and cyclists. All pave the way to self-driving car. The Fraunhofer FEP has now developed a radar sensor that can be integrated into the car headlight.
The installation of an ever increasing number of Sensors limits the availability of exposed measuring points. There is hardly any more space available for installing sensors. The Fraunhofer Institute for Organic Electronics, Electron Beam and Plasma Technology FEP therefore developed these radar sensors. Such a radar sensor can be integrated into the headlight. The underlying project, which is funded by the Federal Ministry of Education and Research BMBF, is called Radar glass.
The integration of the radar sensors in the headlights of a vehicle protects them from snow, ice and rain. The outer vehicle shell is not affected. Designers of future generations of automobiles are no longer restricted by additional sensor structures on the vehicle.
Together with project partners, the researchers investigated which Thin film system radar waves can be controlled with little loss without restricting the lighting function of the headlamp. They developed a thin, transparent functional coating for an assembly mounted in the headlight. With it, the radar beams can be specifically shaped and directed.
The coating of the radar sensor can Beam propagation manipulate differently depending on the type of use. When detecting pedestrians, for example, the radar beams are directed to the side. The beam shape can be adjusted like an eye to the near or far range. To direct the propagation and shape of the radar beams, small areas of the coating have to be precisely structured using lasers. So they can act as antennas for the radar waves.
“As part of the project, we developed a thin-film system that is almost transparent in the visible range and can also form high-frequency waves. The manufacturing process has been optimized to such an extent that the coating leaves the color of the light source unchanged and withstands temperature fluctuations between -30 ° and +120 ° C, ”explains dr Manuela Junghahnel, Project manager at Fraunhofer FEP.
A demonstrator was designed for the long range. With it, the radar can be bundled with a gain of 20 dBi (antenna gain) in a small beam width of 5 ° in the direction of travel. Obstacles up to 300 m away can be detected with the radar.
In addition to Fraunhofer FEP, the Radarglass project also includes Institute for High Frequency Technology the RWTH Aachen and the Fraunhofer Institute for Laser Technology ILT involved. The experts from RWTH Aachen simulated the antenna layout and checked it with measurements in the 76 GHz to 81 GHz band. In this way, the suitability and performance of the radar reflector could be determined. The scientists at Fraunhofer ILT developed a high-precision laser ablation process for structuring the antennas on the coating.
By Radar glass Many applications are being developed in the automotive and supplier industries. A wide range of impulses can be expected from the current development trend towards autonomous vehicles. In addition to licensing agreements, those responsible are seeking further cooperation with industry in order to be able to produce the radar sensors in series.
08.09.2020 | In the area of assistance systems for self-driving cars, for example, many motorists are relying on the Lidar technology. lbeo Automotive Systems has developed the new lidar sensor for this purpose. The Ibeonext has a new type of photon laser Measurement Technology. It generates a high-resolution 3-D point cloud for the reliable detection of objects.
14.08.2020 | When will self-driving cars really be available across the board? paradigm shift Will autonomous driving be within reach in 2020 or will it be “just” another top topic in the automotive industry alongside electromobility? Minebea Mitsumi develops products for autonomous vehicles. The automotive supplier shows which level we have reached and which intermediate stations the journey into the mobility of the future leads through.
July 28.07.2020, XNUMX | The development time for self-driving cars should be shortened to hours instead of weeks. To achieve this, build Shelf and Nvidia a high-performance computer cluster based on the DGX AI system. virtual data generation, Artificial intelligence and simulation of autonomous vehicles are the future core tasks of the most powerful Supercomputers the automotive industry.
01.07.2020 | Modern electrical and electronic systems combine the computing power for complex applications with the flexibility to evolve through updates and upgrades. Autosar Adaptive and High Performance Computer (HPC) are the key to in-vehicle applications. Development environment Prevision 9.5 of Vector facilitates integration into classic architectures and simplifies the design of such systems.
28.02.2020/XNUMX/XNUMX | In the car of the future, the need for fail-safe electrical and electronic systems is increasing when it comes to electromobility and autonomous or automated driving. Infineon is now presenting an important new development for the automated vehicle. The Aurix microcontroller of the second generation (TC3xx) were the first embedded safety controllers worldwide to receive certification for the highest functional safety level in cars (ASIL D) according to the latest version of ISO standard 26262 from SGS TÜV Saar.
The standard describes a globally binding procedure for the development and production of safety-critical systems in vehicles, as is required in automated driving. The current version has been in effect since December 2018. It has replaced the original version from 2011.
"We defined the second-generation Aurix security architecture when the new version of ISO 26262 was not yet available," says Peter Schäfer, Vice President and General Manager Automotive Microcontroller at Infineon. “Nevertheless, it meets all the requirements for an ASIL-D safety controller for automated driving.
Molded release film prevents galvanic corrosion in the car
The basis is our holistic one Security approachwhich is reflected in a sophisticated, robust architecture. The second-generation Aurix microcontrollers provide the security and contribute to the confidence required to bring automated driving to the road, ”continued Schäfer.
The Aurix family from Infineon has long been successful in safety-related applications. Computing platforms for automated driving rely on Aurix as the safety host controller. In addition, the microcontrollers are used for processing sensor data in radar systems, engine and operating controls, brakes, airbags and Steering, central Gateways, domain control units or hybrid and electric cars.
Next to the Automotive-Area, the product family is getting stronger in others too safety-critical applications in demand, for example in commercial vehicles and in robotics. For this reason, Infineon is planning certification according to IEC 61508 in the next step. This is a cross-industry, fundamental standard for functional safety that serves as the basis for application-specific standards.
Aurix TC3xx devices have up to 6 processor cores, each with a 300 MHz clock frequency. Up to four of them have an additional lockstep core. With around 3000 DMIPS, Aurix sets the standard among computing controllers for computing power for implementing ASIL-D security. There is also a distributed memory protection system and secure internal communication buses. In addition, the security controller allows the integration of software with different security levels from different sources. This allows multiple operating systems and applications such as Brakes, steering, airbag and Driver assistance systems host for the autonomous, self-driving, automated vehicle on a common platform.
Infineon recently launched the second Aurix TC3xx family for a wide range of applications in the distribution market started. The company offers extensive hardware and software support to enable customers to implement it quickly and efficiently. In addition to entry-level kits and evaluation boards, there are a number of application kits. The Aurix Development Studio is a free toolkit for software development and testing. In the Aurix forum, developers can exchange ideas with one another.
17.09.2019 | Mitsubishi Electric has the business area High-precision positioning systems with headquarters in the German branch in Ratingen. The new business unit offers German and European customers key technologies to accelerate the introduction of centimeter-accurate autonomous driving and safe driving assistance.
13.09.2019/XNUMX/XNUMX | the Joyson Group is presenting a cockpit concept in Frankfurt for the first time that combines expertise from all three divisions of the group. In addition to solutions for active and passive safety, the demonstrator has operating systems/HMI as well as backlit surfaces and a joint air vent. The guiding principle of the cockpit concept is an "adaptive interior": an interior that is also designed for the requirements of autonomous driving.
The system, developed by Joyson Safety Systems, includes Concept study includes a steering wheel suitable for different levels of autonomous driving, which does not require a fixed steering column connection thanks to the steer-by-wire technology integrated in the steering wheel body. In the highly automated driving mode, the concept offers the option of folding the steering wheel away and using a large touchscreen productively while driving.
A major challenge in autonomous driving is the safe transfer of vehicle control. This is accompanied by the need to Driver condition during the handover. This is ensured by the interaction of three technologies: the camera-based sensing of the head inclination and direction of gaze "Driver Monitoring System" (DMS); Hands-on Detection (HoD), in which sensors in the steering wheel record the position of the hands and evaluate vital functions such as pulse and skin conductivity; and a light bar positioned in the driver's field of vision, which provides visual feedback in critical driving situations or can provide information about the degree of autonomous driving.
Im Steering wheel Joyson's system supplier expertise is also reflected here, as the multifunctional steering wheel switches were incorporated into the concept by the Preh Group. They have a closed surface and active haptic feedback. The close cooperation between the various Joyson divisions is particularly evident in the area of the center console. The joint vent with invisible slats and the backlit surface decor are from Joysonquin and the touch module integrated into the center console is from Preh.
Recuperation | Braking energy of electric drives
Joyson Safety Systems also presents a seat with a special motorized Seat belt and a belt-integrated airbag, which also provides additional safety in alternative seating positions, e.g. lying down, and can replace the conventional front airbag. Together with the illuminated belt buckle and the belt presenter, these technologies help to improve occupant protection and comfort. A sensor for seat recognition, required for the interaction of belt tensioning and airbag deployment, is also part of the seat, as is a pre-crash side airbag.
17.07.2019 | Mitsubishi Electric Information Technology R&D Center in Japan has a new Sensortechnology developed that enables highly precise detection of vehicle perimeters even in dense fog or heavy rain. The technology is intended to enable robust functioning of autonomous and assisted driving systems even in harsh weather conditions, under which the detection accuracy of conventional sensors decreases.
14.06.2019/XNUMX/XNUMX | the mobility industry is moving rapidly towards the so-called era of "smart mobility", which in principle will also mean networked and sometimes even completely autonomous driving. Mitsubishi Electric develops multi-layer cyber defense technology for connected cars in the age of dynamically developing smart mobility.
01.02.2019 | Vector With the VN4610 network interface, Siemens is introducing its first special solution for IEEE 802.11p and CAN (FD)-based applications. As a connection to the “CANoe. Car2x” test tool, the interface quickly brings 802.11p-based control units to series production readiness.
The VN4610 is a network interface with USB interface for access to IEEE 802.11p and CAN (FD) networks. When implementing their Car2x/V2X applications, the user benefits from the simple reception and transmission of IEEE 802.11p messages. The received IEEE 802.11p messages are transferred to the application synchronously with the CAN (FD) messages. The built-in GNSS receiver provides the GNSS time and the current GNSS position.
For secure testing of DSRC applications by car manufacturers via IEEE 802.11p radio channels The VN4610 meets all hardware requirements. For analysis, it forwards all messages received from the two radio channels unfiltered to the test tool, such as CANoe. Car2x. The advantage for the developer is that messages are also analyzed that would be discarded in a control unit due to timing, geo-information or Car2x/V2X protocol errors. In addition, latency measurements can also be carried out because the time stamps of the messages on the bus channels are synchronized.
Autonomous driving (full self driving) refers to the ability of a vehicle to to drive independently, without a human having to actively intervene and of course without causing accidents. The vehicle uses a combination of sensors, cameras, radar systems and artificial intelligence to recognize its surroundings, make decisions and get safely from one point to another. The aim is to relieve the driver, increase road safety and increase comfort. While the first autonomous vehicles are already on public transport in the USA, this technology is still in the testing phase in Germany and only available to a limited extent.
The 5 stages / levels of autonomous driving define the degree of vehicle autonomy:
While the first autonomous vehicles are already on public transport in the USA, this technology is still in the testing phase in Germany and only limited permitted under certain conditions. Vehicles with level 3 autonomy, such as the Mercedes-Benz EQS, are allowed to drive autonomously on certain roads, such as motorways. However, fully autonomous driving (levels 4 and 5) is not yet permitted across the board and is currently being tested in special pilot projects and test areas.
Source: This article is based on information from the following companies: Continental, Fraunhofer, Ibeo, Infineon, Joyson, Minebea Mitsumi, Mitsubishi Electric, Nvidia, TU Munich, Valeo, Vector.
Angela Struck is editor-in-chief of the development scout and freelance journalist as well as managing director of Presse Service Büro GbR in Ried.