sensor – Safe Car News https://safecarnews.com Driver Assistance to Autonomous Vehicles Fri, 20 Aug 2021 06:24:01 +0000 en-US hourly 1 https://safecarnews.com/wp-content/uploads/2018/08/logo-3-web1-150x90.png sensor – Safe Car News https://safecarnews.com 32 32 Tactile Mobility launches virtual sensor solution to prevent ‘runovers’ https://safecarnews.com/tactile-mobility-launches-virtual-sensor-solution-to-prevent-runovers/ https://safecarnews.com/tactile-mobility-launches-virtual-sensor-solution-to-prevent-runovers/#respond Tue, 20 Jul 2021 19:54:39 +0000 https://safecarnews.com/?p=21058  Tactile Mobility has announced today the launch of its first-of-its-kind runover virtual sensor solution providing vehicles with the ability to identify in real-time an initial runover of an object and prevent a full runover.

The software-only solution uses “sense of touch” technology to detect a variety of objects on the road of different heights, sizes, shapes, and materials – both organic and hard – such as a human body, road debris and other objects. The virtual sensor then prevents the vehicle from running over the object, which could harm human life or damage the vehicle. The safety level virtual sensor will be added on top of the Tactile Processor Platform, which already includes the company’s suite of virtual sensors such as grip estimation, tire health, surface sensing, vehicle health and much more.

According to NHTSA hundreds of children are killed and thousands are injured every year in nontraffic crashes in parking lots, driveways and private roadways. Runover virtual sensors in both autonomous vehicles and vehicles with ADAS systems can help reduce the human death toll by sending signals to these vehicles that alert the car and driver at distinct stages of the runover. The new virtual sensor will enable another critical safety function, allowing vehicles to sense the road, identify the type of material under their tires and alert an initial runover, preventing vehicles from fully running over objects that could lead to a fatal incident.

Three in four Americans are afraid to ride in fully self-driving vehicles. For autonomous vehicles to be trusted by the mainstream, they must be far safer than human-controlled vehicles. To achieve this, they must respond to vehicle-road dynamics just as – or better than – human drivers do; they must be able to not only “see” the road that lies ahead, but also “feel” the road friction, roughness, curves, grades, distresses and objects in the roads under their tires. Runover sensors enable vehicles to sense the road and react to obstacles, hazards, and vulnerable objects, significantly mitigating the damage.

]]>
https://safecarnews.com/tactile-mobility-launches-virtual-sensor-solution-to-prevent-runovers/feed/ 0
Archive:ibeo introduces ibeo.Reference system solution for validating ADAS sensors https://safecarnews.com/ibeo-introduces-ibeo-reference-system-solution-for-validating-adas-sensors/ https://safecarnews.com/ibeo-introduces-ibeo-reference-system-solution-for-validating-adas-sensors/#respond Thu, 20 May 2021 06:29:57 +0000 https://safecarnews.com/?p=20773 Ibeo Automotive Systems GmbH has developed a new automated total solution for the validation of sensors necessary for ADAS and autonomous driving. The ibeo.Reference tool-chain also automates and specifies the manual data labeling-process, which is conventionally very personnel- and time-intensive, and thus ensures considerable time and cost savings with consistently high performance.

Almost all conventional sensor systems can be referenced. With the ibeo.Reference system, Ibeo is responding to the high demand on the OEM and Tier 1 side in the race for ever more powerful systems.

Today and in the future, automated driver assistance systems (ADAS) and autonomous driving systems (AD) must be validated in a complex process before they ultimately can be installed in the vehicle, ready for series production. Regardless of the technology of the sensor system (camera, radar, or LiDAR), reference data is required for each system and must be compared with the data acquired during the test process. Ibeo’s recording system is installed on the vehicle alongside the manufacturer’s prototype system and records data in parallel. Furthermore, it can also be used to create scenarios for ADAS or AD algorithm-development.

The recorded data collected in validation drives can be in the low-to-mid thousands of hours. And for every hour of test driving, manual evaluation results in up to 300 hours of labeling activity – a Sisyphean task that is not only tedious and costly but may also be inaccurate in various areas.



With the new ibeo.Reference toolchain, the user receives all of the necessary building blocks for sensor validation from a single source:

  • The recording system, consisting of hardware and software, for running in the reference data, which now also supports selected third-party LiDAR sensors for the first time
     
  • ibeo Auto Annotation as scalable, cloud-compatible software for automated creation of labeled reference data
     
  • Editors for further processing of reference data to ground-truth data

To classify the large amounts of data recorded, the industry traditionally uses outsourced service providers. In many cases, hundreds of employees manually evaluate every meter of road to determine whether the sensor being tested has detected cars, people, lane markings or guard rails, for example, to reconcile the data.

]]>
https://safecarnews.com/ibeo-introduces-ibeo-reference-system-solution-for-validating-adas-sensors/feed/ 0
Archive:OmniVision launches advanced image sensor for automotive viewing cameras https://safecarnews.com/omnivision-launches-advanced-image-sensor-for-automotive-viewing-cameras/ https://safecarnews.com/omnivision-launches-advanced-image-sensor-for-automotive-viewing-cameras/#respond Tue, 02 Jun 2020 13:37:35 +0000 https://safecarnews.com/?p=19685 OmniVision Technologies, developer of advanced digital imaging solutions, announced the OX03C10 ASIL-C automotive image sensor—the world’s first for viewing applications that combines a large 3.0 micron pixel size with a high dynamic range (HDR) of 140 dB and the best LED flicker mitigation (LFM) performance for viewing applications with minimized motion artifacts. This is also the first viewing image sensor with HDR and LFM that can deliver 1920 x 1280p resolution at the highest rate of 60 frames per second (fps), enabling greater design flexibility and faster camera-view switching for drivers. Additionally, the OX03C10 has the lowest power consumption of any LFM image sensor with 2.5MP resolution—25% lower than the nearest competitor—along with the industry’s smallest package size, enabling the placement of cameras that continuously run at 60 fps in even the tightest spaces for stringent styling requirements.

Basic image processing capabilities were also integrated into this sensor, including defect pixel correction and lens correction. Furthermore, the integration of OmniVision’s industry-leading HALE (HDR and LFM engine) combination algorithm uniquely provides top HDR and LFM performance simultaneously. With such industry-leading features, the OX03C10 provides the best image quality for automotive viewing applications, including rearview cameras (RVC), surround view systems (SVS), camera monitoring systems (CMS) and e-mirrors.

]]>
https://safecarnews.com/omnivision-launches-advanced-image-sensor-for-automotive-viewing-cameras/feed/ 0
Archive:Hyundai Mobis develops radar-based rear seat passenger detection system https://safecarnews.com/hyundai-mobis-develops-radar-based-rear-seat-passenger-detection-system/ https://safecarnews.com/hyundai-mobis-develops-radar-based-rear-seat-passenger-detection-system/#respond Mon, 23 Mar 2020 13:46:31 +0000 https://safecarnews.com/?p=19512 Hyundai Mobis said that it successfully developed the system for detecting rear-seat passengers with a ‘radar’ (ROA, Rear Occupant Alert), and it is planning to propose the system to global automakers.

The ROA is a device for preventing passengers from being left unattended in rear seats. In the past, the weight sensor of children’s car seats or the ultrasonic sensor were used in general. Hyundai Mobis vastly improved detection accuracy by replacing them with the radar sensor. It is expected to greatly help prevent heatstroke accidents caused every summer by children being left unattended in vehicles and other safety accidents.

If a passenger is left in the rear seat, the passenger detection system alerts the driver with the sound made when the door is closed, or through the instrument panel or smartphone. The system, developed by Hyundai Mobis, is known to have secured electromagnetic reliability so that it works normally near high-voltage lines and railroad tracks, and be precise enough to distinguish adults, infants and pets. Hyundai Mobis is planning to develop a radar capable of measuring the heartbeats of passengers and expand the biometric function this year.

You can find out more about driver monitoring technologies here

]]>
https://safecarnews.com/hyundai-mobis-develops-radar-based-rear-seat-passenger-detection-system/feed/ 0
Hyundai and Kia develop ICT connected shift system https://safecarnews.com/hyundai-and-kia-develop-ict-connected-shift-system/ https://safecarnews.com/hyundai-and-kia-develop-ict-connected-shift-system/#respond Thu, 20 Feb 2020 13:15:30 +0000 https://safecarnews.com/?p=19374
  • Information and communication technology (ICT) automatically shifts to optimal gear based on road and traffic conditions ahead
  • Improves driving comfort and fuel efficiency by minimizing unnecessary shifts through predictive gear-shifting control system
  • Future developments to include consideration of driver’s preferences and traffic signal status
  • Hyundai Motor Company and Kia Motors Corporation announced that they have developed the world’s first predictive Information and Communication Technology (ICT) Connected Shift System, enabling the vehicle to automatically shift to the optimal gear after identifying the road and traffic conditions ahead. Hyundai and Kia plan to apply the technology on future vehicles. During system development, the companies filed about 40 major patents in South Korea and abroad.

    While the technologies used to automatically shift depend on drivers’ preferences, such as Smart Drive Mode – available on most current Hyundai and Kia models – ICT Connected Shift System is the first ICT to automatically shift the gear according to road and traffic conditions.

    ICT Connected Shift System uses intelligent software in the Transmission Control Unit (TCU) that collects and interprets real-time input from underlying technologies, including 3D navigation equipped with a precise map of the road as well as cameras and radar for smart cruise control. The 3D navigation input includes elevation, gradient, curvature and a variety of road events as well as current traffic conditions. Radar detects the speed and distance between the vehicle and others, and a forward-looking camera provides lane information.

    Using all of these inputs, the TCU predicts the optimal shift scenario for real-time driving situations through an artificial intelligence algorithm and shifts the gears accordingly. For example, when a relatively long slow down is expected and radar detects no speed irregularities with the car ahead, the transmission clutch temporarily switches to neutral mode to improve fuel efficiency.

    When Hyundai and Kia tested a vehicle with an ICT Connected Shift System on a heavily curved road, the frequency of shifts in cornering was reduced by approximately 43 percent compared to vehicles without the system. Accordingly, the system also reduced the frequency of brake operation by approximately 11 percent, thereby minimizing driving fatigue and brake wear.

    When rapid acceleration was required to enter a highway, the driving mode automatically switched to Sport Mode at the merge, making it easier to join the traffic flow. After merging with traffic, the vehicle automatically returned to its original driving mode, enabling safe and efficient driving.

    In addition, the engine brakes were automatically applied upon release of the accelerator pedal by determining speed bumps, downhill slopes and location of the speed limit change on the road. The changes in distance from the front car were detected by the front radar to adjust appropriate transmission gear automatically, which improved driving quality.

    The system is also in line with autonomous technology, which is developing day by day. The ICT Connected Shift System will deliver both improved fuel efficiency and a stable driving experience in the era of autonomous vehicles by providing improved performance in response to real-time road and traffic conditions.

    Hyundai and Kia are planning to further develop the ICT Connected Shift System into an even more intelligent transmission technology that can communicate with traffic signals based on LTE or 5G communication and identify drivers’ tendencies, resulting in further refinement of gear-shift control.

    ]]>
    https://safecarnews.com/hyundai-and-kia-develop-ict-connected-shift-system/feed/ 0
    AISIN and Vayyar imaging partner to develop exterior sensing solutions for vehicles https://safecarnews.com/aisin-and-vayyar-imaging-partner-to-develop-exterior-sensing-solutions-for-vehicles/ https://safecarnews.com/aisin-and-vayyar-imaging-partner-to-develop-exterior-sensing-solutions-for-vehicles/#respond Wed, 08 Jan 2020 11:13:36 +0000 https://safecarnews.com/?p=19268 Aisin Seiki, and Vayyar Imaging, a leader in 4D imaging sensor technology, are collaborating to provide high resolution Short-Range Radar (SRR) for exterior vehicle sensing.

    By providing a single-chip radar sensor, Vayyar will contribute to the safer functionality and advanced control that Aisin Seiki aims for.

    The collaboration will combine AISIN’s in-vehicle system expertise with Vayyar’s 4D high-resolution short-range radar to develop exterior sensing capabilities for vehicles, such as blind spot detection for low-speed driving support.

    ]]>
    https://safecarnews.com/aisin-and-vayyar-imaging-partner-to-develop-exterior-sensing-solutions-for-vehicles/feed/ 0
    ON Semiconductor and Pony.ai collaboration on image sensing and processing technologies for AVs https://safecarnews.com/on-semiconductor-and-pony-ai-collaboration-on-image-sensing-and-processing-technologies-for-avs/ https://safecarnews.com/on-semiconductor-and-pony-ai-collaboration-on-image-sensing-and-processing-technologies-for-avs/#respond Tue, 07 Jan 2020 13:38:06 +0000 https://safecarnews.com/?p=19238 ON Semiconductor and Pony.ai announced that they will collaborate on developing next-generation image sensing and processing technologies for machine vision to enhance the reliability and scalability of mass-produced autonomous vehicles.

    Pony.ai is working to revolutionize the future of transportation by building safe and reliable technology for autonomous mobility. Since late 2018, Pony.ai has pioneered autonomous mobility deployment to benefit people in both the US and China. Pony.ai is the first company to roll out daily robotaxi operation in China, starting in the city of Guangzhou in December 2018. In collaboration with Hyundai and Via, the company is also the first to launch and offer a robotaxi service to the general public in California, which has been operating in Irvine since November 2019.

    ON Semiconductor has developed a leading position in the automotive sector through its innovative products and solutions in imaging, radar, LiDAR and ultrasonic sensing. The company is unique in that it offers all four sensor modalities key to autonomous driving’s perception systems. Pony.ai is leveraging ON’s image sensing and processing chips to capture and process a large amount of camera data everyday with its global autonomous fleet. Utilizing data-driven algorithms, the next-generation image sensing and processing models could maximize information from cameras and vastly expand the perception capabilities of autonomous vehicles.

    ]]>
    https://safecarnews.com/on-semiconductor-and-pony-ai-collaboration-on-image-sensing-and-processing-technologies-for-avs/feed/ 0
    Arbe raises $32M to step up productization of first HD radar chipset for ADAS and AVs https://safecarnews.com/arbe-raises-32m-to-step-up-productization-of-first-hd-radar-chipset-for-adas-and-avs/ https://safecarnews.com/arbe-raises-32m-to-step-up-productization-of-first-hd-radar-chipset-for-adas-and-avs/#respond Tue, 17 Dec 2019 11:26:14 +0000 https://safecarnews.com/?p=19163 Arbe, the provider of next-generation 4D Imaging Radar Chipset Solution, enabling high-resolution sensing for ADAS and autonomous vehicles, announced the closing of $32 million in Round B funding from existing, new, and CVC investors Catalyst CEL, BAIC Capital, AI Alliance (Hyundai, Hanwha, SKT), and MissionBlue Capital, and from earlier investors Canaan Partners Israel, iAngels, 360 Capital Partners, O.G. Tech Ventures, and OurCrowd. Arbe will use the funding to move to full production of its breakthrough radar chipset, which generates an image 100 times more detailed than any other solution on the market today.

    With the new funding, Arbe will focus on expanding its team to support global Tier-1 customers in moving into full production of radar systems based on Arbe’s radar development platform. The delivery of radars based on Arbe’s proprietary chipset is a game changer in the automotive industry, as Arbe’s technology is the first to enable highly precise sensing in all environment conditions. The unique radar technology produces detailed images; separates, identifies and tracks hundreds of objects in high horizontal and vertical resolution to a long range in a wide field of view; enabling the OEMs to provide all-conditions, uncompromised safety to next generation cars with an affordable sensor for mass market implementation.

    The technology developed by Arbe resolves some of today’s most pressing radar challenges, which include eliminating false alarms, processing massive amounts of information generated by 4D imaging in real time, and mitigating mutual radar interference. By achieving high-resolution object separation in both azimuth and elevation, Arbe supports safer and more accurate decisions for all levels of autonomous driving.

    ]]>
    https://safecarnews.com/arbe-raises-32m-to-step-up-productization-of-first-hd-radar-chipset-for-adas-and-avs/feed/ 0
    Archive:Velodyne Lidar announces “Puck 32MR” sensor for autonomous systems https://safecarnews.com/velodyne-lidar-announces-puck-32mr-sensor-for-autonomous-systems/ https://safecarnews.com/velodyne-lidar-announces-puck-32mr-sensor-for-autonomous-systems/#respond Tue, 13 Aug 2019 11:38:55 +0000 https://safecarnews.com/?p=18536 Velodyne Lidar introduced the Puck 32MR sensor to address key markets in the autonomous industry. This product offers a cost-effective perception solution for low speed autonomous markets including industrial vehicles, robotics, shuttles and unmanned aerial vehicles (UAVs). The Puck 32MR bolsters Velodyne’s robust portfolio of patented sensor technology, delivering rich perception data for mid-range applications.

    In addition to featuring Velodyne’s patented surround-view perception capability, the Puck 32MR boasts a range of 120 meters and a 40-degree vertical field of view to enable navigation in unfamiliar and dynamic settings. Generating a high-resolution point cloud with minimal noise in all light conditions, the Puck 32MR accurately detects crosswalks, curbs, vehicles, pedestrians, bicycles and obstacles for safe and efficient operation in roadway, commercial and industrial use cases. Along with outstanding perception performance, this sensor delivers reliability and durability in a compact form factor.

    The Puck 32MR is designed for power-efficiency to extend vehicle operating time within broad temperature and environmental ranges without the need for active cooling. The sensor uses proven 905 nanometer (nm), Class 1 eye-safe technology and is assembled in Velodyne’s state-of-the-art manufacturing facility. The Puck 32MR is designed for scalability and priced attractively for volume customers. Velodyne provides world-class technical support for the sensor across North America, Europe and Asia.

    ]]>
    https://safecarnews.com/velodyne-lidar-announces-puck-32mr-sensor-for-autonomous-systems/feed/ 0
    Archive:MY 2019 Ford Ranger equipped with radar for towing capabilities https://safecarnews.com/my-2019-ford-ranger-equipped-with-radar-for-towing-capabilities/ https://safecarnews.com/my-2019-ford-ranger-equipped-with-radar-for-towing-capabilities/#respond Wed, 01 Aug 2018 10:40:55 +0000 https://safecarnews.com/?p=14967
    The new Ford Ranger is equipped with Blind Spot Information System with trailer coverage. After hooking up a trailer, radar housed in Ranger’s taillights monitors blind spots all the way to the back of the trailer. The system keeps drivers informed of the presence of a vehicle in the truck’s blind spots until that vehicle passes. Ranger can store up to three trailer profiles, including a trailer’s length, to let the radar system know how far back to provide warnings when another vehicle is traveling next to the trailer.

    In addition the advanced Blind Spot Information System can tell the driver when vehicles are in the truck’s blind spot when a trailer is not attached as well and from its cross-traffic alert technology, the system can warn drivers of an oncoming vehicle when they are backing out of a parking spot. Blind Spot Information System is standard on Ranger XLT and Lariat models.

    Source: Ford

    ]]>
    https://safecarnews.com/my-2019-ford-ranger-equipped-with-radar-for-towing-capabilities/feed/ 0
    Archive:Valeo partners with Baidu’s Apollo driving platform https://safecarnews.com/valeo-partners-with-baidus-apollo-driving-platform/ https://safecarnews.com/valeo-partners-with-baidus-apollo-driving-platform/#respond Wed, 04 Jul 2018 11:55:09 +0000 http://localhost/scn_staging_new/?p=14799
    Valeo will contribute its skills in sensor cleaning systems an optimal for sensor functioning, connectivity between autonomous vehicles and control and optimization of air quality in the vehicle interior.

    Valeo will benefit from the full range of software, hardware and data tools (operating systems, high-precision positioning and HD mapping services, simulation engines, cloud, algorithms, etc.) provided by Apollo to its ecosystem partners through its open, reliable and secure platform. Launched in April 2017, Apollo accelerates the development, testing and deployment of autonomous vehicles through a collaborative approach.

    Source: Valeo

    ]]>
    https://safecarnews.com/valeo-partners-with-baidus-apollo-driving-platform/feed/ 0
    Archive:SADA a project consortium works on sensor fusion for autonomous driving https://safecarnews.com/sada-a-project-consortium-works-on-sensor-fusion-for-autonomous-driving/ https://safecarnews.com/sada-a-project-consortium-works-on-sensor-fusion-for-autonomous-driving/#respond Mon, 23 Apr 2018 12:42:56 +0000 http://localhost/scn_staging_new/?p=14376 SADA_2

    In the joint project SADA, a project consortium with the participation of the German Research Center for Artificial Intelligence (DFKI) developed solutions for the smooth fusion of heterogeneous sensor data from vehicles with data from the environment. At HANNOVER MESSE, April 23 to 27, 2018, the consortium will present the results of the project at the booth of the Federal Ministry for Economic Affairs and Energy (BMWi) – Hall 2, Stand C28 – using the robotic vehicle EO smart connecting car 2.

    SADA was funded by the BMWi with a total volume of 3.8 million euros from 1 February 2015 to 30 April 2018.

    Intelligent linking of heterogeneous sensor data in road traffic

    The DFKI Robotics Innovation Center under the direction of Prof. Dr. Ing. Dr. hc Frank Kirchner, BaseLabs GmbH, NXP Semiconductors, fortiss GmbH and ALL4IP TECHNOLOGIES GmbH & Co. KG under the coordination of Siemens AG work together on a system that enables dynamic integration and evaluation of data from different and non-matched sensors. The adaptation and fusion process developed for this purpose is able to recognize in real-time which data is available to it, to select the relevant data for the application and thus to react quickly.

    DFKI realizes sensor data fusion in real time and data prioritization

    As a central system component, the Robotics Innovation Center has developed a back-end in the project, which is used to manage and process sensor data, as well as to correlate mobile and stationary sensors. The back-end manages formalized semantic metadata, describing the properties and contextual relationships of the available sensors and datasets, forming the basis for an automated ad-hoc fusion of heterogeneous sensor data. In addition, the DFKI scientists implemented an efficient and fail-safe communication infrastructure that ensures at all times that the data and processing capacities required to implement complex tasks are available in real time.

    Range extender for increased vehicle range and improved environmental awareness

    In addition, the DFKI research area in SADA developed a so-called range extender, ie a trailer for electric cars, which is equipped with additional batteries and sensors. These allow the vehicle to travel longer distances and record additional environmental data. The project partners will link the appearance at HANNOVER MESSE with the completion of the project and demonstrate the results at the robotic vehicle EO smart connecting car 2 developed at the Robotics Innovation Center, which serves as a demonstration platform in SADA.

    Source: DFKI

    ]]>
    https://safecarnews.com/sada-a-project-consortium-works-on-sensor-fusion-for-autonomous-driving/feed/ 0
    Archive:ANSYS acquires Optis to enhance sensor simulation capabilities for autonomous vehicles https://safecarnews.com/ansys-acquires-optis-to-enhance-sensor-simulation-capabilities-for-autonomous-vehicles/ https://safecarnews.com/ansys-acquires-optis-to-enhance-sensor-simulation-capabilities-for-autonomous-vehicles/#respond Fri, 23 Mar 2018 09:01:55 +0000 http://localhost/scn_staging_new/?p=14156

    ANSYS has entered into a definitive agreement to acquire OPTIS, a premier provider of software for scientific simulation of light, human vision and physics-based visualization. The acquisition of OPTIS will extend ANSYS’ industry-leading multiphysics-based portfolio into the increasingly important area of optical simulation.

    The transaction is expected to close in the second quarter of 2018. Management will provide further details regarding the transaction and its impact on the 2018 financial outlook after the closing.

    OPTIS develops physics-based software that simulates light and human vision. Other vendors consider the aesthetics of light, but OPTIS bases its solutions on real-world physics to give the most accurate simulation possible. Its customers include a who’s who of the automotive industry, including Audi, Ford, Toyota and Ferrari.

    Engineering simulation plays an increasingly important role in sensor development as the industry races to develop safe autonomous vehicles. ANSYS has created simulation solutions specifically for autonomous vehicle sensor development, and will extend its market leadership with the OPTIS acquisition. The addition of OPTIS’ capabilities to the ANSYS portfolio will result in a comprehensive sensor solution for the market, covering visible and infrared light, electromagnetics and acoustics for camera, radar and lidar.

    OPTIS has also developed a photo-realistic virtual reality and closed-loop simulation platform, which will help speed the development of autonomous vehicles. Using this VR backbone – combined with other ANSYS solutions – automotive manufacturers can simulate the environment driverless vehicles are navigating, including road conditions, weather and one-way streets.

    Source: ANSYS

    ]]>
    https://safecarnews.com/ansys-acquires-optis-to-enhance-sensor-simulation-capabilities-for-autonomous-vehicles/feed/ 0
    Archive:Autoliv to develop Level 3 ADAS systems for Geely https://safecarnews.com/autoliv-to-develop-level-3-adas-systems-for-geely/ https://safecarnews.com/autoliv-to-develop-level-3-adas-systems-for-geely/#respond Thu, 08 Mar 2018 15:28:57 +0000 http://localhost/scn_staging_new/?p=14024

    Autoliv and Zenuity, have been selected to develop and produce the first Level 3 advanced driver assistance systems for Geely.

    Autoliv was selected as supplier for Geely’s Level 3 project, which includes ADAS electronic control units and software, radar systems, as well as mono vision and stereo vision camera systems. Geely selected Autoliv, including Zenuity, for its hardware and software capabilities and flawless execution in China.

    “This award is a milestone for Autoliv, marking a new customer for the Zenuity software solution and also an expansion in the rapidly growing Active Safety market in China,” says Jan Carlson, Chairman, President and CEO of Autoliv. “We are honored to have been selected to embark on this project with Geely and look forward to the future collaboration towards autonomous driving,” he continued.

    Autoliv and Zenuity stands for a combined knowledge all from sensor to vehicle, and cloud-based software technology.

    Source: Autoliv

    ]]>
    https://safecarnews.com/autoliv-to-develop-level-3-adas-systems-for-geely/feed/ 0
    Archive:Harman demonstrates anti-spoofing solution for autonomous vehicle sensors https://safecarnews.com/harman-demonstrates-anti-spoofing-solution-for-autonomous-vehicle-sensors/ https://safecarnews.com/harman-demonstrates-anti-spoofing-solution-for-autonomous-vehicle-sensors/#respond Tue, 09 Jan 2018 05:13:00 +0000 http://localhost/scn_staging_new/?p=13429 HARMAN SHIELD Sensor Spoofing Demo at CES 2018

    HARMAN International is demonstrating at the Consumer Electronics Show in Las Vegas new detection capabilities, part of the HARMAN SHIELD Solution, that protect autonomous and semi-autonomous vehicles against cyber-attacks aimed at the vehicle’s sensors.

    The technology demonstration shows how an adversary image with a speed limit sign was able to fool the on-board traffic sign recognition system of a production vehicle, to present false information to the driver and impact other vehicle systems, such as adaptive cruise control. That spoofed traffic sign was successfully detected and reported to HARMAN Cybersecurity Analysis Center – a full dashboard and analytics solution which provides 24/7 visibility of broad vehicular security-related events from HARMAN SHIELD Agents.

    Adversary images are images that have been intentionally manipulated in such a way that a human recognizes them correctly but a neural network or computer vision-based system will misclassify them. The use of such images opens a way to new attacks on autonomous cars and ADAS systems, requiring non-conventional cybersecurity techniques to detect and possibly mitigate. These attacks do not require any physical access to the car or tampering with the communication system and the consequences of such attacks can be grave, including major traffic disruptions.

    A strong cyber security infrastructure that protects our cars is necessary for the deployment of autonomous driving on public roads. This September, the House of Representatives passed the SELF DRIVE Act bill that lays out a basic federal framework for autonomous vehicle regulation, with specific mention of cybersecurity measures, such as having an on-board intrusion detection system. The bill, Safely Ensuring Lives Future Deployment and Research In Vehicle Evolution Act (SELF DRIVE Act), will create a base regulation for vehicles across the United States and is expected to pass the Senate soon before reaching the president’s desk to sign.

    Source: Harman

    ]]>
    https://safecarnews.com/harman-demonstrates-anti-spoofing-solution-for-autonomous-vehicle-sensors/feed/ 0
    Archive:Toyota to introduces automated driving research vehicle at CES 2018 https://safecarnews.com/toyota-to-introduces-automated-driving-research-vehicle-at-ces-2018/ https://safecarnews.com/toyota-to-introduces-automated-driving-research-vehicle-at-ces-2018/#respond Fri, 05 Jan 2018 12:49:32 +0000 http://localhost/scn_staging_new/?p=13391

    • Platform 3.0 features long-range 200-meter perception around 360-degree perimeter of vehicle
    • Compact packaging of sensory components accentuated with sleek look from CALTY Design Research
    • Research vehicles being created at Toyota’s Prototype Development Center

    Toyota Research Institute (TRI) will show its automated driving research vehicle, Platform 3.0, at CES. The new platform, which is built on a Lexus LS 600hL, combines greater technological capabilities with new harmonized styling that integrates the automated vehicle technology into the LS model’s design.

    TRI approached development of a new research platform with three core principles:

    (1) Elevate perception capabilities to be an industry pacesetter among automated vehicles.

    (2) Blend the sensing equipment into the vehicle design with a distinct appearance that is sleek and elegant.

    (3) Package the automated vehicle technology in a manner that is easy to reproduce for building a fleet at scale.

    PERCEPTION TECHNOLOGY

    Platform 3.0 represents maturing of TRI’s automated vehicle research. Experimentation has transitioned to narrowing in on a technology package with a more defined sensor configuration and level of performance that helps catapult proficiency in understanding the world around the car.

    Platform 3.0 has a very sensor-rich package that makes it one of the most perceptive automated driving test cars on the road. The Luminar LIDAR system with 200-meter range, which had only tracked the forward direction on TRI’s previous test platform, now covers the vehicle’s complete 360-degree perimeter. This is enabled by four high-resolution LIDAR scanning heads, which precisely detect objects in the environment including notoriously difficult-to-see dark objects.

    Shorter-range LIDAR sensors are positioned low on all four sides of the vehicle – one in each front quarter panel and one each on the front and rear bumpers. These can detect low-level and smaller objects near the car like children and debris in the roadway. The new platform remains flexible for incorporating future breakthrough technology as it becomes available.

    DESIGN

    TRI engaged the expertise of CALTY Design Research in Ann Arbor, Mich. and engineers at Toyota Motor North America Research and Development (TMNA R&D) to compact and conceal the sensors and cameras. They created a new rooftop weather and temperature proof panel, cleverly using available space in the sunroof compartment to minimize the overall height. Their ingenuity eliminates the look of equipment as bolt-on appendages and replaces the “spinning bucket” LIDAR sensor that has historically characterized autonomous test vehicles.

    CALTY designed the confident image of the rooftop panel, defined as intelligent minimalism, which is inspired by off-road motorcycle helmets. The forward area has a crisp technical look that becomes more fluid and aerodynamic toward the rear of the car, unifying with the contour lines of the LS. The panel is embellished with chrome trim that runs along the side where it meets the roofline.

    The vehicle’s computational architecture for operating the automated vehicle components, which previously consumed nearly all trunk space, has also been consolidated. The electronics infrastructure and wiring in condensed into a small box exquisitely adorned with an LED-lit TRI logo.

    PRODUCTION

    Production of Platform 3.0 vehicles begins this spring. The Prototype Development Center at TMNA R&D headquarters in York Township, Michigan, which has expertise in low volume, specialized production, will create Platform 3.0 cars from stock Lexus LS models.

    Production volume is intentionally low to allow for continued flexibility given the quickness with which TRI has progressed in updating its test platform. There have been three major updates, including two new generation test models, in less than a year, and TRI anticipates continued rapid developments.

    A share of the new test vehicles will be assembled with the dual cockpit control layout that TRI debuted last summer. This arrangement is for testing TRI’s Guardian approach to automated driving, experimenting with effective methods to transfer vehicle control between the human test driver and the automated system while maintaining a safety driver as a backup. Single cockpit vehicles, like the one on display at CES, are used to test Chauffeur, which is TRI’s approach to full vehicle automation.

    Source: Toyota Research Institute

    ]]>
    https://safecarnews.com/toyota-to-introduces-automated-driving-research-vehicle-at-ces-2018/feed/ 0
    Archive:AImotive raises $38 million series C funding https://safecarnews.com/aimotive-raises-38-million-series-c-funding/ https://safecarnews.com/aimotive-raises-38-million-series-c-funding/#respond Fri, 05 Jan 2018 12:47:31 +0000 http://localhost/scn_staging_new/?p=13389

    AImotive announced the closing of its $38 million USD/€32 million Series C funding round, led by B Capital Group and Prime Ventures, with participation from Cisco Investments, Samsung Catalyst Fund, and Series A and B investors Robert Bosch Venture Capital, Inventure, Draper Associates and Day One Capital.

    AImotive will use this new round of funding to continue developing its proprietary autonomous driving technology, which relies primarily on affordable, off-the-shelf camera sensors and artificial intelligence-based vision processing. AImotive’s technology is inherently scalable, due to its low cost modularity and flexibility, while also open to the fusion of non-vision based sensors for additional safety in poor visibility conditions. With relatively little additional cost, AImotive’s software can be ported into various car models, driving in diverse locations around the world.

    After receiving autonomous testing licenses on public roads for multiple locations, the company started testing its car fleet in Hungary, France, and California in summer 2017. It has plans to further expand testing to automotive hubs in Japan, China, and other US states this year.

    “The auto industry is moving rapidly toward autonomy, and AImotive’s vision-first strategy for solving perception and control is far more scalable than LiDAR-based approaches as an industry standard,” said Gavin Teo, partner at B Capital Group. “We’re excited to support Laszlo and the AImotive team in building a lasting brand in the AV space.”

    AImotive has taken inspiration from the aviation industry in designing its in-house development process and tools.

    “We want to reach similar levels of safety on the roads as have been reached in the skies,” said Laszlo Kishonti, founder and CEO of AImotive. “The aviation industry, which is 10,000 times less dangerous per mile than road transport, relies heavily on simulation. We are applying that methodology to develop safe autonomous software systems, and our team has already realized astonishing time and cost benefits, allowing us to quickly bring new features to road testing.”

    AImotive’s solution is designed for automotive OEMs, mobility service providers and other mobility players, and the company is working extensively with established automotive players including Groupe PSA, SAIC, and Volvo.

    “As a fund with many years of investment experience in deep technologies, we are excited to be part of shaping the next chapter in automotive,” said Monish Suri, partner at Prime Ventures. “What the AImotive team has been able to achieve to date, leveraging simulation technology, is very impressive, and we are looking forward to helping the company grow further and bring their technology into production.”

    Kishonti added, “We are really excited to build the next chapter together with our incredibly strong investor group. AImotive is very fortunate to have partners from such diverse backgrounds, each bringing unique expertise and networks.”

    Source: AImotive

    ]]>
    https://safecarnews.com/aimotive-raises-38-million-series-c-funding/feed/ 0
    Archive:HERE to deliver BMW drivers safety services created from live vehicle sensor data https://safecarnews.com/here-to-deliver-bmw-drivers-safety-services-created-from-live-vehicle-sensor-data/ https://safecarnews.com/here-to-deliver-bmw-drivers-safety-services-created-from-live-vehicle-sensor-data/#respond Thu, 04 Jan 2018 12:51:53 +0000 http://localhost/scn_staging_new/?p=13370

    Intelligent and connected vehicle technology took a significant step forward with the commercial launch of the HERE Safety Services Suite.

    This cloud-based services suite, developed by HERE Technologies, is unique because it is the first to aggregate real-time, rich sensor data generated by cars of different brands on the road. HERE then transforms this data into useful live road safety information that is delivered to drivers and passengers through the car’s head unit display, or to the car’s Advanced Driver Assistance Systems (ADAS) to support automated safety functions.

    BMW will be the first automaker to offer HERE Safety Services in production vehicles beginning in mid-2018. The services will first become available to drivers and passengers across North America and Western Europe.

    “Digital real-time maps and location-based services form the basis for the mobility of tomorrow. In summer 2017, the BMW Group introduced the first stage of local hazard warning based on intelligent connectivity and car-to-car communications,” said Dieter May, Senior Vice President Digital Services and Business Models at BMW Group. “We are delighted that the next stage will follow from mid-2018 and that BMW drivers will be the first to benefit from this enhanced service. All this data on local hazards, such as the scene of an accident or dangerous weather conditions, can be shared on an anonymous basis to warn drivers in good time and so further improve safety.”

    The fleet providing live sensor data to HERE is expected to grow quickly after launch, surpassing more than ten million vehicles in 2019.

    The HERE Safety Services Suite consists of HERE Hazard Warnings and HERE Road Signs. HERE Hazard Warnings, which has been developed using algorithms and know-how from both HERE and BMW, provides drivers and passengers with information about potential road hazards, accidents and extreme weather events, such as slippery roads and reduced visibility. HERE Road Signs provides up-to-date traffic signage information, including permanent and dynamic speed limits.

    “The HERE Safety Services Suite demonstrates how HERE transforms singular pieces of sensor data into valuable new services that make driving safer and more comfortable,” said Ralf Herrtwich, Head of Automotive at HERE. “As cars become increasingly connected and intelligent, the HERE Open Location Platform is the place where the entire industry can contribute to and access differentiating products and services that not long ago seemed out of reach.”

    The services are built on the HERE Open Location Platform, which enables multiple automakers to transmit live, anonymized sensor data that is then aggregated, enriched with high-precision location data, and transmitted back to cars in the form of near real-time, geo-targeted, contextually-relevant information about changing road conditions.

    The HERE Safety Services Suite utilizes data emitting from an array of on-board sensors, including hazard lights, fog lights, camera, emergency brakes and electronic stability control.

    The HERE Safety Services not only benefit road users – and highlight new possibilities for automakers to create value through the HERE Open Location Platform – but they also support mapping technology for automated driving. The services build the foundation for HERE HD Live Map to be entirely self-healing, where car sensor data is used to detect change in the real world. The result is an accurate and real-time representation of the road network.

    “HERE is driving the next generation of connected car services built on crowd-sourced sensor data,” said James Hodgson, Senior Analyst at ABI Research. “This most recent announcement with BMW demonstrates how HERE is leveraging existing sensor and cellular connectivity technologies, in tandem with their market-leading location intelligence expertise, to deliver connected services that are both compelling and which contribute to driver safety.”

    Source: HERE

    ]]>
    https://safecarnews.com/here-to-deliver-bmw-drivers-safety-services-created-from-live-vehicle-sensor-data/feed/ 0
    Archive:NXP and Baidu partner on Apollo open autonomous driving platform https://safecarnews.com/nxp-and-baidu-partner-on-apollo-open-autonomous-driving-platform/ https://safecarnews.com/nxp-and-baidu-partner-on-apollo-open-autonomous-driving-platform/#respond Tue, 02 Jan 2018 12:18:53 +0000 http://localhost/scn_staging_new/?p=13306

     

    • NXP joins Baidu’s Apollo, a leading open autonomous driving platform
    • NXP to provide millimeter wave radar, V2X, security and smart connectivity
    • Partners to leverage NXP BlueBox development platform for low-power, high performance and functional safety benefits

    NXP Semiconductors and Baidu, announced a cooperation in autonomous driving. Under the terms of the agreement, NXP will join Baidu’s open autonomous driving platform, Apollo, and provide semiconductor products and solutions including millimeter wave radar, V2X, security, smart connectivity and in-vehicle experience technologies.

    First announced in April 2017, Apollo is Baidu’s open autonomous driving platform which provides a comprehensive, secure and reliable all-in-one solution supporting all major features and functions of an autonomous vehicle. Baidu refers to Apollo as the Android of the autonomous driving industry, but more open and more powerful, allowing partners to go from zero to one and quickly assemble their own autonomous vehicles and start their product R&D. Apollo has now attracted over 70 global partners.

    Details of the collaboration include:

    • NXP will provide semiconductor products and solutions for autonomous driving, millimeter wave radar, V2X, security, smart connectivity and in-vehicle experiences
    • Companies will leverage the NXP BlueBox development platform’s low energy consumption, high-performance and functional safety benefits
    • NXP and Baidu will collaborate on sensor integration and high-performance processors for deep learning networks
    • Baidu’s conversational in-car system, DuerOS for Apollo, will incorporate NXP infotainment solutions for faster time to market and enhanced performance

    Source: NXP

    ]]>
    https://safecarnews.com/nxp-and-baidu-partner-on-apollo-open-autonomous-driving-platform/feed/ 0
    Archive:Nissan releases the technology and functionality of ProPILOT Park https://safecarnews.com/nissan-releases-the-technology-and-functionality-of-propilot-park/ https://safecarnews.com/nissan-releases-the-technology-and-functionality-of-propilot-park/#respond Thu, 07 Dec 2017 12:53:27 +0000 http://localhost/scn_staging_new/?p=13151

    By combining advanced image processing technology using four high-resolution cameras and information from 12 ultrasonic sensors around the car, ProPILOT Park guides the car into a space safely and accurately.

    ProPILOT Park is a fully fledged system that helps drivers park by automatically controlling acceleration, brakes, handling, shift changing and parking brakes to guide the car into a parking spot.

    All steering, braking and throttle inputs for various parking maneuvers, such as parallel parking, are automated. The system can automatically identify a parking space around the car so that the driver doesn’t need to set a target parking position. Requiring only three easy steps for activation, this technology liberates drivers from one of the most tedious, and at times the most challenging, tasks of driving.

    Function Overview

    In just three steps, ProPILOT Park will automatically park the car by controlling the steering, accelerator, brakes, gear shift and parking brake. When the parking process is complete, the system will shift the transmission to “P” and activate the electronic parking brake.

    1. In the vicinity of the desired parking space, press the “ProPILOT Park” button once, then slowly approach and stop the car in front of the desired parking space.

    2. The car will automatically detect the parking space and notify the driver the system is active with a [P] icon on the navigation display. Confirm that the system has recognized the desired parking space by pressing the “Start Park” button on the navigation monitor.

    3. Hold down the “ProPILOT Park” button until the parking process has been completed. If the button is released, or if the driver steps on the brakes or adjusts the steering wheel, the system will deactivate.

    Once the parking process is completed, the system automatically sets the electronic parking brake and shifts the transmission to the “P” position.

    • Compatible for front and back-in parking, as well as parallel parking. The system automatically turns the steering wheel while reversing to complete the parking maneuver into the parking space.
    • The sensors and controls have limitations, therefore, the driver should always monitor the surrounding area and apply the brakes if necessary to ensure the well-being of the car and its occupants.
    • The system is programmed to stop the parking maneuver if any hazard is detected, however, it does not fully eliminate chances of collision.

    Technology Overview

    ProPILOT Park uses four high-resolution cameras capable of real-time image processing and 12 sonar sensors placed around the vehicle to assess the vehicle’s surroundings. After processing the collected information, the system can then safely park the vehicle by controlling the accelerator, brakes, steering and transmission.

    Source: Nissan

    ]]>
    https://safecarnews.com/nissan-releases-the-technology-and-functionality-of-propilot-park/feed/ 0