Helm.ai, a developer of AI software for advanced driver assistance systems (ADAS) and autonomous driving, has introduced Helm.ai Driver, a vision-only neural network designed for real-time path prediction. The system, which supports Level 2 to Level 4 autonomous driving in both highway and urban environments, operates solely using camera-based inputs, eliminating the need for HD maps, Lidar, or additional sensors.
The Helm.ai Driver uses the company’s Deep Teaching⢠methodology, which allows the system to learn complex driving behaviors such as navigating intersections and avoiding obstacles without explicit programming. This enables the neural network to replicate human-like driving patterns, improving its performance in a variety of driving conditions.
See also: Helm.ai Secures Automotive SPICE Level 2 Certification for Autonomous Driving Software
The system’s capabilities were tested through closed-loop simulations on the CARLA platform, with Helm.aiās GenSim-2 providing realistic camera outputs to validate the system’s performance. The system integrates with Helm.aiās perception software to ensure compatibility and enhance its interpretability, factors that are essential for safe autonomous driving.
Vladislav Voroninski, CEO of Helm.ai, stated, āWe believe this vision-only system represents a step forward in urban path prediction. By combining AI with generative simulation, we aim to develop scalable and adaptable autonomous driving solutions.ā
See also: Helm.ai Unveils Enhanced VidGen-2 AI Model for Autonomous Driving
Founded in 2016 and based in Redwood City, CA, Helm.ai continues to focus on AI-driven autonomous driving technology, working with global automakers to advance the development of autonomous vehicles.