Autonomous Driving and Software-Defined Vehicles: Insights from EE Times Automotive Tech Forum 2026

Autonomous Driving and Software-Defined Vehicles: Insights from EE Times Automotive Tech Forum 2026

While electrification continues to reshape vehicle platforms, the real architectural shift in automotive is unfolding in autonomy and software-defined systems. At the EE Times Automotive Tech Forum 2026, discussions moved beyond incremental ADAS updates and into the structural challenges of achieving Level 4 autonomy and scaling software-defined vehicle (SDV) architectures.

From deployment timelines and redundancy strategies to LiDAR-enabled perception and centralised compute models, one thing was clear: vehicles are rapidly becoming long-lifecycle. safety-critical compute platforms. Here are the key autonomous driving insights from this fascinating virtual event, and what they mean for semiconductor design and verification teams.

Automotive Tech Forum 2026 at a Glance

  • Event: EE Times Automotive Tech Forum
  • Date: February 25 – 26, 2026
  • Format: Virtual technical conference
  • Focus Areas: EV power electronics, SiC, GaN, Drivetrain optimisation, Autonomous vehicles, LiDAR, Level 4 roadmaps and SDVs

Autonomy’s Next Phase Is an Architectural Challenge

The opening keynote – Autonomous Vehicles Are on Their Way: A Practical Timeline and Roadmap – presented by Shihao Fu, Technology Analyst, IDTechEx, provided a pragmatic assessment of where autonomous driving truly stands in 2026.

Level 3, long described as “technically ready but commercially slow”, is now entering early commercialisation. Regulatory and liability barriers remain significant but momentum is building, particularly in China, where the first mass-produced Level 3 passenger vehicle has been approved under tightly defined operating conditions. Looking ahead to 2026 and beyond, L2+ are increasingly converging toward L3 capability. However, the pace of adoption will remain closely tied to regulatory clarity rather than purely technical readiness.

For Level 4 robotaxis, the priority is no longer pilot feasibility but scaling beyond controlled deployments. Real-world driving data accumulation is emerging as a critical competitive differentiator, shaping training, validation and deployment confidence. During the Q&A portion, Shihao Fu emphasised that scaling beyond multi-city pilot operations requires more than incremental performance gains. The underlying technology must be replicable across regions, capable of adapting to different infrastructure types, traffic behaviours and regulatory environments. He also noted that sustained capital support and ecosystem integration will be just as important as perception accuracy.

Clearly, autonomy progression is becoming an architectural question. Over the next five years, Shiaho Fu claims the defining challenge for OEMs will be how they design and evolve their software-defined vehicle (SDV) platforms – the foundation that enables scalable compute, redundancy and continuous software updates. For semiconductor design and verification teams, that shift places an increasing emphasis on system-level validation and long-lifecycle design readiness.

LiDAR and Perception Reliability at Automotive Scale

In a keynote – LiDAR Is Key to Safer ADAS and Autonomous Vehicles – delivered by Elad Hofstetter, Chief Business Officer at Innoviz Technologies, the focus shifted to perception reliability.

LiDAR (Light Detection and Ranging) enables high-resolution 3D environmental mapping using laser-based distance measurement. Unlike camera-based systems, it is less affected by low-light conditions, glare or certain visual distortions, and can maintain strong performance at night and in challenging weather conditions such as rain or snow. High-performance LiDAR systems can accurately detect obstacles, identify lane markings and road boundaries, and generate detailed scene perception in real time.

However, the session emphasised that LiDAR does not replace other sensors. Cameras, radar and LiDAR each offer distinct strengths across functions such as colour and semantic interpretation, small-object detection, stationary object recognition, depth accuracy and adverse-weather performance. Together, they form a complementary perception stack, described by Hofstetter as the “eyes, ears and nose” of autonomous vehicles.

For semiconductor teams, this multi-sensor approach increases architectural complexity. Perception stacks are no longer isolated subsystems; they are tightly integrated into centralised compute architectures, raising the bar for system-level verification.

From Hardware-First to Software-Defined: The New Architecture of Automotive Systems

In a session on SDV architecture, presented by Manish Vora of eInfochips, the discussion focused on how automakers are shifting from traditional hardware-centric designs toward centralised, modular computing platforms

Historically, vehicles evolved as mechanical products with electronics added as supporting components. Software-defined vehicles redefine that. As Vora described it, the industry is moving from a “frozen hardware state to a living, evolving platform” – one capable of continuous software updates and feature expansion over its operational lifetime.

This transformation is driven by the consolidation of compute functions. Instead of hundreds of small, isolated controllers, SDVs rely on centralised computing architectures, often described as a vehicle’s digital brain. This enables learning, adaptability and long-term relevance, but it also introduces significant architectural demands.

To support these capabilities, vehicles require substantial processing power and high-performance computing infrastructure. Automotive zone controllers and centralized compute modules increasingly rely on ASIC-based designs to handle safety-critical data and real-time decision workloads. ASICs play a key role in delivering the performance and determinism required for advanced autonomy and system-level reliability.

For semiconductor and verification teams, this shift underscores a fundamental truth: automotive platforms are no longer static systems. They are evolving compute environments that demand rigorous system-level validation, long-term maintainability and hardware architectures capable of supporting continuous software evolution.

The Road to Autonomous and Software-Defined Systems

The automotive industry is undergoing a structural transformation. Autonomy and software-defined architectures are redefining what a vehicle is – from a static mechanical product to a living, evolving computing platform.

At the EE Times Automotive Tech Forum 2026, this shift was evident across every discussion: scalability beyond pilot deployments, sensor fusion for reliable perception, and centralised compute models capable of supporting long-term software evolution.

For semiconductor and verification teams, the implications are clear. Vehicles are increasingly long-lifecycle, safety-critical systems requiring system-level validation, heterogeneous compute architectures and hardware foundations designed to support continuous adaptation.

As autonomy progresses and software-defined architectures mature, success will depend on the ability to integrate performance, safety and flexibility at scale. The future of automotive innovation is no longer defined solely by mechanical engineering, but by software, silicon and system-level reliability.

At AsicPro Solutions, we specialise in solving complex verification and design challenges for next-generation systems. If you’d like to discuss how emerging automotive architectures intersect with semiconductor innovation, get in touch.

Comments are closed.