Innovation

Netradyne: Teaching machines to understand driving

Before Netradyne, fleet management is largely reactive.

 

Every day, millions of commercial vehicles move across roads carrying goods, people, and services. Behind each vehicle is a driver, making hundreds of small decisions every hour—when to brake, when to accelerate, when to check mirrors.

Most of these decisions go unrecorded. And when something goes wrong, the system reacts after the fact—through accidents, insurance claims, or investigations.

Netradyne is built around a simple but ambitious idea: what if driving could be understood in real time, and improved while it is happening?

The origin:

Netradyne was founded in 2015 by Avneesh Agrawal and David Julian, both former Qualcomm engineers. The starting point was not fleet management. It was computer vision—the ability of machines to “see” and interpret the world.

The founders believed that if machines could understand images and video at scale, they could understand driving behaviour. And if they could understand driving behaviour, they could reduce accidents.

This was not an obvious path at the time. Dashcams already existed, but they were passive tools—they recorded footage for later review. Netradyne wanted to build something active: a system that watches, understands, and responds.

What Netradyne built

Netradyne’s core product is called Driver•i, an AI-powered camera system installed inside commercial vehicles. At a basic level, it looks like a dashcam. But it does much more. The system uses multiple cameras—facing the road, the driver, and the surroundings. These cameras capture video continuously. That video is processed using artificial intelligence.

The system identifies events such as: harsh braking, overspeeding, lane changes, tailgating, distraction, or mobile phone use. But it also identifies positive behaviour—safe driving, proper following distance, smooth braking.

This is an important difference. Most systems focus on what drivers do wrong. Netradyne focuses on both good and bad behaviour. The output is not just footage. It is a score and feedback system.

Drivers receive real-time alerts inside the vehicle. Fleet managers receive dashboards showing performance trends. Over time, the system builds a profile of each driver’s behaviour.

How the system works

You can think of Netradyne as adding a “layer of intelligence” to driving.

First, cameras capture everything that happens during a trip. Second, AI models analyse this data in real time. Third, the system classifies behaviour—safe or risky. Fourth, it gives feedback.

For example: If a driver is too close to another vehicle, the system alerts them immediately. If a driver maintains safe distance consistently, the system records it as positive behaviour. This happens continuously, across 100% of driving time—not just selected moments.

What changes after deployment

Before Netradyne, fleet management is largely reactive. Accidents happen → reports are filed → actions are taken. After Netradyne, the system becomes proactive. Drivers receive guidance during the trip, not after. Managers see patterns across drivers and routes. Training becomes data-driven instead of generic. The results show up in operational metrics.

Some fleets report significant reductions in risky behaviour. In one case, distraction alerts dropped by 25% and collision-related alerts by 50% within months.  Another reported outcome is a 98% reduction in preventable incident costs for a logistics company.

These numbers matter because they translate directly into fewer accidents, lower insurance costs, and safer roads.

Scale, funding, and growth

Netradyne has grown into a global company, headquartered in San Diego with major operations in India. It has raised over $300 million in funding across multiple rounds, including a $90 million Series D in 2025. This funding pushed the company into unicorn status, with a valuation above $1 billion.

It processes billions of miles of driving data, building one of the largest datasets on real-world driving behaviour.

What makes the approach different

There are many fleet tracking and telematics companies. Netradyne’s difference lies in how it uses vision.

First, it analyses video, not just sensor data. Traditional systems rely on GPS and speed data. Netradyne sees what is actually happening on the road.

Second, it focuses on positive reinforcement. Instead of only penalising bad driving, it rewards good driving. This changes how drivers respond to the system.

Third, it captures 100% of drive time. This creates a complete picture rather than sampled insights.

Fourth, it uses edge AI. Much of the processing happens inside the device, enabling real-time feedback without relying entirely on cloud connectivity.

Finally, it combines hardware and software into a single system—camera + analytics + dashboard.

Market response and challenges

The strongest adoption has come from logistics, transportation, and delivery companies. For these businesses, safety is directly tied to cost. Accidents lead to vehicle damage, insurance claims, delays, and reputational risk. Netradyne offers a way to reduce these risks. Feedback from customers often highlights improved safety culture. Drivers become more aware of their behaviour, and companies can link incentives to performance scores.

However, the model also faces challenges. One major concern is privacy. Drivers may feel uncomfortable being constantly monitored. Adoption requires building trust and demonstrating that the system is fair and beneficial.

Another challenge is behaviour change. Technology can provide feedback, but drivers still need to act on it.

The global landscape

Netradyne operates in the broader category of video telematics and fleet safety technology. Globally, companies like Samsara, Lytx, and Motive provide similar solutions—combining cameras, sensors, and analytics. The market is growing rapidly as logistics networks expand and regulations around safety tighten. What differentiates players is not just hardware, but the intelligence layer—how well the system interprets data and improves outcomes.

  • Our correspondent