Seeing Machines improves driver safety by monitoring for signs of fatigue

Seeing Machines, an Australian business founded in 2000, is developing systems that could one day be able to determine what drivers are thinking and feeling. 

When people ask what he does for work, mechanical and systems engineer Damien Balachandran MIEAust says he is employed by a computer vision company that specialises in smart cameras. 

It’s an easy way to explain that, in his work at Seeing Machines, he develops technology that constantly analyses a driver’s face, and particularly their head, eyes and eyelids, to help detect when they’re distracted or in danger of falling asleep. 

In doing so, he engineers machines that save the lives of drivers and other road users.

“As we move into a world where there will be more autonomous vehicles on the road, it will be important to know the level of a driver’s engagement, assessing whether their mind is ‘on the road’, and how they might be feeling,” Balachandran, Team Leader for the Applications Group at Seeing Machines, explained. 

“You can tell a lot by analysing a person’s face, and we train machines to recognise that.”

Industries such as automotive, aviation, fleet management, resources and rail, are engaged in the application of this technology. 

Already, the Seeing Machines system, known as Guardian, has travelled more than 5.4 billion kilometres with drivers, detecting more than 7.7 million events along the way. It is connected to more than 23,000 commercial vehicles in 30 countries and has been proven to reduce fatigue events by more than 90 per cent and mobile phone usage by 80 per cent. 

“You can tell a lot by analysing a person’s face, And we train machines to recognise that.”
Damien Balachandran MIEAust

Guardian has made more than 157,000 fatigue interventions over the past 12 months alone.

What does an intervention involve? When the system detects fatigue or distraction, it activates seat vibration and audio alerts to warn the driver. Human analysts in a central support room then review the footage and notify a manager, depending on the severity of the fatigue. 

The manager can help to determine the best course of action — asking the driver to take a break or ending a shift, for example — then implement new processes, such as changed shift lengths or training programs, to improve safety.

“I work within a team of hardware and systems specialists who build the electronics,” Balachandran said.

“So, I can discuss our product from a circuit board design point of view, all the way up to systems design point of view.

“Part of what I do is talk to car manufacturers, to interact with them around where’s the best place to put a camera, what performance they can get with our system, what can the system do, etc. And we coordinate processes among our global teams.”

Algorithmic engineering

Matthew Hampsey, who graduated as an electrical and electronics engineer and is now a software engineer, calls himself as a “code monkey”. He originally worked with animation software for films such as The Lego Batman Movie. 

Now he works with the algorithm embedding team at Seeing Machines, taking the complete, driver-monitoring algorithms from the developers and “squeezing them down” into the embedded platform that goes into each car.

“The algorithm itself — for example the gaze-tracking algorithm that works out where the driver is looking — is developed by researchers on their big, beefy, powerhouse desktop machines. But in the vehicle, it’s got to run on the infotainment unit,” Hampsey explained. 

“Basically, it’s taking the software and making it run on cheaper and less powerful machines inside a car. It’s not simple. One of the things the computers in the cars usually have is some kind of hardware accelerator — it might be a digital signal processor or a [graphics processing unit]. Part of my job is taking those algorithms and working out which parts can be run on those accelerators.”

But it’s not as simple as that. For every car type and every manufacturer, Hampsey and his team require estimates of what else will be running on the vehicle’s system. The Seeing Machines product is only allowed a certain level of resources — a certain level of technological bandwidth.

“We do some projections and see if it’s going to be feasible, or even possible,” he said. 

“From there, we do a lot of profiling of the software. We have tools to work out which parts are going to struggle, and we use that to guide which parts need to be optimised or moved on to the accelerators.”

It’s a great example of the way physical engineering, such as structural and mechanical, is merging with electrical and software engineering across numerous applications and industries. Increasingly complex and intelligent systems have to work in progressively smaller, highly designed and more demanding spaces.

Such smart engineering must then work flawlessly with the engineered and technological environment in which it operates. 

“It’s taking the software and making it run on cheaper and less powerful machines. It’s not simple.”
Matthew Hampsey

System-wide challenges

In creating vision-based monitoring technology that enables machines to see, understand and assist people, Seeing Machines employs around 140 engineers. The work involves AI algorithms and local data processing that takes its feed from a device specifically engineered around optics and power. 

In its automotive application, sending intervention alerts back to a central monitoring centre for analysis by human specialists involves interaction with the local mobile network.

Ahira Martin, a system engineer with Seeing Machines, works with a team that develops the full product in which the algorithm is embedded. 

“We have hardware engineers co-designing the hardware parts with third-party suppliers, firmware engineers developing the firmware that runs on those different hardware parts, and application engineers developing services that enable our IP solution to monitor the driver at maximum availability — ie minimum system failure,” Martin explained. 

“Each unit we have in the field then has to communicate with cloud applications so that human analysts can remotely review and monitor what’s happening on these fielded units.”

With so many different teams and domains involved, it’s important to ensure alignment at any point in time, she said. A seemingly tiny fault in one part of the product could cascade into other parts of the system and create severe failures. 

Ahira Martin sits in a Seeing Machines monitored car with Balachandran alongside.

As lead system engineer, Martin ensures those risks are identified when new features come in, that their impact is understood across all the domains, and that engineering requirements are well documented to address these features and issues.

“As our focus has always been on our algorithms, we’ve had internet-related faults which were critical but lacked attention. Internet is vital for the end-to-end system to work,” she said. 

“If the internet fails, the system will be unable to communicate with the cloud applications and analysts will receive delayed notification of fatigue and distraction events. If an event is delayed by two hours, for example, it may not be actionable any longer. For a while, we had issues with losing internet and an inability to recover until the next ignition cycle.”

Many factors were at play in the issue, and many were out of the hands of the technicians and engineers at Seeing Machines. 

“There are different network providers in use by our customers,” Martin said. “Our trucks are being driven to different locations and many of these may be out of coverage areas. We are using third-party libraries to connect to the internet, and our modem has its own firmware being supplied by third party, as well.”

To figure out the root cause of this issue, Martin dived deep into the different network layers of the system, consulted with third-party library owners, coordinated with the modem supplier and with network providers. 

“There were systematic issues that have been fixed, but there are still issues remaining which were isolated to the interaction of our modem with specific networks,” she said.

“As I previously worked in telecommunications, I know that these modem suppliers and network providers have the tools to get to the bottom of the issue, and so I rallied to get that support.

“In the short-term, the reality is that autonomous technology isn’t mature enough for drivers to not be required completely.”
Damien Balachandran MIEAust

“It’s a challenge to stay on top of all of the technologies we use, but it also means I’m not in too deep with any particular one of these components. I need a high-level understanding of where things fit and, from time to time, I dig down deeper to understand how things work and why.”

The future of drivers

Why does the type of technology being developed by businesses like Seeing Machines matter in a world that is hurtling towards a driverless future? Why do we invest in monitoring drivers when drivers will soon be redundant?

“In the short-term, the reality is that autonomous technology isn’t mature enough for drivers to not be required completely,” Balachandran said.

“What we need to do is ensure the driver is available to help the vehicle navigate complex scenarios when required. Perhaps at some stage we’ll be able to take our mind off the driving task, read a book or do something else; however, until then, drivers will be required to assess complex situations, such as someone or something running across the road, or encountering challenging weather or road conditions.

“We don’t want situations where someone falls asleep on a long, autonomous drive, if the vehicle encounters a complex situation. That is extremely dangerous. We need to know whether the driver is able to make a decision or not.” 

The long-term view is that, once technology has matured enough to allow the existence of properly autonomous vehicles, the purpose of a camera-based monitoring system broadens from a focus on safety to aspects of convenience, Balachandran said.

That convenience could come in the form of technology that detects whether a driver is happy or sad, and responds accordingly, he explained. 

“The car could recognise you and know your favourite seat settings, change the lighting to suit your mood and perhaps play some jazz because you appear stressed,” he said. “As the role of the driver disappears, the technology would refocus to ensure the comfort and safety of all passengers.

“This is a niche market in which Seeing Machines considers itself the leader. As it’s working in uncharted territory, the business is having to solve numerous problems on the fly — problems that, in many cases, have never been experienced before.”  

Counting the cost

  • 1.35 million: Annual global death toll from road accidents — about 3700 deaths per day.
  • Worldwide, road traffic injuries cost $712 billion annually.
  • For people aged 5 to 29, road traffic injuries are the leading cause of death. 
  • The road traffic accident death rate is more than three times higher in low-income countries than high-income.
  • In 2018 in the US, more than 2800 people died and around 400,000 were injured as a result of a driver being distracted.
  • In 2019 in Australia, 1195 people died in road accidents.
  • In Australia, between 2010 and 2019, the national annual road accident fatality rate per 100,000 population decreased from 6.1 to 4.7, but hospitalised injuries increased by 3.3 per cent per annum.
Exit mobile version