Brought to you by
CEO, Cyborg Dynamics Engineering; Bachelor of Engineering (Aerospace), Monash University; Member of Engineers Australia; Chartered Engineer
Athena AI is an engineering breakthrough that has the ability to identify protected objects, people and symbols, such as hospitals, in near real time for military operations using computer vision at very high probabilities.
This means military commanders and decision-makers using the technology have a “third eye” that can watch an environment and identify if there is a change in the scenario that would require protection of a given piece of infrastructure such as a hospital in a warzone.
The system can also process large amounts of information to create a “no-strike” list of United Nations and medical or refugee areas in a given location.
The project involved the development of an artiﬁcial intelligence (AI) system to help soldiers on the battleﬁ eld identify protected objects and persons in accordance with international humanitarian law.
The system can create a “no-strike” list of United Nations and medical or refugee areas in a given location.
The team, led by CEO of Cyborg Dynamics Engineering Stephen Bornstein CPEng, trained an object classiﬁer to identify various signs of surrender, non-combatants, protected symbols and signs of injury.
This could then be used to alert an operator that a particular object or person could not be targeted on the battleﬁ eld unless the protection status was lost.
This AI classiﬁer and software development tool is a world-ﬁrst, combining a legal and ethical framework and developing an AI decision support tool that is compliant with Article 36 of the Geneva Convention, which no current system in the Australian Defence Force has.
The solution was workshopped with computer scientists, legal ofﬁcers, military ethicists and user groups to develop a functional product that improved humanitarian outcomes for targeting applications.
“This project is an AI decision support tool to provide Australian Defence Force members with information to identify civilians and other non-military people and objects.
“Combining a legal and ethical framework into the artiﬁcial intelligence tool involved many diverse stakeholders, who had to be managed sensitively. There are obvious beneﬁts to the community in protecting people who might otherwise be treated as collateral damage, but this tool also potentially beneﬁts the mental health of military personnel, as they can be more conﬁ dent that they are not targeting non-combatants.”