CREATE
  • Technology
    • BIOTECH
    • COMMUNICATIONS
    • COMPUTING
    • IMAGING
    • MATERIALS
    • ROBOTICS
    • SOFTWARE
  • Industry
    • DEFENCE
    • INFRASTRUCTURE
    • INNOVATION
    • MANUFACTURING
    • POLICY
    • PROJECTS
    • TRANSPORT
  • Sustainability
    • ENERGY
    • ENVIRONMENT
    • RESOURCES
  • Community
    • CULTURE
    • PEOPLE
  • Career
    • EDUCATION
    • INSPIRATION
    • LEADERSHIP
    • TRENDS
  • About
    • CONTACT
    • SUBSCRIBE
No Result
View All Result
CREATE
  • Technology
    • BIOTECH
    • COMMUNICATIONS
    • COMPUTING
    • IMAGING
    • MATERIALS
    • ROBOTICS
    • SOFTWARE
  • Industry
    • DEFENCE
    • INFRASTRUCTURE
    • INNOVATION
    • MANUFACTURING
    • POLICY
    • PROJECTS
    • TRANSPORT
  • Sustainability
    • ENERGY
    • ENVIRONMENT
    • RESOURCES
  • Community
    • CULTURE
    • PEOPLE
  • Career
    • EDUCATION
    • INSPIRATION
    • LEADERSHIP
    • TRENDS
  • About
    • CONTACT
    • SUBSCRIBE
No Result
View All Result
CREATE
No Result
View All Result
Home Industry Policy

New report asks what kind of relationship Australians want with AI in the future

Nadine Cranenburgh by Nadine Cranenburgh
18 August 2019
in Policy
3 min read
0
artificial intelligence user guidelines

All Australians should be asking themselves and their governments what role artificial intelligence will play in society, according to Chief Scientist Dr Alan Finkel.

Finkel made the comment while launching a recent report by the Australian Council of Learned Academies (ACOLA). The report urges the nation to reflect on what AI-enabled future it wants, as crucial decisions are currently being made about the future impact of AI.

“This report was commissioned by the National Science and Technology Council, to develop an intellectual context for our human society to turn to in deciding what living well in this new era will mean,” Finkel said.

Among the report’s findings are the importance of having a national strategy, community awareness campaign, safe and accessible digital infrastructure, a responsive regulatory system, and a diverse and highly skilled workforce.

“By bringing together Australia’s leading experts from the sciences, technology and engineering, humanities, arts and social sciences, this ACOLA report comprehensively examines the key issues arising from the development and implementation of AI technologies,” said Professor Hugh Bradlow, Chair of the ACOLA Board.

Setting an example

Co-chair of the ACOLA expert working group, Professor Toby Walsh, said that AI offers great opportunities, provided we ensure it does not compromise our human values.

“As a nation, we should look to set the global example for the responsible adoption of AI,” he said.

Walsh himself has been active in setting an example, campaigning locally and internationally for a ban on autonomous weapons, or ‘killer robots’.

According to Walsh, his activism started when he realised how many of his colleagues in the AI field were dismissing killer robots as a problem of the distant future.

“From what I could see, the future was already here. Drone bombers were flying over the skies of Afghanistan. Though humans on the ground controlled the drones, it’s a small technical step to render them autonomous,” he explained.

To counter this apathy, Walsh organised a debate about autonomous weapons at a scientific conference, and was asked by the head of the MIT Future of Life Institute to help him circulate a letter calling for the international community to ban emerging robot weaponry. Walsh gathered over 5000 signatures, including those of Elon Musk and Steve Wozniac.

According to Walsh, the key issue is that we can’t let machines decide if we live or die.

“Machines don’t have our moral compass, our compassion and our emotions. Machines are not moral beings,” he said, adding that unlike other banned weapons of mass destruction, autonomous weapons could use facial recognition to discriminate between victims.

Walsh has previously told create that engineers need to be aware of their responsibility to produce AI-enabled tools that meet the expectations of society. This applies not just for robotic killing machines, but more mundane applications such as smart house tech, facial recognition and news algorithms.

“There are some decisions we should make about where technology shouldn’t be in our lives, not just where it should be in our lives,” he said. 

Tags: machine learningpublic policyautonomous weaponsartificial intelligenceautonomous systemsethics
Previous Post

This year’s AFR Most Innovative Companies list is rich with engineering and tech innovation

Next Post

Engineering students solve real-world problems to help refugees

Nadine Cranenburgh

Nadine Cranenburgh

Nadine Cranenburgh is an electrical engineer with postgraduate qualifications in environmental engineering, and professional writing and editing. She works as a freelance writer and editor specialising in complex topics that draw on her experience in the engineering, local government, defence and environment industries.

Next Post
humanitarian engineering hackathon

Engineering students solve real-world problems to help refugees

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

    WANT CREATE DELIVERED DIRECT TO YOUR INBOX? SUBSCRIBE TO OUR NEWSLETTER.

    By subscribing to create you are also subscribing to Engineers Australia content. Please find our Terms and conditions here

    create is brought to you by Engineers Australia, Australia's national body for engineers and the voice of more than 120,000 members. Backing today's problem-solvers so they can shape a better tomorrow.
    • ABOUT US
    • CONTACT US
    • SITEMAP
    • PRIVACY POLICY
    • TERMS
    • SUBSCRIBE

    © 2024 Engineers Australia

    No Result
    View All Result
    • Technology
      • BIOTECH
      • COMMUNICATIONS
      • COMPUTING
      • IMAGING
      • MATERIALS
      • ROBOTICS
      • SOFTWARE
    • Industry
      • DEFENCE
      • INFRASTRUCTURE
      • INNOVATION
      • MANUFACTURING
      • POLICY
      • PROJECTS
      • TRANSPORT
    • Sustainability
      • ENERGY
      • ENVIRONMENT
      • RESOURCES
    • Community
      • CULTURE
      • PEOPLE
    • Career
      • EDUCATION
      • INSPIRATION
      • LEADERSHIP
      • TRENDS
    • About
      • CONTACT
      • SUBSCRIBE
    preload imagepreload image