CREATE
  • Technology
    • BIOTECH
    • COMMUNICATIONS
    • COMPUTING
    • IMAGING
    • MATERIALS
    • ROBOTICS
    • SOFTWARE
  • Industry
    • DEFENCE
    • INFRASTRUCTURE
    • INNOVATION
    • MANUFACTURING
    • POLICY
    • PROJECTS
    • TRANSPORT
  • Sustainability
    • ENERGY
    • ENVIRONMENT
    • RESOURCES
  • Community
    • CULTURE
    • PEOPLE
  • Career
    • EDUCATION
    • INSPIRATION
    • LEADERSHIP
    • TRENDS
  • About
    • CONTACT
    • SUBSCRIBE
No Result
View All Result
CREATE
  • Technology
    • BIOTECH
    • COMMUNICATIONS
    • COMPUTING
    • IMAGING
    • MATERIALS
    • ROBOTICS
    • SOFTWARE
  • Industry
    • DEFENCE
    • INFRASTRUCTURE
    • INNOVATION
    • MANUFACTURING
    • POLICY
    • PROJECTS
    • TRANSPORT
  • Sustainability
    • ENERGY
    • ENVIRONMENT
    • RESOURCES
  • Community
    • CULTURE
    • PEOPLE
  • Career
    • EDUCATION
    • INSPIRATION
    • LEADERSHIP
    • TRENDS
  • About
    • CONTACT
    • SUBSCRIBE
No Result
View All Result
CREATE
No Result
View All Result
Home Technology Software

Just like the human brain, artificial intelligence can be biased

Nadine Cranenburgh by Nadine Cranenburgh
23 July 2019
in Software
4 min read
0
bias in artificial intelligence

Think artificial intelligence is unbiased? Think again. Researchers are finding the technology can reflect the flaws of the humans who build it, and are trying to counteract this effect.

Sometimes, the issue isn’t with the algorithm, but the data that is used to train it, according to electronics engineer and computer scientist Professor Vasad Honavar, who directs the Artificial Intelligence Research Laboratory at Pennsylvania State University.

In a statement, Honavar said AI systems are trained on large data sets, but if the data is biased, this can affect the system’s recommendations.

For example, Amazon retired an experimental AI recruiting tool when they found it favoured men over women – because most of the applicants over the past decade had been male.

Honavar explained that in cases such as this, the machine learning algorithm is doing what it’s supposed to do, which is to identify good job candidates based on certain desirable characteristics.

“But since it was trained on historical, biased data it has the potential to make unfair recommendations,” Honavar explained.

To address this issue, Honavar and a team of researchers have developed an AI tool to detect discrimination on the basis of characteristics such as race and gender.

Estimating fairness

The tool was designed to detect discrimination based on the principle of cause and effect.

Researcher Aria Khademi explained that to tackle the question of whether gender affected salaries, it could be reframed as “does gender have a causal effect on salary?”

“Or in other words, ‘Would a woman be paid more if she was a man?’,” added Khademi.

The researchers tested their tool on two data sets: US income data; and demographic data about drivers pulled over by the New York state police force.

They found there was evidence of gender-based discrimination with regards to salary, with women having two-thirds less chance of earning more than US$50,000 (AU$71,000) per year. From the police force data, the researchers found some evidence of possible racial bias against Hispanic and African American individuals, but did not find evidence of discrimination against these groups on average.

The researchers’ findings were published in the Proceedings of the 2019 World Wide Web Conference in May.

Their paper stated there is a pressing need to make sure real-world algorithmic decision-making systems do not become vehicles of unfair discrimination, inequality and social injustice. To do this, you need effective tools for detecting discrimination, Honavar said.

“Our tool can help with that,” he added.

Inside the ‘black box’

Another pressing issue as industry and government continue to collect personal data – including biometric data such as facial images – is how this data will be used by AI algorithms. This means the engineers who develop AI technology also need to think about how their work will be put to use.

This issue was recently brought into the spotlight when Curtin University and the University of Technology Sydney announced that they are reviewing links to Chinese companies and research that use facial recognition tech to track and detain members of the minority Uyghur ethnic group.

To spur discussion on the issue, researchers from the University of Melbourne developed a tool called the Biometric Mirror. This is an interactive application that compares users’ photos to thousands of facial images and crowd-sourced evaluations, where a large number of people have rated how they perceive each face’s personality.

Biometric Mirror uses this comparison to rate the user’s personality characteristics, including attractiveness, aggression, responsibility, emotional stability and ‘weirdness’ – and asks them to imagine a world where this information is shared with their employer or insurer.

According to developers Dr Niels Wouters and Professor Frank Vetere, their application can be confronting.

“It starkly demonstrates the possible consequences of AI and algorithmic bias, and it encourages us [to] reflect on a landscape where government and business increasingly rely on AI to inform their decisions,” he said.

Tags: ethicssoftwareAIartificial intelligence
Previous Post

Australian bioengineering inspirations, from the pacemaker to the bionic eye

Next Post

Building inspectors will soon get help from specialised drones

Nadine Cranenburgh

Nadine Cranenburgh

Nadine Cranenburgh is an electrical engineer with postgraduate qualifications in environmental engineering, and professional writing and editing. She works as a freelance writer and editor specialising in complex topics that draw on her experience in the engineering, local government, defence and environment industries.

Next Post
asset maintenance by drone

Building inspectors will soon get help from specialised drones

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

    WANT CREATE DELIVERED DIRECT TO YOUR INBOX? SUBSCRIBE TO OUR NEWSLETTER.

    By subscribing to create you are also subscribing to Engineers Australia content. Please find our Terms and conditions here

    create is brought to you by Engineers Australia, Australia's national body for engineers and the voice of more than 120,000 members. Backing today's problem-solvers so they can shape a better tomorrow.
    • ABOUT US
    • CONTACT US
    • SITEMAP
    • PRIVACY POLICY
    • TERMS
    • SUBSCRIBE

    © 2024 Engineers Australia

    No Result
    View All Result
    • Technology
      • BIOTECH
      • COMMUNICATIONS
      • COMPUTING
      • IMAGING
      • MATERIALS
      • ROBOTICS
      • SOFTWARE
    • Industry
      • DEFENCE
      • INFRASTRUCTURE
      • INNOVATION
      • MANUFACTURING
      • POLICY
      • PROJECTS
      • TRANSPORT
    • Sustainability
      • ENERGY
      • ENVIRONMENT
      • RESOURCES
    • Community
      • CULTURE
      • PEOPLE
    • Career
      • EDUCATION
      • INSPIRATION
      • LEADERSHIP
      • TRENDS
    • About
      • CONTACT
      • SUBSCRIBE
    preload imagepreload image