NewsPronto

 
Men's Weekly

.

USA Conversation

The Conversation USA

The Conversation USA

Does your AI discriminate?

  • Written by Julie Manning Magid, Professor of Business Law, IUPUI

AI may not cut discrimination out of the hiring process.

Women leaders like New Zealand Prime Minister Jacinda Ardern and San Francisco Mayor London Breed are receiving recognition for their quick action in the face of the COVID-19 pandemic.

But men are chosen as leaders of government around the world in vastly greater numbers.

This disparity is not confined to political leadership. In 2019, Forbes choose 100 of America’s “Most Influential Leaders,” and 99 of them were men.

The lack of diversity is not limited to gender. A survey of nonprofit sector chief executives found that 87% of survey respondents self-identified as white.

As the executive and academic director of a leadership center, I study employment discrimination and inclusion. I’ve seen that many organizations want a process where bias could be removed from identifying leaders. Investors want to invest in businesses with diverse workforces, and employees want to work in diverse organizations.

My research indicates that relying on data analytics to eliminate human bias in choosing leaders won’t help.

AI isn’t foolproof

Employers increasingly rely on algorithms to determine who advances through application portals to an interview.

As labor rights scholar Ifeoma Ajunwa writes, “Algorithmic decision-making is the civil rights issue of the 21st century.” In February 2020, the U.S. House of Representatives’ Committee on Education and Labor convened a hearing called “The Future of Work: Protecting Workers’ Civil Rights in the Digital Age.”

Hiring algorithms create a selection process that offers no transparency and is not monitored. Applicants struck from an application process – or as Ajunwa refers to it, “algorithmically blackballed” – have few legal protections.

For instance, in 2014, Amazon reportedly began developing a computer-based program to identify the best resumes submitted for jobs. The idea was to automate a process and gain efficiency, much as it has done with other aspects of its business.

However, by using computer models to observe patterns in the previous 10 years of submitted resumes to choose the best, the computer taught itself that resumes from men were preferred to a resume that included the word “women’s,” as in a women’s club or organization. Amazon subsequently abandoned the project, according to reports.

Although often historic biases are inadvertently built into algorithms and reflect human prejudices, recent scholarship by Philip M. Nichols has identified an additional threat of potential intentional manipulation of underlying algorithms to benefit third parties.

Inadvertent or intentional, the ability to detect bias of an algorithm using advanced data analytics is extremely difficult because it can occur at any stage of the development of AI, from data collection to modeling.

Therefore, although organizations have access to leadership analytical tools based on research and analysis of leadership traits, the white male leader stereotype is deeply ingrained and even sometimes perpetuated by those who themselves are diverse. This cannot be eliminated simply by developing an algorithm that selects leaders.

After the interviews

The data to build these algorithms increase exponentially.

One video interview service, HireVue, boasts of its ability to detect thousands of data points in a single 30-minute interview, from sentence structure to facial movements, to determine employability against other applicants.

Imagine the opportunity, then, for a current employer to collect data continuously to determine leadership potential and promotions of its workforce. For instance, cameras in the workplace can collect facial expressions all day at work, particularly when entering and exiting the workplace.

Increasingly, the data are not just collected during the work day or while at work, but during off-duty conduct as well. In a recent article, Professor Leora Eisenstaedt identified workplace programs that gathered massive amounts of data of off-duty conduct of employees from Facebook posts and Fitbit usage, for example, without transparency about future use of the data. Employers then used those bits of data to draw correlations to predict workplace success.

As Eisenstaedt notes, most workers “will likely chafe at the notion that their taste in beer, love of indie rock and preference for the Washington Post, along with thousands of other variables, can be used to determine professional development opportunities, leadership potential and future career success.”

Nonetheless, that potential exists today in workplaces, and the law simply has not caught up to the vast amount of data collected and utilized by employers wanting to know the promotion and leadership investment in its employees is supported by the data.

In many cases, employees agree to collection of meta-data without a thorough understanding of what that data can reveal and how it can be used to help or hamper a career.

[You’re smart and curious about the world. So are The Conversation’s authors and editors.You can read us daily by subscribing to our newsletter.]

Julie Manning Magid does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Authors: Julie Manning Magid, Professor of Business Law, IUPUI

Read more https://theconversation.com/does-your-ai-discriminate-132847

More Articles ...

  1. The lack of women in cybersecurity leaves the online world at greater risk
  2. Robo-boot concept promises 50% faster running
  3. Solar farms, power stations and water treatment plants can be attractions instead of eyesores
  4. How do Buddhists handle coronavirus? The answer is not just meditation
  5. How Little Richard helped launch the Beatles
  6. Death by numbers: How Vietnam War and coronavirus changed the way we mourn
  7. More than 1 in 5 Americans are taking care of their elderly, ill and disabled relatives and friends
  8. Who's in charge of lifting lockdowns?
  9. Megacity slums are incubators of disease – but coronavirus response isn't helping the billion people who live in them
  10. Prehistoric human footprints reveal a rare snapshot of ancient human group behavior
  11. What makes the wind?
  12. Social distancing is no reason to stop service learning – just do it online
  13. Everyday ethics: Stripping puts me in close contact with others – should I go back to work?
  14. Delaying primaries helps protect incumbents as well as voters
  15. We designed an experimental AI tool to predict which COVID-19 patients are going to get the sickest
  16. A new type of chemical bond: The charge-shift bond
  17. What is the ACE2 receptor, how is it connected to coronavirus and why might it be key to treating COVID-19? The experts explain
  18. 'I thought I could wait this out': Fearing coronavirus, patients are delaying hospital visits, putting health and lives at risk
  19. Masks help stop the spread of coronavirus – the science is simple and I'm one of 100 experts urging governors to require public mask-wearing
  20. Americans may be willing to pay $5 trillion to stop the spread of the coronavirus and save lives
  21. What the coronavirus crisis reveals about vulnerable populations behind bars and on the streets
  22. Coronavirus diets: What's behind the urge to eat like little kids?
  23. How the Lyme disease epidemic is spreading and why ticks are so hard to stop
  24. Amid pandemic, campaigning turns to the internet
  25. Why it's wrong to blame livestock farms for coronavirus
  26. Bankruptcy courts ill-prepared for tsunami of people going broke from coronavirus shutdown
  27. Surprise medical bills continue during coronavirus time, and Congress still misses major points
  28. What is a clinical trial? A health policy expert explains
  29. 'Blue state bailouts'? Some states like New York send billions more to federal government than they get back
  30. Everyday ethics: Is it OK to feed stray cats during the coronavirus crisis?
  31. AI tool searches thousands of scientific papers to guide researchers to coronavirus insights
  32. Government cybersecurity commission calls for international cooperation, resilience and retaliation
  33. Ashamed over my mental illness, I realized drawing might help me – and others – cope
  34. The dirty history of soap
  35. Study shows how Airbnb hosts discriminate against guests with disabilities as sharing economy remains in ADA gray area
  36. Can a business still be small with 500 employees?
  37. A way to make COVID-19 college furloughs more fair
  38. What FDR’s polio crusade teaches us about presidential leadership amid crisis
  39. As reopening begins in uncertain coronavirus times, you need emotional protective equipment, too
  40. Nurses on the front lines: A history of heroism from Florence Nightingale to coronavirus
  41. You're not going far from home – and neither are the animals you spy out your window
  42. What every new baker should know about the yeast all around us
  43. Diabetics break bones easily – new research is figuring out why their bones are so fragile
  44. What are Asian giant hornets, and are they really dangerous? 5 questions answered
  45. For parents of color, schooling at home can be an act of resistance
  46. Science fiction builds mental resiliency in young readers
  47. What US states can learn from COVID-19 transition planning in Europe
  48. Why the military can use emergency powers to treat service members with trial COVID-19 drugs
  49. The tooth fairy as an essential worker in a child's world of wonder
  50. Historic power struggle between Trump and Congress reviewed by Supreme Court