Hitting the Books: AI could help shrink America's gender wage gap - Help with AI for latest technology

Breaking

Underreview for latest technology gadgets and worldwide technologies, AI, Machine Learning, Neural networks, Artificial intelligence, Tensorflow, Deep Learning, DeepAI, Python,JavaScript,OpenCv, ChatBot, Natural Language Processing,Scikit-learn

Sunday, 30 October 2022

Hitting the Books: AI could help shrink America's gender wage gap

Women have faced gender-based discrimination in the workforce throughout history, denied employment in all but a handful of subservient roles, regularly ignored for promotions and pay raises — and rarely ever compensated at the same rates as their male peers. This long and storied socioeconomic tradition of financially screwing over half the population continues largely unabated into the 21st century where women still make 84 cents on the dollar that men do. In her new book, The Equality Machine: Harnessing Digital Technology for a Brighter, More Inclusive Future, Professor of Law and founding member of the Center for Intellectual Property Law and Markets at the University of San Diego, Dr. Orly Lobel, explores how digital technologies, often maligned for their roles in exacerbating societal ills, can be harnessed to undo the damage they've caused.  

The Equality Machine Cover
Public Affairs

This article has been excerpted from The Equality Machine: Harnessing Digital Technology for a Brighter, More Inclusive Future by Orly Lobel. Copyright © 2022. Available from PublicAffairs, an imprint of Perseus Books, LLC, a subsidiary of Hachette Book Group, Inc.


For years, the double standard was glaring: employers demanded secrecy about salaries while asking prospective employees for their salary histories. Now, we can tackle both ends of this asymmetry. Just as digitization is helping to reverse information flows to foster more transparency in the market about employees’ worth, new laws are also directing employers to not rely as much on past pay levels, which can be tainted by systemic inequality. In 2016, Massachusetts became the first state to pass a law prohibiting employers from asking job candidates about their salary histories. Since then, more than a dozen states have followed suit.

Barring employers from asking prospective job candidates about their salary histories has two goals. The first is breaking the vicious pay gap cycle, which emerges when women are paid less at a previous job and that gap is then replicated by the next employer. The second is addressing gender differences in the negotiation process Salary figures are plagued by gender disparity, and they can perpetuate and further exacerbate existing market disparities. When a woman discloses that she currently earns less than a man, she could be harming her salary trajectory — both in the applied-for position and for the rest of her career. Each time she discloses her current salary to a potential employer, that gap is likely to grow, as recruitment efforts and promotions are often offered as a percentage increase in relation to current base salary. Rather than relying on biased figures, bans on salary history inquiry induce employers to use other ways to determine a potential employee’s worth, including a shift to automated computation. Employers using market and internal data can consider merit-related characteristics when determining pay, such as experience, training, education, skill, and past performance.

And yet, as we have seen, human bias can creep into our algorithms, and an algorithm that is fed data tainted by salary bias is likely to perpetuate that bias itself. Feedback loops are digital vicious cycles that can result in self-fulfilling outcomes. Once again: bias in, bias out. The risk is that an algorithm will learn that certain types or categories of employees are on average underpaid, and then calculate that into salary offers. This is the wrong that recent policy has been designed to eliminate — and that we can program AI to avoid. Removing the anchored numerical figure encourages employers to proactively assess pay based on the company’s needs and the candidate’s fit rather than on a tainted number. At the same time, having pay scale information for a job but not having a salary history on the table can embolden women to ask for more.

What’s more, AI can also help in the future — maybe not even the distant future — by replacing some of the negotiation that takes place in unequal settings. Empirical studies on negotiation differences between men and women have repeatedly shown that women on average negotiate less, and that when they do, employers react negatively. Women don’t ask for higher salaries, better terms, promotions, or opportunities nearly as frequently as men do. In my research, I’ve called this the negotiation deficit. In one study at Carnegie Mellon University, 93 percent of female MBA students accepted an initial salary offer, while only 43 percent of men did. In another study, female participants simulating salary negotiations asked for an average of $7,000 less than male participants. Economists Andreas Leibbrandt and John List have also found that while women are much less likely to negotiate with employers over salary, this difference disappears when all job seekers are explicitly told that pay is negotiable, mitigating the pay gap. My own experimental research with behavioral psychologist and law professor Yuval Feldman, my longtime collaborator, has found that women in some work environments act less as “homo economicus” — that is, as rational economic actors — and more as altruistic social actors, such that women do not demand for themselves as much as men, and are more likely to value non-monetary benefits, such as good corporate culture.

Can these research insights offer us clues for developing new software tools that will spur women to negotiate? Digital platforms can serve employees by providing advice and information on asking for a raise or preparing for an interview. Information on pay—and especially an explicit expectation that pay can and should be negotiated—can empower applicants to negotiate higher salaries before accepting job offers. The digital platform PayScale conducts annual surveys asking thousands of job seekers whether they disclosed their pay at previous jobs during the interview process. PayScale’s 2018 survey found that women who were asked about their salary histories and refused to disclose were offered positions 1.8 percent less often than women who were asked and disclosed. By contrast, men who refused to disclose when asked about salary history received offers 1.2 percent more often than men who did disclose.

Even when women do negotiate, they are treated differently. In my research, I call this phenomenon the negotiation penalty. Women are told to “lean in” and make demands, but the reality is that for centuries, women have been universally viewed as weaker negotiators than their male counterparts. In one series of experiments, participants evaluated written accounts of candidates who did or did not initiate negotiations for higher salaries. The results in each experiment showed that participants penalized female candidates more than male candidates for initiating negotiations, deeming women who asked for more not “nice” or too “demanding.” While qualities such as assertiveness, strength, and competitiveness culturally benefit male negotiators, women who display such characteristics are often considered too aggressive. Another study looked at data from a group of Swedish job seekers and found not only that women ended up with lower salaries than equally qualified male peers, but also that they were often penalized for negotiating like them. Nick Yee and Jeremy Bailenson have shown that attractive avatars lead to more intimate behavior with a confederate in terms of self-disclosure and interpersonal distance. In a second study, they also observed that tall avatars lead to more confident behavior than short avatars in a negotiation task. They term it the Proteus Effect (the Greek god Proteus was known to have the ability to take on many self-representations). The Proteus Effect suggests that the visual characteristics and traits of an avatar are associated with correlating behavioral stereotypes and expectations, including those that affect the way we negotiate.

The eleventh annual competition for artificial intelligence that has been trained to negotiate — the Hagglebot Olympics, as it’s been termed in the popular media — was held in January 2021. Universities from Turkey and Japan won this time. In some experiments involving negotiations with bots, most people did not even realize they were talking to a bot rather than another person — the bots had learned to hold fluent conversations that completely mimicked humans. Using game theory, researchers are increasingly improving the ways bots can negotiate on behalf of humans, eliminating some of the aspects in which we humans are fallible, like trying to factor in and weigh many different aspects of the deal. AI can now predict the other side’s preferences quite fast. For example, an AI listening by microphone to the first five minutes of negotiation is learning to predict much of the eventual deal just from the negotiators’ voices. Following these speech patterns through machine learning, it turns out that when the voice of a negotiator varies a lot in volume and pitch, they are being a weak player at the negotiation table. When the negotiating sides mirror each other, it means they are closer to reaching an agreement. Using AI also has helped uncover the ways in which women are penalized at the negotiation table. A new study out of the University of Southern California used a chatbot that didn’t know the gender identities of participants to evaluate negotiation skills. The study showed that most of us — both men and women — do quite badly at negotiating salaries. Over 40 percent of participants didn’t negotiate at all, and most people left money on the table they could have received. Women valued stock options less than men did as part of their compensation package, affecting women’s likelihood to accumulate wealth over time. These advances can also help with negotiation disparities across different identities. A group of Israeli and American researchers looked at how a smart computer can negotiate with humans from different cultural backgrounds. Without telling the machine anything about the characteristics of people from three countries — Israel, Lebanon, and the United States — they let the AI learn about the patterns of cultural negotiation differences by engaging in negotiation games. They found that the computer was able to outperform people in all countries. These developments are promising. We can envision bots learning about negotiation differences and ultimately countering such differences to create more equitable exchanges, level the playing field, and achieve fair outcomes. They can be designed to tackle the specific distributive goals we have.



from Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics https://ift.tt/ksiAPD9

No comments:

Post a Comment

Guys Comments for Revolutionary Change!!!