Machines Can Save Us from the Mistakes of the Crowd

Categories: Big Data, Editorial

Tags: , ,

The classic example of ‘crowd wisdom’ dates back to 1906 when Sir Francis Galton observed a contest in which attendees were asked to guess the weight of an ox at a country fair in England. In what many consider to be the first experiment on crowd wisdom, the average of the 800 guesses was within one pound of being correct.

Consider that these kinds of experiments can now be done digitally – across cultures and time zones and fairly instantaneously. The classic experiment was re-enacted recently with a digital crowd when a photo of a cow was posted online and viewers were invited to guess Penelope’s weight. More than 17,000 votes were cast and the average guess was within 5 percent of being accurate.

ChicagoInno logo

This post by Alanna Lazarowich originally appeared at the website ChicagoInno under the title, “Wise But Biased: How Machines Can Save Us from the Mistakes of the Crowd.” Is reposted by permission.

So, crowds can be collectively ‘wise’ and with online access to large groups of individuals and the real time sharing of opinions, the power and frequency with which we might harness this power is ever increasing.

New research however is adding a new kink in the benefits of crowd wisdom. The incredible power and ubiquity of modern crowds necessitates further scrutiny regarding bias.

Market and Crowd Gender Bias
One might argue that the stock market is the ‘ultimate crowd’ with company valuations based on the collective perception of all investors. Advances in computational research show that even if individual actors are not biased, if they believe that others are, they will price in this bias into the market.

Ned Smith, professor at the Kellogg School of Management, studies decision-making in financial markets and teaches on the influence of crowds. In a recent study, he demonstrates that greater media coverage of a CEO appointment results in negative market reactions for a female CEO but positive market reactions for male CEOs, all else held constant. So much for an unbiased crowd.

In this new take on bias and markets, Professor Smith’s research suggests that it is not individuals attributing their own biases to the female CEO appointment. To be sure, there is plenty of research demonstrating the benefits of female executives particularly as related to increased innovation, collaboration and even returns.

Rather, his research suggests that the negative market reaction is likely the result of individual assumptions that ‘others’ in the market will make biased investment decisions.

Professor Smith and a PhD student Kevin Gaughan studied female CEO appointments of publicly traded companies from 2000-2015 and found that female appointments accounted for 1 percent of appointments but on average received 3 times the media attention of male appointments. When female appointments received little media attention, the market responded with a favorable return on the day of the announcement (holding other factors constant).

However, when media attention was high enough for investors to presume that all investors have heard the news, the announcing company’s stock traded at a discount on the day of this announcement. Professor Smith proposes that investors discount the company’s value on the assumption that ‘others’ are negatively biased, a phenomenon referred to as ‘second-order bias’.

Watch More

Note: The research of Professor Ned Smith was presented at the 2nd International Conference on Computational Social Science. Archived video presentations are available here.

Professor Smith also wrote about this in Fortune when GlaxoSmithKline announced the appointment of Emma Walmsley as CEO.

This is possible by applying sociological insights to large data-sets coupled with machine learning techniques. In fact, his study had four times as many observations as any prior study on market response to female executive appointments.

Enter Machines and Natural Language Processing to Help Address Gender Bias
As machines allows us to analyze bias, we can engage computational partners to also help identify or even overcome gender bias, even when not overt. A few innovative companies are using machines to address bias and they are starting where one might argue is the first step in any woman’s career – the recruitment process.

Textio uses machine learning to detect patterns including evidence of bias in job postings. Textio’s natural language processing engine consumes job postings, over 54 million to date, and has corporate partnerships with companies as varied as CVS, Capital One, and Apple, to understand the outcome of these postings in terms of candidates who have applied, received and accepted job offers.

Textio offers language suggestions that can help lead to a more diverse group of applicants. For example, the service suggests that the word ‘manage’ tends to draw more male job seekers and identifies other choices such as handle, lead or run instead. On average, companies with job listings with a high score in Textio recruit people that are 24 percent more qualified with 12 percent more diversity — and they do it 17 percent faster.

According to Textio CEO Kieran Snyder “addressing this kind of subtle gender bias is important in fields that are traditionally male dominated – such as technology or finance – but bias is everywhere and Textio helps customers from the arts, healthcare, and manufacturing too.” By crafting job postings that resonate more with women, companies seeking to attract more women now have machine partners that can help them overcome the implicit bias that was creating unintentional obstacles to female recruitment.

Addressing hiring practices is an obvious start to tackling underlying gender bias; especially when that bias may not appear immediately evident to the human eye. And while including machines is not free of the inherent bias of human collaborators, future human-machine partnerships may target other forms of implicit and unintentional bias as well. 

 

Leave a Reply