An algorithm is meant to perform routine, code-based tasks without human interaction. In theory, this automation is supposed to make life easier. But it also comes with a dark side.
The most recent example of algorithms gone awry is the airline industry, where bots are being used to squeeze more money from families. Passengers in Europe took (irate) notice of how airline booking algorithms charged families a premium when they tried to purchase multiple seats together.
Europe’s Civil Aviation Authority (CAA) has been investigating algorithms for over a year. Its latest findings show that premium pricing for groups seated together varies from airline to airline, but is widespread nonetheless. (The main offender was discount airline RyanAir.)
When it comes to negative perception of algorithms, things get worse. Pew research shows 57 percent of respondents think using algorithms for résumé screening is “unacceptable,” even though it’s a widely used technology for recruiters and hiring managers. Interviewing.io’s Aline Learner says bots may be to blame for that perception, and anonymizing candidates yields better results for employers; anonymous candidates get more interviews because bots can’t scan for items that might inadvertently disqualify good candidates, such as non-traditional education.
Pew’s data suggests people blame people for the way algorithms are used: 58 percent of respondents say bots taught by humans will always contain some bias. When parsed into age groups, most older Americans believe algorithms are biased, but nearly half (48 percent) of the 18-29-year-old crowd have a healthy distrust for our digital stand-ins.
We’ve taught algorithms to be biased. Whether that’s premium pricing for a family of four to sit together on a plane, or disqualifying a job candidate who didn’t go to Stanford, we only have ourselves to blame. The natural assumption is wonky algorithms can be cured by humans, but that’s not a proven panacea. In fact, it’s possible that regulation is needed to ensure algorithms don’t act in ways that humans otherwise wouldn’t.
If premium pricing for families isn’t a hard and fast rule for an airline, an algorithm shouldn’t try to make it happen. If a master’s degree isn’t required, maybe the bootcamp graduate is just as good as that Stanford alum. Those are rules a human would be able to follow, and it’s time algorithms and bots be held to the same standards.