Unethical algorithms are taking your airplane seat for a ride

Algorithms have been blamed for everything from promoting employment gender biases to racial stereotyping in criminal sentencing. Now there is another issue to throw at them – planned separation of families on airlines. The charge is that airlines aren’t just separating families for spite: they are doing this so that family members can pay more so they can be put back together.

“Some airlines have set an algorithm to identify passengers of the same surname travelling together,”said the UK’s Digital Minister Margot James to a committee. Subsequently, she explained, family members are then distributed around the plane and then charged more when they have the temerity to want to sit together. She described this as an example of algorithms and software being misused to exploit and hoodwink the general public.

While some people might want to be seated a long ways away from other family members, safety experts say this is a potential safety risk. In the event of an emergency, the first inclination is to rejoin the family, not exit from the closest doorway.

Every breath you take, every place you sit

Seat splitting is already the theme of a British Civil Aviation Authority investigation during the past year. Their most recent research survey of 4,296 people from October of 2018 found that the odds of being separated vary wildly by airline. The devil is in the details and the research shows that some airlines are seen as taking a much more aggressive stance. Flybe and TUI Airways just separated about 12 per cent of passengers with this number nearly tripling for Ryanair. Ryanair has denied it would have such a family separation and revenue enhancement policy.

The idea of paying extra for a particular seat beyond just a first class and economy class division is relatively new. It dates back to 2011 when easyJet budget airline first introduced the concept and has since been used by a wide range of airlines.

Here come the algorithm ethicists

The airline seating case is one of the first cases that the brand new British Centre for Data Ethics and Innovation is to look at. While ethnics may seem rather soft and squishy, competition and consumer rights agencies will also be looking into these sorts of things for signs of biases.

Once upon a time, it looked like using mathematical formulas might take the bias out of pricing and selection, however, the opposite seems to be true. Algorithms actually reflect the biases of the individuals which write them, stated Sascha Eder, founder and COO of NewtonX, the world’s first AI-powered expert network. His basic suggestion is to take algorithms out of their hallowed black boxes so people can take a harder look at the inbred biases.

New York City could be at the head of the crowd in this respect. The city passed a law this year, the Algorithmic Accountability Bill, which set up a task force to look at the fairness and validity of algorithms used by city agencies. After all, since the public is directly influenced by these formulas, they are entitled to know what went into the math.

As a PR Consultant and journalist, Frink has covered IT security issues for a number of security software firms, as well as provided reviews and insight on the beer and automotive industries (but usually not at the same time). Otherwise, he’s known for making a great bowl of popcorn and extraordinary messes in a kitchen.