A problem of honesty in an automated world
Be informed about free updates
Simply log in to Social affairs Myft Digest – delivered directly to your arrived mail.
What does the machine decision be “fair”? So far, public debate has mostly focused on the issue of bias and discrimination. This is understandable: most people would expect the machines less biased than people (indeed, this is often given as a explanation for their use in processes such as employment), so it is correct to pay attention to attention to attention evidence that they can be biased too.
But the word “fair” has a lot of interpretations, and “impartial” is just one of them. Recently, I found myself at the entrance end of an automated decision why I was thinking about what it really meant to feel that you were treated for you and how hard it would be for you to stick to these principles in all automated world.
I have a personal gmail account that I use to correspond to the book project I am working on. I woke up one morning in November to discover that I could no longer approach him. Google’s message said my approach is “limited to global level” because “Gmail is used to send unwanted content. The unwanted Postpath is violating Google’s policy.” The note states that the decision was made with “automatic processing” and that if I think If it were a mistake, I could file an appeal.
I didn’t send any unwanted mail and couldn’t imagine why Google’s Algorithm thought I had. It made it difficult to know what to write in the “Apall” text, except in the panic version of something like, “I didn’t do it (whatever it was)!” And “Please help, I really need access to my e -stars and files.” (To my relief, I later realized that I had not lost access to my drive.)
Two days later, I heard: “After reviewing your appeal, access to your account remains limited to this service.” I didn’t give more information on what I supposedly did or why the appeal was rejected, but you were told that “if you disagree with this decision, you can file another appeal.” I tried again and I was rejected again. I’ve done it a few more times – curiously, at this point, about how long this loop can continue. A look at Reddit proposed Other people went through similar things. I eventually gave up. (Google refused to comment on the record.)
Among the regulators, one popular answer to the question of how automated decisions make “more honest” is to insist that people can ask for a a man for an overview them. But how effective this medicine is? First of all, people are prone to “Automation of complacency” – tendency for too much belief with a machine. For example, in the case of a scandal with a post office in the UK, where SUB-Potnica-Potnica was miscipited by the theft of a defective computer system called Horizon, the judge concluded in 2019 that people were in mail by mail shown “Simple institutional stubbornness or refusal to consider any possible alternatives to their view of the horizon.”
Ben Green, an expert in algorithmic righteousness at the University of Michigan, says that there may be practical problems in some organizations. “Often human supervisors are in a tight schedule – they have many cases to review,” he told me. “Many cases I have looked at are cases where a decision is based on a kind of statistical prediction,” he said, but “people are not very good at making these predictions, so why would they be good at the rating.”
Once my powerless anger is because of my e -I have shone, I have found that I have some compassion for Google. With so many customers, an automated system is the only practical way to discover the violation of their policies. And while it felt deeply unjust that I had to declare myself about my case without knowing what started the system, nor any explanation of traps that should be avoided in attraction, I could see that as much detail as possible Google offers about the system of system, more easily, more easily It would be if bad actors bypass him.
But that’s a point. In all automated systems, the aim of procedural justice – which people consider the process of being fair to them – often comes into conflict with other goals, such as the need for efficiency, privacy or security. There is no easy way to disappear these compromises.
In terms of my account, I will, when I decided to write about my experience for this column, I sent the E -GO to Google’s Press Office with details to see if I can talk about this problem. By the end of the day, my approach to the E -osta account was renovated. I was pleased, of course, but I don’t think many would have seen it as a special fair.