Thursday 19 November 2020

It Was Not Me, It Was My Algorithm -それでも僕はやってない。アルゴリズムのせいでした。-

I recently had a problem with my bank.  I have been buying audio books from a company owned by Amazon for many years.  I used to make monthly credit card payments, allowing me to listen to one book each month.  Then I noticed that there was better value for money if I bought 24 book credits at the same time. 

So I tried to buy 24 book credits in one payment.  But my bank blocked the money transfer from my credit card.  I couldn’t understand why, because there was plenty of money in my account.  When I called up the bank to find out what had happened, they said that a computer algorithm had stopped the payment.  Because there was a change in my behaviour, the computer programme decided that something was suspicious. 

“But surely you can see that it is not suspicious,” I said.  “I have been making payments to this same company for years.  There is plenty of money in my account.  And anyway, it is a company owned by Amazon.  They are one of the biggest companies in the world.  Surely Geoff Bezos doesn’t need to steal a little money from one customer.” 

“Yes, I see,” said the bank clerk.  “But it can’t be helped.  The computer algorithm decides by itself what looks suspicious.” 

This is the new world that we live in.  Human beings who don’t want to take responsibility for making mistakes can just blame the computer algorithm.  But who set the rules for the algorithm?  Who decided to have decisions made by a computer? 

My problem was a very minor one compared to some of the other problems caused by an over-reliance on algorithms.  I listened to a radio programme recently about “predictive policing,” which is used in America. 

In some police forces, a computer algorithm tries to predict which citizens are likely to commit crimes in the future.  The police are then warned to be careful of Mr. X and Ms. Y, who have not yet committed any crimes, but who might commit crimes in the future.  The computer uses a points system.  If you contact the police for any reason, then you are given some points.  In other words, if someone steals your car and you inform the police, the computer algorithm becomes a little suspicious of you.  If you witness any crime, and help the police to put a criminal in jail, you are given points.  If one of your relatives or neighbors commits a crime, then you are given points.  After your points total goes above a certain level, the police are warned about you, and start to harass you. 

In the radio documentary, there were perfectly innocent people who were being woken up at 3 o’clock in the morning by police officers carrying guns, just to make sure that they were not committing any kind of crime.  Of course, if the police look hard enough, they can probably find some kind of crime.  One of these “suspects” was arrested because he had not written his name on his letter box, which is required by law in Florida.  So, in a very stupid way, the computer algorithm’s prediction that Mr. X would be found guilty of a crime was proven true. 

Some people hate people, because people are not perfect.  We make mistakes.  But it is stupid to think that computers will do any better.  After all, they are designed and built by people.  They have all of the flaws that people have, but none of the charm.

 

Vocabulary:

predictive policing – the use of mathematics or statistical analysis to guess where crime might happen

to harass someone – to unfairly annoy, bother, trouble, etc. someone

a suspect – someone believed to be or suspected of being involved in a crime

a letter box – a place where mail can be delivered

a flaw – something imperfect or slightly wrong



 

No comments: