Subscribe
About

When algorithms get the wrong results

Sibahle Malinga
By Sibahle Malinga, ITWeb senior news journalist.
Johannesburg, 11 Nov 2016
We live in an amazing world where algorithms actually understand us better than we understand ourselves, said Eighty20's Andrew Fulton.
We live in an amazing world where algorithms actually understand us better than we understand ourselves, said Eighty20's Andrew Fulton.

Using big data analytics can be a powerful way to understand our world and the people in it better. Big data sets are aimed at explaining what's happening in the world and better understanding the people in it. However, none of the online polling that happened leading up to the US polls predicted that Donald Trump would be elected president.

This was according to Andrew Fulton, director of big data consulting company, Eighty20, speaking at the "The truth about data sets" forum organised by MediaCom yesterday. Fulton explained the election of Donald Trump as US president this week was a pivotal event, and the election, together with the Brexit online predictions, are extreme cautionary tales around how algorithms can get results wrong.

"What did we get wrong in data analytics? This will go down as one of the biggest mistakes in big data history. The problem with big data analysis in SA and internationally is that we're so obsessed with predicting people's opinions instead of using big data to listen to them.

"Big data is made up of data sets with sizes beyond the ability of commonly used software tools to capture, curate, manage and process the data within a tolerable lapse time. The three V's which are used as characteristics of data are volume, velocity and variety," he explained.

The big data revolution, explained Fulton, all started when Google led the way with big data through ideas that, given enough data and computing power, an all-encompassing system could understand individuals better than they understand themselves.

"For instance if you ask Google the question: Is my husband gay? The irony of this is that Google is able to answer that question with an 80% accuracy rate. Google is able to determine whether or not one is straight or gay, based on the things users "liked" on social media and the posts they put up," he continued.

Fulton made reference to a Microsoft research, which analysed 70 000 Tweets written by people suffering from depression. They then created an algorithm that could determine if people were depressed, based on what they were tweeting.

"We live in an amazing world where these systems actually understand us better than we understand ourselves and they are more honest than we are. The problem is we got led astray with this whole thing of algorithms and unfortunately an algorithm is just a sledgehammer without a lot of accuracy or thought process that goes into it.

"There are creepy mathematical models that have been deployed often against people who are already struggling, in order to make things better for businesses, such as credit agencies that score consumers to determine how much they can afford to pay for insurance, and also determine whether or not they should get credit, and these tools are also doing the majority of share trading on stock exchanges now," he asserted.

We are putting some of the most important decisions of our lives into the hands of Tinder and other dating applications that determine which life partner we spend the rest of our lives with; these are all based on algorithms, observed Fulton.

But algorithms, he warned, can also get it wrong.

"Big data sets, especially predictive analytics have great potential, but organisations must have beyond a small number of experts to deliver value and high quality results. Using data analytics must be led by ethical behaviour and be used to achieve objectives that go beyond just spamming people," he concluded.

Share