Lunch and artificial intelligence: Part 2

I will concede up front that this article was much harder to write than I anticipated without getting a bit into how AI/machine learning/deep learning are implemented as I had promised not to do. At least I steered clear of typing codes even though the temptation was there. I still feel like I haven’t done much justice to AI/machine learning/deep learning so I will write another post that will not have explanations but just examples of where these systems are being applied in the real world. Without further ado, let’s get into it:

Last week, I published an article telling you about a lunch I had with my friend because she wanted to know more about this new buzz word: artificial intelligence. If you haven’t read the article yet, you can read it here. Thank you to everyone who has already done so and especially those who asked me questions in person, via Inbox, WhatsApp etc. I was glad when you guys mentioned that you learnt something new because of that post. In this second part, I promised to talk a bit about machine learning and deep learning. Unless you live under a rock, various outlets have published tons of articles mentioning how these systems are changing the world but to someone who has never bothered to read them or who assumed that it is the stuff for nerds only, what exactly is it? This morning I had an interesting conversation with a colleague when I got to work. I was telling him a joke that my AI told me before I left home. “Don’t even get me started on Velcro. What a rip off…” the AI said. Good one, right? Instead of giggling at the silly joke, my friend asked, “Wait, you have an AI at home?” “Huh? Of course I do. Don’t you have one too?” I responded. “Yoh, yoh, yoh. How did you get hold of that my man?” he questioned me. I knew where this was going. His mind was already thinking of a robot or something futuristic so I quickly replied, “Oh no, man. It came with my phone. I didn’t build it. I just prompted it this morning by saying ‘Hey Siri. Tell me a joke’ and it told me a joke.” Besides me hoping you will giggle at the joke too, I wrote this conversation here because AI systems are all around us, whether we care to understand what they are or not. With the proliferation of rich data being generated everywhere, they are only going to get even more ubiquitous. Just like the previous article, the aim during that lunch and right now is to help you get a basic understanding of AI/machine learning/deep learning, enough to hopefully get you excited about disruption that is coming to many industries in the coming years.

What we know about computers is that they are extremely good at doing some tasks much better than us humans, for example crunching large numbers as quickly as possible. If you were to add every single integer/whole number between 1 and 1 billion without any computer (no calculator too because that is a basic computer), it will take you a bit of time, wouldn’t it? A computer won’t even break a sweat (or circuit?). The ability to do these tasks much better than us means they can consume and analyse data in quantities and at speeds that humans will never be able to do without them. The implication then is any industry where data can be collected, analysed and used to inform business discussions can and probably will benefit from computers because they can be used to do these tasks at a much faster and more efficient rate than humans. But how can computers do this? This is where AI/machinelearning/deep learning come in. Last week we discussed high level details regarding AI. This week, let’s start with machine learning. 

Plagiarising Wikipedia, machine learning is defined as,

“…a field of computer science that often uses statistical techniques to give computers the ability to “learn” with data, without being explicitly programmed.”

In very basic terms, what this is saying is machine learning algorithms are a set of techniques that help a computer analyse data and come up with conclusions on that data. That’s it. Feed the algorithm data, it will work its magic (learn from it) and give you a result. Let’s discuss an example of how machine learning works.

Would you have survived the Titanic? Machine learning has the answer

Picture us getting into my imaginary time machine and going back to 15th of April 1912, the fateful day when the Titanic sank. Imagine we are part of White Star Line, the owners of the cruise ship, and a few days after the accident, we are in our disaster room trying to figure out who survived and who didn’t because we cannot account for some people. On the table, we have files of those who made it and those who were unfortunate. In our hands, we have a record of a young lady, aged 25, who was staying in a cabin on the second floor and was travelling alone. Did she survive or not? We have no idea. What about that father who had his wife and three children who booked into the family cabin on level 4? There is no way to tell with 100% accuracy who made it to the lifeboats and who didn’t BUT with machine learning, we can get very close to that. Here’s the high level of it. We can take the data of those we have already accounted for – their age, gender, cabin, number of siblings they had on ship etc and, very importantly, whether they survived or not – then code an algorithm (sounds intense but actually maybe just 5 lines!) which will take in this data and analyse it in some clever ways. The computer will to figure out by itself (“…without being programmed explicitly” from definition above) what factors contributed to someone surviving and someone not surviving. Was it the gender? Was it the age? Cabin level? Number of siblings they had on board because maybe as the ship was sinking, those who had many siblings on board spent time looking for their family members and didn’t rush to the lifeboats. Or is it a combination of all these factors and some more I didn’t state here that could be in our data? The computer does all this “thinking” by itself and it creates a general profile of a person who survived and another for a person who didn’t. Without getting into the detail of how it does this or how we will test the accuracy, say we are happy with the general profiles it has defined for itself and we have tweaked our code to increase accuracy, we can then take records of passengers whose state we do not know and ask the computer to tell us if the person survived or not. The computer will say, “Okay, based on the general profiles that I created from analysing data for people whose state was known, I am x% sure this person with this age, gender, number of siblings and cabin level survived or did not survive.” That is machine learning! The computer learnt by itself what factors contributed to someone surviving and not surviving based on the data we gave it and then was able make predictions on people we cannot account for. We leave our disaster room and unfortunately break the news to the families of those we think, with x% accuracy, lost their family members at sea.

This example is usually the first one you will code when you start studying machine learning. What it is showing is any industry that collects data with distinct classes (here two classes: survived and did not survive) can benefit from machine learning algorithms by helping decision makers drill down into what factors determine those classes.

  • Do you work for a bank and you have data on lending deals clients have done and you want to know what factors contribute to them going into arrears or not? It is virtually impossible to come up with these factors by manually looking at that data. Is it their age, employment status, number of deals they have internally and externally, education, number and type of dependents or a combination of all these and more factors? Machine learning helps you do this by employing some very clever algorithms that learn from this data and highlight the most influential factors which contribute towards affecting person’s credit record. This then allows you to make decisions regarding the types of potential clients you, with x% confidence, can feel comfortable lending to.
  • If you run a spaza shop and you want to know what goods people buy together, say maybe teabags and sugar, machine learning algorithms can help you figure out those associations and optimise your restocking efforts.
  • Estate agents can run certain algorithms that help them predict cycles in house prices based on historical data.
  • You have probably heard of Cambridge Analytica, a company that ran some machine learning algorithms on US citizens’ Facebook data and classified them into political affiliations or which party they were probably going to vote for in the 2016 presidential elections and then targeted the “opposition” voters with adverts.

Apple’s Siri, Microsoft’s Cortana, Amazon’s Alexa and Google Now walk into a bar….

To explain deep learning, I am going to use Siri as an example. Bear with me even more on this part because deep learning is where the really big boys play. I don’t want this post to go on and on so I tried to be as succinct as possible. Anyway, in biology class, we were taught about the human brain and the nervous system. Remember that? We were told how information or signals are passed from one neuron to the next if certain things happen. For example, when you look at something, the neuron connected to the back of your eye will pass that information on to the next neuron and then to the next one and to the next etc until it reaches the occipital lobe where that visual processing part of your brain will conclude that you are looking at a cat.

This is essentially what happens in deep learning. According to MIT Technology Review,

“Deep-learning software attempts to mimic the activity in layers of neurons in the neocortex, the wrinkly 80 percent of the brain where thinking occurs. The software learns, in a very real sense, to recognize patterns in digital representations of sounds, images, and other data.”

Deep learning makes use of algorithms called Deep Neural Networks which are made to mimic the human brain. They have layers set up one after another just like neurons in your brains lined up one after another. And just like neurons, calculations are done at each layer and the results are forwarded to the next layer then to the next one and the next one and so on until the information reaches the end point and a conclusion is made. According to Apple’s website, “The “Hey Siri” detector uses a Deep Neural Network (DNN) to convert the acoustic pattern of your voice at each instant into a probability distribution over speech sounds. It then uses a temporal integration process to compute a confidence score that the phrase you uttered was “Hey Siri”. If the score is high enough, Siri wakes up and proceeds to initiate the task.” Too many words so I drew this image to guide the explanation. The red circles indicate stages and from (2) up to (4) is our deep neural network. Each set of white circles is a layer where calculations are done and the results are passed on to the next layer, for example from layer (2) to layer (3).

What Apple is essentially saying is, just as your eye takes in the signal/input, Siri’s first layer will take in the sound wave you have made with your words (1) as input and then do some fancy calculations to figure out what those individual words are in layer (2). After figuring out the individual words you have said, the neural network then passes those words to the next layer (from 2 to 3). At the next layer (3), it might try figure out the context by putting those individual words together – is it a question or it an instruction – and then passes the conclusion on to the next layer (from 3 to 4). At the output layer (4) it will decide what to do with what you have told Siri by executing the task and giving you an output. This final layer for Siri will say, “The user asked me to find him what is happening in Jozi on a Sunday afternoon so let me load up Safari and do a Google search and show them the result.” Or, “The user asked me to set an alarm for 5am repeated every day so let me do just that.”

Deep learning is very fascinating because, unlike many machine learning algorithms, here we are close to making a system that actually thinks for itself and understands what the person is saying even without specifically training it on all different classes of what people might talk about.

  • Self-driving cars make use of deep learning algorithms when detecting objects around them in a field called Computer Vision. It is possible to write an algorithm that you can run on your computer to detect objects on your desk and it will tell you, “With 92% accuracy, that is a pencil…” This uses deep learning with layers that analyse different parts of the object during each run and then reach a conclusion at the end of the deep neural network and it will tell you what your laptop is “seeing”.
  • Did I mention that these algorithms are all around us? Because Google search results are also powered by deep neural networks when it groups results that are similar to what you searched for.
  • Don’t even get me started on the new Google Assistant that was announced at the recent Google I/O event. If you don’t know what I’m talking about, watch this video because this capability is coming to an Android phone near you anytime soon. Google’s assistant uses deep neural networks or deep learning to understand what the person on the other end of the phone call is saying, a tough area of research called Speech Recognition. Deep neural networks allow the Google Assistant AI to understand speech nuances, accents and even just the words themselves and what it means to book an appointment or reserve a table.
  • If you have used filters on Snap(chat) or Instagram, you have made use of simple deep neural networks!

Wow, you actually read the whole article! I doubt you would want me to get into examples of how these systems are practically disrupting different industries in this article so, with the explanations out of the way, the next post will just be examples of what is being implemented in various fields and is potentially taking jobs and creating new ones. Look out for Lunch and artificial intelligence: Part 3! Here’s another bad joke from our AI, Siri: