Outsmarting AI. Brennan Pursell

Читать онлайн книгу.

Outsmarting AI - Brennan Pursell


Скачать книгу
healthy skepticism, maintain humility, and consider our neighbor throughout. This mind-set will produce better outcomes. Humanize your AI.

      1.

      J. McCarthy, M. L. Minsky, N. Rochester, and C. E. Shannon, “A Proposal for the Dartmouth Summer Research Project on Artificial Intelligence,” August 31, 1955. Last modified April 3, 1996. http://www-formal.stanford.edu/jmc/history/dartmouth/dartmouth.html.

      2.

      According to his best-selling, autobiographical book about AI and great power politics: Kai-Fu Lee, AI Superpowers: China, Silicon Valley, and the New World Order (Boston: Houghton Mifflin Harcourt, 2018), ch. 7.

      3.

      See Nick Bostrom, Superintelligence: Paths, Dangers, Strategies (Oxford, England: Oxford University Press, 2014).

      4.

      http://norman-ai.mit.edu/.

      5.

      Mariya Yao, “Beyond Backpropagation: Can We Go Deeper than Deep Learning?” Topbots, November 9, 2017. https://www.topbots.com/deeper-than-deep-learning-beyond-backpropagation-geoffrey-hinton/.

      6.

      Matthew Herper, “MD Anderson Benches IBM Watson in Setback for Artificial Intelligence in Medicine,” Forbes.com, February 19, 2017. https://www.forbes.com/sites/matthewherper/2017/02/19/md-anderson-benches-ibm-watson-in-setback-for-artificial-intelligence-in-medicine/#3175068f3774.

      Chapter 2

      AI in Plain English

      AI is cutting-edge, proliferating technology, devised by people, so it can be understood and used. There is no magic and no mystery involved. It cannot work wonders, but it can be applied to many, many tasks that we find in workplaces across the globe. As in other technological “revolutions,” research and development are leading the way.

      In 2017, the number of AI-related patent applications worldwide rose to more than 55,000, up from 19,000 in 2013. Since 2013, as many patents have been awarded as in the preceding sixty years. IBM and Microsoft are leading the pack with applications, followed closely by Japanese and Korean tech companies. The 167 universities and research institutes that apply for patents are mostly in China, the United States, and South Korea.[1] In 2019, the United States awarded double the number of AI patents over the year before.

      The patent explosion is following a similar trend in the publication of scientific papers. In 2016, there were three times as many scientific papers as commercial patents, down from eight times in 2010. Patents are filed for applications in the telecom, transportation, life and medical sciences, personal devices, computers, banking, entertainment, security, manufacturing, and agricultural industries.

      AI tech has left the lab for the world market in goods and services, ready to be used, bought, and sold, not just by corporate giants, but millions of small to medium-sized businesses and organizations like yours.

      The most commonly patented AI technology is machine learning, which frequently relies on “deep learning,” “neural network” algorithms. The number of machine-learning patents has lately grown 175 percent per year on average. You need to know what these are and how they work.

      This section will explain how these algorithms work, without getting too wonky about the computer code or math involved.

      The Math of AI

      As we said in the introduction, AI is software at work on computer hardware, and it performs sophisticated statistical analysis of your digitized data. AI is just math.

      So let’s start with the math. Now don’t close the book!

      I want to equip you against the torrents of numbers and statistical calculations coming from data scientists. I’ll explain the essential principles and spare you the formulas. The math, in some cases, is centuries old, and computers do it all today anyway, but you as the human have to understand what it’s doing, because it doesn’t.

      Remember that the goal is to obtain business value from your data. The terms below will empower you as you get to know AI tools and implement them in your organization.[2]

      At the heart of it all is conditional probability, which is just a percentage. What is the chance that something is going to happen—or not happen—given what has taken place? For example, given the data, what is the chance that this or that transaction will occur? And these percentages are constantly changing. There is, for example, no set percentage chance that you might develop colon cancer for the duration of your adulthood. Doctors hawking colonoscopies won’t show you that your chance of having cancerous polyps is not just the national average, but that average adjusted over time by your age, your weight, the prevalence of the disease in your family, your daily diet, your level of physical activity, the incidence of intestinal inflammation, whether you have already been checked and cleared once before, etc.

      Netflix’s recommendation system, a major part of its market success, is likewise based on conditional probability. Given the films you have seen and liked, what other films should be recommended to you? Your own viewing history, however, is a very limited data set. What about everyone else who liked the films that you liked? What other films did they like? Netflix’s recommendations made for you are based on a vast range of data entries about people’s viewing histories and many other factors as well. AI algorithms process that data, calculate, and update those recommendations for each user. Netflix paid $1 million in prize money for the algorithm model in 2008. The paper about it is available online for free.[3]

      Conditional probability is a key component of the math-mix that allows AI to constantly update and improve its predictive calculations in just about every imaginable application. Many call it “personalization,” but the software and hardware that calculate it could not be more impersonal.

      Prediction rules are just mathematical equations that describe the relationship between input data and the calculated output. You can also call them models. The easiest example is your maximum heart rate, given your age. Subtract your age (the input) from 220, and you have your maximum number of heart beats per minute (the output). As you age, your maximum heart rate declines. Add more and more data points, and the prediction rules become necessarily more complex. They are the “patterns” that AI can “detect.” When you “train” an AI system on a data set, and it “learns a pattern,” that means that it fits the prediction rules to match the data inputs and outputs.[4] I’ll come back to this idea with “backpropagation,” below.

      Regression analysis calculates the statistical relationship between variables, usually an input and an output. You’ve probably seen a regression graph before, perhaps many times. A picture is worth a thousand words.[5]

      Regression Analysis

      Source: Public Domain. Wikipedia.org. Regression analysis. Image by Sewaqu, 4 Nov. 2010. Public Domain.

      The line, expressed in a mathematical equation, is the prediction rule for the relationship between the X and Y data entries for each point. The best equation has the lowest average distance of all points to the line. This mathematical achievement comes from Adrien-Marie Legendre in 1805.

      Regression analysis can be linear, showing the relationship between a dependent variable and an independent variable. If more than one independent variable influences a dependent variable,


Скачать книгу