Autonomy: The Quest to Build the Driverless Car - And How It Will Reshape Our World. Lawrence Burns

Читать онлайн книгу.

Autonomy: The Quest to Build the Driverless Car - And How It Will Reshape Our World - Lawrence Burns


Скачать книгу
Stanford teams. Around the same time Levandowski was selling this integral piece of technology to as many teams as he could, however, he also was advising the Stanford team for the Urban Challenge. There’s no indication he did anything unethical at this point; Stanford apparently knew about his role with Velodyne, but it is easy to see how this activity could be perceived as a conflict of interest. And just to make all this even more incestuous, Google’s Street View played a key role in how it all went down.

      By 2006, Thrun was itching to do a start-up. The atmosphere of Silicon Valley played a role, as did the Stanford AI prof’s developing relationship with Google cofounders Larry Page and Sergey Brin. But what should the start-up do?

      Thrun was fascinated with the data that he and Montemerlo had collected while testing Stanley for the second Grand Challenge. Teaching Stanley to drive, motoring in the Touareg around the Mojave Desert, Thrun and his teammates had struck on the idea of fixing cameras, pointed in several different directions, on the roof of the vehicle. Not as sensors—rather, the imagery helped them re-create the circumstances that triggered bugs in Stanley’s programming. Over time, Thrun realized how interesting it was to just go through the imagery collected by the multidirectional rooftop cameras.

      That year, Thrun happened to be teaching a class at Stanford about computer vision. He assigned the class’s most brilliant student, Joakim Arfvidsson, the task of creating a program that could easily stitch together the camera footage in such a manner that it provided the illusion of an unlimited field of vision, which, Thrun figured, would provide the impression of actually being at the original location. “Joakim recorded a street in San Francisco,” says Thrun. The resulting computer program gave the feeling of actually standing on the street. “You could look up, look down—it was completely amazing.”

      Thrun gave Arfvidsson an A-plus in the class. During the summer of 2006, alongside his oversight of Stanford’s Urban Challenge team, Thrun assigned a second team the task of building a version of the street-visualization software that could work on a cellular phone. The team included Hendrik Dahlkamp, Andrew Lookingbill and Arfvidsson. Thrun installed his good friend Astro Teller, whom he knew because Teller did his PhD in artificial intelligence at Carnegie Mellon, as the start-up’s CEO.

      In the first months of 2007, Levandowski also joined the team. Preparing to show the technology to venture capital firms for seed funding, Thrun aimed to stage a really impressive demo—he wanted to be able to place the VCs on any street in San Francisco, which required driving their camera rig along every street in the city. Levandowski was the one who determined how to do that quickly. He figured out that rental cars were really cheap if you hired them by the month. Next, he procured large numbers of drivers by advertising on craigslist. It took about two weeks to complete the map. “He was a really great person, I would say, to get shit done,” Thrun recalls.

      March 2007 was the month Thrun made his approach to VCs for funding. The effort became all-consuming. “Just strategizing with these venture capitalists becomes this amazing, intense game,” recalls Thrun. “My life is completely taken over. I have no social life—my wife thinks I’m a moron.” The effort paid off, though. Two of the valley’s top VCs, Sequoia Capital and Benchmark, were both interested. Thrun set the bidding for a Sunday, April 8, 2007. Soon the bids were climbing: a $5 million round of seed funding, which became $10 million. Then $15 million.

      That evening, as he mulled over which venture capital firm to choose, Thrun invited himself over for dinner to Larry Page’s place. Sergey Brin showed up. The men discussed Thrun’s technology, which he called VueTool. Page already had sponsored something similar. Some time before, he, Brin and Marissa Mayer had gone out and taken some footage that they then stitched together. (In fact, according to Thrun, the idea of immersively stitching together camera imagery so that it was possible to click through it had been invented in 1979 by an MIT scientist named Andrew Lippman.) And Page and Brin had a similar project happening at Google, one run by a guy named Chris Uhlik. After dinner, Thrun took Brin and Page to his office at Stanford to demo the footage of the San Francisco streets. The Google cofounders were impressed; they saw that Thrun’s team had accomplished a lot more, much more cheaply and in a lot less time, than their people had. For example, Google’s in-house Street View team was using custom-built camera rigs that cost $250,000 each, according to Mark Harris’s reporting in Wired. Thrun and Levandowski, Harris writes, were getting images of similar quality using a setup of off-the-shelf panoramic webcams that cost $15,000.

      The next day, Google’s head of mergers and acquisitions called Thrun, who agreed to sell the VueTool technology to Google. As part of the deal, Thrun, Levandowski and the rest of the team joined the company to accelerate the Street View project. “We got fairly lavish bonuses,” Thrun explains. In following with the Red Whittaker approach of setting an ambitious goal to motivate his team, Thrun set up an arrangement with Google that would trigger another bonus payout if Thrun, Levandowski and the team were able to map a million unique road miles for Street View. Using the method that Levandowski pioneered, with lots of cars, Thrun and his team ended up meeting the milestone in just seven months.

      Thrun’s obsession with meeting the Street View goal meant the day-to-day work on Stanford’s entry to the DARPA Urban Challenge was led by Mike Montemerlo. On the other side of the country, Montemerlo’s former officemate Chris Urmson led the day-to-day work on Carnegie Mellon’s robot vehicle. Urmson approached his effort on this race like it was the most important quest of his life. Every so often he recalled the conversation with his wife, Jennifer, nearly four years before, when he promised that he’d just do the first desert challenge, before going off and getting a real job outside academia, where he could start making a good living to support his growing family. (He and Jennifer had since had a second boy.) Sandstorm’s rollover before the first race had provided him with the sense that he could have won it, if not for that accident. H1ghlander’s mystifying mechanical issues in the second DARPA challenge provided a similar sense of the team’s tantalizing proximity to victory. This urban event, Urmson knew, likely represented his last chance at victory. His last, best shot.

      Leading up to the race, Team Tartan often discussed the rules. How difficult would DARPA make this challenge? Did DARPA even want a winner? It would be a simple enough matter to make the race so tough that no team could ever win. That would be the most cost-efficient option. If the government’s intention was to outsource development of autonomous vehicles, to prove the possibility of self-driving cars while investing the minimum amount of money, then one way to do that would be to stage a race that prompted universities and research centers all over the country to work on the problem, while making it so tough that DARPA wouldn’t have to actually hand out the prize money.

      By this point, building an autonomous robot had become nearly routine. Transforming a Chevy Tahoe into the self-driving Boss resembled the maturation of a human being, in some respects. The vehicle started blind and dumb, unable to sense, to navigate, to move on its own. Then Urmson and his team installed sensors—the LIDAR, the radar—as well as the computer processors. In their earliest tests the team taught the robot not to walk but to drive, supplying it with a list of GPS waypoints, similar to the sort that earlier generations of CMU vehicles had been required to follow in the desert challenges. Once the robot could trace the dots of a mile lap around the grounds of the old steel mill, the team set up longer waypoint-finding tests. In November 2006, a full year before the competition, Boss completed a fifty-mile route, achieving speeds of 28 mph.

      In parallel to Boss’s mechanical testing, Salesky’s programming team worked to incorporate perception and planning systems into Boss. The robot could understand the input it was getting from its rudimentary eyes, in the form of its Velodyne LIDAR and radar sensors. In December, Boss achieved a multi-checkpoint mission, running at night along the cold Pittsburgh riverside. Tartan Racing also coded rules that would dictate how the vehicle would behave during the situations it encountered while driving. Intersection handling was an early module in which the programmers created a set of directions for the various situations the robot might encounter at an all-way stop. What if Boss arrived first at an intersection, followed by someone else to the right? What if Boss arrived second? Salesky’s team created rules for each scenario.


Скачать книгу