Mike Angiulo, a former Microsoft vice president, is now in his third year at the University of Washington law school. (GeekWire Photo / Todd Bishop)

Mike Angiulo worked at Microsoft for 25 years as an engineering manager and vice president for products including Windows PCs, Microsoft Outlook, Xbox, Surface, and cloud and artificial intelligence technologies. But it was actually not the work Angiulo originally envisioned doing. He had planned to be a lawyer, delaying those plans after he started at Microsoft in his early 20s.

Now, at age 47, nearly three decades later, he’s circling back to his original plan — going back to school and preparing for a second career, as a lawyer focused on some of the most interesting and difficult questions facing the same types of technologies that he helped to create for so many years.

“I am a big believer that just the prevalence of big data, the speed of the cloud platforms, the modernity of the algorithms, combine to the point where every single business is going to be relying on deep data insights, probably to some automated extent,” he says. “And increasingly, people won’t be able to explain how they work.”

And that raises all sorts of interesting questions about the future of technology and the law, as we learn on this special episode of the GeekWire Podcast.

Listen below, or subscribe to GeekWire in any podcast app, and continue reading for an edited transcript.

Todd Bishop: After more than two decades at Microsoft, Mike Angiulo’s decision to change careers resulted from an epiphany, courtesy of his daughter.

Mike Angiulo: “The thing that changed in me, it was a little bit spurred by watching my older daughter, Emily. She’s at George Washington University in a pre-med program. I was watching her write her college applications. And she was writing these essays about how she really wants to change the world and make it better, specifically for women and healthcare. And it reminded me of the feelings I had when I was young, saying, I want to help people be safe, or I want to be on the leading edge of this area of law. Normally parents are there, prodding their children to think big. But it was really the other way around, you know, she kind of like unsettled me a little bit.”

Todd Bishop: The law runs in his family. His grandfather immigrated from Italy and became an immigration judge with the Department of Justice. His dad was a doctor, a lawyer and, eventually, became a judge in Arizona. His dad worked for years in a county hospital that served the indigent population on the south side of Tucson. Angiulo saw the disparities in care depending on their financial situation, national background, and immigration status.

Angiulo: “Growing up, I saw law as always a way to be able to affect some big change, something important. And I went to engineering school with the first thought that I was going to do engineering law. And I didn’t know what that really meant at the time. The obvious route for that is to work on patents or intellectual property. But I was growing up in an era that was just kind of a big expansion of consumer products and technologies. And I was thinking about product liability and how things worked and how people were kept safe. What changed along the way was, I had an internship, it was really just supposed to be a summer internship. It’s like the Gilligan’s three-hour tour.”

TB: He arrived at Microsoft for that internship in 1993 … and he was hooked.

Angiulo: And I got to work on a project directly for Bill Gates himself. He was going to do a keynote address at COMDEX. I got to work on it personally, and I got to meet him and do a little work with him. And I’m telling you, as a 20-year-old it’s intoxicating to not be treated like an intern. And that was kind of the magic thing about Microsoft at the time, for sure, and even to this day, that it’s not a seniority-based organization. If you’re young and bright, and you’ve got something to say, people will listen, and I really fell in love with that. So I decided to put off going to law school for a year. I did a master’s in chemical engineering at the University of Washington as my cover story for why I was up here.”

Mike Angiulo at Microsoft in 2008. (Microsoft Photo/Chiang Ying-ying)

TB: He knows that was a pretty unusual cover story!

Angiulo: “Well, I didn’t want to tell my parents that I was dropping out of my educational plans for this job at a software company! And that didn’t even make sense, given that that family history, so by being in a master’s program, I was buying myself time just because I wanted to work on the stuff that I was working on at Microsoft, and that kind of ran its course and I graduated and now I needed another reason.”

TB: He kept putting off law school for another year … and then another year … and then another year.

Angiulo: Funny story is that to go to law school, you have to take the LSAT, the LSATs.

TB: LSATs are the Law School Admission Test … and he got to know them well.

Angiulo: Those scores expire. So when you go to apply to a law school, you have to have taken the LSATs within the last I think three years. So I had LSATs ready to go because I was about to go to law school. And then three years later, they expired, and I took them again and again. So I’ve been about to do this for about 20 years. I was always willing to kind of wait till next year, wait till next year because the stuff that was happening at Microsoft was changing the world.”

TB: And then, one day, there was his daughter and her pre-med school application nudging him back to his original goal … and there was another thing steering him toward the law — an observation about the speed of innovation in software versus the law’s ability to keep up.

The gap between tech and the law

Angiulo: I realized that the law, the law in general — you can look at it in terms of civil law in terms of regulation so I’m just using a very general term — our legal structures and the innovation speed that was going on in the software space, seemed out of whack. And I want to work on the intersection of those two things somehow. And I didn’t leave with a specific plan, because frankly, that’s not how my career even worked. As long as you were learning, as long as you had something smart to add, opportunities were there, but I could have never planned where I was going to be in two or three or four years along the whole way at Microsoft.

TB: He started to look at the growing field of artificial intelligence.

Angiulo: I’m by no means a leading expert in AI and how AI works. But I am a big believer that just the prevalence of big data, the speed of the cloud platforms, the modernity of the algorithms combine to the point where every single business is going to be relying on deep data insights, probably to some automated extent. And increasingly, people won’t be able to explain how they work.

TB: One area where these AI products are being used is in the area of predictive policing. It’s not like the Tom Cruise film “Minority Report,” where the prediction focused on individuals. These systems work to create something like a weather report, he says, except instead of predicting rain they predict crime.

Angiulo: “For example, there’s a parking lot where people when it was frozen, the night before would leave their cars running in the morning. They’d be running unattended, puffing exhaust while they warmed up and melted the ice off. And people would steal those cars. And this system realized that there was this pattern that, on these kinds of days with these temperatures [theft would be more likely occur]. It’s the kind of insight that an experienced beat cop would know, on its own, which is the reason why these systems weren’t so troublesome, as they start. They seem like they’re just kind of helping the decisions that people would ordinarily feel justified and be accountable for making.”

TB: But these systems are getting more sophisticated and taking on more complex tasks related to crime.

Angiulo: “Now they’re starting to be used to issue a threat rating or a risk rating on serving a warrant. So if a police officer is going to go serve a warrant on a property, which they’re required to do when a judge signs a warrant … they can serve it by knocking on the door, or they can serve it by not knocking, or a no-knock warrant, where you see them rolling heavy with those big metal things that knock the doors down and whatnot. And there are times where serving a no-knock warrant makes a lot of sense. If you know you’re going to have someone that’s going to offer active resistance, if you know someone is going to start flushing drugs down the toilet as soon as you do the gentle knock, you might have this election. The challenge is, as soon as you start serving a no-knock warrant, you’re serving it with guns drawn and the probability of violence gets much higher.

So you can use a system that takes a bunch of factors into account, that predicts the risk of a particular warrant. The challenge is, what happens when it gets that wrong, and police roll heavy, knock down the front door, surprise someone who’s got a remote control in their hand, and someone at the wrong address or whatever is shot. Or you do that process and someone later challenges that decision and says, “Did you take into account the socioeconomic factors? Did you take into account race?” So for example, there are a lot of laws that are very specific that apply heightened standards of scrutiny for decisions based on state action that took into account race. Now normally, you can actually answer the question, and then the law can be applied. But what happens when no one really knows which factors were taken into account in that black box?”

One of the cool things about these algorithms is, as they develop, they’re better than an algorithm that you would have been able to think of on your own. And in fact, they almost get to the point where you can’t quite understand it. So imagine just a simplistic case, where you’re asking a person, did you take this factor into account in your decision? You know, you’re giving someone a loan. Did you take (into account) the location of their neighborhood, the redlining stuff? Well, there’s an answer to that, it’s yes or no, I mean, the person may lie or not, but it’s still a fact whether the person took that into account. But now you start asking, what did you take into account when you made this deviation around this weather pattern or around this accident? And it would take an expert, an AI expert, to even understand what factors were weighted in what way, and so now you’ve got this challenge where, in liability cases like this, you’re often going to a jury. So you’ve got to go all the way from a technical expert that has to explain things within a certain limited legal framework, because the law is very careful about allowing expert testimony at trials, because a jury could give too much weight to an expert that simply says, “Yes, this caused that.”

TB: What if there is a major storm coming, like a tornado or a hurricane, and you want to use an AI system to predict potential injuries and prepare in advance for the recovery phase? There are possible pitfalls here, too.

Angiulo: Well, wouldn’t it be smart, instead of keeping all of the ambulances in the garage, at the hospital, the fire department, the trucks at the fire department, why not have them staged in places where they would have optimum response times to places that you know that you’re going to have a problem? It seems like some combination of an Uber-type system plus a predictive-type system like this could let you stage ambulances closer to the places of probable injury. Well, that just seems smarter. I mean, you could, you could just predict lives that you could save by predictively allocating scarce resources of any kind. But here’s the challenge. Maybe that system has all of this data and says you need ambulances over by this community development, because the people in that community development actually have health care. And because they have health care, they have a bunch of health records. And because they have health records, we know that there are people there who are going to need particular services. But now you might have this other neighborhood over here, of a different socioeconomic class, they don’t have healthcare, so they don’t have health records. So the system doesn’t think about them. And so now you’ve just prioritized an ambulance towards one direction or another. If you ignored the system, you’d have a suboptimal result, so it’s not like you’re deploying the system to, in effect, reinforce race or class based outcomes, but you might be, and how do you know?”

Artificial intelligence and airplanes

TB: There’s something we haven’t mentioned yet about former Microsoft executive Mike Angiulo. He’s not just a law student — he’s a pilot. And he has a particular interest in how AI systems impact aviation. This summer, he interned at Perkins Coie, a law firm that specializes in aviation cases, and counts Boeing among its clients. He wasn’t able to talk about the Boeing 737 MAX case because of that, but we did talk about aviation issues in general.

Angiulo: “Now, with aviation cases, jurisdiction is really complicated. You’ve got an aircraft maybe made in one or more states, delivered to either the military or an airline that’s operating in another state, they may have their headquarters in yet another state. The people involved in an incident or accident themselves may be U.S. citizens or not, may be citizens of an individual state or not. And then the location of the accident may have nothing to do with any of those states altogether. So there’s always a lot of very complicated work to understand the jurisdictional aspects of these cases. For a simple example, in choice of law, where you learn which state’s law is going to apply in, say, a car crash, one of the key factors is, where did the crash happen? It’s not always that that’s where the law is going to happen, but you can imagine that the state where a highway accident happens, that state has an interest in making sure that their laws are being followed on the highway. And so that site-of-the-crash factor is really important. So what happens in a Malaysian airline case where the plane straight up disappears? There isn’t a site. And so the laws weren’t written in a way to even be able to handle some of the complexity of aviation-specific accidents. And so that’s one of the few reasons that I think it’s a really interesting and important space.”

TB: He feels that it may be better — and safer — to innovate with AI in air travel first rather than on the nation’s highways. As an example, he talked about a new Garmin auto-land system that was certified for single engine turboprop airplanes. In an emergency, anyone on board can push a single button, and the system takes over and lands the plane automatically at the closest airport. He says that kind of innovation is easier to do in the air than on the ground.

Angiulo: So you have one set of standards and one set of bodies for operating in the National Airspace System. So it’s a lot easier to get certified for something like that, because you only have one set of standards. You also have a lot more money going into R&D, chasing safety improvements. You have this really well-balanced regulatory and innovative partnership between the government and aviation. You could just look at this and say you can spend $400-500 to buy a ticket to fly over the ocean, and you have a bigger risk of choking to death on your meal than being in an accident. And like the statistics of the safety are so incredibly high, yet the public has this really low-cost very reliable access, because the regulation and the innovation have gone sort of in lockstep ever since the invention of the airplane. But if you go to the highway system, it’s none of those things. Every state has regulations on how things are allowed to operate. You have a patchwork of legal approaches across the way. You’re going to have people behind the wheel of vehicles for at least the next 40 or 50 years, even if tomorrow, fully autonomous vehicles were available. So you’re going to have a coexistence problem, and you’re going to have a really challenging legal framework. Then on top of it, you’ve got a bunch of companies that themselves are not inherently regulated the way aviation technology is. And so they’re slamming technologies together because of market pressures in a way that you would never do in designing aircraft. [AI] will revolutionize transportation. But I feel like aviation is the best first place to make that progress and then have it sort of trickle out to other other environments.

TB: As you might expect, his time at Microsoft influenced his view of the law and product liability in some specific ways.

Angiulo: There are a couple things that really came away from it. One of them is just the absolute, almost religious belief in the relentless innovation that’s going to come from software development. And I said, it’s gonna come. Even though we look at today, and you look at the devices and the access to information, it almost seems unbelievably complete. And meanwhile, if you just go back 10 or 12 years or or more, you realize how far we’ve come. The rate of innovation is accelerating. The cost of doing a new startup, with a cloud service backing, it really can be anybody anywhere from the world. All the MIT OpenCourseWare for learning how to program is free on YouTube. Any human with a broadband connection can change the world in a way that would have required millions of dollars of institutional backing just minutes ago. So if you look at the innovation curve, and you look at the energy and you look at how much money gets saved, and how much value gets created, that’s for sure. So I just know that. I saw it. I participated in it first hand.

Another thing that I learned is what it’s like when a big corporation is making big decisions. And look, the threat of legal action is one of the things that helps corporations make responsible decisions. And there are a few areas of law where that’s not good enough. So environmental law, for example, the federal government makes a point of prosecuting criminally when people have intentionally dumped pollutants hoping that they just wouldn’t get caught, because you don’t really want people doing a financial calculation of saying, “Well, I’m only going to get caught 10% of the time, but I saved 20x the money. So it’s a good deal, go for it.” But outside of those crazy cases, just the balance of risk-reward does play into the thought process as products are being developed. So having that pressure be right sized, balanced, healthy, productive, aligned with facts. It’s a good thing. And so I saw that firsthand. And I also saw the fact that as soon as you get five smart people together, you get 20, you get 100 smart people together, you start getting some crazy things happen. And you get 1,000, 10,000 or 100,000, smart people, and it’s possible for companies to make mistakes, even though lots of intelligent bright actors are all in there at the same time. Whether it’s information flow, organizational politics, different kinds of pressures around the world, I saw what big company thinking and life looks like. And for sure, from a lawyer’s perspective, understanding that is really important, because you know what to look for in terms of the evidence. You understand how the accountability flow works in a large organization.

TB: Did he ever imagine himself at this stage in his life making this big career change?

Angiulo: I’ve been married a little over 20 years and I remember telling my then-girlfriend, now wife, that I wanted to be a product liability attorney one day. She just thought that was a very odd thing to hear from a teenager — very oddly specific. And so even the whole time I was at Microsoft, I always had my eye to the law because the logic problems, the thought behind it, it just fascinates me. So in a way, this is still my Plan A, but it is a really goofy timeline. It’s kind of funny to be there in school and be twice as old as the classmates around me, but it makes me feel young, to tell you the truth. I’m absolutely loving it. You do get to a certain point where you’ve been doing something for 25 years and you start thinking, ‘OK, I guess this is what I do.’ That guy was never gonna be me.”

TB: Mike Angiulo is in now his third year at the University of Washington Law School, and he has accepted a job for after he graduates at Perkins Coie here in Seattle, where he’ll be focusing on … you guessed it … aviation and software.

Podcast editing and production by Curt Milton. Music by Daniel L.K. Caldwell.

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.