GeoLegal Weekly #38 - Will PeopleLaw AI Save Society from Collapse?
I speak with Bill Henderson about the intersection of legaltech and societal collapse - then connect that to DoNotPay's FTC settlement. Also Israel/Iran, VP debate, longshoremen and fentanyl shipping.
The world is becoming increasingly complex. Businesses have global exposures at a moment when the infrastructure governing globalization is breaking down. Citizens have festering discontent with their governments - and new ways to organize given new technologies. AI tools are being created that can barely be explained, and society must grapple with how to regulate them. Deepfakes spread like wildfire and make us question if real things are also false.
Will this lead to collapse or will AI save us all? To understand the implications of this complexity for society and the legal sector, I sat down with Bill Henderson, professor at Indiana University’s Maurer School of Law and editor of Legal Evolution. You can watch the full interview and read my commentary on the discussion below. I conclude by linking it to the recent action against DoNotPay.
Societal Collapse
Bill and I went deep into the potential for societal collapse because of a failing legal system, at least for ordinary people, and the rise of AI. Bill pointed to a book called Collapse of Complex Societies by Joseph Tainter from the late 1980s, which argues that societies are problem-solving machines that come together to conquer challenges like food, shelter, and safety. When the complexity of society gets too great, fractions begin peeling off. At some point, usually through the lens of history, the disintegration becomes a collapse.
This raises two thoughts for me:
a) AI adds a huge amount of complexity to society. It can be a black box, it can be weaponized, it is challenging to regulate, it will re-organize the economy. This will create pressure on democratic capitalist societies because the losers from new tech will push for subsidies and benefits (paid for by the winners) which is pressure for socialism and governments will increasingly look to use technology to maintain order which is pressure for authoritarianism. It’s not clear how any of this will play out, but a political system that is increasingly under stress will become more so.
b) The law is too complex to navigate for most people today. Supreme Court Justice Neil Gorsuch’s new book Over Ruled: The Human Toll of Too Much Law is a great exposition on this. For instance, he gives the example of how a fisherman somehow goes to jail for violating Sarbanes-Oxley because he threw fish overboard and it takes eight years before the Supreme Court considers and reverses the decision. Gorsuch points out that the law is so complicated every one of us probably breaks it many more times than we realize. That complexity begets further complexity to comply, enforce or change the rules.
LegalAI to the Rescue?
Now, AI may well create marginal benefits to society that outweigh its complexity. Many of us are betting our careers on it. One way this would be the case is if it could deal with said legal complexity.
For companies, this is coming or already here. Billions of dollars are poured into “Organizational Law” solutions—technology for legal departments and the law firms that serve them. These solutions make lawyers faster, increase compliance, and (as tools like our Geolegal AI product do) enable businesses to monitor and deal with an increasingly volatile world.
But rich businesses getting richer doesn’t save society from collapse. Bill told me he realized he had spent most of his career on focused on the Organizational Law sector without appreciating the imperative around building technology for real people - what he calls “PeopleLaw.” The January 6th riots woke him up to the fact that people are angry in part because they don’t have a functioning legal system, and that’s often their primary interface with the government. He is now focusing this phase of his career on solving challenges in this domain which he explains in his essay “Period of Discernment (350),” from Legal Evolution, Dec. 31, 2023.
And we’re lucky to have him because too few people are. In fact, only $1 dollar of legaltech investment flows to PeopleLaw for every $40 that flows to organizational law. As one lawyer quipped to me recently, “you can’t get rich solving the law for poor people.” While I could show you some business models that might plausibly make you think twice, the reality is that funding legal innovation for the bottom half of society may be a market failure that needs alternate solutions.
Bill suggests we could solve this problem through non-profits owning a societal challenge (like tenants rights) as well as the mandate to build technology to solve it. Governments could invest in the development of these solutions, which could serve like utilities, and be available for societal good. I suggested that a potential US sovereign wealth fund (as both parties have suggested) could be a source of funding, though, more likely, governments would redirect legal aid budgets toward more technology investment (comment below if you think that’s a good or bad idea!)
But I think a bigger shift is to reconceptualize PeopleLaw challenges as societal challenges rather than legal challenges. Sure, legal is one process to solve for a dispute between you and your landlord, but there many other ways to solve for keeping you in your house.
Consumer Champions vs Robot Lawyers
The biggest hurdles in PeopleLaw are regulatory and political. Until rules are modified about unauthorized practice of law and non-lawyer ownership of law firms, innovation will be stifled. I’m optimistic those rules will change soon (as I wrote for Bloomberg Law) but, until then, state bar associations are spoiling for a fight to keep AI lawyers out of court rooms. Tech founders, of course, are often happy to pick fights they can afford to lose.
No doubt political factors came to bear on the FTCs selection of DoNotPay as one of the first violations it has brought for so-called AI washing. DoNotPay admitted no guilt and settled for a very small amount of money given just how much money it has raised and the fact that it is profitable enough to return dividends to investors.
The FTC claimed it improperly marketed itself as “the world’s first robot lawyer” even though it hadn’t been trained on a corpus of law or been supervised by lawyers. The FTC pointed to website copy that,
notes on his excellent Substack, “is pitchdeck stuff - all-encompassing and definitive statements that tech startups use to get investors to throw piles of money at them. It’s stuff that I see all the time in legal tech, like companies saying that their legal AI performs ‘better than human levels of accuracy’ and ‘without hallucinations.’”It’s hard to pick up a branded stuffed animal at a legal technology conference without falling over some of these claims.
I actually think branding is one of the biggest challenges in PeopleLaw today: If you say you are building AI to solve legal challenges for real people, you’ll raise less money from investors who doubt the market (as Bill discussed) and you’ll bring the ire of lawyers who will try to ensure you’re shut down one way or another (as DoNotPay is experiencing).
But DoNotPay serves as a broader lesson here. It has raised eight figures from some of the top VCs in the world to create software that has the potential to help real people in their entanglements with governments and companies through software it more generally brands as a “Your AI Consumer Champion.” It’s only when they started talking about the law that they got in trouble.
The fact is that software like DoNotPay solves consumer challenges whether they are emulating lawyers or not. For instance, when I get a parking ticket, I want that parking ticket to go away. A lawyer is one tool for making it go away and a DoNotPay generated letter to the parking authority is another. It means little to me whether DoNotPay actually emulates a lawyer or does something completely different; if it makes the ticket go away, I’m better off.
Thus, perhaps one of the keys to unlocking PeopleLaw is a new narrative about what it actually is. At the end of the day, citizens have rights. Because of complexity, it is very hard for average citizens to fully enjoy those rights when they intersect with companies or governments who have power over them. Software that levels the playing field will reduce complexity and probably dampen a general sense that society is unfairly organized to the benefit of companies and government. Technology tools that do that may solve some of the societal problems Bill is worried about.
Thus, software that levels the playing field will potentially serve an important role in the future. As one of the FTC Commissioners noted in his concurring statement “If a company were to create a computer system capable of giving accurate legal advice and drafting effective legal documents, or honestly advertise a system that provides something less, I doubt that the aggressive enforcement of lawyers’ monopoly on legal services would serve the public interest.”
I doubt that too. In order to create that software, however, founders may need to first act as if they are creating something else.
In Other News
Middle East Meltdown: I’ve been more negative than most about the Middle East and I feel depressingly vindicated in that negativity. To me, we are now entering the most fraught moment for citizens and business in the US (the moment has been especially fraught for the Middle East population for nearly a year now). One of the big fears in this wave of regional unrest has been the drawing in of Iran and the US into the conflict. Iran is now firing missiles directly at Israel, which the US is supporting in shooting them down. Israel is promising retaliation and the US is not calling for restraint. There are various scenarios supply chain and oil scenarios worth worrying about from escalation but one that doesn’t get enough attention is the risk of a cyber attack on US assets during the balance of election season especially if the US is drawn in more to the conflict. Expect to hear more about that in coming weeks, particularly as Iran has already been active (in support of Harris).
VP Debate: VP debates don’t matter unless something goes terribly wrong. Nothing went terribly wrong last night. In fact it felt like an odd throwback to when presidential campaigns and debates were about policies. Trump’s VP candidate JD Vance was polished (even when deflecting) and reintroduced himself to the American people in a way that might wipe off the negative press and memes he’s been subject to. Harris’ VP pick Tim Walz fumbled and misspoke a couple times, but nothing needle-moving and he exuded a folksy charm that will appeal to some. If you are predisposed to the politics of either, you probably thought they did well. If you didn’t know who to vote for for president, I don’t think the debate solved that for you.
Longshoremen strike - in part against technology: The longshoreman strike in the East and South will cause pretty substantial damage if it continues beyond a couple of weeks. A world where products, automobiles, and other packages are not getting to destinations would be both an inflationary economic shock and a political one as the election approaches (though interestingly President Biden has weighed in supporting the workers, couching it in terms of unfair foreign companies vs. US labor). One element of note - the strikes are in part to stop automation of the docks. I expect much more of this across both manual and knowledge work as AI begins to eat many forms of work.
Long-read on fentanyl and trade law: Reuters has a well-reported story on how a shift in US trade law has reduced shipping carton inspections so much so that fentanyl precursors are now routinely shipped from China through the US to Mexico because of efficient infrastructure in America. It’s a really good story about how corporate lobbying for small wins can have big, second-order consequences which can then put the companies who advocated for the changes in tough positions.
-SW