GeoLegal Weekly #34 - Terms & Conditions
I contemplate terms and conditions while taking an autonomous taxi. Hence also launches a fashion fundraiser to support the next generation of female Rwandan coders!
Last Saturday night, I was chauffeured around Los Angeles in an autonomous Waymo taxi with a friend. The taxi had no driver but picked us up and dropped us off wherever we were going. We could play whatever music we wanted, keep the temperature however we wanted and roll down the windows without worrying about disrupting the professional in the driver’s seat. Because there was no professional in the driver’s seat. The was a certain bliss to the simplicity and control of it all.
That was until the car made a couple of herky-jerky movements. Nothing dangerous - just quick shifts as the car tried to identify if something was a hazard or not. But it was enough to get me to think about my own liability as a passenger. If the car veered off and did the unthinkable, would my friend or I possibly be at fault? Probably not, but if I was injured, did I have any idea who would be at fault?
The honest answer is I have absolutely no idea. I can see how the car manufacturer or the autonomous ride-hailing app would have liability for failures or mistakes they could make. I could see how either of us passengers could probably be liable if we physically interfered with the car. I could imagine the software being hacked in ways that could leave any of us or none of us at fault. But the simple fact was that I was rolling around town in a vehicle that could put my life and others’ lives and property at risk without really understanding the terms and conditions.
This is a state of the world today that is hard to avoid. Research by Deloitte found:
91 percent [of consumers] willingly accept legal terms and conditions without reading them before installing apps, registering Wi-Fi hotspots, accepting updates, and signing on to online services such as video streaming. For ages 18 to 34, the rate of acceptance of terms and conditions, without reading them, reaches 97 percent.
This reminded me of the Black Mirror episode “Joan is Awful” where the main character Joan discovers her life can be recreated by streaming service Streamberry because she agreed to that in the terms and conditions of joining the streaming service. Bad thing after extremely bad thing happens to Joan while a stream of lawyers tell her she can’t sue because she agreed to the terms.
Terrible things could never happen to you for signing up for a video streaming service, right? Well, I’m not so sure any more.
A few weeks ago, a man’s wife had a fatal allergic reaction after eating at a restaurant on property owned by Disney. When the man included Disney in his lawsuit against the restaurant operator, Disney argued that the man must submit to arbitration instead because he had once taken a Disney+ streaming trial and agreed to a term there that “any dispute between you and us, except for small claims, is subject to a class action waiver and must be resolved by individual binding arbitration.”
Of course, the public relations tone-deafness of the legal strategy was kind of wild and it’s worth noting that Disney has since reversed this position. But what’s really interesting to me was the concept that sprawling companies, who are increasingly in control of more and more of our lives, could theoretically take consent in the context that has the lowest bar and apply that in ways we’ve never imagined.
Which brings me back to Waymo, owned by Alphabet, the parent of Google. If my Waymo had crashed and injured me, did I cede all rights to a jury trial when I signed up for Gmail in 2007? Or, more perversely, if a person was struck by an errant Waymo while following Google Maps across the street, did their assent to the mapping app jeopardize their claim for injury?
These may seem like edge cases, but we live in a world of great consolidation. We work and play on the same Apple devices. We post about the real world and interact in the metaverse using the same Meta accounts. We use our Amazon logins to stream content and to steam dumplings from Whole Foods. We use ChatGPT to draft code for apps we’re building and draft groomsmen speeches we’re due to give (shhh.)
It’s actually remarkable when we think about how many products in our houses, at work, or in the cloud are owned by so few companies - firms that have access to all our personal information and even the risk of causing us physical harm. Especially in a context where government is woefully behind in understanding the technology and developing meaningful regulations.
The reality is these platforms deliver the convenience and innovation customers want. So how do we strike a balance?
As I wrote about in my 2024 GeoLegal Outlook, the undefined space - where governments struggle to regulate emerging tech - creates openings for alternative forms of policing. Sure, companies can attempt to shift liability to their users knowing they don’t read their agreements - that is, until, AI empowers customers to quickly digest their obligations.
Companies may instead self-police and hold themselves to a higher standard than the current law because they think it's the right thing to do or because they don’t want to take public relations risks. Industry groups can come together to raise standards in a technology sector version of the example I wrote about in GeoLegal Weekly #30. Or, as Gillian Hadfield has written, rather than wait for government, government could allow for-profit regulators to compete to design the best regulatory systems that technology companies want to use and get paid for the regulation they have designed.
And, lets say that subsequent court decisions allocate liability for new technology products in ways that hamper innovation. We as a society could decide we still want more autonomous cars - or whatever the technology is - and set-up a fund like the National Vaccine Injury Compensation Program. That program provides no-fault compensation to those injured from vaccines which is paid for by a user fee and was developed after skyrocketing legal risk threatened the vaccine supply chain. The same could be applied to tech products.
Until there is more resolution, I think it’s worth stepping back and realizing that in-house legal teams in innovative companies have a gatekeeper power to this innovation. Ultimately, it’s these lawyers who are defining what is acceptable. That is to say that certain technologies may achieve government clearance but be held back by their inventors because the legal risk of the product is too high. Conversely some products may have too-clever-by-half solutions built in, like the theory that Teslas deactivate one second before autopilot accidents in order to shift liability back to drivers.
As the world moves to a new equilibrium, like the Disney example, I suspect more and more of this will play out in the court of public opinion.
Supporting Female Coders in Rwanda
As many of you know, our company has its engineering hub in Kigali, Rwanda. I’m really excited to announce a project I’ve been working on for over a year with my good friends Walé Adeyeme MBE, a top UK fashion designer, and Judith Kaine, who runs Komezart, the premier online platform for Rwandan art.
Inspired by Hence’s philosophy to “change the narrative” with respect to global talent, Walé worked with Rwandan artist Imufashe Olivier to create a one-of-a-kind work of art in support of the next generation of Rwandan female technologists. This art is available globally as a t-shirt or a print.
All proceeds will go to SheCanCode, a transformative tech bootcamp for Rwandan women aged 18-25. It costs about $200 to sponsor training for one woman to become equipped to enter the IT sector. Let’s see how many women this community can sponsor!
If you want to help us change the narrative and support the next generation of female technology talent in Rwanda, please consider getting yourself or a loved one a very fashionable tee here
That’s it for this week.
-SW