The Evolving Role of Data Privacy in M&A Disputes

With M&A disputes on the rise, Daniel Ryan and Amy Worley discuss the findings of BRG's 2023 Mid-Year M&A Disputes Report and the global impact of emerging data privacy and digital asset regulations.

With M&A disputes on the rise, Daniel Ryan and Amy Worley discuss the findings of BRG's 2023 Mid-Year M&A Disputes Report and the global impact of emerging data privacy and digital asset regulations.


Transcript

[00:00:34] Daniel Ryan: Hi, my name is Daniel Ryan. I'm a BRG managing director in our London office. In this episode of BRG's ThinkSet Podcast, I'll be speaking with Amy Worley, a managing director at BRG and associate general counsel. Amy leads our Privacy and Information Governance practice group and serves as a fractional data protection officer for other organizations worldwide.

Today, we're going to discuss data privacy issues in relation to our 2023 Mid-Year M&A Disputes Report, which surveyed 162 M&A-focused lawyers, private equity professionals, and corporate finance advisors on the current M&A disputes landscape.

Amy, thank you for joining me today.

[00:01:27] Amy Worley: Sure. Well, thanks for having me.

[00:01:29] DR: Why don't we begin by talking about digital assets and services, one of the key focus areas in our M&A report. Four in ten survey respondents expect that new digital asset regulations, including those governing data privacy, are likely to increase the preponderance of M&A disputes this year. Why do you think that is? And how has today's shifting regulatory landscape impacted the viability of new M&A deals?

[00:02:03] AW: That is a big question. I will break my answer into a couple of pieces. One reason that I believe we're seeing that response is because worldwide there are several different approaches to the regulation of personal information or personal data that are creating some uncertainty and can conflict with one another, which can create challenges with implementation and operationalizing.

The other thing we're seeing is that the first rule of privacy is that the last rule of privacy just changed. I have notifications set for global privacy regulation around the world. It has been my practice for years to get up in the morning and, with my coffee, read what's happening. And I can no longer do it in that amount of time. And, you know, markets prefer predictability. Markets prefer stability. Disputes arise when things become uncertain and different interpretations can apply to different sets of facts. As someone who lives in the world of data protection and information compliance daily, I can tell you that the brightest minds on this don't agree. And that creates an environment that is ripe for conflict.

The last thing I'll mention is we are starting to see fines get larger. So, for a long time, the fines were—especially on the M&A scale—a risk you could accept. We're seeing that change. Obviously, the most prominent cases are in Europe, but even in the United States, we're starting to see more fines or larger verdicts or settlements. Greater regulatory action. That increases the risk, and that's going to create an opportunity for disputes in the M&A space.

[00:04:00] DR: Amy, you mentioned that Europe is one area where those fines are increasing. Some of our survey respondents called out particular regulations, such as the EU's Digital Services Act and the UK's Digital Markets Competition and Consumer Bill—both of which are quite new, and so everyone is finding their way. How do you see their potential to drive disputes? And is it really just the threat of fines that's an issue, or is it something around whether the fundamentals of an acquired company's business model are threatened by some of these regulations?

[00:04:39] AW: I think it's a “yes, and” issue. We certainly see with the entire suite of data-related laws in Europe, including the AI regulation that will be likely passed at the end of this year, Europe is trying to build a comprehensive approach to regulating the data marketplace. And one of the tools that they use are fines, but another is limiting a business's ability to operate if they don't comply with the laws.

I was actually on the phone with a data privacy regulator earlier today who is taking the view that certain uses of personal data for advertising simply cannot be done without consent. Now, this is not a view that's been accepted by the European Court of Justice yet, but it's an example of that kind of uncertainty. We will see that across these different legislative frameworks, where the regulators are developing their own doctrinal view, and there may be things like algorithmic disgorgement. You can't put this data in your algorithm. Or you may not use this data in pricing. Or ultimately, you can't operate in Europe. And I'm only picking on Europe because they're ahead of the game on digital regulation. I expect that we'll see this in other countries as well. We see regulators globally using not just the financial fine, but also the inability to, for example, transfer data out of the jurisdiction. And as that is all evolving, it creates the opportunity for controversy.

[00:06:23] DR: And do you perceive that there is a different outlook on how the regulators in Europe versus the US and Asia approach this issue?

[00:06:37] AW: Yes. And, and I would say that my answer to this question would have been different two years ago than it is now. The US has historically looked at the data and regulated it by industry. So, finance had a regulation, and health had a regulation. And that actually made a lot of sense in the pre-global-internet world. Because those verticals already had regulatory bodies established, and so it was adding requirements onto those. What we're now seeing in the US with all of the new US state comprehensive privacy laws is a move toward comprehensive privacy rules in the US.

One difference that I don't think [will] ever be fully resolved is there's a constitutional difference in the US. We have constitutional protection for some types of commercial speech. That creates a jurisprudence and philosophical difference versus Europe. Europe has the fundamental human rights, the rights and freedoms of individuals, as the basis for its regulation. Because of that, it's handled a bit more strictly. It relates to fundamental rights and freedom. Socially, there are differences. The Americans lived through World War II on this side of the ocean and didn't experience the same data collection issues that arose in Europe during that time period. And so, in my experience working on both sides of the pond, I often find myself telling World War II stories to US boardrooms to get them to understand how these two types of laws have developed and why.

Asia and Latin America are very consent based. They have constitutional rights to privacy. The individuals have the right to waive those, but it creates a big administrative burden on companies to make sure that they have gathered consent for the various uses of data. Especially when you're in emerging markets, where you may have low literacy rates or multiple languages in a small area. So there, it's very different how it's handled. It's very complicated. I would hate to show you my spreadsheet where we crosswalk the whole world, but yeah, philosophically very different.

[00:08:55] DR: I've certainly experienced that, having done some work for the European Commission on IP-related issues. And in the UK as well, where there was the question of, could Google ever happen in the UK? And one of the issues was around IP protection. But obviously there, you know, you are still dealing with a business-to-business issue. And a lot of the concerns around data privacy issues are more business-to-consumer, where clearly technological advances have made it possible to aggregate lots of that data and use it in powerful ways. And when one looks at the successful companies in some spaces, it is those algorithms where one is looking at user behavior that are driving the successful companies in those spaces. So to some extent there is potentially a conflict between successful innovation and regulation that prevents you from doing things that are new and different.

[00:010:00] AW: So, two things. One, currently the UK government is trying a kind of fourth way and trying to position itself as more digital friendly than Europe, but not quite going to the full, free-market US perspective. That's brand new, and we'll see how that evolves.

I would push a little bit on the idea that the regulation and innovation are in conflict. When done well, appropriate implementation of these regimes is like brakes on a Ferrari. So, you know, a Ferrari is great. It goes really fast. If the brakes go out, you don't want to be in the Ferrari.

Our approach, particularly in my practice, is to be uber pragmatic. And how can we do what is required in a way that we provide a good service like brakes without stopping the Ferrari. And so, I think you can do both. You just have to be very, very mindful. And honestly, it also depends on the company's leadership. Can they see it that way?

I know a lot of our Silicon Valley clients would prefer federal privacy legislation in the US because, right now, they're having to comply with the patchwork, and they feel like if they knew what was expected, it would be much easier to meet that obligation.

[00:11:22] DR: And I guess that's part of it: it’s knowing what's expected and how it is going to change. So, I guess you have to be very forward thinking so that you can adapt to changes in the regulation quickly.

[00:11:35] AW: Yes, and the challenge in most democratic nations is that the technology is usually a decade ahead of the law. Because of the way that our representative governments work, people have to care, and then they tell their representatives to care, and then government is in power or not in power, and they compromise, and ultimately they do something, and now they're regulating, you know, DOS. And so, the other thing that will be a challenge is to see: Can lawmakers do this faster? Because right now—I'll take HIPAA for example in the US—it is increasingly difficult to apply HIPAA to a digital health information landscape. It wasn't written for that. And so I think what we're going to see is more change, faster change, and that creates complexity.

[00:12:30] DR: And that pace of change is again going to vary. So the patchwork becomes even more complex. How does that sort of mishmash of different regimes create dispute risks in the context of sort of M&A disputes, particularly those that are cross-border and global?

[00:12:50] AW: So, I'm going to pick on Europe again. There are, right now, two different views among European regulators about the use of personal data for advertising. Let's say you're an e-commerce provider and you're using click stream and behavioral data to serve up items that you think that user might want on that platform. The Germans and the Dutch do not think that you can do that based on legitimate interest, because you have a business interest in doing so. Other regulators don't have that view, even within Europe. So, you do a deal. There are representations and warranties made about compliance. And you find out that maybe those reps and warranties are true in one part of a jurisdiction and not true in another.

I also think, because of how fast these things can happen and the way due diligence works, due diligence may have been reviewed with a certain set of rules in mind, and by the time the deal closes—or fails to close—there's a new set of rules or decisions that have been made by regulatory bodies. I'm sure that the Meta legitimate interest data transfer decision caused some heartburn for some in this space.

[00:14:11] DR: I mean, due diligence is obviously one way of managing risk. Are there best practices in this particular space that you would recommend?

[00:14:21] AW: I do still see an over-focus on security and an under-focus on privacy. And I suspect that's because breaches get a lot of press, and they can hurt the brand, and they are bad for consumer trust. It's a more recent phenomenon that privacy violations create a risk. And so, those doing the deals are having to sort of evolve their thinking into the more contemporary regulatory issues. And so, looking at not just security, but really looking at: On what basis is this personal data being processed? What regulators are we going to be dealing with? Are we in Germany and Austria, or are we in France and Italy and Spain? And taking a more holistic view about the privacy component in addition to the security component, I think, would really be helpful and a way to do that.

[00:15:19] DR: So, one thing that you mentioned at the beginning was regulation around AI, which is clearly an area where a lot of governments are saying a lot of things. What are the early indications as to the direction that different jurisdictions are taking in relation to AI?

[00:15:38] AW: So still early days. A couple US states have put forward some legislation, and then in the US regulatory space, the FTC, the SEC, the CFPB—we love alphabet soup over here—they've all issued guidance. And then, of course in Europe, they're taking the comprehensive regulation approach. And just for anybody who doesn't know, a regulation must be applied in all EU countries, as opposed to a directive that directs the individual country to create legislation that complies with those principles.

But I'm seeing right now more commonality than difference. There are some basic principles that have been circulating in the AI community for over a decade. The big one is explainability, so that you don't have a black box algorithm. So, transparency about the data that's being ingested. Reliability. You know, are the results reliable? Are they explainable? And then, of course, the bias. And that's really where I think there's going to be a lot of room for dispute and debate. One, because it depends on how the algorithm learns. It also depends on what is the algorithm learning from? We in the US have a particularly nasty history with respect to race. And so, if your algorithm is learning from older American texts, it may be ingesting those views. That might be quite different in a more homogenous culture that maybe has a more troubling history with respect to class or socioeconomic status. I can only imagine that then you get a legislative framework informed by that.

So, I think we all agree that we don't want AI to amplify the worst of us, but we don't all have the same worst. And so, it will be very interesting to see how that plays out.

[00:17:36] DR: Which again, going back to algorithms, social media, sometimes does amplify the best and the worst. And clearly a huge amount of data comes through that, particularly in relation to individuals and how they communicate with the outside world, mostly through social media and similar things. And presumably there will be some sort of interaction there between AI regulation and data privacy regulation.

[00:18:08] AW: Absolutely. And the other part that will most likely shake out in the courts, and we're seeing it right now—I believe Sarah Silverman filed a lawsuit in the US. But creatives, people who own IP or copyright writers, are saying, “You have trained this to sound like me. And you haven't paid me for it.” And sometimes that is easy to see when you go into an AI tool and you ask it to sound like a famous person. Sometimes it's not as clear. And we will be waiting on courts all over the world to decide, what are the property rights to the intellectual property that's feeding these algorithms. And that will be incredibly dynamic and will also depend very much on where in the world you're talking.

[00:18:57] DR: Yes. I mean, that's an interesting intersection between AI regulation and copyright. But again, quite different in terms of how different jurisdictions look at those types of issues.

[00:19:12] AW: Well, and I would apply another layer, which is professional accountability. So, I was listening to a podcast I love this morning on the intersection of tech and the law. And one thing they raised was algorithmic-enhanced medical diagnosis, or algorithm-enhanced legal services, etc. Right now, most of those professions have accountability for decisions based on someone with credentialing, training, etc. How are we going to manage that? The AI is not going to be kicked out of the medical association or the bar association, so it's an amazing time. I mean, it's fascinating. It is certainly going to require unique governmental solutions, but it will be a while before all of the different areas impacted play out.

And what I tell people is I have zero fear at all about AI. I think it's amazing. I only worry about the people who use it.

[00:20:12] DR: One of the other things that is going on in a rapidly changing world: the increasing importance of ESG, sustainability, social governance issues that, you know, companies are now starting to take much more seriously. Potentially again, you have an interaction there between privacy, data-related issues, potentially AI and the social governance dimensions, which, going back to our M&A dispute survey, was seen as one of the big drivers of potential disputes over the next twelve months. Where do you see that sort of interaction?

[00:20:54] AW: So mostly, at least from the privacy perspective, it's on the assurance side. When you are providing assurances about supply chains, whether it's making sure that the workers are of a proper age and being paid a proper wage, or agricultural spaces, you know, farming practices. In order to gather all of that, there's a lot of personal information involved there. And thus far, we haven't reconciled the two. The information that we would like to have in order to make the assurances about supply chains, etc. And so that leaves an opportunity for disputes.

The other part on the data side is data's not green. As we ramp up our compute power to previously unimagined levels, we burn gas. If I were to make a payment in crypto, I pay a fee that's basically gas. I'm paying for the electricity used to mine the coin. And we really haven't reckoned with that. But I expect we will soon, and I think that will impact things like decentralized finance, and looking at a company's overall carbon footprint, how digital are they, and what are they doing to offset that if they are? I always tell people data comes from trees, or data comes from dinosaurs, or data comes from gas. And I don't think that that's discussed as much as maybe it should be.

[00:22:34] DR: Well, I think this is clearly a hugely complex area and the interaction of so many different things that are also new and constantly evolving. Before we end, any final thoughts that you had in relation to your practice, and how you see it evolving as the landscape changes?

[00:22:58] AW: Sure. Yeah. So, one of the reasons I went into consulting, as opposed to going back into a law firm, is I really think that very interesting and powerful work is inside the companies—trying to create processes and procedures and implement technologies to make these legislative requirements actually happen. And to do so in a way that doesn't stifle innovation. I am fascinated. I get up every morning thinking what a cool space to work in.

I will say that it's sometimes very hard for me to describe what I do. And I've really started moving more toward information compliance because the information itself doesn't care where it is, and it doesn't care whether you're talking about privacy or copyright or security. It's a question of how do we solve problems around data within business to reduce risk without hurting innovation. And that's really what we what we try to do.

[00:24:02] DR: Excellent. Well, that is definitely a great note to end on. Amy, thank you again for taking the time to speak with us today. If anyone out there would like more information on our 2023 Mid-Year M&A Disputes Report, then please visit the M&A and Private Equity Disputes page on the BRG website, which is available in the podcast episode description. Thanks again.

[00:24:29] AW: Thank you, Danny.