AI is “not open in any sense,” the battle over encryption is far from won, and Signal's principled (and uncompromising) approach could complicate interoperability efforts. warned company president Meredith Whitaker. But it's not all bad news.
(I wrote the good news separately, so this is actually all bad news.)
Whittaker, who spoke to me on stage at Strictly VC LA, called the resurgence of legal attacks on encryption “magical thinking.”
“We are seeing a lot of narrow-minded and highly politically motivated legislation, many of which are based on the idea of protecting children, when they are actually being used by security agencies and authoritarian governments. “It has been used to advance something that is a very old aspiration: to systematically backdoor strong encryption,” Whitaker said. “The ability to communicate digitally and privately can be fundamentally eliminated, often swept away by well-intentioned people who don't have the knowledge or education to understand the implications of what they're doing. I think there is.”
Ironically, or ironically, one of the galvanizing factors has been a decade of calls for more responsibility from technology companies.
“The overall theme that I see is a deep desire for accountability in the tech industry, and that was seen as kind of galvanizing in the mid-2010s. I mean, it's been weaponized. And I think we're looking at a bottle of responsibility with surveillance wine in it,” she said.
“'Accountability' means more monitors, more surveillance, more backdoors, and more freedom for people to express themselves, instead of actually checking the business models that have given rise to large platforms. It seems to further eliminate places where people can interact and communicate. Information operations, recording personal information, whatever, can be easily weaponized, right? At the root of the problem is an unwillingness to attack. Instead, what we are seeing are de facto proposals to extend oversight to the government and NGO sectors in the name of accountability. ”
One example of such a proposal comes from the UK's Investigatory Powers Act, under which the UK government could globally block updates to apps deemed to pose a threat to national security. He is threatening to do so.
“[The IPA] This effectively asserts that the UK has the power to require technology companies across any jurisdiction to check with the UK government before distributing security patches. That's because they may be exploiting the patch somewhere in the business they want to continue. Again, this is a form of narrow-minded, magical thinking,” Whittaker said.
“This is very dangerous because it threatens to return to the paradigm of the early 90s, before the 1999 liberalization of encryption, when governments had a monopoly on encryption and digital privacy rights. Any ability to deploy encryption or privacy updates or anything that protects and hardens the service must be cleared by the government.”
“And honestly, I think the VC community and big tech companies need to be more involved in identifying what a threat this is to the industry and pushing back,” she added.
One regulation that seems to make sense is the messaging interoperability obligation being pursued in the EU through the Digital Markets Act. However, there are dangers lurking in this too.
“I think the spirit makes a lot of sense. But of course, Signal can't interoperate with another messaging platform without significantly raising the bar for privacy.” “Because the Signal protocol… Because we don't just encrypt the content of your messages using . We encrypt the metadata, including your profile name, profile picture, who's in your contact list, who you talk to, and when you talk. We do encryption. It needs to be at a level of privacy and security that is fully agreed upon with the person we are interoperating with before we agree to interoperate.”
She explained that there is a risk that security and privacy will be watered down in the name of convenience, and the opposite will happen. “It could actually lower privacy standards, create a kind of interoperable monolith, and further marginalize those who are demanding honest privacy standards.” , scoffed at the idea that Apple, with permission, would leave such a regime hopelessly fragmented.)
In the private sector, Whitaker was quick to call the dominant Nvidia a monopoly.
“It's a chip monopoly, and it's a CUDA monopoly,” she said, referring to the unique computational architecture at the heart of much of today's high-performance computing.
I asked him if he thought the company had become dangerous because of its accumulation of power.
“So there's a lot of Spider-Man pointing fingers at each other, right? I'm just saying Microsoft is pointing fingers at Nvidia and saying, if you're worried about monopoly, don't rely on poor Microsoft, look at Nvidia. They're Nvidia, and we're seeing them saying, look at Google, too. Google released this kind of PR document last week on AI access principles, and it's clear that Google is going to move vertically from app stores to chips. We talked about being the only company that's integrated. That's true, right? But then a few days later, Google announced that it's actually a monopoly because Microsoft has kind of a monopoly on OpenAI and Azure. announced.
“I mean, no one's innocent here. There's a lot of stuff like, 'We're all trying to find the guy who did this…' (I mean, you should leave with Tim Robinson. The famous “Hot Dog Guy” meme (quoted from I think).
“I think we need to realize that AI relies on big technology. It requires massive technical resources. It's not open in any sense,” she said. “Let's be honest, if you need $100 million to train, that's not an open resource. If you need $100 million to deploy at scale for a month, that's not open, right? So these We need to be honest about how we use terminology. But I don't think this week's bias against Nvidia as the culprit undermines our response to this massively concentrated power. I don’t want that.”
You can read the full interview below.