How the Salesforce-owned chat platform is furthering its AI vision, following ongoing concerns about how big tech companies are repurposing personal and corporate data to train their AI services. , sparking a storm of anger among Slack users.
The company, like many other companies, is using its own user data to train some of its new AI services. But it turns out that if you don't want Slack to use your data, you have to opt out by emailing the company.
And the terms and conditions were hidden in an outdated and confusing privacy policy that no one paid attention to. As was the case with Slack, an offended person posted about his Slack on a community site that's hugely popular with developers, and the post went viral…that's what happened here.
It all started last night when a Hacker News memo raised the question of how to train AI services with a direct link to Slack's privacy principles. No additional comments were necessary. This post started a long conversation about how Slack opts users in to AI training by default, and how to opt out you have to email a specific address, and his current Slack user It seemed like news to me.
That hacker news thread has since sparked multiple conversations and questions on other platforms. There's a new, household-name product called “Slack AI” that allows users to search for answers and summarize conversation threads, so why is it never mentioned? Does it somehow put a name on its privacy principles page to make it clear whether the privacy policy applies? And why does Slack refer to both “global model” and “AI model”? Is it true?
People are confused about where Slack applies its AI privacy principles, and the idea of opting out via email is surprising and frustrating in a company that touts the idea that you can control your data. Among the people who are doing it, Slack is doing it. It doesn't come off well.
The shock may be new, but the conditions are not. According to the Internet Archive page, the terms have been in effect since at least September 2023 (we've asked us to confirm).
Per its privacy policy, Slack uses customer data to, among other things, train a “global model,” which Slack uses to power channels, emoji recommendations, and search results. Slack says there are certain limits to its use of data.
“Slack has platform-level machine learning models for things like channel and emoji recommendations and search results. We build these models in a way that allows them to learn, remember, and reproduce pieces of customer data. We do not provide any training or training,” a company spokesperson told TechCrunch. However, this policy does not appear to address the overall scope and the company's broader plans for training AI models.
Slack's terms say customers can opt out of data training and still benefit from the company's “globally trained AI/ML models.” But in that case, it's unclear why the company is using customer data to power features like emoji recommendations in the first place.
The company also said it does not use customer data to train Slack AI.
“Slack AI is a separately purchased add-on that uses large language mModels (LLMs), but we don't train these LLMs with customer data. Slack AI is hosted directly within Slack's AWS infrastructure. Because we use LLM, customer data remains internal and is not shared with the LLM provider. This ensures that customer data remains under the control of that organization and is used only by that organization.” the person in charge said.
Some of the confusion may be resolved sooner or later. In a response to a critical take on Threads by engineer and writer Gergely Orosz, Slack engineer Aaron Maurer wrote that the company “explains how these privacy principles work in his Slack AI. I have acknowledged that the page needs to be updated to reflect this.
Maurer added that these terms were written at a time when the company didn't have Slack AI in place, and that these rules reflect the company's efforts around search and recommendations. Given the confusion surrounding what Slack is currently doing with AI, it's worth considering the terms of future updates.
The problem with Slack is that in the fast-changing world of AI development, user privacy shouldn't be an afterthought, and companies' terms of service should clearly spell out when and how data will, or won't, be used. It's a stark reminder of what needs to be specified.
Publish an AI newsletter. Sign up here to start receiving it in your inbox on June 5th.