A Startup Is Selling Dead Companies' Slack Messages and Emails to Train AI
Last updated: April 19, 2026
Last updated: April 2026
SimpleClosure, a startup dissolution service, is now packaging and selling dead companies’ Slack messages, emails, and code repositories to AI companies building “reinforcement learning gyms,” where AI agents practice navigating real corporate environments using data from employees who never consented to that use. Forbes reported the news on April 16, 2026, and the details should concern anyone who has ever worked at a company that went under.
The demand is coming from a new corner of the AI industry: “reinforcement learning gyms.” These are simulated corporate environments built from real company data where AI agents practice navigating actual workplaces. Not synthetic scenarios. Not made-up org charts. Real conversations from real companies that no longer exist.
How This Works
When a company uses SimpleClosure to wind down, the process involves liquidating assets, distributing to investors, and dissolving the entity. SimpleClosure’s new tool adds a step: packaging up the company’s digital footprint and selling it to AI data buyers.
The data includes Slack messages between employees, internal emails, code pushed to repositories, project management boards, and other workspace artifacts. SimpleClosure positions this as another way for founders to extract value from a dead company during dissolution. Instead of that data just getting deleted, it gets sold.
From SimpleClosure’s perspective, this is efficient asset liquidation. From an AI company’s perspective, this is gold. Real enterprise data, with real communication patterns, real decision-making chains, and real workplace dynamics, is exactly what AI agents need to learn how to operate in corporate environments. Synthetic data can teach an AI to write an email. Real data teaches it how people actually communicate at work.
The Legal Gaps Are Enormous
Start with the obvious question: who consents to this?
When employees join a company, they use corporate tools (Slack, email, project management software) under the assumption that the company controls that data for business purposes. Most employment agreements and acceptable use policies give the company ownership of communications made on company systems. That’s standard.
But “the company owns your Slack messages for business operations” is a very different proposition from “the company can sell your Slack messages to train AI after the company dies.” The original consent framework, if one existed, almost certainly didn’t contemplate this use case. Most privacy policies written before 2024 don’t mention AI training at all.
Then there’s the question of what’s in those messages. Slack conversations contain personal information, health discussions, salary negotiations, complaints about managers, job search conversations, and the kind of unfiltered communication people have when they think only their coworkers are reading. Selling that to AI companies creates exposure under state consumer privacy laws, potential HIPAA issues if health information was discussed, and a raft of problems under the GDPR if any European employees were involved. And this isn’t a niche concern — Mercor already faces five lawsuits over exactly this kind of unauthorized AI training data use.
The Reinforcement Learning Gym Problem
The “reinforcement learning gym” concept deserves its own scrutiny. These are environments where AI agents practice doing things inside simulated companies. They learn to navigate Slack channels, respond to emails, manage tasks, and interact with simulated coworkers, all based on patterns from real defunct companies.
The practical effect is that an AI agent trained on your dead company’s data might learn your communication style, your project management patterns, your decision-making shortcuts, and your workplace culture. That agent then gets deployed at other companies, carrying the behavioral patterns of people who never consented to having their work habits turned into training data.
For companies building these gyms, the appeal is obvious. AI agents trained on synthetic data act synthetic. They don’t understand that people respond differently to a Thursday 5pm Slack message than a Tuesday 10am one. They don’t know that certain team leads prefer bullet points and others want narrative updates. Real data captures all of that.
What This Means for Current Employees Everywhere
If you work at a startup right now, your Slack messages, emails, and code contributions might end up as AI training data if your company fails. That’s a real possibility for any venture-backed company, given that roughly 90% of startups eventually shut down.
Your employment agreement probably doesn’t address this. Your company’s privacy policy probably doesn’t address this. The terms of service for Slack and Google Workspace and Microsoft 365 probably have provisions about data ownership that were never written with this scenario in mind.
The law hasn’t caught up. There is no federal statute that specifically prohibits selling employee communications from defunct companies for AI training purposes. State privacy laws like the CCPA give consumers rights over their personal information, but the intersection of employee data, corporate dissolution, and AI training rights is untested territory. The FTC’s case-by-case enforcement approach means there’s no clear regulatory signal until someone gets hit.
What to Do Now
If you’re a founder: Before your company ever touches SimpleClosure or any similar service, understand what data you’re selling and whether you have the legal right to sell it. Check your employee agreements, your privacy policy, and any promises you made to employees about how their data would be handled. Selling their Slack messages for AI training after shutting down is the kind of move that generates class actions.
If you’re an employee at a startup: Ask your employer what happens to internal communications data if the company shuts down. Get it in writing. Better yet, push for a data destruction clause in the wind-down plan.
If you’re a GC or outside counsel: Update your template employment agreements and privacy policies to address AI training use of internal communications data, both during the company’s life and after dissolution. The current gap between what employees expect and what companies can technically do with their data is a liability factory.
If you’re building AI training datasets: The companies selling you this data may not actually have the consent chain you need. When the first employee lawsuit lands over unauthorized AI training use of their workplace communications, the data buyer is going to be named right alongside the data seller. Just ask Match Group how that turned out when the FTC came knocking over AI training data.
The market for corporate corpse data is only going to grow. The legal framework for it doesn’t exist yet. The companies that get ahead of this will be the ones that build consent mechanisms before the courts force them to.
Your employment agreements and data policies probably have a gaping hole where AI training rights should be. That hole becomes a lawsuit when someone else fills it for you. Book a diagnostic and we’ll tell you exactly where your exposure is.