LinkedIn, the professional networking platform owned by Microsoft, is facing a proposed class-action lawsuit over allegations that it disclosed private messages from Premium users to third parties for training artificial intelligence (AI) models without explicit consent.
According to the lawsuit, filed in federal court in San Jose, California, LinkedIn introduced a privacy setting in August 2024 allowing users to opt in or out of data sharing. However, the platform discreetly updated its privacy policy the following month, stating that user data could be used to train AI models. In a linked FAQ section, LinkedIn reportedly clarified that opting out would not prevent the use of data that had already been processed for AI training.
According to a Reuters report, the complaint, which seeks to represent millions of LinkedIn Premium users, alleges that the platform violated its promise to use personal data solely to improve its services. The plaintiffs claim that LinkedIn’s actions were an intentional effort to minimize public scrutiny and potential legal consequences, accusing the company of attempting to “cover its tracks.”
The lawsuit, filed on behalf of Premium customers who used LinkedIn’s InMail feature, seeks unspecified damages for breach of contract, violations of California’s unfair competition law, and $1,000 per person for violating the federal Stored Communications Act.
In response to the allegations, LinkedIn issued a statement dismissing the claims as “false” and “without merit.”
This legal development comes shortly after U.S. President Donald Trump announced a high-profile joint venture among Microsoft-backed OpenAI, Oracle, and SoftBank, with a potential $500 billion investment to build AI infrastructure in the United States.
The case, titled De La Torre v. LinkedIn Corp (No. 25-00709), highlights growing concerns over the use of personal data to train AI systems. As AI technologies rapidly evolve, scrutiny over how companies handle user data is intensifying, with privacy advocates and regulators closely monitoring these practices.