NEW YORK (AP) — OpenAI whistleblowers have filed a complaint with the Securities and Exchange Commission and asked the agency to investigate whether the ChatGPT maker illegally restricted workers from speaking out about the risks of its artificial intelligence technology.

A letter to SEC Chair Gary Gensler representing “one or more anonymous and confidential” whistleblowers asks the agency to swiftly and aggressively enforce its rules against non-disclosure agreements that discourage employees or investors from raising concerns with regulators.

The July 1 letter references a formal whistleblower complaint recently filed with the SEC. The Washington Post was the first to report on the letter.

U.S. Sen. Chuck Grassley's office shared a copy of the letter with The Associated Press, noting it was provided to his office by legally protected whistleblowers.

“OpenAI’s policies and practices appear to cast a chilling effect on whistleblowers’ right to speak up and receive due compensation for their protected disclosures,” said Grassley, an Iowa Republican, in a written statement. "In order for the federal government to stay one step ahead of artificial intelligence, OpenAI’s nondisclosure agreements must change.”

OpenAI and the SEC didn’t immediately respond to requests for comment Monday.

Share:
More In Technology
Sex is a big market for the AI industry. ChatGPT won’t be the first to try to profit from it
OpenAI has announced that ChatGPT will soon engage in "erotica for verified adults." CEO Sam Altman says the company aims to allow more user freedom for adults while setting limits for teens. OpenAI isn't the first to explore sexualized AI, but previous attempts have faced legal and societal challenges. Altman believes OpenAI isn't the "moral police" and wants to differentiate content similar to how Hollywood differentiates R-rated movies. This move could help OpenAI, which is losing money, turn a profit. However, experts express concerns about the impact on real-world relationships and the potential for misuse.
Load More