A non-profit group is joining Elon Musk’s efforts to prevent OpenAI from becoming profitable
Encode, the nonprofit that co-sponsored California Ominous SB 1047 AI Safety Legislation has requested permission to file an amicus curiae brief in support of Elon Musk Judicial order To stop OpenAI from becoming a for-profit company.
In a Proposed summary An attorney for Encode, who was filed in the U.S. District Court for the Northern District of California on Friday afternoon, said turning OpenAI into a for-profit enterprise would “undermine” the company’s mission of “developing and deploying… transformative technology in a safe and useful way.” To the public.”
“OpenAI and its CEO, Sam Altman, claim to be developing technology that changes society, and these claims should be taken seriously,” the summary said. “If the world is truly on the cusp of a new era… Artificial General Intelligence (AGI)“The public has a deep interest in this technology being controlled by a public charity legally obligated to prioritize safety and public benefit rather than an organization focused on generating financial returns for a few privileged investors.”
In a statement, Sneha Revanur, founder and president of Encode, accused OpenAI of “sucking profits.” [of AI] Rather, the externalization of the consequences for all of humanity,” he said, “[t]Courts must intervene to ensure that the development of artificial intelligence serves the public interest.
The Encode brief was supported by Geoffrey Hinton, AI pioneer and 2024 Nobel laureate, and Stuart Russell, professor of computer science at UC Berkeley and director of the Center for Humanized AI.
“OpenAI was founded as a non-profit organization with a clear focus on safety and has made a variety of safety-related promises in its charter,” Hinton said in a statement. press release. “It gets many tax and other benefits from its nonprofit status. Allowing it to rip all that up when it becomes inconvenient sends a very bad message to other players in the ecosystem.”
OpenAI was launched in 2015 as a non-profit research laboratory. But as its experiments have become more capital intensive, it has become more so creature Its current structure, which accommodates external investments from venture capital firms and corporates, Included Microsoft.
Today, OpenAI has a hybrid structure: a for-profit side controlled by a non-profit organization with a “defined profitability” stake for investors and employees. But in A Blog post The company said this morning that it plans to begin converting its existing for-profit corporation into a Delaware Public Benefit Corporation (PBC), with shares of common stock and capital stock. OpenAI mission As a public benefit to it.
The nonprofit OpenAI will remain in existence but will cede control in exchange for shares in PBC.
Musk, an early shareholder in the original nonprofit entity, filed a lawsuit in November seeking an injunction to stop the proposed change, which has long been in the works. He accused OpenAI of abandoning its original philanthropic mission of making the fruits of its AI research available to everyone, and depriving competitors of capital — including his AI startup, xAI — through anticompetitive means.
OpenAI has Named Musk’s complaints are “baseless” and just a sour case.
Meta, Facebook’s parent company and AI competitor, also supports efforts to block OpenAI’s diversion. In December, dead sender Letter to California Attorney General Rob Bonta, arguing that allowing the shift would have “seismic implications for Silicon Valley.”
Encode’s lawyers said OpenAI’s plans to transfer control of its operations to the PBC would “transform an organization bound by law to ensure the safety of advanced AI into an organization bound by law to ‘balance’ its consideration of any public benefit against ‘financial pay’.” [its] Contributors.
Encode’s lawyer notes in the brief, for example, that the nonprofit OpenAI has committed to stop competing with any “value-aligned, safety-conscious project” that comes close to building AGI before it does so, but that OpenAI as a for-profit organization would have an incentive to… Less (if at all) to do so. The brief also notes that the board of directors of the nonprofit OpenAI will no longer be able to cancel investors’ shares if necessary for safety reasons once the company’s restructuring is complete.
OpenAI continues its experiment that flow to High level Talents This is partly due to concerns that the company is prioritizing commercial products over safety. One former employee, Miles Brundage, a longtime policy researcher who left OpenAI in October, said in an article Series of posts On
“The fiduciary duty that OpenAI promotes to humanity will evaporate, as Delaware law is clear that CBC directors owe no duty to the public whatsoever,” Encode’s brief continued. “The public interest would be harmed by a safety-focused, mission-constrained nonprofit relinquishing control of something transformative at all costs to a for-profit organization without an enforceable commitment to safety.”
Encode, which Revanor founded in July 2020, describes itself as a network of volunteers focused on ensuring the voices of younger generations are heard in conversations about the impacts of artificial intelligence. Encode has contributed to several federal and state pieces of legislation related to AI as well as SB 1047, including the White House bill. Amnesty International Charter of Rights And President Joe Biden Executive Order on Amnesty International.
Updated Dec. 30, 10:10 a.m. PT with statements from Revanor and Hinton.
TechCrunch has an AI-focused newsletter! Register here Get it in your inbox every Wednesday.