<p> Few people can force OpenAI to change governance at the crisis-stricken artificial-intelligence company, and the head of Microsoft, a major financial backer, is not one of them, according to legal experts.</p>.<p>The nonprofit board overseeing the maker of the popular ChatGPT chatbot sent shock waves through Silicon Valley on Friday by abruptly firing Chief Executive Sam Altman. Nearly all of the company's 700 employees signed a letter threatening to resign if the board does not step down, and Microsoft CEO Satya Nadella has called for governance changes.</p>.<p>The turmoil has left investors weighing their legal options and illustrated a divide over how the potentially disruptive technology can be developed safely.</p>.OpenAI board, Altman in talks for return of former CEO .<p>Because it is a nonprofit, the only people who could force the current board of OpenAI to step down or change are judges or state attorneys general, said Alexander Reid, an attorney at BakerHostetler who counsels nonprofit organizations.</p>.<p>Attorneys general oversee and investigate nonprofits, and have wide latitude to seek reforms.</p>.<p>"Even if they don't go to court, their mere presence typically gets results," he said.</p>.<p>Attorneys general can enact everything from leadership changes to complete shutdown of an organization, usually after finding fraud or illegal conflicts of interest.</p>.<p>Hershey Co is one example. The trust that controls the candymaker agreed in 2016 to replace certain board members after the Pennsylvania attorney general challenged the trust's spending.</p>.<p>Darryll Jones, a law professor at Florida AM University, said the U.S. Internal Revenue Service is another source of accountability.</p>.<p>"There is a whole boatload of scholarship noting that nonprofit enforcement is severely lacking, but for the most part nonprofits are pretty good at self-policing if only to avoid scandal that would impact donations," he said.</p>.<p>OpenAI's for-profit arm was under the full control of a nonprofit, an arrangement meant to insulate decisions about a potentially powerful technology from being driven by corporate greed.</p>.<p>Because of that, investors who have collectively plowed billions of dollars into the startup face hurdles to suing the board over Altman's firing, though sources have told Reuters some are considering legal action.</p>.<p>Under OpenAI's bylaws, only directors can remove or elect new board members. The arrangement, known as a self-perpetuating board, is very common in the nonprofit world, said Reid.</p>.<p>There are currently four people on the board: three independent directors, and OpenAI chief scientist Ilya Sutskever. The latter worked with the other board members to remove Altman and former President Greg Brockman, but has since said he "deeply regret(s)" the action.</p>.<p>Outside of government enforcers, Sutskever may now be the only person in a position to formally challenge the board's decision.</p>.<p>Board members can sue other board members, either directly or on behalf of the organization, for failing to exercise their duties, said Reid.</p>.<p>But typically such court battles are fought only when there is suspected malfeasance connected with spending or compensation, he said.</p>.<p>In fights over organizational direction or control, the more common play is for the organization to split.</p>.<p>"You just form another nonprofit that does it slightly differently," he said.</p>.<p>OpenAI has already survived one such break.</p>.<p>The cofounders of Anthropic, who were also executives at OpenAI until 2020, had broken from their employer over disagreements regarding how to ensure AI's safe development and governance.</p>.<p>Whether OpenAI survives the rift between its board and employees will likely be determined in the next few days.</p>.<p>(Reporting by Jody Godoy in New York Editing by Tom Hals and Matthew Lewis)</p>
<p> Few people can force OpenAI to change governance at the crisis-stricken artificial-intelligence company, and the head of Microsoft, a major financial backer, is not one of them, according to legal experts.</p>.<p>The nonprofit board overseeing the maker of the popular ChatGPT chatbot sent shock waves through Silicon Valley on Friday by abruptly firing Chief Executive Sam Altman. Nearly all of the company's 700 employees signed a letter threatening to resign if the board does not step down, and Microsoft CEO Satya Nadella has called for governance changes.</p>.<p>The turmoil has left investors weighing their legal options and illustrated a divide over how the potentially disruptive technology can be developed safely.</p>.OpenAI board, Altman in talks for return of former CEO .<p>Because it is a nonprofit, the only people who could force the current board of OpenAI to step down or change are judges or state attorneys general, said Alexander Reid, an attorney at BakerHostetler who counsels nonprofit organizations.</p>.<p>Attorneys general oversee and investigate nonprofits, and have wide latitude to seek reforms.</p>.<p>"Even if they don't go to court, their mere presence typically gets results," he said.</p>.<p>Attorneys general can enact everything from leadership changes to complete shutdown of an organization, usually after finding fraud or illegal conflicts of interest.</p>.<p>Hershey Co is one example. The trust that controls the candymaker agreed in 2016 to replace certain board members after the Pennsylvania attorney general challenged the trust's spending.</p>.<p>Darryll Jones, a law professor at Florida AM University, said the U.S. Internal Revenue Service is another source of accountability.</p>.<p>"There is a whole boatload of scholarship noting that nonprofit enforcement is severely lacking, but for the most part nonprofits are pretty good at self-policing if only to avoid scandal that would impact donations," he said.</p>.<p>OpenAI's for-profit arm was under the full control of a nonprofit, an arrangement meant to insulate decisions about a potentially powerful technology from being driven by corporate greed.</p>.<p>Because of that, investors who have collectively plowed billions of dollars into the startup face hurdles to suing the board over Altman's firing, though sources have told Reuters some are considering legal action.</p>.<p>Under OpenAI's bylaws, only directors can remove or elect new board members. The arrangement, known as a self-perpetuating board, is very common in the nonprofit world, said Reid.</p>.<p>There are currently four people on the board: three independent directors, and OpenAI chief scientist Ilya Sutskever. The latter worked with the other board members to remove Altman and former President Greg Brockman, but has since said he "deeply regret(s)" the action.</p>.<p>Outside of government enforcers, Sutskever may now be the only person in a position to formally challenge the board's decision.</p>.<p>Board members can sue other board members, either directly or on behalf of the organization, for failing to exercise their duties, said Reid.</p>.<p>But typically such court battles are fought only when there is suspected malfeasance connected with spending or compensation, he said.</p>.<p>In fights over organizational direction or control, the more common play is for the organization to split.</p>.<p>"You just form another nonprofit that does it slightly differently," he said.</p>.<p>OpenAI has already survived one such break.</p>.<p>The cofounders of Anthropic, who were also executives at OpenAI until 2020, had broken from their employer over disagreements regarding how to ensure AI's safe development and governance.</p>.<p>Whether OpenAI survives the rift between its board and employees will likely be determined in the next few days.</p>.<p>(Reporting by Jody Godoy in New York Editing by Tom Hals and Matthew Lewis)</p>