HomeIndustriesAI may regret imitating Wall Street's regulatory resistance

AI may regret imitating Wall Street's regulatory resistance

Unlock Editor's Digest at no cost

There are two ways to impose rules on an industry: before it breaks something, or after. Artificial intelligence takes the latter route.

California Gov. Gavin Newsom vetoed a bill last weekend that might have imposed relatively lenient standards on AI models like OpenAI's ChatGPT or Google's Gemini, requiring them to take reasonable precautions to avoid catastrophic damage .

The measures – including publishing a security declaration and testing recent models for devastating potential – aren’t a bureaucratic nightmare. They would probably only commit to AI machines larger than those that exist today. Most large firms already say they may prioritize security. In addition, the proposed standards have already been significantly weakened.

Still, among the biggest names in tech, including enterprise capitalists Andreessen Horowitz and Y Combinator, cried foul. Some argued that rules should come from Washington — unlikely in a divided Congress — or should somewhat be based on evidence of actual risks from AI, which remains to be largely theoretical.

There are parallels here with one other industry that threatened to destroy civilization: banking. Major lenders like JPMorgan and Goldman Sachs invest a number of time and money fighting their schemes to maintain them under control. They also recently won a victory, forcing the Federal Reserve to weaken complex rules that might have made them hold more capital.

Regulatory objectors across industries are likely to make similar arguments, claiming that hasty rules stifle innovation and stifle upstarts. Both Wall Street and Silicon Valley worry that too-tight bureaucracy will prevent them from doing good. Banks wouldn’t give you the option to offer Americans with mortgages; AI developers would draw back from creating models that developers can freely tinker with.

It’s true that AI regulation might be higher. Newsom complained that the bill's size limit would spare smaller firms that might still cause major harm. This is what happened in banking when the medium-sized and flippantly regulated Silicon Valley bank chaotically collapsed in 2023. However, it is healthier to set the edge high and proper it later. When it involves crisis prevention, the proper is the enemy of the great.

The thought leaders behind AI and their backers will want to pay more attention to the Wall Street analogy. On the one hand, hated rules have only made big banks greater. Despite strict oversight, U.S. institutions are prioritized because customers consider them safer. Meanwhile, inventiveness in finance continues to advance.

Furthermore, regulation arising from an unexpected crisis may be very burdensome indeed. Just consider the strict Dodd-Frank rules that were drawn up after the financial trauma of 2008 and that also curb the animal spirit of bankers today. As time goes on, the AI ​​overlords who defeated California's bill may need that they had been less successful.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read