SB-1047: Governor Gavin Newsom’s Ambivalent Approach To AI Regulations

Share

Share

 

It is undisputed fact that California has some mystical allure to it which makes it one of the most populous states in the United States (US). In a search for factors leading to California’s growth as a global economic powerhouse, one might consider ‘Hollywood’ as one of the major contributing factors, however that might be an oversimplification. California’s avant-garde status is primarily attributable to the ‘Silicon Valley’, which is the global hub for technology and innovation. California has been provenance of some of the most influential tech companies in the world like Apple, Alphabet, Meta, Nvidia and OpenAI to name a few. It would not be fallacious to accede to the notion that California, which is home to all the major players in the realm of artificial intelligence (AI) is destined to be the battlefield where AI bloodshed is imminent in the coming years.

If you had an opportune moment to gaze into the world of Frank Herbert’s novel ‘Dune’, you may be familiar with the ‘Butlerian Jihad’ which was a massive, galaxy-wide conflict between humans and AI machines which led to the destruction of nearly all advanced AI as well as computers. Leaving the extravagant sci-fi references aside, California is indeed amid a battle royale in the AI space with the introduction of the Senate Bill 1047 (SB-1047) which is slated to be one of the most stringent pieces of legislations on AI safety.
What makes the introduction of SB-1047 increasingly more problematic and frustrating for the Governor of California, Gavin Newsom, is the timing during which it was introduced and passed in both the houses of the state legislature in late August 2024.

Considering California’s propensity towards the Democratic Party, it would be very unusual for Governor Newsom to muster even an ounce of hesitation to make SB-1047 a reality with a stroke of a pen. In a surprising turn of events, it seems that Governor Newsom is in a rather difficult position and biding his time before making SB-1047 a reality. One might wonder where the hesitation is coming from. However, the answer to the same is not very exciting and is simply attributable to tactful politics, given that there is a looming presidential election to worry about which will impact Governor Newsom’s own ambition of running for the highest office in US in the near future. Another pain point for Governor Newsom is the fact that while institutions like the Screen Actors Guild – American Federation of Television and Radio Artists (SAG-AFTRA), National Organization for Women (NOW) and other motivated and influential individuals are pressuring Governor Newsom to sign SB-1047; the Silicon Valley on the other hand is completely opposed to this notion.

The issue is further exacerbated by the fact that the major chunk of funding coming to the Democratic Party is from tech executives in the Silicon Valley, and in California, these executives are deeply entrenched in the state politics. Furthermore, in recent years, it is evident that there is clearly an ideological discrepancy between the Silicon Valley and Democratic Party when it comes to AI innovation, since growing outcry on the dangers of AI is pushing the latter into favouring regulations, forgoing the risk of stifling innovation.

The first iteration of SB-1047 required developers to conduct safety tests and third-party evaluations on their AI models. The bill went further and required developers to introduce a ‘kill switch’ to shut down rogue models and report safety incidents to a proposed ‘regulatory agency’. Furthermore, the attorney general was empowered to sue the developers for any safety failures, the penalties for which were significantly high. Any endeavour to conceal the same also attracted severe penalties including prison time in exceptional cases. After facing a lot of criticism, given how stringent the first iteration of SB-1047 was, the bill underwent many changes, and the result culminated into a second draft of SB-1047 which was significantly milder, but still impactful enough for the committee to pass it on August 15, 2024. The changes included scrapping of the proposed ‘regulatory agency’ and now the attorney general can no longer sue developers for any safety failures.

Additionally, instead of going through heavy compliance for procuring safety certifications, developers were offered more flexibility by issuing public statements about their safety practices. Lastly, startups and entities spending less than $10 million in fine tuning their models were excluded from the ambit of the bill to offer protection to smaller startups and open-source developers.
While the current SB-1047 may seem watered down to the staunch opposers of AI, it is still one of the most significant and impactful developments in AI regulation till date barring the EU’s AI Act. It seems that the lawmakers are convinced that the benefits of passing SB-1047 is likely to overweigh its costs, and that this is next logical step towards bringing harmony to the perturbing ‘AI innovation/ AI regulation dichotomy’ which has made even the smartest and most influential lawmakers around the world scratching their heads in utter indecision. Given that the deadline for passing the bill is September 30, 2024, the ball is in Governor Newsom’s court, as he contemplates the fate of SB-1047, which has already been passed in both houses with overwhelming majority. Whether SB-1047 will become a reality in California and what will be the consequences thereof? We will find out by September 30, 2024.

Author: Amartya Mody

Consult with us.

Lawyers.

Interns and Paralegals.

Disclaimer.

As per the rules of the Bar Council of India, we are not permitted to solicit work or advertise. By agreeing to access this website, the user acknowledges the following:

This website is meant only for providing information and does not purport to be exhaustive and updated in relation to the information contained herein. Naik Naik & Company will not be liable for any consequence of any action taken by the user relying on material / information provided on this website. Users are advised to seek independent legal counsel before proceeding to act on any information provided herein.