California Gov. Newsom Mandates AI Harm Assessment in Government Contracts

Published 1 hour ago2 minute read
Uche Emeka
Uche Emeka
California Gov. Newsom Mandates AI Harm Assessment in Government Contracts

California Governor Gavin Newsom has signed a new executive order, establishing a framework for the state's engagement with artificial intelligence, particularly in the context of state government operations. This order comes on the heels of a notable dispute where the federal government labeled San Francisco-based AI tools maker Anthropic as a supply-chain risk. Under Newsom's directive, the state of California will now independently review such federal designations and make its own determination on whether to conduct business with the affected entity. The broader objective of this order is twofold: to implement essential guardrails on the use of AI by state employees while simultaneously fostering and accelerating their adoption of the technology.

The specific incident involving Anthropic stemmed from a disagreement with the Department of Defense over contract terms. Anthropic sought to bar the military from employing its systems for domestic mass surveillance or fully autonomous weaponry. In response, the Department of Defense designated Anthropic a supply-chain risk, effectively preventing the startup from vying for certain military contracts and subcontracts. A temporary injunction has since been issued by a judge to block this designation.

Highlighting California's prominent role in the AI landscape—being home to many of the world's largest AI companies and leading the nation in AI regulations—the executive order outlines several critical mandates for state agencies. These include the development of recommendations for state contract standards concerning AI's potential to generate child sexual abuse material, violate civil liberties and civil rights laws, or infringe upon legal protections against unlawful discrimination, detention, and surveillance. Agencies are also tasked with facilitating employee access to

Loading...
Loading...
Loading...

You may also like...