Log In

Yoshua Bengio Launches LawZero: A New Nonprofit Advancing Safe-by-Design AI

Published 1 day ago3 minute read

, /PRNewswire/ - Yoshua Bengio, the most-cited artificial intelligence (AI) researcher in the world and A.M. Turing Award winner, today announced the launch of LawZero, a new nonprofit organization committed to advancing research and developing technical solutions for safe-by-design AI systems.

LawZero is assembling a world-class team of AI researchers who are building the next generation of AI systems in an environment dedicated to prioritizing safety over commercial imperatives. The organization was founded in response to evidence that today's frontier AI models are developing dangerous capabilities and behaviours, including deception, self-preservation, and goal misalignment. LawZero's work will help to unlock the immense potential of AI in ways that reduce the likelihood of a range of known dangers associated with today's systems, including algorithmic bias, intentional misuse, and loss of human control.

LawZero is structured as a nonprofit organization to ensure it is insulated from market and government pressures, which risk compromising AI safety. The organization is also pulling together a seasoned leadership team to drive this ambitious mission forward.

"LawZero is the result of the new scientific direction I undertook in 2023, after recognizing the rapid progress made by private labs toward Artificial General Intelligence and beyond, as well as its profound implications for humanity," said Yoshua Bengio, President and Scientific Director at LawZero. "Current frontier systems are already showing signs of self-preservation and deceptive behaviours, and this will only accelerate as their capabilities and degree of agency increase. LawZero is my team's constructive response to these challenges. It's an approach to AI that is not only powerful but also fundamentally safe. At LawZero, we believe that at the heart of every AI frontier system, there should be one guiding principle above all: The protection of human joy and endeavour."

Scientist AI: a new model for safer artificial intelligence

LawZero has a growing technical team of over 15 researchers, pioneering a radically new approach called Scientist AI, a practical, effective and more secure alternative to today's uncontrolled agentic AI systems. Scientist AI stands apart from the approaches of frontier AI companies, which are increasingly focused on developing agentic systems. Scientist AIs are non-agentic and primarily learn to understand the world rather than act in it, giving truthful answers to questions based on transparent externalized reasoning. Such AI systems could be used to provide oversight for agentic AI systems, accelerate scientific discovery, and advance the understanding of AI risks and how to avoid them.

Major institutions and individuals, including the Future of Life Institute, Jaan Tallinn, Open Philanthropy, Schmidt Sciences, and Silicon Valley Community Foundation have made donations to the project as part of its incubation phase.

About LawZero

LawZero is a nonprofit organization committed to advancing research and creating technical solutions that enable safe-by-design AI systems. Its scientific direction is based on new research and methods led by Professor Yoshua Bengio, the most cited AI researcher in the world. Based in Montréal, LawZero's research aims to build non-agentic AI that could be used to accelerate scientific discovery, to provide oversight for agentic AI systems, and to advance the understanding of AI risks and how to avoid them. LawZero believes that AI should be cultivated as a global public good—developed and used safely towards human flourishing. LawZero was incubated at Mila - Quebec AI Institute, a non-profit founded by Professor Bengio. Mila now serves as LawZero's operating partner. For more information, visit www.lawzero.org

SOURCE LawZero

icon3

440k+
Newsrooms &
Influencers

icon1

9k+
Digital Media
Outlets

icon2

270k+
Journalists
Opted In

Origin:
publisher logo
Cision PR Newswire
Loading...
Loading...

You may also like...