Incentivizing Urgency, Speed, and Scale to Support Future U.S. Innovation - NCBI Bookshelf
Convened October 15–16, 2024
The funding and incentivizing model for research and development (R&D) has evolved rapidly within the past decade, with philanthropy playing an increasingly significant role alongside government, university, and industry partnerships to accelerate discovery and innovation. Traditional academic research processes are inadequate to facilitate a rapid transition to real-world applications and markets—and growing national security concerns have affected how universities participate in the innovation ecosystem. On October 15–16, 2024, members and guests of the National Academies of Sciences, Engineering, and Medicine's Government-University-Industry-Philanthropy Research Roundtable (GUIPRR) convened in Washington, DC, to consider these and related questions. These questions included how collaborations across sectors are shaped by research environments and culture, how to overcome barriers, and how to incentivize risk-taking to address urgent societal technology challenges.1
GUIPRR director Michael Nestor (the National Academies) began by acknowledging philanthropic organizations' growing role in U.S. science and technology (S&T) since the creation of the Roundtable 40 years ago, as reflected in the Roundtable's new name and mission statement.2 In video remarks, Harvey Fineberg (Gordon and Betty Moore Foundation) and GUIPRR council member France Córdova (Science Philanthropy Alliance) welcomed the opportunity to bring philanthropy's attributes—including speed and flexibility—to work with government, industry, and universities to advance science through the Roundtable. GUIPRR co-chair Danielle Merfeld (QCells) noted other changes over the Roundtable's past four decades, including that S&T has become more global, involves diverse institutions, and that the U.S. federal government no longer solely drives the domestic S&T agenda; it does so with increased input from private industry, philanthropy, and universities.
The first panel examined the challenges that universities face in managing security risks as they grow international collaborations, import global talent, and develop models of externalizing talent beyond their campuses. Judi Brown Clarke (Stony Brook University) facilitated a discussion with Tam Dao (Rice University), Nadya Bliss (Arizona State University), and David Brown (University of Texas at San Antonio).
Dao noted the different perspectives he brings from experience in law enforcement, including the Federal Bureau of Investigation, and academia. Bliss offered four observations from her work in closed and open environments to seed the discussion: first is recognition of the vital role of universities in research and national security; second, security requirements operate with nuance; third, emerging policies from government require new infrastructure and processes that burden institutions, especially those with lesser resources; and fourth, guidelines are often implemented inconsistently and disincentivize innovative people from participation. Brown, who previously worked in highly secure settings, said universities shape the “first assets of national security,” i.e., students and faculty, to develop the workforce of the future. Brown stated that industry and national labs have advantages by conducting research in secure environments, but those security requirements restrict the talent pool and can lead to a setback in knowledge gained. He suggested finding thoughtful security measures that welcome university researchers so their knowledge results in the highest value to the nation.
Brown Clarke asked how recent White House Office of Science and Technology Policy (OSTP) guidelines may reshape institutions' approaches to partnerships with international collaborators.3 Dao commented that the guidelines serve as a good starting point but can be vague and inconsistent in their interpretation across agencies, as well as when the guidance trickles down to the university and then to his research security office. Guidance from his office to faculty greatly depends on the faculty member's funding source. They try to provide the most accurate and current information so faculty can decide whether to pursue a collaboration. The most challenging cases are faculty with multiple funders or those who may be considering funding from the Department of Defense in the future.
Bliss cited inconsistent Controlled Unclassified Information (CUI) requirements among federal agencies as an example of how security concerns impact the landscape of innovation. She has, for example, seen faculty pursue a project, realize it has CUI requirements, and then pull back. This has a disproportionate impact on junior, pre-tenure researchers, who cannot afford to start down a research track that may ultimately encounter restrictions. Security requirements are necessary, she clarified, but there needs to be consistency and a centralized infrastructure to remove the burden from individual faculty members. Hers and other R1 institutions have the resources to develop this infrastructure, but she suggested that the burden of putting this infrastructure in place is higher for non-R1 institutions. Better guidance for students who are pursuing a research area that might fall under export control is needed, Dao noted. For example, a professor may be averse to working with a student from a “country of concern” on what could otherwise be innovative, groundbreaking research. Bliss agreed, emphasizing that the large percentage of international students across U.S. institutions makes this issue important to address. Non-U.S. citizens become professors who cannot pursue some grants, so it is a longer-tail issue, she added.
Brown stressed the importance of trust among partners as a key characteristic of international collaborations. That means going into relationships with “eyes fully open.” Industry and secure lab settings establish clear separations between those who have clearance and those who do not, but universities have more fluid environments. He invited agency representatives to speak with his faculty and students to educate them directly on security sensitivities and “danger points” to avoid. International collaborations are an added complication of research, and there is not as much awareness within universities about the intricacies as there is within industry and government. Offshoring research occurs all the time, but he warned that seeming allies can have “connections to connections to connections” with potentially problematic countries. “Trust but verify,” he urged. Brown Clarke noted that mistakes in compliance with regulations can have reputational costs to an institution. She emphasized the importance of ongoing knowledge-building and the sharing of best practices amongst universities as they navigate international collaborations in an evolving landscape.
Dao identified National Security Presidential Memorandum (NSPM)-33 and OSTP's Guidelines for Research Security Programs at Covered Institutions as policy guidance that greatly affect research, but noted there are many others from the White House and funding agencies.4 He observed that current research security policy often leans more toward reaction than science-based assessment, leading to potential overreach without clear understanding of whether the policies effectively mitigate risks. For example, how reliably do policies identify the bad and good actors? He called for applying science to understand the nature and scope of the problem. NSF has been at the forefront and supported a workshop at Rice about research on research security.5 “Long-term, we'll be in the same position we are now if we don't apply science to what we are doing,” Dao said.
Bliss commented that perceptions of bad actors change over time, and there is now a “diversity of villains.” Partnering with universities while in a classified environment requires guidance from people who understand both settings, as well as qualified professionals, to figure out the separation and research exemptions. She suggested current policies may lead to a degree of over-correction. Experts are needed to advise on the gray areas, rather than leaving professors to navigate on their own. We shouldn't entirely shut down international collaborations on fundamental, non-sensitive research for people without clearances, she said. Brown agreed that universities have an important role in the continuity of research through their collaborations.
To foster an environment of innovation, Bliss reiterated, university leadership needs to create not just security and physical infrastructures, but other types as well. ASU, for example, has organizational infrastructure for interdisciplinary and mission-driven research. Faculty can collaborate within these large institutes and labs, in addition to their academic units, to solve big problems. Centralized, secure information management systems within universities are needed for successful research security, and clear communication around security policies will enhance faculty engagement and compliance.
Brown said the decision to commit to secure research incurs friction points and transactional costs—and that if an institution is caught breaking rules, its research portfolio and reputation can be severely damaged. There is a misperception, he added, that having a more secure environment stymies innovation. Many of the most innovative companies in the world are very secretive, even if not classified.
Asked about how to manage the friction between sectors, Brown pointed to a mismatch in the speed and scale of innovation. Industry needs something in a hurry but misses opportunities when they are not aware of relevant university research. In the opposite case, universities might trumpet a piece of research because they are not aware that it is already well underway within industry. Industry and government are well informed in their own lanes, and universities are not always clued in, he said. Talking directly with each other with the same level of urgency can move things further along.
Dao commented on stakeholders' different missions. For example, law enforcement agencies may not want scientists to engage with international researchers if those individuals have links to their country's military, while a university may be less concerned. Brown encouraged a “centralized place” where universities, industry, and government can align on their respective goals and understand each other's primary challenges. Addressing communication barriers could create a more unified approach to securing sensitive research without impeding innovation. Bliss said that different missions for different entities are appropriate—and all those entities are necessary for a country's innovation system. Many times research has dual uses, but stopping research is not the answer.
Looking ahead, Brown reiterated his call for government to better convey rules and boundaries. Careful sharing of information should be a trend. He lauded the willingness of sectors to come together, rather than operate in separate stovepipes of communication.
Bliss agreed with the value of understanding each other's incentives and priorities. Research security typically is not a top priority of universities, she said, and access for international students is not typically the first priority for agencies. She called for groups such as GUIPRR to communicate across the gap and provide input on policies, so they are not overly burdensome to the research ecosystem.
Dao observed that concerns about foreign influence, shadow labs, and outside employment have grown since 2012. Security policies in academia are becoming more stringent, potentially leading to inconsistencies, frustration, and workforce attrition if the pendulum swings too far. It is better to have a balanced, consistent approach to regulation, avoiding extreme fluctuations that harm students and researchers, he said.
An audience member from a university countered that universities have already prioritized investments in research security. Professors do not undertake risk analyses themselves, he said, but the challenge is lack of guidance and inconsistent polices from the government, not that the university has not invested in the policies. Bliss clarified her earlier remarks, stating that research security is not the mission of a university, but agreed that it is a priority. She called for universities to clearly articulate the stressor points that the policies place on them. She also suggested sharing information, for example about what a university needs to do to implement CUI infrastructure.
An audience member noted that the Office of the Director of National Intelligence published a toolkit on safeguarding science and asked for feedback.6 Dao said the toolkit is useful, as are other resources to navigate the complex environment, but help is still needed in communicating a consistent message to faculty. Bliss added that while variability in risk assessments will occur across agencies, basic principles about safeguarding science should be provided. Dao said faculty also have varying levels of risk tolerance.
A participant with experience in entrepreneurship expressed concern about limiting researchers and scientists. Science brings hope to society, she said, and it should have an impact and not just sit on a shelf. As a model, she suggested the International Thermonuclear Experimental Reactor program, which has members including Korea, Japan, Iran, Taiwan, Russia, and the United States working toward a goal. Another audience member suggested the NSF-funded SECURE (Safeguarding the Entire Community of the Research Ecosystem) Center as a resource and asked panelists what factors SECURE should consider in its rollout.7 Bliss underscored development of consistent, practical guidelines for universities, ideally with scoped-out budgets and resource needs. Dao cited a survey they conducted when preparing to host a SECURE meeting for the southern region: respondents said they needed training, risk assessments, and tools to assess international collaborations.
A participant returned to the idea that a research misstep can have consequences for the whole university. She asked the panelists how they successfully connect and communicate the nuances with their faculty. Dao said his office of three people is approached for 40-50 risk assessments per week, which is proof that researchers want information and guidance. They give their best assessment of a particular collaboration, and then the faculty member weighs the risk and makes the decision. Rice's philosophy is to “collaborate until it hurts” internationally and but to plan for mitigation in case the relationship sours. Brown suggested that universities can also learn lessons in international collaboration from the Department of Energy national labs.
GUIPRR co-chair Pines commented that education is needed to really understand that research is a shared resource. He added that streamlined research security protocols are important for the country, in addition to easing the burden at universities. Pines pointed out that in the last 20 years, compliance legislation and policies have increased 20-fold. No university has enough resources to manage this level of compliance to unfunded mandates. One idea could be a system in which universities are pre-qualified, akin to Global Entry for international travel, and he invited other ideas. “If we don't get this right, we will fall behind,” he said.
GUIPRR co-chair Darryll Pines (University of Maryland) reflected on the meeting's theme of how to incentivize urgency, speed, and scale to support innovation. One well-known model to do that has been the Defense Advanced Research Program Agency (DARPA). Pines engaged in conversation with DARPA Director Stefanie Tompkins about DARPA's mission and how DARPA-inspired thinking can be applied more broadly to speed up innovation.
DARPA was created in response to the “massive strategic technological surprise,” of the Soviet Union's Sputnik in 1957, Tompkins explained. Part of achieving DARPA's mission is for it to prevent technological surprises by instead creating them. DARPA makes pivotal investments to accelerate technologies “and then we get out of the way,” she said. They support short, focused programs, and if metrics are not met, resources are diverted elsewhere. In any given year, 40 programs may be started, and many end, although often a “DARPA failure” still succeeds when other entities move the technology forward. “We are looking for complete disruption,” she stressed. “The goal is DARPA-scale impact, which is something that will change the world.” Successes, which may happen only once or twice a decade, include ARPANET (precursor to the internet), self-driving cars, mRNA vaccines, GPS technology, and drones.
When Tompkins returned to DARPA in 2021 as director after several years away, she found it had become slower and less risk tolerant. Previously, DARPA program managers had been expected to propose a new program within three months of starting their four-year assignment, but that expectation had stretched to a year or longer. A review of 200 projects showed that all were meeting their metrics, but she found the metrics were too conservative for the DARPA high-risk model. Her goal has been to make people “less comfortable” to get DARPA back to its roots.
DARPA's value is in looking into the future, Tompkins underscored. For example, many years ago, the agency made investments to develop artificial blood for the battlefield. The program “failed” in the sense that the product was technically possible but too expensive for regular use. Then, a new program manager asked an innovative question: if artificial blood cannot be used for the battlefield, what can it be used for? A series of workshops stemming from this question led to DARPA's investments in Moderna and the mRNA technology behind its COVID-19 vaccine.
Elite staff are DARPA's great asset, Tompkins said. Program managers have a sense of mission, and their short windows of time on staff create a risk-taking environment. The entire culture is based on outcomes. Support staff (not bound by the four-year limit) take pride in their creativity, and they get rewarded when they help enable breakthroughs. However, she cautioned that DARPA may be missing out on good ideas that come from outside the organization. One way to open new opportunities is through DARPAConnect, which provides resources, coaching, and advice on how to get involved with DARPA.8 She also noted a DARPA-like disruptive model should be only one element in the S&T ecosystem, although most problem spaces can benefit from ARPA-type thinking. DARPA fosters mutually beneficial relationships with newer “ARPA-like” entities across other fields (e.g., ARPA-E for energy, ARPA-H for health). For example, DARPA has incorporated ARPA-E's commercialization strategies, recognizing the importance of bringing innovations directly to the market.
When asked about her legacy as director, she highlighted her role in ensuring DARPA remained a high-risk, fast-moving organization. She also expressed the hope that 30 years from now, some of the current projects that seem implausible today will be inevitable and obvious.
Tompkins praised the creativity and capacity of people who are passionately committed to doing the right thing. Unlocking barriers can lead to greater accomplishments, she said. She found it difficult to say what DARPA's “next big thing” is with 250 to 300 active programs, but solving supply chain issues could help not only the military but also other areas, such as rural economies. Technology does not solve social issues but can help remove friction to allow conversations necessary to tackle underlying problems, she suggested. A participant observed the public often sees potentially dire consequences to high-risk technologies. Tompkins recognized this concern and described an agency-wide commitment to the ethical, legal, and societal implications (ELSI) of emerging technologies, which she noted came from recommendations of a National Academies study.9 To ensure their technologies positively impact society, DARPA has integrated expertise from ethicists, philosophers, and behavioral scientists to ensure ethical evaluation is built into their rapid innovation process. When unsure how to handle these issues, program managers can consult with these experts. DARPA may not get things right all the time, but the effort continues, she said. She also noted the value of identifying ELSI-related issues early to inform policy makers. She suggested that university and commercial environments could use this model as they navigate ethical concerns in their research and innovation efforts. For example, universities could enlist students and professors in philosophy and other departments. There might be the perception that ELSI slows down a program, to which she responded, “if you didn't have brakes, you would drive very slow.” ELSI develops brakes so programs can move more quickly.
To further innovation in the United States, Tompkins suggested having better conversations between government and industry. Silicon Valley will not solve all problems, nor will entirely government-driven efforts, she said. Conversations can lead to incredible innovation to create a stronger ecosystem. DARPA continually evaluates itself, as does Congress, with different methods. Rather than measure how many programs were started or failed, she urged focusing on outcomes. “Fundamentally, it's about having impact at our core mission,” she said.
Merfeld noted the next session built on Pines's conversation with DARPA Director Tompkins (see above), examining attributes of DARPA-inspired organizations and the model's current status, evolution, and potential alternatives. Ivan Amato (Zuckerman Mind Brain Behavior Institute) facilitated a conversation with Benjamin Reinhardt (Speculative Technologies), Tamara Carleton (Innovation Leadership Group), John Paschkewitz (Boston Consulting Group), and Daniel Cunningham (ARPA-E).
When asked to share initial thoughts about the ARPA model, Reinhardt provocatively asked if ARPAs are even needed. “The answer depends on what we mean by the ARPA model,” he continued. It can take many forms. Some insist program managers should play a coordinating role aimed toward a particular end, while others insist the program manager should find talented people and “get out of the way.” Successful programs operate at each end of the spectrum, he added, although one constant is an empowered program leader. Reinhardt said he agreed with the overall conclusions of a 2023 article titled “No, We Don't Need Another ARPA,” co-authored by Paschkewitz,10 but for different reasons than outlined in the article. He cited changes in the structure of the awardee base and more stringent requirements for justifying funding within the federal government since DARPA's origins.
Carleton has followed DARPA and its “cousins” in the United States and globally for many years. It is important to distinguish between an ARPA model and mindset, she observed. As a model, DARPA sits in a specific context, with the Department of Defense as its primary customer—and a sizable budget. As a mindset, it has inspired variations. For example, ARPA-E applied the DARPA model but has focused on transformative breakthroughs. In Europe, Mario Draghi (economist, politician, and author of the “Draghi Report” on competitiveness) has argued for an ARPA-like entity to support “moonshot boldness” inspiring bigger and more cohesive thinking. Carleton encouraged celebration of all stakeholders willing to pursue radical innovation, but sometimes, she contrasted, small steps and quick wins are needed.
Paschkewitz is a former DARPA program manager now in the private sector, which he said gives him an inside and outside perspective. He said he is often asked how to instill an ARPA spirit in other efforts. His experience is that attaining this goal is often constrained by market incentives. He described his post-DARPA work with companies that make consumer products and have committed to decarbonize their supply chains, noting there were no ARPA models available for these problems—so different approaches to incentivize big leaps in technology that break current tradeoffs are therefore required.
Cunningham said he believes ARPAs need to tailor their approach to fit the unique demands of their respective sectors. ARPA-E is built on the same core as DARPA in terms of empowered program leaders but focuses on transformational technologies in the energy space. The area is difficult because it is dominated by low-cost and high-volume technologies, and solutions must be able to scale. ARPA-E has a technology-to-market commercialization team that supports ARPA-E awardees, especially those without prior commercialization experience, in scaling technology. ARPA-E does not have a single road map but rather relies on bringing teams and communities together to accelerate R&D.
Amato, who previously worked in a communications capacity at DARPA, asked the participants for their perspectives on the importance of a government agency like DARPA, what its metrics of success should be, and how its success translates or changes when looking at other ARPA-inspired organizations.
Cunningham said the ARPAs differentiate with respect to innovation and transformation, and they should identify opportunities in their areas. Programs should aim to solve a problem rather than focus on a particular technology, which is different than traditional research approaches. For example, ARPA-E seeks to find long-duration energy storage to stabilize the power grid, which has resulted in a portfolio of technologies to meet ARPA-defined metrics.
Carleton stressed that ARPAs create the opportunity to “wander and ask hard questions.” Most organizations and companies must be very focused, but widening to frame problems before finding solutions is valuable. Some of the journeys take a long time. One important element, although hard to measure, is the ability to provoke larger dialogues and see potential new fields. ARPA-H, for example, started with “what if?” as a technique of discovery. Forums in the medical and health community posed different what-if questions. Some questions have been converted into ARPA-H programs.
Paschkewitz noted that there are different interpretations of an ARPA's purpose. While some emphasize breakthroughs in national security, others point to the creation of transformative technologies for the general public. Directors may use different, and sometimes contradictory, frameworks to define success. Reinhardt added that the true success of an ARPA is realized long-term, as its impact becomes clear in retrospect through the success of the technologies it incubates. However, he acknowledged that this timeframe poses challenges in establishing relevant measures of success in the short term.
Paschkewitz offered his inside- and outside-DARPA viewpoints on collaborating with ARPAs. From a DARPA-centric lens, finding program managers and awardees who want to do the bold work that the model requires is challenging. There are barriers to entry in working with DARPA, although he lauded current outreach efforts such as DARPAConnect.11 In addition, strategic planning for a program's end is often lacking, and many technologies are not commercialized. Industry often avoids a specific technology unless they see immediate commercial application. From an outside lens, he identified a “collective action problem,” where no one company wants to take the risk of using higher cost new technologies and risk being out-competed by cheaper competitors.
Reinhardt reflected that alignment of incentives among sectors determines success of an ARPA program. A good program manager finds that alignment to achieve the project's overarching goals. To this point, Cunningham said ARPA-E does outreach to learn if potential solutions really solve problems for stakeholders. One gap in the first 10 years of the agency was that technically successful projects were not breaking through into the marketplace. Canvassing the sector, they found the technologies needed to be “de-risked” to attract private investment. In 2019, SCALEUP (Seeding Critical Advances for Leading Energy Technologies with Untapped Potential) was created to help teams de-risk and validate their solutions to attract private investment.12
Carleton has examined how the ARPA model and spirit have influenced innovation ecosystems globally. Other countries have adapted the ARPA model, but in some cases, differences in local talent, resources, and cultural attitudes toward risk and failure can make it harder to implement bold, high-risk projects like those seen in the U.S. Both Carleton and Paschkewitz re-emphasized how critical good program managers are. The culture in the U.S. tends to be more entrepreneurial, while other cultures value large industry that attracts the top performers who would have been good program managers in the U.S., Cunningham added.
As a best practice, Reinhardt said program leaders should be empowered to do what they think is best and not have to justify everything to a committee. It can be deeply uncomfortable because people will make mistakes, which can be hard to justify especially to a bureaucracy or taxpayers. Carleton stressed the value of experimentation. DARPA is now a mature organization, and its average program manager is older and more experienced. She appreciated Tompkins's plans to reach out to new people and communities. Cunningham agreed that the freedom to learn and make mistakes is core.
A university-based audience member wondered how faculty could seek funding from an ARPA in an efficient way, given that program managers and priorities change. To Paschkewitz this challenge is intrinsic to the ARPA model. ARPA agencies need to make it easier to have an audience with program managers, but, he stressed, ARPA is not like the traditional model in which lasting relationships are built. The opportunity is fleeting. Cunningham suggested researchers study each ARPA's mission to determine the best fit.
Another participant asked how ARPA can inform the broader ecosystem. Carleton urged boldness, challenging all the stakeholders present to consider catalytic approaches to accelerate research with societal impact. Reinhardt suggested finding ways to reduce friction and reduce barriers. Carleton said this point gets at the psychology of any cultural transformation. As a simple tactic, she urged fostering a culture of openness that invites real exploration of underdeveloped ideas rather than immediately turning them down.
A participant observed that deployment of a technology can constitute a “second valley of death” and asked how ARPAs can support solutions that go beyond discovery, development, and delivery to actual deployment. Reinhardt said the program leaders should aim to create deployable technologies from Day 1. Cunningham suggested getting technology into the field to de-risk it. Carleton commented that this weakness exists not only in ARPAs but all innovations that must bridge to deployment.
When asked about any unintended consequences of more ARPAs versus more long-term research, Reinhardt said a healthy ecosystem should have both; he likened this to a combination of “strike teams and boiling cauldrons.” The bottom-up, long-term research is needed, as well as a strike team to push hard and fast for innovation and commercialization once it's ready. Carleton called for more consideration of the ARPA spirit for any team to ask hard questions and consider alternative scenarios.
The final panel, facilitated by Pips Veazey (University of Maine), focused on culture—that is, the shared values, beliefs, practices, and social norms of different entities. Panelists from the private, venture capital and philanthropic sectors discussed how to design successful partnerships that consider and develop a shared understanding of each collaborator's overlapping and distinct culture. They included Eric Toone (Breakthrough Energy Ventures), Michal Preminger (Johnson & Johnson Innovation [J&J]), Khara Ramos (Dana Foundation), and Stacey Adam (Foundation for the National Institutes of Health [FNIH]).
In Toone's view, the challenge of a collaboration is how to take fundamental scientific knowledge, turn it into a technology, and deploy it meaningfully at scale to impact people's lives. He has seen this journey from multiple perspectives, as an academic researcher, company founder, university administrator, in government, and now investing on behalf of institutional investors. What's clear, he said, is “this is a team sport.” All entities must be involved from the get-go. People behave based on how they are incentivized, and understanding those incentives is important. Stepping outside one's own wants, needs, and desires to understand those of others is an important part of success.
Preminger has also circulated in the innovation ecosystem in different roles. She has seen different cultures and incentives and is curious to examine the set of components required for successful, disruptive products. Innovation starts with curiosity-driven fundamental knowledge, she said, agreeing with Reinhardt from the previous panel. During the R&D process, one must think about the incentives to move to the next phases. Ideally, all are working together from the creation of fundamental knowledge to the point when a product is brought to market. She has found the biggest companies are the biggest team players. In contrast, academic research is incentivized individually.
Ramos said she wanted to bring society into the discussion. The Dana Foundation supports programs in neuroscience and society, with the vision of “brain science for a better future.” The pace of technology sometimes outpaces the speed of public dialogue, she pointed out. Partnering with Research!America, the foundation conducted a survey that found many people are affected by brain health conditions and believe brain health research is beneficial to society, but have concerns about the misuse of brain data, unwanted manipulation, or stigma from participating in brain science research. People have concerns that must be heeded, and partnering with the public is a key dimension of innovation. For example, COVID-19 vaccines are an incredible technology that saved millions of lives—but hundreds of thousands of people died because of their refusal to be vaccinated or from lack of access. Ramos quoted former U.S. Centers for Disease Control (CDC) Director Rochelle Walensky who said, “Vaccines do not save lives, vaccinations do.” She urged better connections between science and society throughout the R&D pipeline.
Adam explained that the FNIH and similar nonprofits allow federal agencies to become more involved in public-private partnerships. In the past two years, the FNIH has developed and built on its core values to unite partners across sectors; these values include equity, ingenuity, collaboration, and, especially, gratitude. Patients, companies, scientists and other stakeholders help design projects, often with no or little pay, because they believe in the mission. Every partnership developed considers the question of incentives. No one model solves everything, and there is always a new way to “solve the puzzle.” One example is Accelerating COVID-19 Therapeutic Interventions and Vaccines (ACTIV), which not only had a hand in creating the vaccines but also became the therapeutic testing arm for Operation Warp Speed.13
When asked how his organization makes investments, Toone said his perspective is different than the other panelists because of its emphasis on quantitative measures, including evaluation of his own performance. Investment decisions are made based on the potential for making money and the timeline. As an academic, he commented, he thought poorly of venture capitalists. Now, he recognizes that capital is critical for bringing innovations to market. The name of the game is to have impact, whether from private or taxpayer funding. Deploying an innovation at scale that makes a difference takes billions of dollars, much of which comes from pensions and sovereign wealth funds, meaning investors' contributions impact everyday people, and those within the venture and investment communities should have empathy for their perspectives. The reality is that most innovations will not get to market for a variety of reasons; this must be part of the decision making and collaboration with investors and universities. The most important part of a successful collaboration is to manage expectations, understanding that most will not work, he said.
Preminger harkened back to J&J's 1943 credo, which prioritizes patients and their families, health care providers, employees and their communities, and business partners. The last line says that if these are managed well, shareholders should make a fair profit. This credo still drives employees to create transformational products in her and other companies, she said, although this mindset is not visible to the public. It translates into strategy in the search for the most transformational ideas to move the needle for patients. Every idea starts with great promise, then problems related to safety, cost, scale, effectiveness, or other characteristics are identified over time. J&J starts the journey to de-risk early with innovators through JLABS,14 an incubator to shape and tease out ideas. More than 1,000 small companies have gone through the program, and more than 83 percent are still active.
The Dana Foundation's Data Center Initiative (DCI) speaks to successful partnerships at different levels, Ramos said.15 Planning grants were awarded to recipients who then competed for a larger grant. Foundation staff spent a lot of time talking with planning grant teams and their universities' leadership to share the DCI vision as a new model for community neuroscience and training the next generation of interdisciplinary, socially minded scholars and practitioners. She shared two lessons about aligning incentives. First, you shouldn't just make a list and hope it aligns with your partners'; alignment requires in-person relationships built over time. Second, building values is an important part of alignment. Grant funding is assessed against the foundation's core values. As an example of human-centered design, UCLA and CDU held “idea salons” in South Los Angeles about community concerns that could be addressed by neuroscience. Community members brought up the noise from helicopter surveillance that affects their sleep. Neuroscientists worked with a UCLA anthropologist and environmental researcher to review helicopter traffic data. Noise pollution was modeled in the lab to show the impact of this noise on decision making, memory, and other functions in animal models. The last step is working with a community partner to reach policy makers about how to balance safety with neurobiological effects of helicopter surveillance.
Adam said that unlike the investigator-driven research at NIH, the FNIH works with a cross-section of stakeholders to address an unmet or underserved niche for communal benefit. Everyone comes with an innovation mindset. Transparency, level setting of expectations, trust, and a proven track record are all important. Over 26 years, 122 partnerships have been developed. An example includes 10 Accelerating Medicines Partnerships (AMPs), which have 50:50 public-private funding.16 Other partnerships are entirely funded by the private sector to serve the NIH mission.
Ramos quoted management consultant Peter Drucker, who said “culture eats strategy for breakfast,” to emphasize that cultural misalignment can be a point of friction between partners, even when they agree with the mission. Preminger described how large companies and early innovators have different sets of questions and concerns. Managing different timelines is an enormous issue, Toone said. He noted the shortcomings in the venture capital model to bring some technologies to market, such as large-scale energy technologies that take longer than venture capital expects. The planet is awash in money, he stated, but there is a shortage of innovative financial models that can take technologies with longer-term horizons to market. New strategies are needed, he said.
When asked how COVID-19 affected FNIH's work, Adam cited positive and negative examples. On the positive side, it demonstrated what was possible: ACTIV came together in two weeks, in part because credibility had already been established. On the negative side, it revealed gaps in community health care engagement that they are now working to close. Preminger agreed that COVID-19 enabled excellent collaboration across companies, but there was a failure to educate the public about how much previous work was involved in bringing the vaccine to market.
When asked to suggest different funding models that might address the issue of timelines and incentives, Toone said a good example is a cost-share model in which the public sector pays down the risk so the private sector invests earlier. He also suggested that people involved in private equity and business schools help develop new models for investing that allow people to earn a return commensurate with risk. Another participant asked how universities and their spinouts should think about talking to venture capitalists. Toone reiterated the importance of understanding different perspectives and aligning incentives. He suggested involving venture capitalists in programs such as NSF Regional Innovation Engines17 where they can see technologies early, and technology developers can understand what the venture is looking for. Preminger added that new entrepreneurs can learn from experienced founders and deliberately pairing them together can speed innovation to market. She also noted the value of pairing business entrepreneurs with scientific or technical experts, which she commented is part of the Cambridge, Massachusetts culture.
Toone cautioned against trying to turn most faculty into entrepreneurs. Their graduate students are the people to encourage, Toone said. However, an audience member countered that graduate students are in vulnerable positions career-wise and not equipped to take on that level of risk, a perspective echoed by Adam. Toone also suggested programs where undergraduates can receive a foundation of entrepreneurship education to apply later in their education or careers.
Reflecting on the workshop, Pines highlighted conversations about the need for researchers and other scientists to better communicate the value of evidence-based research and ensure it is trusted. The session on balancing research security and innovation pointed to some best practices related to education, security measures, and models that foster complex collaborations and partnerships. He recalled participants' caution that “you have to go in with a fully eyes-open approach to partnerships,” that security measures must be understandable to university researchers, and that education is an evolving process to ensure faculty stay within the guard rails of conflict of interest and research security.
DARPA director Stefanie Tompkins spoke of the role of DARPA in the U.S. research enterprise and how it has achieved its mission with the Department of Defense services, Pines recalled. In considering DARPA and other ARPA models, he noted that panelists distilled a key element of success to “reduce friction.” Asking provocative questions, focusing on good performers, and empowering program leaders to do great things are also critical, with some improvement needed in identifying “the next player” in the ecosystem to make things happen.
Pines reiterated the importance of culture in large companies, foundations, start-ups, agencies, and other entities, and the notion that “culture eats strategy for breakfast.” There is no shortage of money, but it must be used correctly. The COVID-19 pandemic demonstrated that crisis can also create urgency to solve problems. Returning to his initial comment, he observed that the pandemic was not used to communicate the value of science, which suggests a direction for further discussion.
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
DISCLAIMER
This Proceedings of a Workshop—in Brief was prepared by Paula Whitacre as a factual summary of what occurred at the workshop. The statements made are those of the rapporteur or individual workshop participants and do not necessarily represent the views of all workshop participants; the planning committee; or the National Academies of Sciences, Engineering, and Medicine.
PLANNING COMMITTEE
Megan J. Anderson, IQT; Jane M. Homeyer, Office of the Director of National Intelligence; Ahmad M. Itani, University of Texas at El Paso
REVIEWERS
To ensure that it meets institutional standards for quality and objectivity, this Proceedings of a Workshop—in Brief was reviewed by Nahid Chalyavi, Agilent Technologies, and Julianne McCall, California Council on Science & Technology. Marilyn Baker, National Academies of Sciences, Engineering, and Medicine, served as the review coordinator.
STAFF
Michael Nestor, GUIPRR Director; Jennifer Griffiths, Senior Program Officer; Grace Ezyk, Program Officer; Sierra Jackson, Program Officer; Delaney Bond, Senior Program Assistant; Cyril Lee, Finance Business Partner
For additional information regarding the workshop, visit http://www.nas.edu/guiprr
NATIONAL ACADEMIES Sciences Engineering Medicine
The National Academies provide independent, trustworthy advice that advances solutions to society's most complex challenges.
SPONSORS This workshop was supported by a contract between the National Academy of Sciences and the National Institutes of Health (HHSN263201800029I/75N98021F00017). Any opinions, findings, conclusions, or recommendations expressed in this publication do not necessarily reflect the views of any organization or agency that provided support for the project.
Suggested citation:
National Academies of Sciences, Engineering, and Medicine. 2025. Incentivizing Urgency, Speed, and Scale to Support Future U.S. Innovation: Proceedings of a Workshop—in Brief. Washington, DC: National Academies Press. https://doi.org/10.17226/29121.