Could AI potentially cause major epidemics or pandemics, and if so, when?
Government urged to enact rules preventing AI models from causing widespread harm.
AI models could potentially cause major epidemics or pandemics, warn experts researching advancements in artificial intelligence.
A paper published in Science by co-authors from Johns Hopkins University, Stanford University, and Fordham University claims that AI models can manipulate substantial biological data, accelerate drug and vaccine design, and enhance crop yields.
As with any powerful new technology, biological models will also pose considerable risks. Because of their general-purpose nature, the same biological model that can design a benign viral vector to deliver gene therapy could also be used to design a more pathogenic virus capable of evading vaccine-induced immunity, researchers wrote in their abstract.
"The paper emphasized the significance of voluntary commitments among developers to assess biological models' potential risks but noted that they are not enough on their own. Instead, the paper proposed that national governments, including the United States, enact legislation and establish mandatory rules to prevent advanced biological models from contributing to large-scale dangers, such as the creation of novel or enhanced pathogens capable of causing major epidemics or pandemics."
The paper's authors stated that although today's AI models do not significantly contribute to biological risks, the necessary components to create highly concerning advanced biological models may already exist or soon will, according to Time.
Officials are reportedly recommending that governments establish a battery of tests for biological AI models before they are made available to the public, after which access restrictions can be determined.
"Anita Cicero, the deputy director at the Johns Hopkins Center for Health Security and co-author of the paper, stated in Time that planning is necessary now to reduce risks associated with powerful tools. Government oversight and requirements will be required to achieve this goal."
Cicero reportedly stated that biological risks from AI models could become a reality "in less than 20 years" without proper oversight.
According to Paul Powers, an AI expert and CEO of Physna, AI has the potential to engineer pandemics, and as AI is advancing at a rapid pace, most people are not prepared for it.
"While governments and large businesses have access to increasingly powerful capabilities, individuals and small businesses also have access to these capabilities. However, the problem with regulation is that it is enforced nationally, and regulation cannot keep up with the speed of AI."
Powers stated, "Their proposal is for the government to approve specific AI training models and applications. However, the challenge lies in enforcing compliance."
"Powers stated that certain nucleic acids are crucial for any potential pathogen or virus and advised starting with identifying who can access the building blocks first."
tech
You might also like
- How your online data can be used against you on TikTok
- Tesla's Cybertruck receives a military upgrade with a tactical twist.
- Apple redesigns AirPods Pro 2 as affordable hearing aids.
- The National Security Agency advises Americans to restart their phones weekly for security purposes.
- Intimidation tactics in sextortion scams are evolving with the use of Google Maps images.