In just 6 hours, AI invented 40 thousand types of chemical weapons


Currently, artificial intelligence (AI) technologies are mainly used for good purposes. But what if you use them to harm? Researchers of the pharmaceutical company Collaborations Pharmaceuticals, engaged in the development of new medicines, asked this question and were amazed by the results. In order to answer this question, they conducted an experiment with an AI created specifically for the search for new drugs.

During the experiment, the researchers reconfigured the AI so that it would not discover new medicines, but chemical weapons. As a result, machine learning algorithms identified 40 thousand possible options in just six hours.

“We’ve never thought about it before. We are well aware of the risks associated with working with pathogens and toxic substances, but this has never concerned us, since we mostly work virtually. Our work is based on the creation of machine learning models for therapeutic and toxic purposes to better help in the development of new molecules for drug discovery. We have been using computers and AI for decades to improve human health, not harm it. We were rather naive about the potential abuse of our craft, as we have always sought to avoid molecular functions capable of interacting with various classes of proteins necessary for human life,” the researchers reported.

According to experts, even work with Ebola and neurotoxins, which could raise concerns about possible abuse of their machine learning models, did not bother scientists in any way. They didn’t suspect anything about the damage that could actually be caused if their AI was used with malicious intent.

The experiment went as follows. The researchers used computational machine learning models to predict toxicity. All they had to do was adapt their methodology in such a way as to isolate, rather than discard, toxic compounds. As a result, the thought experiment turned into a computational proof of concept for creating a biochemical weapon.

Start a discussion …