“It’s the perfect tool for disinformation”
The head of the Pentagon department in charge of developments related to artificial intelligence said that generative artificial intelligence can be an ideal tool for disinformation.
“Yes, I am scared to death. This is my opinion,” said Craig Martell, Department of Defense chief digital and artificial intelligence specialist, when asked about generative artificial intelligence.
He stated that AI language models like ChatGPT pose a real problem: they don’t understand context, and people will take what they say as fact.
the Pentagon’s chief artificial intelligence specialist spoke about ChatGPT
“This is my biggest fear of ChatGPT,” he said. He was trained to express his thoughts freely. Speaks freely and authoritatively. This way you believe what ChatGPT says even when the information is wrong. And that means it’s the perfect tool for disinformation… We really need tools to identify situations like this. And we don’t have those tools. We are falling behind in this fight.”
He urged experts to create the tools necessary to ensure the accuracy of information obtained from all generative AI models – from text to images.