The bad news about AI for cybersecurity: Hackers have access to the same tools as hospitals

While security is among the uses cases with big potential for artificial intelligence and machine learning, the underlying reality is that both hospitals and hackers have access to the same technologies.

“AI is a dual-use technology that can be deployed defensively or offensively,” said Lee Kim, Director of Privacy & Security at HIMSS. “There are malicious uses of AI.”

Indeed, the report “The Malicious Use of Artificial Intelligence: Forecasting, Prevention and Mitigation,” published by the Future of Humanity Institute and others, pointed out that some of the same “systems that examine software for vulnerabilities have both offensive and defensive applications,” and likened the scenario to the way a drone for delivering medications need not differ greatly from a drone that drops explosives.

“Malicious use of AI could threaten digital security (e.g. through criminals training machines to hack or socially engineer victims at human or superhuman levels of performance), physical security (e.g. non-state actors weaponizing consumer drones), and political security (e.g. through privacy-eliminating surveillance, profiling, and repression, or through automated and targeted disinformation campaigns),” the report said.

“The malicious use of AI will impact how we construct and manage our digital infrastructure as well as how we design and distribute AI systems, and will likely require policy and other institutional responses.”

Among the ways hackers can harness AI are the common tactics of phishing, spearphishing and whaling.

“Phishing and getting in through the human is a soft spot in organizations,” said Kim. “Phishing is effective — we all know about it, but how good are we at detecting it and not falling victim? Not very, according to the results of our HIMSS Cybersecurity Survey.”

Kim will present the results of the 2019 HIMSS Cybersecurity survey during a session at HIMSS19 in Orlando in February 2019.

The problem of phishing is well known at this point, of course, but the rise of artificial intelligence in the realm is perhaps less understood.

“AI-fueled techniques are lower cost, higher accuracy, with more convincing content, better targeting, customization and automated deployment of phishing emails,” she added.

During the session, Kim will also discuss the psychology of phishing from the perspective of both attackers and victims, outline the anatomy of a phishing attack, share insights about how phishing has evolved with the use of AI, explain how to recognize advanced attacks and outline mitigation techniques.

What’s more, hackers could ultimately use AI to launch attacks against individuals.

“We may feel the kinetic effects of phishing in the physical world soon,” Kim said. “The nexus between cybersecurity and patient safety may become even more apparent.”

Kim’s HIMSS19 session, “Don’t be Phooled!: What you need to know about phishing,” is scheduled for Tuesday, February 12, from 1:30-2:30 p.m. in room W320.

Twitter: @SullyHIT
Email the writer: [email protected]

Healthcare IT News is a HIMSS Media publication. 

Source: Read Full Article