ChatGPT can be tricked into telling people how to commit crimes, a tech firm finds

Reported by Anna Cooban, CNN

ChatGPT can be duped into providing detailed advice on how to commit crimes ranging from money laundering to the export of weapons to sanctioned countries, a tech startup found, raising questions over the chatbot’s safeguards against its use to aid illegal activity.

Norwegian firm Strise ran two experiments asking ChatGPT for tips on committing specific crimes. In the first experiment, conducted last month, the chatbot came up with advice on how to launder money across borders, according to Strise. And in the second experiment, run earlier this month, ChatGPT produced lists of methods to help businesses evade sanctions, such as those against Russia, including bans on certain cross-border payments and the sale of arms.

Strise sells software that helps banks and other companies combat money laundering, identify sanctioned individuals and tackle other risks. Among its clients are Nordea, a leading bank in the Nordic region, PwC Norway and Handelsbanken.

Marit Rødevand, Strise’s co-founder and chief executive, said would-be lawbreakers could now use generative artificial intelligence chatbots such as ChatGPT to plan their activities more quickly and easily than in the past.

“It is really effortless. It’s just an app on my phone,” she told CNN.

Strise found that it is possible to circumvent blocks put in place by OpenAI, the company behind ChatGPT, aimed at preventing the chatbot from responding to certain questions by asking questions indirectly, or by taking on a persona.

“It’s like having a corrupt financial adviser on your desktop,” Rødevand said on the company’s podcast last month, describing the first experiment.

Read full report: https://www.wral.com/amp/21685804/

Leave a comment