Our use of cookies

We use strictly necessary cookies to make our site work. These cookies enable core functionality such as security, network management, and accessibility. The cookies collect information in a way that does not directly identify anyone. For more information on how these cookies work please see our privacy policy.

To agree to our use of analytical cookies, click the 'Accept cookies' button. No, give me more information.
Accept cookies Reject analytical cookies Manage cookies
 
Visit the Staffordshire County Council newsroom homepage
 

Scams warning as criminals turn to AI to defraud consumers

Posted on Wednesday 17th July 2024
iStock AI NR

Stock image representing AI

Staffordshire consumers are being warned to be alert to increased risks from fraudsters as criminals turn to AI to make their scams more believable.

With advances in Artificial Intelligence (AI), criminals are becoming more sophisticated with their scams, using software to make themselves more convincing.

Examples include Deepfakes - using video, convincing voice cloning and phishing messages, all of which are worrying trading standards officers at Staffordshire County Council.

Victoria Wilson, Cabinet Member responsible for Trading Standards at the authority said:

“We’ve all seen the huge developments with AI in recent years but sadly we’re now starting to see more and more criminals using it to scam people.

“Our Trading Standards Team are seeing the usual types of scams but AI is making them more convincing and harder to spot, which is worrying.  This means it's important to know what to look out for and to stay one step ahead of the scammers. 

“If you think you’ve been scammed, call your bank immediately using the number on the back of your bank card and report it to Action Fraud, or call the police on 101.”

Deepfakes, using video cloning software to create fake video content, is used to produce bogus celebrity endorsements and messaging.  This tries to fool consumers into believing a bogus or non-existent product is genuine.

Voice cloning is similar to deepfake videos and uses voice cloning to create artificial voice recordings from samples of speech.  Again, criminals will create phoney messages that can be used in conjunction with existing scams, such as the WhatsApp “hi mum” messaging fraud.

Advice on spotting and reporting scams is available on the Citizens Advice Consumer service website.

Related Items

Sorry, there are no related items