Tech News : Robo-Calls Now Illegal In The US | Digital Network Solutions
Need support? 01603 778255

Tech News : Robo-Calls Now Illegal In The US

Written by: Paul | February 14th, 2024

The US The Federal Communications Commission (FCC) has announced that robocalls using AI-generated voices are now illegal.

What Are Robocalls And Why Make Them Illegal?

A robocall is a call made using voice cloning technology with an AI-generated voice, i.e. it is a telemarketing call that uses an automatic telephone-dialling system and an artificial or prerecorded voice message.

This type of call is now common practice in scams targeting consumers, hence the move by the FCC to make such calls illegal.

Escalation As Voice Cloning Technology More Available

These types of calls have escalated in recent years as this technology has improved and become widely available. The FCC says it’s reached the point where the calls now have the potential to effectively confuse consumers with misinformation by imitating the voices of celebrities, political candidates, and close family members.

How Changing The Law Will Help

The FCC’s Chairwoman Jessica Rosenworcel, has explained how making such calls illegal will help saying: “Bad actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, imitate celebrities, and misinform voters. We’re putting the fraudsters behind these robocalls on notice.” She also said the move will mean that: “State Attorneys General will now have new tools to crack down on these scams and ensure the public is protected from fraud and misinformation.”

What Will It Mean For Telemarketers?

The law used by the FCC to make these robocalls illegal is the Telephone Consumer Protection Act which is broadly used to restrict the making of telemarketing calls and the use of automatic telephone dialling systems and artificial or prerecorded voice messages. The updated FCC rules, as part of this act, will mean:

– Telemarketers need to obtain prior express written consent from consumers before robocalling them using AI-generated voices.

– The FCC civil enforcement authority can fine robocallers and take steps to block calls from telephone carriers facilitating illegal robocalls.

– Individual consumers or an organisation can bring a lawsuit against robocallers in court.

– State Attorneys General have their own enforcement tools which may be tied to robocall definitions under the TCPA.

Countrywide

The fact that a coalition of 26 State Attorneys General (more than half of the nation’s AGs) recently wrote to the FCC supporting this approach means that the FCC will have partnerships with law enforcement agencies in states across the country to identify and eliminate illegal robocalls.

Worry In An Election Year

The move by the FFC is timely in terms of closing avenues for the spread of political misinformation. For example, in January, an estimated 5,000 to 25,000 deepfake robocalls impersonating President Joe Biden were made to New Hampshire voters urging them not to vote in the Primary.

What Does This Mean For Your Business?

Advances in AI voice cloning technology and its wide availability have given phone scammers and deep-fakers a powerful tool that can be used to spread misinformation and scam consumers. In light of what happened in New Hampshire, the FCC wants to clamp down on any possible routes for the use of deepfakes to spread political misinformation as well as protecting consumers from scams.

The prevalence of these types of calls makes it more difficult for legitimate telemarketers (who are likely to be pleased by the action taken by the FCC). The fact that 26 State Attorneys General covering half the country support the FCC’s law change gives the move power and reach, but whether it will be an effective deterrent for determined scammers in what may become a very messy election remains to be seen. Also, the telephone is just one route as voters and consumers can be targeted with misinformation in many other ways, perhaps more widely and effectively, e.g. through social media and shared deepfake videos. That said, the law change is at least a step in the right direction.