15
Vie, Nov
0 New Articles

Reports and Coverage
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

US regulatory authorities have officially deemed fraudulent "robocalls" utilizing artificially generated voices as unlawful. This issue garnered significant attention recently when a robocall impersonating US President Joe Biden discouraged individuals from participating in voting activities. It is estimated that between 5,000 and 25,000 calls were placed utilizing the impersonated voice of Biden.

Also Read: Is It A Call You Want to Answer?

"Bad actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, imitate celebrities, and misinform voters," Federal Communications Commission (FCC) chairwoman Jessica Rosenworcel said in a release. "State Attorneys General will now have new tools to crack down on these scams."

The FCC, in a unanimous decision, concluded that AI-generated voices constitute "artificial" entities, thus contravening the Telephone Consumer Protection Act (TCPA), a primary legislative instrument utilized by the FCC to mitigate unwanted calls, including telemarketing solicitations and automated dialing systems.

This ruling effectively renders voice cloning techniques employed in robocall scams illegal, thereby enabling the prosecution of individuals involved in such illicit operations, as outlined by the FCC. Previously, law enforcement agencies could only pursue legal action against perpetrators for the repercussions of scams, such as fraudulent activities facilitated through robocalls, without directly addressing the calls themselves.

The proliferation of such fraudulent calls has surged in recent years, facilitated by automated calling systems, some of which manipulate caller IDs to simulate local origins. A coalition consisting of 26 state attorneys general was among the entities advocating for FCC intervention to curtail the utilization of AI-generated voices in telemarketing communications.