(TNS) — Pennsylvania lawmakers are proposing safeguards against deceptively accurate voice impersonations and “deepfake” visuals created with artificial intelligence, as the threat of AI-related scams increases.
Artificial intelligence technology has the ability to generate an audio clone of a person with a recording of as little as three seconds of the person’s real voice, according to state Rep. Robert Merski, D-Erie. Hypothetically, he said, someone could receive a call from a family member claiming to be stranded with a disabled vehicle and in need of a credit card number — only to learn later the caller was an AI voice robot.
“Everybody should be on guard with this technology. If they can impersonate the president of the United States, they can impersonate anybody,” Merski said. He was referring to news last month that a robocall voice apparently generated by AI faked President Joe Biden’s voice well enough to trigger an investigation in New Hampshire.
Merski is author of a bill pending in the House Judiciary Committee that would criminalize the unauthorized dissemination of audio impersonations or deepfakes.
At least eight other bills related to AI have been filed. Merski’s is likely to be the subject of a Judiciary Committee discussion and vote soon, according to committee Chairman Tim Briggs, D-Montgomery. Briggs called it “a big concern” and said that AI regulation “is becoming more relevant in our society.”
Rep. Chris Pielli, D-Chester, who is working with Merski on several AI-related bills, said, “The whole aspect of how this affects our daily lives is frightening.”
Interest in taking action appears to be bipartisan. Sen. Tracy Pennycuick of Montgomery County, the Republican chair of the Communications and Technology Committee, said it was important to have a law to protect from faking an individual’s images, their likenesses and voice.
“Right now that is probably number one,” Pennycuick said. “We also need to look at how AI is used in government.”
Connecticut — one of a growing number of states that has enacted AI-related legislation — last year adopted a measure that called for an inventory of all systems that employ artificial intelligence and are being used by the state. It also requires an assessment to make sure they are not causing discrimination.
“Technology moves faster than our ability to regulate it,” said Democratic Connecticut state Sen. James Maroney. With AI, he said, the pace of change seems especially fast.
“There is urgency to get some form of guard rails or regulations in place,” said Maroney, considered a leader in that state in AI lawmaking.
In the New Hampshire incident involving a robotic fake of Biden’s voice, one 73-year-old woman who received such a call was quoted by the Associated Press as saying, “I didn’t think about it at the time that it wasn’t his real voice. That’s how convincing it was.”
A well-connected staffer in Pennsylvania law enforcement said the very nature of the scam — the impersonation of a person’s voice — may make it difficult to know exactly how often it is occurring. But Berks County District Attorney John Adams, who is a spokesperson for the Pennsylvania District Attorneys’ Association, said it is happening.
“AI has been enabled to mimic someone’s voice and that has allowed some scams to take place,” Adams said.
A spokesperson for Pennsylvania Attorney General Michelle Henry, Brett Hambright, said the office had received few complaints about that sort of activity. On Monday, though, Henry called attention to a new Federal Trade Commission rule that prohibits the use of AI to pose as a government agency or business — and the FTC’s desire to expand the rule to cover individuals.
Henry said in a statement that her office helped lead a coalition of attorneys general seeking the rule. “Consumers deserve transparency when they receive a phone call, and this rule will ensure they get it,” she said.