The Federal Communications Commission (FCC) has proposed a $6 million fine against Steve Kramer, who has confessed to orchestrating the fake call sent to thousands of people in New Hampshire on the eve of the state’s January primary. Kramer used used voice-cloning tech to impersonate President Biden in a series of illegal robocalls.
According to NPR, Kramer has also been indicted on criminal charges in four New Hampshire counties, including 13 counts of voter suppression, a felony, and 13 counts of impersonating a candidate, a misdemeanour.
In a statement, the FCC proposed a substantial fine for apparently illegal robocalls made using deepfake, AI-generated voice cloning technology and caller ID spoofing to spread election misinformation to potential New Hampshire voters prior to the January primary.
Kramer faces a $6 million proposed fine for apparent spoofing violations.
Two days before the New Hampshire 2024 presidential primary election, illegally spoofed and malicious robocalls carried a deepfake audio recording of President Biden’s cloned voice telling prospective voters not to vote in the upcoming primary.
Political consultant Steve Kramer was responsible for the calls and now faces a $6 million proposed fine for perpetrating this illegal robocall campaign on January 21, 2024. The calls apparently violated the Truth in Caller ID Act by maliciously spoofing the number of a prominent local political consultant.
The robocalls, made two days prior to the election, used a deepfake of President Biden’s voice and encouraged voters to not vote in the primary but rather to “save your vote for the November election.” Commission rules prohibit knowingly causing the transmission of inaccurate caller ID information with the intent to defraud, cause harm or wrongly obtain anything of value. Mr. Kramer’s conduct apparently runs afoul of this rule.
“We will act swiftly and decisively to ensure that bad actors cannot use U.S. telecommunications networks to facilitate the misuse of generative AI technology to interfere with elections, defraud consumers, or compromise sensitive data,” said Loyaan A. Egal, Chief of the Enforcement Bureau and chair of the Privacy and Data Protection Task Force. “We thank our partners at the New Hampshire Attorney General’s Office for their help with this investigation.”
To transmit the calls, Kramer had engaged Voice Broadcasting Corp., which used the services of Life Corp. to transmit calls through voice service provider Lingo Telecom. Lingo Telecom transmitted these calls, incorrectly labeling them with the highest level of caller ID attestation, making it less likely that other providers could detect the calls as potentially spoofed. The Commission brought a separate enforcement action today against Lingo Telecom for apparent violations of STIR/SHAKEN for failing to utilize reasonable “Know Your Customer” protocols to verify caller ID information in connection with Mr. Kramer’s illegal robocalls.
In February, the FCC’s Enforcement Bureau, in coordination with the office of the New Hampshire Attorney General, ordered Lingo to cease-and-desist from carrying the suspicious traffic. The Commission has taken such actions to block active robocall scam campaigns, in addition to imposing financial penalties like those proposed today. These efforts to stop active campaigns have had important impacts, including FCC actions that resulted in a 99% drop in auto warranty scam robocalls, an 88% month-to-month drop in student loan scam robocalls, and the end to a predatory mortgage robocall campaign targeting homeowners nationwide.
The FCC continues its work in understanding and adjusting to the impacts of AI on robocalling and robotexting. The Commission has made clear that calls made with AI-generated voices are “artificial” under the Telephone Consumer Protection Act (TCPA), confirming that the FCC and state Attorneys General have the needed tools to go after bad actors behind these nefarious robocalls. In addition, the FCC launched a formal proceeding to gather information on the current state of AI use in calling and texting and ask questions about new threats, like robocalls mimicking the voices of those we know. T