A congressional crackdown on deepfakes continued this week with the introduction of a bipartisan Senate bill targeting financial scams that leverage artificial intelligence to trick people out of their money.
The Preventing Deep Fake Scams Act from Sens. Jon Husted, R-Ohio, and Raphael Warnock, D-Ga., would create a task force led by federal financial regulators to study fraud, data and identity theft powered by AI. The bill has a companion in the House from Reps. Brittany Pettersen, D-Colo., and Mike Flood, R-Neb., that was introduced in February.
Husted said in a statement that the legislation is aimed at protecting seniors, families and small business owners “from malicious actors who take advantage of their compassion.”
“Scammers are using deep fakes to impersonate victims’ family members in order to steal their money,” the Ohio Republican said. “As fraudsters continue to scheme, we need to make sure we utilize AI so that we can better protect innocent Americans and prevent these scams from happening in the first place.”
The bill calls for the task force to be chaired by the Treasury secretary and filled with the heads or designees of the Federal Reserve, the Consumer Financial Protection Bureau, the Office of the Comptroller of the Currency, the Federal Deposit Insurance Corp., the National Credit Union Administration, and the Financial Crimes Enforcement Network.
The task force would be charged with examining proactive measures that financial institutions could take to use AI to prevent fraud, in addition to flagging possible risks related to the misuse of the technology. The report, which would be delivered to Congress within a year of the bill’s enactment, should also detail best practices to protect consumers from deepfake financial crimes and provide regulatory and legislative recommendations.
According to Federal Trade Commission data, fraudsters stole more than $12.5 billion from consumers last year, a 25% jump from 2023. AI tools are increasingly being used by scam artists to craft emails, text messages and phone calls that trick people into thinking their loved ones are in danger and that payment is the only way to guarantee their safety.
Deepfakes and other AI-fueled scams have been a hot topic on Capitol Hill this congressional term. In April, the House passed a bill that would criminalize using a person’s likeness to create nonconsensual deepfake pornography. The FBI revealed last month that malicious actors have been creating texts or deepfaked audio messages that impersonate top government officials, part of a scam to trick current or former senior federal and state government leaders.
A bipartisan Senate bill introduced last month, meanwhile, takes aim at the growing ubiquity of AI-fueled scams, calling for a Commerce Department-led education and awareness campaign to help Americans better identify deepfakes and warn them of their dangers.
The post Financial deepfake scams targeted in bipartisan Senate bill appeared first on CyberScoop.