A proposed rule by the Federal Communications Commission to limit voice calls and text messages made using artificial intelligence would restrict banks’ ability to communicate important information to customers, such as suspicious activity alerts and one-time passcode requests, the American Bankers Association said Friday in comments to the agency.
The FCC in September issued a notice of proposed rulemaking to require callers to obtain prior consent from consumers to receive AI-generated calls and provide a disclosure at the beginning of a call that uses an AI-generated voice. In a letter, ABA said it supports the commission’s efforts to eliminate illegal calls, but the rule as written is too broad. The proposed definition of what constitutes an “AI-generated call” is so expansive and vague it captures many technologies already used to reach customers with vital information.
“If the commission adopts its expansive definition of ‘AI-generated call’ as proposed, consumers may also be confused and decline to consent to receive these important, consumer-benefitting calls,” ABA said.
ABA suggested that if the agency wanted to stem the flow of illegal calls, it should withdraw the proposed rule and instead move forward with actions previously put forward by the association. Those include prohibiting voice service providers from displaying data on the consumer’s caller ID device when the authenticity of calls cannot be adequately verified through a direct and verified relationship with the call originator; increasing enforcement of voice service providers that improperly sign calls with ‘A-level’ attestation; and mandating that non-IP network providers implement a commercially available authentication solution within six months of an FCC order imposing the requirement.