NYDFS Guidance Highlights Cybersecurity Risks in AI Adoption
All Fintech
Cybersecurity
October 31, 2024
The Case
On October 16, 2024, the New York State Department of Financial Services (NYDFS or the “Department”) published an industry letter (the “Guidance”) regarding the increased reliance on artificial intelligence (AI) and the cybersecurity risks associated with that practice. The Department identified several risks related to legitimate and malicious use of AI and recommended controls and measures to mitigate AI-related risks, including enhancing procedures, technical tools, and training.
While the Department notes that the Guidance does not impose new requirements beyond NYDFS’s cybersecurity regulation codified at 23 NYCRR Part 500 (the “Cybersecurity Regulation”), the Guidance points to the Cybersecurity Regulation as a framework to assess and address AI-related cybersecurity risks. Entities regulated by the NYDFS (“Covered Entities”) would be well-advised to incorporate the Department’s guidance into their risk assessments, a core component of the Cybersecurity Regulation.
Regulatory Implications
The NYDFS Guidance highlights critical cybersecurity risks tied to the growing reliance on artificial intelligence (AI), urging Covered Entities to adapt their frameworks to address both malicious and legitimate AI applications. Key takeaways include:
Malicious Use of AI: Threat actors increasingly use AI to enhance social engineering, such as creating deepfakes for phishing attacks or automating hacking processes. This raises the stakes for implementing stronger access controls, like multi-factor authentication (MFA) resistant to AI manipulation.
Legitimate Use Risks: Organizations leveraging AI tools often process large volumes of sensitive data, making them prime targets for cyberattacks. Additionally, dependency on third-party AI providers introduces supply chain vulnerabilities that require proactive management.
Integration into Existing Regulations: While the Guidance doesn’t introduce new rules, it ties AI-related risks to the existing Cybersecurity Regulation under 23 NYCRR Part 500. This integration emphasizes risk assessments, incident response plans, and board-level accountability.
With these developments, NYDFS signals that AI-related risks are not a future concern but an immediate challenge requiring urgent attention.
Practical Guidance for Firms
Adapting to the NYDFS Guidance requires proactive measures that align with existing cybersecurity frameworks. Firms can focus on the following areas:
Revise Risk Assessments:
Incorporate AI-related vulnerabilities, including those from third-party providers, into existing risk evaluations.
Update these assessments regularly to reflect advances in AI technologies and evolving threats.
Strengthen Access Controls:
Implement MFA solutions that can counter AI-driven spoofing attempts.
Consider biometric authentication technologies with anti-spoofing capabilities.
Enhance Data Governance:
Maintain detailed inventories of AI-integrated systems.
Minimize data collection and ensure robust disposal policies for nonpublic information (NPI).
Train Your Workforce:
Educate staff on identifying AI-enhanced attacks, such as deepfakes or AI-driven phishing.
Include specific training for leadership to ensure they thoroughly understand and effectively oversee AI-related risks.
Bolster Third-Party Oversight:
Update vendor contracts to include AI-specific security guarantees.
Conduct regular audits to verify third-party compliance with secure data practices.
InnReg specializes in helping firms integrate these measures into their operations, offering services such as customized risk assessments and vendor oversight strategies.
All Fintech
The SEC has recently taken a series of enforcement actions against financial firms for failing to maintain and preserve electronic communications, particularly those conducted through off-channel methods like personal devices.
Broker-Dealers
Cash sweep programs, which automatically transfer uninvested cash in brokerage accounts to higher-interest accounts, are facing increased scrutiny from regulators like the SEC and FINRA, and investors.
Broker-Dealers
The North American Securities Administrators Association (NASAA) is requesting public comments on proposed revisions to NASAA’s broker-dealer conduct rule entitled Dishonest or Unethical Business Practices of Broker-Dealers and Agents (“Conduct Rule”).