3 Takeaways from MBA Servicing’s Data Privacy Panel
Expert panels at industry trade shows are usually excellent, but aren’t known for their emotional range. Dry to arid (depending on subject matter) is the norm, but this panel at MBA Servicing ‘23 spanned hilarious, sober, and downright terrifying. We’ll tell you more, but first, let’s introduce the panelists:
- Amy Mushahwar, Partner, Alston & Bird
- Gabriel Acosta, Regulatory Specialist, MBA
- Wendy Lee, EVP, Chief Legal Officer, Sagent
Below are 3 quick highlights from this panel, one that’s imminently applicable (with a deadline), one best practice that applies to any organization’s reporting structure, and one that’s a little scary for data-security-conscious folks with one iota of imagination.
Just before we jump in, if you’re attending MBA’s Legal Issues and Regulatory Compliance Conference in Austin, TX on May 7-10, be sure to join Wendy, Amy, Gabriel, and their colleagues for a panel on May 9 at 1:00 PM CT. These industry pros are sure to bring important insights to advance your compliance and data-security measures.
FTC Safeguards rule + Data Privacy
The Safeguards Rule itself is two decades old, but with new provisions going into effect on June 9 this year (Wendy Lee’s birthday, for anyone who wants to send a note of congrats or sympathy) the panel spent a majority of their time discussing what’s involved.
They explained that the updates to this rule are harmonized with the New York Department of Financial Services and guidelines from NIST. Of course, the impacts are gargantuan, so they focused on a few key applications for servicers.
The Rule includes prescriptive requirements for data security, including that designated qualified individuals must implement, ensure enforcement, and [must] report to the board of the company. More on that later.
It also dictates the need for granular risk assessments along with plans for mitigation, as Gabriel described it:
This is a way of identifying foreseeable risks that could lead to unauthorized access or compromised data. And it’s not just about conducting the risk assessment then putting the results on a desk. You also have to identify how the risks are going to be mitigated.
Another huge consideration is the need for written records of identified risks, incident response plans, vulnerability management reports, and more. Amy calls them “artifacts of compliance,” saying,
Typically, the FTC or the NY DFS is looking for at least several quarters of compliance reports. So if you have not started asking for information security, and artifacts of compliance… don’t just have IT or security to tell you you’re compliant. They need to show and prove that you are compliant.
This segment was full of valuable info which we won’t try to capture here, but the focus on these two areas is indicative of their importance, so they’re worth your time for more research if you’re not sure about your organization’s readiness.
Organizational structure
Wendy asked her fellow panelists, “What do you think is best practice in terms of the organizational structure surrounding CTO and CISO?”
Amy answered:
The CISO and CTO should be 2 different people. If you’re a small organization where the chief technology officer is also the chief of security, consider using a consultant or virtual CISO so you can have those individuals dialog and cross-check each other.
These two individuals should not report to each other, they should be questioning each other and should report to officers such as the CEO or CFO. If one is a higher rank than the other, that risks dictating the priority of the technology platforms over security, or vice versa.
AI, ML, or by any other name…
Gabriel started this segment by pointing out that AI presents a “really great opportunity for efficiency,”, especially in the servicing space. But with an acknowledged “bad rap” that might be attributable to the sci-fi genre, he asked Wendy, “How scared should I be about SkyNet?”
Wendy — who admitted to knowing nothing about AI — in preparation for the panel submitted a query to a headline-generating tool called ChatGPT about the ways that AI could be applied to servicing. The answers from this large language model ranged broadly, including automation of repetitive, mundane tasks such as the application of payments, loan amortization calculations, escrow analysis… anything that can free up servicing operators to do the “hard things” that are best attended by humans. ChatGPT also pointed out that AI-powered chatbots and virtual assistants promise to improve customer service and detect fraud.
The panel said that predictive analytics is directly in the wheelhouse of AI by analyzing data from various sources including credit, employment records, public records, and so on. The possibilities are endless, and included in those possibilities are some grave (and this is the terrifying part) risks.
One of these that Wendy expressed was, “How do I make sure AI doesn’t cause harm?” And she’s not the only one asking this question. To that end, the panel discussed the AI Bill of Rights which is a guide produced by the White House Office of Science and Technology Policy to protect people from the threats posed by technology that can meaningfully impact the public’s rights, opportunities, or access to critical needs. The panel discussed several fundamental principles that are included in this guide to protect Americans:
- We should be protected from unsafe or ineffective systems;
- We should not face discrimination from algorithms, and systems should be used and designed in an equitable way;
- We should be protected from abusive data practices by design;
- We should know when an automated system is being used and understand how it will impact us;
- We should be able to opt out or have access to a human who can remedy an AI-generated decision.
This was subjectively the most entertaining and thought-provoking segment, equal parts inspiring and terrifying about the immediate future as we learn to coexist with this nascent technology.
The compliance ramifications of this entire panel discussion are dizzying, but thankfully Wendy Lee (and her cadre of highly qualified colleagues) is advocating for and educating servicers while ensuring these considerations are at the core of every solution Sagent is creating as we build the future of servicing.
PS If you’d like to learn more from Wendy’s contributions to the consumer-data and privacy-protection realms, check out her previous webinars here and here. And USFN members can find her at the Compliance and Legal Issues Seminar in Chicago on July 13-14, in addition so the MBA panel mentioned earlier.