Common weak spots in institutional data quality ahead of CECL

Jan 31, 2017

Financial institutions preparing to adopt FASB’s accounting standards update for estimating and recording credit losses must weigh various methodologies available for calculating their losses under the new current expected credit loss model, or CECL.

A key consideration in methodology selection is whether the bank or credit union already has or will have the data necessary to incorporate a particular approach into its estimates for the allowance for loan and lease losses (ALLL) calculation. “Depending on the estimation method or methods selected, institutions may need to capture additional data and retain data longer than they have in the past on loans that have been paid off or charged off to implement CECL,” regulatory agencies recently advised .

Garver Moore, director of special research for Abrigo Advisory Services, says banks or credit unions assessing their data and planning for changes to ALLL methodology ahead of CECL will be well served to first, ensure there is “one view of the world” as it relates to the institution’s existing data about loans. One approach is to create (either formally or informally) a “data dictionary” – something that outlines what data elements have been captured, exactly what each data element means and where it is represented or housed in the enterprise.

“What we see at a lot of banks is that the information is being kept in different places and in different ways,” he says. Asking 10 different people for a fact and getting 10 different answers creates a “special hell” for anyone trying to make preparations. As a result, Moore says, “Having an idea of where the data is, exactly what the data is and what can be relied on is an important exercise — not just for a compliance or accounting measurement, but so you can provide that same unified view to your customers.”

The quality and accuracy of data is especially important when measuring expected credit losses. “It’s not good enough to look up something on your data warehouse and compare it to your core,” Moore says. “Compare the data in the core to the underlying business reality. So you might identify a field where the debt service coverage ratio is housed, but then you have to know how often that data is updated: Does the DSCR at origination match the credit review, and does it reflect reality?”

Many loan-level attributes captured by institutions are intended to be representative of real-time data, but institutions need to verify whether that is the case. “You may decide to do a segmentation analysis based on the DSCR,” Moore says. “What you don’t want is for DSCR data for half of the loans to be recorded at origination and for half to be based on loans that have migrated through various loan classifications – and for you not to know this.”

“Your data is only as good as your understanding of its strengths and its limitations,” he says.

To identify common weak spots in institutional data that affect banks’ ability to implement CECL effectively, Abrigo analysts examined client data stored in the Abrigo system. The good news is that across the sample, the data quality was generally good for many data fields linked to origination (fields such as the original amount of the loan and the origination date).

Data quality on the collateral side of operations was also generally good in the Abrigo review of client data. A lot of institutions have the original appraisal value and the original appraisal dates from when the loan was booked.

However, even for the field “origination date,” some banks had issues with data quality and integrity. The value of this field for a specific loan should never change, obviously; there can only be one origination date. But among the Abrigo clients, 5 percent of loans on average had origination dates that were either missing or had been changed at some point.  For some clients, that ratio was as high as 35 percent of loans.

This indicates that when some loans renewed or experienced another event such as a restructuring or a move to non-accrual status, the origination date was replaced with the new event date. Abrigo analysts say that at a high level, data integrity at financial institutions seems to break down through the life of the loan.

Renewal dates and renewal balances are especially important for capturing life-of-loan information needed for vintage analysis under CECL. However, among institutions with loan renewals recorded, renewal dates were archived for only 2.6 percent of loans, and accurate renewal balance data was stored for less than 1 percent of renewals, the Abrigo analysis of client data found. This doesn’t necessarily mean the institutions aren’t tracking this data at all, but it is a strong indication that institutions aren’t keeping track of it in the core and aren’t consistently providing this data to Abrigo, despite evidence that this data will be important under CECL, according to Abrigo integration specialist Danny Sharman.

“By and large, information seems to be really good at origination and kind of lacking when there’s any sort of major change to the loan during the loan life,” Sharman says.

For more information on common data problems institutions are facing for the CECL transition and for immediate steps to ensure data needed for CECL calculations is accessible and sound, listen to the recorded webinar, “Data Quality Considerations for CECL Measurement.”