CECL Data: Mind the gaps and start today
Nov 17, 2016
FASB’s June issuance of the new current expected credit loss (CECL) model has sparked questions from institutions seeking to understand what type of and how much historical data is needed to divine future losses for loan and lease allowances under the new rule. The data question has caused angst among those who worry about the reliability of their data, especially in cases where previously ignored fields seem to have a new significance under CECL. But before digging through the annals of the core, it is key to take stock of what is available and what is not – and what that could mean for CECL calculations.
Mind the gaps
In a recent webinar, “Data Quality Considerations for CECL Measurement,” Danny Sharman, an integration project manager at Abrigo, discussed trends he sees in client data. The high-level trend among Abrigo clients is that the quality of their data starts strong: origination dates, loan amounts, as well as the original collateral appraisal amounts and dates are typically collected and reliable. The changes that can occur over the life of a loan introduce variability, however, in the completeness of data. As Sharman notes, “throughout the loan life, through any sort of major events — whether that’s a renewal, a restructure, the loan getting set to nonaccrual status — updates don’t seem to be made.”
While these missing data points may seem insignificant for a single loan, the learnings that can be gleaned from these changes could represent the documentation needed to underpin projections made in CECL calculations. Understanding changes to loans in the context of external factors like unemployment, and having the data points to justify your quantification of these factors, can mean a difference of crucial basis points factoring into the allowance.
More data, more flexibility
While a lack of granular loan-level data will not make CECL calculations impossible, more data will allow for more flexibility in methodologies across the board. The type of data available will determine an institution’s ability to perform certain calculations, and the amount of data will determine how much flexibility there is within the chosen methodology.
Having access to more data can present those undertaking CECL calculations a choice to use only data from time periods that are compositionally similar to the one being evaluated. For example, rather than having to perform a regression analysis to quantify what a certain unemployment level will mean for losses, an abundance of data enables choosing from what is relevant. “If unemployment is projected to be between 7 and 8 percent, throw out data that is from periods where the rate is higher or lower,” advises Garver Moore, Abrigo’ Director of Special Research. With more data, better forecasts are possible and easier to make.
Don’t panic, start now
The key to ensuring an institution has the best possible data for CECL implementation is to start ordering financial data today. Implementing preliminary parallel calculations using the current expected credit loss model will help to identify holes in data that can be addressed over the next few years.
Currently the new CECL standard seeks calculations that make use of an institution’s “reasonably available” data. Starting to collect granular, loan-level data today will provide at least three years’ worth of good and useful data by implementation. And if best efforts cannot recover very much meaningful historical data, institutions will be able to defend what is not reasonably available. After all, as Moore points out, “barring access to a time machine, there’s only so much you can do.”
But there’s no time like the present, and, right now, there is much to be done to prepare.
See the full webinar: Data Quality Considerations for CECL Measurement
Download FASB’s CECL Prep Kit