Written By: Gordon Dobner

Most financial institutions understand Accounting Standards Update (ASU) No. 2016-13, Financial Instruments–Credit Losses, and more specifically applying the CECL model to their loan portfolio, represents the most significant accounting change for financial institutions in recent memory. However, there is less comfort over how the standard will specifically affect each institution. The conversation on its effect goes well beyond day 1 increases in allowance. The focus of this article is not to discuss every potential effect of the standard, but instead focus on what seems to be the most substantial immediate effect. Few will be shocked to hear the answer to this is data. However, many institutions we work with, especially if they are early on in their CECL implementation process, are uncertain about why or how data becomes such a significant issue. To provide clarity to those concerns, we wanted to focus this article on what we have seen in real-life CECL implementations. We have broken down data issues and concerns into the following four main areas:

  • What you don’t have
  • What you do have
  • How you got it
  • How you will maintain it going forward

Before we dive into each of the above areas, a word of caution: Each institution’s implementation is unique and the complexity of data needs and integrity is a product of many variables. These variables include pooling segmentation decisions, models/methodologies being considered, the strength of internal controls in your institution and how you store and maintain historical data today, to name a few. Therefore, you may not experience all of the issues discussed below, but you will likely face challenges as it relates to data.

What you don’t have

Data issues in this area are usually the easiest for institutions to grasp, and many of these reveal themselves quickly in the process. These relate to data fields you don’t have or don’t consistently capture. An example of an issue in this area is if an institution thinks loan-to-value or debt service coverage ratio is a good indicator of future expected credit loss in commercial loans and incorporating these into their CECL models may make sense. However, they realize quickly either they don’t capture those data points in their loan systems or they don’t regularly and consistently update them. Therefore, without a lot of cost to the institution, these data points can’t be obtained. This typically leads institutions to consider other data fields, such as loan grading, that may be more reliable and still incorporate those factors. Other common issues in this area are:

  • Inability to access historical data after a certain time frame
  • Lack of quality historic data from acquired institutions
  • Data points for future forecasts

What you do have

CECL is forcing a lot of institutions to reconsider the quality of the data they can obtain in their loan systems and data warehouses. As many institutions are considering using more or different data in their CECL models, they are asking themselves the question, “How comfortable am I with the completeness and accuracy of the data?” Common issues we have seen in this area are:

  • Accuracy and consistency of how data is input and recorded into loan systems
  • Changes in underwriting or grading systems causing lack of consistency in historic data sets

A few specific problem areas include how loan renewals and modifications are input, loan payment fields being recorded as a whole rather than separate principal and interest amounts, and lack of detail for charge-offs to determine how much was principal versus remaining premium or discount or other components of amortized cost basis. Institutions should consider and document how they gain comfort with the accuracy of the information in their loan systems by considering the controls over input and maintenance of data fields necessary in their CECL models/methodologies. To help get auditors and regulators comfortable with these conclusions, a good idea is to reference how internal audit and controls testing provides management independent validation of the accuracy. If certain data fields have not been consistently tested or validated, or have shown a history of issues, then additional testing may be necessary to gain comfort over the reliability of those data points.

How you got it

For many institutions who have made progress in their CECL implementation, they have quickly realized they will need to obtain and store loan-level data in a different manner, and for a longer period of time, to implement CECL compared to today’s incurred loss approach. Many institutions are creating data warehouses of historic loan-level data to address this issue. Some institutions already have data warehouses. Those institutions, especially if their external auditors are already auditing the controls over the completeness and accuracy of the data warehouse, should have a much lower risk of data gaps or errors in this area. However, for everyone else having to create a new data warehouse for CECL, issues can arise if validation is not performed over the completeness and accuracy of the data uploaded into the data warehouse. This will require management to create internal controls over the completeness and accuracy of uploading data into the data warehouse.

How you will maintain it going forward

As noted above, many institutions have realized they need to create data warehouses to be able to store historical data sets outside of their existing loan systems to leverage that data for CECL. This brings in the question of how will the integrity of the data in the data warehouse be maintained and protected going forward. If an institution is considering maintaining an in-house data warehouse in Excel as a long-term solution, then robust spreadsheet controls need to be developed to protect the data from accidentally or intentionally being manipulated. Implementing these controls can be a challenging process and may lead institutions to seek alternatives. Institutions that are moving to an outsourced vendor solution that includes their data warehouse will likely have less to worry about since those vendors should have well-defined controls and control testing over data integrity. No matter what avenue an institution takes, they should ask the question what could go wrong in the process of uploading new data and maintaining existing data in their data warehouses and make sure they have adequate controls to cover any significant risks.


The ways data can become a barrier to CECL implementation are too numerous to list. A best practice to help a financial institution overcome these barriers is to go through the exercise of evaluating what you don’t have, what you do have, how you got it and how you will maintain it going forward. To complete this exercise, institutions will need adequate education of the CECL standard, a thorough understanding of potential models and methodologies and a general idea of how the institution will pool their loans under CECL. By taking time to think through your responses to these areas, an institution can work to determine their data gaps and decide what action needs to be taken.


About Andrew Wallace

A member of BKD National Financial Services Group, Andrew has worked with public and closely held financial institutions ranging from $50 million to $8 billion in total assets. He provides various audit and consulting services, such as current expected credit loss model implementation, asset and liability management, control environment process and procedure assessment and other industry-specific consulting services.

He also has experience with U.S. Securities and Exchange Commission reporting and integrated audits under section 404 of the Sarbanes-Oxley Act of 2002.

Andrew is a member of the American Institute of CPAs anss THe Ohio Society of Public Accountants. He also is active in the Ohio Bankers League and is the Vice President for the Cincinnati Chapter of the Financial Managers Society. He is a 2015 graduate of Butler University, Indianapolis, Indiana, with an M.Acc. degree.