Financial Institution Advantage and the Optimization of Information Processing. Keenan Sean C.
Читать онлайн книгу.under the Bank Secrecy Act of 1970 (BSA), and other types of fraud detection, more technical, risk-related training has become increasingly important. Within the AML space (now a front-burner issue, especially for larger institutions), detection solutions are increasingly based on sophisticated statistical modeling and voluminous data processing. Top-vendor AML systems deploy sophisticated models that require not only expert management and independent validation, but also rich and timely data flows that can test the capability of the institution's overall data infrastructure. Even CIP, formerly a rule-based exercise with a tendency toward weak performance measurement, continues to evolve in this direction. The Patriot Act clarification on CIP includes this statement:
The Agencies wish to emphasize that a bank's CIP must include risk-based procedures for verifying the identity of each customer to the extent reasonable and practicable. It is critical that each bank develop procedures to account for all relevant risks including those presented by the types of accounts maintained by the bank, the various methods of opening accounts provided, the type of identifying information available, and the bank's size, location, and type of business or customer base. Thus, specific minimum requirements in the rule, such as the four basic types of information to be obtained from each customer, should be supplemented by risk-based verification procedures, where appropriate, to ensure that the bank has a reasonable belief that it knows each customer's identity.6
To summarize, regulators now expect financial institutions to bear the full weight of modern data management and creative, advanced analytics in addressing issues for which compliance had traditionally been a matter of minimally following highly prescriptive rule sets. Given the emphasis on risk-based techniques requiring advanced, industrial strength data processing support, chief risk officers and chief compliance officers will be challenged to lead these efforts without a technical risk-analytics and IT-oriented experience base.
Outsourcing and the Culture of Failure
Unfortunately for many firms, the problem of an inadequate experience base is self-reinforcing. How many stories have we heard about giant IT projects that were catastrophic failures? Without adequately experienced leaders in place (who could potentially prevent some of these disasters), it can be extremely difficult to get accurate assessments of why the projects failed or what could have been done better. The experience deficit has also created an information asymmetry in which business decision makers, often not completely clear about what their current and future needs are and scarred by past IT project failures, are squared off against software vendors who are often very well informed about the firm's knowledge, current capabilities, and history, and can tailor their sales pitches accordingly. Ironically, many large-scale IT failures occurred because the projects weren't nearly large enough – that is, as big as they may have been, they weren't part of a holistic redesign of the overall information processing infrastructure of the firm. At the same time, many IT-related outsourcing relationships have helped financial institutions improve performance and efficiency, creating a tremendously appealing perception that more outsourcing is better, and that financial institutions need to get out of the information processing business. But as we will discuss in more detail below, institutions need to consider carefully what aspects of their information-process complex are truly core to their identities and competitive positions in the marketplace, and invest in and further develop these internal capabilities instead of outsourcing them.
Outsourcing issues aside, all IT infrastructure projects expose the firm to some risk. In the absence of a clearly communicated overall vision, the risks associated with piecemeal infrastructure projects are elevated for a number of reasons. In the first place, even well-meaning and experienced project managers are at an informational disadvantage. They are solving a problem – or a narrow set of problems – without knowing whether the design choice will be complementary to other software and system projects also underway. The only way to insure strong complementarity of such projects is to have a clearly articulated vision for the overall system and to evaluate each project for consistency with that vision. For many large institutions, particularly those who have grown by acquisition, the underlying system is effectively a hodgepodge, and there may be no clearly articulated vision. Under these conditions, the chance that any one project will make the problem worse is high. This can lead decision makers to embrace min-max strategies7 with respect to high-visibility infrastructure projects – often strategies that can be supported with information from industry experts, including consultants and the software vendors themselves, who certainly do not have a long-term vision for the firm's competitive position as a goal. In many cases, both the requirements for a given infrastructure build and the design choices made in order to meet those requirements are partly or wholly outsourced to vendors, consultants, or both. From asset/liability management systems, to Basel II/III systems, to AML systems, to model governance systems, to general purpose database and data processing systems, key expertise and decision making are routinely outsourced. Recognizing this, the sales presentations from the major software firms increasingly involve selling the vendor-as-expert, not just the product. Consulting firms, too, have increasingly oriented their marketing strategies toward this approach under the (frequently correct) assumption that the audience is operating on the short end of an information asymmetry, understanding primarily that it has a problem and needs a solution. Vendors' increasing focus on integrated solutions reflects their perception that institutions are now aware that they have bigger problems and are increasingly willing to outsource the vision for how the firm manages its analytic assets in the broadest sense.
The outsourcing approach can be expedient, particularly when an institution does not have the immediately required technical knowledge. And since design choices require both technical/product knowledge and a deep understanding of the particular institutional needs and constraints (and history), adding internal expertise through hiring may not be a quick solution either, since new hires may know less than consultants about the internal workings of a given firm. But such knowledge and expertise gaps may be symptoms of a deeper underlying problem and, particularly when regulatory expectations are involved, the persistence of the expertise gaps that led to the outsourcing can come back to haunt the firm. One illustrative example is that of AML. At many banks, sophisticated vendor-provided transaction-monitoring software is used for AML alert processes. In fact, the sophistication of these systems has been increasing rapidly over the past several years, to the delight of institutions that see these systems as solving a real problem caused by increasing expectations of regulators. But both the understanding of how these systems work and the back-testing and tuning of the many settings required to operationalize them have, in many cases, been outsourced to the vendors who supply the product. Predictably, regulators who applaud the installation of such capable technology have been highly critical of the banks' actual deployment of it. Generally, negative feedback from the regulators about the implementation of these systems has included the following complaints:
• The company is not using the full functionality of the system.
• The system is not adequately customized to align with the company's risk profile.
• The program does not capture all of the company's products and services.
• The scenarios or rules used do not adequately cover the company's risks.
• There are no statistically valid processes in place to evaluate the automated transaction monitoring system.
• There are insufficient MIS and metrics to manage and optimize the system.
More recently, the Fed has issued explicit statements that it considers such systems to be models and that therefore the requirements of SR 11-78 apply – requirements that few compliance teams or model validation teams are fully prepared to meet. Clearly, the regulatory community is quite sensitive to the fact that outsourced solutions without close internal expertise and oversight may be ineffective in achieving their goals, and in the AML area this sensitivity is backed up by firsthand experience obtained in the course of examinations. But AML is just one example. And regulators' concerns aside, the firms themselves should be very sensitive to these same issues. For any large information processing project for which a significant amount of planning, design specification, vendor selection, implementation, tuning or validation
6
Board of Governors of the Federal Reserve System, “FAQs: Final CIP Rule” (2005), www.federalreserve.gov/boarddocs/SRLETTERS/2005/SR0509a1.pdf.
7
Min-max strategies are intended to minimize the maximum regret, or alternatively, to narrow the range of outcomes that could be called a failure.
8
SR 11-7, the Fed's guidance on model risk management, will be discussed further in Chapter 4.