JUNE 2010

Tracking the evolution of Credit Value Adjustment


Establishing a National Institute of Finance




Exploring the gap between confidence and certainty

Four perspectives on compliance

Understanding Whole Enterprise Risk

The beauty of art as investment

Credit Value Adjustment
The changing environment for pricing and managing counterparty risk
Download our latest whitepaper at http://www.algorithmics.com/en/CVA.cfm

© 2010 Algorithmics Software LLC. All rights reserved.

As the world’s leading provider of enterprise risk solutions, Algorithmics conducts ongoing research into industry trends and challenges. Our latest white paper on Credit Value Adjustment (CVA) contains survey results based on in-depth interviews with a cross-section of financial institutions that actively manage counterparty credit risk. These interviews reveal how CVA is being measured, where CVA fits into systems today, and how CVA practices are expected to evolve. Download your free copy at http://www.algorithmics.com/en/CVA.cfm

Proven Risk Management Solutions



What financial risk management can learn from the nuclear energy and demolition industries


How risk management tools can help financial institutions extend their lifecycle



Profiling Allan Mendelowitz and the Committee to Establish a National Institute of Finance

How replicating portfolios can improve the effectiveness of firm-wide risk management


Portfolio updates in Haiku? On a smartphone, anything is possible

zombies and more 12 DEPARTMENTS 04 OPENING BELL The pieces and puzzles of ERM 05 IN CONVERSATION: JOHN MACDONALD Algorithmics’ Executive VP on the realities of counterparty risk.CONTENTS JUNE 2010 28 THE SOLVENCY SCENARIOS Best practices approaches on the road to Solvency II compliance 17 SURVEY SAYS! In-depth interviews reveal changing attitudes towards Credit Value Adjustment (CVA) and the pricing and managing of counterparty risk 10 READING ROOM Paul Samuelson. economic calamities. technology and the “Basel Billions” 08 IN REVIEW Art and the emergence of an alternative investment 10 READING ROOM New and noteworthy titles for the practitioner’s bookshelf 44 THE LAST WORD Risk management apps we’d like to see for our smartphones . financial models.

for any purpose without the express written permission of Algorithmics Software LLC or any other member of the Algorithmics group of companies. The difference between the puzzles lies in our expectations. we have the opportunity to ask ourselves how it fits into the larger picture and whether prior assumptions are still relevant. You may not reproduce or transmit any part of this document in any form or by any means.algorithmics. an Algorithmics publication created by and for risk practitioners. . Establishing this discipline can only help risk management play a larger and more meaningful role within the financial services industry. your comments and feedback regarding THINK are welcome. All rights reserved. With financial markets. and an intriguing look at how risk management tools can enhance the lifespan of financial institutions. but. The materials presented herein are for informational purposes only and do not constitute financial. we strive to fill in as many of these blank spaces as we can.OPENING BELL Like an intricate jigsaw puzzle. As always.com/think © 2010 Algorithmics Software LLC. we know in advance we will not end up with a complete image. Ontario. investment or risk management advice. What we know is that the more pieces we assemble the more insight we can gain. electronic or mechanical. Canada M5T 2C6 416-217-1500 think@algorithmics. Welcome to the fourth issue of THINK. This curiosity is reflected in “When 99% is not enough. The NIF and Committee co-founder Allan Mendelowitz are profiled in “The Big Picture. the evolving role of Credit Valuation Adjustment. 04 JUNE 2010 THINK Dr. including photocopying and recording. Michael Zerbs President and Chief Operating Officer Algorithmics PUBLISHER John Macdonald EDITORIAL DIRECTOR Maria Raposo EDITOR-IN-CHIEF Erin Williams CONTRIBUTING EDITOR David Bester CONTRIBUTORS Jeremy Asprey Andrew Aziz Andrew Barrie Bob Boettcher Curt Burmeister Penny Cagan Jon Gregory Mario Onorato Diane Reynolds Inga Rottmann Fraser Schad Michael Zerbs PRODUCTION AND DISTRIBUTION COORDINATOR Nalinie Sharma ART DIRECTOR & DESIGN Ma design Studio CONTACT INFORMATION Algorithmics 185 Spadina Avenue Toronto. we devote a great deal of our energy to engineering and assembling different pieces that will create more effective enterprise risk management solutions.” Other featured content includes a look at how banks can benefit from the use of replicating portfolios. in which risk management practices between the social and physical sciences are compared. Acquiring accurate data remains an important component of risk management.” this edition’s cover story. but as long as a blank area remains we don’t wonder if the puzzle is irrational or if we have relied too heavily on historical analysis. As industry thought leaders. due to the complexity of the subject matter. Another puzzle considered in this issue is how to prevent future instances of systemic risk. and that all these pieces matter. financial markets contain a number of smaller pieces whose relationships to each other may not be obvious. The Committee to Establish a National Institute of Finance (NIF) believes the answer lies in the collection and analysis of granular transaction and position data. We may look inside the box or check around our feet for missing pieces. But each time risk professionals interact with a piece of this puzzle.com www. We expect jigsaw puzzles to fit together neatly.

which helps to prevent confirmation backlogs. One idea gaining traction. The multilateral aspect and its support for anonymous trades can reduce barriers to market entry and mitigate counterparty risk. they could produce their own set of all too familiar problems.IN CONVERSATION: JOHN MACDONALD John Macdonald. Sell-Side Business and Marketing at Algorithmics. Executive Vice President. A CCP can help reduce the volume and severity of counter- . suggesting that there are promising opportunities ahead for the banks that want to win them. A CCP can create best practices and procedures in partnership with regulators more efficiently than institutions working individually or in alliance. Consolidation would require standardized rules and mechanisms. I think that there is a clear understanding emerging of what regulations are coming and the impact they will have. particularly among regulators and policy makers. Through the use of multilateral netting. This awareness enables banks to anticipate and focus their energies on which practices must be modified or changed in light of the financial crisis and concerns about capital markets. is the introduction of central counterparties (CCP). THINK: Let’s start with the benefits. such as valuation approaches and documentation. CCPs offer banks enhanced flexibility to enter into new transactions and end existing ones. THINK: Can you give an example of where you are seeing a change in practice? John: Counterparty risk remains a high priority issue that banks are looking to minimize across the board. Size plays a role as well. limiting their ability to be cleared through an automated system. CCPs are complex entities however. leading to greater liquidity. What advantages can a CCP provide? John: A CCP can improve market resilience by reducing the potential harm from a major dealer’s failure. enhance transparency and deliver operational efficiencies. and if they are not properly implemented. party risk in OTC derivatives markets. Centralizing counterparty transactions through CCPs automates the clearing process. but this aspect has not been needed in 05 JUNE 2010 THINK THINK: Many banks are cautiously optimistic about a return to stability. may be required. They are also seeing an increase in business volumes. Yet OTC products can be heavily customized. Standardizing some product elements. A better understanding of risk is certainly helping banks to price and assess transactions in ways that really weren’t being done before. shares his thoughts on the nature of banking. THINK: What are the disadvantages? John: A centralized clearing function requires a certain degree of product conformity. which can reduce the risk of legal disputes from unconfirmed trades. the “Basel Billions” and the evolution of technology. Is that a fair characterization? John: Yes.

We know that even with supervision and strict capital requirements. Banks will need to develop scenarios and run simulations on margin calls generated through a CCP. In bilateral markets dealers compete for business. In many cases. to a certain extent. Where does acquiring on-demand analytic capabilities fit within a bank’s priorities? John: The knowledge and technology to support real-time analysis exists. So. it lets a bank optimize its own IT resources on creating software or tools which give it proprietary advantage over rivals. and deliver quick wins for organizations seeking to demonstrate compliance or transparent processes. it is interesting that CCPs are being cited as an improvement because their size would bring added stability to the process. THINK: How can banks navigate between these benefits and challenges? John: The form CCPs will take is still unknown. I had mentioned that size plays an advantageous role in the benefits a CCP can provide.bilateral markets. it is not necessary or even desirable to have everything in real time. the ability to accurately price and understand the risk and return at the time of the transaction becomes paramount. Without this level of competition. This cost can be more easily justified for certain aspects. internal dialog within banks to ensure that resources are diverted to the areas that will deliver the best return on their risk investment. is crucial to effective management. And what did they get for it? 06 JUNE 2010 THINK . By taking these steps. Amidst all the talk of institutions that are too big to fail. After living through the “Basel Billions. THINK: Speed of response is linked to real-time analysis. Keeping ahead of the competition requires continuous improvements to infrastructures. but it is also a concern. banks can prepare themselves and leverage technology to integrate their infrastructure and develop more innovative and sophisticated products with this emphasis on automated transactions in mind. but banks individually invested millions of dollars into their compliance programs. no institution is invulnerable to unforeseen circumstances. THINK: The “Basel Billions?” John: It’s an informal reference. The essence of a “Right Time” approach is to make sure that an integrated infrastructure of data. less so for others. At a decisionmaking level. but they do reflect the growing presence of automated transactions across the banking landscape. a constant pressure to speed up trading processes and their associated checks. As the trend to transform credit risk into tradable market risk grows. I like to think of “Right Time” rather than real time alone as what is needed. and the maintenance of data that seems to constantly increase in volume and importance. technology and applications is able to deliver the right information to the right person when they need it. Leveraging vendor technology can deliver a savings of costs and scale. THINK: Where will the next generation of technology solutions come from: internal development or solution providers? John: Banks are major consumers of technology. the latest and greatest technology comes with a price. leading back to market instability.” banks are justifiably cautious about making an extensive investment in their risk infrastructure without having a clear picture of what the returns will be beyond satisfying the regulator reporting requirements. this could induce institutions to enter into trades with less accurate pricing of counterparty risk. not just through use of technology but also of decision making. not just a cost. that’s adding value to the business. Like all things in life. In today’s climate the speed of response. so the ongoing answer will likely be a combination of both. the motivation to accurately price and manage counterparty risk when entering a trade is reduced. on their capabilities of managing counterparty risk. this form of risk replaces some of the counterparty risk that they hold today. Should regulators favor a particular CCP. Market changes will impact the size and timing of margin calls and. effectively. the question needs to be asked: “Is there a logical fit here between need and application?” There has to be an ongoing.

So technology is important. But as the market place and the corresponding needs of banks evolve. outside of compliance. currency. This must take place at a deep level of granularity. And so the Liquidity & Capital Requirements Directive is being revised. What steps can banks take to position themselves for long-term success? John: To ensure stability and lay the groundwork for profitability. One area that provides the greatest benefit is the ability to acquire data and make it available across the organization. add value and then use many times” is the best way to ensure that decision makers within the bank can access data and use it as required. But in the real world this happens rarely. or may even provide a false sense of security. but is really secondary to vision. THINK: Where should these resources be focused? John: Any investment in a risk infrastructure that can give future flexibility is a positive one. This need to acquire and analyze data will only increase as the diversity of instruments continues to grow. leadership and organizational structure will play an even larger role in determining the success of technology-based projects. Sharing consistent and accurate risk information across the organization has been proven to be a more effective approach than trying to manage risk by components through a silo-based mentality. But given the speed at which the industry evolves. Has the sum of these changes altered the nature of banking? John: The nature of banking hasn’t changed – only its complexity.Now. How high a priority is it for banks to address liquidity risk? John: Unlike efforts to address counterparty risk. Algorithmics has evolved in a similar way. new techniques and business solutions are required to support them. It becomes meaningless. Providing decision makers with the information they need is not the same as handing over all the data at once. you certainly want to take any steps you can to ensure stability. But at the same time. THINK: That seems to lead us towards the future. which will likely affect compliance requirements. political bodies and investors that additional transparency is required. we are now protected. We have always advocated the benefits of measuring and managing risk on an enterprise level. we continue to research. a consensus has developed among regulators. let me qualify this by saying. no signpost that an institution can see and say: “Yes. There are more variations on timing. As vendors in an enduring partnership with our clients. This relates particularly to exotics and securitization. In a theoretical world. THINK: Regulators are certainly concerned about transparency. there’s no visible cliff where systemic failure occurs. Senior Supervisor’s Reports have been quite clear that in the wake of the financial crisis: those banks that relied on a broader range of risk measures and challenged some of the assumptions underlying their methodologies. by that I mean the same amount over the same period of time. banks need to ensure that they have a thorough understanding of the real drivers deep in their business model. As an organization. Frequent reorganizations and consolidation can make it challenging to sustain multi-year strategies. product type. from basic compliance to technology and its role in maximizing the availability of useful information. you don’t want to be in a position where the costs to take these steps are unclear or difficult to quantify. methods of transacting and other characteristics of a product or trade. THINK: We have discussed a number of shifts in thinking about risk in the banking industry. High-level dialog and a willingness to engage on all aspects of the bank’s risk policy will be even more important to ensure that the bank’s strategic and business goals are met. the investment in a risk infrastructure itself offers its own rewards.” The interconnected nature of global markets and by extension institutions is a factor when addressing liquidity risk. 07 JUNE 2010 THINK . but perhaps even more so about liquidity. tended to do better than those institutions that viewed their risk infrastructure purely as a means to address compliance. closely tied to the bank’s traditional business model. a bank would lend and borrow money on the same terms. so banks will be much more cautious about the potential of overextending themselves in the market and to other institutions. A greater emphasis on the continuity of risk planning. Banks with the vision to implement solutions and maintain the high-level support required to see them through will be more likely to enjoy a successful outcome. It goes back to the idea of getting quality information at the right time to the right place. Following the credit crisis. Each time a new instrument or approach is introduced – which is demanded by a competitive market place – new correlations need to be examined and understood. On an institutional level. it is better to put resources towards systems that can deliver on a cost/benefit analysis. These issues can be supported to an extent by technology. is required as these new opportunities are pursued. They must also have a firm grasp of the constraints they have to operate under to meet the capital adequacy goals set out both internally and by regulators. An overload of data is counterproductive. An advanced risk framework can be a competitive differentiator for banks. We now have a better understanding of the nature and challenges of managing liquidity. but what system could one bank put into place that provides a benefit? What would it cost? What actionable information could it provide? On this issue it is best to let regulators and governing bodies take the lead. The ability to outsource transactional processing solutions may provide some relief. A philosophy of “acquire once. innovate and engineer solutions that will help banks to better understand and manage the actual risks associated with the known challenges of today and the emerging opportunities of tomorrow.

Over the past few years. many of these spectators have become speculators that view art as an asset class with its own unique risk and return measures.3 million February 2010 . The British Rail Pension Fund was the first institutional investor to treat art as an investment vehicle.IN REVIEW ART AS INVESTMENT For centuries. 1948 Sold for $140 million November 2006 Eight Elvises Sold for $100 million December 2009 08 JUNE 2010 THINK Andy Warhol For the Love of God Sold for $100 million August 2008 Damien Hirst Walking Man Sold for $104. Many financial institutions now offer specialized art advisory and insurance services. Corporate art collecting has grown substantially. rare books. Progressive Insurance and UBS in particular. The extensive media coverage garnered by these sales keeps the spotlight on art as investment. Original art pieces are unique objects that support diversification strategies while adding great prestige and beauty to a portfolio. No. These pieces were ultimately sold in the late 1980s. and a number of banks have taken a leading role in amassing the most impressive collections. Deutsche Bank. 5. Old Master drawings and African tribal art. making it likely that the audience of institutional art investors will continue to grow. The potential to realize extraordinary returns is also hard to ignore. But the interest in art continues to grow. The attraction to art is easy to understand. In 1974. a successive stream of pieces has shattered individual records for art sales.3% to the fund. Since the British Rail Pension Fund tested the waters with their art escapade. Over the last few decades. art has had a profound impact on viewers around the world. delivering an annual compound return of 11. a number of art funds have followed suit with mixed results. the fund dedicated around 3% of its holdings (approximately $70 million) to a diversified collection of pieces including Impressionist paintings.

INVESTORS AND CONNOISSEURS.6 million November 2007 Titian Diana and Actaeon Sold for $91 million January 2009 Alberto Giacometti . HERE IS A LIST OF 8 NOTABLE TRANSACTIONS FROM RECENT YEARS.5 million November 2006 Jackson Pollock Willem De Kooning Boy with a Pipe Sold for $104 million May 2004 09 JUNE 2010 THINK Pablo Picasso Jeff Koons Hanging Heart Sold for $23.BREAKING THE BANK RECORD-BREAKING ART SALES RAISE EYEBROWS AND EXPECTATIONS AMONG COLLECTORS. Woman III Sold for $137.

The first recounts the history of modern economics. real estate crashes and credit crunches? In How Markets Fail. it becomes easier to understand why many businesses struggle to effectively manage risk. enlightening way to understand the force of the irrational in our volatile global economy. economists Carmen Reinhart and Kenneth Rogoff have compiled financial data covering 66 countries over 8 centuries to study financial crises and analyze their causes. Donald MacKenzie tweaks the meaning of this phrase in the title and content of An Engine. John Cassidy seeks to answer these questions by connecting two storylines. and questions whether these events could have occurred without the availability of such a theory. as MacKenzie suggests. not a camera expected to faithfully reproduce its every detail. Kenneth S. Donald MacKenzie is Professor of Sociology at the University of Edinburgh. there was almost no trading in financial derivatives. The second details the recent financial crisis and how it was shaped by what he calls “utopian economics. over the course of a generation. affecting markets in fundamental ways. among most individuals. combines on-the-ground reporting with historical analysis to make his point that conforming to antiquated orthodoxies isn’t just misguided: in today’s economic crisis. Rogoff For This Time is Different. from overconfidence to envy. not a Camera: How Financial Models Shape Markets Author: Donald MacKenzie Milton Friedman famously referred to economic theory as an engine intended to analyze the world. MacKenzie points out that these scenarios took place during a period when an authoritative theory of financial markets emerged. glaring inequality. This disconnect can lead to a host of individual behavioral biases. How Markets Fail offers a new. Cassidy argues that. copycat behavior and myopia. This Time is Different: Eight Centuries of Financial Folly Authors: Carmen M. If the impact of finance theories is not appreciated How Markets Fail: The Logic of Economic Calamities Author: John Cassidy For 50 years or more. The role of finance theory in the stock market crash of 1987 and the market turmoil tied to the fall of Long-Term Capital Management in 1998 is also examined. wealth creation and the efficient allocation of a society’s resources. utopian economics have become accepted mainstream theory for policymakers and investors. utopian economics involves turning a willful blind eye toward the behavior of real people and the possibility that an unregulated free market can produce disastrous and unintended consequences. he argues that markets are “rationally irrational” and can fail to capture all the information necessary to sustain the assumptions required by utopian economics. His work in the social studies of finance will interest anyone who wants a better understanding of how American financial markets have evolved into their current form.A ROUNDUP OF RECENTLY RELEASED AND NOTEWORTHY TITLES. Preceding almost every crisis was a common belief . These failures remain unexpected behavior to those who believe in a purely efficient market. it’s dangerous. an economics writer for The New Yorker. By 2004. MacKenzie suggests that the emergence of Nobel Prize-winning theories. However. not a Camera. Reinhart. MacKenzie proposes that this growth could not have occurred without the development of theories that legitimized derivatives and explained their complexities. despite the availability and variety of information age tools. $273 trillion in derivatives contracts were outstanding worldwide. were not merely external analyses but intrinsic parts of economic processes. But what about when markets don’t work? What about when they lead to stock market bubbles.” For Cassidy. based on elegant mathematical models of markets. such as futures. 10 JUNE 2010 THINK An Engine. In 1970. which give rise to market bubbles and crashes. He argues that the engine of modern finance theories have actively transformed the environment around them. economists have been busy developing elegant theories of how markets facilitate innovation. Cassidy.

Ioan Hudea. This prolific output later in life is impressive. The book traces Samuelson’s career from his 11 JUNE 2010 THINK Dr. All rights reserved. As author of the bestselling economics textbook in history (Economics) and the first American to win the Nobel Prize in Economics. Authors Philip Munz. who passed away in December 2009. explain the various forms a crisis can take (for example. In more recent times where there is an international dimension to markets. his research and ideas spanned trade. His theories and methods. .READING ROOM days as an undergraduate at the University of Chicago through his evolution as a theoretician. no sure-thing formulas will ever be definable. new methods are needed to identify potential surprises. a concise profile of an original thinker and one of the great economists of the 20th century. a magazine columnist and the architect of the influential economics department at Massachusetts Institute of Technology. “wise sayings” and role as a mentor are also covered. currency debasement or the bursting of asset price bubbles) and emphasize that the different types of crises tend to occur in clusters. Aron Gottesman. that conditions at the time were unique. where it is generally assumed that all of a person’s good work takes place in their 30s.” Reinhart and Rogoff define what constitutes a financial crisis. considered himself the last generalist in a discipline that was gradually taken over by specialists.mathstat. and attempt to define the most relevant parameters that contribute to a global financial crisis. Lall Ramrattan THINK invites Algorithmics thought leaders to review titles of interest from their own bookshelves. gauge their intensity and their transmission. inflation crises. however acquired – legally or illegally. banking crises. and illustrate the outcome with numerical solutions. exchange rate crashes. In the article abstract. Paul Samuelson: On Being an Economist Authors: Michael Szenberg. Shortly before his death at 94.uottawa. Andrew Aziz When Zombies Attack! I n the movie Zombieland only a handful of humans remain to fight a world overrun by flesh-craving zombies. visit www. before becoming undead. The book balances personal anecdotes from his colleagues and students with a functional overview of his philosophies and the mathematical language of his methodology. For those craving a more intellectual perspective. For more information on When Zombies Attack! and to download a copy.ca/~rsmith/ Zombieland images courtesy of Columbia Pictures Industries. His legacy and engaging personality are documented in On Being an Economist. the Annual Review of Financial Economics published Samuelson’s brief memoir. Boo hoo. Samuelson. when smaller historical movements constituted huge surprises and were therefore hugely disruptive. sovereign debt. Andrew Aziz. celebrity. This issue’s contribution is by Dr. The repeated inability of governments and market participants to recognize analogies and precedents was so frequent that the authors name the blind spot as the “this-time-isdifferent syndrome. Over seven decades. determine equilibria and their stability. Buy-Side Business. Joe Imad and Robert J. public finance. Part of Samuelson’s appeal was his spirited writing. and his early work on speculative pricing covered ground that eventually produced the efficient markets hypothesis in financial theory.” His oversized wit and intellect deserve to be remembered and celebrated. The authors’ analysis sets guidelines for conditional thresholds that must be weighed in the context of historical time periods. but not infectious. We are left with the message that we must resist the temptation of “this-time-isdifferent syndrome” if we are to reduce the risk of future financial crises or better manage catastrophes when they do happen. Excess returns – excess ’alphas’ – can result only from early new ’insider’ knowledge. particularly in the sciences. The last observation leads the authors to suggest that it may be possible to have a systemic definition of crises that would improve our ability to recognize conditions and identify a potential crisis. An Enjoyable Life Puzzling Over Modern Finance Theory. business trends and consumer behavior. Inc. The film takes a light-hearted if gory approach to surviving a zombie attack. The model is then modified to introduce a latent period of zombification. Executive Vice President. whereby humans are infected. and that potential risks had been properly identified and managed. Paul Samuelson maintained both a popular and influential platform to share his views and enthusiasm for neoclassical economics. Smith? introduce a basic model for zombie infection. Over 90% of Samuelson’s published work appeared after he turned 50. Samuelson wrote: “Because economic history at best obeys only quasistationary probabilities. Paul Samuelson: On Being an Economist is a fine starting point for practitioners and general interest readers. a group from the University of Ottawa and Carleton University have come to the rescue with When Zombies Attack!: Mathematical Modelling of an Outbreak of Zombie Infection.


There is one fundamental reason for my pessimism. LITTLE BENEFIT It would be disappointing and in fact highly dangerous to the future stability of the financial system if the many improvements that have been recommended and are now being implemented ultimately fail to enhance risk governance in a meaningful way. As a somewhat discouraging example. they are often fragile and not actionable. businesses. So now when we have a weekly meeting. enable actionable hedging strategies and facilitate 13 JUNE 2010 THINK . risk strands and other dimensions to improve their overall risk capabilities and processes. Properly constructed. A POTENTIAL OUT-OF-THE-BOX SOLUTION Replicating portfolios. which have been adopted widely in the insurance industry as a computationally efficient and robust proxy for complex and unwieldy liability portfolios. these investments will help financial institutions advance the quality of their risk measurement. and the risk heads show up. Recently. the head of fixed income and equities show up. on-demand risk analysis. What is less clear though is whether those changes will truly improve corporate risk governance. the Senior Supervisors Group re-iterated those concerns in their updated October 2009 report. “Risk Management Lessons from the Global Banking Crisis. “it [the risk committee] functions and functions across the business.” MUCH EFFORT. supervisors continue to see insufficient evidence of board involvement in articulating and monitoring institutions’ risk appetite more than two years after the onset of the financial crisis. Institutions face a difficult dilemma: they are asked by regulators and industry working groups alike to provide multiple relevant measures of risk “to develop a range of perspectives and consider a broad distribution of possible outcomes. Financial institutions are making considerable investments in their risk infrastructure to allow for the comprehensive aggregation and monitoring of exposures across counterparties. Regrettably. In other words. can help the broader financial services industry resolve this dilemma. Where such processes exist. It just didn’t function. replicating portfolios can provide a succinct and intuitive representation of large and multi-dimensional portfolios.” Thain’s statement was made before the full impact of the financial crisis was understood. I show up. such an outcome is entirely possible. Yet it does not appear that additional hindsight has inspired institutions to take a deeper look at their risk practices. allow for responsive. details may obscure the bigger picture. which is insufficient focus on how boards and senior decision makers can make sense of the risk analysis and reports they receive in order to formulate actionable responses. In a January 2008 interview with the Wall Street Journal.T I he recent financial crisis has unambiguously demonstrated the importance of and need for effective firm-wide risk management that builds on responsive risk analysis and informative risk reporting. Merrill Lynch CEO John Thain explained one of the reasons why the company had suffered substantial losses in the previous year: “Merrill had a risk committee. Undoubtedly. Thain further stated that once these and other changes had been made.” yet it may well be that enhanced transparency in the form of more and at times conflicting information will overwhelm the consumer of information.

replicating portfolios translate risk . the articulation of a firm’s overall risk appetite. they “replicate” or mimic the risk of the large portfolio for the purposes of risk analysis. on the other hand. Tactical risk management. for instance using VaR or shortfall measures. replication would be used primarily to ensure risk equivalence for a range of general market or systemic stress tests.14 Concise replicating portfolios allow for the rapid evaluation of the implications of changing market conditions and the assessment of alternative scenarios in an on-demand mode of operation. JUNE 2010 THINK In its communication objectives the approach is similar to the concept of a “risk equivalent” position that has been popular since the 1980s. which is the same objective as in replication. analysis. Typically. as the ability to identify a portfolio of replicating instruments offers greater flexibility for matching non-linear behavior over multiple scenarios and across several risk factors. In both cases. This discussion is focused on replication in the context of market-wide Black Swan events rather than idiosyncratic events. The approach naturally extends to more granular risk analyses as well. scenario optimization techniques are used to identify effective replicating portfolios. In contrast to such earlier approaches. Where strategic risk management is focused on major threats at the enterprise level. complex interest rate books would have been expressed in two-year or five-year government bond futures positions with the same local interest rate sensitivity as the original book. The concept is particularly applicable to the risk analysis and management in the very banks that are strongly encouraged by supervisors to redouble their efforts to address critical areas for continued improvement resolutely. small portfolios of financial instruments that have been constructed to have essentially the same risk characteristics in a risk model as actual large and complex portfolios over a range of scenarios and multiple time periods. In other words. Equity portfolios would have been expressed as beta-weighted positions in a general market index. decisions and hedging all become much more straightforward when the modeled portfolio is a slimmed down version of an actual “enterprise” portfolio. SUCCINCTNESS AND INTUITION Replicating portfolios use different semantics with different intuitive associations than traditional risk reports. would require replication that encompasses both systemic and idiosyncratic risks. the objective was to find a simple and intuitive way to communicate risk information about complex portfolios. Whereas traditional risk reports typically show the “worst case loss” at a certain probability threshold. for example the adequacy of risk capital in a market-wide Black Swan scenario. For example. replication can capture a much richer set of risk characteristics. SMALL IS BEAUTIFUL The main idea is simple: replicating portfolios are hypothetical. The benefits of replication are directly related to the concise representation of large portfolios – interpretation.

algorithmics. the “short call” representation of risk tells us more than a VaR representation with probabilistic assumptions that have been overtaken by events does.algorithmics. It must be ensured that replicating portfolios are accurate and reliable. rhos and other greeks or PV01s. RESPONSIVE. the authors consider how employing a closely related series of replicating portfolios can benefit insurers. Probabilistic assumptions have a great capacity to be misunderstood. Eurostoxx or Euribor futures or options than to disclose detailed tables of deltas. it is not a panacea for reducing complexity and enabling effective risk management decisions based on informed risk assessment.THE ART AND SCIENCE OF REPLICATING PORTFOLIOS While replication has many potential benefits. say. 15 exposures into succinct portfolios of well-understood plain vanilla instruments that support robust intuition. comprehensive risk reports can take several hours of simulation time on grids with hundreds of CPUs. the authors provide a detailed insight into the mechanics of building a dependable replicating portfolio. This timeframe is not fast enough when risk management needs to be responsive to market shocks that may be developing rapidly throughout the day.com/EN/publications /whitepapers JUNE 2010 THINK The practice of portfolio replication This excerpt from the Algo Risk Quarterly reviews the basic principles behind replication and several practical examples of its application.html . you should think of your market exposure as a short call option on the equity market combined with a long commodity position and an interest rate spread position. Statements like. exactly when they are needed.” When markets are in free fall. Similarly. http://www. it is easier to explain risk exposures in terms of plain vanilla representative positions in.com/think/January09 /beast. http://www.algorithmics. As turnaround times shrink from hours to Replicating Portfolios in Algo Risk This paper offers a high-level overview of the replicating portfolio construction process available within Algo Risk. “You can expect to lose more than this amount less than once in a hundred years. and encourage intuition that is aligned with the underlying risk characteristics. ON-DEMAND RISK ANALYSIS Running firm-wide risk analyses is a major computational challenge.com/EN/publications /whitepapers Strength in Numbers In this article. “Fundamentally. http://www. Even with advanced hardware and risk architectures that have been optimized for performance.algorithmics.” with all the assumptions that are left unsaid and the many incorrect inferences that can be made by non-experts are replaced with a statement such as.html Building the Perfect Beast In this article. http://www. gammas. Concise replicating portfolios allow for the rapid evaluation of the implications of changing market conditions and the assessment of alternative scenarios in an on-demand mode of operation. Algorithmics has published a number of papers that provide additional insight into the effective creation and usage of replicating portfolios.com/think/dec09 /strength1.

Business and hedging strategies can be tuned to reduce or cap exposures to specific scenarios that drive risk capital consumption. NOT EVOLUTION Supervisors and government alike have stated repeatedly that a “revolution” (U. Treasury Secretary Timothy Geithner) are required to create a financial system that is truly robust and sustainable. Further. which can remain unhedged. The approach can distinguish naturally between acceptable risks under normal market conditions. at the height of the recent market turmoil there were only a handful of hedging instruments globally that had meaningful liquidity.K. certain gaps could potentially undermine the […] progress already made. replicating portfolios provide useful insights in the management of risk capital. It needs to be complemented by succinct out-of-the-box communication methods that relate to the audience’s learned concepts and experiences in all of their natural richness and suggestive power. Replication can make the best of such constrained situations and can help in the quantification of trade-offs. 16 JUNE 2010 THINK . in the form of expanded disclosure and deeper transparency. Communicating at an intuitive level is especially important when the target audience includes senior executives. The buyer’s strategy is no longer obvious to the seller. replication has also been used successfully to hedge bond trading exposures in certain local government bond markets with limited market depth. if left unaddressed.One area where fundamental change is needed is in the delivery of complex information to enable the meaningful articulation of risk appetite and the effective oversight of risk by non-expert stakeholders. While these comments were made in the context of supervisory reform. and radically different approaches have to be considered. they apply equally to the change in industry practice that is required to improve risk management substantively. Specifically. Financial Services Authority Chairman Lord Turner) and “fundamental change” (U. HEDGING AND CAPITAL MANAGEMENT Replication can be used to identify effective hedging strategies. Because the hedging strategies that replication produces are often non-intuitive to other market participants. replicating portfolios change a reactive reporting perspective that often lags events to a pro-active and interactive mindset that enables risk managers to participate with fresh and relevant information in the decision making process. incremental improvement along the same trodden path won’t do. For example. especially in times of market turmoil. This is particularly important when hedging strategies bump against liquidity constraints due to the absolute size of certain markets even in normal market conditions. Examples of trade-offs that need to be considered include the following questions: Is it better to buy more protection at a higher cost or less protection at a lower cost? How robust are different hedging strategies under different unexpected scenarios? Is it more effective to hedge with three highly liquid and transparently priced products or five less liquid and less transparently priced products? Replication under tight liquidity constraints is important when hedging large enterprise risks. In periods of market stress. Replicating portfolios can serve to highlight the key bets that have been taken and in this way complement other probabilistic risk measures or detailed sensitivity reports. A CALL FOR REVOLUTION. which five can we choose to hedge our exposure to a particular set of potential Black Swan events?” The methodology naturally accounts for liquidity constraints and transaction costs. won’t suffice by itself. they have the potential to change the dynamics between protection buyers and sellers. but especially when conditions are not normal. Replicating portfolios are innovative constructs that are well placed to rise to the challenge and provide actionable risk insights. counterparty exposures.” In other words. One area where fundamental change is needed is in the delivery of complex information to enable the meaningful articulation of risk appetite and the effective oversight of risk by non-expert stakeholders. without waiting for a full revaluation of each individual position.S. and unacceptable tail exposures which drive capital consumption and free up risk capital if reduced. which can make a critical difference to the buyer’s ability to hedge effectively and at a reasonable cost. minutes. As the Senior Supervisors Group noted in their recent report: “[We] remain unconvinced that firms are undertaking the full scope and depth of needed improvements. capital buffers and risk limits tightly and in near real time during periods of market turmoil. the ability to revalue a proxy portfolio within minutes can provide invaluable information to traders and risk managers alike. As institutions monitor collateral thresholds. board members or even other stakeholders who are not experts in finance or quantitative risk management. As another application of the same concepts. The request “show us the combination of five plain vanilla instruments that best represents the main risks in our proprietary trading book” is closely related to the question “out of the following ten highly liquid instruments. The approach can also be used to effectively reprice large portfolios during times of high market volatility. “More of the same”. the desire for a perfect hedge that is finely tuned to each risk sensitivity of a portfolio quickly gives way to a desire to hedge “at least” some of the major exposures to key market factors. as well as when hedging specific trading books in smaller and less liquid markets.

The market volatility experienced during the financial crisis has driven many firms to review their methods of accounting for counterparty credit risk (CCR).V SU E by Jon Gregory RY Y SA S .. Jon Gregory.. Credit Value Adjustment (CVA) offers an opportunity for banks to move beyond the control mindset of credit limits by dynamically pricing counterparty credit risk directly into new trades. THINK asked Dr. As part of ongoing research. a consultant specializing in the area of counterparty risk. to summarize the interview results and share his analysis of evolving CVA practices. Algorithmics conducted in-depth interviews with a cross-section of financial institutions to gain insight into their approaches to emerging opportunities for CVA. 17 JUNE 2010 THINK COUNTERPARTY RISK AND THE EMERGENCE OF CREDIT VALUE ADJUSTMENT .

collateral was the most common answer. When asked which elements of CCR had become more important as a result of the recent crisis. for the moment.” T I 18 JUNE 2010 THINK he new millennium has been disastrous for derivatives and financial risk management. institutions do not always get cash guarantees. with cash as a highly preferred form of collateral. Securities or bonds can present additional problems when used as collateral because of their complexity. Most institutions (81%) cited interest rate products as contributing the most Q Q A A AQ A A AQ . The credit crisis brought CCR to the forefront of people’s thinking in a similar way that VaR did 13 years ago. behind us. big or small. Common products like credit derivatives. updating approaches and analytics can be part of a long-term overhaul. which were thought to be relatively straightforward. Recognizing the source of CCR is a first step toward controlling risk in this changing environment. while collateral requirements can be tightened up almost immediately. To address today’s considerable financial challenges. To reduce the chance of significant losses. cannot handle the operational workload of daily margining. have been shown to contain huge elements of CCR. Although improving the measurement and management of CCR is critical. the majority of interview subjects reported that the attitude in their institution had changed dramatically in the last two years. institutions no longer take the concept of “too big to fail” for granted. Furthermore. and even more would have folded were it not for government aid. Some financial institutions have declined or failed. many institutions indicated that they are only willing to enter into daily collateral agreements. As a result. No respondents indicated that attitudes remained the same. This response was also to be expected. sometimes to the extent that the product itself can be worthless. It would appear that the era of banks calling every two weeks to review terms on collateral is.“I think our CVA calculations follow an 80/20 rule in that 80% of the risk resides in the 20% of the products we don’t yet capture properly. CCR has rapidly become a problem for all financial institutions. In less liquid markets. When asked about CCR. an area that needs particular and urgent attention is that of counterparty credit risk. and these results are hardly surprising. Many counterparties. such as the high profile bankruptcy of Lehman Brothers. particularly corporates.

and these same organizations benefit from the utility of CVA.” “Our credit derivatives counterparty risk was almost completely ignored prior to 2007. carry a smaller notional exposure but can be more toxic in terms of their CCR. and can use CVA to create incentives for individuals and departments to choose the most appropriate trades. BNP Paribas and Barclays Capital. that these products were selected as the top three responses indicates that institutions are looking in the right direction as they attempt to better measure and manage CCR. This practice is consistent with portfolio diversification and generally permits trades that moderately reduce or increase exposure. A VALUE-BASED ADJUSTMENT Setting limits against future exposures and verifying potential trades against these limits is the traditional approach financial institutions have used to control CCR. His latest book. The frequency of calculating CVA provides additional insight into the motives behind its usage. However. Half the institutions surveyed indicated that they calculate CVA for all counterparties on a monthly basis. This answer demonstrated that institutions are gaining awareness of their true exposures. Firms that adopt CVA also gain a metric to measure trading desk performance. Counterparty Credit Risk: The new challenge for global financial markets (Wiley. Combining this stated usage with the frequency of calculations suggests that interest in CVA is being driven by accounting rules. Fair pricing of new trades was the second most popular response. 2009). In contrast. fully accounting for the cost of carrying or hedging the risk. Credit Value Adjustment is the fair value price of CCR. but that a trend is emerging to use incremental CVA for the fair pricing of new business. While interest rate swaps may carry a maximum exposure that is a small percentage of their notional value. but this traditional approach risks rejecting trading opportunities with large exposures that exceed set limits.to their overall risk. Therefore. the sheer size of that value suggests that CCR lurks beneath the surface. or the expected loss arising from a future counterparty default. The first three answers are somewhat complementary. previously working for Citigroup. but to what degree is likely determined by the relative interests of each institution. The remaining institutions were evenly split (25% each) into daily and pre-deal calculations. Interview results indicated that most institutions currently use CVA for accounting purposes. SYSTEMS AND CALCULATIONS CVA is traditionally defined as the difference between the risk-free and risky value of one or more trades. Many institutions are required to employ CVA for reporting purposes. foreign exchange and credit derivatives. These drivers can be seen as related as they both provide benefits. The 19 JUNE 2010 THINK Jon Gregory is a consultant specializing in the area of counterparty credit risk.” . which helps reduce the reliance on older practices. banks are able to price CCR directly into trades. It could be argued that credit derivatives actually pose a greater threat than interest rates due to the hidden nature of the risk they contain. has been described as “(a)n excellent practitioner’s guide and required reading for all risk managers. followed by reducing reliance on credit limits and charging for unexpected losses. which were ranked the second and third largest contributors to counterparty risk. now it is the key focus.

Trading systems cannot easily be extended since the calculation of CVA is often an order of magnitude more complex than that of the underlying product. Furthermore. These institutions. as most large users of derivatives either already have CVA groups dedicated to controlling CCR for their business lines or are quickly developing such groups.” WHO’S MINDING – AND HEDGING – THE STORE When asked who in their organization is responsible for the management of CVA. 20 JUNE 2010 THINK Nearly half of respondents compute CVA through simple add-ons. which means that the total CVA for a counterparty is not the sum of its parts (the individual transaction CVAs). “The top priority is to have a real-time CVA calculation for all derivatives trades. such as lookup tables or formulas. As institutions become more sophisticated about counterparty risk. Although CVA offers a more advanced approach to CCR than transitional credit limits. referring to an 80/20 rule with CVA: that 80% of the products are captured very easily but this only represents about 20% of the risk involved. Arriving at a value for CVA is always more complex than valuing the underlying instrument as. Critically. aggregation and post trade processing. have an individual running a CVA or CCR team from the front office to complement an environment of fairly active traders. it is more likely that a risk management group or an individual from a market risk or credit risk department is tasked with CVA management. such as in the case of interest rate swaps from a yield curve. the more complex the CVA becomes. The main challenge in computing CVA arises from the impact of netting. and databases. Among less technically advanced institutions. To achieve this requires the implementation of a highly efficient Monte Carlo simulation that can calculate CVA numbers “on-the-fly”.” components for a successful CVA system are scenario generation. a single front-office group was the most popular response (58%). suggests credit derivatives haven’t been properly incorporated into these systems because they have only become prominent over the last few years. the more complex the instrument. The remaining third utilize a full Monte Carlo system incorporating netting. It is important to understand where CVA resides within the systems of an institution. which tend to be big banks. it is natural to centralize the management of CVA as a typical counterparty can be linked . it does not seem to be replacing this more historical approach. This anecdotal response.“We will not trade with anyone without daily margin calls on collateral. the calculation of a CVA incorporating netting becomes a multi-asset calculation. One respondent made an interesting point. taken with answers on frequency of pre-deal CVA calculations. These responses are in keeping with current practices. only a quarter of all respondents indicated that they have the infrastructure to calculate incremental CVA on an intra-day basis (pre-deal CVA). A small number of around 20% calculate CVA more accurately on a product-by-product basis but still ignore netting. by definition. pricing and valuation. This means that the incremental CVA of a potential new deal must be calculated with reference to all other existing deals that could be netted with this deal in the event that the counterparty defaults. This can be seen from the result that CCR pricing works exclusively on an expected loss basis as less than 10% of participants reported that their CVA calculation includes a charge for unexpected losses. Multiple groups were cited by 25% of respondents and the remainder indicated a single risk group.

but just as importantly. Wrong-way risk received the most responses (just over 50%). To access your copy today. while others have no single dedicated group for managing CCR. THE FUTURE OF COUNTERPARTY RISK Institutions were asked to choose the most critical aspect of CCR going forward from a short list of choices. contains detailed insights into how CVA is currently being measured. institutions view central counterparties as a solution that re-introduces the concept of “too big to fail” and may end up creating its own set of problems. In the long term. These specialized CVA groups. they can realize when it is best to walk away from a transaction with another counterparty. cited by just over half of respondents. In the coming years. followed by collateral. this volume of hedging may be inflated because people understand that CCR should always be hedged. are increasingly seen as being well positioned for such management. Since CVA is presented as a price for CCR. an institution’s total CVA may exhibit severe volatility and. Furthermore. The key issue for CCR is what might happen if someone defaults or their credit quality deteriorates. Not only can they improve the competitive advantage within transactions. In the short term. some institutions manage CVA with risk management teams. more toxic areas and exposure generation. potentially lead to large losses. Interestingly. CVA desks can charge all risk takers consistently for the incremental risks they add and are therefore able to manage the overall volatility of CVA within the institution. Firms that are facing issues of CCR have many options available to manage their overall exposure. and how CVA practices are expect to evolve. Establishing such a specialized group can add enormous value to an institution’s ability to manage risk. Without the ability to hedge. Credit Value Adjustment and the changing environment for pricing and managing counterparty risk.algorithmics. To some extent. not that it always is. Furthermore. CVA desks can also increase the level of business with a reliable counterparty and reduce concentration risk by diversifying credit exposure. similar to front-office trading desks. and the key to remaining competitive in a changing environment is intelligent planning and decision making at all levels. 21 JUNE 2010 THINK > VISIT THINK ONLINE TO READ THE CVA WHITE PAPER IN ITS ENTIRETY Algorithmics’ white paper. a qualification that is not surprising since no trader likes to brag about their unmanaged risks. correlation products and contingent credit default swaps (CCDS). where CVA fits into institutions’ systems. visit THINK online at: www. The most common instrument used to hedge CVA is single name credit default swap (CDS).across numerous business areas and trading desks. central counterparties. the implementation of in-house and third-party CVA systems will clearly be a key objective for banks and other financial institutions. Hedging CCR poses many challenges due to the number of market variables involved and the linkages between them. Some participants suggested that hedging was active and continuous with no significant residual risks. However. which are often proposed by regulators as “the solution” to counterparty risk. CVA desks effectively run multi-asset books and are therefore exposed to many underlying risk factors and their cross-dependencies.com/think . despite the enthusiasm of regulators. enterprises may find themselves severely limited in the type and amount of transactions they can take and the counterparties with which they can trade. options/swaptions. major institutional developments are underway to enhance existing CVA systems or to design new ones with the capacity for real-time calculation. however the range of instruments from these answers makes sense. therefore. an increasing number of financial institutions are taking steps to control wrong-way risk and tighten collateral management parameters in an effort to reduce exposure and increase profitability. CDS indices. incorporation of debt valuation adjustment and better handling of credit derivative products. This is an indicator that. Other instruments that received multiple mentions are futures/forwards. the credit market is a relatively crude asset class and it is impossible to hedge it extremely well – it’s just not liquid enough. and CDS does offer the most logical protection. it is natural to ask what the associated “hedge” is. received the fewest votes.

VaR is commonly utilized to calculate the worst loss an institution can experience within a certain timeframe up to a confidence level of 99%. THINK takes a closer look at the demolition and nuclear power industries to understand how risk is managed when anything less than a perfect outcome can be a catastrophe. a measure that only covers 99% of any variable is simply unacceptable.WHEN 99% IS NOT ENOUGH EXPLORING THE GAP BETWEEN CONFIDENCE AND CERTAINTY 22 JUNE 2010 THINK In the world of financial engineering. risk measures are used to estimate the probabilities of unexpected outcomes. For some businesses however. by David Bester & Michael Zerbs .

Prior to an .CONTROLLED DEMOLITION When a financial institution is destroyed. expertise. A “Once we decide that we can safely perform a project from a technical standpoint. “In order to achieve this level of trust. Not everything CDI does can be proven mathematically. extraordinary observation and management skills. This means that operators are not considered innocent until proven guilty in the court’s eyes. While construction disciplines are taught in universities and there is a well-documented history of how to control a design/build process when a structure is erected. Mark Loizeaux. Rather they are guilty until they can demonstrate their innocence. regulators tend to be more reactive than proactive. there would be no need to implode it!” Mark points out. and show up once a problem has been identified at a demolition project. This is largely because the data on what actually exists in older. it means everything has gone according to plan. to ensure absolute control over the project’s successful outcome. the first risk we manage is that of our client. For over 60 years. This puts CDI at extraordinary risk with regard to litigation. Once this is done. “We need to understand their perspective. Inc. which holds the Guinness World Record for the largest structure by volume ever felled with explosives. with critical path management at each level. 23 JUNE 2010 THINK EXPLOSIVE RELATIONSHIPS Historically. and there is no large body of data or industrysponsored groups to vouch for the data. “Regulatory agencies in our field don’t know as much about what we are doing as we do. because even with a set of original. we would have to de-build the structure.” states Mark. If we aren’t permitted this role.” says CDI’s President. leadership and a methodology designed to guarantee complete predictability. 3 generations of the Loizeaux family have created a commercial explosives demolition industry through innovation. the National Demolition Association spends a great deal of money to educate regulators and maintain clear lines of communication. The next step is to break the demolition down into a series of sequential tasks. something terrible has happened. When Controlled Demolition. as-built blueprints. CDI is an American company that demolishes structures with the kind of precision and planning usually associated with their creation. as the company’s insurance and reputation are first in line for claims. then we aren’t interested in the project. a finite structural analysis would need to be performed to fully trust the data plugged into structural engineering formulas.” CDI begins with engineering to see if the numbers match the company’s intuitive analysis for the likelihood of success. and the ability to communicate clearly with not only the client. but also every single team member involved.” observes Mark. “We become the core clearing house for decision making. fatigued structures is too uncertain. explosives-handling operations fall under strict or absolute liability. (CDI) destroys something. the relationship between regulators and the demolition industry has been a tenuous courtship. This requires tremendous experience. Our solution to their problem needs to embrace those points. their position in the industry and what they have to lose. As a result. particularly with regard to new and cutting-edge concepts. As a result. This history includes the implosion of the Kingdome in Seattle. as well as those related to CDI’s scope of work and task at hand. CDI takes this position because under common law in the United States. communication and performance. “Regulators are accustomed to industries that rely on mathematical analysis and computer programs memorialized in technical publications and books. based on their experience in the demolition of thousands of structures. as is the case in the construction industry. the same can’t be said for taking a structure apart.

which dictates how testing. “The pump can fail. He offers the example of a pump. We then assign a probability of failure to each individual component. Channelization and duplicate redundancies are used to further improve system reliability.” To maintain its reputation and support its mandate to understand client needs. that facility’s license would be taken away. Every piece of equipment that has the potential to fail must be modeled probabilistically. This is why CDI treasures and protects its reputation so aggressively. up to five years out. where a partial core meltdown resulted in the release of a substantial volume of radioactive gases in 1979. well-insured. so we need to identify what could lead to this failure. “We use sensors to detect low pressure in the Primary Heat Transport system.” says Gee. The system requires one sensor. where the concept of 99% is not enough is often used in seminars and presentations. for every consulting project they handle.S. which forms the fault tree. outdated data. These standards are in place to prevent an incident like the Three Mile Island accident. For any operational system. One way to mitigate a pump fail is to install a secondary pump. which is vital to coolant injections. or even a spring within a switch. a developer wanting to borrow $100 million from a major U. So we install a third pump.” states Gee. Defense of this reputation extends to the company’s work as consultants. If only 99% system availability is achieved. it must be able to actuate in response to a large coolant loss. A consulting opinion without a commitment to stand behind same is almost useless. Probabilistic risk assessment and project risk management are the two main approaches used by the nuclear industry. CDI offers to perform the work at a future date.” says Mark. Having a safety system is insufficient.The Kingdome’s unique post-tensioned construction to support the largest concrete dome on earth necessitated a whole level of dynamic control never needed on previous CDI projects. but what happens in the case that this one sensor is not operating properly? We address this 24 JUNE 2010 THINK . internationally-recognized demolition company. “A safety system in a nuclear power plant is in place to prevent a core meltdown. weekly or monthly basis. the target for safety system unavailability is generally between six and eight hours per year. In the nuclear industry. we are required to model every component down to the level of every single switch and relay. “The problem with many consultants is that they prognosticate from lofty ivory towers with either no hands-on experience or. maintenance and replacements are performed. These elements form the industry’s probabilistic risk model. if the consultant is long retired from the demolition industry. “A fault tree analysis is a requirement for regulators. regulators tend to rely more on the individual reputations and track records of companies requesting permits. But this might also fail. accident. A POWERFUL REACTION Gee Sham is a Senior Engineer in the Canadian nuclear industry. engineers know whether an individual component needs to be tested on a daily. bank for a new development can move forward knowing they can rely on a fixed price from a solvent.” Within the tree framework. CDI needed to create controlled venting of millions of cubic feet of air from the structure to safely bring it down while preventing damage to 100-year old historic structures directly across the street. With this cost certainty in place. that could contribute to a system failure.

All risks of overrunning the schedule and exceeding the maintenance budget are logged through a risk register.” 25 variable by using three sensors.” says Gee. We would normally complete between two and three million hours without an incident so we have achieved well above 99% in this area. and this applies to cut fingers. Today investor suitability testing is being demanded in much more depth. Prior to the credit crisis. an indication that the industry has moved from a position that things are fine until there is a problem to a more proactive stance. who oversee a smaller number of plants that operate in different fashions.. and Canada. “The manufacturer sends out a bulletin and the U. the onus was generally on corporate or individual investors to ensure that they understood the risks of financial products.S. we will have two or three in operation. Gee relates that at his plant. If a maintenance or refurbishment project involves opening up a major piece of equipment. The Markets in Financial Instruments Directive (MiFID) and the latest update to Undertakings for JUNE 2010 THINK . back injuries and anything that is an outcome of work-related injuries. Instead of one pump. Through this process we have protected ourselves against potential channel rejections. SHARED BORDER. regulators tend to be more reactive than proactive. so maintenance is scheduled when demand is low.5 million hours without any lost time accidents. our results are quite an achievement. In the U. But the target remains zero. As a result. Canadian regulators. the potential to encounter unforeseen problems must also be taken into account. provide directives and it is up to the Canadian plants to demonstrate compliance.” explains Gee. as acquiring additional spares and maintaining engineering resources on standby for every conceivable situation is not feasible. DIFFERENT APPROACHES There is a marked difference between nuclear oversight in the U. and considering the volume of people working 40+ hours per week. Channelization also reduces the possibility of generating a spurious signal because it requires two out of three channel logic to make it a true signal.” PARALLEL LINES How can regulators acquire higher quality information to make better assessments of the true risks facing institutions? One analogy can be drawn between the concepts of guilty until proven innocent cited by CDI and the growth in investor suitability testing in financial services. “Let’s say there is one type of valve that is known to be problematic. To minimize risk that a project runs over time. The risk of each component is dictated by the agency and compliance is enforced. regulator would specify what should be done. Most nuclear and utilityrelated industries are on this track. subject to fairly basic risk disclosure requirements. the Nuclear Regulatory Commission specifies how each utility must operate in each power plant. “We have achieved 6. which ties realistic numbers to specific components based on operating experience. particularly with regard to new and cutting-edge concepts. The issue of regulators in bordering countries bringing a different approach to industry oversight was referenced for the nuclear industry.“Regulatory agencies in our field don’t know as much about what we are doing as we do. In Canada the regulator would ask. Additional parts and engineering support must be secured on a cost/benefit analysis.S.S. repairs and maintenance variables are addressed in great detail. ’What are you going to do to demonstrate that this is under control?’” Operators on both sides of the border share one goal: a target of Zero Defect and Zero Tolerance. This extends to accident rates as well. Taking a reactor offline is expensive.

With this additional liquidity in place. what is enough during normal market conditions often isn’t enough during periods of turmoil. they have become involved too late. It became clear that oversight by the Icelandic regulator was insufficient. Under common passport rules. Many banks faced this problem during 2008 and 2009. One outcome is clear from all three industries: if regulators become aware of a problem only once something has blown up. Similar to the concept of redundancies in nuclear engineering. So how much redundancy is needed? One approach is to increase the availability and quality of capital and liquidity in case there is a large problem. Financial institutions need enough liquidity and risk capital to withstand large losses and must have the additional liquidity and loss bearing capacity to carry on with their business. even when they can’t raise more capital and their balance sheet has large positions in illiquid and complex assets. TOP TO BOTTOM. A complementary approach is to utilize multiple redundant measures to better understand where problems could develop early on even if any specific model fails. but we don’t understand human behavior to the same degree as we understand the physical properties of steel. A rigorous and disciplined approach to the implementation and interpretation of several redundant measures is one way in which the financial risk community could learn from other disciplines. but the credit crunch exposed the failings in the Icelandic banking model. when they had enough capital to make up for the initial wave of losses but were perceived to be left with insufficient funds to run a business or protect the bank in the event of a second wave.000 depositors from the UK and the Netherlands deposited funds into Icesave’s high interest accounts. Icesave is a good example of where innocent until proven guilty would have been the standard approach at one time. 26 JUNE 2010 THINK Collective Investment in Transferable Securities (UCITS IV) attempt to provide harmonized regulations for investment services across various borders of the European Union. . how can an organization know with certainty how much capital is enough when there isn’t enough certainty that their risk model captures the relevant range of outcomes effectively? To make matters more challenging. These principles seek to deal with the European problem of connecting common passport rules with different perspectives on the nature and objectives of regulation that can vary from country to country. More than 400. firms can maintain full market confidence. In social sciences. Iceland’s Icesave bank was allowed access to the British market without much oversight by the FSA. or do you need to look at each individual position and model it? One luxury the physical sciences have is that they are typically modeling processes with stable. while today the host regulator would probably insist on its supervision complementing the home regulator’s supervision to a much greater extent. BOTTOM TO TOP Within risk management and finance there is a debate whether you can go top down or bottom up: is it good enough to have some high-level assumptions about how the system in the aggregate behaves. or the pressure required to cause concrete columns to collapse. Trying to build a set of models with stable statistical properties over a baseline of assumptions is a more complex proposition. models are built to approximate how the human mind works. well-defined properties.A rigorous and disciplined approach to the implementation and interpretation of several redundant measures is one way in which the financial risk community could learn from other disciplines.

As a final question. but it did not lead to a change in behavior. A similar normalization of deviance took place during the Three Mile Island accident. a deviance occurred and became accepted as normal until an eventual catastrophe occurred and a shuttle disintegrated during reentry.” 27 JUNE 2010 THINK . This had happened four or five times and NASA management was aware of it. the expected downgrade did not occur and the market moved on. when a control panel light was known to malfunction. It’s easy to accept small modifications in protocol. “When I go into a reactor I take a gamma meter and if that meter goes off I back out even if everything appears perfectly normal. “I think it is more appropriate to say that what I learned from the financial industry is what I apply to the demolition industry: never risk more than you’re prepared to lose. “Identify deviations.” As a result.” says Gee. or because you believe the housing market is on a continual upswing. a behavior that practitioners in social and physical sciences must seek to avoid. but accepting this scenario as normal ultimately became a contributing factor in the reactor’s meltdown. Periodically situations like this occur where a small market disturbance suggests an underlying issue of model failure. ultimately. One possibly harmful outcome from the continued use of these models was that a false sense of security took hold. who had a slightly different take. Gee cites an example from the aerospace industry that underscores the potential damage a false sense of security brings.The financial industry accepts a certain risk tolerance.000 chance that unemployment doubles and the budget deficit goes up to 12% is not good enough. Once we start believing that this deviance will lead to predictable behavior it is the same thing as waiting for an accident to happen. you have accepted deviance as normal. For a few weeks the market was quite concerned about these implications. What practitioners can observe and perhaps learn from how this tolerance is defined in industries with physical properties is that operating with a 1 in 1. No matter how small the problem appears to be. Challenge them. but the stakes are too high to allow any deviance to go unchallenged. Gee was asked what the financial industry could learn from his work. it must be reported and it must be reviewed. There was a worry that the debt ratings on Ford and GM would be dropped below investment grade. If you give high-risk mortgages and accept that they should be treated the same as other mortgages because everyone else is doing it. the models were not examined further.” The same question was asked of Mark at CDI. as that light was designed to convey important information. causing a massive realignment of correlations. Some instruments were subsequently priced far outside of what the models had predicted. The foam was known to come loose and damage the craft’s thermal protection system. Since no bank went under in these transactions and markets returned to normal. “NASA was using foam insulation to protect space shuttles from the heat caused by reentry into the earth’s atmosphere. but. On a relative basis we should try to rethink what an acceptable risk tolerance is and try to move toward that line. In the nuclear industry this phenomenon is referred to as normalization of deviance. The operators got used to it and thought it was okay. NORMALIZATION OF DEVIANCE About a year before the credit crisis a small episode took place in the CDO market.


Firms that fall in the middle may have the most options and choices to make. monitor the risk being taken by the asset managers. W This form of compliance will require significant rethinking by institutions of how they treat risk. even for the institutions that are already valuing liabilities in a market consistent manner. consistent way across all of the organization’s business. audit and utilize data. the salient information can be distilled and analyzed in an appropriate manner that allows strategic. However. provides organizations with a greater degree of flexibility in how their risk appetite will be defined. must be reflected across the organization. the authors consider four perspectives on Solvency II. reengineer their processes will try to steer themselves toward a compliance-based solution. Solvency II will be a significant undertaking for all insurers. or cannot. the process of valuing their liabilities in a market-consistent fashion has been underway for many years and these newly established processes will be a useful step towards a Solvency II implementation. . but the answer depends on the preferences of the individual institutions. Many multinationals have made significant advances in their risk measurement methodologies over the past few years. Under Solvency II insurers must implement a risk-based system that aligns capital requirements with the true underlying risks of the company. It will affect elements as varied as how to consistently consider mismatched risk. or does it also involve a deeper re-engineering of key business processes? A successful vision must address how risk will be measured and managed in a holistic.#1: VISION The first step for any organization is for senior management to create and articulate a clear vision. Because it is unreasonable to expect any one individual to understand or oversee the detail for all the moving parts. Their decisions will ultimately reflect a balance of cost and internal resources with the commercial or management benefits that can be derived from the solution they select. In this article. This is particularly true in the U. and how they acquire. Viewing Solvency II as an opportunity. It will be interesting to see whether evolving solutions are utilized to address common standards or support a wider range of diverse approaches. Is Solvency II solely about calculating metrics. competitive and business decisions to be more easily embedded in all facets of an organization. representing different entry points and levels of preparedness. to help insurers determine where to focus their regulatory energies. Senior management decisions. Unlike other regulations.K. where historic regulatory practice have required companies to invest a great deal of time and effort into the construction of valuations under both base and stressed scenarios. observed and managed. For larger firms. such as the Solvency Capital Requirement (SCR). and communicate both product benefits and product risks to customers. The vision will have implications that go beyond systems and technologies. An ERM framework with the capacity to adapt to evolving standards and hat is the best-practice approach for implementing a Solvency II solution? This question may apply to all European Union insurers. This can present a significant challenge because of the scope of the problem and the number of integrated and interdependent components. At an ERM level. from high-level strategy down to a granular level of calibrations and approximations.. Two determining factors for creating a vision are the size and resources of an organization. the scope of Solvency II goes well beyond existing capabilities. Solvency II’s principle-based system requires companies to take ownership of their positions and be able to justify them under regulatory scrutiny. and not merely an exercise in regulatory compliance. However. It is likely that smaller or less complex firms that either do not want to. establish product pricing principles for guarantees. Solvency II solutions are often linked into enterprise risk management (ERM) frameworks. this approach also demands a greater level of responsibility to ensure this appetite is well understood.

One barrier for firms considering an internal model approach may be the technological requirements. N . Internal models are more complex and require a higher up-front investment but are expected to deliver long-term capital benefits and help embed risk awareness throughout an organization’s business. Under this approach. Curve fitting is a technique to approximate the value of an instrument over scenarios based on a limited amount of data. taking into account the values of other risk factors. the project risk of acceptance is virtually non-existent. #2: THE STANDARD FORMULA VS. Implementation will require less time. possibly leading to a reduction in capital requirements. The Standard Formula has advantages that are easier to communicate and may be more readily apparent than those of the internal model. An internal model is essentially a huge Monte Carlo simulation that requires precise data management and the processing of tens of thousands of scenarios. Because the regulator provides the formula. and may make preparing for a “known” value easier than preparing for an “unknown” value. 30 JUNE 2010 THINK strategic plans is the most reliable way for insurers to support their vision and ensure that compliance and governance can be managed and demonstrated throughout their organization. offer dramatic performance improvements at a more acceptable cost. The Standard Formula will also provide faster insight into initial capital requirements. This approach has historically been time consuming. less expertise and a lesser investment. VI SI O Internal models promise a greater range of advantages to insurers. but new options. a competitive advantage through more accurate pricing of products. less work. provides organizations with a greater degree of flexibility in how their risk appetite will be defined. It is generally accepted that internal models will track risk more accurately. including a lower cost to acquire capital. expensive and computationally demanding. such as curve fitting and replicating portfolios. Monte Carlo or stochastic scenarios are not required for recalculating the market consistent valuation of liabilities. Curve fitting enables insurers to value a liability using a formula that calculates values of the instrument over a fitted curve. INTERNAL MODELS Solvency II offers insurers the option of using a Standard Formula or an approved internal model to calculate SCR. and not merely an exercise in regulatory compliance. The Standard Formula is easier and possibly less costly to implement but may require additional management time and may negatively affect market perceptions about an organization’s commitment to understanding its risks. The general nature of the Standard Formula may also help by preventing insurers from regarding formula results uncritically or treating them in a “black box” fashion. the opportunity to deploy capital more effectively throughout the organization.Viewing Solvency II as an opportunity. and the reputational rewards of demonstrating transparency to investors and regulators. Lower capital requirements lead to a host of benefits.

can support these additional requirements and make a comprehensive internal model a practical proposition. such as curve fitting and replicating portfolios. stored or analyzed. Strict governance is also required to ensure that the data platform is up to regulatory standards.” The Committee of European Insur- ance and Occupational Pensions Supervisors explicitly lists which uses must be incorporated. replicate or mimic the risk of the larger portfolios for the purpose of risk analysis. Many of the items on this list entail capital allocation or an equivalent form of risk-adjusted performance measurement that will extend the requirements of the internal model. These organizations may find new techniques like curve fitting and replicating portfolios make internal models a more attractive option than previously thought. According to the requirement. Under Solvency II. eliminating the need for stochastic pricing models and therefore enabling faster response times for risk monitoring. Replicating portfolios. portfolio management and business planning. performance management. effective management decisions. performance measurement and analysis. insurers will be required to demonstrate how data has been reconciled from multiple sources into an approved analytical process. While actuarial teams have traditionally searched for data on an as-needed basis. this method will no longer be feasible. accurate information.ST VS A . For example. the Solvency II Use Test promises both technical and cultural challenges. I ND N A TE R R DF N O A R L M MU O L D A EL S 31 JUNE 2010 THINK Portfolio replication is another technique that can be used to reduce model run time. there will be an increased demand for historical data to support SCR calculations and model verification. “the internal model is widely used and plays an important part in [the insurer’s] system of governance. Bringing this level of consistency into practice begins with deriving data from models at a speed sufficient enough to provide decision makers with relevant. These smaller replicating portfolios can be quickly revalued using analytical models. A similar standard of ownership will need to be demonstrated at every point where data is captured. incorporating support for Business and Strategic Planning will introduce a multi-year projection requirement that goes far beyond the base internal model requirement of just establishing a current valuation capability. #3: THE USE TEST For firms choosing a partial or full internal model approach. once derived. Institutions who find themselves between these two positions will have to weigh their needs against the known and hidden costs of their choice. the decision to use the Standard Formula will be a natural one. . Many larger institutions have already committed significant resources to this path. New techniques. mergers and acquisitions. It seems clear from Solvency II’s focus on risk-based systems that insurers are being encouraged to go the route of internal models. A replicating portfolio is a portfolio of financial instruments chosen to match a portfolio of insurance liabilities as closely as possible. For organizations that deal exclusively in vanilla products. From pricing a product to reinsurance.

For those firms who go through the steps to engineer internal models. #4: IMPLEMENTATION AND BEYOND With a vision. The title – Use Test – is quite literal in meaning: management will need to demonstrate to regulators that data has been used and tested in an appropriate and consistent fashion. insurers still face a number of questions regarding the operation of their Solvency II framework. model determination and usage process in place. If the proper due diligence is not undertaken. Chief Executive of the International Underwriting Association believes that modernization and Solvency II are connected U SE TE ST . these obligations will have been fulfilled.32 JUNE 2010 THINK It seems clear from Solvency II’s focus on risk-based systems that insurers are being encouraged to go the route of internal models. Once this data is acquired in a timely. regions and geographies? For these issues to be solved effectively and ensure that the spirit and details of the Use Test are met. firms will have to determine what shape future processes will take and the cultural implications they may present. insurers will find meeting the Use Test to be a cumbersome process. But will board members have enough insight into the models and data to effectively challenge results or rely on them to make strategic decisions? Will definitions of risk be clearly understood and shared across multiple business lines. and that results gained from internal models are actively used to make real business decisions. Dave Matcham. auditable fashion. it is expected to support risk-informed decision making.

but also to live with it as part of their organizational DNA. models and usage are properly implemented can insurers benefit from more accurate SCR calculations and the insights afforded by a risk-based system. an organization would rely on individual people and groups who understood each part of the system and had an intuitive understanding of whether things were working as expected. people are speaking to each other and all efforts are directed toward a commonly understood goal. there were fewer and more limited existing processes. Ensuring that the time. Only by ensuring that the vision. Determining what to keep and what to dispose of. resources and understanding of this phase are in place is required to make sure that the systems speak to each other. having one person who understands the entire process is unrealistic. Prior to the acquisition of an ERM system. Organizations must be prepared to accept that certain challenges and costs will only emerge during the implementation phase. It may be tempting to rip up existing systems and start from scratch – the benefits of working with a brand new system designed to connect all aspects of an organization has its advantages – however this approach will likely generate more pain and expense than can be justified in the short term.IM PL EM EN TA TI O N 33 JUNE 2010 THINK issues: “On the face of it they are separate initiatives – one is regulatory. The need to properly implement a Solvency II solution is crucial for firms of all sizes. When Basel II requirements were unveiled. Larger firms that choose a more customized approach to Solvency II will not only have to defend their position to regulators. Organizations seem to be trending toward creating a Solvency II system and wrapping it around existing processes. A clear methodology must be in place to ensure the integrity of data gathering and sharing. With an enterprise-wide system. with vendors and with solutions providers. Smaller firms that rely on the Standard Formula will still have to prove they are embedding processes and methodologies into their overall business. what should link into this new layer and what must be treated as a standalone application presents a significant challenge that management must be able to share internally. The scope of the system and the “whole is greater than the sum of the parts” aspect of ERM requires a series of audits and checks. The joint aspects of technology and process change highlight one of the main differences between Solvency II and the Basel II process experienced by the banking sector. . In the European Union many of the larger insurance companies have already been moving to value their liabilities in a market-consistent way. the other technology and process change… however I believe that they are not mutually exclusive. What the directive provides is the requirement to extend these valuations from limited usage to the entire organization.” Insurers must determine how to implement the components required to address Solvency II relative to existing systems and processes.


the nature of these reforms is coming into focus. it would be too late to initiate such an effort when the need arises. In addition. academics and regulators on a volunteer effort to bring into being a National Institute of Finance (NIF). The U. Whatever answers are supplied will be acted upon in a crisis. House of Representatives passed a bill in December 2009 designed to impose more oversight and stronger capital requirements on large American banks and investment firms. COMMITTEE ORIGINS The Committee to Establish a National Institute of Finance (CE-NIF or Committee) emerged from an explicit recognition that there was a need to seriously address the challenge posed by systemic risk. even call data that regulators collect periodically tends to be accounting data. as envisioned by the House. have the capacity to prevent a systemic crisis? It seems doubtful. The FSOC. which is not the best data set with which to understand risk. Allan sat down with THINK to discuss the Committee’s mandate and why the NIF is a critical component of the upcoming restructuring of financial regulation in the U. Given that building a data collection system can take years. “What the industry and the country really need is a technical agency that can access granular transaction and position data in order to provide stakeholders with the insights required to deal with systemic risk. More than a year after the near collapse of the financial industry. “The most significant weakness is not a lack of legal authorities. Former Chairman. Federal Housing Finance Board. Policymakers will require answers to urgent questions that cannot be foreseen today.n the aftermath of the financial crisis. But completing an exam once a year in isolation does not provide the best insight into how risk develops on an institution’s balance sheet. which is intended to remedy this omission.S. he is working with an ever-expanding group of finance professionals. The bill contained provisions for a new Consumer Financial Protection Agency and a new Financial Services Oversight Council (FSOC). would be responsible for identifying emerging risks in financial markets and intervening when a “too big to fail” bank is considered to pose a great danger to the economy as a whole. a general consensus emerged among business leaders and lawmakers that reforms were necessary. it is the absence of necessary data and analytical capability.” states Allan Mendelowitz. The next crisis will undoubtedly be just as surprising as the previous ones.S. I But would the FSOC. “As the head of a regulatory agency I grew frustrated with the traditional way of gathering data.” Allan is not just observing events from the sidelines. Examination teams can review the policies of an organization. pull files and test internal controls. Proponents for the FSOC contend that one reason the credit crisis occurred was the absence of an agency focused primarily on systemic risk. getting such actions wrong can make any crisis much worse. examine training methodologies and procedures.” 35 JUNE 2010 THINK .

S. the NIF would play a role similar to the National Transportation Safety Board in that it would be called upon to determine the causes of financial disruptions and recommend regulatory responses. Y and Z to reduce their exposures.S. government agency that would act as a resource to gather. In this context.” Following a research conference in February 2009. And as it turns out. Professor Markowitz took the lead in organizing a half dozen Nobel Laureates who were recognized for their work in financial economics to write to the Senate Banking Committee calling for legislation to create the National Institute of Finance. solution providers. GAO (now known as the U. an agency that connects a lot of data and develops and applies complex models to understand and forecast the weather. The analytic component of the NIF would also support a sustained research effort into how financial markets work and the causes of systemic risk. In addition. recipient of the Nobel Memorial Prize in Economic Sciences and John von Neumann Theory Prize. Had a bank listened to him and followed his advice. UNDER PRESSURE The NIF is a proposed new U. the NIF would fulfill a similar role to the National Weather Service. Allan foresaw a major credit event tied to housing and finance that would push the U. Even if you have the right data and analysis.S. “I know how to make this happen. He reasons that had he attempted to have a conversation with a financial institution in 2005 and advise them to do X. I did not come to this conclusion alone. The NIF would also provide the analytical capabilities to monitor systemic risk. sound conclusions can be rejected in the political arena for all kinds of reasons unrelated to the merits of your case. The idea was brought to Allan. unless the right data was collected remotely on an ongoing basis and subjected to analysis it would be impossible for regulators to see the risks as they developed in the financial markets.” demics and a financial quant sat down to discuss systemic risk and the importance of collecting relevant data and improved analytics. the CEO would likely have been fired 36 JUNE 2010 THINK Allan’s concerns about how systemic risk is measured and managed developed over a long career within the federal government. Statistical associations. and provide advice to the federal regulatory agencies tasked with ensuring the health of the financial system. And the CE-NIF nucleus was born. banking industry leaders and influential academics have endorsed the call for a National Institute of Finance. “I developed an explicit recognition that there was a need to take a different approach if the challenge posed by systemic risk was going to be seriously addressed. In addition. Government Accountability Office). a group of three aca- . and through an ongoing association with the Federal Housing Finance Board. rather than act as another regulatory body. perform independent risk assessments of individual financial entities.“What the industry and the country really need is a technical agency that can access granular transaction and position data in order to provide stakeholders with the insights required to deal with systemic risk. As early as 2005.S. “An important reason (the NIF) needs to be an insulated agency is to maintain a suitable level of independence so that it can speak truth to power. The Committee identified that one of the challenges with regulations in general is pressure.” explains Allan. The Committee’s goal to see the NIF become part of the financial reform bill making its way through Congress has garnered support from a variety of constituents.” he said to the group. they would have thanked him for his opinion or showed him the door. into its worst recession since World War II. Allan came to believe that. clean and provide appropriate data for the financial regulatory community. including Harry Markowitz. who quickly saw that it addressed many of his own concerns. While serving as Managing Director at the U. regulators.

policymakers would have had the data to see the buildup of this dangerously large unhedged concentration of risk and . ALL THE PIECES MATTER One “new” approach to systemic risk is to identify a small number of systemically important firms and focus on regulating them well enough to contain systemic risk. While enhancing the oversight of these firms is a positive development. that the problems at Lehman had been understood by the market for many months. “It’s not just about understanding the right thing to do.NOW SHOWING GRANULAR TRANSACTION AND POSITION DATA/ JUNE/1-30/ 2010 THE NATIONAL INSTITUTE OF FINANCE “Data in the financial sector is simply a disaster. and [Treasury Secretary Henry] Paulson believed. government’s decision not to intervene in the firm’s demise. The first case is Lehman Brothers and the U. “The threat posed by the CDS book at AIG was totally hidden from Treasury and the regulatory community. Well. Allan provides three examples from the financial crisis to illustrate his point. it ignores the role of concentrations of risk and interconnectivity between market participants that can create systemic risk even in the absence of large systemically important firms. rather than on a clear picture of what had actually occurred. “He did not have the data that would have enabled him to see that a Lehman collapse would lead a large money market firm to ’break the buck’. From the Committee’s perspective. These risks can only be understood through the collection and analysis of granular transaction and position data.” The second instance is the credit default swaps (CDS) at AIG.” because the company would have passed up opportunities for substantial (accounting) profits in 2006 and 2007 as the housing bubble continued to expand. and that the counterparties had ample opportunity to adjust their exposures. this approach is fundamentally flawed.S. When it comes to systemic risk. If the government had collected granular data for the market as a whole. he was wrong!” Allan notes that Paulson made a monumentally bad decision based on a belief. it’s about trying to execute the correct strategy in a complex world in which diverse pressures and incentives are brought to bear. which was what spread the crisis across the breadth of the market. the whole is greater than the sum of the parts. Treasury officials did not realize what was going on at AIG until the week that Lehman actually went down. “It had been reported.” he continues.

it is through a whole host of different systems with different data structures – and you cannot use this Tower of Babel of data to build any kind of meaningful picture of what is going on in the market as a whole. Then the NIF proposes to securely capture data on contractual terms and conditions at the most granular level: as attributes of specific legal contracts.S. and many of them have breaks. With its focus. such as large network simulations of how the affects of stress play out in the financial markets.” concedes Allan. such as when each trader has recorded a slightly different record of the trade price. stating.AIG’s linkages to firms across the market.S. analyzing systemic risk data acquired from a multilateral.” To do its job.” The third issue is forensics. on a national agency. “Every day institutions reconcile their trades. millions of trades. • all transactions that have an American counterparty These components will generate enough data to have a material improvement over what is currently available to regulators and perhaps lay the groundwork for future participation from other countries.” Allan also notes a strong pocket of support from solution providers. therefore. it’s easy to say that the SEC simply did not do its job. “At the working level. But a huge volume of breaks come down to firms using inconsistent data. HOME AND ABROAD Given the globalization of financial markets.” Copenhagen is a recent reminder that trying to orchestrate a global framework can drag on for decades with no guarantee of success. it’s about trying to execute the correct strategy in a complex world in which diverse pressures and incentives are brought to bear. “a new data and analytic infrastructure is required to maximize the effectiveness of any financial regulatory system. In many financial firms it is a major weakness and vulnerability. A U. “We are finding enthusiasm for the Committee’s plans from vendors like Algorithmics who deal with risk all the time and understand the benefits a higher quality of data can provide for measuring risk and modeling risk.” observes Allan. when different regulatory bodies collect data. The NIF’s first order of business would be to develop reference databases for financial entities and financial products. There has been criticism of how the U. Securities and Exchange Commission (SEC) failed to uncover the Madoff fraud – the largest and longest running Ponzi scheme in history – despite several investigative attempts. The resulting insights will be invaluable in identifying future systemic fragilities. the SEC would have seen immediately that no counterparties existed for Madoff’s reported trades and the fraud would have been laid bare. This detailed data will provide the flexibility required to feed a diverse range of risk models and enable new analytical approaches. Action could have been taken sooner. such as describing the same counterparty differently. we have enjoyed regular ongoing participation from the European side because they are grappling with similar issues and they appreciate the importance of expanding the range of data collection. “With hindsight. all of the reporting firms would benefit from the standardization of data across the market. In large part this is because it is hard to focus attention on and get resources for what is dismissed as a housekeeping chore that does not generate revenue. If the SEC had access to the granular transaction data as we propose for the NIF. the NIF would have to establish uniform data standards to assure the accuracy and comparability of the detailed financial data it collects. “Secondly. These differences have to be reconciled.S. Data in the financial sector is simply a disaster. international approach would be the ultimate objective. whose source cannot be predicted today.” DOUBTS ABOUT DATA The current state of financial data is problematic. Their operating costs would decline and their ability to understand risk would improve.-based system will certainly be a good start towards where everyone should ideally be heading.’ Madoff’s purported investment strategy required large numbers of derivative transactions. “But without access to market-wide transaction data there was no simple way to easily uncover this fraud which was perpetrated by a ’pillar of the financial community. At the same time. Yet the Committee is aware of the political realities involved in establishing global standards: the 2009 Climate Change Conference in “It’s not just about understanding the right thing to do.” GAINING WIDER SUPPORT The Board of Directors of the American Statistical Association (ASA) provided an endorsement of the Committee’s plans. including very active participation from leaders 38 JUNE 2010 THINK . the Committee proposes that the NIF be created with the authority to oversee the following market components: • all transactions by American firms • all transactions that take place in the U. Sometimes these breaks are substantive.” Perhaps the most unexpected response has been the highlevel support received from a small number of financial institutions.

The Committee’s arguments have even influenced proposed legislation. With nothing more than personal energy and a strong belief in the importance of what we are doing. The 41 Republican Senators. They have laid the foundation to include such a capability in any legislation that ultimately becomes law. Senate. Title I of the Act deals with systemic risk and calls for the establishment of the Office of Financial Research within the Department of the Treasury. Democrats have 59 Senators. It is easy for them to view the NIF as counter to their financial interests.S. “I couldn’t help but ask why – my view was that their firm’s traders liked selling opaque products for their large profit margins and big commissions. This protection extends to the Director’s testimony. may choose to filibuster. we have watched as support for this idea has become widespread like the ripples spreading out from a pebble dropped into a still pond. Housing & Urban Affairs Committee Chairman. the Office would essentially fulfill all the requirements endorsed by the Committee. preventing the measure from being brought to a vote. for a six-year term. No one in the U. budget and testimony. are unique. STATE OF THE UNION What the Committee has accomplished in just over one year is unprecedented for the type of organization it represents. Banking. government. Aside from a proposed name change.” A lead executive from the firm answered that there was a big difference between how a trader and the firm’s leaders view the market. we have worked to 39 JUNE 2010 THINK . The Chairman’s goal is to get the Act approved by the end of May. Now that these ripples have reached the U. including the White House. which will require the support of at least one Republican Senator. In fact. of a global financial services firm. Senate. In March 2010. But this institution’s response and openness to consider the market in a new light is a positive sign for our ongoing efforts. “As the last year has progressed. Commercial interests do not own. control or finance the initiative. which is one vote short of the number required to pass a cloture vote and end debate on the bill. All committee members are volunteers and are paying out of their own pockets the necessary expenses to accomplish what they believe to be a reform that is critical to the future well-being of the United States. The part of the bill that would establish the Office of Financial Research is supported by both Democrat and Republican Senators. What happens next will be determined by what happens to the whole bill on the floor of the U. but strong profits could be made off a larger volume of better trades. The Office is designed to be truly independent in terms of operations. the evolution of the NIF from concept to reality could be mere months away.” Allan observes. We.S. the Committee has raised no money to support its efforts. A trader may prefer opaque products but as an institution the executive felt that the firm would be better off doing business in a more transparent and stable market in which margins might be smaller. the Committee has achieved much by getting a provision to create an advanced warning system for systemic risk included in the financial reform bill that reached the floor of the U. “There are firms and trade associations who spend millions of dollars every year on lobbying in Washington to try and make things happen.S. appointed by the President and approved by the Senate.” says Allan. The Office would contain a Data Center and a Research and Analysis Center to support the work of the systemic risk council. Senator Chris Dodd. Whatever the fate of this bill. “It’s asking a lot for the financial community to endorse the NIF.” says Allan. Its budget would be raised based on industry assessments so that funds could not be appropriated for other uses or be subject to political pressures. The Committee’s issues went all the way up to the firm’s CEO.sell a new and important idea based on the strength of our arguments. It would be headed by a Director. on the other hand. I wanted to know how it could be in his firm’s interest to support the NIF when the end result could dramatically increase transparency in the market. shall have the authority to require the Director to submit testimony for approval prior to submitting it to Congress.S. Negotiations between the two sides are ongoing. who so far oppose the overall bill. Senate. unveiled the Restoring American Financial Stability Act of 2010.


Information from past events. which leads to the key question: What do I have to do to survive? One of the most strategic exercises a financial institution can undertake is the construction. Although financial institutions tend to have a long life. or perhaps passage of regulations that make the sale of such products illegal or impractical because of price restrictions. Barings Bank. was founded in 1969 and existed for 39 years before it collapsed under the weight of nonperforming subprime loans. Financial institutions generally have longer life spans than corporations. Another notorious unregulated firm – Bernard L. Their longevity could be a positive consequence of regulation. author of The Living Company (1997). we know intuitively if his or her life was too short or longer than average. risk profiles and business models that need to adapt along with shorter and more extreme business cycles. translation and adoption of a catalog of key scenarios that consider an institution’s longevity. Fortune magazine also selected Bear Stearns for its list of “America’s Most Admired Companies” between 2005 and 2007. strategic plans and capital structures that were designed during buoyant market conditions were deficient during such a protracted and pernicious downturn. questioning whether a certain risk will occur within a certain time period is essentially asking if this corporation will be around if a certain severe event occurs. Madoff Investment Securities LLC – was founded in 1960 before it was discovered to be a Ponzi scheme and collapsed in December 2008. At the heart of fat tail risk scenario exercises is the idea of corporate longevity. T 41 JUNE 2010 THINK Arie de Geus. In his book. an analysis of the different cross currents that led to he human lifespan has a certain predictability. distribution channels and product innovations. Based on a person’s age and geography. Countrywide Financials’ stock was characterized by Fortune magazine as the “23. for instance. It also assists with prioritizing risks in a concise manner. oversight and their focus on solvency and capital. failed in 1995 after a landmark rogue-trading event that led to a liquidity crisis for the august institution. The process of constructing scenarios that are multi-disciplinary and are ultimately understood and adopted by an institution’s senior management and board of directors is a powerful and focused method for considering key risks and the overall risk profile of a financial institution with respect to what matters most: survival. Managing risk strategically and across a complex organization involves the construction of a narrative. It allows a firm to consider not just “what will happen if a particular risk event occurs” but also what would happen if its primary form of income disappears as a result of a fall in demand for its products. came to the conclusion that the average lifespan of a Fortune 500 or equivalent multinational company is only 40 to 50 years. experiences and market conditions cannot predict the future and ensure corporate longevity. Philips Electronics and IBM during the mid1970s provided no hint of pending problems they would later face. The Barings case demonstrates how important it is for a financial institution to understand that the “rules of the game” are always changing. In fact. the financial reports filed by General Motors. de Geus observed that if one studied the audited statements of corporate “darlings” of the 1970s there would be no indication of the troubles they would face during the next decade. Countrywide Financial. a confluence of events or a set of market conditions. translates into a strategy that considers survival from an enterprise perspective. which continued to fund Nick Leeson’s trading activities in Singapore despite lacking an understanding of how much overall risk the bank had taken on. the narrative should contain key elements of the story.000% stock” based on the returns it earned between 1982 and 2003. which was founded in 1762 and was the oldest merchant bank in . And of course. This was certainly the case with Barings. contained units rather than in aggregate. Financial firms that were entirely unregulated – as opposed to the investment banks that were perhaps too lightly supervised but still regulated – had shorter life spans that more closely resembled the “corporate” average. The demise of several of our most respected financial institutions during the past year has led to this consideration of a “corporate lifespan” and how risk management tools and approaches can assist with the fulfillment of a long life.England. the length of time they have been in business does not necessarily indicate how long they will survive. A continued adherence to silo-driven approaches within risk management perpetuates the tendency of financial institutions to understand their risks in small. This premise has become especially true as the modern pace of change resulting from technology and evolving societal mores can translate into strategies. The Great Credit Crisis demonstrated that risk assessments. a loss of people or materials that support those products. companies such as Enron were later proven to have released financials that were designed to hide their true condition by pushing liabilities and losses off their balance sheets. companies with long lifespans may become too complacent and not fully understand the risks they face through new businesses. For instance. The analysis of the survival of an individual institution as a result of one event. But such information can certainly inform strategic planning for the future.

the narrative should be transformed into a scenario that is relevant for one’s own organization and accounts for what a related event would look like if it occurred internally. Ultimately. 2009. Illinois was closed on September 11. rumors circulated that Corus would be next.100 70 80 40 60 90 10 20 30 50 42 JUNE 2010 THINK Managing risk strategically and across a complex organization involves the construction of a narrative. Corus Bank had total assets of $7 billion and total deposits of approximately $7 billion.2 billion in financing to primarily condominium development projects in 2008. What controls would likely fail? What market events would heighten the severity of the event? What direct and indirect costs would result from such an event? What could cause a liquidity crisis? The key to constructing scenarios is to use material from the past. Case Study One: Failure of Corus Bank Corus Bank of Chicago.) Corus operated extensively outside its Chicago base and was one of the largest condominium lenders in South Florida. the United States experienced 140 bank failures. The FDIC estimated that the cost to the Deposit Insurance Fund would be $1. The bank continued its commitment to financing large condominium projects in the state of Florida even after the dramatic implosion had started. which appointed the Federal Deposit Insurance Corporation (FDIC) as receiver. This can be accomplished primarily through a questioning process. it was operating under . 2009. a dissection of the control failings and a listing of both direct and indirect impacts. Ultimately. Only about five percent of the bank’s outstanding loans were associated with Chicago-area projects. The bank’s failure constituted the third largest of 2009 – a year in which there had already been 133 bank failures in the United States. Following the closures of Guaranty Bank and Colonial Bank in August 2009. A lending strategy that includes extending a large portion of loans outside a bank’s home market can prove to be a risky endeavor. (By the end of the year. because the institution may not fully understand the implications of extending loans to borrowers that are located far away. an analysis of the different cross currents that led to the event. by the Office of the Comptroller of the Currency. the narrative should contain key elements of the story. the narrative should be transformed into a scenario that is relevant for one’s own organization and accounts for what a related event would look like if it occurred internally. and making the event one’s own. Illustrated below are two excerpts from case studies of operational risk events along with examples of how to construct relevant scenarios through a questioning process. Corus was characterized in numerous press reports as a poster child for commercial lending in boom-bust markets. such as the facts surrounding the failure of an organization. The FDIC entered into an immediate purchase and assumption agreement with MB Financial Bank to assume all of the deposits of Corus Bank. As of June 30. the event. Corus was also heavily exposed to commercial real estate lending in Las Vegas – another victim of the real estate downturn. By early June 2009 the bank’s capital base was almost entirely wiped out by loan defaults. It provided $1.7 bil- lion. a dissection of the control failings and a listing of both direct and indirect impacts.

which perhaps provides senior management with a false sense of security that they truly understand their risks.8 million. Scenarios provide the opportunity for financial institutions to better understand the true risks that they face and extend their life expectancy even further beyond those of their corporate counterparts. As of March 31. 2009. Dwelling House had total assets of $13. replace top management. Scenarios are a powerful method for bringing together all the insights that an organization has gained through data collection. 2009. workshops. The incident led the OTS to order the bank to replenish the missing funds.” The following charts provide a view into the increase of corporate “deaths” by both number of events and total loss amount. The good news is that the process of constructing a catalog of cross-organizational scenarios need not be expensive. by the Office of Thrift Supervision (OTS). But understanding individual risks in isolation of each other may not provide the best window into what can bring an organization down and result in a truncated lifecycle. or risk being closed down.4 million and total deposits of approximately $13. find a new owner. which was attributed to poor bookkeeping practices. they had been illegally transferred into accounts held with 62 financial institutions that had unknowingly processed electronic checks through an automated clearing house system. And while it may not be the most expensive risk endeavor that an organization undertakes. The FDIC estimated that the cost to the Deposit Insurance Fund would be $6. Questions to consider in constructing a scenario using the Corus Bank case: How cyclical are our investments? How exposed are we to long-term projects that can be started in one economy and finished in another? What is the largest amount we can lose from such a project? Do we understand our markets? Are we operating in non-core businesses that we may not understand? How correlated are our businesses? Could one be hampered by economic conditions while others remain relatively healthy? What if external fraud is present at the same time as deteriorating credit conditions and volatile market conditions? What about a large internal fraud? Case Study Two: Dwelling House Savings and Loan Association Dwelling House Savings and Loan Association of Pittsburgh was closed on August 14. geographies and risk and compliance disciplines. corporate experience and wisdom into a singular view of risk. tighten compliance controls and accept supervision from a local bank that would serve as its mentor. The FDIC entered into an immediate purchase and assumption agreement with PNC Bank of Pittsburgh to assume all of the deposits of Dwelling House Savings and Loan. it can certainly be the most profound. This can result in an overwhelming amount of information.8 million. 43 JUNE 2010 THINK regulatory orders to raise additional capital by June 18. 2009. An internet fraud that resulted in about $3 million of capital being stolen from the bank contributed to Dwelling House’s undercapitalization. Dwelling House recovered about one-third of these funds. The crime went undetected for more than a year. . complex organizations collect a lot of important data from a multitude of departments. which appointed the FDIC as receiver. Questions to consider in constructing a scenario using the Dwelling House case: Could such a fraud occur within our organization? How big would it have to be to have an impact on our capital base? Could a smaller fraud occur over a period of time without detection? Could the improper withdrawal of small amounts of money over time by employees or external fraudsters add up to an amount that would have a notable impact? Could money be siphoned out of our company without noticing? What control breakdowns would have to occur for that to happen? What is the state of these controls? Are there subsidiaries or branches within our organization that engage in gray market practices that might cause concern to the regulators – perhaps in a very small far away location? Could these practices cause reputational issues for the parent organization? Could the impact of a technology breach be magnified by difficult market conditions? What conditions could occur to increase the severity of such an event? A Scenario for Long Life Large.Algo FIRST tracks the end of corporate lifespans through the indexing term “loss of access to markets.

Regulator tracker Follow real-time movements of individual regulators. Counterparty counselor Counterparty relationships not what they used to be? Rebuild trust and transparency through daily tips and non-judgmental exercises. Deadlines can be reset. Country insolvency watch Worried that a nation will devalue its currency or default entirely? The national anthem of countries in crisis will automatically play when certain thresholds are breached. A discreet ’beep’ notifies you when a regulator comes within 100 meters of your desk. hit the big red button to close global markets for 10 minutes. Haiku market watch The intellectually-driven practitioner can enjoy portfolio updates in the elegant form of Japanese poetry. Score translator Translate match scores into a stock ticker format that can be safely accessed during long meetings. but they’re on a short list of titles we’d gladly download. Market stop When all else fails. . but only with regulatory approval. smartphones can transform into a portfolio tracker. Use sparingly. These fake risk management apps may not be available anytime soon. Solvency II countdown clock Count down to important implementation milestones.The last word Thanks to the availability of highly specialized applications. Go Manchester United! 44 JUNE 2010 THINK iSimplify Do your executives and directors hate numbers? This app reworks complex analysis into digestible three panel cartoons for the “content-free” crowd. flashlight or fitness coach with a few quick touches.

www. and help us shape the future direction of THINK.com/think .RFF: Request for Feedback THINK was created to inspire a conversation between risk practitioners. Visit us online to share your thoughts and feedback on this issue. and we invite you to take part.algorithmics.

MARK-TO-FUTURE. we help clients to see risk in its entirety. RISKWATCH. ALGORITHMICS & Ai & design.© 2010 Algorithmics Software LLC. our proven. enterprise risk solutions allow clients to master the art of risk-informed decision making through the science of knowing better. Measuring risk along individual business lines can lead to a distorted picture of exposures. ALGO RISK SERVICE. ALGO OPVANTAGE. ALGO CAPITAL. Not all risks are worth taking. Ai & design.com . Proven Risk Management Solutions algorithmics. ALGO. Supported by a global team of risk professionals. and ALGO SUITE are trademarks of Algorithmics Trademarks LLC. KNOW YOUR RISK. and identify new opportunities that maximize returns. ALGO OPVANTAGE FIRST. ALGO COLLATERAL. ALGORITHMICS. ALGO CREDIT. ALGO RISK. ALGO MARKET. At Algorithmics. This unique perspective enables financial services companies to mitigate exposures. All rights reserved.