Wednesday, August 19, 2009
However, one of the large Porsche shareholders and member of the board, Ferdinand Piëch, is also Chairmen of the VW board and he had different plans for the company. He wanted the transfer to work the other way round, i.e. VW takes over Porsche. In the end, he won the power struggle due to Porsches’ problems with repaying their loans and discovering new funds for the takeover. Wiedeking had to leave the company, together with his CFO.
Now, the stock market reacts to these developments again. This time the hedge funds are betting on further dropping share prices which led on the one hand today to a drop of almost 20% of the stock price. One of the reasons is that Porsche has sold their options to Qatar. On the other hand the non-voting preference shares went up.
That means that the biggest European car maker has lost more than $ 28 billion of its market value in just 2 trading days. It has to be seen if the hedge funds are right and the downfall of the VW shares will continue.
Tuesday, August 18, 2009
The small and medium sized businesses (e.g. in Germany) have received their loans mainly from saving and loan institutions. Due to the significant decline in turnover and export, those companies are struggling and often have to file for bankruptcy. Insolvencies and credit defaults of their clients will cause a lot of financial institutions to sway.
However, the institutions will most likely weather the storm better than the last crisis, according to the chief economist of Allianz, Michael Heise, and I agree with him. The banks have reacted to the crisis and invested heavily into risk provisioning. In addition, the governments “rescue parachute” is already in place to help the financial institutions in need.
Nevertheless the savings and loans who managed to remain stable through the first wave (due to their lack of risk appetite and their concentration on retail banking) have – like the other financial institutions – to ensure that their equity ratio is high enough and that they fully understand the risk involved and the probability of default of their loans. The right level of detail for this kind of analysis and appropriate reporting is therefore a must.
Monday, August 17, 2009
The bank which held about $25 billion in assets was a big lender in real estate development.
The Federal Deposit Insurance Corporation (FDIC) has been made receiver and approved the sale of Colonial's assets and $26.06 billion in deposits. Rival BB&T has taken over the bulk of Colonial's assets, the government banking agency said.
Before the takeover, BB&T wants to secure fresh money. The bank therefore initiated the sale of new shares worth $750 million. It will be the biggest takeover for BB&T in its history.
Three other US banks filed bankruptcy the same day, the Community Bank of Arizona, the Community Bank of Nevada and the Union Bank, National Association.
The states in the US that are hit hardest by the financial crisis are Florida and Georgia. Colonial operated mainly in Alabama and Florida and was therefore hit very hard by the burst of the real estate bubble. Unfortunately, the problems became to big so that, at the end, FDIC had to pull the rip line.
Thus, the effective management of liquidity risk allows a financial institution to meet its cash flow obligations – in all circumstances. Important to assessing liquidity risk and deriving the right management measures are robust cash flow models as well as sophisticated stress scenarios. Regulators are more and more emphasizing stress testing as a critical component of the risk management tool set of a financial organization because it enables a better understanding of the liquidity risk profile and it also helps to model the liquidity risk appetite.
The financial services authorities (FSA) in the UK proposed recently new rules that are based on recently agreed international liquidity standards, in particular the Basel Committee on Banking Supervision’s (BCBS) Principles for Sound Liquidity Risk Management and Supervision, published last June, and also take into account difficulties faced in the market over the past 18 months.
The FSA’s proposals emphasize the responsibility of firms’ senior management to adopt a sound approach to liquidity risk management, and present the following changes:
- All regulated entities must have adequate liquidity and must not depend on other parts of their group to survive liquidity stresses, unless permitted to do so by the FSA.
- A new systems and controls framework based on the recent work of the BCBS and the Committee of European Banking Supervisors (CEBS).
- Individual liquidity adequacy standards for firms based on firms their being able to survive liquidity stresses of varying magnitude and duration.
- A new framework for group-wide and cross-border management of liquidity allowing firms, through waivers and modifications, to deviate from self sufficiency where this is appropriate and would not result in undue risk to clients.
- A new reporting framework for liquidity, with the FSA collecting granular, standardized liquidity data at an appropriate frequency so the FSA can see firm-specific, sector- and market-wide views on liquidity risk exposures.
The FSA hopes to introduce the new rules in October 2009. These rules will come along with new reporting requirements. I believe that these new rules are important, not just in the UK but for financial institutions as a whole to manage liquidity risk more efficiently. Therefore I would be surprised if these new regulations are not adopted quickly by the other (European) countries.
Friday, August 14, 2009
The business units of the financial institution routinely receive funds from their depositing customers and other third parties. These funds are then thereupon invested in loans and investments (sometimes through different business units) to borrowing customers and / or third parties.
The amount, terms, and interest rate of funds collected and invested are described in financial agreements between the organization and its customers. The interest payments on these funds contribute to the overall net interest margin of the institution, defined as the difference between interest revenue earned on funds used to acquire assets less the interest expense on funds gathered. In other words, funds transfer pricing determines the cost of funds for the asset side and a value of funds for the liability side. The net interest margin of the institution and the value of its financial contracts fluctuate as market conditions and the underlying cash flow of the funds change over time.
Even though, business units and the customers participate in the continuous intermediation process, the contribution to the net interest margin and value is not equal by all participants. The amount of funds received and provided is rarely matching. The task of the funds transfer pricing process is therefore to measure and assign the discrete contribution of funds – when assessing their overall profit contribution – by business unit, product and customer.
There are usually two methods to calculate the net margin and value contribution of funds, the pooled approach and a specific assignment approach. The pooled approach assigns funds to financial instrument pools created under a predefined set of criteria (e.g. type of balance, term, reprising term, payment frequency, and origination) with transfer rates derived either internally, based on actual rates earned or paid, or as an alternative, by market derived interest rates and adjusted for risk.
The specific assignment or single rate method approach uses transfer rates based on asset yields, which favors net funds provider’s contribution. The deficiency of that approach is it assumes that all funds have equal importance to the financial institution. No differentiations based on the value of fund attributes nor the market conditions at the origination of the transaction are taken into account. However, multiple pool approaches that use contemporary market rates lack the ability to benchmark management decisions made at the time of initial transaction pricing.
In any case, by assigning a transfer price to each component on the balance sheet, you can compare the earnings resulting from the use of each asset to alternative uses, compare the cost of each source of funds (liabilities) to alternative sources, and measure the profit contribution of each asset or liability. The typical reports are either two dimensional or OLAP reports that show initially the balance sheet and various columns with the comparisons and offer then drill down capabilities by the various criteria. Dashboard, showing the relevant measures like net interest margin and various visual representations of the balance sheet components, the comparisons mentioned above (e.g. in form of a bar chart), and the yield in form of a graph, are also commonly used.
Due to the fact that all financial instruments need to be pooled and calculated to receive the funds transfer prices the data volume is very high and the reporting tool need to be able to cope with this volume in an efficient way. In addition, the format of the two dimensional reports is usually stated. Hence, the BI tool needs to reflect this by providing “pixel-perfect” reporting, i.e. the ability to arrange all items on the report exactly as needed, something that not many reporting tools can master.
Thursday, August 13, 2009
There is also the fact that all institutions in the financial services have to deal with very sensitive data. Detailed information on credit card holders and their transactions, the medical history of health insurance policy holders, the financial situation of clients (e.g. in wealth management) are just a few examples of the confidentiality of the data these organizations are dealing with on a daily basis. That is why the demand for highly sophisticated security mechanisms is a common theme in this industry.
A third determining factor for financial services is performance. Trading departments need real-time information on the stock market; a hedge fund manager wants to follow up on the performance of his fund with the ability to analyze deeper to understand the cause of changes; the CRO requires quick information to manage the risk and minimize the expected loss; a financial analyst in an insurance company is interested to find hidden patterns in the customer data helping him with new business opportunities.
Thus, financial institutions require a scalable solution that can cope with large data volumes, ensure the right level of security and can handle the magnitude of requests from stakeholders and a growing user community
These needs resulted in huge investments in IT architecture to manage the amount of data and provide the right framework. One of my customers, one of the biggest banks worldwide, had ordered hardware with multiple Petabyte of disk space. Their aim was to report the consolidated bank at the detailed financial instrument level on a daily basis. The data load for this kind of endeavor into a data warehouse (DW) had to be optimized with sophisticated sort algorithms in order to provide the data on time. Once the data was in the operational data store of the DW the data had to be cleansed, enriched and then loaded into a data mart that is optimal for reporting.
While the development of a well defined data warehouse is a good concept and worth doing, in this particular instance the time between the transactions took place and their reflection in a report was just too long. For the daily group consolidation report it was sufficient, but only if the intercompany transactions were matching. If not, an exception report was produced and someone had to check all payables and receivables where the relations were not matching. Once this manual review was finished, the corrected results were entered into the enterprise resource planning system (ERP), which then triggered an update of the DW and the following processes.
That is not optimal for ad hoc reporting and definitely not a solution for some of the demanding business requests mentioned above (time is money). As such a reporting tool that can access the transactional data directly, combining it with the information from the other sources and presenting the information with the help of an integrated metadata layer is preferable.
Depending on the role of the business user it is not necessary to have always the full low level detail visible in a report. The CFO for example wants to get a quick overview of the business. An aggregated dashboard with the key performance indicators that are relevant for the CFO will work. Wherever a more thorough analysis is required, he can look at the KPI from different angles (e.g. segments, products, channels), just by clicking on a different tab of the dashboard. If that is still not detailed enough a deep-dive into the transaction report directly from the dashboard is possible. We have implemented that multiple times and it is always remarkable how well received the variety of visualizations, the flexibility of report development, the ease-of-use, and the scalability of the reporting tool is. The ability to drill anywhere with great performance, even when handling enormous data volumes is essential for the business to become as efficient as possible and is a great competitive advantage!
Solvency II introduces a comprehensive framework for risk management for defining required capital levels and to implement procedures to identify, measure, and manage risk levels. It is the updated set of regulatory requirements for insurance firms that operate in the European Union.
The rationale for the European Union behind this framework is the development of a Single Market in insurance services in Europe, whilst at the same time securing an adequate level of consumer protection.
Solvency II is based on economic principles for the measurement of assets and liabilities. Risk will be measured on consistent principles and capital requirements will depend directly on this which means that it is a risk-based system, too. It is somewhat similar to the banking regulations of Basel II. It also consists of three pillars:
· Pillar 1 focuses of the quantitative requirements (e.g. the amount of capital and insurer should hold)
· Pillar 2 consists of requirements for the governance and risk management of insurers, as well as the effective supervision of insurers
· Pillar 3 concentrates on disclosure and transparency requirements
A solvency capital requirement may have the following purpose:
· Reduce the risk that an insurer would be unable to meet claims
· Reduce the losses suffered by policyholders in the event that a firm is unable to meet all claims fully
· Provide early warnings for supervisory so that they can react promptly if capital falls below the required level
· Improve the confidence level in the financial stability of the insurance sector
I think Solvency II is an important step forward in the effort to improve insurance regulation, to foster risk assessments and to rationalize the management of large firms. The directive, especially if complemented by indicators that take the lessons from the crisis into account, would remedy the present fragmentation of rules in the EU and allow for a more comprehensive, qualitative and economic assessment of the risks.
The directive has been agreed by the EU and will be implemented starting year 2012. For the insurance industry this will mean a paradigm shift of their business related decision processes. The goal is not to create more regulations. It must be in the interest of the industry itself to find solutions, risk models, and other concepts that prepare them for upcoming crisis. Solvency II is just a vehicle to articulate those needs and guide the insurers in the right direction.
Right now, the insurance industry reached the point where they understand the necessity of an intelligent risk management. It is recognized as a value generating process as well as a competitive advantage. However, the implementation of Solvency II is still in the early stages of development.
It is easier for large insurance organizations to budget for the development of company-specific risk models. For small insurers this is a real obstacle, which leads them to the conclusion that they have to build their models still Excel-based.
Besides, the main hurdles for the implementation of Solvency II seem to be data availability and quality. Insurance companies are known for their heterogeneous IT system architecture. Data consistency, data integrity, flexibility and good performance of reporting and analysis are not easy to achieve. In fact, the contrary is true!
So what are the consequences?
Even though the insurers have still some time before the implementation of Solvency II is mandatory, they definitely need to start working on a solution that fits their needs for their own sake.
Insurance companies do not have all the answers themselves. They need to reach out to experts who know how to extract their data, identify the right KPIs, build an integrated, qualitative good and consistent data layer, and a flexible, well performing reporting solution that meet their needs and the requirements of Solvency II.
An integrated BI system that enables the business to analyze their data quickly down to the deepest level while at the same time building the confidence in the accuracy of the data is priceless. The feedback that I am receiving from my clients is that easily understandable state-of-the-art reporting capabilities (like dynamic dashboards) that allow the user-specific visualization of information from different angles without making the mistake of being to excessive help significantly with the adoption of the new regulations.
Wednesday, August 12, 2009
Typically, the two main sources of revenue for insurers are:
1. Underwriting profits
2. Investment gains
While insurance companies do not suffer from the financial crisis like the banks, the returns on investment in a barren economic environment decreased significantly. Thus, companies are forced to maximize their profits from underwriting and ensure that they have optimal strategies to achieve this end.
A critical strategy for insurers to maximize returns from writing new business is creating new sales opportunities for existing customers. That sounds simple but the challenge is to identify the right products for the proper audience. Therefore insurers strive to gain access to insightful policyholder information that helps them sell more.
Of course it is not possible for the companies to know everything about their customers. Still, they attempt to achieve as much as possible to segment customers, position products, and target customers most effectively.
The unique problem of insurance companies is the lack of interaction with their customers, they are dependent on loyalty. The relationship between a consumer and an insurer is normally initiated at the time of a significant life event, such as the purchase of a new car, a new house, or the anticipation of a new baby.
Once the customer has evaluated the insurance market to get the best coverage for a low price he contacts an agent or the insurer directly to get the deal done. With the signature under the policy contact between the policyholder and the insurer is limited, if required at all. Automatic policy administration systems take care of the billing and renewals. Delinquencies and address changes seem to be the only reasons for interaction. When these rare situations occur most insurers are not prepared to foster the relationship to achieve stronger customer loyalty and/or capitalize on the sales opportunity with intelligent customer insight.
A proper customer analysis for an insurance company has usually to deal with fragmented data sources due to mergers and acquisitions that were necessary to penetrate new markets or utilize new channels. An integrated view of this data is – even when it is tedious to build – important for proper customer care.
Business Intelligence (BI) solutions are the answer to these demands. They are not just reporting tools. They can translate the data into actionable insight and additional revenue opportunities for the businesses.
Today, best-of-breed BI platforms can be cost-effectively deployed to tap into and query a number of different sources of data to produce useful information based upon historical and real-time data, and predictive models. Resulting information about customer demographics, product performance, and next-best actions can empower companies to uncover hidden opportunities and devise strategies to maximize customer value.
What do insurance companies know about their customers? Which customers are most profitable? Can a positive ROI be generated with this customer? What is his growth potential? What product bundles can be marketed to him?
These are just a few questions that BI can help answering. Combining multiple data sources like marketing data, policy statistics, financial data, demographics etc. with a unified, integrated metadata layer can lead to the right product mix. Clustering of customers and the usage of predictive analytics can identify hidden patterns in loyalty and buying behavior that had been previously overlooked. In addition, managers of multiple lines of business have more data at their disposal, enjoy greater flexibility with more analytic capabilities, and receive unique, targeted offerings from which to choose.
Another advantage of BI – in comparison to the traditional spreadsheets, still very popular with insurers – is the incorporated security. The customer data is very sensitive and requires a sophisticated security in place in order to prevent unauthorized data access. Besides, the various intuitive visual representations of information that BI provides (e.g. ad hoc reporting or multiple dashboard books that present all relevant information at a glance) and the fast performing distribution of reports to a wider audience are key factors for an efficient information delivery strategy.
Last but not least, the demand for scorecards that represent the key performance indicators of the insurance company (branch or department) in a colored scheme, i.e. “traffic lights” where green represents a KPI on target, yellow indicates a possible problem and red shows a value that is out of range, is becoming more and more evident.
In the United States, the Emergency Economic Stabilization Act of 2008 (commonly referred to as a bailout of the U.S. financial system) suggested the creation of such bad banks as a response to the ongoing subprime mortgage crisis in the U.S.
Other countries like Germany followed this idea as a result of the financial crisis.
The problem with bad banks is the moral hazard effect. The construct that allows a financial institution to transfer their risks to the bad bank and the government will create an incentive for such financial institutions to take on higher risks in the confidence that they can pass on these risks. In addition, the bad bank is not a ‘normal’ market participant. The banks may gain trust by transferring their bad assets (e.g. asset backed securities), the bad bank itself not. The bad bank is – until it is reincorporated – dependent on government funds.
A positive example of a bad bank is Securum, a Swedish bank founded in 1992 for the purpose of taking on and unwinding bad debts from the partly state-owned Nordbanken bank during the financial crisis in Sweden 1990-1994. Many of the debts were owed by real-estate companies and it became a goal for Securum to stabilize the property market.
The company took over a quarter of the bank's credit portfolio, comprising 3000 credits with 1274 companies and the management of Securum were given free hands. By 1994 a large number of credits were unwound and by summer 1997 Securum itself could be wound down.
The ways bad banks are structured differ. In Germany for example, banks are now allowed to transfer their bad assets into such a bad bank to adjust and unburden their balance sheet. That will enable the banks to get fresh equity capital. However, it is not that simple. The banks have to build a special purpose community that buys the toxic papers for 90% of their value as of 30th June 2008. Since their value is currently much less, the banks have to pay the difference in equal installments over a period of 20 years (and a fee for the government securities). This means they cannot get rid of the problematic assets in total.
For a financial institution that follows this path (it is voluntary!) the finance and controlling departments as well as the risk management department will have more work to do. Additional reporting requirements will occur.
The normal financial statement will not include the toxic papers and will reduce the balance sheet. However, the bank has to build accruals for the installments (if their reporting GAAP allows accruals to be built). In addition, they will also build an internal management view of the financial statement which will include the impact of the bad assets, i.e. the installments.
Additional reporting will be also required for banks that are evaluating the possibility of using the bad bank on the risk management side. The calculation of risk needs to be reviewed and new KPIs have to be included in the dashboards that simulate the impact of the toxic papers on and off the balanced sheet.
Furthermore, the executive board, which is now sometimes influenced or controlled by the government, has additional reporting needs that give them better oversight of the business.
The risk appetite of financial institutions (despite the bad banks in place) may have changed but is still apparent. yet the risk models and the funds transfer pricing calculations are adjusted to the new circumstances. These will most likely impact the reporting needs as well. More in-depth knowledge of the loans, mortgages etc. by various attributes / dimensions is required in order to fully understand the loss given default (LGD), probability of default (PD), and the expected exposure rate (EE) – to name just a few risk indicators.
The analysis of risk is complicated and requires a vast amount of data. It is not sufficient to concentrate on a “risk cube” or “risk data mart” for reporting. The financial institutions demand an analysis across multiple source systems, including external benchmarks, as well as low level detail information (probably down to the transaction level).
A reporting tool that can fulfill this multi-sourcing without any problems and can drill anywhere to the detailed data is therefore mandatory, at least that is the feedback that I am receiving from my customers.
Tuesday, August 11, 2009
Top funds with an A-rating, good performance, and a couple years of a market presence are managing much higher voloumes than those with bad performance (according to a study from the Feri Eurorating Services rating agency). The probability of a causal relationship between the success of a fund and the willingness of investors to invest, which impacts the volume, is obviously pretty high.
However, that does not necessarily mean that small funds cannot produce a good performance. Especially in niche markets are smaller funds preferable because they allow a more flexible fund strategy.
I think the main differentiator for sustainable success is information and how this information is used.
What does that mean for the fund managers / the investment bank?
- The investment bank should try to diversify their portfolio to mitigate the risk. This can be achieved by issuing a good mixture of larger and smaller funds.
- The fund managers are dependent on the data from different sources, i.e. market data, benchmarks, financial information, etc. to make better informed decisions.
- Fund managers of larger funds cannot react as quickly as fund managers of smaller funds. Both need the right KPIs (e.g. risk measures like volatility and VaR) always at their disposal. They also need to analyze the data from different angles.
- The strategy may be different for funds of different sizes, the information demand is similar. Transparency of data is always key.
- In order to obtain the good ratings of their funds they need to impress their investors as well as the rating agencies. The best way to do this is good performance and an excellent presentation of the fund progression.
Detailed information about the performance of the fund and the reasons for the development across multiple data sources, transparency, flexibility of data analysis, and good performance when handling large volumes of data are key factors for the success of fund managers. That is the domain expertise of Business Intelligence.
To sum up, the volume of a fund may be important but without the knowledge and the trust in the data a fund manager cannot ensure the future success of his fund. I have seen that multiple times, dashboards that present the fund manager with all relevant information at a glance with the ability to analyze further and the option to distribute the results easily to other stakeholders can make the difference between a well performing and an excellent performing investment.
Monday, August 10, 2009
Most of the time, the issuers of securities are companies, special purpose entities, non-profit organizations, or governments (local, state or national) issuing bonds or other debt-like securities that can be traded on a secondary market.
A credit rating for an issuer takes into consideration the issuer's credit worthiness (i.e., its ability to pay back a loan), and affects the interest rate applied to the particular security being issued.
There are more than 150 rating agencies existent (with local or industry focus) but the big 3 agencies are dominating the market:
• Standard&Poors (S&P)
In the beginning of the agencies (around 1909) the investors had to pay for the ratings. This changed in the 1970s, when the issuer payed the bill of the agencies. Nowadays it is a mixture of both but most of the time issuers have to pay the rating agencies.
This is causing problems.
First of all, the 3 big agencies can influence the market quite significantly due to their market position and reputation. Their ratings can become a "self-fullfilling prophecy".
Secondly, the rating agencies receive their money from their clients. Therefore they are dependent on the goodwill of their clients. An objective rating can therefore not always be expected.
As a consequence, especially due to the current financial crisis, the request for more (independent) control of the rating agencies becomes more important.
A perfect example is the rating of mortgage-backed securities (CMBS).
It is obvious that the real estate bubble / sub-prime crisis was one of the main triggers of the current economic downfall. Thousands of real estate loans with high risk of default were packaged into very complicated collaterilized debt obligations (CDO), e.g. CMBS, and sold all over the world.
When the real estate market plummeted and the credit default rate increased dramatically, the financial institutions who invested heavily in the CDOs had to write down their values in their balance sheets dramatically.
The governments around the globe fought against the crisis but also demanded a higher control of the agencies as they had given these risky financial products always top ratings.
The rating agencies reacted in July. S&P downgraded the CMBS to a very low "BBB-". However, the main issuers of the CMBS were Goldmann Sachs, JP Morgan Chase, Morgan Stanley, Credit Suisse, and Wachovia.
Without a top rating they cannot deposit their CMBS - within the framework of the existing programme of the Federal Reserve - as security in return for credits. Hence they were up in arms about the rating of S&P.
S&P who had just downgraded these complex financial products withdrew their ratings one week later due to the enormous pressure of the issuers, their clients, and awarded the CMBS again with the top rating of "AAA"!
Of course there are justifications for this move and yes, in general (e.g. in the years before the crisis) the CMBS may be a save store of value, but they are so complex that even the bank clerks have difficulties to understand the underlying risk. Some of these financial products require the reading of up to 90,000 pages to fully understand their structure. Therefore a rating of AAA is more than questionable.
So what is the bottom line?
In my mind, the dependency of rating agencies on their clients has gone too far. They are not objective. An independent control board (from the governments and/or the federal banks) should be put in place. In addition, the payment of rating agencies need some review. I think we should go back to the concept that investors are paying for the ratings not the issuers.
As Ben Bernanke, the chairmen of the Federal Reserve Bank, has said, the insolvency risk of this market carries a huge risk for the whole economy. Therefore he demanded that the problematic loans have to be restructured in a way that the probability of default drops significantly. He is right!