Thursday, November 12, 2009

SEPA – New regulations for financial transactions

Since the beginning of November 2009 new regulations for banking transactions in Europe are in place – as a consequence of SEPA efforts.

What is SEPA? As the European Payments Council states, the Single Euro Payments Area or SEPA will be “the area where citizens, companies and other economic participants make and receive payments in euro, whether between or within national boundaries, under the same basic conditions, rights and obligations. In the long-term, the uniform SEPA payment instruments are expected to replace national euro payment systems now being operated in Europe“.

SEPA currently consists of the 27 EU Member States, Iceland, Liechtenstein, Monaco, Norway and Switzerland. SEPA is an EU-wide policy-maker-driven integration initiative in the area of payments designed to achieve the completion of the EU internal market and monetary union. Following the introduction of euro notes and coins in 2002, the political drivers of the SEPA initiative - EU governments, the European Commission and the European Central Bank - focused on harmonizing the euro payments market. Integrating the multitude of national payment systems existing today is a natural step towards making the euro a truly single and fully functioning currency. SEPA will become a reality when a critical mass of euro payments has migrated from legacy payment instruments to the new SEPA payment instruments.

The main benefits expected are the creation of conditions for enhanced competition in providing payment services as well as more efficient payment systems through harmonization. Once the SEPA is established it will be possible to exchange euro payments between any accounts in SEPA as easily as it is possible today only within national borders. Common standards, faster settlement and simplified processing will improve cash flow, reduce costs and facilitate the access to new markets. Moreover, users will benefit from the development of innovative products offered by payment sector suppliers.

According to a recent study conducted by CapGemini Consulting at the request of the European Commission, the replacement of existing national payments systems by SEPA holds a market potential of up to €123 billion in benefits, cumulative over six years and benefitting the users of payments services.

While this potential is extremely interesting and important to financial institutions, it also means that the banking processes, reporting requirements etc. need to be adjusted and the terms and conditions for the customers are changing.

The personal risk of consumers when making a bank transfer or when losing their debit card has heightened. The main changes for customers are:

  • From now on a bank transfer becomes irrevocable on receipt by the bank, i.e. if the customer makes a mistake filling out the transfer of payment, he cannot reclaim the transfer himself, even if the bank has not executed the transfer yet.
  • Furthermore, banks are not longer obliged to verify that the name of the recipient of the transfer is in accordance with the bank account number. In the past – at least in Germany – courts did not consider the account number as sufficient.
  • It is now the responsibility of the customer himself to get his wrongly wired money back, not longer a task of the bank.
  • Another new rule is related to the “EC-card” or debit card. Customers have to pay up to €150 when their debit card was used abusively due to the fact that it got lost. The liability of the customer starts with losing the card and ends with the bank inactivating the card. This is a shift in accountability. In the past the consumers were only liable when they acted carelessly.
  • With the implementation of the new EU-rules a debit advice can be made European wide, i.e. across countries, not just within the country of the customer.

SEPA forces financial institutions to implement the new European instruments and processes and to tie their payment transactions with the electronic mass transaction systems of the central banks, e.g. SWIFTNet (the system of the German Central Bank). In addition, a more detailed customer administration and engagement with the customer is needed; adjustments to the master data as well as a more sophisticated debitor analysis are also required.

In order to make these changes for the financial institutions and their customer base as smoothly and transparent as possible business intelligence plays an integral part.

For the customers it is important to mitigate their risk by getting all relevant information about the transactions quickly at a glance, e.g. showing the name of the recipient and the account number. Exception reports can also help identifying suspicious transactions.

For the banks the process is now more standardized, which means the reporting requirements to the authorities are also more restrict. In addition, they need to anticipate the possible issues with customers and their transactions and should prepare for it with the right level of detailed reporting.

Tuesday, October 13, 2009

CRM or CMR?

Is the change from the traditional and well known Customer Relationship Management (CRM) towards Customer Managed Relationships just a nice idea or really an interesting concept for the future? Customer orientation and improvements of the value to the customer is something all financial services institutions have on their radar but the realization of these goals is not simple. In any case, the customer becomes more and more the key factor for the success of a company.
The market condition has changed significantly with the internet. The possibilities are immense and the transparency of the market increased a lot. While it was difficult in the past to compare the simple facts of financial products like interests and fees with each other, it is now much easier for the consumer to get this information online. As long as we are not talking about structured products like asset backed securities – which even bank clerks have difficulties to fully comprehend – the market is pretty visible for the investors and provides an unlimited amount of data.

As a consequence, the loyalty of the customers towards their bank has declined. Just take the tough competition around interest rate as an example. It tempts the consumer to switch quickly.
The answer for the financial services industry lies therefore in even more data and better information and analysis about their target customer groups. The traditional CRM is here sometimes not comprehensive enough as it only analyzes existing contracts of their customers. The problem with CRM is that it looks at the customer from the perspective of the company, not from the customer angle. CRM tries to identify new sales & service opportunities for their clients based on the processes of the bank.

CMR – or "Customer Managed Relationships" is using a different approach. The concept of “CMR” started to be spoken about maybe two years ago but still gets not much attention. “Self service” is a term that is more broadly used and understood but misses the power of what customers really want. It looks at the saving from a company’s point of view, not the empowerment from the customer’s perspective.

CMR means three things
1. The ability to question and reshape your organization and its knowledge in a way that it is at the disposal of your customers
2. Internet enabled management tools which customers use to get what they want
3. The ability to react to the information being generated and used by customers in order to increase profitability.

CMR generates - if executed well – the following major benefits over CRM:
1. It is easier to implement because the customer is doing the more complex work
2. It creates more binding of your customers since customers having invested their data with the financial services institute will not move easily
3. It allows financial services organizations to move faster than their competitor since they are in a trusted relationship with their customer

Companies need to understand CMR and then change accordingly. With the words of business strategist Gary Hamel – you need a well developed view of the future, whether or not it is true. You have to invest in the competencies to make that future come true. You need to experiment and learn to see which parts of your view are developing.

“Customer managed” – a simple thought but with major impacts
The consequence for the company is a loss of control. Customers will be in the driving seat, not the financial institution. They have to start thinking and behaving differently.
It may be hard to envision but it is nevertheless absolutely feasible – with internet enabled platforms and the right business intelligence –to imagine how whole industry processes can be reconstructed putting the customer in charge of their own needs by giving them the internet based management tools and data they require. This is what a customer managed relationship is about.

The industry is currently not designed to serve customers that way. Almost every financial services institute puts the customer and the improvement of the relation to its customer base in their mission statement and strategy as a top priority. The mindset is clear. If they can establish a good relationship with their customers it will (hopefully) result in cross sell opportunities and more profit.

However, the customers usually do not care so much about a relationship with their bank. They want results. If a customer asks for a loan a simple yes over the phone would do! In other words, the customer decides when a relationship with his bank is useful.

Customers need to answer the question “How much money do I get and what shall I do with it?” all the time. Presenting this dynamic problem to a financial institution will be difficult to handle for them. Their CRM systems would not answer the question.

With CMR you present the customer with the tools to manage his relationship with his bank. This can be a portal that provides the ability to key in (safely) the individual information about customers’ savings, pensions, investments, insurance information, salary etc. The customer then can decide to take a look at his portfolio from different angles, using dashboards. Benchmark information as well as learning algorithms based on the data provided and external market data will help the customer improving in managing his own finances. Only when he needs to contact a clerk or a bank analyst he can do so by triggering an action, an email alert etc.
This flexibility and self-control from the customers’ point of view may be too farfetched right now but it will most likely be the next evolution of customer relations in the financial services industry in an effort to retain their customer base.

Friday, September 18, 2009

Is the call for regulation of the financial market just a lip service?

The financial crisis from 2008, caused by the American real estate bubble, the complicated structured finance products, the huge amount of defaulted loans, and the downfall of Lehman Brothers, impacted the world economy like few other events in the history since the Great Depression.

In an unprecedented effort the governments spent trillions of Dollars to stabilize the market and their “system-relevant” financial institutions, avoiding the total collapse. Some of the banks are now government owned; some had to agree to more control and more restrictive bonus and incentive systems for their employees. This quick response saved (more or less) the economy and also helped to win back some trust in the credit market. However, it also showed the banks that they can take on risks to an extend that is not backed up by their own equity ratio, because they know – in case of mayor problems – the governments will support and save them.

What measures are required in order to prevent this scenario from repeating itself in the future?

At the peak of the crisis, governments around the globe asked for more regulation, for more restrictive rules to better control the financial market (especially hedge funds and structured financial products), the rating agencies, and the key players. The G20 summit in 2009 agreed to more control and discussed more stringent rules but so far it is looks more like a lip service. The financial market has recovered, the stock market is showing new heights almost every day, the financial institutions are chalking up enormous profits again, and the incentive system for their management has not changed. The payed out bonus is reaching still enormous levels and is – most of the time – still not related to long term goals.

France made an effort in changing the mind-set of their mayor banks. They announced that they will only work with banks in the future for government orders who comply with the rules of more control, higher equity ratios, and long-term goals as standard for their incentive plans. While this is a step in the right direction, it can only succeed if it is adopted on a global level. Mr. Sarkozy, the president of France, therefore tried to join forces with Germany. Ms. Merkel, the chancellor of Germany, supports his efforts but also points out that it needs to be accepted by all economies to avoid disadvantages for the local economy. That is the challenge!

France and Germany are both export oriented economies. They have strong industrial industry and are not fully dependent on the financial sector. This is different for the UK. They made the decision in the 1980s to transform their whole economy, away from the industrial sector towards the service industry. They wanted to become a global player for financial services. They established London as the second largest financial market, next to Wall Street. Most of the hedge funds worldwide are located in London!

Therefore the British economy was extremely hit by the financial crisis and some of the big banks are now in the hand of the government. Yet, it did not lead to a change in the government policy. Their main concern is to lose their position in the global financial sector. Thus, Gordon Brown, prime minister of the UK, is doing his best to avoid strict rules and hardened control that could torpedo his leading role.

This may be understandable, as long as tiger states in the Middle East and Asia are eager to jump in and gain a larger market share of the financial business, but it will not solve the issue and will not prepare the global economy against a repetition of such a problematic financial situation.

Hence we can only hope that the next G20 summit held at the end of September will reach conclusions and come up with binding solutions for the global market.

Friday, September 11, 2009

Balanced Scorecard for Financial Institutions

The concept of Balanced Scorecard (BSC) is not new. It has been developed in the early ‘90s by the Robert S. Kaplan and David P. Norton. In the beginning it was more of a temporary fashion. Meanwhile it has evolved into a business standard that more and more companies adopted as their strategic management tool. That is also true for financial institutions, especially in Europe.

Every department within the company has to take – especially now in the current economic situation – an even more economic focused approach to their daily work, i.e. they need to define targets and objectives and have to specify the key indicators for managing their department profitably. And that is exactly where the balanced scorecard comes into play.

With the help of a BSC
• You will find a common bottom line; common goals for all employees
• You will identify current strengths & weaknesses within your organization and derive actions for the future
• You will make binding agreements for the future
• You will control agreements key performance indicators (KPIs) in the sense of self-control
Due to the simplicity and completeness the BSC is the right tool to model your goals and indicators.

However, a company can change their methods and organizations easily – but not always successful. If you really want to change yourselves you need to involve all employees to change their behavior and attitude. It is a longer but more successful process. The employees need to own the scorecards; they will be measured on the KPIs compared against the corresponding targets.

The traditional view of a company is backward oriented, purely focused on financial results. While they are very important and necessary to understand the performance of the company, the financial indicators are typically lagging indicators. The main achievement of a BSC is that it takes also other perspectives that are forward looking (with leading indicators) into account, making the scorecard “balanced”. The BSC also describes the interdependencies between the various KPIs, their cause & effect relationship.

The four standard perspectives according to Norton/Kaplan are
· Financial Perspective
· Customer Perspective
· Internal Processes
· Learning & Growth

For most companies these four perspectives may be sufficient. In order to keep the BSC manageable and efficient the key is to define only a very small number of truly important KPIs per perspective (typically 4 to 5 KPIs). For a financial institution this is usually too restrictive. What I have seen at my customers are 5 to 6 perspectives. In addition to the ones mentioned above, two other perspectives are more and more common in the financial industry:
· Risk Perspective
· Image
(In the manufacturing or retail industry the “supplier” perspective is often used)

The process of a BSC is clearly defined and nowadays an integral part of business intelligence. It is the combination of an easy to use BSC framework that supports the functional users in defining their perspectives, objectives, and KPIs on the one hand and sophisticated reporting / dashboard functionality to visualize the scorecard results and trends on the other hand, that gives your balanced scorecard initiative the edge. With the ability to manage and distribute your scorecards via the web to all required users the sustainability and adoption of management by balanced scorecard is much easier to achieve.

Thursday, September 10, 2009

New investments in secondary bonding

A bond is a debt security, in which the authorized issuer owes the holders a debt and, depending on the terms of the bond, is obliged to pay interest (so called coupon) and/or to repay the principal at a later date (the maturity of the bond). In other words, a bond is a formal contract to repay borrowed money with interest at fixed intervals.

Thus, a bond is similar to a loan: the issuer is the borrower (debtor), the holder is the lender (creditor), and the coupon is the interest. Bonds are often used to provide the borrower with external funds to finance long-term investments, or, in case of a government bond, to finance current expenditure.

There are two different types of bonds, bonds with fixed interest rates and those with variable interest payments.

The following bonds fall under the first category (fixed-interest):
Zero Bonds – bonds without interest payments/coupon. The issue price must therefore be much lower than the nominal value of the bond
Combined Interest Rate Loan – 2 different interest rates are assigned to the bond, a lower interest in the beginning and a higher interest rate later. That way the bond value will go up.
Straight Bonds – the holder of the bond is entitled to a fixed payment by the issuing institution (in Germany normally on a yearly basis, in the U.S. half-yearly).

The second category (variable) can have the following forms:
Floaters – variable interest rates with a minimum (floor) and a maximum (cap) usually are using a reference interest rate like the one for government bonds.
SURF – Constant Maturity Treasury Step Up Recovery Floating Rate Notes, abbreviated SURF. In contrast to normal floaters they are oriented at short-term interest rates.
Reverse Floater – like normal floaters the interest is oriented at a reference interest rate. However, in a way that the interest is going up when the reference interest rate is declining.
Participation or income bond – besides the redemption – a variable interest is payed based on revenue or dividends of the issuer.

The European market for bonds has changed in 2005 when the government liability for bonds was discontinued. After that no larger unsecured bonds were issued. Until now!
This week, four mayor financial institutions (Deutsche Bank, WestLB, Société Générale, and Lloyds) successfully issued unsecured bonds (senior notes) as well as secondary bonds with a total value of almost € 4.5 bn ($6bn)!

Secondary bonds are a mixture of stockholder’s equity and borrowed capital. In case of bankruptcy the handling of these bonds is of inferior priority, which makes them risky in the current economic circumstances. That is also true for senior notes, which are straight bonds but without the protection of government liability. It makes them interesting for investors because the increased risk is coming with higher interest payments.

According to the analysts, the issuance of unsecured bonds to such an extent, something that was impossible for banks to do in spring of 2009, shows how eager the banks are to prove their financial strength and their independency of government guarantees.

Deutsche Bank had also another reason for issuing their bond; they need money for the takeover of Sal. Oppenheim.

However, it is a sign that the banks are trying to gain trust and overcome the crisis. It has to be seen if they succeed in the long run but at least it looks like it.

Wednesday, August 19, 2009

The Porsche – Volkswagen deal and the power of hedge funds

The deal between Volkswagen and Porsche is complicated. In the past, Porsche bought shares of VW and acquired an interest of around 30% of VW. At that point there were only interested to be a minority shareholder. Then, they changed their strategy. The CFO of Porsche, Holger Haerter, developed – together with the CEO Wendelin Wiedeking – a complex financing model to take over Volkswagen. Part of this strategy was to invest in options and gamble with the stock price of VW. As a consequence the VW shares were sky rocking last year and made Porsches’ cash till ring. The hedge funds also played a part in this bet. Then, after a couple weeks, the VW shares went back to normal.

However, one of the large Porsche shareholders and member of the board, Ferdinand Piëch, is also Chairmen of the VW board and he had different plans for the company. He wanted the transfer to work the other way round, i.e. VW takes over Porsche. In the end, he won the power struggle due to Porsches’ problems with repaying their loans and discovering new funds for the takeover. Wiedeking had to leave the company, together with his CFO.

Now, the stock market reacts to these developments again. This time the hedge funds are betting on further dropping share prices which led on the one hand today to a drop of almost 20% of the stock price. One of the reasons is that Porsche has sold their options to Qatar. On the other hand the non-voting preference shares went up.

That means that the biggest European car maker has lost more than $ 28 billion of its market value in just 2 trading days. It has to be seen if the hedge funds are right and the downfall of the VW shares will continue.

Tuesday, August 18, 2009

Do we have to expect a second banking crisis?

The financial institutions in Europe are currently recovering slowly from the economic crisis when several experts stir up the next horror scenario. According to the valuation of the economists, the consequences arising out of the company insolvencies will hit the financial sector in a second round, this time especially the savings and loans, at full tilt.

The small and medium sized businesses (e.g. in Germany) have received their loans mainly from saving and loan institutions. Due to the significant decline in turnover and export, those companies are struggling and often have to file for bankruptcy. Insolvencies and credit defaults of their clients will cause a lot of financial institutions to sway.

However, the institutions will most likely weather the storm better than the last crisis, according to the chief economist of Allianz, Michael Heise, and I agree with him. The banks have reacted to the crisis and invested heavily into risk provisioning. In addition, the governments “rescue parachute” is already in place to help the financial institutions in need.

Nevertheless the savings and loans who managed to remain stable through the first wave (due to their lack of risk appetite and their concentration on retail banking) have – like the other financial institutions – to ensure that their equity ratio is high enough and that they fully understand the risk involved and the probability of default of their loans. The right level of detail for this kind of analysis and appropriate reporting is therefore a must.

Monday, August 17, 2009

Colonial Bank went bankrupt

On Friday, August 14, 2009, regulators of the Alabama State Banking Department shut down Colonial, the largest US bank to fail this year, based in Montgomery, AL.

The bank which held about $25 billion in assets was a big lender in real estate development.
The Federal Deposit Insurance Corporation (FDIC) has been made receiver and approved the sale of Colonial's assets and $26.06 billion in deposits. Rival BB&T has taken over the bulk of Colonial's assets, the government banking agency said.

Before the takeover, BB&T wants to secure fresh money. The bank therefore initiated the sale of new shares worth $750 million. It will be the biggest takeover for BB&T in its history.
Three other US banks filed bankruptcy the same day, the Community Bank of Arizona, the Community Bank of Nevada and the Union Bank, National Association.

The states in the US that are hit hardest by the financial crisis are Florida and Georgia. Colonial operated mainly in Alabama and Florida and was therefore hit very hard by the burst of the real estate bubble. Unfortunately, the problems became to big so that, at the end, FDIC had to pull the rip line.

Closer look at liquidity risk is crucial

Liquidity Risk is the risk that a financial institution does not have sufficient liquid funds to enable it to meet its obligations when they fall due, or that they can secure them only at an excessive cost.

Thus, the effective management of liquidity risk allows a financial institution to meet its cash flow obligations – in all circumstances. Important to assessing liquidity risk and deriving the right management measures are robust cash flow models as well as sophisticated stress scenarios. Regulators are more and more emphasizing stress testing as a critical component of the risk management tool set of a financial organization because it enables a better understanding of the liquidity risk profile and it also helps to model the liquidity risk appetite.

The financial services authorities (FSA) in the UK proposed recently new rules that are based on recently agreed international liquidity standards, in particular the Basel Committee on Banking Supervision’s (BCBS) Principles for Sound Liquidity Risk Management and Supervision, published last June, and also take into account difficulties faced in the market over the past 18 months.

The FSA’s proposals emphasize the responsibility of firms’ senior management to adopt a sound approach to liquidity risk management, and present the following changes:

  • All regulated entities must have adequate liquidity and must not depend on other parts of their group to survive liquidity stresses, unless permitted to do so by the FSA.
  • A new systems and controls framework based on the recent work of the BCBS and the Committee of European Banking Supervisors (CEBS).
  • Individual liquidity adequacy standards for firms based on firms their being able to survive liquidity stresses of varying magnitude and duration.
  • A new framework for group-wide and cross-border management of liquidity allowing firms, through waivers and modifications, to deviate from self sufficiency where this is appropriate and would not result in undue risk to clients.
  • A new reporting framework for liquidity, with the FSA collecting granular, standardized liquidity data at an appropriate frequency so the FSA can see firm-specific, sector- and market-wide views on liquidity risk exposures.

The FSA hopes to introduce the new rules in October 2009. These rules will come along with new reporting requirements. I believe that these new rules are important, not just in the UK but for financial institutions as a whole to manage liquidity risk more efficiently. Therefore I would be surprised if these new regulations are not adopted quickly by the other (European) countries.

Friday, August 14, 2009

Funds Transfer Pricing

Funds Transfer Pricing (FTP) is an internal measurement and allocation process that assigns a profit contribution value to funds gathered and lent or invested by the bank. It is a critical component of the profitability measurement process of financial institutions, as it allocates the major contributor to profitability, net interest margin. An intermediary is created within the organization (bank or insurance) to manage FTP, usually Treasury.

The business units of the financial institution routinely receive funds from their depositing customers and other third parties. These funds are then thereupon invested in loans and investments (sometimes through different business units) to borrowing customers and / or third parties.

The amount, terms, and interest rate of funds collected and invested are described in financial agreements between the organization and its customers. The interest payments on these funds contribute to the overall net interest margin of the institution, defined as the difference between interest revenue earned on funds used to acquire assets less the interest expense on funds gathered. In other words, funds transfer pricing determines the cost of funds for the asset side and a value of funds for the liability side. The net interest margin of the institution and the value of its financial contracts fluctuate as market conditions and the underlying cash flow of the funds change over time.

Even though, business units and the customers participate in the continuous intermediation process, the contribution to the net interest margin and value is not equal by all participants. The amount of funds received and provided is rarely matching. The task of the funds transfer pricing process is therefore to measure and assign the discrete contribution of funds – when assessing their overall profit contribution – by business unit, product and customer.

There are usually two methods to calculate the net margin and value contribution of funds, the pooled approach and a specific assignment approach. The pooled approach assigns funds to financial instrument pools created under a predefined set of criteria (e.g. type of balance, term, reprising term, payment frequency, and origination) with transfer rates derived either internally, based on actual rates earned or paid, or as an alternative, by market derived interest rates and adjusted for risk.

The specific assignment or single rate method approach uses transfer rates based on asset yields, which favors net funds provider’s contribution. The deficiency of that approach is it assumes that all funds have equal importance to the financial institution. No differentiations based on the value of fund attributes nor the market conditions at the origination of the transaction are taken into account. However, multiple pool approaches that use contemporary market rates lack the ability to benchmark management decisions made at the time of initial transaction pricing.

In any case, by assigning a transfer price to each component on the balance sheet, you can compare the earnings resulting from the use of each asset to alternative uses, compare the cost of each source of funds (liabilities) to alternative sources, and measure the profit contribution of each asset or liability. The typical reports are either two dimensional or OLAP reports that show initially the balance sheet and various columns with the comparisons and offer then drill down capabilities by the various criteria. Dashboard, showing the relevant measures like net interest margin and various visual representations of the balance sheet components, the comparisons mentioned above (e.g. in form of a bar chart), and the yield in form of a graph, are also commonly used.

Due to the fact that all financial instruments need to be pooled and calculated to receive the funds transfer prices the data volume is very high and the reporting tool need to be able to cope with this volume in an efficient way. In addition, the format of the two dimensional reports is usually stated. Hence, the BI tool needs to reflect this by providing “pixel-perfect” reporting, i.e. the ability to arrange all items on the report exactly as needed, something that not many reporting tools can master.

Thursday, August 13, 2009

Scalability of data

The financial services industry is known for its large data volumes. The need for detailed customer information and extensive financial analysis, the consideration of risk management, regulatory changes, and fragmented liquidity has led to an explosion of (market) data volume.

There is also the fact that all institutions in the financial services have to deal with very sensitive data. Detailed information on credit card holders and their transactions, the medical history of health insurance policy holders, the financial situation of clients (e.g. in wealth management) are just a few examples of the confidentiality of the data these organizations are dealing with on a daily basis. That is why the demand for highly sophisticated security mechanisms is a common theme in this industry.

A third determining factor for financial services is performance. Trading departments need real-time information on the stock market; a hedge fund manager wants to follow up on the performance of his fund with the ability to analyze deeper to understand the cause of changes; the CRO requires quick information to manage the risk and minimize the expected loss; a financial analyst in an insurance company is interested to find hidden patterns in the customer data helping him with new business opportunities.

Thus, financial institutions require a scalable solution that can cope with large data volumes, ensure the right level of security and can handle the magnitude of requests from stakeholders and a growing user community

These needs resulted in huge investments in IT architecture to manage the amount of data and provide the right framework. One of my customers, one of the biggest banks worldwide, had ordered hardware with multiple Petabyte of disk space. Their aim was to report the consolidated bank at the detailed financial instrument level on a daily basis. The data load for this kind of endeavor into a data warehouse (DW) had to be optimized with sophisticated sort algorithms in order to provide the data on time. Once the data was in the operational data store of the DW the data had to be cleansed, enriched and then loaded into a data mart that is optimal for reporting.

While the development of a well defined data warehouse is a good concept and worth doing, in this particular instance the time between the transactions took place and their reflection in a report was just too long. For the daily group consolidation report it was sufficient, but only if the intercompany transactions were matching. If not, an exception report was produced and someone had to check all payables and receivables where the relations were not matching. Once this manual review was finished, the corrected results were entered into the enterprise resource planning system (ERP), which then triggered an update of the DW and the following processes.

That is not optimal for ad hoc reporting and definitely not a solution for some of the demanding business requests mentioned above (time is money). As such a reporting tool that can access the transactional data directly, combining it with the information from the other sources and presenting the information with the help of an integrated metadata layer is preferable.

Depending on the role of the business user it is not necessary to have always the full low level detail visible in a report. The CFO for example wants to get a quick overview of the business. An aggregated dashboard with the key performance indicators that are relevant for the CFO will work. Wherever a more thorough analysis is required, he can look at the KPI from different angles (e.g. segments, products, channels), just by clicking on a different tab of the dashboard. If that is still not detailed enough a deep-dive into the transaction report directly from the dashboard is possible. We have implemented that multiple times and it is always remarkable how well received the variety of visualizations, the flexibility of report development, the ease-of-use, and the scalability of the reporting tool is. The ability to drill anywhere with great performance, even when handling enormous data volumes is essential for the business to become as efficient as possible and is a great competitive advantage!

Solvency II and its consequences

The first regulatory requirements for insurance companies in the European Union were introduced in the 1970s, known as Solvency I. Since then, sophisticated risk management systems have been developed.

Solvency II introduces a comprehensive framework for risk management for defining required capital levels and to implement procedures to identify, measure, and manage risk levels. It is the updated set of regulatory requirements for insurance firms that operate in the European Union.
The rationale for the European Union behind this framework is the development of a Single Market in insurance services in Europe, whilst at the same time securing an adequate level of consumer protection.

Solvency II is based on economic principles for the measurement of assets and liabilities. Risk will be measured on consistent principles and capital requirements will depend directly on this which means that it is a risk-based system, too. It is somewhat similar to the banking regulations of Basel II. It also consists of three pillars:

· Pillar 1 focuses of the quantitative requirements (e.g. the amount of capital and insurer should hold)
· Pillar 2 consists of requirements for the governance and risk management of insurers, as well as the effective supervision of insurers
· Pillar 3 concentrates on disclosure and transparency requirements

A solvency capital requirement may have the following purpose:
· Reduce the risk that an insurer would be unable to meet claims
· Reduce the losses suffered by policyholders in the event that a firm is unable to meet all claims fully
· Provide early warnings for supervisory so that they can react promptly if capital falls below the required level
· Improve the confidence level in the financial stability of the insurance sector

I think Solvency II is an important step forward in the effort to improve insurance regulation, to foster risk assessments and to rationalize the management of large firms. The directive, especially if complemented by indicators that take the lessons from the crisis into account, would remedy the present fragmentation of rules in the EU and allow for a more comprehensive, qualitative and economic assessment of the risks.

The directive has been agreed by the EU and will be implemented starting year 2012. For the insurance industry this will mean a paradigm shift of their business related decision processes. The goal is not to create more regulations. It must be in the interest of the industry itself to find solutions, risk models, and other concepts that prepare them for upcoming crisis. Solvency II is just a vehicle to articulate those needs and guide the insurers in the right direction.

Right now, the insurance industry reached the point where they understand the necessity of an intelligent risk management. It is recognized as a value generating process as well as a competitive advantage. However, the implementation of Solvency II is still in the early stages of development.

It is easier for large insurance organizations to budget for the development of company-specific risk models. For small insurers this is a real obstacle, which leads them to the conclusion that they have to build their models still Excel-based.

Besides, the main hurdles for the implementation of Solvency II seem to be data availability and quality. Insurance companies are known for their heterogeneous IT system architecture. Data consistency, data integrity, flexibility and good performance of reporting and analysis are not easy to achieve. In fact, the contrary is true!

So what are the consequences?

Even though the insurers have still some time before the implementation of Solvency II is mandatory, they definitely need to start working on a solution that fits their needs for their own sake.

Insurance companies do not have all the answers themselves. They need to reach out to experts who know how to extract their data, identify the right KPIs, build an integrated, qualitative good and consistent data layer, and a flexible, well performing reporting solution that meet their needs and the requirements of Solvency II.

An integrated BI system that enables the business to analyze their data quickly down to the deepest level while at the same time building the confidence in the accuracy of the data is priceless. The feedback that I am receiving from my clients is that easily understandable state-of-the-art reporting capabilities (like dynamic dashboards) that allow the user-specific visualization of information from different angles without making the mistake of being to excessive help significantly with the adoption of the new regulations.

Wednesday, August 12, 2009

Profits for insurers in a tough market

It is obvious that we currently live in challenging market conditions. Companies are forced to rethink their strategies and their approach to business. Insurance companies are no exception to this rule. Like other industries, insurers take a closer look inward and examine the unrealized value of different sources of revenue.

Typically, the two main sources of revenue for insurers are:
1. Underwriting profits
2. Investment gains

While insurance companies do not suffer from the financial crisis like the banks, the returns on investment in a barren economic environment decreased significantly. Thus, companies are forced to maximize their profits from underwriting and ensure that they have optimal strategies to achieve this end.

A critical strategy for insurers to maximize returns from writing new business is creating new sales opportunities for existing customers. That sounds simple but the challenge is to identify the right products for the proper audience. Therefore insurers strive to gain access to insightful policyholder information that helps them sell more.

Of course it is not possible for the companies to know everything about their customers. Still, they attempt to achieve as much as possible to segment customers, position products, and target customers most effectively.

The unique problem of insurance companies is the lack of interaction with their customers, they are dependent on loyalty. The relationship between a consumer and an insurer is normally initiated at the time of a significant life event, such as the purchase of a new car, a new house, or the anticipation of a new baby.

Once the customer has evaluated the insurance market to get the best coverage for a low price he contacts an agent or the insurer directly to get the deal done. With the signature under the policy contact between the policyholder and the insurer is limited, if required at all. Automatic policy administration systems take care of the billing and renewals. Delinquencies and address changes seem to be the only reasons for interaction. When these rare situations occur most insurers are not prepared to foster the relationship to achieve stronger customer loyalty and/or capitalize on the sales opportunity with intelligent customer insight.

A proper customer analysis for an insurance company has usually to deal with fragmented data sources due to mergers and acquisitions that were necessary to penetrate new markets or utilize new channels. An integrated view of this data is – even when it is tedious to build – important for proper customer care.

Business Intelligence (BI) solutions are the answer to these demands. They are not just reporting tools. They can translate the data into actionable insight and additional revenue opportunities for the businesses.

Today, best-of-breed BI platforms can be cost-effectively deployed to tap into and query a number of different sources of data to produce useful information based upon historical and real-time data, and predictive models. Resulting information about customer demographics, product performance, and next-best actions can empower companies to uncover hidden opportunities and devise strategies to maximize customer value.

What do insurance companies know about their customers? Which customers are most profitable? Can a positive ROI be generated with this customer? What is his growth potential? What product bundles can be marketed to him?

These are just a few questions that BI can help answering. Combining multiple data sources like marketing data, policy statistics, financial data, demographics etc. with a unified, integrated metadata layer can lead to the right product mix. Clustering of customers and the usage of predictive analytics can identify hidden patterns in loyalty and buying behavior that had been previously overlooked. In addition, managers of multiple lines of business have more data at their disposal, enjoy greater flexibility with more analytic capabilities, and receive unique, targeted offerings from which to choose.

Another advantage of BI – in comparison to the traditional spreadsheets, still very popular with insurers – is the incorporated security. The customer data is very sensitive and requires a sophisticated security in place in order to prevent unauthorized data access. Besides, the various intuitive visual representations of information that BI provides (e.g. ad hoc reporting or multiple dashboard books that present all relevant information at a glance) and the fast performing distribution of reports to a wider audience are key factors for an efficient information delivery strategy.

Last but not least, the demand for scorecards that represent the key performance indicators of the insurance company (branch or department) in a colored scheme, i.e. “traffic lights” where green represents a KPI on target, yellow indicates a possible problem and red shows a value that is out of range, is becoming more and more evident.

New reporting requirements due to bad banks

Bad Bank is a term for a financial institution created to hold nonperforming assets owned by a federally insured bank. The purpose of such institutions is to address challenges arising during an economic credit crunch wherein private banks are allowed to take problem assets off their books.

In the United States, the Emergency Economic Stabilization Act of 2008 (commonly referred to as a bailout of the U.S. financial system) suggested the creation of such bad banks as a response to the ongoing subprime mortgage crisis in the U.S.

Other countries like Germany followed this idea as a result of the financial crisis.
The problem with bad banks is the moral hazard effect. The construct that allows a financial institution to transfer their risks to the bad bank and the government will create an incentive for such financial institutions to take on higher risks in the confidence that they can pass on these risks. In addition, the bad bank is not a ‘normal’ market participant. The banks may gain trust by transferring their bad assets (e.g. asset backed securities), the bad bank itself not. The bad bank is – until it is reincorporated – dependent on government funds.

A positive example of a bad bank is Securum, a Swedish bank founded in 1992 for the purpose of taking on and unwinding bad debts from the partly state-owned Nordbanken bank during the financial crisis in Sweden 1990-1994. Many of the debts were owed by real-estate companies and it became a goal for Securum to stabilize the property market.

The company took over a quarter of the bank's credit portfolio, comprising 3000 credits with 1274 companies and the management of Securum were given free hands. By 1994 a large number of credits were unwound and by summer 1997 Securum itself could be wound down.
The ways bad banks are structured differ. In Germany for example, banks are now allowed to transfer their bad assets into such a bad bank to adjust and unburden their balance sheet. That will enable the banks to get fresh equity capital. However, it is not that simple. The banks have to build a special purpose community that buys the toxic papers for 90% of their value as of 30th June 2008. Since their value is currently much less, the banks have to pay the difference in equal installments over a period of 20 years (and a fee for the government securities). This means they cannot get rid of the problematic assets in total.

For a financial institution that follows this path (it is voluntary!) the finance and controlling departments as well as the risk management department will have more work to do. Additional reporting requirements will occur.

The normal financial statement will not include the toxic papers and will reduce the balance sheet. However, the bank has to build accruals for the installments (if their reporting GAAP allows accruals to be built). In addition, they will also build an internal management view of the financial statement which will include the impact of the bad assets, i.e. the installments.
Additional reporting will be also required for banks that are evaluating the possibility of using the bad bank on the risk management side. The calculation of risk needs to be reviewed and new KPIs have to be included in the dashboards that simulate the impact of the toxic papers on and off the balanced sheet.

Furthermore, the executive board, which is now sometimes influenced or controlled by the government, has additional reporting needs that give them better oversight of the business.

The risk appetite of financial institutions (despite the bad banks in place) may have changed but is still apparent. yet the risk models and the funds transfer pricing calculations are adjusted to the new circumstances. These will most likely impact the reporting needs as well. More in-depth knowledge of the loans, mortgages etc. by various attributes / dimensions is required in order to fully understand the loss given default (LGD), probability of default (PD), and the expected exposure rate (EE) – to name just a few risk indicators.

The analysis of risk is complicated and requires a vast amount of data. It is not sufficient to concentrate on a “risk cube” or “risk data mart” for reporting. The financial institutions demand an analysis across multiple source systems, including external benchmarks, as well as low level detail information (probably down to the transaction level).

A reporting tool that can fulfill this multi-sourcing without any problems and can drill anywhere to the detailed data is therefore mandatory, at least that is the feedback that I am receiving from my customers.

Tuesday, August 11, 2009

Does size matter for an investment fund?

The size of an investment fund is in the line of business usually the main characteristic for the success of the fund. However, sometimes the volume can become a burden because it makes it more difficult to respond quickly to changing market conditions.

Top funds with an A-rating, good performance, and a couple years of a market presence are managing much higher voloumes than those with bad performance (according to a study from the Feri Eurorating Services rating agency). The probability of a causal relationship between the success of a fund and the willingness of investors to invest, which impacts the volume, is obviously pretty high.

However, that does not necessarily mean that small funds cannot produce a good performance. Especially in niche markets are smaller funds preferable because they allow a more flexible fund strategy.

I think the main differentiator for sustainable success is information and how this information is used.

What does that mean for the fund managers / the investment bank?
  • The investment bank should try to diversify their portfolio to mitigate the risk. This can be achieved by issuing a good mixture of larger and smaller funds.
  • The fund managers are dependent on the data from different sources, i.e. market data, benchmarks, financial information, etc. to make better informed decisions.
  • Fund managers of larger funds cannot react as quickly as fund managers of smaller funds. Both need the right KPIs (e.g. risk measures like volatility and VaR) always at their disposal. They also need to analyze the data from different angles.
  • The strategy may be different for funds of different sizes, the information demand is similar. Transparency of data is always key.
  • In order to obtain the good ratings of their funds they need to impress their investors as well as the rating agencies. The best way to do this is good performance and an excellent presentation of the fund progression.

Detailed information about the performance of the fund and the reasons for the development across multiple data sources, transparency, flexibility of data analysis, and good performance when handling large volumes of data are key factors for the success of fund managers. That is the domain expertise of Business Intelligence.

To sum up, the volume of a fund may be important but without the knowledge and the trust in the data a fund manager cannot ensure the future success of his fund. I have seen that multiple times, dashboards that present the fund manager with all relevant information at a glance with the ability to analyze further and the option to distribute the results easily to other stakeholders can make the difference between a well performing and an excellent performing investment.

Monday, August 10, 2009

Control of Rating Agencies

A credit rating agency (CRA) is a company that assigns credit ratings for issuers of certain type of debt obligations as well as the debt instruments themselves. Sometimes, the servicers of the underlying debt are also given ratings.

Most of the time, the issuers of securities are companies, special purpose entities, non-profit organizations, or governments (local, state or national) issuing bonds or other debt-like securities that can be traded on a secondary market.

A credit rating for an issuer takes into consideration the issuer's credit worthiness (i.e., its ability to pay back a loan), and affects the interest rate applied to the particular security being issued.

There are more than 150 rating agencies existent (with local or industry focus) but the big 3 agencies are dominating the market:
• Moody's
• Standard&Poors (S&P)
• Fitch

In the beginning of the agencies (around 1909) the investors had to pay for the ratings. This changed in the 1970s, when the issuer payed the bill of the agencies. Nowadays it is a mixture of both but most of the time issuers have to pay the rating agencies.

This is causing problems.
First of all, the 3 big agencies can influence the market quite significantly due to their market position and reputation. Their ratings can become a "self-fullfilling prophecy".
Secondly, the rating agencies receive their money from their clients. Therefore they are dependent on the goodwill of their clients. An objective rating can therefore not always be expected.
As a consequence, especially due to the current financial crisis, the request for more (independent) control of the rating agencies becomes more important.

A perfect example is the rating of mortgage-backed securities (CMBS).

It is obvious that the real estate bubble / sub-prime crisis was one of the main triggers of the current economic downfall. Thousands of real estate loans with high risk of default were packaged into very complicated collaterilized debt obligations (CDO), e.g. CMBS, and sold all over the world.
When the real estate market plummeted and the credit default rate increased dramatically, the financial institutions who invested heavily in the CDOs had to write down their values in their balance sheets dramatically.

The governments around the globe fought against the crisis but also demanded a higher control of the agencies as they had given these risky financial products always top ratings.

The rating agencies reacted in July. S&P downgraded the CMBS to a very low "BBB-". However, the main issuers of the CMBS were Goldmann Sachs, JP Morgan Chase, Morgan Stanley, Credit Suisse, and Wachovia.
Without a top rating they cannot deposit their CMBS - within the framework of the existing programme of the Federal Reserve - as security in return for credits. Hence they were up in arms about the rating of S&P.

S&P who had just downgraded these complex financial products withdrew their ratings one week later due to the enormous pressure of the issuers, their clients, and awarded the CMBS again with the top rating of "AAA"!

Of course there are justifications for this move and yes, in general (e.g. in the years before the crisis) the CMBS may be a save store of value, but they are so complex that even the bank clerks have difficulties to understand the underlying risk. Some of these financial products require the reading of up to 90,000 pages to fully understand their structure. Therefore a rating of AAA is more than questionable.

So what is the bottom line?
In my mind, the dependency of rating agencies on their clients has gone too far. They are not objective. An independent control board (from the governments and/or the federal banks) should be put in place. In addition, the payment of rating agencies need some review. I think we should go back to the concept that investors are paying for the ratings not the issuers.
As Ben Bernanke, the chairmen of the Federal Reserve Bank, has said, the insolvency risk of this market carries a huge risk for the whole economy. Therefore he demanded that the problematic loans have to be restructured in a way that the probability of default drops significantly. He is right!