German halt casts shadow over European GDP recovery

Europe’s post-recession recovery has hit a roadblock as German economic growth unexpectedly ground to a halt in the final quarter of 2009, though France made up for some of the damage.

The news comes at a hard time for Europe’s single currency bloc as governments struggle to sort out Greece’s debt difficulties and contain financial market fears that have driven the euro lower and government bond yields higher.

German GDP was flat in the final quarter of last year, the statistics office said, following expansions in the two previous quarters that ended a year-long recession.

Growth of 0.2 percent had been forecast.

France, the Eurozone’s second largest economy, fared better in the same period with a GDP increase of 0.6 percent versus the third quarter that, unlike Germany, was driven in large part by healthy consumer spending.

That meant France maintained its recovery pace, after a GDP rise of 0.5 percent in the third-quarter.

Those figures followed confirmation that Spain, one of the hardest hit by the end of housing booms across the globe, stayed in recession with a fourth quarter GDP dip of 0.1 percent versus the previous three months.

“The Eurozone growth engine has taken a break in the fourth quarter but it should return soon,” ING bank economist Carsten Brzeski said.

“Today’s numbers, however, were a good reminder that recoveries can not only be bumpy but also capricious.”

Part of Europe’s problem is that it needs a reasonable pace of economic growth to help limit the surge in sovereign debt caused by the recession of 2008-2009.

Forecasters for now believe that the Eurozone will have a weaker recovery than the US this year, just as it fell harder than the US in 2009.

Devilish details
In France, where the recession knocked 2.2 percent off GDP in 2009 as a whole compared to a five percent hit in Germany, the end of year news was marginally more positive than expected, once again as a result of domestic demand.

French data showed a 0.9 percent quarter-on-quarter increase in household consumption in the fourth quarter.

That helped to offset the damage from tumbling investment, a major casualty of recession, which fell 1.2 percent in the final quarter versus the preceding one and dropped 6.9 percent over 2009 as a whole. “This is good news, which contrasts with the very disappointing figures out of Germany,” said Jean-Louis Mourier, economist at Aurel BGC. “The main support came from consumption and stocks.”

Germany’s statistics office said only that falling investment and consumption offset firmer foreign trade in the fourth quarter.

ECB President Jean-Claude Trichet has been warning for some time that the recovery in Europe will be “bumpy” and at times “chaotic”.

Severe winter weather at the turn of the year may well blur the picture further. Economists say that could hit GDP in the first quarter but that much of such immediate losses are usually made up in the ensuing months.

German Economy Minister Rainer Bruederle has already said growth in the first quarter of 2010 could be near zero.

EU assembly rejects US bank data deal

A nine-month interim agreement went into force provisionally at the start of February but deputies from the assembly’s Liberal, Socialist and Green groups opposed it on the grounds it failed to protect the privacy of EU citizens.

Washington will now have to seek other ways to access information on money transfers in Europe until it can negotiate a permanent agreement with the EU. It says such data is vital to track terror suspects.

Underscoring US concerns, Treasury Secretary Timothy Geithner and Secretary of State Hillary Clinton wrote to parliament president Jerzy Buzek to ask for support.

EU governments also made last-minute appeals, pledging to give deputies better access to future talks with Washington.

But Buzek said more data privacy protection was needed.

“The majority view … is that the correct balance between security, on the one hand, and the protection of civil liberties and fundamental rights, on the other, has not been achieved,” Buzek said in a statement after the vote.

Washington previously had access to the data, collected by the Society for Worldwide Interbank Financial Telecommunication (SWIFT), which registers money transfers among states. But it lost it when SWIFT changed its server infrastructure in recent months. It now wants a permanent agreement on data sharing.

One way to regain access could be to seek bilateral agreements with states hosting SWIFT servers, Switzerland and the Netherlands, EU diplomats say.

“This outcome is a setback for counter-terrorism cooperation (between the US and the EU). We are disappointed and are evaluating our options how to proceed from here,” a US government spokesperson said, without elaborating.

Power struggle in Brussels
Concerns over data protection have dogged data transfers since SWIFT said in 2006 it was cooperating with the US authorities as part of their anti-terror activities after the September 11, 2001 attacks.

The rejection of the interim deal by the European Parliament is also part of a drive to ensure that its expanded powers, gained under the EU’s Lisbon treaty, are respected in practice.

Among other provisions, the treaty allows the parliament to decide jointly with EU governments on justice and home affairs.

To allay parliament’s concerns, EU Member States pledged to deepen privacy protection in any new agreement and allow deputies better access to classified documents subject to the negotiations.

Under the agreement rejected by parliament, investigators already would have to justify their demand for information in each case and could seek access only to data on people suspected of terrorist activities.

The Spanish government, which holds the EU’s six-month rotating presidency, also sought to press the benefits of data sharing. A cabinet minister told parliament that information exchange had helped thwart an attack on Barcelona and data was used to investigate the 2004 Madrid train bombings.

US foreclosures drop in Jan but more loom

US mortgage foreclosure filings dropped in January but the decline may prove only temporary as housing-rescue efforts fall short of addressing current drivers, a report released on Thursday showed.

Foreclosures are by far one of the biggest threats to the US housing market, which remains highly vulnerable to setbacks and heavily reliant on government intervention. If foreclosures continue dropping it would be one of the strongest signals yet the market is on the path to recovery.

Foreclosure filings – including mortgage default notices, house auctions and home repossessions by banks – were reported on 315,716 properties in January, a decrease of nearly 10 percent from December, but up 15 percent from the year-earlier month, real estate data firm RealtyTrac said.

One in every 409 US housing units received a foreclosure filing in January, Irvine, California-based RealtyTrac said in its January 2010 US Foreclosure Market Report.

Furthermore, more than 300,000 properties received foreclosure filings for an 11th straight month, Irvine, California-based RealtyTrac said.

While January’s decrease may indicate foreclosure prevention efforts are gaining traction, the data has been volatile in recent months and foreclosures appear poised to rise again.

“January foreclosure numbers are exhibiting a pattern very similar to a year ago: a double-digit percentage jump in December foreclosure activity followed by a 10 percent drop in January,” James J Saccacio, chief executive officer of RealtyTrac, said in a statement.

“If history repeats itself we will see a surge in the numbers over the next few months as lenders foreclose on delinquent loans where neither the existing loan modification programs or the new short sale and deed-in-lieu of foreclosure alternatives works,” he said.

REOs, or real estate-owned properties, activity nationwide was down five percent from the previous month but still up 31 percent from January 2009; default notices were down 12 percent from the previous month but up four percent from January 2009; and scheduled foreclosure auctions were down 11 percent from the previous month and up 15 percent from January 2009, RealtyTrac said.

High unemployment and wage cuts have hurt the ability of many home owners to pay monthly mortgage payments. Unemployment was 9.7 percent in January, according to the Labor Department.

Many lawmakers, advocacy groups and housing experts say the government’s Home Affordable Modification Programme, or HAMP, has fallen short because of its failure to adequately address negative equity, or “underwater” mortgages.

Negative equity has been one of the biggest banes of many homeowners, making many unqualified for home loan refinancing and preventing some from selling their homes. Borrowers in negative equity are more prone to defaults and foreclosures.

Sun Belt still hurting
Despite a year-over-year fall in foreclosure activity of nearly 18 percent, the foreclosure rate in Nevada, once one of the hottest US real estate markets, remained highest among US states for the 37th straight month.

One in every 95 Nevada housing units received a foreclosure filing during the month – more than four times the national average.

A four percent month-over-month increase in foreclosure activity boosted Arizona’s foreclosure rate to second highest among the states in January. One in every 129 Arizona housing units received a foreclosure filing during the month.

Foreclosure activity decreased by double-digit percentages from the previous month in both California and Florida, and the two states registered nearly identical foreclosure rates – one in every 187 housing units receiving a foreclosure filing.

California, the most populous US state, had a foreclosure rate statistically higher by a slim margin and ranked third highest among the states, while Florida’s foreclosure rate ranked fourth highest.

Other states with foreclosure rates among the nation’s top 10 were Utah, Idaho, Michigan, Illinois, Oregon and Georgia, the report showed.

Myanmar turns to bartering

The bartering illustrates the effects of sanctions on one of the world’s most isolated, repressive countries, along with surging inflation and the military junta’s curious decision to stop printing small notes, experts say.

“How shall I give it to you? You want coffee-mix, cigarettes, tissues, sweets or what?”

That question is heard often in shops and restaurants in the former Burma, where coins and small notes disappeared years ago and other notes have now started to follow suit.

State banks were main source of small notes for shop-owners, but they stopped issuing new currency several years ago. Today, beggars who collect money on the street now provide shops with the bulk of their small notes, often in return for food.

Rampant inflation also plays a role. Consumer prices rose by an average 24 percent a year between 2005 and 2008, according to the Asian Development Bank. That has taken a toll on Myanmar’s currency, the kyat.

Officially, the kyat is pegged at 5.5 per dollar. But it fetches nowhere near that, trading instead at about 1,000 per dollar. The cost of printing small notes is now far more expensive than the face value of the notes themselves.

A Yangon government high school teacher said most of her pupils had never even seen coins or small notes.

Sweet currencies
In the commercial capital, Yangon, 100 kyat (around 10 US cents) is worth a sachet of coffee-mix or a small container of shampoo. Tissue packets or a cigarette or sweets are the equivalent of 50 kyat.

“The shopkeeper gave me three sweets for change of 150 kyat when I bought a bottle of cough mixture last week,” said Ba Aye, a Yangon taxi driver.

“When I told her that sweets would make my cough worse, she offered me a Thai-made gas lighter. When I said ‘I don’t smoke’, she then asked me to accept three packets of tissues that would be useful for my runny nose.”

General-store owner Daw Khin Aye said most of her customers preferred small items like sweets to notes.

“The small notes that are in circulation are in very bad shape – worn out, torn, stained, dirty and in most cases stuck with tape,” she said.

In Sittwe, the capital of western Rakhine State, teashop owners manufacture their own coupons to use as currency.

“It’s far more convenient to use these self-circulated notes instead of small items,” teashop owner Ko Aung Khine said.

“But you need to make sure coupons can’t be forged. Mostly we use a computer to print it with the name of the shop, face value and signature of the shop owner,” he added.

Officially there are 13 denominations of notes in circulation – starting from 50 pya (one cent) up to 5,000 kyat. But only the three big notes (200, 500 and 1,000 kyat) are common. The rest are growing scarcer by the month.

“So far as I know, they print only 1,000 kyat notes now,” said a retired economist from Yangon University. “The cost of printing is far higher than the face value of most small notes… so they now print just the biggest ones.”

How much money is in circulation is anyone’s guess. Myanmar has not publicly released money supply data since 1996-97, when it put the value at 179.82 billion kyat.

Asked by reporters for the latest figure, a senior government official replied: “We cannot tell you. It’s a state secret.”

ArcelorMittal sees only slow steel recovery in Q1

The firm, which has about eight percent of the global market and capacity some three times greater than nearest rival Nippon Steel, said it expected core profit or EBITDA (earnings before interest, tax, depreciation and amortisation) of between $1.8 and $2.2bn in the first quarter.

The figure compared with an average forecast in a Reuters poll of $2.6bn, albeit with a wide range of estimates, and with a fourth quarter result of $2.1bn.

Chief Executive Lakshmi Mittal said 2010 would continue to be challenging, although capital expenditure would rise by 43 percent to $4bn this year.

“We therefore start the year in a good position to benefit from the progressive, albeit slow, recovery that is under way,” he said in a statement.

ArcelorMittal said its shipments were expected to be higher in the first quarter of this year than at the end of 2009, but it would face lower average selling prices and increased costs. Net debt was expected to increase over the period.

The mixed picture chimed in with results and comments from other major steel companies in recent weeks.

The $500bn steel industry took a heavy beating in the 2008/2009 downturn, with demand from key construction and auto consumers sharply down and destocking magnifying the negative effect. Producers cut output by as much as a half.

Capacity utilisation increased to 70 percent in the last three months of 2009. It was set to rise gradually to 75 percent in the first quarter.

ArcelorMittal is among the most exposed companies to spot steel prices, which should rise with expected restocking.

Its EBITDA of $2.1bn in the final quarter of 2009 missed the average $2.23bn forecast of a Reuters poll of 21 analysts. The company had given a range of $2.0 to $2.4bn in October.

ArcelorMittal returned to net profit in the third quarter after three consecutive quarterly losses and was again profitable in the fourth.

Cash flow good, guidance poor
Hermann Reith, analyst at BHF Bank in Frankfurt said the company’s cash flow was good and net debt better than expected, meaning it would easily meet its loan covenants. Planned capital expenditure was higher than expected.

“What is disappointing is the guidance. With that start, it will be tough to meet market estimates for the full year. I expect them to be revised down,” he said.

In the US, AK Steel topped market expectations and forecast higher prices with improved demand, but larger rival US Steel made a heavier fourth-quarter loss than expected and said it saw a similar first quarter.

Nippon Steel forecast a first annual net loss in seven years, but world number eight Tata Steel more than doubled quarterly profit at its Indian operations.

European steel body Eurofer said recently the sector in the region was recovering slowly on the back of an improving outlook for car and engineering companies, despite a continued slump in construction.

Charges added in Rajaratnam insider trading case

Additional criminal charges have been filed against Galleon hedge fund founder Raj Rajaratnam as prosecutors alleged he and his co-defendant reaped $49m from illegal insider trading, up from an earlier claim of $40m.

In the parallel civil case against Rajaratnam and co-defendant Danielle Chiesi, a judge ordered the two defendants to turn over wiretaps to the US Securities and Exchange Commission. In a letter to Manhattan federal court Judge Jed Rakoff, Rajaratnam’s lawyers asked him to stay the order pending an appeal.

Thousands of wiretaps were made in the criminal probe between 2003 and 2009 involving Wall Street and Silicon Valley firms that was announced last October, but lawyers for the defendants and the SEC have been tussling over their use in the parallel civil fraud case.

Rajaratnam, 52, and Chiesi, 44, a former employee of New Castle Funds LLC, were arrested last October and indicted in December on charges of securities fraud and conspiracy in what prosecutors have described as the biggest hedge fund insider trading case in the US.

The new indictment adds two more counts of securities fraud against Rajaratnam. In a letter to the court on Tuesday, his lawyers said they would ask the judge presiding the criminal case to order a separate trial from Chiesi.

The indictment alleges that Rajaratnam made a total of $45m and Chiesi $4m in a wide-ranging scheme that also led to charges against a score of other traders, lawyers or fund managers. Rajaratnam and Chiesi face possible prison sentences of up to 20 years if convicted.

They pleaded not guilty to the original indictment.

“Mr Rajaratnam is innocent and looks forward to his day in court when a jury of his fellow citizens will examine and evaluate all of the evidence,” his lawyer, John Dowd, said in a statement.

Chiesi’s lawyer Alan Kaufman said the superseding indictment “has nothing new with regard to the allegations against my client.”

Separately, Rakoff ruled that Rajaratnam and Chiesi, who are fighting to keep the wiretap evidence out of both criminal and civil cases, must provide the SEC with recordings they received from criminal prosecutors by February 15.

“The notion that only one party to a litigation should have access to some of the most important non-privileged evidence bearing directly on the case runs counter to basic principles of civil discovery in an adversary system,” Rakoff’s written order said.

While the SEC and criminal prosecutors often coordinate with each other, there are limits under the law on the information they can share in parallel civil and criminal cases, which is why the defense was ordered to provide the material and not the prosecutors.

“We are obviously disappointed and respectfully disagree with the ruling,” Chiesi’s lawyer Kaufman said.

In the wider insider trading probe, 21 people have been criminally or civilly charged. Nine have pleaded guilty. Eight of those are cooperating with the government’s investigation, including two longtime friends of Rajaratnam, former McKinsey & Co executive Anil Kumar and Rajiv Goel, a former director of the treasury group at Intel Capital, the investment arm of Inel Corp.

“Rajaratnam, Chiesi and others repeatedly traded on material, nonpublic information pertaining to upcoming earnings forecasts, mergers, acquisitions, or other business combinations,” the office of the Manhattan US Attorney said in a statement.

It said the superseding indictment charges trading based on inside information in Intel Corp, International Business Machines Corp, Akamai Technologies Inc, Polycom Inc, Hilton Hotels Corp, Google Inc, Sun Microsystems Inc, Clearwire Corp, Advanced Micro Devices, ATI Technologies Inc and eBay Inc Inc.

The cases are USA v Raj Rajaratnam and Danielle Chiesi, U.S. District Court for the Southern District of New York, No. 09-01184 and SEC v Galleon Management LP et al 09-cv-08811.

Dynamic Financial Analysis

Dynamic financial analysis (DFA) is an application of mathematical modelling to businesses. DFA models the key elements that impact an organisation’s operations and simulates thousands of potential situations, determining the firm’s financial condition for each outcome. The output from DFA is the distribution of potential financial results for the next few years.

DFA is used for a variety of reasons, including solvency regulation, assigning financial ratings, evaluating a change in operations, and determining the value of an organisation for an acquisition. All DFA models are based on a number of key assumptions that govern how the values of future financial variables are determined. Key assumptions include interest rates, inflation, stock returns and catastrophe claims. DFA models generally have several interrelated modules that interact to generate the results. Separate modules for an insurer can include underwriting gains and losses, catastrophe losses, reinsurance agreements, investments, and taxation. When all the modules are combined, the total after tax operating statement and balance sheet for the insurer are calculated for the next few years. A single iteration is likely to require thousands of random variables and tens of thousands of calculations, with each run of a DFA programme performing thousands of iterations to determine a probability distribution for future financial positions. This type of simulation could not be done prior to the development of powerful computers that perform these operations.

Output from a DFA model is used to determine a firm’s financial condition based on the likelihood of its developing financial difficulties on the basis of its current operating conditions. For each iteration associated with financial difficulties, all the relevant variables can be captured and analysed. For example, an insurer could discover that most of the incidents of financial impairment are associated with large catastrophe losses. This could lead the insurer to revise its reinsurance contracts or change the geographical areas in which it operates.

Despite the power of an effective DFA model, care needs to be used in applying these models. DFA, as any model, is a simplified representation of reality. Many factors that can influence results are not included in the models. Thus, all the uncertainty is not reflected in the model; actual results will vary more widely than the model will indicate. Models are built on the basis of what has happened in the past, not on what new conditions can arise. Firms that do not recognise this inherent uncertainty, and accept levels of leverage that are based on the results of a DFA model, are more exposed to risk than they and their regulators realise. Also, situations can change so that factors not included in the model will need to be added. DFA is dynamic in the sense that it will continually evolve. In general, DFA models can be useful as a guide for firms, but they must be used with a healthy dose of skepticism.

This article is an edited version of
an entry in the “Encyclopedia of Quantitative Risk Analysis and
Assessment”, Copyright © 2008 John Wiley & Sons Ltd. Used by
permission.

www.wiley.com

Crisis restrictions in trade

But measures to support troubled institutions will need to be unwound carefully to avoid distorting competition, the WTO said in a restricted report to members, dated February 3.

“The financial crisis does not seem to have prompted a widespread introduction of trade restrictions in financial services,” said the report by the WTO secretariat, prepared at the request of members.

With only a few exceptions, countries maintained policies regarding typical market access limitations such as foreign equity caps or incorporation requirements.

While some countries such as India slowed down previously announced liberalisation plans, others such continued to open up financial services markets, such as Malaysia, which announced a broad liberalisation package in April 2009, it said.

Coordinated exit strategies
The WTO said support measures for banks such as nationalisation, recapitalisation or expanded government guarantees often constituted some form of state aid or subsidy and so could affect competition in financial services.

Such measures and other regulatory moves enjoy a waiver from the WTO’s non-discrimination rules under the “prudential carve-out” which allows members to take whatever action necessary to defend the stability of their financial system.

The WTO noted that expanded bank guarantees had often been prompted by similar moves in other countries, leading governments to follow suit for fear that inaction could result in even healthy banks suffering a competitive disadvantage.

As a result such measures must be unwound carefully.

“A persuasive case can be made in favour of countries’ coordination of exit strategies, particularly where there is potential for financial and regulatory arbitrage across jurisdictions,” it said.

The US and EU are the biggest importers and exporters of financial services, but have developed differently since the crisis.

US exports of financial services seem to have stabilised in the second quarter of 2009, falling only one percent year-on-year after a 13 percent drop in the first quarter.

But EU exports were 26 percent lower than a year earlier in the second quarter after falling 30 percent in the first three months, it said.

In the first quarter of 2009, the latest period for which estimates are available, world exports of financial services fell 26 percent year-on-year, after expanding only two percent in 2008 due to a year-on-year decline that started in the third and fourth quarters.

Besides the US and EU, some Asian countries suffered particularly big drops in the first quarter of last year, with Hong Kong falling 32 percent year-on-year, Taiwan 53 percent and South Korea 56 percent.

World exports of financial services, including insurance, had reached $370bn in 2007, after growing 10 percent a year since 2000, the WTO said.

The WTO said the origins of the crisis lay in a variety of factors, including lax monetary policy, risky lending and poor regulation. That suggested trade liberalisation in financial services – granting market access and national treatment – was not a cause of the crisis but could help transmit it faster.

ECB Trichet departure sparks Greece rescue talk

European Central Bank President Jean-Claude Trichet is cutting short a trip to Australia to attend a special EU summit, prompting market speculation initiatives are in the works to help resolve Greece’s debt problems.

EU heads of state are due to meet this week in Brussels for a special summit on the economy under pressure to restore confidence among investors worried that rising debt in Greece, Portugal and other weaker states in the Eurozone could undermine a global recovery.

The summit was called in early January and Trichet had been expected to spend both Tuesday and Wednesday in Australia at central bank meetings. Instead, he is leaving early, officials at the Reserve Bank of Australia and the ECB said.

He will fly on Tuesday to make sure he returns in time for the main session of the European summit, prompting speculation over the meaning of his early departure.

“There is a possibility that the EU could get the ECB involved and support Greece,” said Ayako Sera, market strategist at Sumitomo Trust Bank. “Fiscal concerns that have also spread to Spain and Portugal could temporarily ease if we get something on Greece.”

The euro inched up on news of Trichet’s changed travel plans as dealers speculated about European support for Greece.

“Investors may begin to think that a policy measure directed at Greece’s fiscal situation is potentially in the works,” said Barclays Capital in a research note.

It has fallen more than six percent since mid-December when ratings agencies first downgraded Greece.

Investors have shifted funds out of riskier assets into so-called safe havens, including the Japanese yen and the Swiss franc. Yields on Greek, Portuguese and Spanish debt and the cost of insuring against default have risen sharply.

Summits
The EU usually holds four summits a year, when all 27 heads of state or government gather in Brussels. The first summit, scheduled for March, normally focuses on economic issues.

But EU President Herman Van Rompuy, who can convene a special summit at any time if there are pressing issues, called for the meeting saying the bloc needed more economic growth in order to finance its social model on a sound basis.

Trichet has regularly taken part in the main session’s of the summit, but doesn’t always attend.

Before his departure, Trichet told the central bank gathering it was important to monitor global developments, not just local ones, and to anchor inflationary expectations.

“Keeping inflation expectations anchored remains of paramount importance, under exceptional circumstances even more than in normal times. Our framework has been successful in this regard thus far,” Trichet said. “But the lessons of the past fifty years – and, in particular, our success in anchoring inflation expectations – should remain uppermost in our minds.”

Eurozone finance ministers, facing the bloc’s first debt crisis in the 11-year-old currency union, tried to calm investor fears over the risk of sovereign default in peripheral states at a Group of Seven meeting in Canada over the weekend.

They said they would ensure Greece kept to a plan to cut its budget deficit to below three percent by 2012 from 12.7 percent in 2009, the Eurozone’s biggest gap.

Market sentiment
Trichet, who attended the G7 meeting, expressed confidence in the Greek plan. But the G7 comments did little to lift investor appetite for risk.

“Sentiment is still weak amid deepening concerns about southern European nations’ sovereign rating risks,” said Juhn Jong-kyu, a market analyst at Samsung Securities in Seoul.

Governments in Athens, Lisbon and Madrid are pushing through budget cuts to tighten their fiscal belts and restore confidence in their economies and ability to service their debt.

But they face domestic opposition to the plans.

Greek civil servants recently threatened to stage more strikes in protest at the austerity measures, raising worries over the government’s ability to rein in its deficit that has been swollen by the global crisis and billions of euros in stimulus spending.

A failure to press ahead with austerity measures is likely to increase pressure on the three state’s bonds and push up borrowing costs in a vicious circle that economists say could force the bloc to bail out one of its members or even prompt a country to be expelled from the EU.

Model risk

Reliance on models to price, trade, and manage risks carries risk. Models are susceptible to errors.

In liquid and more or less efficient securities markets, the market price is, on average, the best indicator of the asset’s value. In the absence of liquid markets and price discovery mechanisms, there is no alternative than theoretical valuation models to mark-to-model the position, to assess the risk exposure along the various risk factors, and to derive the appropriate hedging strategy.

The pace of model development has accelerated to support the rapid growth of financial innovations such as caps, floors, and others. These have been made possible because of developments in theory, which allow a better capture of financial risks. At the same time, these models would have never been implemented in practice, had the growth in computing power not accelerated so dramatically. The main causes of model risk are model error and implementing a model wrongly.

Model error
A model is incorrect if there are mistakes in the analytical solution. A model is also incorrect if it is based on wrong assumptions about the underlying asset price process. Finance is littered with examples of trading strategies based on shaky assumptions – some model risks are really just a formalisation of this.

The most frequent error in model building is to assume that the distribution of the underlying asset is stationary, when it changes. The ideal solution would be to acknowledge that volatility is stochastic and to develop an option-pricing model, but option-valuation models become difficult when any sort of stochastic volatility is included.

Implementing a model wrongly
With models that require extensive programming, there is always a chance that a bug may affect output. Some implementations rely on techniques that exhibit inherent errors and limited validity. Many programmes that seem error-free have been tested only under normal conditions.

For models evaluating complex derivatives, data are collected from different sources. The implicit assumption is that for each period, the data for all relevant assets and rates pertain to exactly the same instant, and thus reflect simultaneous prices. Using nonsimultaneous price inputs may be but can lead to wrong pricing.

When implementing a pricing model, tools are used to estimate parameters. But how frequently should input parameters be refreshed? Should the adjustment be made on a periodic basis, or should it be triggered by an event? Should parameters be adjusted according to qualitative judgements, or should these be based on statistics?
Statistical estimators are subject to errors involving inputs. A major problem in the estimation procedure is the treatment of outliers. Are the outliers really outliers, in the sense that they do not reflect the true distribution? Or are they important observations that should not be dismissed? The results of the estimation procedure will be vastly different depending on how observations are treated. Each bank may use a different procedure to estimate parameters. Some may use daily closing prices, while others may use transaction data.

The quality of a model depends on the accuracy of inputs and parameter values. This is particularly true in the case of new markets, where best-practice procedures and controls are evolving. Volatilities and correlations are the hardest parameters to judge.

Most institutions use internal data as well as external databases. The responsibility for accuracy is often not clearly assigned.

Adding observations improves the power of tests and tends to reduce errors; but the longer the sampling perio, the more weight is given to obsolete information.

The gap between the bid and ask prices may be large enough to complicate the process of finding a single value. Choices made about the price data at the time of data selection can have a major impact on the model.

How can we mitigate model risk?
One way is to invest in research to improve models and to develop statistical tools. It is critical for an institution to keep up with developments.

An even more vital way of reducing model risk is to establish a process for independent vetting of how models are constructed.

The role of vetting is to assure management that any model for the valuation of a given security proposed is reasonable. It provides assurance that the model offers a reasonable representation of how the market itself values the instrument, and that the model has been correctly implemented.
1. Documentation
This should be independent of any implementation, such as a spreadsheet or code, and should include:
(a) the term sheet or;
(b) a statement of the model
2. Soundness of model
The vetter needs to verify that the mathematical model is a reasonable representation of the instrument.
3. Independent access to financial rates
The vetter should check that the bank’s middle office has independent access to an independent market-risk-management rates database.
4. Benchmark modelling
The vetter should develop a benchmark model based on the assumptions and on deal specifications. The vetter may use an implementation that is different from that proposed.
5. Health check and stress test
6. Build a formal treatment of risk into the overall risk-management procedures

Large trading profits tend to lead to large bonuses for senior managers, and this creates an incentive for these managers to believe the traders who are reporting the profits. Often, traders use their expertise in formal pricing models to confound internal critics, or they may claim to have some sort of profound insight. Senior managers should approach any model that seems to record or deliver above-market returns with a healthy skepticism, to insist that models are made transparent, and to make sure that all models are independently vetted.

This article is an edited version of an entry in the “Encyclopedia of Quantitative Risk Analysis and Assessment”, Copyright © 2008 John Wiley & Sons Ltd. Used by permission.

www.wiley.com

Asset–liability management

ALM is universally defined as a comprehensive analysis of the asset portfolio in light of current liabilities and future cash flows of a going-concern company, incorporating existing asset and liability portfolios as well as future premium flows. ALM also considers additional risk factors beyond interest rate changes such as inflation, credit, and market risk. ALM also considers actions beyond the characteristics of a fixed income portfolio and seeks to identify and exploit hedges.

Insurance companies can benefit from a more integrated analysis of the asset and liability portfolios in seeking better risk-return decisions. An enterprise-wide analysis of potential risks and rewards affords an opportunity to analyse the company’s investment portfolio and underwriting portfolio. Since insurance liabilities are far less liquid than assets, such analysis and management activity tend to focus on adjustments to the investment portfolio, given the constraints of the reserves and underwriting portfolio, to improve the risk-return characteristics. In this respect assets can be thought of as a way to hedge liability risk. However, management activity need not be confined to fine-tuning investment strategy. Future underwriting considerations, along with other hedges such as reinsurance, are risk-management variables at their disposal.

Venter et al. presented a series of simple examples illustrating that the optimal risk-return portfolio decisions are very different as the asset and liability considerations become more realistic and complex. The authors started with a standalone asset portfolio, then with adjustments added a constant fixed duration liability, a liability that varied as to time and amount, and then added consideration of cash flows from current underwriting. As the various layers of complexity are added to the illustration, the nature of the inherent risks changes, as does the optimal investment portfolio.

The study did not address tax considerations, which can have a profound impact on investment decisions. Recent studies have found that insurers consider cyclical changes in the portfolio between tax-exempt and taxable fixed income securities over the course of the underwriting cycle to be one of the principle drivers in investment strategy. In addition to the integration of underwriting and investment results, such strategies rely on reallocation of assets to maximise income while avoiding alternative minimum taxes (AMT).

Consideration of equities, too, adds complexity and richness to the asset–liability analysis. Equities are considered risky in their own right and will imply a potentially worse downside risk to capital. Some believe that equities may provide a better inflation hedge for liabilities in an increasing loss cost environment. This proposition may be tested through the enterprise risk model, although the conclusion will be sensitive to input assumptions of the incorporated macroeconomic model.

In 2002, the Casualty Actuarial Society Valuation, Finance, and Investment Committee (VFIC) published a report testing the optimality of duration matching investment strategies for insurance companies. VFIC attempted to tackle Venter’s most complex scenario discussed above.

Where Venter et al., focused on changes in GAAP pretax surplus changes as the risk measure, VFIC looked at several different risk measures on both a statutory and a GAAP basis. Return, too, was considered on both accounting bases. In doing so, VFIC’s conclusion as to optimality was what one might expect in the real world: it depends. Duration matching was among a family of optimal strategies, but the choice of specific investment strategies was dependent on the company’s choice of risk metrics, return metrics, and risk-return tolerances or preferences.

An asset–liability modelling approach
It has been asserted that an enterprise-wide model is the ideal way to model and ultimately manage an insurance company investment portfolio. ALM makes for an excellent application of such an integrated model.
1.Start with models of asset classes, existing liabilities, and current business operations.
2.Define risk metrics for the analysis.
3.Similarly, management must define what constitutes return.
4.Consideration must be given to the time horizon of the analysis and the relevant metrics.
5.The model will have to consider relevant constraints.
6.The model should be run for a variety of investment strategies, underwriting strategies, and reinsurance options under consideration.
7.An efficient frontier – a plot of the return metric versus the risk metric – can be constructed across the various portfolio scenarios.
8.Since liabilities are more illiquid, the asset–liability analysis and management can be largely asset centric given the existing liabilities.
9.Having selected a targeted point on an efficient frontier and a companion reinsurance strategy, simulation output should be analysed to identify those scenarios where even the preferred portfolio(s) performed poorly.

Future research
While enterprise modelling is perhaps the only way to adequately address asset–liability management issues, there are a number of real-world issues that are subject of continuing research. Correlations can materially alter the risk of the optimal portfolio. Also, models of unpaid losses have not been developed as explanatory models. That is, unlike asset models, reserving models do not predict future loss payments with parameters linking losses to economic indices. Inflation sensitivity is often hypothesised on an accident year, a calendar year, or a payment year basis, but rarely explicitly developed from historic economic data and projected on the basis of, say, an economic scenario generator.

This article is an edited version of an entry in the “Encyclopedia of Quantitative Risk Analysis and Assessment”, Copyright © 2008 John Wiley & Sons Ltd. Used by permission.

www.wiley.com

Actuary

The word actuary derives from the Latin actuarius, who was the business manager of the Senate of Ancient Rome. It was applied to a mathematician of an insurance company in 1775 in the Equitable Life Insurance Society of London.

Actuarial science
This provides a structured and rigid approach to modelling and analysing uncertain outcomes of events that may impose or imply losses or liabilities upon individuals or organisations.

Given that uncertainty is a main characteristic of actuarial events, it follows that probability must be the cornerstone in the structure of actuarial science. Probability in turn rests on pure mathematics.

In order to enable probabilistic modelling of actuarial events to be a realistic and representative description of real-life phenomena, understanding of the “physical nature” of the events under consideration is a basic prerequisite. Pure mathematics and pure probability must therefore be supplemented with and supported by the sciences that deal with such “physical nature” understanding of actuarial events.

Actuarial practice
The main practice areas for actuaries can broadly be divided into: life insurance and pensions; general/nonlife insurance; and financial risk.

There are certain functions in which actuaries have a statutory role. Evaluation of reserves in life and general insurance and in pension funds is an actuarial process, and it is a requirement under the legislation in most countries that this evaluation is undertaken and certified by an appointed actuary. The role of an appointed actuary has long traditions in life insurance and in pension funds. A similar requirement has been introduced by an increasing number of countries since the early 1990s.

The involvement of an actuary can be required as a matter of substance in other functions. Then there are functions where actuarial qualifications are neither a formal nor a substantial requirement, but where actuarial qualifications are perceived to be a necessity.

Life insurance and pensions
Assessing and controlling the risk of life insurance and pension undertakings is the origin of actuarial practice and the actuarial profession. The success in managing this risk comprises the following:
•understanding lifetime as a stochastic phenomena, and modelling it within a probabilistic framework;
•evaluating the diversifying effect of aggregating lifetime of several individuals into one portfolio;
•estimating individual death and survival probabilities.

By analysing historic data and projecting future trends, life insurance actuaries constantly maintain both first-order and second-order bases. Premium and premium reserve valuation tariffs and systems are built on the first-order basis. Over time emerging surplus is evaluated and analysed by source, and in due course reverted as policyhold¬ers’ bonus.

General insurance
Over the years actuaries have attained a growing importance in the running of the nonlife insurance operations. The basis for the insurance industry is to accept economic risks. An insurance contract may give rise to claims. Both the number of claims and their sizes are unknown to the company. Thus insurance involves uncertainty and here is where the actuaries have their prerogative; they are experts in insurance mathematics and statistics.

Finance
Financial risk has grown to become a relatively new area for actuaries. Those who practice here are called “actuaries of the third kind”.

It is different from ordinary insurance risk in that increasing the size of a portfolio does not in itself provide a diversification effect. For actuaries it has been a disappointing that the law of large numbers does not come into assistance.

The most fundamental and innovative result in the theory of financial risk/mathematical finance is probably that risk associated with contingent claims can be eliminated by appropriate portfolio management.

This theory is the cornerstone in a new practice field, called financial engineering. Activities include quantitative modelling and analysis, funds management, interest rate performance measurement, asset allocation, and model-based scenario testing. Actuaries may practice financial engineering in their own right, or they may apply financial engineering as an added dimension to traditional insurance-orientated actuarial work.

A field where traditional actuarial methods and methods relating to financial risk are beautifully aligned is asset liability management (ALM). The overriding objective of ALM is to gain insight into how money is best allocated among given financial assets, in order to fulfill specific obligations represented by a future payment stream. The analysis of the obligation’s payment stream rests on traditional actuarial science, the analysis of the asset allocation problem falls under the umbrella of financial risks, and the blending of the two is a challenge that requires insight into both and the ability to understand and model how financial risk and insurance risk interact. ALM is today a key component in the risk management of insurance providers, pension funds, and other financial institutions.

A new challenge on the horizon is the requirement for insurers to prepare financial reports on a market-based principle, which the International Accounting Standards Board has had under preparation for some time. In order to build and apply models and valuation tools that are consistent with this principle, actuaries will be required to combine traditional actuarial thinking with ideas and methods from economics and finance.

This article is an edited version of an entry in the “Encyclopedia of Quantitative Risk Analysis and Assessment”, Copyright © 2008 John Wiley & Sons Ltd. Used by permission.

www.wiley.com

Consumer products

Each year consumer products are involved in millions of injuries and thousands of fatalities. Responsibility for this rests with the manufacturers. Most developed countries have also established government agencies that provide regulatory oversight. In the US, the Consumer Product Safety Commission (CPSC) maintains regulatory jurisdiction over more than 15,000 types of consumer products.

Consideration for public safety can improve product development and product quality. Formal tools of quantitative risk assessment add value by improving decision making at different stages.

Product characteristics and analysis of historical data on product-related injuries and deaths can inform safety assessment of a product design before it is manufactured.

Unintended declines in the quality of consumer items can pose safety hazards to consumers, causing unexpected financial burdens and potential legal liabilities to manufacturers. At the beginning of the production process, companies may implement acceptance sampling plans to ensure the quality of raw materials or components. The acceptable quality limits (AQLs) used in such plans can be chosen on the basis of a quantitative risk assessment. Standard tools for statistical process control are routinely deployed by manufacturers to provide checks on quality. A motivating principle is that successful business operations require continuous effort to reduce variation in process outputs. End-of-line testing is typically performed before manufacturers approve products for shipping to customers. In some industries such testing may be required by government regulation.

Information about the performance of a product after release to consumers comes from returns, warrantee programmes, and complaints. Records of incidents involving failure of a product during operation typically include information such as model and serial number of the unit, production date or date code, incident date, and location. In monitoring field performance, a manufacturer needs to determine whether an adverse change has occurred and, if so, assess its implications for product reliability and safety. If a background level of event risk is accepted, statistical methods can determine whether adverse events can be explained by random variation or require attention. Production, shipping, and sales records typically provide the raw data for such analyses.

Confirmation of a problem raises a series of questions: How bad is it? Do field data point to a particular production period or facility? A quantitative risk assessment can address these questions using statistical estimation and hypothesis testing. Risk analysts review the information about production history and seek changes in time period, plant, process, equipment, design, or supplier that are associated with subsequent problems. Parametric statistical models are frequently used to fit time-to-failure distributions in engineering applications.

Product recalls require identification of the units affected, typically by time or source of production, and determination of whether affected units will be repaired or replaced. Increasing globalisation challenges manufacturers and regulators of consumer products, as a product may contain components from different nations and be sold all over the world.
The process for implementing corrective action varies from country to country. In 1997 the US CPSC adopted a “Fast Track Product Recall Programme” for reports filed according to Section 15(b) of the Consumer Product Safety Act (CPSA). This programme requires manufacturers, distributors, and retailers of consumer products to notify the commission of certain defects, unreasonable risks, or noncompliance with voluntary or mandatory stan¬dards. Under the CPSC Fast Track programme, the staff refrains from making a preliminary determination when firms report and implement an acceptable corrective action plan. The plan submitted to CPSC must describe the recall action (refund, repair, or replace) that the company will take to eliminate the identified risk, and provide sufficient information on the product design, incident, and testing to allow the CPSC staff to determine whether the proposed action can correct the identified problem.

An effective risk assessment must link measured levels of risk to specific corrective actions. Supporting tools have been developed, particularly in the EU.

A community rapid information exchange system known as RAPEX assesses the risk of potentially hazardous consumer products by first considering (a) the probability of health damage from regular exposure, and (b) the severity of injury from the product. Depending on the probability and severity of damage the RAPEX method then classifies the overall gravity of an adverse outcome on a five-point ordinal scale ranging from very low to very high. The final judgment of whether the risk requires corrective action considers three additional factors: (a) the vulnerability of people exposed to the product, (b) whether the hazard is obvious to nonvulnerable adults, and (c) whether the product has adequate warnings and safeguards. The RAPEX methodology is sometimes viewed as the preferred approach in countries adopting a risk-averse approach to consumer product safety based on such rough scoring.

Although consumer products are required to be safe, safety does not mean zero risk. A safe product is one that has reasonable risks, given the magnitude of the benefit expected and the alternatives available. In managing consumer product risk, quantitative risk assessment and associated statistical methods are used to frame substantive issues in terms of estimable quantities and testable hypotheses, to extract infor¬mation on product performance from data on field incidents and manufacturing records, and to com¬municate findings to upper management, regulatory authorities, and consumers.

This article is an edited version of an entry in the “Encyclopedia of Quantitative Risk Analysis and Assessment”, Copyright © 2008 John Wiley & Sons Ltd. Used by permission.

www.wiley.com

Enterprise Risk Management

Enterprise risk management (ERM) is a recent technique, practiced increasingly by large corporations in industries throughout the world. Sensible risk management flows from the recognition that a dollar spent on managing risk is a dollar cost to the firm, regardless of whether this risk arises in the finance arena or in the context of a physical calamity such as fire. ERM thus proposes that the firm addresses these risks in a unified manner, consistent with its strategic objectives and risk appetite.

Most corporations adopt the definition of ERM proposed by the Committee of Sponsoring Organisations of the Treadway Commission (COSO) in their 2004 ERM framework. It intended to establish key concepts and techniques for ERM. In this framework, ERM is defined as “a process, affected by an entity’s board of directors, management and other personnel, applied in strategy setting and across the enterprise, designed to identify potential events that may affect the entity, and manage risk to be within its risk appetite, to provide reasonable assurance regarding the achievement of entity objectives”. This definition highlights that ERM reaches the highest level of the organisational structure and is directed by corporations’ business strategies. The concept of risk appetite is crucial. Risk appetite reflects a firm’s willingness and ability to take on risks in order to achieve its objectives.

As a rising management discipline, ERM varies across industries and corporations. The insurance industry, financial institutions, and the energy industry are among the industry sectors where ERM has seen relatively advanced development in a broad range of corporations. Recently, even the public sector, are becoming aware of the potential value of ERM and risk managers are increasingly bringing it to top executives’ agendas.

Notwithstanding the attractiveness of ERM conceptually, corporations are often challenged to put it into effect. One of the main challenges is to manage the totality of corporation risks as a portfolio in the operational decision process, rather than as individual silos, as is traditionally done.

Operationalisation of ERM
The core of the challenge lies in operationalising ERM. Integration of risks is not merely a procedure of stacking all risks together, but rather a procedure of fully recognising the interrelations among risks and prioritizing risks to create true economic value. Important components of this procedure include risk identification, risk measurement, risk aggregation/other modelling approaches, risk prioritisation, and risk communication.

The four major categories of risks considered under an ERM framework are hazard risk, financial risk, operational risk, and strategic risk.

Under ERM, the identification of individual risks in different categories should facilitate successive prioritisation and integration of risks to best achieve business objectives within the corporation’s risk appetite. Any event that may adversely affect the corporation’s achievement of its objectives is considered a risk under ERM. Proper objective identification is a prerequisite for risk identification. For example, business objectives can be described by certain key performance indicators (KPIs), which are usually financial measures such as ROE, operating income, earnings per share (EPS), and other metrics for specific industries, eg, risk adjusted return on capital (RAROC) and risk-based capital (RBC) for financial and insurance industries. Risks are then recognised by means of these company performance metrics.

Prioritisation
To realise effective risk integration, ERM also promotes risk prioritisation. Risk prioritisation stems from the fact that risks are not equally important to corporations. Prioritisation should reflect different aspects of the company’s strategies and risk-management philosophy, eg, cost to tolerate that risk, reduce it, elicit and apply management’s risk preferences, etc.

ERM and compliance
ERM at first arises from corporations’ efforts to comply with laws and regulations. To this end, it is seen more as an efficient internal control process. Within a corporation, it is often conducted with internal control functions and supervised by internal auditors. The most significant regulatory forces responsible for the rise of ERM are the Sarbanes Oxley Act of 2002, the Basel Capital Accord II, and rating criteria set forth by rating agencies such as Standard & Poor’s (S&P).

ERM future – value creation
ERM practices may have been initially driven by compliance needs, but developments should continue to serve as an internal control function for better corporate governance. One common objective for corporations is to maximise firm value. ERM provides a framework for corporations to consciously optimise risk/return relationships. This optimisation is achieved through the alignment of corporate strategic goals and risk appetite. At the operational level, the alignment guides virtually all activities conducted by the corporation. Specific risks are identified and measured. They are prioritised and integrated by recognising the interrelations and relative influences affecting different risky outcomes. Risk management strategies are developed for the entire portfolio of risks and their effects are assessed and communicated.


This article is an edited version of an entry in the “Encyclopedia of Quantitative Risk Analysis and Assessment”, Copyright © 2008 John Wiley & Sons Ltd. Used by permission.

www.wiley.com