Archive | Business RSS feed for this section

A new breed of banker?

26 Apr

 ‘Difficult but talented’ replaced by ‘Pedestrian but hard-working’.

A slew of regulatory reforms have made banks much safer places – Dodd-Frank in the US, and the Markets in Financial Instruments Directive (Mifids 1 and 2, as they are affectionately known) – have made it harder for banks to market and sell risky securities to ‘unsophisticated’ investors, and regularised professional trading standards to combat volatility.

But now that over-the-counter derivatives and so-called ‘dark pools’, where large trades can be placed anonymously without the onerous reporting obligations of licensed exchanges, have been strait-jacketed by legislation, is there any place for the dashing, risk-loving daredevil of popular stereotype?

According to eFinancial careers, which keeps a close eye on trends in the industry, the typical banker has become almost… boring.

“A lot of people who had successful careers – the most talented people of my generation – have left the industry to do other things,” Kerim Derhalli, the ex-Deutsche MD who himself has quit banking to head Invstr, a social network for amateur investors, told the recruitment firm.

It is certainly true that investment banking still pays the highest starting salary for graduate recruitments, according to a UK survey by graduate recruitment research company High Fliers. Their report, on ‘The Graduate Market in 2016‘, put the welcome package at investment banking firms at £47,000, the top-ranking sector, with banking and finance ranked third at £36,000.

Though the number of positions at banking and finance companies did not increase by even a quarter of the increase seen within IT & Telecommunications, which went up 219% compared to just a 37% rise in finance positions.

The High Fliers report recorded that “The number of entry-level positions available for graduates in IT & telecommunications and in the public sector has more than doubled over the last ten years, whilst recruitment at the top consulting firms has increased by two-thirds.”

This is corroborated by another eFinancial careers source, former head of rate sales at Deutsche Bank Chris Yoshida. “When I went into banking [in 2000] my graduate class was comprised of the best students from the best universities in the world,” said Yoshida, who now advocates for the Kairos Society, an organization that helps young entrepreneurs effect global change. “This is no longer the case – the very top students now want to work for Google and Facebook. Banks are attracting the students who are in the top 50% to 75% (rather than the top quartile).”


Even banking interns, such as those enrolled in the Goldman Sachs 2016 ‘summer analyst class’, which prepares those keen on a financial career in its inner workings before they have even graduated, have become less motivated by materialism.

Goldman Sachs, which has a spoof twitter account under the label ‘GSElevator… Straight to Hell’, reported on its blog that its interns were predominantly interested in saving to buy a house (46%), while just 3% wanted to own luxury items. I suppose it might come across as presumptuous before you’d even been offered the highly desired and competitive job to say ‘I just want a Ferrari’, but… the tone of the industry has definitely changed.

Perhaps this is partly due to the emergence of multiple sources of alternative finance like crowdfunding and peer-to-peer lending platforms; and alternative payment systems such as Paypal and now blockchain-linked crypto-currency mechanisms.


Gottfried Haberler’s contribution to trade theory

17 Feb

Gottfried Haberler was a member of what is loosely termed the ‘Austrian’ school of economics, to denote the group of theorists who opposed centralised – government – intervention in money creation, which they argued artificially distorted capital flows and created structural inefficiencies.

He was more closely tied to the Austrian school at the beginning of his career, when while in that country he was a regular contributor to the seminars organised by Austrian economist Ludwig von Mises; as part of the Mises-Kreis, the celebrated group of economic, sociological and philosophical thinkers.

In what was, for the time, a departure from the orthodox theory of value, quantified it in terms of labour and output, the Austrian theory of value focused on the process of production itself. And how in electing to lend part of the finite amount of money to certain industries over others, there was a danger of creating structural inefficiencies which would self-correct over the course of the business cycle.

A Pioneer in Trade Theory

In the 1930s Haberler was instrumental in creating an alternative framework for analysing cost and value, moving away from the theory of comparative costs (advantage) on the single-product model which had underpinned trade theory since Ricardo. Haberler’s framework mapped out the relationship between the opportunity cost of producing two competing goods, under a given supply of productive factors. This he performed both for constant and fluctuating opportunity costs.

Previously orthodox theory had been based on the ‘real-cost’ theory of value, which saw prices as quantified largely in units of labor. The new approach enabled the determination of relative prices to be analysed under more realistic production circumstances as variable factor productions, in a much simpler and more direct manner than under a real-cost approach.

This paradigm shift triggered a wave of writings by other academics, which incorporated and expanded Haberler’s theory, like Lerner (1932, 1933, and 1934), Leontief (1933)  and Viner (1937), who introduced ‘social’ or ‘community indifference’ curves. When these two curves – those of opportunity cost and social indifference – are plotted together – Marshall’s reciprocal demand curves can be derived; and most general equilibrium effects of trade on relative commodity prices, production, and consumption then shown.

What Haberler’s analysis did not include was an attempt to model consumer preferences for the commodities being produced. Nor an explanation of how productive factors evolve as an economy moves along its production-possibilities curve. This would have to wait until Stolper and Samuelson (disciples of Haberler) published their ground-breaking article in 1941, which elucidated more fully the way the production-possibilities curve is determined, and how factor proportions fluctuate along the curve.

 Where Trade Theory Fails

Haberler’s 1950 work, ‘Some Problems in the Pure Theory of International Trade’, examined the less-than-ideal situation of real wage rigidity, which can be caused by insufficient mobility of labour between a developed and less developed sector; one of the scenarios examined by economists who later expanded his model. This article formed the precursor for an extensive body of literature on ‘domestic distortions’ in which orthodox theories of trade relations might be non-applicable.

The consensus that in the majority of situations free, unimpeded trade has a net benefit did not change dramatically. The theory so-called Hicksian optimism rehabilitated the argument for free trade largely on the basis that wider availability of goods and increased competition leading to cheaper prices would yield a welfare gain; the need for protection arises only when there is a market failure in the domestic economy. Where there is a domestic divergence between prices and marginal costs, foreign competition can hurt some domestic industries.

In the event of real wage rigidity, the opening-up of trade – whether in a full customs union or a free trade area – could cause loss of output. Industries which for whatever reason are unable to pay a competitive rate which attracts new workers would be threatened by removing tariff barriers, which would allow unimpeded entry of competing products.

As the marginal return on these products became unviable, but wages were not flexible enough to change accordingly, labour would move out of these struggling industries and into more competitive ones. Often the industries that suffer are those at the breaking edge of new technology and development, which lacks a mature labour pool with the necessary skill set.

Let’s Get Technical

In his econometric model Haberler demonstrated that increased availability of products and a wider market to stimulate output had a net benefit, provided this increase was to the right of the domestic indifference curve.

In trade theory the state of ‘autarky’ is where the factors of production are deployed to their maximum potential, accounting for the limiting factors of the opportunity cost of manufacturing that product over another, which are assumed to increase; and a community indifference curve which has an inverse relationship to the opportunity cost curve, (increasing where there is scarcity of a particular good’.

The material gains from trade are represented in graphical form by the international trade ratio. In a model comprising two exchangeable commodities, this describes the amount of commodity A that can be exchanged for commodity B. If commodity A buys two of commodity B abroad, but at home you need two of commodity A to get one of B, then domestically A is more valuable. Therefore B should be exported.

However, Haberler has a caveat. If T, the international trade ratio representing the increased availability of goods from trade, is such that there is a net outflow of goods, that “these imperfections are persistent, … and that they persistently operate in such a direction as to weaken (rather than to strengthen) the case for free trade,” protection might be justified.

His idea of a desirable welfare position is not an overly naive one in which all individuals are necessarily better off, but “it is sufficient that everybody could be better off.” He distances himself from the idea that “perfect mobility of factors within each country is a necessary condition for the ideal classical model”, going on to assert that “what really causes trouble and may make trade detrimental and justify protection is rigidity of factor prices, which may or may not be associated with immobility of factors.” The most likely factor to experience difficulty transitioning between industries is that of labour.

Expanding this theory further, Brecher (1974) examined a number of scenarios involving real wage rigidity, starting with one in which free trade was combined with unemployment; he analysed the consequences of using different policy instruments. If the importable is labour-intensive to produce, a tariff would increase employment and output, by shielding domestic industry. But capital and labour would move disproportionately into the protected industry; also there would be a by-product consumption distortion.

When the demand for a product, reflected in its price, is proportionate to the marginal cost for each firm and product, there is zero distortion. But protectionism can mean the output swells beyond sustainable consumer demand, as that industry is protected from foreign competition and, more indirectly, may benefit from tariff revenue.

The Austrian school holds that distortions of this kind inevitably self-correct over the course of the business cycle, and ‘creative destruction’ can mean boom-time companies do not survive when they lose policy protection.

The second scenario Brecher modelled was polarised between a subsistence, and an advanced sector, where high skills and/or costly technology necessitated a wage rate in excess of the opportunity cost of labour – i.e. higher than the marginal product of labour in the subsistence sector. To phrase it in plain English, these advanced sectors would have trouble attracting capital which could be profitably employed in more basic industries.

This form of domestic distortion, he argued, necessitates subsidies in place of tariffs or taxes on trade. Because the revenue effect is negative and so higher distorting consumption taxes are needed, he acknowledges the extent of the offsetting subsidies may have to be incomplete.

The distortions not offset are weighed against the new distortions created as a consequence of financing the subsidies. While not a perfect solution, he concludes it is ‘first-best’, the most beneficial option, to deal with distortions in this way.

This theory of ‘domestic distortions’, which Haberler led the field in, is admittedly a far cry away from his origins as an Austrian School disciple, a staunch defender of the principle of unimpeded trade. This just goes to demonstrate his intellectual versatility and ability to break with orthodoxy and form new approaches.

But today it might be time for a new school of thought on the subject. As is so often the case in institutions where collective decision-making is skewed by relative economic and political weight, the WTO is governed to a large extent by the vested interests of the countries with the biggest economic muscle. Trade or customs unions like that existing within the EU and the proposed Trans-Pacific Partnership (TRIPS) which Trump made guillotining one of his first official presidential acts, only yield a net benefit to those participating in them.

For those countries outside the golden circle, they can face significant obstacles to equitable trade not limited to tariffs; customs’ scrutiny of imports is far lower, for example, within the EU which has a unified legislative framework to enforce commercial and legal standards. Trump’s announcement of his intention to renew the North American Free Trade Agreement is another step towards his avowed position as a champion of ‘free trade’[1] and greater competitiveness.



‘The Normative Theory of International Trade’ – W. M. Corden, Australian National University, Canberra  (Seminar Paper no.230)

‘Gottfried Haberler (1900 – 1995)’ – Joseph T. Salerno.

‘Some Problems in the Pure Theory of International Trade’ – Gottfried Haberler, 1950

‘Gottfried Haberler’s Contributions to International Trade Theory and Policy’ – Robert E. Baldwin, The Quarterly Journal of Economics vol. 97, No. 1 (Feb 1982), p.141-48



The Austrian School in a Nutshell

8 Dec


The Austrian School’s theory of credit and capital has a reputation for being complicated, and some posit that the reason Keynes’ ‘General Theory of Employment, Interest and Money’ holds such weight with policymakers is that it draws on the psychology of lending. Keynes saw the amount of capital channelled into investment as being inherently uncertain, a product of employer confidence in demand for goods, which depended on wider factors like the level of employment, income distribution and wage levels.

Austrians have been caricatured as prophets of the ‘efficient markets’ hypothesis, which essentially holds that artificial credit injections mess with the natural pattern of investment. They believe investment should stem naturally out of normal saving behaviour, as this creates sustainable demand for the goods and services produced through borrowing; loans or debt instruments initiated by banks which are backed on margin and not fully offset by the bank’s stock of available capital represent ‘artificial’ credit whose long-term effect is to cause price increases.

Austrian School economists are sceptical about policymakers’ belief that they can engineer optimal market conditions, believing that artificial intervention causes market inefficiencies. Broadly speaking, the argument runs that inefficient loan allocation to companies which are not sufficiently equipped to use them, or for whose products there is not enough demand, may in the short-term stimulate ‘growth’ – employment and consumption.

But the consequence of this ‘growth’ – the increase in wages and concordantly prices – will be priced into existing and future loans. The unregulated growth of credit will eventually cause a systemic shock, as confidence fails and many of the riskier debt instruments shed as investors try to exit their positions en masse. The value of higher-risk instruments plummets further, as some of the underlying assets – e.g. homes in the case of sub-prime mortgages – undergo foreclosure and are sold at a discount at auctions. Sound familiar?…

Disciples of Haberler, who was more concerned with the structural allocation of credit across the ‘vertical’ chain of production – from commodity mining and production to equipment manufacture to the factories where this equipment is used – would focus more on the credit tied up in cap ex by companies investing in new technology or expansion.

The central bank is supposed to moderate growth, preventing unsustainable expansion and stimulating consumption and/ or investment. Setting a minimum interest rate is just one of many ways in which it does this. Asset purchase schemes, and quantitative easing, are two others. Note in support that the ‘inflation premium’ detailed by economist Irving Fisher is priced by default into interest rates by commercial issuers; the government traditionally just provided a benchmark.

These asset purchase and what we’re going to nickname ‘capital injection’ schemes (when the government creates bonds it then buys back from banks, creating liquidity but also increasing the public deficit) have the net result of more reserves ending up deposited with the central bank.  As more capital is circulating, some of it inevitably ends up in current accounts with the banks and they are required to hold a certain ratio of this capital as reserves with the central bank for security.

If and when the central bank decides to raise interest rates, in fairness it must apply the policy rate to its own reserves, effectively paying interest on its own debt at the taxpayer’s expense. If it does not, this will act as an ‘opportunity cost’ – effectively a tax – to banks who could have invested the money elsewhere at a profit. This trend has been analysed in more depth by those such as Claudio Borio, Head of the Monetary and Economic Department of the BIS. For a brief introduction to the argument, see the speech given by him and the Bank of Thailand’s Mr Piti Disyatat on ‘Helicopter Money’.

One of those historical Austrian economists, transported to the present day, might find that their essential doctrine that when tampered with interest rates can have a distorting effect on growth, still has relevance. Unquestionably, QE has boosted spending, and new, sometimes innovative investment. But the cost has fallen on those financial institutions that must take the new risks, as they are forced to chase ever higher yields; pension funds which are struggling to climb out of their own deficits, as former ‘safe-haven’ assets provide insufficient income to meet their liabilities.

The rise of alternative finance seemed for some time to provide another option to the legislation-hampered banks, whose rigorous screening tests and core capital obligations prevented them from extending loans to all comers. But the marketplace lenders’ models which were heavily reliant on ‘diversifying’ the risks they did not properly analyse, by aggressively seeking new loan applicants to replace the loans that had gone bad, are now looking likely themselves to face the price of over-expansion.

The USA’s Lending Club is a prime example, and has suffered losses of $36.5m in the third quarter of this year – though up from $81.4m in the second quarter – as it confronts the need to tighten up its due diligence operations and tackle non-performing loans.

Reversing the secular and government-sponsored decline in interest rates may not be a painless process, but the alternative is to enshrine a borrowing climate which is not profitable for the majority of lenders, or investor in the resulting securities. Can the market rely forever on the government to prop it up, even as the government’s own debt must increase to fund its stimulus measures? Does this rhetorical question even need a response?…


Ludwig von Mises

The ‘Austrian’ Theory of the Trade Cycle

Borrowed the capital theory developed by Carl Menger and elaborated by Eugen von Bohm-Bawerk. Mises attempted to prove that when, in an unsustainable credit expansion, interest rates are forced down, capital is allocated inefficiently. Because loans are granted in an indiscriminatory fashion, the production process ties down capital for too long a period in relation to ‘the temporal pattern of consumer demand’. In the end, the discrepancy means the market for both consumer and capital goods (loans) readjusts to counteract the misallocation.


Friedrich A. Hayek

Can We Still Avoid Inflation?

Plotted series of right-angled triangles to show the two factors of time and money, as capital flows through the production process. It is agreed to have been overly simplistic in imagining capital as ‘tied down’ in development loans when in reality it still circulates fairly freely. But Hayek pioneered the use of time as a vital factor in analysing the boom-and-bust sequence.





Interest rate risk? What interest rate risk?

27 Oct


Growth in the third quarter of 2016 seems to be on track to remain consistent with that of Q2, at the identical figure of 2.1%. Which will doubtless engender much self-congratulatory patting of backs among the BOE’s Monetary Policy Committee, for steering the country through a potential capacity surplus and dip in employment. Do they really deserve this round of applause?

There is an opinion held in some quarters that central bank control is the main obstacle standing in the way of the perfect equilibrium that would be achieved if interest rates were allowed to compete in a free market. While I do not necessarily believe that free-market capitalism always acts in favour of the greatest good of the greatest number of people, the tried-and-tested sticking plaster of QE does not reach the deeper systemic problems – something the BOE itself admits.

As well as announcing the continuation of its asset repurchase policy (quantitative easing), the BOE’s Monetary Policy Committee decided in August to cut bank rate to a growth-stimulating – at least in theory – 0.25%. The negative effect this would have on bank’s lending profit margins they hoped would be countered by the Term Funding Scheme (TFS) they were offering to institutions affected.

They elaborated in the August Inflation Report that “the Term Funding Scheme (TFS)… will provide funding for banks at interest rates close to Bank Rate.  This monetary policy action should help reinforce the transmission of the reduction in Bank Rate to the real economy to ensure that households and firms benefit from the MPC’s actions.  In addition, the TFS provides participants with a cost-effective source of funding to support additional lending to the real economy, providing insurance against the risk that conditions tighten in bank funding markets.”

This is all very good and of noble intention, but it is doubtful this alone would counteract the many factors acting against the free lending by banks to needy applicants. For one, the alternative finance market is proving serious competition, as the nimbler new competitors are far more proactive in sourcing customers, using sophisticated data analysis to find leads; and their pricing algorithms bypass the more thorough risk profiling traditional banks undergo. In addition, they often avoid taking direct liability for issuing the loans, doing this instead through partner banks. By keeping their balance sheets free of these liabilities, they avoid the imposition of different tiered capital ratios along the lines of Basel III.

Let’s get Specific

Interest Rate Risk in the Banking Book (IRRBB) is a quantified metric which is itself the subject of constant, international regulatory scrutiny. In a report released April 2016, the Basel Committee on Banking Supervision succinctly explains the complex relationship between different loan products:

IRRBB arises because interest rates can vary significantly over time, while the business of banking typically involves intermediation activity that produces exposures to both maturity mismatch (e.g. long-maturity assets funded by short-term liabilities) and rate mismatch (e.g. fixed rate loans funded by variable rate deposits). In addition, there are optionalities embedded in many of the common banking products (e.g. non-maturity deposits, term deposits, fixed rate loans) that are triggered in accordance with changes in interest rates.”

An option is linked to a specified contingency which, if occurring, gives you the right but not the obligation to exercise that option. For example if you end up stuck with a loan at 4% interest which was originated when base interest rates were at 1.2%, and then base interest falls to 0.5%, your interest payments will be overpriced relative to the rest of the market. The option might specify that if interest rates fall below a certain point, you can choose to switch to a floating-rate linked to a benchmark like LIBOR. (London Interbank Offered Rate, administered by ICE or IntercontinentalExchange group)

Definitions, definitions, definitions

‘Option risk’ is just one of three defined types of interest rate risk that can occur; the others being ‘gap risk’ and ‘basic risk’. Automatic option risk is when an optionality is included as part of a synthetic product, be it an asset, liability and/ or off-balance sheet item. Behavioural option risk is where the change in cash flow results from the autonomous human decision to exercise an option.

So-called gap risk describes the shortfall that arises when banks’ debt instruments undergo rate changes which occur at different times. Some loans, particularly for construction or infrastructure projects, are staggered so interest payments increase towards the end of the loan’s duration, as during the early set-up phase there is little to no cash flow or income.

Basis risk is classified as the impact of relative changes in interest rates between instruments which are linked to different pricing benchmarks; say for EM bonds over Treasuries, which are priced by different market factors and whose basis points might change at different rates, though their tenors are the same. One might even be used to hedge the other, and basis risk quantifies the effectiveness – or not – of this hedge.

The regulator admitted it was having difficulty persuading banks to fully assess the IR risk on their books, as in their internal assessments they tended to focus on earnings measures over the period under review, rather than economic value measures which examine assets and liabilities right until their expiration. In their guidance the Basel supervisory committee stressed the importance of having both types of assessment compliment each other.

If the late troubles of Deutsche Bank are any indicator, it is that banks can struggle to reconcile their regulatory obligations with taking on high enough risk assets to turn a viable profit. That, and the importance of having a smoothly running infrastructure and back-end. On the other hand, JP Morgan’s recent triumphant quarter shows you can make a killing out of trading bonds, if you know what you are doing.


Why Recycling is the new Moisturising.

19 Oct

Of all the business workings you must archive and report, ‘waste’ is probably the least appetising. Trying to tot up the margin of product that fell off the production line, the bits you’d like to pick back up off the scrapheap… it’s not an auditor’s most exciting way to spend a day.


There are a number of different regulations in the UK currently, which as many are derived from EU guidelines might change over the course of the Brexit negotiations. They govern aspects as diverse as a ‘tax’ on packaging for prolific producers of paper, polyethene and cardboard hybrid coffee cups, glass, etc. And punitive fines and even jail-time for companies which engage in unlicensed disposal of ‘controlled waste’.

Wrapping-paper Tax

Any UK-listed or operating producer which emits more than 50 tonnes of packaging a year, and which has a turnover of over £2million, is obligated to submit a Packaging Tax Return to HMRC. They then have to offset their obligation by funding a commensurate amount to the government’s packaging recycling programme.

The private sector too has cashed in on the packaging sector, with a wealth of innovative initiatives to minimise waste. Probably the greatest expansion has been in devices to prevent food wastage, from the now fairly commonplace ethylene absorbers to special types of bacteria-fighting film. Ethylene is a hormone produced by metabolism in most fruit.  It initiates and accelerates the ripening of fruit and causes vegetables to start decomposing. Several companies now provide packaging with ethylene absorbers to increase produce shelf life.

Still more exciting are some of the patented inventions now seeking corporate sponsorship. For example, the wrappers with built-in anti-microbial properties recently developed by the Fraunhofer Institute for Process Engineering and Packaging IVV in Freising. Sorbic acid is the active component of the bacteria-fighting film; which in clinical trials reduced the size of an E.coli colony cultivated on day-old pork loin for the experiment to around a quarter of its initial size. Crucially, in the concentration of the laquer applied to the film, sorbic acid is neither poisonous nor allergenic and virtually odourless and tasteless.  )

Dodge the Plastic Bag Tax

The resource-efficiency bar has been raised still higher in the compostable plastic department, where a number of global competitors jostle for supremacy. In the UK there are several competing providers of biodegradable plastics, including Scotland-based BioBags, and Biopac, the self-described ‘leading developer’ of a very wide range of eco-friendly food packaging and catering disposables.

In Australia, where ‘sustainability’ is a buzzword even for the big mining companies, one player dominates the market. Publicly listed ‘Secos’ was formed in a reverse merger of Cardia Bioplastics Ltd with Stellar Films Group Pty. Ltd. In April 2015. Post-merger, its preliminary annual report for December 2015 showed total assets including cash, trade and other receivables and prepayments, was $9,076,829. The most recent figures available from show that the Australian stock market looks favourably on its prospects, as its P/B (price-to-book) ratio is 2.33, compared to 1.43 the market benchmark, and 1.54 for the sector.

UK-based Biopac’s impressive range of products enable catering and hospitality companies to proudly declare their green credentials; not only can they cite their sustainable container purchases on their annual reports, it is also often branded on the product itself. There is the ‘I am not a plastic cup’ made from renewable cornstarch that also carries the government approved CE marking (£130 for a case of 2100).   And the 12oz single use* ‘I’m a Green Cup,’ made from certified FSC (Forest Stewardship Council) board with a starch material, which is actually 100% compostable (£57.45 for a case of 1000). Various PLA (polylactic acid) clear tumblers …

If you needed further proof that this was a growth trend that has become impossible to ignore, there’s even a site called ‘Biodegradable Plastic Glasses’ (insert domain name here).

Or, if you prefer a more official stamp of approval, a market research report by Markets and Markets entitledBiodegradable Plastics Market by Type (PLA, PHA, PBS, Starch-Based Plastics, Regenerated Cellulose, PCL), by Application (Packaging, Fibers, Agriculture, Injection Molding, and Others) – Global Trends & Forecasts to 2020  states that the biodegradable plastics market is projected to grow from more than USD 2.0 Billion in 2015 to USD 3.4 Billion by 2020, at a CAGR of 10.8% between 2015 and 2020.

That’s a nice return on your investment.




Why ‘Defined Benefit’ should become ‘Pre-Defined Benefit Re-Defined in Light of Changing Conditions’

4 Sep


A recent well-intentioned but technically inaccurate FT article laid out the pitfalls of employers continuing to offer Defined Benefit (DB) schemes, in light of the recent bout of QE and interest rate revisions. Briefly, in a separate article it reported that consultancy Hymans Robertson had conducted research analysis which hypothesised the BOE’s announcement of a £70bn QE programme would lead directly to another £70bn DB shortfall.

The BOE also announced it would cut interest rates by 0.25%, which will hit all fixed-income returns by reducing the value of gilts and ‘safe haven’ government securities, and correspondingly increase the demand for, and price of, riskier fixed-income products. Pension trustees had better make some shrewd investment choices.

The writer argued that in order to solve the deficit overhang, the government needed further legislation to prevent the proliferation and even the continuation of DB schemes. He stopped just short of saying they should be outright banned.

A Lighter Touch

What the government has done, under the 2014 Pensions Act, is more of a soft-touch approach which controls market incentives to withdraw money from schemes, and window-shop continually for better ones. This provides more security for scheme managers, in terms of the available capital for investment.

To issue a ban on Defined Benefit schemes would be a hostile move, which might be resisted by certain interest groups and create a conflict both sides are surely keen to avoid. When a legislator has a choice between prohibition measures and providing incentives for industry participants to behave a certain way, a collaborative approach is usually more effective.

Though that is not to say that the government hasn’t laid down some firm rules on scheme governance and providing Value for Money.

Can Members Leave a Scheme if they Want to?

Although members are technically free to leave the scheme at any time, there are a number of barriers which impede them from doing so. Many funds impose exit fees, to try and keep investments safely enclosed in the fund and prevent a deficit from occurring.

Another barrier is that administrative provisions for transfer of their pension pot might be insufficient or incomplete. Scheme member data storage and integrity has recently been the subject of widespread audits and reforms, because so much information was missing or compromised. For example, employees would be classed as ‘absent’ from the scheme and hence the records, when in fact they had a Personal Pension Plan that the employer made regular contributions to.

Pre-existing pension legislation held that if you had been in the scheme less than 2 years, and the scheme rules permitted it, you were entitled to a refund of the contributions already made. Though having received the refund you would not be entitled to any benefits for the period to which the refund relates. However, the Pensions Act 2014 made it harder to leave or withdraw money from a scheme in several new ways, one of which was the abolition of these ‘short service’ refunds, for people who leave a money purchase occupational pension scheme after the mandatory 30-day period, and within 2 years of entering the scheme.

Mandatory Disclosure

Money purchase schemes are simply Defined Contribution (DC) by another name, where an employee contributes an agreed percentage of their salary at set intervals, with no guaranteed level of return. Targets and benchmarks are shared with scheme members, and indeed it is a legal requirement that trustees provide members with an annual statutory money purchase illustration (SMP), which states the likely pension at retirement based on in-house assumptions about market conditions including inflation. Also with details of contributions credited (before deductions) to the member in the preceding scheme year.

So a major advantage of DC schemes is that the governing directors’ expectations and projections of expected returns are continually revised. In the current fixed-income investment climate, with the BOE still steering interest rates on a tight rein particularly in the uncertainty surrounding Brexit, to make the kind of gold-plated promise to employees which Defined Benefit schemes make does not seem… prudent.

This enhanced level of disclosure was also introduced in the recent reforms governing trustee accountability. Not that I’m biased, but the now fairly stringent requirements on trustees of money purchase schemes seem to make them the better option. Among the other provisions are that:

–              Investments made with each contribution should be documented, recording the date of each. Best practice is that every new contribution be invested within five working days; where a member’s contributions are invested in more than one fund, and “the total amount contributed in a period is recorded explicitly”, verify the sum of the individual transactions elements equals the total contribution.

–              To this end, there should be a record of every investment sold, date sold and amount realised. This does not have to be recorded separately for each contributor, but must be categorised by investment fund.

Reasons not to Shop Around

The 2014 Pensions Act contained a number of other measures relating to private pensions, many of which strengthen existing legislation. Many seem calculated to try and ring-fence the contributions to existing schemes. They include provision for:

  • a new power to make regulations to prohibit the offering of incentives to transfer pension scheme rights
  • the introduction of a new statutory objective for the Pensions Regulator, to minimise any adverse impact on the sustainable growth of sponsoring employers when exercising its functions relating to scheme funding
  • measures to restructure the Pension Protection Fund compensation cap to better protect long serving scheme members
  • an amendment to the Public Service Pensions Act 2013 to allow smaller public body pension schemes to transfer accrued rights into one of the larger public service schemes

So in conclusion, measures have been taken to make DB schemes vastly less attractive. But to outright ban them would itself be infringing on the rights of those existing DB scheme-holders whose right not to have their scheme fold due to insufficient funds these lobbyists are defending in the first place.

The important thing is for DB scheme managers to have the freedom to adjust their expectations and projected returns to a level which is compatible with their income stream and all their liabilities. Which I suppose would mean they were no longer ‘Defined Benefit’ but ‘Pre-Defined Benefit Re-Defined in Light of Changing Conditions’. If only someone would outline the circumstances in which this decision would be permissible.



MyWorkpapers releases game-changing software Connect – ‘like Facebook for Accountants’

17 May

As a seasoned accountant who has both managed his own practice and worked in-house on the corporate side, MyWorkpapers CEO Richard Neal has used his years of experience to create the complete set of tools for the busy accountant.

The newest addition to the MyWorkpapers toolbox is ‘Connect’, a collaborative platform between client and accountant which compliments its bookkeeping package. Richard believes the efficiency gains from this service could be as high as 30-50%; an ambitious claim but one which may just be borne out.

Its most obvious function is to provide a place for accountants and clients to communicate – “like Facebook, but for accountants,” he jokes. The ‘traditional’ communication methods of email and telephone can be inefficient, as calls and messages can be missed or lost in translation. Here queries can be organised alongside the client worksheets, and supporting information can be uploaded instantly to the relevant section.

Because it is a collaborative platform, everyone involved in managing the ‘numbers’, be it bookkeeper, business or accountant, is able to literally drag and drop supporting documents where the need arises. This process also leaves a clear audit trail.

In the information gap which arises between when an accountant emails for explanations of issues arising from the latest batch of figures, and later when responses arise, vital information might have been missed.

Pro-Active Accounting

Richard is keen for accountants to take a more active role in offering clients advice on how to optimise their business processes. The danger, he says, especially with the advent of bookkeeping software, is that accountants will be relegated to processing end-of-year accounts, because clients can learn and perform more of the bookkeeping, monthly reconciliation tasks and VAT returns themselves.

“Cloud accounting software has been picked up and adopted by thousands of businesses in the last few years because of the data feeds and intuitive use. Accountants are also promoting it to their clients, and assisting with implementation and rollout. The consequence, however, is that the ‘honeymoon’ is soon over because the client becomes educated and think they don’t need their accountant as much… Businesses become more independent and choose to do more themselves, and are less reliant on their accountant for processing and checking of the numbers.”

He is concerned that among the older generation at least, there is a reluctance to adopt time-saving technology because their methods have stood the test of decades. Why change now? Well, for one because they are in danger of having revenue poached by the tech-savvy twenty-somethings, who leap on promising apps and software with gold-rush enthusiasm.

And Richard is confident there is nothing quite like this on the market. A core attribute of Connect is that Workpapers are dynamically generated by integration connections. There is no exporting or importing of CSV files. Queries and tasks are also done through the product.

When his own company switched to using the Connect package, Richard says they saved around £20,000 on bookkeeping costs, and £5,000 on accounting fees. Another part of this efficiency gain is thanks to the capability for creating personalised worksheet templates and tasks and processes for different clients. By stipulating monthly tasks and required documents, the practice can keep tabs on problem areas like delinquent debts or payments; create realistic cashflow forecasts; spot and track potential areas of inefficient spending. Duplicated information is easily identified and corrected.  And because all problems have been resolved as and when they arise, “year-end becomes a breeze both for the accountant and the client.”

A question of security

What differentiates this package from other accounting software offerings are all the details: for example, they have hired an external cyber-security company – “basically hackers” is how Richard Neal describes them – to do penetration testing, so their encryption protection is independently verified. All the data is stored in the UK, in accordance with the Data Protection Act.

Furthermore the data is not just backed up, it is duplicated on three levels, and stored both on and off-site, with another provider. Where the industry standard for data duplication is 15 minutes, “ours is truly instantaneous.. if one server goes down, it will come straight back up and you won’t even notice.”

Perhaps the most telling indicating of how big a deal this company actually is, though, is that their services are now employed by one of the ‘Big Four’ accounting firms. So naturally assessment of their IT security policy formed part of the six-month due diligence process.

And how much is this evolutionary software?

Just £5 a month for each individual user.


Bringing Science Back to Surveys

27 Mar

Increasingly it is becoming obvious to the financial community that CSR criteria are something they cannot take lightly and form an important part of commercial companies and investment funds’ reporting of their respective ‘successes’.

But simply surveying ‘investor attitudes’ is not an effective or, at any rate, empirically sound way of gauging their CSR priorities. Should a company focus on managing its carbon emissions, or having a flawless labour relations record?

Literature Review

Indeed, academic consensus shows organisations that tackle stakeholders’ concerns, over public approbation, achieve greater returns than firms that fail to address these interests. Manrai and Manrai (2007) demonstrated the success of this priority in reducing customer churn; Sweeny and Swait (2007) on how it increased share and profits. McDonald and Rundle-Thiele contended in their paper that customers, though, were not satisfied by corporate social responsibility (CSR) initiatives – which might be costly – as much as by dramatic-sounding but essentially superficial actions like preventing child labour, or other human rights abuses.

These considerations ranked highest in customer satisfaction levels, according to the authors after examining banking industry surveys. Furthermore, they found community support, i.e. “offering customers in low socio-economic groups fee-free accounts and low-interest loans, banks’ support of their employees’ volunteer activities via paid leave and flexible working arrangements” resulted in the lowest customer satisfaction. Other important sustainability measures, “reduction of water and energy consumption, carbon offset programmes, recycling and use of recyclable materials,” delivered the third-highest level of customer satisfaction

Questions of Reliability

Survey responses, though, are not the most reliable means of gauging an individual’s true opinion, as respondents usually know the purpose of the survey so they might be inclined to slant their responses in a ‘helpful’ direction. Another potential – and major – source of bias arises from participants’ desire to present themselves and their firm in the best possible light. This could be an unconscious source of bias even when responses are anonymous.

Examples abound of surveys linked to plush events with a hefty price tag, where participants are wined and dined and soothed into a compliant mood by keynote speakers who promote a sense of inclusion around common issues. An even more blatant example of ‘priming’, as psychologists term the process of activating certain of an individual’s emotions and short-term memory cells, is the Award Ceremony where everyone unites in an orgy of back-patting and self-congratulation.

Making Surveys more Scientific

Isn’t it time someone initiated a rigorously scientific study into stakeholders’ real unbiased opinions? Academic psychology studies are carefully crafted to minimise all possible sources of bias; and the results, when analysed, are not simply tested for correlation or with regression analysis, which imposes a series of ‘logical’ mathematical assumptions to determine constants that support its own self-generated model. Assuming, that is, that the ‘error term’ is zero, a fact which is rarely if ever true for real-life models.

Let us say that we initiated an experiment with controlled variables whereby we divided participants into three groups whereby each was exposed to the following conditions:

  1. Played video/ shown slides about two companies’ efforts at reducing and effectively reporting their carbon emissions.
  2. Played video/ shown slides about two companies’ efforts at providing favourable work conditions – flexible hours, home working, travel bursaries, sponsoring further vocational qualifications.
  • Played video/ shown slides about two companies’ efforts at providing grassroots investment to local communities affected by their activities.

At the end, each was given an assessment card where they recorded their perception of each company’s level achievement in each area, from a scale of 1 to 10. At the end, – and this would be the experiment’s overarching objective – they would be questioned about whether they thought the company scored highly for CSR criteria, whether they thought it represented a good long-term investment, and its perceived risk level.

These experimental conditions would need to be replicated across a number of samples, such that the results started to follow a normal distribution. The assumption of the homogeneity of variances between samples is a key tenet of the statistical test you are about to perform, but corrections could be made if this is not entirely the case.

Levene’s Test tests the null hypothesis that the variances of the groups are the same. If Levene’s test is significant (i.e. the value of significance is less than 0.05) then the variances are significantly different meaning that one assumption of the analysis of variance has been violated.

The formula for Levene’s Test can be found here

You could then perform an analysis of variance, to determine the difference between systematic and unsystematic variance. For a full walk-through of how Analysis of Variance is performed, with or without the aid of stats software, stay tuned for my next blogpost …


Remember also that psychologists have to control for multiple sources of bias, for example:

  1. Selection bias. Measures should be taken to ensure demographic factors which might influence a subject’s opinion or response, such as age, income bracket, social ethnicity etc are controlled for.

If different participants are used for different experimental conditions, a method for allocating interventions to participants must be laid out in the report, based on some chance (random) process, i.e. sequence generation. Moreover, steps should be taken to prevent foreknowledge by participants of the forthcoming allocations.


  1. Performance bias, defined as systematic differences between groups in the care that is provided, or in exposure to factors other than the interventions of interest. Many studies are designed such that the actual thing being measured is concealed under an alterior objective which is presented as the subject of study.


The aim is to reduce the risk that knowledge of which intervention was received, rather than the intervention itself, affects the results. Often the assessors are also ‘blinded’ as to which participants have received which condition, to prevent them unconsciously biasing the outcomes.


  1. Detection bias, defined as systematic differences between groups in how outcomes are determined. In recording a subject’s reactions, if the evidence is qualitative rather than quantitative an assessor can unconsciously predicate the desired result. Again, blinding (or masking) or outcomes assessors may reduce the risk that knowledge of which intervention was received, rather than the intervention itself, affects outcome measurement.


  1. Reporting bias refers to systematic differences between reported and unreported findings. Within a published report those analyses with statistically significant differences between intervention groups are more likely to be reported than non-significant differences.



How does a “soft pull” affect your credit score, and your ability to participate in P2P marketplaces?

14 Jan


A ‘soft pull’ or ‘soft inquiry’ is when an institution, or indeed yourself, does a credit check on you without it affecting your credit score. If you were applying for a loan and the bank did a ‘hard pull’ on you, and you were subsequently denied the loan, this would stay on your permanent credit record.

A hard pull resulting in a failed application would likely lower your credit score, because if you already have debts owed and are making further loan applications, this would make you a less attractive applicant.

Many organisations can ask for a ‘soft pull’ on your credit record, including P2P lenders. A potential employer can ask your permission to do a superficial credit check on you. Financial institutions you already have an account or relationship with check your credit; and credit card companies that want to send you preapproved offers check your credit.

What kind of checks do P2P lending forums perform?

The level of scrutiny a marketplace lender will put a potential applicant under varies greatly depending on the loan provider. Some target the upper tier of borrowers, while others offer sub-prime loans to those with credit scores too low to allow them to qualify anywhere else.

Some hire bespoke credit database companies to do an in-depth background check on applicants, usually if they are the company CEO and it is a business loan; other, perhaps less discerning, lenders stick to the three main credit reporting bureaus, Equifax, Experian and TransUnion.

Avant is an example of a company which deliberately targets applicants with low credit scores, offering them the chance to “repair” their credit score with a history of prompt loan repayments. Naturally applying for a loan with Avant, the company assures consumers, will not affect their credit score.

FICO (Fair Isaac Corporation), the independent industry body which is responsible for pooling the scores of the three credit scores from Equifax, Experian and TransUnion, warns that loan companies promising quick-fix solutions to a credit score are making empty promises.

The company does warn applicants that the interest rate on the loan they take out will be more favourable if they have a good credit record. It uses this as incentive to borrow, in the hope of ‘saving money’ in the future:

“We’ll send notice of payment history to the major credit bureaus which may improve your credit score with timely payments. As your credit improves, you may be eligible for lower rates on subsequent loans through AvantCredit.” Note that the representative APR is a hefty 48.5%.

It is true though that timely reporting to credit bureaus of prompt repayments on a loan might in the long run make a borrower seem more trustworthy. But if you are only looking to improve your credit score, remember that much of your scoring comes from paying bills on time (about 35%) and how much outstanding debt you have. Factors like drawing on a range of different forms of credit (e.g. credit card and long-term loan) comprise around 10% of your score.

Social Selection at Social Finance

Social Finance inc. (SoFi) carefully selects its borrowers, assessing a range of financial and cultural factors to determine not only the applicant’s creditworthiness, but also effectively their social status. It asks questions about their education and their career experience, as well as monthly income vs expenses, and obviously their financial history.

SoFi likes to keep loans within the SoFi community, operating a subtle social streaming process. It offers cash rewards for successfully referring a friend. Previous loan applicants can share a referral link to let someone else refinance their student loan or take out a personal loan.

The platform also mentors newly graduated entrepreneurs through the SoFi Entrepeneur Program, and here some of the application questions are indicative of the parallel socio-cultural assessment. The company is asking itself, “Is this individual a long-term investment?”

Such questions include details like the name of school the applicant graduated from, and details on their employment such as “Are you a founder/ co-founder?” and “Are you working full-time?”.

In Conclusion

The question is how thoroughly the organisation manages its data, and if it sells it to a third party. Information in this industry is currency, and there is no guaranteeing the privacy of everything you disclose in an application.

Consider that, while the Federal Housing Association (FHA) says anyone with a credit score of 500 can apply for a mortgage loan, 97% go to those with credit scores of 620 or over. While a ‘soft pull’ will not affect your permanent credit score, there is no guarantee the information unearthed will in no way affect your application for a marketplace loan.



Marketplace Lending – A High-Risk Investment? Too Soon to Tell

13 Jan


Where banks must make full disclosures of their capital adequacy ratios (under Basel 3) – and, in their annual report, the current market value of all their assets and liabilities including derivatives, – marketplace lenders have far less transparency obligations.

Key names such as Borrowize, Accion, Fundera, Multifunding and others are seemingly not financially significant enough to be required to publish annual reports on the SEC’s electronic filing system EDGAR. The information is not readily available on their websites. And they do not willingly give such information to journalists or inquisitive citizens.

For those of whom the majority of their investors are private institutional clients, they admittedly have no real public obligation. But those marketplace lenders whose primary stakeholders are retail investors arguably do have a duty to give some detail on how they manage the key risks of conducting their conducting their business.

Market Risk

This we will broadly define as the risk that an asset will decline in value due either to macro conditions, e.g. change in interest rates; or some underlying change in the asset class, e.g. a certain number of loans defaulting on payments.

Lending Club is typical of many marketplace lenders, in that it offsets its exposure to the loan pool by selling notes, equivalent at the time of issuance to the value of the loan, to its institutional partner. Lending Club collaborates with Utah-registered WebBank, partly to take advantage of Utah’s lax stance towards interest rate-capping. But many lenders have a wider range of institutional partners.

In its 2014 annual report, Lending Club explains away its market risk thus:

“Because balances, interest rates and maturities of loans are matched and offset by an equal balance of notes and certificates with the exact same interest rates and maturities, we believe that we do not have any material exposure to changes in the net fair value of the combined loan, note and certificate portfolios as a result of changes in interest rates. We do not hold or issue financial instruments for trading purposes.

 The fair values of loans and the related notes and certificates are determined using a discounted cash flow methodology. The fair value adjustments for loans are largely offset by the fair value adjustments of the notes and certificates due to the borrower payment dependent design of the notes and certificates and due to the total principal balances of the loans being very close to the combined principal balances of the notes and certificates.”

In order for the loans’ value to continue to equal that of the notes and certificates, the debt trading forum must ensure prompt resolution of any delinquent debts. But they do not generally have the right to forcible repossession of goods to the sum of what is owed. Fixed charges placed over assets owned by the debtor would make up part of the money owed.

But how many actively are the loan issuers ensuring collateral is posted?

Collateral Posted – A Mixed Bag

Intersect Fund and Copperline, in response to our inquiry, volunteered information about their policy on collateral, and on their credit checks. The contrasting policies of these two respondents clearly demonstrate that there is no fixed industry standard on either point.

When asked under what circumstances they would require an applicant to post collateral, Intersect Fund explained: “We use collateral as a compensatory factor for recent credit blemishes and overdrafts. We don’t have a LTV (loan to value) minimum and it depends on how strong the applicant is in other areas.”

Intersect Fund makes a policy of taking four character reference numbers, in addition to running a personal credit check through TransUnion. For Copperline, “Personal credit reports (from Experian) serve as character references”.

The ‘insurance policy’ of Copperline is also more relaxed, and typifies the more liberal end of the lending market. A spokesperson summarised, “We only require collateral if the client is purchasing equipment in which case we take the said equipment as collateral. We never take additional collateral.”

In sum

While all marketplace lenders take measures to counter the risk of delinquency and default, there are limits to the measures they can take to recover missed interest payments. Fortunately there is an ever-expanding supply of fresh loan applicants to keep their portfolios at full value.

Furthermore, the actual sums at stake seem to indicate this risk is for now fully under control. To draw again on Lending Club’s 2014 annual report – this time a detail from the auditor’s notes – we can see that its ‘loan loss contingency fund’ of $1,824,739 is more than sufficient to cover the losses in its three main portfolios over the preceding two years. In fact, the maximum sum deficient, in 2012, was $512,395.

So the management has just cause to consider the loan loss contingency fund “sufficient” to cover all potential future losses from its portfolios.