The building of a bullshit economy

In 2013, anthropologist David Graeber, now a professor at LSE, crashed the website of a small magazine with a short essay that struck a chord all over the world: ‘On the Phenomenon of Bullshit Jobs’.

Graeber, who later extended the article into a book, was struck by the number of jobs thought pointless even by those who did them. He pondered Keynes’ much-quoted prediction that we would all work 15 hours a week by the year 2000, and noted capitalists’ aversion to spending money on unnecessary jobs (or even necessary ones: ‘No – give me back my fucking money!’ Trump reportedly raged on finding he was supposed to employ a transition team when moving into the White House). So what was going on?

Graeber was acute in nailing the proliferation of non-jobs, but less so at explaining it. In fact the situation is more insidious than his version, if admittedly duller. It is not, as he suggested, primarily the result of ‘managerial feudalism’ (employing flunkies to big up your status), nor a dark plot by the ruling class to keep workers out of mischief by insisting on the sanctity of work even when it is valueless, although that is an outcome. Instead it is the predictable consequence of our current destructive management beliefs and the work designs they lead to.

The reasons are fairly simple. Since companies put their own short-term interests above those of society, there is constant friction at the margins of what’s legal or at least acceptable. Pushing too far leads to scandal (Enron), crash (Lehman) or both (2008), and, as sure night follows day regulation to bolt the door after the departed horse. As John Kay wearily explains, ‘We have dysfunctional structures that give rise to behaviour that we don’t want. We respond to these structures by identifying the undesirable behaviour, and telling people to stop. We find the same problem emerges, in a slightly different guise. So we construct new rules. And so on. And on. And on.’

As regulation gets ever more complicated, it evolves into an industry in its own right, with its own vested interests and bureaucracy – a monstrously growing succubus symbiotic with the industries it is supposed to control. You can watch the process playing out again in Silicon Valley now. ‘Facebook puts profits above care for democracy’, proclaimed the FT in a recent article. Of course it does: that’s what managers have been taught to do. The demand for regulation is steadily building as a consequence.

Don’t get me wrong – Big Tech needs reining in as urgently as Big Finance. But as a manifestation of a bigger problem – the ‘dysfunctional structure’ that generates regulation that is simultaneously necessary and useless – the only solution is to reduce the need for regulation in the first place by placing a duty of care on companies for the society they form part of. In other words, regulatory jobs are net energy and value-sapping jobs which shouldn’t exist – the creation of philosopher John Locke’s madman, ‘someone reasoning correctly from erroneous premises’. As Peter Drucker put it, ‘There is nothing quite so useless as doing with great efficiency something that should not be done at all’.

And here’s the thing. The dysfunctional structure is fractal, replicated at every level down through the organisation. Since it assumes at least some workers, including managers, will shirk and skive, management is geared for control rather than trust. Low-trust organisations run on rules, surveillance and performance management – which through the process of self-fulfilling prophecy actually makes untrustworthy, or at least unengaged, behaviour more likely. Look no further for the cause the apparent paradox, noted by Graeber, that bureaucracy proliferates just as much in the supposedly lean and efficient private sector as in the public. In effect, each company carries the burden of its own regulatory apparatus. In 2016 Gary Hamel estimated that excess bureaucracy was costing the US $3tr a year in lost productivity, or 17 per cent of GDP. Across the OECD, what we might call the ‘bullshit tax’ amounted to $5.4tr. ‘Bureaucracy must die!’ proclaims Hamel. Yet he concedes that despite his campaign, it seems to get worse, not better.

Finally, with the ideology of Public Choice, the same pessimistic assumptions and stultifying management structures have been visited on the public sector in the form of New Public Management, with exactly similar results. Marketisation has added a further overlay of bullshit. Symptomatic is the experience of the university sector: compare stationary salaries and worsening conditions of academic staff with burgeoning jobs (and salary levels) in administration and management (especially at the top) and the creation of entirely new departments concerned with branding, PR and massaging the all-important student satisfaction figures – an enormous increase in pointless overhead on the real work of turning out critical citizens who can distinguish real value from hot air.

Putting all this together, it is hardly surprising that the US and UK, as the most extreme proponents of deregulation and privatisation, are, with delicious irony, more subject to this systemic bureaucratisation than other less laisser faire economies. So much so that it is tempting to characterise the UK in particular as a bullshit economy. Having largely abandoned manufacturing, it prides itself as a purveyor of financial and professional services selling advice and other products of which the social value is dubious, to say the least. The extreme and paradigmatic case is advertising. ‘The UK advertising industry,’ a recent House of Lords report solemnly intoned, ‘is a success story. Advertising fuels the economy by helping businesses to grow and compete against one another. It is also a significant sector of the economy on its own. The UK, especially London, is a global centre for advertising, exporting services to clients around that world,’ and plenty more in the same vein.

Well, maybe. But in its own terms, as senior adman Dave Trott succinctly told a BBC Radio 4 audience recently, of £23bn worth of ads purchased annually in the UK, ‘4 per cent are remembered positively, 7 per cent are remembered negatively, and 89 per cent are neither noticed or remembered at all’. Let that sink in a minute. £20bn of ads that might as well never have been created – that is bullshit of an awesome order.

Bullshit generates more bullshit. ‘The best minds of my generation are thinking about how to make people click on ads’, one Silicon Valley techie accurately noted. ‘And that sucks.’ Or about spin and fakery – another British ‘success story’ that bloats as newsrooms shrink. PR people now outnumber reporters five to one, compared with two to one 15 years ago. Which is why this kind of bullshit/bureaucracy is so hard to root out. It’s what happens when economic incentives are out of line with society’s interests. It’s not a bug in the system – it’s a feature. It won’t change, in other words, until everything changes.

Corporate reform grows in unlikely places

It’s mea culpa time. After a grief and denial phase, the growth of populism is producing a rare outbreak of handwringing among the liberal elite, as we now have to call them, as they own up, at least partially, for their part in bringing about the angry, polarised world that we now inhabit. Theresa May was the first to put her hand up with her ‘barely managing’ and ‘capitalism for all’ on the doorstep of No 10 after the botched election of 2017, but all that has long since disappeared into the black hole of Brexit (from which, pace the late Stephen Hawking, nothing ever returns).

More recent owners-up include former Treasury Secretary Larry Summers, who in a bizarre and awkward FT piece recounted his astonished discovery of ‘the way of life’ of ‘the rest of America’ on a two-week transcontinental car trip this summer – a wonderful example of class cluelessness. And The Economist, where editor Zanny Minton Beddoes penned a 10,000-word manifesto for liberalism in which she lamented, rightly, that too many liberals had turned conservative, shunning calls for bold reforms to an economic neoliberalism out of which they had actually done rather well.

Beddoes puts forward a number of proposals to put liberalism on track, ranging from upholding free trade to moderating immigration, enforcing competition policy and dreaming up a new social contract. But although she is right about the need to do something about ‘left behind’ places and people, nowhere, surprisingly, is there an acknowledgement of the extent of the challenge to liberal ‘business as usual’ from the pincer jaws of neoliberal global financialisation on one side and what economic historian Carlota Perez insists is the burgeoning fifth (not fourth) industrial revolution comprising the internet and mobile communications on the other.

What no one has picked up is one of the most obvious things of all. Whether what we are going through is the fourth or fifth revolution, it is different in one crucial sense from all those that have gone before. Right up till this one, economic incentives have largely been aligned with the interests of wider society. Broadly speaking, corporate growth led to prosperity through the creation of well-paid, full-time jobs.

Two important things have happened to change that. First, economic incentives have been yanked round to pull in a different direction, encouraging businesses to treat human employment as just another means to the end of enriching shareholders – another cost to be minimised.

Now, companies are still growing all right; but they only employ people when they have no alternative, and then preferably on minimum pay and zero hours. Look at Uber, the totemic platform enterprise, which is rushing as fast as it can to perfect autonomous vehicles which would dispense it from employing anyone at all, bar a few economists and quants to fine-tune its surge-pricing algorithms. This kind of work is not a reliable route out of poverty, and growth can longer spread wider prosperity when big companies are spending 90 per cent of their earnings on stock buybacks for the benefit of shareholders. The sums are stupendous: over the last decade Apple has spent $102 billion (with another $210 bn to come!); Microsoft $878 bn; Cisco $228 bn; Oracle $67 bn; JPM Chase $63 bn; Wells Fargo, $56 billn; Intel $55 bn; Home Depot $51 bn. Meanwhile real US wage levels have barely budged in getting on for half a century.

The second thing that has changed since the last great growth surge is the power and wealth of the largest corporations, and the monstrous accumulation of vested interest that has resulted, knitting together a formidable fellow-travelling ecology of consultancies, business schools and investing funds, which together have effectively captured the political process. ‘There is no force on earth that can stand up effectively, year after year, against the thousands of individuals and hundreds of millions of dollars in the Washington swamp aimed at influencing the legislative and electoral process,’ former Fed chairman Paul Volcker declared in the New York Times recently.

While economic incentives conflict with society’s interests, prospects of dealing with the immediate Frankensteins of ramping inequality and the desertification of the jobs market, let alone resolving major problems like climate change and shrinking bio-diversity, are grim. Other adjustments, both social and institutional, will be needed too. But there will be no lasting solutions until business and social needs are pulling in the same direction. That means altering the incentives. And given the weight of the aforementioned vested interests and lobbying power, only the most determined effort will prevail.

This is why Elizabeth Warren’s Accountable Capitalism Act, now before the US Senate, is cautiously encouraging. Warren, a hard-nosed Democrat law professor who is mulling a run at the presidency in 2020, knows business, having worked in bankruptcy and consumer protection; she also grasps that the big issue is not identity – it’s the economy, stupid.

Warren’s bill is radical and simple. It would set up an Office of US Corporations which would require the biggest firms to adopt a federal charter mandating them to consider the interests of all stakeholders – workers, customers and communities – and not just shareholders. Workers would elect 40 per cent of the board, and there would be restrictions on the way executive stock options (which have played a huge part in the systematic enrichment of the executive class) are exercised.

Warren’s bill of course has little chance of making it into law any time soon. But it is being taken seriously and draws on a number of strands of public approval, notably worker representation in boardrooms. Not only that: the concept of ‘accountable capitalism’ can be read as a legislative response to the much remarked call of Larry Fink, the head of the world’s largest investor BlackRock, for firms to show they were making a positive as well as financial contribution to society. ‘Companies must benefit all of their stakeholders, including shareholders, employees, customers, and the communities in which they operate’, he wrote to CEOs earlier this year.

Warren and Fink are a powerful duo. Their initiatives should give much-needed heart, and sinew, to British progressives, who are now paying the same high price as US Democrats for backtracking from corporate reform before the Crash (yes, Tony Blair, we do mean you). A similar approach by the two major shareholder-dominated economies would give a much better chance of making reforms stick – and is probably the only hope of overcoming the abject funk of Westminster and Whitehall at the possibility of being labeled anti-business. Perhps the UK has more riding on the outcome of the US mid-terms than we thought.

CEO activism is a dangerous game

Time was when trying to get CEOs to talk about anything more controversial than last year’s profits or the new marketing campaign was like drawing teeth. Even on matters that affected them directly, like tax or budget changes, they preferred to let their organisations ventriloquise for them. But that is changing – not so much in the UK (where even Brexit and Corbyn’s worker-share proposals have failed to evoke more than a few strangulated yelps from the nation’s boardrooms) as in the US, where a recent wave of ‘CEO activism’ has caused excited comment.

Several factors are driving the phenomenon. On one side, in a bitterly divided post-trust world, the demand for figures able (or at least willing) to supply answers to existential questions has never been more intense, especially among needy, brand-conscious younger employees – which puts their CEOs squarely on the spot. Who, on the supply side, are increasingly pleased to step into the spotlight – partly because they feel they have an undeniable right to strong opinions on controversial issues, and perhaps more cynically because they are confident of reaping the rewards in terms of exposure and lobbying influence available to those able and unsqueamish enough to exploit an increasingly competitive celebrity culture. Whatever the reasons, over the last two years CEOs have become ever bolder in speaking out on issues including immigration, the environment, gun law, LGBT rights, inequality and race and gender relations, to name the most prominent.

They could plausibly argue that they have little choice. In the world we live in, controversy is unavoidable. Yet it is uncomfortable, even dangerous territory. As too many have come to realise, the first law of celebrity is that it is as easy to die as to live by. In the age of social media, a reputation can be, and often is, swept away by a twitterstorm overnight. Ask Elon Musk, Travis Kalanick, Elizabeth Holmes at Theranos. United Airlines only survived the ‘unfortunate incident’ when it brutally removed a doctor on an overbooked flight from Chicago by making a grovelling apology and paying an undisclosed sum in compensation, sacking two officials and upping the offer to passengers willing to give up their seat from $400 to $10,000. To state the obvious, high-profile CEO interventions come with consequences, sometimes costly, first and foremost for direct stakeholders and rippling out for the wider society. In turn, that requires them to weigh those consequences carefully – which only gets them into even deeper moral waters. Calculating relative costs and benefits of standing up for a cause for, say, shareholders and society is not only difficult – even in the unlikely event of it being clear cut, it is a hopelessly unreliable guide to action. As John Kay (for the record, an economist) helpfully put it, ‘ethics is about what you to do when good behaviour and profitable business are not necessarily the same thing.’ Damned if you do; even more damned if you don’t.

There is an even more dangerous side to CEO activism, however – and it is playing out in front of us every day. It’s a fairly small step from activism in public affairs to wanting to shape or even control them (that’s surely the point). For where this can lead, look no further than the US White House, the antics of whose current occupant suggest that the direct import of business into government makes for explosive, possibly nuclear, results. The temptation to wish Trump out at any cost is strong; but is the prospect of the only viable opponents consisting of other businessmen (Bezos, Zuckerberg, Bloomberg, anyone?) much more enticing? Giant companies and the very rich already have much too much sway, both direct and indirect, over our lives. What we need is a straight alternative to the money- and profit-centric view of life that has got us in the current mess, not just a less voracious version.

The irony is that there is one obvious area where CEO activism would not only be welcome but that we have been awaiting in vain for years. It is an area that, unlike many other topics, business is uniquely well qualified both to speak and act on – and it would benefit everyone, not just sectional interests. Give up? It’s business itself.

Long before the Great Financial Crash in 2008, it was clear to everyone not wearing earplugs that the tocsins were ringing. Shareholder value maximisation, that simplistic and treacherous mantra, was a corrupt, busted flush, benefiting only chief executives loaded with share options and hedge funds that take no thought for the health of the overall system. In an angry review of Deborah Hargreaves’ ‘devastating’ take-down of executive pay, Are CEOs Overpaid?, Margaret Heffernan, herself an entrepreneur and business leader, charges that the real failing of the current generation of Anglosphere CEOs is not, as so often posited, greed. She writes: ‘I’m not sure chief executives are merely greedy. What I’ve seen, in the US and UK, is more disheartening than greed. These men – and they are mostly men – are not leaders but followers. They are afraid to step out of line and set a better example. Instead, accepting their huge salaries, they hide behind an old, discredited alibi: everyone’s doing it.’

Coming out for minorities is well and good, but away from the public eye many of those worst treated are toiling within the firms that CEOs speak for. From rock ‘n’ roll to grunge, commerce has always been quick to spot the possibilities in coopting rebellion – see Nike’s recent ‘Just Do It’ ad campaign featuring Colin Kaepernick, the mixed-race American footballer who first initiated the practice of black players kneeling during the national anthem as a protest against racial injustice. For Nike, causing controversy was part of the point. CEO activism could, and almost certainly will, be read as something similar: cynical virtue-signalling or personal brand advertising, unless it is first deployed to put their own damaged house in order.

And we’ll take the low road

On 15 September, economist Mariana Mazzucato tweeted:

Great 2 days in San Francisco for West Coast launch of Value of Everything [her new book]. But the number of suffering homeless people on the street was much greater than I have ever seen and left bitter taste. Only one word can describe it: barbaric. Humans in 21st capitalism deserve better.

Her equally eminent colleague at UCL’s Institute for innovation and Public Purpose, Carlota Perez, the analyst of great technological surges, replied:

@MazzucatoM Homelessness, precarious zero hours and gig economy contracts plus the many that have dropped out of the workforce make a mockery of the current celebrations of full employment in the US and UK. We had better start looking at real reality in the face

Their vignettes made a deft counterpoint to a piece the same week by Sarah O’Connor, the FT’s first employment correspondent for decades, entitled ‘Workers have right to gig economy that delivers for 21st century’. It carried the self-explanatory subhead, ‘Flexible working is touted as the future but too often resembles an exploitative past’.

It is worth reading, and sits easily (or make that ‘profoundly uneasily’) beside recent books like James Bloodworth’s Hired: Six Months Undercover in Low-Wage Britain or Jeff Pfeffer’s angry Dying for a Paycheck. Beside the daily tribulations of those trying to make a living in the new precarious economy, she made the wider point that for many of them the tech-enabled ‘flexible labour market of the future’ resembled nothing so much as the bad old piecework system of the past.

Quite starkly, the labour market is dividing in two. In future a minority will enjoy high-paid, full-time, jobs, while the lot of a growing segment of the working population will be self-employment, part-time and gig employment with little security and precious few prospects. Worryingly, as O’Connor wrote,

‘The UK’s low-paid sectors are 20 to 57 per cent less productive on average than the same sectors in Belgium, France, Germany and the Netherlands. Take the low-productivity activities that have recently been “reshored” to the UK, from manufacturing screws to stitching £7 dresses. A developed economy that finds itself sliding back down the global value chain is an economy where something is going awry’.

What’s going awry shouldn’t come as a surprise. In 2012 Guardian economics editor Larry Elliott and Dan Atkinson summed up their view of the situation in a book called Going South: Why Britain Will Have a Third-World Economy by 2014. Widely brushed off as a exaggeration at the time, the book may have got the date wrong, but otherwise the jeremiad looks depressingly accurate. One of the points the authors made was that given the option the UK economy unerringly took what looked like the easy way out in the short term only to find that in the long term it simply reinforced the narrative of decline.

Not one but two examples of this lethal addiction figured in the press last week. The first was by Elliott himself, who brought his 2012 analysis up to date by noting that it took a rather special interpretation of success to qualify UK labour-market flexibility as such. He pointed out that in the much less flexible market of 1975, at least work paid – unlike today, when it is anything but a reliable route out of penury: as witness the fact that two-thirds of the UK poor live in households in which at least one person works. True to form, employers have seized on the negative, cost-cutting possibilities of ultra-flexibility with relish, accentuating the downward slide that Elliott-Atkinson identified in 2012. Britain, Elliott wrote last week, ‘now appeared to be permanently locked into a low-wage, low-skill, low-productivity economy, in which workers compensate for a lack of earnings power by taking on more debt’.

Meanwhile, back in the FT, economics editor Chris Giles was mercilessly rehearsing the sorry story of British devaluation, the nation’s fall-back macro-economic easy way out. In 1948, he recounted, one pound bought more than $4 and DM13.5. That compares with today’s $1.30 and the equivalent of DM2.2. Over the 70-year period, only Canada has grown more slowly than the UK among the G7, and we have underperformed the eurozone since it was formed in 1999.

Cheaper sterling is ‘no route to prosperity’, concluded Giles, qualifying the most recent in the series, the 20 per cent depreciation of sterling since late 2015, as particularly disappointing. There has been no boost to exports, and no import substitution. Confirming the week’s low-road consensus, he noted that ‘a mini revival in manufacturing employment is overwhelmingly in making simple, low-productivity products such as food or metal goods such as radiators, cutlery and screws’; all in all, ‘the latest depreciation is challenging to be the worst in British history’.

That was all in one week. And we haven’t even mentioned Brexit. It was left to O’Connor among our doom correspondents to attempt to pull something positive out of the wreckage. The ‘good news’, she opined bravely, was that ‘the best possible time to reform a labour market is when unemployment is low and bargaining power is naturally on the rise’. With the labour market at its tightest since the 1970s, surely now was the time to drag the labour market into its higher pay, high productivity future. ‘Brexit uncertainties are not a reason to drift and dither, but an impetus to act’.

Mmmm. Well. She’s right in theory, of course. But in a week which has picturesquely reminded us that we seem unable to implement a railway timetable, let alone a political one, all one can really say is: good luck with that.

Branch economy blues

One of the craziest things about Brexit (I know, there’s plenty of competition) is choosing to impose it on what is now a branch economy to the rest of the world.

With its chronic trade gap (a record 5.2 per cent of GDP in 2015), the UK has long been dependent on ‘the kindness of strangers’, as Bank of England Mark Carney memorably put it last year, to cough up the difference when confidence in the economy is weak and sterling under pressure.

This vulnerability has long been camouflaged under weasel words about Britain being ‘open for business’ and ‘a magnet for inward investment’ because of its free-for-all market for corporate control and ‘flexible’ labour market. Indeed, a bit like Northern Rock basing its business model on tapping the day-to-day money markets rather than doing the work of attracting long-term depositors, successive governments have made a deliberate policy of soliciting easy money from fair-weather friends by offering low corporation taxes and non-dom residence deals for individuals.

But relying on inward investment is a dangerous game. In this case, foreign investors have enthusiastically taken us at face value (why wouldn’t they?), snapping up many of our most advanced, firms and leaving a few branch assembly plants and if we’re lucky localised R&D facilities behind to take advantage of the UK’s cheap labour. The family silver having long been flogged off, what remains is a few workaday pots and pans.

As a former adviser to George Osborne, Lord Jim O’Neill, recently noted, as a strategy to improve Britain’s competitiveness the policy has been a flop. Investment has remained ‘very weak absolutely and relatively to many other countries. Despite this remarkable drop in ongoing corporation tax and rising profitability’ – not to mention a weak exchange rate – ‘… it’s not done the job it’s supposed to do.’ These days the biggest manufacturing sector of the erstwhile workshop of the world is food processing.

Meanwhile, few substantial companies today are wholly domestically focused, and to that extent their loyalty to one geographical territory is limited. For some large firms (WPP, HSBC come to mind) switching nationality is barely more emotionally charged than changing an overcoat. But as President Trump has discovered, even for an entity as iconically and profoundly American as Harley Davidson there comes a point when hard economics has to take precedence over national sentiment, however deeply felt.

In the UK’s case, the City, the motor industry and suppliers to the aerospace and defence industries – in other words, some of the country’s last purveyors of traditionally skilled jobs and regular pay packets – are first in line if tariffs and border controls go up when we leave Europe, making a mockery of the idea of Brexit as a means of taking back control of our economic destiny. Rather, it reveals its fragility.

Airbus – which directly employs 14,000 in the UK and whose supply chain supports an estimated 110,000 more – has spelled out the consequences. ‘Until we know and understand the new [post-Brexit] relationship,’ it said in a recent internal risk assessment, ‘Airbus should carefully monitor any new investments in the UK and should refrain from extending its UK suppliers/partners base.’ If the UK leaves without a deal, it will be obliged to ‘reconsider its footprint in the country, its investments in the UK and its dependency on the UK’. Even a planned departure will impose a penalty in terms of red tape and operational friction.

The wider significance of this is hard to overstate. As James Bloodworth eloquently recounts in Hired: Six Months Undercover in Low-Wage Britain, his sobering account of spells working for Amazon, Uber and as a carer for the elderly, swathes of the UK’s small-town hinterland have never recovered from the last great industrial retreat in the 1980s.

What replaced factories and mines, and their supporting services, were first call centres, then retail and warehouses, now increasingly delivery and care homes; full-time jobs became part-time, then self-employed and finally zero-hour gigs; and pay packets have shrunk accordingly. For all sorts of reasons, those industrial jobs aren’t going to come back, but their careless abandonment has come back to bite us with a vengeance. The delayed price we all are now paying in terms of personal disengagement and despair, the unravelling of communities and local pride, and not least political unrest is higher than ever imagined at the time. Call it ‘shit life syndrome’.

Ironically, Theresa May’s instincts after the botched election in 2017 were the right ones. Her espousal of industrial policy and workers on boards apparently aimed to address some of the long-term underlying economic discontents that were eventually manifested in Brexit. But the fraught process of leaving has sucked so much energy out of politics that none remains to do any of the things necessary to make it a success, or at least limit its failure. When Northern Rock’s access to short-term cash was cut, its business model collapsed and it went broke. The UK economy won’t fold like an insolvent building society. But make no mistake: Brexit puts its business model (such as it is) on the line. In the absence of deliberate action to strengthen it, all we have is hope that in the long term it doesn’t suffer the same fate.

Signals at red for UK management

You have to admit that there’s nothing, not even a royal wedding, that the British do better than a management cock-up.

The great rail timetable revision was already an Olympic-class act to set alongside other collectors’ items such as Big Ben and the crumbling Houses of Parliament, Grenfell Tower, NPfIT and Universal Credit among a number of major computerisation disasters, and of course the mother of them all, Brexit.

The story of a Newcastle to Reading cross-country train getting lost and ending up in Pontefract may have been a bit embroidered, but not the accounts of undelivered trains, untrained drivers, the wrong kind of holes for the masts bearing overhead cables on newly electrified lines, pantographs on trains that were too short to touch the power lines once they were installed, not to mention communications that could have been bettered by pigeon post between a hapless Department for Transport, Network Rail, where timetabling is centralised, and train operating companies which seemed not to be able to tell one end of their units from the other.

All this was bad enough, making the UK a laughing stock in modern European nations like France, Germany, Switzerland, Belgium and even Italy where efficient, on-time public transport is taken for granted rather than a matter for dazed congratulation (even with a strike on, the French SNCF does a better job of timekeeping than we do running normally). But the award of a CBE to the outgoing chief executive of Network Rail in the middle of what one commentator described as ‘the most chaotic, fundamental, and humiliating failure it has been my misfortune to witness in 40 years as a rail journalist’ took the biscuit, adding a grade-A PR disaster to the unending litany of operating failures. Do we laugh or cry?

Whatever, we shouldn’t be surprised. If there’s one sector that sums up all the debilitating British talent for dither, fudge, short-termism and political zigzag it is the railways. Even the excuses put forward for chronic underperformance and overcost are as rickety as the bus-on-bogies ‘Sprinter’ trains left over from the 1980s.

The fact that the UK’s rail network is the oldest and reputedly most complex in the world should have given us a priceless advantage in terms of both managerial expertise in running a railway and industrial expertise in building an industry around it.

But while France, for instance, singlemindedly electrified its network starting in the 1950s, and launched its high-speed network (with trains that were partly British designed) in 1981, the UK bumbled along on a mixture of steam, diesel (preferred, because it was cheaper) and various kinds of electrification until the 1970s. Even now only 40 per cent of the network is electrified, and as late as last year the government was cancelling long-promised projects to complete an ‘electric spine’ in Wales and the Midlands because it would cost too much. Some observers believe electrification is now so expensive that even for main lines it will never be completed.

The legacy of these stop-start projects and failure to invest long term is not just a network that is incoherent and overcomplex technologically (it sometimes seems surprising that it’s all standard gauge) but also a paucity of engineering knowhow that has to be recreated or reimported for every new programme. This is directly responsible for some of the timetabling chaos. Instead of experienced teams using familiar technology moving seamlessly from one electrification project to another, each has to be created anew every time. The result: huge time and cost overruns, leading to more cancellations and disruption. As for train-building innovation and capacity, it is now almost nil.

The final and perhaps biggest handicap for UK rail industry is its byzantine structure and governance, if it can be called that. Based on a mixture of ideology, political expediency and operating convenience, it features a sort-of-nationalised Network Rail that owns the infrastructure, in semi-permanent warfare (and expensive monthly legal negotiation) with train operating companies that have more regard for shareholder profits than passenger needs, all overseen by a jittery DfT that interferes at every turn. The result is a uniquely British muddle that once again manages to lock in the worst of both private and public sectors and the advantages of neither. As John Harris put it in The Guardian: ‘[It] is yet another example of one of the great ironies of recent history: that Thatcherite believers in the liberating wonders of markets have proved to be very good at creating byzantine, top-down, endlessly failing systems rather suggestive of the worst aspects of the old Soviet Union’. Small wonder that the UK rail industry’s operating costs are reckoned to be 40 per cent higher than in the rest of Europe.

There’s another irony. There is a respectable case for arguing that the broken-down physical infrastructure mirrors, and is at least partly responsible for, the social and poltical fracturing that divides the rest of the country, particularly the north that has suffered the brunt of the great rail timetable meltdown, from the better-provided capital. If the politicians had noticed, we might never have become embroiled in the management omnishambles that is Brexit, which will do not one jot to remedy the real grievances of the austerity-hit regions. Yes, many weary Britons would surely be tempted to vote for someone who just promised to make the trains run on time.

Corporate killing fields

Layoffs are in the news. BT is axing 13,000 middle managers and back-office staff. Marks & Spencer is closing 100 of its High-Street shops, mainly big ones, with as yet unquantified job losses. In the US Tesla followed up a spate of executive departures with the announcement of a broad reorganisation in an attempt to bring down record losses at the company as it ramps up production of its Tesla 3 saloon.

There is a glaring omission in the reporting of these events. Although they merit plenty of coverage, it all centres on the future of the company, the future of the CEO and even the future of middle management. But completely absent from media or public concern in such episodes, points out Jeff Pfeffer in his urgent and angry new book, Dying for a Paycheck, are those most directly affected by the decision – people losing their jobs.

As anyone who has suffered it will testify, being sacked is one of life’s more bruising experiences, up there with divorce and loss of a family member, and its personal sequels – anxiety and depression, substance abuse, family breakdown – are often similar.

Yet as Pfeffer, one of the most respected US academic researchers, shows, this is far from the full price exacted. Those who keep their jobs pay heavily for the redundancies too. Because it is easy and there is no disincentive – companies do not pay the knock-on costs of redundancy; society does – firms invariably take out more people than they do work. The result is added pressures on the survivors, and in some cases dysfunction for the company as it transpires later that the amputated middle managers were an essential repository of corporate memory and order. What’s more, if the company’s underlying problem is lack of orders rather than excess costs – as at M&S – it’s not obvious that getting rid of people is a useful or relevant response. A smaller company is just smaller, not necessarily better. The damage layoffs do often far outweighs the ‘benefits’.

As Pfeffer demonstrates, the toxic effects of layoffs are replicated in other common workplace arrangements such as shiftwork, long hours, jobs with low control, little social contact and poor employment security. All of which are on the increase as work becomes more contingent and fragmented. To put it bluntly, the human consequences of modern performance management in the form of overwork, stress, lack of work autonomy, low pay, inequality and precarity, mean that for many just going to work is as dangerous to health as breathing second-hand smoke.

The costs are mind-bending. Pfeffer and his co-researchers estimate that toxic workplaces are responsible for 120,000 excess deaths a year in the US, making employment the country’s fifth largest killer, at an additional $2bn cost (at least) to the health system. Let that sink in. Most absurd and tragic, these effects are not only preventable, but preventing them would benefit employers too. Writes Pfeffer: ‘Unhealthy workplaces diminish employee engagement, increase turnover and reduce job performance, even as they drive up health insurance and health-care costs. All too many workplaces have management practices that serve neither the interests of employees nor their employers, truly a lose-lose situation’.

Europe is not as bad as the US, where employees largely depend on employers for healthcare coverage, but the UK with its proud emphasis on ‘flexibility’ (aka ‘insecurity’ for employees) is not far behind. So how on earth have we come to this pass? After all, we’ve abolished child labour. Physical accidents at work have been steadily reduced by health and safety regulation: workplace fatalities have fallen by two-thirds in the US since the 1970s. No one argues that health and safety, food and drug quality, or environmental pollution should be left to CEO discretion. So how come they can slave-drive their employees at work without anyone even noticing?

‘What kind of a company keeps you away from your family for that amount of time?’ asked a senior executive bitterly of a travelling routine that had him travelling 200,000 miles a year and on the road for three weeks at a time (GE, since you ask). Or pressures an executive on maternity leave to return to work two weeks after giving birth to make a keynote presentation at a corporate event (Salesforce)? Or apparently without irony recommends taking ‘a call from a client while having sex’ (freelance site Fiverr)?

Pfeffer caustically details the care lavished on tree-planting in the grounds of his university, Stanford, at a time when hundreds of people were being made redundant in the financial crisis. He comments: ‘At Stanford, you were better off being a tree than an employee. At too many workplaces, trees…fare better than people.’ Why? Why is there no business school research or teaching on this stuff? Why is it apparently of no interest to HR departments, whether in public or private sector?

The obvious answer is that this stunning erasure of humanity is the delayed cost of the still unfolding catastrophe that was triggered by the hijacking of the corporate form by the shareholder interest in the 1970s.

The doyen of management gurus, Peter Drucker, warned that the corporation was too important for society as a whole for its control to be monopolised by any one constituency. He was right. When the sole end of corporate activity is to maximise shareholder value, everything else becomes a means. In company accounts, humans are a cost, not an asset – just another resource to be exploited and disposed of, with the wider costs relentlessly exported to society as a whole.

In such a context, technology will intensify and speed the process. In the longer perspective, the gig economy is nothing new – just the logical end point of the unpicking of companies as human communities and their reconstitution, as Pfeffer notes, as the affectless, loyalty-less nexus of labour market contracts that the shareholder-primacy theory proposed – a remarkable if depressing piece of self-fulfilling prophecy.

One of the more poignant ironies – which Pfeffer himself has pointed out in another context – of this narrative is that according to repeated Gallup polls, more than anything else, most people in the world today desire a job and a paycheck. Perhaps that’s why we’re willing to put up with such shit – not to mention bullshit – when we get one. Pfeffer ends his important, indignant book as he begins it. If there is something special and sacrosanct about human life, he argues, then we should accept that it is indefensible morally to trade it for organisational considerations of cost and efficiency. The fact that cost and efficiency are also served by treating people properly – so that, to spell it out, the latter comes at no extra cost – is important but subsidiary. Most people have come to accept that environmental sustainability is essential for planetary survival. What about human sustainability?

British immigration policy is a monument to British mismanagement

The Windrush debacle – there is no word in English that does justice to its multilayered disastrousness – is some kind of apotheosis of hapless British management. First, there is the whelk-stall political version. As FT commentator Janan Ganesh notes, with its ceaseless ministerial musical chairs and unshakable faith in intellectual generalism, at the highest level the British state is unerringly ‘set up for low-key [sometimes not so low-key] shambles’.

It’s hard not to suspect that the replacement of Amber Rudd at the Home Office by Sajid Javid owes more to opportunism – smart move to have the son of an immigrant in charge of immigration policy – and the fact that, this being his fourth ministerial post in as many years, he is used to switching portfolios at short notice, than any managerial prowess. After all, with less than a year in any one job, how could anyone tell?

So day-to-day continuity is provided by civil servants, a bleakly anonymous policy inherited from previous incumbents – and the uniquely unpleasant culture of the Home Office. You could hardly invent a more favourable recipe for disaster.

It will not have escaped notice that Windrush is yet another in the UK’s unmatched line of grim targets fiascos. As usual, none of the right lessons will be learned. No matter how often it happens, ministers always dismiss the unacceptable consequences of target regimes as surprising one-offs that are the fault of bad apples, bad managers, or bad luck rather than their own management blindness. It’s true that Rudd was unlucky to be be left holding the can – it could easily have been May – when the disaster that was waiting to happen actually occurred. But nothing she has said subsequently suggests she has a clue that the trouble with targets is systemic, or that she did anything to soften the fierce target culture that reigns at her now former department.

Many of the press reports on the saga fulminate about the incompetence of Home Office case workers. Of course the episode is incompetent politically, falling flat on its face in the court of public opinion, which in timely fashion has reasserted ‘British values’ of fair play and tolerance that Whitehall officialdom has casually abandoned. In fact it is much more sinister than that. The ‘hostile environment’ for illegal immigration and a ‘deport first, appeal later’ priority that can make someone illegal for making a minor mistake on a form are the outward features of an immigration policy whose purpose is effectively to find a way of saying ‘no’. To serve this end, incompetence, whether willful or not, is an extremely effective means, and used as such along with inscrutable bureaucracy, arbitrary decision-making, excessively detailed forms and exorbitant fees charged for every form submitted. Since the purpose is to make people give up and go away, the fact that so many of the decisions are ‘wrong’ and overturned later by tribunal is beside the point. Officials have obeyed instructions and said ‘no’. We know that some Home Office staff receive bonuses; we don’t yet know to what extent they reflect achievement of targets set for enforced expulsions.

Of course, under austerity, saying ‘no’ has become the de facto purpose of many social services and indeed much of the public sector, including the NHS, where protecting budgets has become paramount. When Rudd admitted that a casualty of the Home Office regime was ‘the individual’ and that the process had taken priority she was right, but also stating the bleeding obvious. That was the point. As she should have realised, this is a travesty of management, management used as a force for ill rather than good. For individuals, many of them UK citizens, what they receive from their government is not just ‘no’, but a side helping of indifference to individual circumstance that splits families, destroys livelihoods and leaves others homeless and destitute. It needs hardly saying that the corruption of government is just as deep. In a truly Orwellian inversion, under this reductive regime the Home Office which ought to be the custodian of British values has become its opposite, a purveyor of fear and paranoia worthy of the Stasi, while the DWP now dispenses not welfare but illfare.

Both are simply unworthy of a civilised nation. There must be a serious case for breaking them up and reconstituting them as institutions with a positive purpose in keeping with a country which once had a history of enlightened treatment of migrants.

At least there are three cheering things to come out of the whole sorry Windrush episode. The first is the astonishingly dignified response of the Caribbean leaders to the unfolding revelations. In the face of their restraint, UK ministers’ desperate attempts to find someone to blame simply shrivelled, leaving the government looking even more shifty and contemptible than before. The second was the public outrage at the story, which only strengthened the same effect. The third was the trigger for the furore, in the shape of brave and persistent reporting by the Guardian’s Amelia Gentleman – a vindication of the values of proper journalism at a time when surveillance and restrictive legislation have pushed the UK down to 40th place in current world press freedom rankings – another British value that badly needs reasserting.

Understanding measurement madness

Modern-day management is subject to several debilitating diseases, but the most damaging and pervasive may be the measurement fetish. As Jerry Z. Muller puts it in his new book The Tyranny of Metrics, we don’t just live in an age of measurement – ‘we live in an age of mismeasurement, over-measurement, misleading measurement, and counter-productive measurement’. Despite the manifest and mounting costs of measurement failure, there’s no sign of the fetish diminishing; if anything the reverse.

The first reaction on reading Muller’s concise and non-strident study is relief: we’re not actually alone, or mad, to believe we are surrounded by measurement madness. As Muller confirms, it is rampant in public and private sector throughout the US and UK, the chief subjects of the book, and its unintended consequences – distortion of purpose and effort, fake figures, short-termism, discouraged risk-taking, innovation and cooperation, burgeoning bureaucracy, degradation of work and worker demoralisation, and costs both direct and indirect – are the same, equally huge, and equally disregarded, everywhere. As Muller remarks: ‘A question that ought to be asked is to what extent the culture of metrics – with its costs in employee time, morale and initiative, and its promotion of short-termism – has itself contributed to economic stagnation?’

Leading on from the first, the second reaction is a mixture of stupefaction and rage: in view of the evidence, including ‘a large body of literature’ dealing with, say, the problems with goal-setting, pay for performance (P4P), and performance rankings, why does this dysfunctional obsession persist and even grow, spreading like a virus from the source of infection in the US and UK to the rest of the Anglosphere and thence to other countries in Europe and the rest of the world?

Muller attempts to answer this question. ‘In a vicious circle,’ he writes, ‘a lack of social trust leads to the apotheosis of metrics, and faith in metrics contributes to a decling reliance on judgment’. He shrewdly notes that a metrics of accountability appeals equally but for different reasons to both the political right and left – the right being suspicious of public-sector empire building and protectionism (Yes, Minister, public choice theory), the left from the 1960s onwards distrusting authority and being convinced that leaving things to experts was to limply accept the prejudices of established elites. Both wings therefore wanted to make institutions more accountable and transparent, ‘using the purportedly objective and scientific standards of measured performance’, while the institutions themselves had no choice but to use similar performance data to defend themselves.

These tendencies were turbocharged both positively and negatively by related developments on the business front, where growing distrust of managerial motives led to the recasting of management into a system of goal-setting, monitoring and incentivisation, all dependent on standardised measured performance. New Public Management applied the same principles to the public sector, supplemented by consumer choice theory which held that giving people the right information to make rational choices was the basis of economic democracy. When this didn’t work (as it couldn’t), the response was not to change course but to try harder, using technology to measure more things and invent new rules about how to do it, giving another savage twist to the measurement ratchet.

The massive irony is that the choice of measures may be the most important thing that management does. And that being so it is subject to measurement’s all-powerful Catch-22: the point of measurement is assumed to be to supplant fallible human judgment and thus enable better decisions; but the choice of measures itself requires human judgment, and without it measurement is actually worse than useless – it misleads and hides the real situation, both from management and everyone else. This is why no one knows the real demand on the NHS or social care, for example; why politicians are puzzled to see services that win stars and plaudits from regulators and inspectors getting a resounding thumbs down from their constituents; and why none of the figures reported by the public sector – whether schools, universities, medicine or the police, all covered in the book – can be relied on.

The truth is that today’s metrics fixation has produced a culture of gaming and manipulation worthy of the Soviet Union – a resemblance that Muller does not fail to observe: ‘Just as Soviet managers responded by producing shoddy goods that meet the numerical targets set by their overlords, so do schools, police forces and businesses find ways of fulfilling quotas with shoddy goods of their own: by graduating pupils with minimal skills, or downgrading grand theft to misdemeanor-level petty larceny or opening dummy accounts for bank clients.’

In the age of Big Data and the gig economy where tasks are measured by the second, it is hard to see the measurement obsession diminishing any time soon. It’s not that the alternative to dysfunctional measurement – choosing measures that relate to purpose and that support rather than degrade professional judgment – is conceptually or practically more difficult; it just requires a different kind of thinking (and an implicit admission that previous methods were wrong) – which is why companies that model the alternatives are treated as one-off curiosities rather than exemplars to emulate.

But as a sobering reminder of how far we have strayed from the ideal of minimal (self-)management ideal, consider the example of the Medical Research Council’s Laboratory of Molecular Biology in Cambridge. The most successful biological research lab in history, the LMB was set up after the war by Max Perutz, who had arrived in the UK in the 1930s as a penniless refugee from Vienna. Recognising that creativity, in science as in the arts, can be fostered but not organised, still less planned, since it arises from individual talent, Perutz saw his task as removing anything that got in the way of his recruits following their desire to do the best possible science. (Compare with Peter Drucker’s rueful observation that ‘So much of management consists of making it difficult for people to work.’) Lab administration was kept to a bare minimum: ‘No politics, no committees, no reports, no referees, no interviews – just highly motivated people picked by a few men of good judgment,’ as pharmacologist Sir James Black, another Nobel prizewinner, described it. The result of this recipe for anarchy? Not surprisingly, the lab became a magnet for scientific talent, and while the MRC may not have got much paperwork from Perutz in return for its cash, by the time he died in 2002 it had chalked up the extraordinary total of nine Nobel Prizes, four Orders of Merit and nine Copley Medals (the highest honour from the Royal Society). London Business School’s Jules Goddard comments that Perutz deserves to be feted as much for the brilliance of his management as for his scientific example.