My article on ‘The slow murder of the listed company’ for the winter 2017 edition of Professional Manager is here. Happy Christmas!
My talk on ‘The 21st Century Corporation’ at the Ellen MacArthur Foundation’s 2017 Disruptive Innovation Festival is here
The 2017 Global Peter Drucker Forum, which took place earlier this month in Vienna, saved the best until last. Under the title ‘Growth and Inclusive Prosperity’, for the first day and a half it was its usual lively, eclectic and social self; Steve Denning’s notes on some major themes here. But on the final afternoon it burst alight.
The match was struck by Carlota Perez, the economic historian and development scholar, who painted a tantalising picture of what-could-be. Taking a 250-year view, Perez sees the four previous great technology revolutions – factories and canals, coal and railways, steel and heavy engineering, and cars and plastics – following a strikingly similar pattern. The first phase is enthusastic uptake as entrepreneurs pile into the new technology, leading to a speculation-fuelled bubble (think successive canal, railway and auto manias in past surges). Irrational exuberance is followed by a sharp recession, even depression, as reality kicks in and ambition is scaled back.
This is the point we have reached in the fifth great revolution wrought since the 1970s by IT, telecoms and the internet. The initial internet bubble burst at the millennium, to be followed by the great casino bust of 2008. As the dust settles on the financial crash, the question now is: can we hope for a second phase of stable, sustained, broad-based growth – a new ‘golden age’, as Perez calls it – as has happened at each of the technological surges in the past? Yes, replies Perez. But if – and only if – in the new phase investment in the technology is production- rather than finance-led; there is a guiding ‘direction’ for the innovation effort; and supporting institutions evolve, or are put in place by governments, to underpin and ease the jolting social change such transformations set in motion.
Those are evidently big ‘buts’. Production and manufacturing, such as they are, still dance to the tune of Wall Street and the City. The only guiding direction for Silicon Valley’s new masters of the universe seems to be technology for its (and their) own sake. And most business today, at least in the Anglo-Saxon world, channels Milton Friedman in asserting it has no social obligation except to increase its profits. Its interests have diverged so far from those of the societies in which it is nominally embedded that it cannot be relied on to provide a way forward. Governments meanwhile are supine and timid, having bought hook, line and sinker the neo-liberal line that the market alone can provide, and are in any case way behind the curve. Supply-side education and training (as I keep saying), while valuable in themselves, have limited effect when companies only employ humans as a last resort. The one nod to institution-building is the emerging debate about a universal basic income – which although interesting looks increasingly like Silicon Valley’s attempt to offload what should be its own wealth- and job-creating responsibilities on to those who pay taxes, a category which does not include itself.
Yet the outlines of a new positive-sum game between business and society are tantalisingly clear. Says Perez: ‘The direction for innovation is clear: smart, green growth’ that would work within environmental constraints to bring further areas of the world into a new-look, carbon-free prosperity. For that, governments need to rouse themselves to extend the fiduciary duties of corporate managers and directors to a wider stakeholder group, prodding enterprise to embrace its proper function of innovating, as Peter Drucker put it, ‘to tame the dragon, that is, to turn a social problem into economic opportunity and economic benefit’. Management’s ‘job-to-be-done’ would switch from the paradigm of the last 100 years, efficiency, the foundation for mass-production and consumption, to effectiveness, which redefines prosperity as the availability of solutions to pressing human needs, and growth as the increasing pace with which they can be generated. Imaginative institutional reform would underpin the changes with a new social contract reshaping obligations and entitlements for the new era.
So far so good. Perez’s outline was greeted with enthusiasm. But that left one large question outstanding: how would we get from here to there? Above all, who was going to make it happen? Perez couldn’t answer that, and, perhaps predictably, the reaction was a resort to what might be called the great leadership lament: where were the great leaders of old, and who would step up to lead us to the promised land? It seemed for a moment as if an the conference was stumbling to an uncharacteristically downbeat end. But that was to reckon without a second coup de théâtre courtesy of the very last speaker, social philosopher Charles Handy. Handy brought the curtain down and the hall to its feet (literally) by providing a frame for Perez’s ideas that was at the same time more daring historically and a direct call to action. Like the the Catholic Church 500 years ago, Handy said, management was ripe for its own ‘95 theses’, or manifesto for reform, that would take on a corrupted institution and reinstate basic human values at its centre. As to where to start and who should lead – what better than the Drucker Forum to stand in for Wittenberg and Peter Drucker for Luther, or even Luther King, their message spread and amplified by every person in the room? His message received a standing ovation. A defining moment for the Drucker Forum? Judge for yourself: watch here.
Many Saturdays, I attend a furniture restoration workshop. Restoring battered pieces to life is inordinately satisfying in itself (as is the interchange with a richly varied, even eccentric cast of characters united only by their craft motivation). But it is also fascinating to reflect on in the context of current concerns over work and automation.
Could this kind of work ever be automated? Unlikely. It’s simply too analogue, too human, the timber (sometimes literally) too crooked.
John, the instructor/boss, has an array of carpentry and upholstery skills that can only be marvelled at. Some of these could be reproduced in machines, although only up to a point. But as impressive as the craft skills are the prodigies of improvisation brought to bear on some of the repairs. Sometimes I laugh aloud with incredulity at the exuberant ingenuity of the resolutely low-tech means used to accomplish a repair that to me looks impossible. ‘Nothing’s impossible,’ John instructs. Thus the contraption rigged up to mend and straighten the leg of a delicate chest brutalised in a previous repair came straight out of Heath Robinson, involving clamps, stray bits of wood, a steel straightedge, sellotape, twine, and a bit of blanket. This in turn illustrates why all material offcuts – everything – is kept, not in the spirit of meanness, but the exact reverse: because of its potential, almost always eventually fulfilled, for imaginative and joyful reuse.
Two contrasting recent stories about industrial automation bring this extraordinary everyday human ingenuity into sharp perspective. One concerns Tesla, Elon Musk’s electric car company, as it attempts to ramp up production of its smaller Model 3 vehicle. Model 3 is crucial for Tesla’s ambition to move out of its high-end niche to compete as a mass-market manufacturer, and it had confidently predicted that by the end of 2017 it would be pumping out 5,000 of them a week at state-of-the-art California and Nevada facilities. Needless to say, automation is a key part of Silicon Valley’s assumption that it can reinvent car manufacturing, and Musk has boasted of his vision of lights-out plants with almost no humans in attendance at all.
That now seems a distant prospect. In its most recent update Tesla conceded that a series of teething problems and production bottlenecks had left workers struggling to produce cars by hand, with the result that just 260 Model 3s were completed in the last quarter. Closer reading suggests that Tesla is waiting to commit to the huge capital outlays necessary to get production up to the planned 10,000-a-week capacity until it knows it can hit the lower target – that is, until it can make the current levels of automation work. In other words Tesla – which is burning money, reporting a larger-than expected loss in 2017 Q3 – is betting the farm on robots doing things better than humans Some investors are beginning to think it is not a forgone conclusion. As Gary Hamel tweeted: ‘Software may eat the world, but hardware is eating Tesla. Turns out making cars is harder than coding. Who knew?’
Well, Toyota for one, which as Fast Company reports is taking a remarkably different approach to advanced automation. Backstory: Toyota vying to be the the largest has always been the most profitable of the major automakers. The Toyota Production System is one of the undisputed wonders of the management world, a living, evolving thing that represents more than half a century of organisational learning.
Nonetheless, in the early 2000s, Toyota went through a bad patch. It ran into quality problems as it chased the global number one spot, culminating in a loss in 2009, followed by hugely embarrassing product recalls and a $1.2bn penalty imposed by the US Justice Department. Out of the debacle emerged a revised production system, now known as the Toyota New Global Architecture (TNGA), and in 2015 a new head of manufacturing, Mitsuro Kawaii.
But in a startling case of back to the future, the new regime represents bold advance not towards more automation but less – a return to the human craftsmanship that the TPS was built on but was neglected in the go-go years. Kawaii worked his way up from the shop floor, and in line with the conviction that automation should ‘grow organically out of human innovation’, he has launched training exercises using string-and-sealing-wax methods to devise small improvements to workplace activity – a bit more sophisticated than the rigs in my furniture workshop, certainly, but recognisably similar in their reliance on craft and human ingenuity.
Kawaii’s view is straightforward and radical: ‘Humans should produce goods manually and make the process as simple as possible. Then when the process is thoroughly simplified, machines can take over. But rather than gigantic multi-function robots, we should use equipment that is adept at single simple purposes’.
These sentences should be stamped on the brow of ministers, civil servants, CEOs – anyone in danger of succumbing to the idea that digital is the automatic answer to a business problem. If they had been we might not have wasted more than £12bn on failed NHS IT and hundreds of millions more on the grotesque inhumanities of Universal Credit, among many other examples. Robots are the apprentice, the servant, not the master; they are used not to cut costs but to free up people to do things better for customers. To rub it in: in tests, Toyota consistently finds that people can assemble cars faster than robots. What’s more, unlike machines they can improve their own efficiency and work quality.
Human by default, supplemented by frugal automation to do the boring bits that humans can’t improve on. It’s a formula that works for Toyota’s vast Georgetown plant in the US. It’s one we’d recognise in our weekly workshop in North London, too.
The FT’s John Gapper recently wrote a perceptive piece calling time on the exceptionalism of the internet. In its early days, online was treated as a kind of la-la-land where information wanted to be free, new business models would proliferate and the laws of economic gravity did not apply. This magical thinking has served the big internet firms well, and they have done their best to keep it alive – not least in consumer naivety about the extent to which their online lives are tracked and sold on to others.
But as those companies grow bigger and fatter it becomes harder to ignore their real-world effects, and, to their indignation, regulators are increasingly taking a hand. As they should, if you think of Uber as a minicab firm, Airbnb as a hotelier and Deliveroo as a courier. Through the same lens Google and Facebook are publishers (with an advertising business model at least as old as newsprint). Amazon is a logistics company, albeit an incredibly efficient one.
Viewed from this angle, a number of things come sharply into focus. Increasingly, ‘disruptive’ looks like la-la lingo for regulatory, legal, tax or just semantic arbitrage. So part of the reason Uber is cheaper is that it books its UK orders through a Dutch subsidiary, which enables it to avoid charging 20 per cent VAT on fares. Uber London is really is just a minicab firm. Another is that, like Deliveroo, Airbnb, and Facebook, sheltering under the claim that it is an enabling not a transport (or food, hospitality or publishing) company, it can shrug off the responsibilities and costs of materials, insurance and above all employing people to drive passengers (or deliver pizza, wash sheets or write news).
But slowly, as Gapper notes, the real world is catching up. Uber and Deliveroo drivers are demanding employment rights through the courts – they may get them. Transport for London (TfL) wants Uber to observe the same standards for crime reporting, driver background checking and competition that it requires from other minicab firms. It is, it explains, nothing to do with the app or preserving in aspic the cab trade, which it and its predecessors have been regulating with some success since the 17th century. TfL has to bear in mind the added (and self-defeating) congestion that Uber generates, just as planners justifiably have something to say about ‘Rooboxes’ (Deliveroo’s ‘dark kitchens’ for cooking takeaways for posh restaurants) causing real-world noise and nuisance in residential areas – or for that matter whole swathes of cities turning into tourist dormitories through Airbnb (which is now getting into the business of developing apartment blocks for let).
If it complies with the rules, will Uber get its London licence? It would be interesting to be a fly on the wall for that conversation. It could of course could be argued that in a sane world Uber’s reckless management would be disqualified from being in charge of any sort of company. But leaving that aside, the regulator will have legitimate questions to ask in that most real of real-world domains: finance.
So far Uber has spent seven years and $13bn in its quest for global domination. Growth has been spectacular – but so has the cost. In the words of one academic investigation, ‘The growth of Uber is entirely explained by massive predatory subsidies that have totally undermined the normal workings of both capital and labor markets. Capital has shifted from more productive to less productive uses, the price signals that allow drivers and customers to make welfare maximizing decisions have been deliberately distorted, and the laws and regulations that protect the public’s interest in competition and efficient urban transport have been seriously undermined.’ In purely financial terms, the losses make the eyes water. In Q1 2017 Uber was applauded for ‘narrowing’ its losses to a mere $708m compared to nearly $1bn in the previous quarter. The total loss for 2016 amounted to around $3bn, after more than $2bn in 2015.
No one has satisfactorily explained how Uber could claw back deficits of that magnitude through normal means in an industry with razor-thin margins and a commodity product. (There are similar doubts about losses piling up at Deliveroo, albeit on a less massive scale.) So how come investors haven’t pulled the plug? One answer might be that venture capitalists in Silicon Valley have all succumbed to magical thinking. OK, me neither. The other explanation is that they really believe Uber’s fair-means-or-foul methods will take it to the promised land where network effects and increasing returns turn the ugly duckling into a winner that takes all, or at least most, of the money in its industry, before the cash runs out.
What’s riding on Uber therefore is much more than today’s riders and drivers. In most analyses, the only way Uber can succeed is by establishing the critical mass (read, quasi-monopoly) that would allow it to jack prices up to cover the full cost of rides – a jump of a beefy 40 per cent on present levels, according to some reckonings. At the same time it also needs to slash costs – which it why it is pouring money into self-driving (SD) technology, which will permit it to dispense with human drivers altogether. At that stage, of course, it won’t give a toss what anyone, including the courts, think of its employment practices – although it may come as a bit of a surprise to the thousands of London drivers whose support for its licence it is happy to claim today.
Uber, in short, is in a race against time, and, some would say, to the bottom. If it succeeds (or is allowed to) it will have legitimised an innovation model that destroys more value than it creates and turns what it does create into rents for the very few winners at the top. What riders gain as consumers they lose as producers – with SD they won’t even get gigs. To say that Uber should be subject to regulation is not anti-innovation – rather, it’s a call for regulation to get to grips with the new tech-enabled monopolies that are in reality old-style robber-baron rentier capitalism brought up to date. London is the theatre for this test case, in which we are both spectators and participants.
We’re familiar enough with product and process innovation – the ‘how’ of management – but, as many business writers have noted, innovation in management and organisation itself is frustratingly rare. I would argue that this is largely due to the tramlines of shareholder value, which lock in command and control and lead undeviatingly to the straitjacket of budgeting, targets and performance management.
But even within those guidelines, innovation is possible – although it probably helps to start from (and in) a different place, and in ignorance of the accepted rules. Take the Chinese white goods manufacturer Haier, which beginning with relatively conventional tactics in the 1980s, has made business model and management innovation its central differentiator. As the academic Bill Fischer, professor of innovation management at business school IMD, tells it, Haier has gone through four incarnations, each more radical than the last.
Back in the 1980s, as Deng Xiaoping opened up China to the outside world, one of the first objects of Chinese consumer desire was the refrigerator. Consumer goods were scarce – Fischer remembers seeing would-be purchasers besieging delivery trucks with bundles of renminbi before they even reached the stores. Quality was even scarcer. Fischer also recalls the day when the CEO of a small manufacturing cooperative grabbed headlines by publicly lining up 76 defective fridges – nearly a month’s production – outside the factory and setting his workers loose to smash them up with sledgehammers.
The CEO was Zhang Ruimin and the company Haier. Zhang was a young town official and Haier a near-bankrupt municipal enterprise that he ended up running because he couldn’t refuse. Faced with an emergency rescue, Zhang decided that the only way the company could survive against worldly global rivals was to build brand and quality, both nearly non-existent in China at the time.
After trashing faulty fridges to show what he thought of them, Zhang’s first step was to instill basic discipline through a tough performance management regime (sample directions: ‘Urinating or defecating in workshops is prohibited’, ‘Stealing company property is prohibited’). That set in train a period of rising quality and work discipline, on the back of which in the 1990s he reengineered the company to privilege customer-responsiveness and innovation, which moved it into services as well as products. The front line took more responsibility, disciplined by an internal market that rewarded the best ideas and performers.
The third phase took a further step in the same direction. In an effort to get closer to customers Haier literally turned itself upside down, adopting demand ‘pull’ as its market facing mode and splitting itself up into 200 or more self-managing, customer-facing teams supported by management from below – the so-called ‘Rendanheyi’, or ‘win-win’ model. Under Rendanheyi, new-product development times and costs were slashed by orders of magnitude. Between 2005, when the model took shape, and 2014 group profits jumped twelvefold.
Haier’s latest move – dubbed Rendanheyi 2.0 – is the most startling, however. Testing uncharted waters, Haier is remaking itself as a platform organisation, opening itself up to an internet-enabled world in a way few companies have ever imagined, let alone executed.
Yet the change is anything but arbitrary, according to Fischer. The key to understanding Haier’s successive transformations, he says is that each is a fresh structural expression of Zhang’s commitment to two core principles: serving customers and making best use of the talent of employees.
On the customer side, the internet had made customers better informed and faster to react than ever before. But now, as Zhang saw it, the Internet of Things (IoT) was about to take connectivity, speed of reaction and service to a new dimension. The future would be different. A smart fridge connected to wearables was just the start of it. But while Haier was by now managerially experienced and savvy enough to make a sideways move into adjacent product areas, there was no way it could envisage the dramatic oblique shifts into otherwise unrelated domains that the internet was opening up. To be able to do that, everything about the business – business model, supply chain, organisation and the way to manage it – would have to change all over again.
In this new world, it wasn’t enough for customers to be close: they had to be inside.
Zhang’s model for Haier’s next structural makeover was an unlikely one: the iPad. Here was a standardised hardware product, turned out in millions by a consumer products company, that had altered the way people used computing devices and viewed content – and also the company that created it. But the secret of the iPad wasn’t a conventional killer app for users. The killer app was Apple’s when it opened the device to developers who could turn a standard platform into anything anyone wanted it to be.
What if, Zhang mused, Haier could become an organisational iPad, a spine or skeleton on which anyone could graft any kind of commercial operation, inside or outside Haier’s traditional spheres of activity? That would change what a company could be used for, just as the iPad had changed the scope of computing. Instead of being a closed system, an ‘iPad company’ would dissolve boundaries and act as a ‘co-creation platform’ for a myriad of micro-enterprises (more than 2000 at last count) in which Haier would take an equity stake. It anticipated that some start-ups would be in fields very different from its usual beat. One new Haier micro-enterprise is using fintech to reengineer the Chinese egg industry and aiming to do the same with pigs; another is tearing apart conventional household appliances and reengineering them as smart connected devices.
Launching Rendanheyi 2.0 in 2015, Zhang noted: ‘The traditional mission for companies is to pocket profits in the long term. Our mission following the transformation is to become a shareholder in our micro-enterprises’. At the same time the initiative turns Haier employees into entrepreneurs. He added: ‘Under Rendanheyi 2.0 all of our employees can become entrepreneurs with decision-making authority, able to distribute benefits and optimally unleash talent. Haier as a company is no longer providing jobs to employees; we are instead offering a platform to become an entrepreneur’.
This might seem like a leap into the unknown. For many managers in established businesses it is scary and incomprehensible. But Fischer takes a different view. It’s certainly true that the changes Haier is currently making seem way more radical than those of a decade ago. It’s also the case that in a company that always seems to be in motion, some of Haier’s moves have been more successful than others. But taking the rough with the smooth, they have worked: otherwise Haier, having taken over GE’s appliances arm in 2015, wouldn’t be the biggest white-goods purveyor in the world. But this move, says Fischer, may be less risky than meets the eye. ‘I think Zhang Ruimin would say that today’s radical changes are all of a piece with, and in some ways an inevitable consequence of, many smaller changes made in the past, which have all been in the direction of pushing P&L responsibility down the organisation,’ he says. Making workers entrepreneurs and leaders in micro-enterprises is just a further step in devolving that accountability.
As well, it begins to make sense of an apparent paradox. As Zhang acknowleges, Haier in its present radically devolved shape is largely his creation. So how will it fare without a great leader to guide it?
Rendanheyi 2.0, says Fischer, modifies Zhang’s job too. ‘One of the attributes of great leadership is creating a safe base from which employees can do unusual things without feeling exposed’, he notes. That’s what the reforms have been designed to do. Now the task is to convince Haier’s people that coming through the transformations has changed them as much as the company. They are already entrepreneurs with success stories to tell. What’s more, while the future is unknown, as leaders of nimble, customer-oriented enterprises on Haier’s platform they are already further into it and may be better placed to seize its opportunities than their equivalents in almost any other large company in the world. Western rivals would be unwise to bet against it.
See my article for LBS Strategy Review, ‘Adhocracy – a new management approach’, with Julian Birkinshaw and Jonas Ridderstrale, here
Almost every article printed about robots and jobs starts and ends the same way. Here’s a recent example from the FT. The first hallmark of the genre, previewed in the headline ‘Poor education leaves emerging markets vulnerable to automation shock’, is a dire prediction of job losses – in this case in the developing world, where ‘the replacement of workers by machines threatens two-thirds of jobs’, according to a UN report. Then, as always – ‘As always, the only answer is education.’
Of course, prediction and ‘solution’ vary slightly. The looming job loss can be in particular sectors, countries or continents, and the answer can be training or other preparation, or, more frequently nowadays, some form of universal basic income. But both diagnosis and are cure are characterised by the same infuriating mixture of fatalism and complacency.
Past technological surges have always ended up creating more jobs than they destroyed, albeit in unpredictable areas, the argument runs; all we can do now is to tempt employers by giving them more skilled, willing and flexible workers. There are better and worse variations on this argument – for instance, this by Tim Harford is fine. But what most share is the unquestioned assumption that the only half of the equation that can be operated on or influenced is the offer – the workers. So the FT article above is more about Indian and African education than employment. As for the demand for workers – well, it’s just what demand will be.
But this is pushing on a piece of string. What if companies don’t want to employ people? After all, it’s not ‘the whirlwind of automation’ or even ‘machines’ that create or eradicate jobs. It is investment decisions made by human beings in company boardrooms. And those decisions do not take place in a vacuum.
A recent White House report on AI, automation and the economy underlines that ‘Technology is not destiny… The direction of innovation is not a random shock to the economy, but the product of decisions made by firms, governments, and individuals’. As Brian Arthur shows in his excellent book on the nature of technology, the latter is an integral part of the evolving economy, both shaping and being shaped by it. In some areas, such as medicine, the avenues science pursues are already economically determined – cures for first-world conditions are more lucrative than those for poorer countries. In others, management motivations decide how discoveries, once made, are diffused. Technologies such as the internet, voice recognition, touchscreen, and GPS – all developed in the public sector – could have been combined in countless different ways, or none at all: it took an inspired Steve Jobs to bundle them together into the familiar shape of the iPhone. The platform economy the iPhone then made possible is a further techno-economic evolutionary twist.
How the platform economy evolved, and why it is now the go-to model for every budding start-up, is in turn in part the result of developments in other socio-technological areas, in this case the company and management. In 2014 Martin Wolf wrote in the FT: ‘Almost nothing in economics is more important than thinking through how companies should be managed and for what ends’. He went on: ‘Unfortunately, we have made a mess of this. That mess has a name: it is “shareholder value maximisation”. Operating companies in line with this belief not only leads to misbehaviour but also militates against their true social aim, which is to generate greater prosperity’.
This is the first time in history that one of the great technological spurts has taken place when companies are being operated in line with this anti-social belief: that is, under a regime where one stakeholder is supposed to maximise its returns at the expense of the others, including society, and where the most widely taught and practiced version of strategy is largely about preventing other stakeholders from eating the shareholders’ lunch.
Now put today’s gig and task-based economy in perspective. It hasn’t suddenly popped up at random from the blue. It is just the latest step in a process of corporate dis-employment which began in the 1980s. Simply put, under shareholder value employees are costs to be minimised like any other. So responding to their new incentives, managers began to pass up employment-creating initiatives they would have undertaken in the past in favour of cost-minimising measures to benefit shareholders.
The downsizing and outsourcing trends initiated then have expanded steadily to the present day. Collateral damage was first lifetime employment, then defined-benefit pensions, then corporate responsibility for career. In recent years automation and AI have further eroded the full-time permanent employment bond, with a corresponding upturn in the growth of freelance, short-term and zero-hours contracting.
Enter in 2007 the iPhone.
There are many ways the smartphone could have been used for economic and social gain – including the enabling of a real sharing economy. But in the labour-historical context there is an inevitability about executives driven by shareholder value deploying it to dismantle the last element in the traditional employment regime: the job. With labour a commodity to be contracted as easily as any other, one of the traditional justifications for the traditionally shaped company collapsed, and with it the last link between corporate growth and employment. At best new-generation companies are job-neutral (Instagram: $19bn in value, 19 employees); at worst, like Uber, they are job killers.
The blunt truth is that companies today have no intention of employing people unless they absolutely have to (which explains why the Silicon Valley titans are hypocritically rallying to the cause of the universal basic income). Next down the road: driverless cars. In this situation, expecting better qualifications to improve employment chances is a bit like hoping that faster, stronger horses would stave off the advance of the combustion engine.
So do we sit passively while employment continues to wither until it becomes the preserve of a privileged few?
Or do we decide to act on what we can still influence, the demand side of labour?
Here are some suggestions.
The first step is for governments to reinstate employment as a central plank of economic policy, as it was up to the 1980s when it was hastily abandoned as politicians let themselves be persuaded that markets know best. Tax policies should be adjusted accordingly. Companies that fail to pay a living wage should not be considered for state contracts.
Employment policy should go alongside a root-and-branch rethink of ‘how companies should be managed and for what ends’, to requote Martin Wolf. Far surpassing the timid and irrelevant tweaks envisaged by Theresa May, the aim would be to prise companies from the grasp = of short-term shareholders (and executives) and restore them to their proper mission of generating greater prosperity for society.
Individuals also have their part to play. Strikingly, Gallup surveys show that globally what more people want than anything else – more than security, family and peace – is a full-time job with a pay cheque. OK: their responsibility then is to prepare themselves to engage with the employment they want as both workers and citizens – including in the responsible trade unions that governments (and companies) should foster as counterweight to the current corporate dominance.
Employment is one of the time-bombs left un-defused by the failure to reform business-as-usual after the financial crash of 2008. It would be better to act now than to wait for a chance detonation to set it off.
Read my account of the latest Foundation forum here
In C.P. Snow’s Strangers and Brothers novel sequence, several of which are set in a fictional Cambridge college in the 1930s and 1940s, older dons could remember the time when college fellows weren’t allowed to marry. As late as the 1960s universities remained a world apart. There were just 22 in the UK, reserved for a privileged 5 per cent of the population – and in some of them students still had to be in by 12 at night.
Of all our enduring institutions (Oxbridge dates back to the !2-13th century), over recent years the universities have perhaps travelled farthest in the shortest period of time. There are now 150 of them in the UK, and Tony Blair’s target of 50 per cent participation has pretty much been met. The all-important student satisfaction survey naturally ensures that no vestiges of restraint on night-time entertainment endure (today’s equivalent may be ‘safe spaces’, but that’s another story).
That is certainly an achievement, but as recent political headlines over tuition fees and teaching attest, it is far from an unalloyed or uncomplicated one. For the recent history of the university sector is an object lesson in the unexpected consequences of opportunistic policy-making, a number of which are now coming destructively home to roost.
The first is perhaps the most straightforward. One powerful justification for university expansion was the supply-side argument that boosting educational levels would respond to employers’ demands for a more capable workforce and thus benefit the economy as a whole. Fifty years on, employers are still whingeing about the wrong kind of qualifications – and these days they’d often rather not employ anyone at all, particularly expensive graduates. It was always the demand side too, stupid.
As commentators such as Alison Wolf have consistently argued, the reverse of the coin of privileging university education is the scandalous neglect of further and vocational education such as apprenticeships. In the resulting mismatch, overqualified graduates are being employed for jobs which previously would have gone to the less well qualified, compressing the latters’ employment and social chances, and everyone’s wage rates. This not only stokes the political pressures that are now all too evident, but also ensures that the expensive loans taken out to buy an income boost that hasn’t materialised, will never be repaid.
There have been internal growing pains, too. As with much in Britain, universities, in the words of Andrew Adonis, head of Blair’s No 10 policy unit, have developed haphazardly, with ‘one thing leading to another, in a typically unplanned British way, on the part of successive governments’. As the university estate has grown, and encouraged by successive governments, notions of choice, competition and latterly value for money have steadily come to the fore. Particularly consequential (and not in a good way) has been the regime of targets in the shape of the Research Assessment Exercise and its successors instituted by the Thatcher government in the 1980s.
As with all such measures, the consequences were predictably unintended. Thus making a large chunk of university funding dependent on research quality provoked massive gaming on one hand (a burgeoning transfer market for prolific researchers, exclusion of weaker colleagues from assessment, huge expansion of conferences and learned journals to report on and in), together with reduced emphasis on teaching on the other. Teaching quality hasn’t suffered – competition for posts among a vastly increased supply of young PhDs has seen to that – but quantity has. Teaching is now to be subjected to its own assessment, with no doubt similar perverse effects, not to mention spiralling bureaucratic demands: university head of department is in practice now a full-time management job, with no time for either research or the teaching that the incumbentwas taken on to perform.
Meanwhile, tuition fees and the insidious marketisation of education generally have increasingly displaced competition between institutions from the academic to factors such as facilities and ‘the student experience’. Universities now think of themselves as ‘brands’, with a massive increase in spending on marketing and related activities – and managerial types to do it. While lecturing salaries are subject to a 1.1 per cent annual growth cap, no such restriction applies to management. Adonis points to soaring vice-chancellors’ pay, which drags other managerial ranks up with it. Bath University employs 13 managers on more than £150,000 a year, and 67 on £100,000. Many universities actually make a ‘profit’ on tuition fees: dispiritingly, this is where the money goes instead.
Which brings us back to the bottom line: who pays? Adonis is right that fees set at £9,000 a year (a cynically and supremely opportunist move by George Osborne to fund tax cuts for the better off), soon to go up again, are a ‘Frankenstein’s monster’ and in the long term untenable. No one should have to begin adult life with debts of £50,000 hanging over them. But the effects ramify out from individuals to the entire macroeconomy. For a start, starting debt levels are now so high that even after 30 years, three-quarters of students won’t have paid off their debt, according to the IFS. On the government’s own estimates total unpaid loans will hit £100bn next year and double again in a decade.
That’s unconscionable enough. But there are huge knock-on effects too. There was always a sneaking suspicion that one intended side-effect of making young adults pay for their university education might be to curb student activism. At least in the US, that seems to have been the intention. But the indirect costs are now mounting vertiginously. Indebted graduates are delaying having families while they search for reasonably paying jobs. As for buying homes, forget it – as also the furniture and other stuff that go in them. In fact, impaired credit ratings make it quite hard for them to make any substantial purchases at all. Dampened spirits and high anxiety levels are being connected with health issues such as depression, and marital failure. Finally, the debt overhang is also reckoned to be a factor in the worrying fall-off in rates of entrepreneurship and new business formation.
It takes a special kind of management to transform a policy that was presented as having only upsides into something now characterised as ‘unsustainable’, ‘in tatters’ and a ‘substantial economic headwind’. It will take something equally special to unravel it, but the other way round. But that would require an ability to register and learn the lessons of past mistakes – so don’t hold your breath.