Deliveroo or Analogic?

Deliveroo or Analogic. In two words at the 2016 Global Peter Drucker Forum in Vienna in November, Columbia Business School’s Rita McGrath put her finger on the fault-line running straight through today’s business economy.

Both companies are worth about £1bn. One of them, Analogic, is 40 years old, modestly profitable and employs 1,500 people worldwide making highly-rated medical and security equipment. The other, Deliveroo, a UK-based food delivery internet start-up, is three years old, made a loss of £18m last year on undisclosed sales, and uses 6,500, 13,000 or 20,000 couriers (take your pick) to deliver restaurant meals to home diners. Deliveroo’s couriers are ‘independent contractors’, not employees. Amid criticism of ‘Slaveroo’ conditions, its UK riders are campaigning for union recognition and the minimum wage.

What the world wants is more companies like Analogic. In Vienna Stanford’s Jeff Pfeffer quoted Gallup surveys – ‘one of the most important discoveries Gallup has ever made’ – showing that, where in the past they desired peace, freedom and family, today above everything else most people in the world want good jobs for themselves and their children. A good job (not a great job) is a steady 30-plus hours a week for the same employer and a pay packet. In another finding, Gallup observes a ‘perfect correlation’ between good jobs and economic wellbeing: the higher the percentage of employees with steady full-time employment, the higher the per capita GDP.

What the world is getting, meanwhile, is Deliveroo. As a variety of management big cheeses lined up to lament at the Forum, big companies these days are effectively on innovation strike. In so far as they invest at all, they prefer predictable efficiency gains (those that reduce headcount) to more speculative longer-term entrepreneurial efforts that might generate new markets and industries – and jobs. ‘Companies start out as equity end up as bonds,’ as Roger Martin put it.

This is not an accident. The mechanism is crystal clear: financialisation has decoupled corporate growth from job creation, with the consequence that new-economy companies like Google, Facebook, WhatsApp and others can be radically large in reach, market share and market capitalism but radically small in headcount. ‘Under our current conditions,’ concludes Michigan University’s Jerry Davis in a recent paper, ‘creating shareholder value and creating good jobs are largely incompatible. Corporations are “job creators” only as a last resort’.

For some in Vienna and the wider world, especially, Silicon Valley, the decline of of corporate employment is a cause for celebration. They see the end of wage slavery as the forerunner of a truly entrepreneurial economy in which every individual is empowered to become anything they want to be. For MIT Professors Erik Brynjolfsson and Andrew McAfee predictions that technology will lead to a workless future are false. The internet of things and 3D printers will enfranchise a new generation of entrepreneurs – human ingenuity and tastes are boundless. In the same vein, ‘The entrepreneurial society is going to happen,’ Tammy Erikson told Drucker participants enthusiastically. ‘Artificial Intelligence will take over our jobs. Organisations will shrink as transaction costs diminish. We can’t stop it… There’s a fabulous opportunity with great technology to transform the world of work.’

Before we reach entrepreneurial nirvana, however, there are two giant roadblocks to navigate. The first is that, as Gallup documents, while entrepreneurship may appeal to those with social and intellectual capital to leverage, it’s not what most people want. ‘I feel very uncomfortable with [the notion of the all-embracing entrepreneurial society ,’ said Maëlle Gavet, herself an internet entrepreneur of repute. She noted that there will be a large proportion of people who are unable or unwilling to be entrepreneurs – and that is not necessarily their fault, or indeed a fault at all. ‘We welcome and encourage disruption, but we need to remember that disruption disrupts real lives.’ In the absence of measures by governments to think innovatively about distributing risk and the increasing flimsiness of societal safety nets – ‘The UK hasn’t a clue what the state is!’ declared Sussex University’s Mariana Mazzucato – it’s hard to argue that the overriding desire for employment is anything but rational.

The second awkward reality is that even if people were keen to embrace entrepreneurship, this is actually not the current direction of travel. Economies like the US and UK are becoming less entrepreneurial, not more. Companies are getting older, more concentrated and less dynamic. Investment and productivity are lagging. In the US, home of Silicon Valley, the rate of new business formation has dropped by 50 per cent since the 1970s, Pfeffer pointed out. Corporate deaths outnumber births. Bureaucratic, industrial-age assumptions about people and organisations still rule. ‘The innovation engine is disappearing in the US,’ said über-guru Clayton Christensen. Curt Carlson of R&D institute SRI, while adamant that innovation and entrepreneurship can be learned and taught, acknowledged that ‘we’re doing a pretty terrible job at the moment. There are thousands of billion-dollar opportunities out there waiting to be taken’.

Instead, we have Uber, Airbnb and Deliveroo; and any number of Tindrs, Grindrs and Tumblrs. ‘We wanted flying cars; instead we got 140 characters,’ as entrepreneur Peter Thiel has caustically observed. The first two are disrupters all right, but as the McKinsey Global Institute has pointed out they create less value than they destroy. Defenders trumpet the benefit to consumers from the increased efficiencies, less so the losses to the same people as workers. On the evidence so far, the idea of maximising consumer value, as some are beginning to advocate, is as unlikely to lead anywhere good as maximising shareholder value.

None of this is inevitable. As the forum heard, cities and self-confident states that don’t deny themselves the opportunity (as the UK or the US do) can do a great deal to foster innovation (see Singapore and Israel respectively). In the private sector, it also heard about the thriving German ‘Mittelstand’, a large cohort of medium-sized, often family-owned companies harbouring a disproportionate number of ‘hidden champions’, below-the-radar world beaters that are quintessential providers of good jobs and community glue, often over generations. German manufacturing still accounts for 22 per cent of GDP, 10 points more than in the US and UK.

All this, let’s underline it, is the result of choice – choice about the way companies are run and governed, and crucially about their responsibilities and obligations to the wider society too. In other countries, companies like Analogic still hang in there – just. But even if manufacturing, once lost, will be difficult to reproduce, the management principles that sustain these customer and community-focused firms are known, proven and available to anyone to use. Is management going to bridge the social gaps that have opened up to admit the current monsters, or widen them? Analogic or Deliveroo?

Business-as-usual gave us Brexit and Trump. Now we have to change it

Brexit and now Trump are the delayed detonations of the unexploded bombs left behind by the Great Crash of 2008-2009. It seemed clear then that the financial meltdown was the logical end-point of a fundamentally flawed version of capitalism that had for ideological reasons inverted the real order of things, placing finance and shareholders as the centre of the universe round which the productive economy revolved, and patronisingly advising everyone else to wait for the benefits to trickle down. Brexit voters and the half of Americans who are worse off than they were in 1999 – and barely better off than in 1967 – have decided the wait is over.

The explosion didn’t go off in 2009 because an equally petrified left and right, despite rhetorical ferocity over marginal differences, united to assure their followers that despite the glaring flaws there was no alternative to the restoration of bankrupt ‘business as usual’, on both political and economic fronts. As in the 1930s (think Weimar Republic) it was a hopeless failure. ‘Quae non possunt non manent’ – things that can’t last, don’t. It’s the borrowed time of the previous consensus, based on the easy assumptions of social and economic liberalism, that has just come come to a noisy and vituperative end.

It’s been too glibly assumed that liberal social attitudes – to race, gender and sexual orientation, immigration, welfare, crime and punishment – which are now under such attack in the US and much of Europe, go hand in hand with democracy. Only up to a point. They are much more, perhaps only, sustainable in a healthy, balanced economy in which jobs, income and new resources funding some kind of social safety net, ease the pinch-points that aren’t caused by, but are blamed on, social liberalism. As we know to our cost, austerity is sooner or later death to tolerance and fellow feeling along with economic wellbeing, and best friends with resentment and anger over what’s felt to be lost, fear of the other and of what’s to come.

This is why real economic change is now both the priority and a possibility. As Paul Mason has noted, ‘It is entirely possible to construct a humane pro-business version of capitalism without…austerity, inequality, privatisation, financial corruption, asset bubbles and technocratic hubris’ – provided we go beyond defeatist determinism that sees the middle and working classes as victims of inevitable globalisation and technological advance, as if these were ineluctable forces of nature over which we have no agency. This is simply false.

It wasn’t abstract economic flows that caused the the derivatives bubble that led to the Great Crash, but catastrophic management decisions, bent by unrealistic assumptions about human nature, about what companies are for, and how they should behave. In exactly the same way, it’s not globalisation itself that is (in part) responsible for stagnant wages and lack of good jobs, but what financialised, short-termist companies and managers have done with it. As former Greek finance minister Yanis Varoufakis expressed it recently on the radio, globalisation in the shape of the international movement of goods, capital and people is one thing; globalisation as the ability of giant corporations to play hide-and-seek with international profits, arbitrage tax regimes and domicile, and lobby for international treaties allowing them to sue countries for actions that damage their profitability, is something that no one signed up to.

It’s no use economists vaunting globalisation and free trade as ‘goods’ in the abstract. They are only good if they are designed to be. Perhaps it’s that conditionality that J. M. Keynes, not noted as a narrow thinker, had in mind when he wrote in the 1930s: ‘I sympathise with those who would minimise, rather than those who would maximize economic entanglements among nations. Ideas, knowledge, science, hospitality, travel – these are things that of their nature should be international. But let goods be homespun wherever it is reasonable and conveniently possible, and above all, let finance be primarily national.’

Similar considerations apply to technology, perhaps even more so. The reason for pessimism over the current direction of technological travel does not lie in the nature of technology itself, nor in the belief that no other direction is possible. Precisely the contrary. It is that the world is experiencing the first great wave of technological advance to take place under a regime in which managers who make resource allocation decisions are enjoined, not to mention highly incentivised, to privilege investments that benefit shareholders (among them themselves), whatever the consequences for other stakeholders. This is why they favour low-risk efficiency gains over less certain but potentially much higher returns from more ambitious and expensive innovation. Consider in this context the ‘sharing’, or better, ‘gig economy’. If you thought it appeared by virgin birth out of the blue of cyberspace, think again. Following logically on from downsizing, outsourcing, offshoring and the end of career, task-based employment, or the end of the job, is just the latest efficiency-driven, technology-aided manifestation of managers’ ongoing determination to bring market mechanisms into the company, in the (wrongly) presumed interest of shareholders.

Peter Drucker believed that corporations were far too important for the health of the wider society to be under the control of any one interest. He also believed that when institutions and beliefs outlive their founding assumptions, as they do, they become afflictions, threatening the whole of civil society with upheaval and unrest. Which is why innovation and entrepreneurship – ‘pragmatic rather than dogmatic, modest rather than grandiose’ – ‘are needed in society as much as in the economy, in public-service institutions as much as the economy’. No one can fail to see the relevance to the events of today. This year’s Global Peter Drucker Forum has as its subject ‘The Entrepreneurial Society’, its subtext the need for self-renewal drawing on the combined practical capabilities of state, civil and private sectors. Never has a major conference theme been so apt or so urgent.

The tyranny of the minority

The main idea behind complex systems is that the ensemble behaves in way not predicted by the components. The interactions matter more than the nature of the units. Studying individual ants will never (one can safely say never for most such situations), never give us an idea on how the ant colony operates. For that, one needs to understand an ant colony as an ant colony, no less, no more, not a collection of ants.

This is the intro to an intriguing online essay by Nassim Nicholas Taleb, best known as author of The Black Swan: the Impact of the Highly Improbable, and one of the most interesting (if also infuriating) writers about systems there is.

The piece is called ‘The Most Intolerant Wins: the Tyranny of the Small Minority’, and as the title suggests it shows how in some circumstances the interactions of the parts cause outcomes for the system as a whole that, looking at the components, might seem impossible.

For instance: in the US most soft drinks are, apparently, kosher. Strict kosher observers make up only a tiny fraction of the population – so how come? Because while kosher-eaters won’t touch non-kosher lemonade, the reverse is not the case: non-kosher people will happily quaff kosher drinks. The asymmetry means that for the shopkeeper it’s a no-brainer: to satisfy most customers he needs only stock a range of kosher soft drinks. Gradually, what was a minority choice comes to dominate.

Hence Taleb’s minority rule: in given conditions, a majority will find itself adapting to, or dominated by, the preferences or will of an intransigent minority, even a tiny one.

Once you get this, there are implications everywhere. One of the conditions for minority rule to work is that complying should not entail extra cost for the more flexible (or indifferent) majority. That’s the case for kosher lemonade, which is no more expensive than other varieties – but not for kosher (or halal) meat, whose slaughter methods levy a cultural cost which many non-kosher eaters are not prepared to pay. If they were aware of it, organic and Fair Trade producers could conceivably benefit from the minority rule much more than they do. Currently their goods are typically much more expensive than the competition. But if they could bring prices down of the level of non-organics, many more outlets might quietly make them the norm, avoiding the cost of stocking both organic and non-organic ranges.

Taleb’s rule can also work in more sinister ways. In Michel Ouellebecq’s brilliantly imagined novel Submission, a minority Muslim Brotherhood party gets itself into power by making it easy for a disillusioned, jittery French electorate to accept a relaxed Sharia law in return for a quiet material life and an almost Gaullist programme of family respect, educational reform and strong leadership.

Or how about a striking commercial example that I came across as I was thinking about Taleb’s piece. For two decades, the giant US chemicals and seeds firm Monsanto has successfully sold a strain of soyabeans genetically modified to resist its potent (and lucrative) Roundup weedkiller. The snag is that as Roundup use has intensified, so has growth of ‘superweeds’ resistant to it, with drastic consequences for farmers’ livelihoods. In response, Monsanto has developed, and marketed, a new soyabean strain called Xtend, which can withstand not only Roundup but a second type of herbicide. The newer chemical has yet to gain regulatory approval. But this has not prevented some hard-pressed farmers from planting the seeds and spraying them with older, highly volatile formulations of the chemical – laying waste thousands of acres of neighbouring fields planted with the older, pre-Xtend seeds. It is now reported that many farmers using Monsanto’s older soyabean strain are being forced to adapt to the minority and adopt the new one to avoid being wiped out by their neighbours.

Actually, although this is a battle that Monsanto might win in the short term, Big Agriculture may be in the process of losing the GM war – ironically because its scientific intuition does not stretch to the workings of complex systems. Its hardball tactics (lobbying, propaganda, smear campaigns against opponents) have radically misfired. Convincing a majority with no strong feelings to eat GM is beside the point; what matters much more is that its methods have created an immovable obstacle in the shape of an irreducible nucleus of people who will never, ever touch GM food, and who, like Taleb, actively proselytise against it. The fact is that the flexible majority who don’t mind eating GM will also consume non-GM. So, since there is minimal cost penalty, food manufacturers will eventually bow to the obstreporous minority and remove GM ingredients from their products – and this is exactly what is beginning to happen.

Finally, consider soaraway executive pay. There is no evidence, empirical or moral, in favour of current pay levels (on some calculations, the higher the pay the worse the corporate performance), and almost no one outside the charmed circle even tries to defend them. Yet they keep on rising inexorably year on year.

On the other side, of course, sits an obdurate blocking minority (those receiving those amounts) who will never voluntarily surrender their licence to pocket a fortune – that after all is the unspoken prize for reaching the top. Colleague CEOs on remuneration committees are naturally loath rock the boat they are also sitting in, fund managers with the same incentives as corporate managers are more concerned with what’s happening to the share price than to the CEO’s wallet, and the same goes for individual shareholders. Which is why it is safe to bet that self-regulation will continue to fail, and that short of an explicit change in corporate governance or company law the pay ratchet will continue to click merrily upwards.

There is much food for thought here. Leveraging this kind of systems dynamic isn’t easy or obvious, particularly where it involves hard-to-manage cultural or other prohibitions. Some of the most potent examples are (fortunately) difficult to replicate. As Taleb points out, the steady rise of Islam in the Middle East was driven not only by Islamic marriage rules (to marry a Muslim woman a non-Muslim man must convert, and any child with even one Muslim parent is Muslim) – but also by the fact that conversion is a one-way street. As Salman Rushdie can testify, public apostates risk death. A favourable asymmetrical rule plus draconian enforcement makes for a pretty powerful ratchet.

But other minorities can take heart. As we have seen, the softer version – unyielding at the core, emollient at the edges – can also take non-systems-aware opponents by surprise. The key is utter intransigence on the central principles, and what Taleb calls ‘skin in the game’, or commitment. Perhaps the anthropologist Margaret Mead was reflecting on the minority rule in action when she famously wrote: ‘Never doubt that a small group of thoughtful, committed citizens can change the world; indeed, it’s the only thing that ever has’.

Ruined by regulation

‘The proposed solution, so characteristic of the modern state, is regulation. In place of internal mission comes external monitoring’.

These arresting sentences occur at the start of a recent piece by the FT’s Martin Wolf attacking the government’s proposals to open up the UK university sector to market forces. In this case, the problem which regulation, in the shape of an Office for Students (OfS), is meant to be an answer to is ensuring quality of provision, and to do this it has a battery of nuclear-powered weapons at its disposal – including, extraordinarily (this is a directly government-appointed body, after all), the ability, by order, to revoke any university’s historic right, granted by Royal Charter or Act of Parliament, to award degrees.

In Innovation and Entrepreneurship, Peter Drucker observed that in time all ‘institutions, systems, policies outlive themselves, as do products, processes and services’, and turn into a new problem to be solved. This is surely the case with regulation. One reason is what has been called ‘the tyranny of misapplied doctrine’ – the indiscriminate application of faddish nostrums (privatisation, lean, digital are three that instantly come to mind) to any institution or organisation thought to need reform, whether the remedy is appropriate or not.

Wolf does a good job of showing why turning the UK’s highly regarded university sector into a competitive market-driven one is a thoroughly bad idea, and I won’t rehearse it here. But it’s worth pausing to consider where ‘regulation’ is heading and the issues it is leaving behind.

Recall that the idea behind the formula ‘privatisation plus regulation’ was to put essentially commercial activities such as telecoms, energy generation and the railways out of reach of political meddling so that markets and competition could work their impartial magic unhindered. In spheres that couldn’t be privatised, regulation was intended to extend the positive functions of the state in a predictable, rule-based rather than arbitrary fashion.

Good luck with that. Relocated in the private sector, energy policy is a complete political and practical shambles. The railways are never out of controversy. ‘Big Bang’ and light-touch regulation were heavily implicated in the Great Crash of 2008 – regulators never even twigged that a crisis was brewing. In the public sector, by contrast, ever-tightening regulation has signally failed to prevent tragedies and scandals such as Mid-Staffs, Baby P and countless others. The truth is that not only has regulation not improved public services; it has actually prevented improvement. To put it another way, regulation fails the bottom-line test. It costs more than the sum of its benefits.

Fallacies live on both sides of the privatisation-regulation equation. Privatisation and markets aren’t always the right or only answer, as in universities (and education generally), health and other social fields. Even when they are a conceivable answer, regulation often doesn’t help, mostly, as John Seddon explains in The Whitehall Effect, because ‘regulators bring with them their own theories of management and control’. These are as ideologically based as privatisation itself, reflecting a fixed belief for example in efficiencies of scale and command and control and incentives to get the work done. Thus regulation arbitrarily decrees that public services should be target-driven, outsourced where possible and ‘digital by default’; in the private sector all that matters is that there should be competition on price.

Regulation-by-decree largely explains how we have ended up with a command version of the market economy which is more flexible than that of Soviet Russia, but only in degree. So we may have a market economy, and therefore ‘choice’. But ‘choice’ in the regulatory sense has a special and limited meaning. It doesn’t mean getting a service that you actually want or meets your individual needs; it means getting a service from providers using methods that the regulator approves.

Those methods rely heavily on compliance, budgeting and performance management activities which add little or no value, and the emphasis on unit cost and scale efficiencies effectively lock services into the industrial-age paradigm of low-cost, low-quality mass production, as (shamefully) in care services. If you ever wondered how in an age when ‘adtech’ boasts of its ability to micro-target ads according to everything from your age to weight to your gastronomic or sexual tastes, your long-time banking, financial services, retailing, energy and public service providers are all indistinguishable from each other in their mediocrity and inability to tell you from Adam, here is your answer. By extension, this is also the reason why all our High Streets resemble each other – for all these organisations it is the regulator who is king, not the customer, and regulators care about economic competition, not distinctiveness or quality.

There is a role for regulation – but it categorically does not include decreeing method, which on the contrary is exhibit No 1 in the museum of regulatory howlers. For a start, there is no so such thing as ‘best practice’, which is contingent on purpose and measures; and setting the means in stone is Soviet-style management that stops innovation dead in its tracks, which is why we have fragmented, lowest-common denominator public (and often private) services, hopeless banks and dismal High Streets.

Regulation – where needed – should be about holding organisations to account against their ‘internal mission’ (Wolf’s words) or purpose. Seddon proposes that regulators should be entitled to ask one and a half questions: What are your measures and how do they show that you are improving against your purpose? After 1000 years, British universities have developed a fairly good idea of their internal mission, and have powerful traditions of defending it, which is one reason they rate highly in international comparisons. That mission has already been compromised by government pushing of market forces: UK (like US) universities now have more, and more highly paid, managers and administrators than lecturers, and compete through marketing and ‘credentialling’ as much as by teaching and research. Using the screen of a regulator to justify and enforce what threatens to be a full-scale government takeover of the university sector is another large step in the Sovietisation of UK society, and the exact opposite of the advancement of the public interest that regulation was supposed to achieve.

Bottom of the class

The silly season used to consist of two or three weeks of summer torpor when newspapers used up their stock of daft and inconsequential stories that couldn’t make it into the paper on normal days. This year the silly season started early and shows every sign of becoming permanent – it’s now not the daft but the normal that’s in short supply.

After the mega-daftness of Brexit and the accompanying rounds of governmental musical chairs, the latest sign of soaring unreality is the much-trailed suggestion that Theresa May’s government is about to sanction a return to selective education and grammar schools. If you take the view that part of Brexit was a cry to stop the world and return to the imagined insular certainties of the 1950s, there’s perhaps a certain mad logic in taking steps to try to bring that world about. But in any other sense it’s a policy that would give even the Monster Raving Loony Party (‘Vote for Insanity!’) pause.

It’s quite hard to know where to begin, but let’s give it a try.

In an age of headlong technological advance which eats jobs for breakfast, lunch and tea (more than 40 per cent of all jobs will be automatable by mid-century, according to one much-quoted estimate) one thing that everyone agrees on is that an essential step in easing adjustment and heading off mass long-term unemployment is to equip people (all people) with better qualifications and broader skills.

In a pre-digital era that was the thinking behind the comprehensive movement and then the ambition that 50 per cent of every age cohort should attend university. The application was flawed, and such supply-side measures are, as I keep saying, in any case pathetically insufficient. But that doesn’t make the reasoning wrong.

The second reason why selection at 11 is criminal as well as crazy is that no one these days still believes that intelligence is fixed. Sir John Harvey-Jones always used to say that the root cause of the UK’s disproportionate contingent of unskilled, poorly-paid workers was low expectations. He was right. Countless experiments at school, in the military and at work have shown that those expected to do well perform better than those who aren’t, and conversely that treating people as failures is the surest way of ensuring that that’s what they become. A neglected part of Steve Jobs’ success at Apple (albeit at high personal cost) was his use of impossibly high expectations to force results from people that astonished even themselves. Selecting by ‘intelligence’ at the tender age of 11 is as unacceptable and arbitrary as selecting by class, gender or colour.

The third reason, and one that amplifies all the others by orders of magnitude, is the ‘100-year life’, the prediction that on present trends half of those born today will still be alive in 2116. Increasing longevity makes a nonsense of the straightforward linear progression from education to work to retirement. If the extra years are to be a blessing rather than a grinding burden, multiple mini-careers will have to become the norm interspersed with education that is lifelong, not bunched upfront at the beginning. In this context, a narrow education determined at an early age is a complete disaster, the exact opposite of what is needed. In their book on the new demographics, Andrew Scott and Lynda Gratton emphasize the importance of keeping the widest spread of options open for possible futures. They also warn that it will be up to the individual to do this: neither governments nor corporations (if there are any left in a generation’s time) will do it for you.

This of course widens out into a bigger debate, perhaps the biggest of all, about education seen as a whole, not just schooling. Just as it makes no sense to select academically at 11, in a world where human life is lengthening and technological cycles are shortening loading 20-somethings with huge debts at the start of an uncertain 60-year working life for a one-off, possibly depleting university investment is plainly crackers.

What is really needed is an entirely new look at at what education should mean in the age of big data and the truly smart machine. What are the roles of man and machine? What does it mean to be human? In front of such issues, the ‘why’ questions trump those of ‘what’ and ‘how’; as machine capabilities develop and algorithms drive ever more of our world, the study of history and thought may become the key factor in preventing the sorcerer’s apprentice from taking over entirely. A greater distance from today’s instrumentalist, reductive, teach-to-the-test teaching it is hard to imagine. As Yuval Noah Harari writes in his new book, Homo Deus: A Brief History of Tomorrow, ‘As long as you have greater insight and self-knowledge than the algorithms, your choices will still be superior and you will keep at least some authority in your hands. If the algorithms nevertheless seem posed to take over, it is mainly because most human beings hardly know themselves at all.’