Regulation - Responding to Innovation

My Introduction to Regulation summarises the factors that have driven much of the explosive growth in regulation since the 1980s. It now looks as though the world may be entering a further Industrial Revolution in which the physical, biological and digital worlds are coming together in the form of new technologies such as machine learning, big data, robotics and gene editing. This web page contains some initial notes on the consequential regulatory issues that are beginning to attract attention.

First, the regulation of this new world is likely to be very complex, very varied and very detailed. Writing in the New Scientist, Douglas Heaven pointed out that ...

The trouble is, robotic intervention into human affairs is going to require something far more comprehensive than the highway code. Legislation that protects passengers and pedestrians from driverless cars, for example, will be powerless to stop data-scraping algorithms from influencing our vote. Medical robots programmed to diagnose and treat people will need different regulations from those sent onto the battlefield. ... There is another hurdle, too. Interpreting rules requires common sense. It’s not clear how you would encode even the most obvious of commandments in a way that a machine could be trusted to follow ...

Another preliminary comment:- It is vital that regulatory frameworks are pro-competitive so that innovators can test their ideas. It is a mistake to have a regulatory framework which requires every innovation to be challenged before it can be put into practice. But it is equally important that potentially dangerous technologies are properly evaluated before deployment.

(There was an interesting Policy Exchange roundtable in July 2017 which considered how regulation might keep up with disruptive regulation. Follow this link to download a note of the discussion.)

Please send me further information and articles etc. which might help other readers keep track of interesting regulatory developments in these or other areas. If and when the notes get too long, I will create separate pages to carry the detail. This has happened already for the discussion of:

Algorithms & AI

There is a bit of a theme running through many of the issues listed on this web page. Modern technologies, including Artificial Intelligence (AI) and algorithms, cut costs and facilitate activities which would otherwise be impossible. But they remove human involvement from the decision-making. These issues are explored in my Artificial Intelligence web page.

Increased unregulated use of AI may also have profound social consequences. Virginia Eubanks talks of 'the Digital Poorhouse' and argues that

'We all live under this new regime of data analytics, but we don’t all experience it in the same way. Most people are targeted for digital scrutiny as members of social groups, not as individuals. People of color, migrants, stigmatized religious groups, sexual minorities, the poor, and other oppressed and exploited populations bear a much heavier burden of monitoring, tracking, and social sorting than advantaged groups.'

Her full Harpers article is here.


Some of us, but far from all, know that our browsing history, saved in the form of cookies, can significantly affect the way in which companies deal with us. The most obvious example are the cookies saved by travel companies, which reveal whether we have previously enquired about a particular flight or holiday. If we have, then the company is less likely to offer us their best deal, believing that our repeated searches suggest that we are very likely to become a customer. (See for instance Travel site cookies milk you for dough in the Sunday Times - 24 June 2018.

Does it matter? I would argue that this is no more than an automated example of the behaviour of any salesperson who has the ability to price a product (such as a car) according to their estimate of the customer's keenness to buy. But it's a bit less obvious - sneaky even.

Data Protection

New EU rules - the GDPR - came into force in May 2018 but one wonders whether any regulations can adequately protect the interests of consumers faced with increasing monetisation of personal data. The following extracts from a letter to the FT summarised concerns very well:

... The consumer will never own the data or the algorithms. ... Every moment, your data relating to browsing, calling, online, social media, location tracking and so on is being churned through a multiverse of data warehouses. If you have been browsing about a certain medicine, correlating to a call to an oncologist and a search for a nearby pharmacy, this can consequently be packaged as a data intelligence report and sold to your medical insurance company. This is just one of the myriad ways monetisation is being unleashed on unsuspecting consumers across the world.

The data protection regulations, although a step in the right direction, are usually still heavily tilted in favour of the corporate giants and still focused on cross-border transfers than on the real risks of monetisation. The fines imposed on the Silicon Valley giants are minuscule compared with the money they have made from data monetisation efforts. And this is all achieved in the age that is still a forerunner to the era of artificial intelligence and quantum computing.

The very concept of data privacy is archaic and academic. The tech giants are moving faster than this philosophical debate about data privacy. All the sound-bites from the tech giants are mere smoke and mirrors. Unless we revisit our concepts of what is data privacy for this new age of data monetisation, we will never really grapple with the real challenges and how to enforce meaningful regulation that really sets out to protect the consumer.

Syed Wajahat Ali

Zeynep Tufekci drew attention to one interesting example in 2018. One medical company was buying individuals' temperature data that had been uploaded to another company which had supplied Internet-connected thermometers. It all sounded very benign but Ms Tufekci was concerned that the customers hadn't thought through the implications of having health data sold to anyone at all, including advertisers, and/or integrated into countless databases.

Facebook was fined the pre-GDPR maximum of £500k in 2018 for allowing Cambridge Analytica to access users' personal data without their explicit consent.

Can data be owned? It is interesting, by the way, to consider whether data can be (or is) owned like other property, It is hard to see that anyone can own the fact that something happened (Ms X bought item Y). And an electronic or other record of that fact can easily be copied - many times. And yet data is sold, which suggests that it has indeed become property. But who does it belong to? Does anyone have the right to control its destination?

Autonomous (Self-Driving) Vehicles (& Ships) (& Planes)

Lots of interesting issues here. Autonomous vehicles seem certain to be much safer (on average) than those controlled by humans. But their successful introduction requires successful advances to be made on many fronts.

It has already taken a long time for the technology to become truly road-ready. I remember, in around 1993, being driven in a technology-demonstrating Jaguar on a fairly quiet French motorway. The car was able to stay in lane by tracking the two white lane markers, and could adjust its distance from the vehicle in front.

And Elliott Sclar argued, rather persuasively, that ...

All transport is interaction between moving vessel and travel medium. Land-based modes require major capital investments in suitable physical infrastructure and administrative co-ordination between vehicle and travel medium. To date it has been implicitly assumed that existing configurations of streets, roads, traffic signalling, sidewalks and so on remain about as they are; only the physical driver disappears. It is important to remember that the present configurations evolved assuming human-operated rolling stock. The effectiveness, efficiency and safety of the new mode will greatly depend upon the degree to which the infrastructure is modified to accommodate its technology. All the adjustment cannot and will not be done in the vehicle.

This seems likely to be particularly true in major British cities where cautious autonomous vehicles will find it very difficult to understand when it is safe to change lane (think Hyde Park Corner!) or slip out of a side road into a busy High Street. Indeed, driverless cars will need to be programmed - on occasion - to mount the pavement so as to drive round a broken down vehicle or make way for an oncoming car or an emergency vehicle. And they will need to be able to edge their way through pedestrians after major sports events, for instance. Programming such behaviour will be far from easy.

And will we hold autonomous vehicles to higher standards? For instance:

Research (such as the Moral Machine Experiment) demonstrates that people will generally say that they prefer to spare a young life in preference to an older one, but:

But vehicle manufacturers could be required to design safety into their products:- soft bumpers say.

Anyway, if vehicles were programmed to follow the rues of the road, and were never driven by tired or inattentive or drunk drivers, that would instantly save loads of lives.

The government announced in November 2017 that self-driving cars would be in use in the UK by 2021, and that insurers would be required to cover injuries to all parties whether or not a human driver had intervened before his or her vehicle was involved in a collision. This implies a fundamental shift in road traffic law away from personal liability to product liability. And which manufacturer will be held liable - that of the hardware (the car) or the software?

Follow this link to read about the psychology involved in our attitude to Risk and Regulation.

Autonomous ships may be on the way, too, though the main aim will be to reduce the number of marine accidents, 80% of which are currently caused by human error. But the savings that can be achieved by removing the final few crew members will be pretty small, and could lead to more piracy. It was therefore interesting to read that 79% of maritime professionals expected the first autonomous ships to arrive by 2028, and 14% by 2038 - with the remainder thinking it would take longer or would never happen. The same figures for expectation that autonomous ships would be commonly deployed were 16%, 56 % and 28% respectively. We shall see !!

I suspect we shall see autonomous aircraft rather sooner than that - though it is currently hard to imagine a plane without someone up front.

Bitcoin and other Crypto-Currencies

Decentralized digital currencies, which use blockchain technology, feel like they are only a small and attractive step from where we are now.

Apart from my share in our house and car, all my significant assets are represented by bits and bytes in the IT systems of various financial institutions. I trust those institutions, of course, partly because they are so heavily regulated, and backed by the Government in the form of the Financial Services Compensation Scheme. But then I think about the financial crisis, and the way in which the true value of my financial assets is affected by interest rates and inflation, over which I have zero control. I remember all too clearly how the value of my financial assets fell by nearly one-fifth following the Brexit referendum.

Crypto-currencies, too, are no-more than bits and bytes, but they are registered in a peer-to-peer database that is controlled by no-one and with which no-one can meddle. That can't be bad. On the other hand, their value, too, currently fluctuates wildly in response to real world events such as Brexit. And can we truly trust those that control the creation of the currencies and the associated databases?

The key difference, I guess, is that Bitcoin and the rest are truly international. Unlike Sterling, the Dollar or the Remnimbi, they are not linked to any one country or influenced by any one government. If their use continues to grow, will governments seek to regulate them or their users? And could they succeed? Many say not.

Blockchain, by the way, is a harmless - and indeed valuable - technology. It is essentially a database which stores records of transactions in units called blocks. New blocks get added to the end of ever-growing chains of blocks, which are in turn copied to many computers on the internet. This elaborate, distributed process makes the blockchain tamper-proof - an unfalsifiable record of the transactions that made it.

CRISPR and Gene Editing

CRISPR technology is now widely available. Tweaking individual letters of genetic code, it takes just hours to adjust what evolution has fashioned over billions of years. A detailed discussion is here.

The Environment

Climate change is clearly a big regulatory issue. My Energy Regulation pages look at the implications for UK energy policy. where price rises caused by pro-renewable and pro-nuclear policies have met considerable resistance.

Environmental protection costs also loom large in Water Industry Regulation And the UK government in 2018 started pressing for international action to clean plastics from the oceans.

But there is bound to be more to come. The world's population is still increasing very rapidly. having reached 7.8 billion in 2018. Average fertility rate - around 2.5 per woman in 2018 - is still way above the steady state figure of 2.1. There is loads of room for argument about the maximum population that the world could sustain, but - whatever it is - it can only be reached as a result of international regulatory cooperation.

Games & Gaming

Competitive computer games are big business nowadays. Valve Corporation, for instance, hosts an annual international DOTA tournament where professional teams compete for a prize pool which had reached $25m by 1017. Intriguingly, however, DOTA is free to play - but the prize pool is funded through the purchase of 'compendiums' - self-updating, interactive booklets that track the tournament's statistics. It is less obvious, however, that Valve creams off 75% of the purchase price before transferring the balance to the prize fund. Nice work if you can get it! But is it abusive? Or are customers deserving of greater protection?

It is also interesting that many in-game purchases open 'loot boxes' whose contents are unpredictable. It's a relatively gentle form of gambling - but it has, as such, been banned in Belgium and the Netherlands. One (perhaps slightly exaggerated) comment explained players' reasoning as follows:- "99% of people who open these things are going for the rares; otherwise it'd be cheaper 90% of the time to just buy it off the market and not play the odds.".

The Gig Economy inc. Uber

Information technology is facilitating new ways of ordering goods and services to be delivered to the door, including books (and much more) from Amazon, taxis (in particular from Uber), food etc. (from supermarkets), and meals (Deliveroo etc.).This is to be welcomed,. (See for instance Robert Hahn and Robert Metcalfe's The Ridesharing Revolution.) But there are some unwelcome consequences.

First, the gig economy facilitates new and arguably onerous ways of employing those who prepare and deliver many goods and services. They can be required to enter into contracts under which

These arrangements can be tax efficient for both 'employer' and 'employee' and they suit many individuals very well. But they can also be exploitative, leaving workers without essential protections. It is far from clear that one-sided contracts are in the long term interests of the individuals or society. Numerous cases are working their way through the courts as lawyers seek to define the boundary between being a 'worker' and being truly self-employed.

The self-employed also pay much less by way of National Insurance Contributions (NICs) despite the fact that, since the introduction of the new state pension in 2016, they get pretty much the same state benefits as employed people. The government attempted to address this by increasing NICs in 2017 but this proved very unpopular and was abandoned. The proposal also appeared to pre-judge or forestall the recommendation of the Taylor Review of modern working practices published a few months later. (See also my web page commenting on weaknesses in HMRC.)

The gig economy can also be economically devastating for those previously sheltered from such competition. Here is an extract from a report in the New York Times.

For decades there had been no more than 12,000 to 13,000 taxis in New York but now there were myriad new ways to avoid public transportation, in some cases with ride-hailing services like Via that charged little more than $5 to travel in Manhattan. In 2013, there were [already] 47,000 for-hire vehicles in the city. Now [in 2018] there were more than 100,000, approximately two-thirds of them affiliated with Uber.

While Uber has sold that “disruption” as positive for riders, for many taxi workers, it has been devastating. Between 2013 and 2016, the gross annual bookings of full-time yellow-taxi drivers in New York, working during the day when fares are typically highest, fell from $88,000 a year to just over $69,000. Medallions, which grant the right to operate a taxi in New York City, were now depreciating assets and drivers who had borrowed money to pay for them, once a sound investment strategy, were deeply in debt. [NY Taxi Drivers representative] Ms. Desai was routinely seeing grown men cry and she had become increasingly concerned about the possibility that they would begin taking their lives.

There is a separate issue concerning the companies' willingness to adjust to local culture and regulation. The BBC commented in September 2017 that "Throughout its short, tempestuous life, Uber has clashed with regulators around the world - and more often than not it has come out on top. Its tactic has often been to arrive in a city, break a few rules, and then apologise when it's rapped over the knuckles. Some regulators have backed down, others have run the company out of town."

In the UK, Transport for London (TfL) have been prominent in attempts to regulate Uber in the same way as they regulate other taxi providers. The Court of Appeal, for instance, backed TfL's insistence that Uber provide a facility for passengers to talk to the company over the phone (or a voice call overt the internet) if the passenger does not want to use Uber's app for this purpose. And, having been threatened with the removal of their licence, Uber eventually conceded a string of safety failings before magistrates gave it a 15 month probationary licence running from June 2018. The judge criticised the company's 'gung ho" attitude and its attempts to "whip up public outcry" by launching a public attack on TfL rather than accepting blame for its mistakes and misbehaviour. Uber's probation was made subject to 14 conditions including adding independent members to its Board.


We have all grown up believing that, although our physical behaviour can easily be constrained and dominated by others, our minds, thoughts, beliefs and convictions are to a great extent beyond external constraint. As John Milton said "Thou canst not touch the freedom of my mind". But advances in neural engineering, brain imaging and neuro-technology mean that the mind may soon not be such an unassailable fortress. Elon Musk and others are developing tools such as

This suggests that we will, at the very least, require improvements to laws around data analysis and collection. But some scientists argue that human rights law will need to be updated to take into account the ability of governments not only to peer into people's minds but also alter them.

Protecting Key Infrastructure

Remember the stories about the Russian hacking of Western databases, and the Stuxnet attack on Iranian nuclear industry centrifuges? Much Western infrastructure is nowadays in private hands, so whose responsibility is it to defend it? Government is understandably reluctant to take on such a massive task, but industry is understandably unwilling to foot the bill. The answer, in the UK at least, is that the owners of designated Critical National Infrastructure have a legal duty to safeguard it, advised and monitored by the Centre for the Protection of National Infrastructure or the National Cyber Security Centre.

Sex Robots

The Foundation for Responsible Robotics published an interesting report Our Sexual Future with Robots in July 2017. The report discussed whether increasingly lifelike robots, such as Sophia on the right, might:

The pace of change in this area certainly seems likely to require some form of regulatory response before too long.

The Technology Giants

Numerous issues associated with Google/YouTube, Facebook, Amazon, Airbnb, Uber etc. are explored here.

Designer Babies (Polygenic IVF Screening)

DNA and other testing has helped numerous parents to avoid giving birth to children with Downs Syndrome, the BRCA1 breast cancer gene and a few other inheritable traits. But it is now (as of November 2018) possible for polygenic techniques to analyse several genetic regions at the same time and so begin to predict other diseases - and intelligence. It will not be long before clinic somewhere in the world begins to offer parents the ability to screen for embryos that are more or less likely to have features such as high or low intelligence, sexuality, autism, or susceptibility to depression.

Someone is going to have to decide whether such screening is ethical. Should this be societies as a whole, or prospective parents?


Martin Stanley


Spotted something wrong?
Please do drop me an email if you spot anything that is out-of-date, or any other errors, typos or faulty links.