The UK AI Strategy: are we listening to the experts?
The emerging UK National AI Strategy is out of step with the needs of the nation’s technical community and, as it stands, is unlikely to result in a well-functioning AI industry. The Data Science & Artificial Intelligence Section (Royal Statistical Society) asks whether the government has actively sought the views of expert practitioners.
The UK government has released plans for a new AI Strategy, with the stated goal of making 'the UK a global centre for the development, commercialisation and adoption of responsible AI'. We asked our members—UK-based technical practitioners of artificial intelligence—their opinion of the plans. Our results point to a fundamental disconnect between the roadmap for the Strategy and the views of those actually building AI-based products and services in the UK.
The basis of the AI Strategy is the AI Council's 'AI Roadmap', which was developed with input mainly from public sector leaders and university researchers. The AI Council does not appear to have engaged with engineers and scientists from the commercial technology sector.
Tech companies commercialise AI, not universities. Yet between the 52 individuals who contributed to the Roadmap, only four software companies are represented. There are 19 CBEs and OBEs but not one startup CTO.
Hoping to fill this gap, we surveyed our community of practicing data scientists and AI specialists, asking for their thoughts on the Roadmap. We received 284 detailed responses; clearly the technical community cares deeply about this subject.
Only by direct engagement with technical specialists can we hope to uncover the key ingredients of a successful AI industry. For example, while the AI Roadmap focusses on moonshots and flagship institutes, the community seems to care more about practical issues such as open-source software, startup funding and knowledge-sharing.
The economic opportunity of AI represents at least 5% of GDP (compare to fisheries, at about 0.05% of GDP). If the National AI Strategy does not correctly identify the challenges that lie ahead, this opportunity will be squandered.
We will publish our findings in four parts, covering the different sections of the AI Roadmap. This first covers AI research and development.
Comparison with the AI Roadmap for R&D
Three areas are central to the Roadmap's plans for R&D: the Alan Turing Institute, Moon shots (such as 'digital twin' technology) and 'AI to transform research, development and innovation'. These topics were scarcely mentioned by our respondents, despite them being listing as potential subjects for discussion.
For example the Alan Turing institute was mentioned only 4 times by respondents. Two were negative.
There were 7 responses on the topic of moon shots, 3 of them negative. 'Digital twins' were not mentioned at all:
"moonshotting" [...] without a solid foundation and shared values would destroy the field in perpetuity.
The central concerns of the Roadmap may sound plausible on paper but they don't resonate strongly with the technical community.
Better collaboration between academia and industry
By far the most frequently mentioned topic was better collaboration between academia and industry, which was addressed by 52 respondents. To summarise: knowledge transfer between academia and companies is not currently working. The UK's strength in academic research will be wasted if industry and academia cannot easily learn from each other.
The Roadmap barely addresses this topic, other than one mention of the pre-existing Knowledge Transfer Partnerships (KTP) scheme. Yet our practitioner community think that clearing this obstacle should be at the core of the strategy. A typical request was:
Better sharing of knowledge and experience between universities and industry, specifically industry use case examples.
There were many voices suggesting the knowledge transfer should also operate in the opposite direction:
The knowledge transfer deficit is in the opposite direction: industry making investment and research headway while universities cannot compete.
Encourage adoption of good software engineering practices amongst researchers.
Another key concern is the brain drain from academia to industry:
UK universities were leading in the AI space until the industry (Google, Msft, Amz, FB) started poaching all the top professors [...]
There needs to be strong support for this area in academia to stop 'brain drain' to big tech companies and allow UK to make research advances that will allow competitive advantages for startups.
Open source
40 respondents recommend that the Strategy focus on open-source. This makes it the second most mentioned issue in the entire survey. Strikingly, the AI Roadmap doesn't contain a single mention of the term 'open-source'.
Many respondents agreed that funding positions for contributors to key open-source projects would bring many benefits. This is well-founded: when Columbia university hired core developers on the Scikit-learn open source project they facilitated knowledge transfer and training on cutting edge techniques.
Open source should be embraced by the Government, it sends a positive message about intent and helps to draw in the right talent to the field (most people learning practical machine learning will start their experience in open source).
Support for startups
40 responses agreed on a need to support startups through direct funding, incubators, tax breaks and other approaches such as access to compute infrastructure.
More funding and assistance for AI startups, and assisting their collaboration with UK-based research and universities.
Funding for AI and Deep Tech startups.
Funding/grants for startups for the use of cloud computing infrastructure.
Ethics
26 responses want to see consideration of ethics at the heart of future AI innovation. For example:
Finally, I think governance of how AI and DS are used by the private sector is very important, and something that, in my opinion, should be a priority for any government AI roadmap.
If you fail to identify and analyze the obstacles, you don’t have a strategy
We draw attention to the work of UCLA strategy researcher Richard Rumelt. He makes a specific warning: 'If you fail to identify and analyze the obstacles, you don’t have a strategy'. Has the AI Roadmap made this mistake? Its 37 pages do not apparently contain a clear analysis of the obstacles in the way of a strong AI industry.
Identification and analysis of these obstacles requires close and sustained collaboration with AI practitioners; our survey is just a starting point. We urge the Office for AI to engage directly with the technical community before moving forward to finalising their AI Strategy.
Sign up to the Data Science & AI Section if you are interested in this topic
Data Science and AI Section (Royal Statistical Society) Committee
Chair: Dr Martin Goodson (CEO & Chief Scientist, Evolution AI)
Vice Chair: Dr Jim Weatherall (VP, Data Science & AI, AstraZeneca)
Trevor Duguid Farrant (Senior Principal Statistician, Mondeléz International)
Rich Pugh (Chief Data Scientist, Mango Solutions (an Ascent Company))
Dr Janet Bastiman (Head of Analytics, Napier AI. AI Venture Partner)
Dr Adam Davison (Head of Insight & Data Science, The Economist)
Dr Anjali Mazumder (AI and Justice & Human Rights Theme Lead, Alan Turing Institute)
Giles Pavey (Global Director – Data Science, Unilever)
Piers Stobbs (Chief Data Officer, Cazoo)
Magda Woods (Data Director, New Statesman Media Group)
Dr Danielle Belgrave (Senior Staff Research Scientist, DeepMind)
Appendix: Analysis
Our survey was designed to bring out the voice of technical community. We asked leading questions - prompting the respondents with topics from the AI roadmap as well as other topics we thought might be of interest to the community. We collected free-text responses.
Our analysis is subjective and we will make our full dataset available for independent analysis. We do not make any quantitative claims, because our sample is biased (for example, geographically).
We included a single quantitative question: 'To what extent do you agree that these are the top priorities for the UK in AI Research, Development & Innovation? (5 means 'Strongly agree')'. Responses could range from 0-5. The average response was 3.4 (neither agree nor disagree).
We received 284 responses in total. We selected qualified respondents by requiring:
They declared they were either “a practising data scientist” or “used to be a practising data scientist”
They declared they were “an individual data science contributor”, “a line manager of data scientists” or “a senior leader involved in data science”
After applying these requirements 245 qualified responses remained. 118 (47%) of respondents identified as either 'Managers' or 'Senior leaders'.
In order to interpret our results we made a crude manual classification of every comment and focused on those topics which at least 20 respondents mentioned.
The declared demographic of our qualified responses was primarily male (77%) and white (75%). We note that only 60% answered questions on demographics.
The Data Science and AI section is grateful for the support of our partner communities PyLadies London, PyData London, PyDataUK, London Machine Learning and the Apache Spark+AI Meetup, representing a combined (overlapping) membership of 27K data scientists and technologists.