
Photo: woman-wearing-face-mask-3902881
Public support for data innovation depends on regulation, report finds
08 March 2021
by Sarah Wray
A new report from the UK’s Centre for Data Ethics and Innovation (CDEI) highlights the range of ways that data is being used to respond to the COVID-19 pandemic, as well as public attitudes to this.
Although attention in the UK has largely centred on data applications such as contact-tracing apps and a controversial algorithm which determined qualifications in the absence of exams, broader use cases have included the use of drones to deliver medical supplies to remote regions and the creation of health equipment databases to monitor the availability of NHS assets.
City applications around the world have encompassed artificial intelligence (AI) algorithms which monitor mask-wearing and social distancing; use of mobility data to measure ‘busyness’; digital innovation matchmaking tools; and apps to help residents buy from local businesses.
Novel uses of AI and data to counter and mitigate the effects of the pandemic are being tracked in the CDEI’s COVID-19 repository.
The latest analysis finds that aside from advancing vaccine research, AI did not play the outsized role many thought it would in relief efforts, in part due to a lack of access to data on COVID-19 to train algorithms. Instead, conventional data analysis, underpinned by new data-sharing agreements, has made the biggest difference to the work of health services and public authorities.
Public perception
The research suggests the public could support more advanced data innovation.
The longitudinal study, with a representative sample of over 12,000 people, ran from June to December 2020. Almost three-quarters (72 percent) of respondents – across all demographic groups – felt that digital technology had the potential to be used in response to the outbreak. A majority of the public (average 69 percent) also showed support, in principle, for a number of specific use-cases, including technologies that have not been widely adopted such as wearables to aid social distancing in the workplace and the use of personal data to inform local lockdowns.
Many respondents indicated that they felt that the potential of data-driven technology was not being fully realised. Fewer than half (42 percent) said digital technology was making the situation in the UK better, while seven percent claimed it was making matters worse.
Respondents cited concerns about whether people and organisations would be able to use the technology properly (39 percent). This was more than double the number who pointed to problems with the technology itself (17 percent).
The research found a strong relationship between trustworthy governance and support for the adoption of new technologies. When controlling for all other variables, the CDEI found that ‘trust that the right rules and regulations are in place’ is the single biggest predictor of whether someone will support the use of digital technology. Just under half (43 percent) said existing rules and regulations were sufficient to ensure the technology is used responsibly but almost a quarter (24 percent) disagreed. Older respondents tended to have lower levels of trust in the existing rules and regulations.
Governance
The CDEI, which was set up by the UK government as an independent advisory body in 2018, urges action to build trustworthy governance that earns the confidence of citizens over the long term, pointing to principles outlined in its Trust Matrix, such as enhancing accountability and transparency.
Edwina Dunn, Deputy Chair for the Centre for Data Ethics and Innovation, said: “Data-driven technologies including AI have great potential for our economy and society. We need to ensure that the right governance regime is in place if we are to unlock the opportunities that these technologies present.”
The research comes as the pandemic has prompted a greater appreciation at all government levels of the power of data to address pressing and important problems.
As digital technology touches more areas of people’s lives, trust and governance are now key areas of focus for many smart city programmes around the world. Examples include London’s Emerging Technology Charter and Helsinki and Amsterdam’s AI Registers.
Image: Anna Shvets/Pexels