Tech company office with workers at rows of desks

Core Requirements


Working closely with communities will be vital for ensuring that automated decision-making systems are used appropriately. To achieve this, public sector organisations need to:

Summary Infographic of 8 Core Requirements listed below.

Ensure there are clear procedures for making sure automated decision-making systems are used responsibly.

Clearly define who is accountable for ensuring automated decision-making systems are used appropriately and for ensuring reliable governance processes are in place.

a. Mirroring requirements for Data Protection Officers, there should be a senior accountable officer within organisations who has responsibility for ensuring that Automated Decision-Making systems are used appropriately, and that adequate governance processes are in place. They should be appropriately qualified and publicly contactable.

b. Governance processes should include proportionate measures for monitoring and evaluating that Automated Decision-Making systems are continuing to function properly, and are continuing to be used appropriately.

c. Governance processes should be based on ensuring mistakes will be acknowledged, documented, and learnt from.

d. Governance processes should complement existing requirements such as Data Protection Impact Assessments, and Equality Impact Assessments.

e. Deployment of new automated decision-making systems should allow for the right people within organisations to be involved at the right point so their expertise is seen as constructive not obstructive.

Use governance processes to make sure there is proportionate human oversight and scrutiny over how Automated Decision-Making systems are being used, and how these systems work.

a. As part of this there will often need to be meaningful human involvement in decisions, rather than leaving these entirely to automated systems. There are evolving legal standards that define when this is necessary such as those in the UK GDPR; these should be treated as a minimum standard.

b. Governance measures around an organisation’s use of Automated Decision-Making systems should include scrutiny from people who are external to the organisation with adequate knowledge and skills. This scrutiny should be proportionate to the risks involved in particular applications.

Publish clear, accessible and understandable information about how automated decision-making systems work, and why they are being used, whenever they have a significant impact on decisions with a ‘public effect’ .

Also publish this information about algorithmic tools in general that interact directly with the public.

Make clear at the point that individuals interact with algorithmic systems that these systems are automated. For example, clearly labelling at the start of a chat when someone is interacting with a chatbot rather than a human.

a. The UK Government’s Algorithmic Transparency Recording Standard, and its accompanying guidance, should be used to decide as a minimum what information should be shared, and when this is required.

b. Ideally this should include sharing the algorithms used in their entirety, though this may not always be feasible.

It should be noted that there are already existing legal obligations for transparency around data use, such as requirements for data protection impact assessments as laid out in the UK GDPR . Meeting those obligations may help with providing the information asked for here.

Use proportionate public engagement, and apply learnings from it, to ensure Automated Decision-Making systems are employed appropriately. Whether public engagement is necessary, and the level of engagement required, will vary based on what is at stake.

a. As part of this, public engagement should also inform the review of existing uses of Automated Decision-Making systems.

b. Existing legal standards around the need for public consultation, including those specified in the Equality Act and UK GDPR/ Data Protection Act 2018 should be treated as a minimum.

c. Public engagement should not be planned solely around the needs of individual projects and teams. Instead, organisations should think about how to create meaningful and joined-up opportunities for participants.

d. Through this public engagement, organisations should make sure that they are not the only ones judging levels of risk, but instead involve a wider range of perspectives.

e. Where public engagement is required, adequate time and resources for this must be factored-in before deploying new technologies.

Put in place mechanisms for raising concerns about automated decision-making systems, and appealing decisions, wherever these systems have a significant impact on decision-making with a ‘public effect’.

a. These should be meaningful opportunities that are realistically accessible for those who may be affected by these technologies.

b. They should be reviewed and responded to in a timely way, with clear information provided upfront about how quickly responses will be given.

c. Rights around the use of personal data, such as those laid out in the UK GDPR may be part of achieving this. Existing mechanisms for appealing specific types of decision may also play a role. Raising awareness of these existing rights and mechanisms may be an important aspect of meeting this requirement.

Grow awareness across teams about the benefits and risks of automated decision-making systems, the need to think carefully about how they are used, and the role of public engagement within this. Without this, there is a risk that none of the measures described here will happen.

a. There should be clearly assigned responsibility for growing this awareness, using steps like awareness raising campaigns and training

b. Ideally this should include growing public engagement skills across teams to foster collaboration with residents around how to use these systems.

c. This topic should not be viewed just as a technical issue. Instead teams need to think about the impacts on people’s lives.

Ensure other organisations partnered with, or commissioned by, the public sector are aware of their responsibilities under these principles, and have the support necessary to live up to these.

a. When public sector bodies partner with, or commission, others as part of delivering services any automated decision-making systems involved in these services should still be subject to the requirements of these principles. This includes obtaining services from other providers, or using third party solutions.

b. Public sector organisations should ensure those they partner with, or commission, understand their responsibilities for achieving this. They should consider providing resources, support and training where appropriate to help organisations less far along this journey to meet these requirements.

c. Public sector organisations should collaborate with a wide range of organisations to make good use of automated decision-making systems, including partners in the private sector and the voluntary, community, faith, and social enterprise (VCFSE) sector. This should be done in a way that protects people’s data rights and is deserving of their trust.

Reflect on a regular basis on how well the standards laid out in these principles are being met, including how well public engagement is being used as part of this.

a. This reflection should be built into governance structures around the use of automated decision-making systems.

b. Organisations who sign up to these principles are encouraged to provide a public statement reflecting on how they will meet the standards set out in these principles, and to commit to regularly reviewing how well they have met these (such as on a yearly basis). Where principles have not been stuck to this should include a reflection on why this has happened and what steps are being taken to change this.

Read the full What Public Engagement Should Look Like values

Read the full Guiding Values