"Automated individual decision-making” under Article 22 of UK GDPR refers to making significant decisions about individuals based solely on automated processes - without meaningful human involvement – often using algorithms or artificial intelligence tools. It can include profiling, i.e. automated processing of personal data to evaluate an individual's behaviour, preferences, or status.
In practice, many decisions that are regarded as automated involve human intervention at some stage. If a human reviews and has the discretion to alter the decision, rather than simply rubber-stamping the machine's determination, then the processing falls outside the ADM rules; we can expect the ICO to expand further on what counts as meaningful human involvement in fresh guidance.
ADM rules only apply where the processing results in "significant decisions" - those with legal or similarly substantial effects, such as decisions that affect someone's employment opportunities, financial position, access to essential services, health, reputation, behaviour or choices. Typical examples include automatically shortlisting or rejecting job applicants in recruitment, approving or denying loan or credit applications in financial services, determining eligibility for welfare benefits, or dynamically adjusting insurance premiums based on risk scoring.
Routine business activities that do not have a material impact on individuals’ rights or circumstances - such as prioritising customer service tickets, displaying website content, or recommending TV programmes based on user preferences or viewing habits - generally fall outside the ADM rules. However, context is important: if, for instance, automated marketing specifically targets a vulnerable group (such as children), the ADM rules may apply because of the significant potential impact on their choices or behaviour.