The UK's Online Safety Act creates a framework for addressing online harms that includes provisions potentially applicable to AI addiction. As Ofcom develops guidance for implementing the Act, the treatment of AI services — particularly those designed for emotional connection — will shape how AI addiction is addressed in British law.

Duty of care framework

The Act establishes a duty of care for online services, requiring platforms to assess and mitigate risks of harm to users. If AI addiction is recognized as a form of online harm, AI platforms would need to implement measures to identify and reduce addictive dynamics.

Child safety provisions

The Act includes strong provisions for protecting children online. AI companion services used by minors may face requirements for age-appropriate design, content controls, and usage limitation features that directly address dependency concerns.

Transparency and accountability

The Act requires certain transparency and accountability measures that could extend to AI services. Regular reporting on user harm, independent audits of safety measures, and accountability for design decisions are provisions that could apply to addictive AI design.

Implementation challenges

Applying the Online Safety Act to AI services presents practical challenges: defining what constitutes harmful AI engagement, measuring addiction-related harm, and establishing appropriate standards for AI-specific risks.

Staying informed

As the Act's implementation develops, UK users of AI services should stay informed about their rights and the protections available to them. Regulatory frameworks only protect users who are aware they exist.

Understand how AI affects you. Our assessment provides personal insight.