UAE’s Child Digital Safety Law: Key Takeaways for Businesses and Service Providers
The UAE’s Child Digital Safety Law introduces a comprehensive, risk-based framework regulating digital platforms, internet service providers, and custodians, with broad extraterritorial reach and significant obligations relating to child protection, data privacy, age verification, content moderation, and reporting.
The United Arab Emirates (UAE) has introduced a new federal framework aimed at strengthening the protection of children in the digital environment through Federal Decree-Law No. 26 of 2025 on Child Digital Safety (hereinafter, the Child Safety Law).
The legislation reflects the UAE’s growing focus on regulating online spaces that are increasingly accessed by children, and on holding digital service providers accountable for the risks associated with digital content, data processing, and online engagement.
The UAE’s Child Safety Law entered into force on January 1, 2026, with a one-year transitional period granted to in-scope entities to align their operations with the new requirements. Full enforcement is therefore expected from January 2027, allowing businesses time to assess their exposure, review existing safeguards, and prepare for the introduction of more detailed obligations through implementing regulations.
At a policy level, the Child Safety Law seeks to protect children from harmful digital content, enhance privacy and personal data protections, and establish a more structured governance framework for digital platforms operating in or targeting the UAE. By introducing age-based safeguards, restrictions on certain online activities, and reporting and monitoring obligations, the law signals a shift toward a more proactive and risk-based approach to child digital safety, with significant implications for platforms, internet service providers, and custodians alike.
In this article, we outline the key features of the Child Safety Law, examine its scope and core obligations for digital platforms, internet service providers, and custodians, and highlight the main compliance considerations for businesses ahead of the issuance of the implementing regulations.
Key takeaways
The Child Digital Safety Law establishes a high-level, principles-based regulatory framework, with many of its operational and technical requirements to be clarified through forthcoming Cabinet decisions and implementing regulations. While detailed compliance standards are still pending, the law already introduces binding obligations and clear regulatory expectations for in-scope entities.
Its scope is deliberately broad, covering digital platforms, licensed internet service providers, and custodians of children. Importantly, the law applies not only to entities operating within the UAE, but also to foreign platforms and service providers that target users in the country, significantly expanding its jurisdictional reach.
A central feature of the framework is a risk-based approach to platform regulation, under which digital platforms will be classified according to factors such as content, scale of use, and potential impact on children. This classification will determine the intensity of obligations, including age verification, content controls, reporting, and parental safeguards.
Taken together, these elements create substantial compliance implications for digital service providers that offer services accessible to children in the UAE. Even in the absence of a physical presence in the country, platforms targeting UAE users may be subject to enforcement actions, including blocking or closure, for non-compliance
Scope of application
Digital platforms
The Child Digital Safety Law adopts an expansive definition of “platforms,” encompassing any digital platform or entity that enables access to digital content or services. This includes, among others, social media platforms, streaming services, e-commerce platforms, gaming platforms, smart applications, search engines, and websites.
A defining feature of the law is its extraterritorial reach. The Child Safety Law applies not only to platforms operating within the UAE, but also to those operating outside the country where their services are directed at users in the UAE. While the law does not expressly define what constitutes “directed” services, its consistent use of a dual nexus (operating within the State or targeting users in the State) indicates a broad interpretation. This is particularly relevant for platforms whose services are accessible to children or whose content may be consumed by child users in the UAE.
Further clarification on the meaning of “directed” services, including relevant technical or commercial indicators, is expected to be provided in the implementing regulations and related Cabinet decisions.
Internet service providers
The CDS Law also applies to internet service providers licensed under the UAE Telecom Law. At present, this includes the UAE’s licensed telecom operators, namely du and Etisalat (e&).
Internet Service Providers are subject to a distinct but complementary set of obligations aimed at safeguarding children at the infrastructure level. These include the implementation of content filtering mechanisms, the provision of parental control tools, and compliance with reporting obligations relating to harmful content and child sexual abuse material. Oversight of ISP compliance falls primarily under the Telecommunications and Digital Government Regulatory Authority (TDRA), which is empowered to issue policies, conduct reviews, and enforce ongoing compliance.
Custodians of children
The CDS Law extends certain obligations to custodians of children, defined as parents or legal guardians of individuals under the age of 18. Custodians are required to take reasonable steps to monitor their children’s digital activities, use available parental control tools, and prevent access to content that is inappropriate for the child’s age.
Custodians are also prohibited from exposing or exploiting children in ways that threaten their safety, privacy, or dignity, and are required to report any child sexual abuse material or harmful digital content to the competent authorities. While these obligations are clearly articulated at a policy level, open questions remain regarding enforcement mechanisms, evidentiary thresholds, and the practical application of sanctions against custodians. Further guidance is expected through implementing regulations
Risk classification framework
A cornerstone of the Child Safety Law is the introduction of a platform risk classification system, to be issued by Cabinet decision. This system will apply to both domestic and foreign digital platforms and will serve as the basis for determining the scope and intensity of compliance obligations.
Platforms will be classified based on a range of criteria, including platform type, nature of content, scale of use, user demographics, and potential impact on children. The classification regime is intended to ensure proportionality, aligning regulatory requirements with the level of risk posed to child users.
The framework will also introduce age-based access controls, with differentiated safeguards depending on the age groups accessing a platform. Higher-risk platforms, particularly those with significant child engagement or exposure to potentially harmful content, are expected to face enhanced obligations, including stricter age verification measures, more robust content moderation systems, and heightened reporting and transparency requirements.
Until the classification system is formally issued, platforms should assume that services with substantial child user bases or child-facing content will be subject to closer regulatory scrutiny and more stringent compliance expectations.
Key obligations for digital platforms
While the Child Digital Safety Law establishes a principles-based framework, it already imposes a wide range of substantive obligations on digital platforms, with further specificity expected through implementing regulations and Cabinet decisions.
Gambling and commercial gaming restrictions
Digital platforms are prohibited from allowing children to access online commercial games, including through advertising, promotion, or indirect exposure. This prohibition extends beyond gameplay itself to encompass marketing communications and embedded content. Platforms are required to implement appropriate technical and administrative safeguards (including age verification, access controls, and content blocking) to ensure that children are effectively prevented from accessing gambling and commercial gaming services.
Child data privacy and protection
The Child Safety Law introduces strict protections for children’s personal data. Platforms are prohibited from collecting, processing, publishing, or sharing the personal data of children under the age of 13 unless explicit, documented, and verifiable consent is obtained from the custodian. Platforms must clearly explain how such data is used through transparent privacy policies addressed to both the child and the custodian.
Data access must be strictly limited on a data-minimisation basis, and the use of children’s data for commercial purposes, behavioural profiling, or targeted advertising is expressly prohibited. Limited exemptions may apply to platforms operating for educational or health purposes, but only where Cabinet approval is obtained and appropriate safeguards are implemented.
Age verification requirements
Platforms are required to implement age verification mechanisms proportionate to the risks associated with their services. The nature and robustness of these mechanisms will be directly linked to the platform’s classification under the forthcoming risk classification framework and the potential impact of platform content on children. Higher-risk platforms are expected to face more stringent age-assurance requirements.
Content moderation and advertising controls
Digital platforms must deploy blocking and filtering tools, content classification systems, and safeguards to restrict children’s access to harmful or age-inappropriate material. Controls must also extend to advertising practices, including restrictions on targeted advertising to children, particularly where advertising may exploit children’s vulnerabilities or encourage excessive engagement.
Custodian control tools
Platforms are required to provide custodians with effective tools to manage children’s use of digital services. These include mechanisms to set daily time limits, manage and supervise accounts, and monitor usage patterns. Such tools are intended to support custodians in fulfilling their own obligations under the CDS Law.
Notice-and-takedown mechanisms
User-friendly notice and takedown systems must be implemented to allow users to report child sexual abuse material (CSAM) and other harmful content. Platforms are expected to deploy technical capabilities, including automated detection tools, to proactively identify and address such content, and to act promptly upon receiving reports.
Reporting and disclosure obligations
Platforms must disclose their policies on content moderation, user engagement, and child protection, and submit periodic reports to the competent authorities detailing measures taken to comply with the Child Safety Law. Any CSAM or harmful content must be reported immediately, together with relevant information concerning the individuals or platforms involved
Key obligations for Internet Service Providers
Internet service providers are subject to a complementary set of obligations aimed at safeguarding children at the network level.
Content filtering and moderation
Internet service providers must implement content filtering systems aligned with national policies prohibiting harmful digital content. These measures are intended to prevent children from accessing prohibited material irrespective of the platform used.
Custodian safeguards
Internet service providers are required to adopt safeguards to ensure the safe use of services by children, including custodian consent mechanisms embedded in terms of service. Internet Service Providers must also provide tools enabling custodians to monitor and supervise the digital content accessible by children.
Reporting duties
Internet service providers are subject to mandatory reporting obligations and must promptly notify the competent authorities of any CSAM or harmful content detected, including information relating to the persons or platforms involved.
Obligations of custodians
The Child Safety Law places direct responsibilities on custodians of children, defined as parents or legal guardians of individuals under the age of 18. Custodians are required to monitor their children’s digital activities, use available parental control tools, and prevent access to content that is inappropriate for the child’s age.
Custodians must also refrain from exposing or exploiting children in ways that threaten their safety, privacy, or dignity, and are obligated to report any CSAM or harmful content to which their children are exposed. While these obligations are clearly articulated at a policy level, further clarification on enforcement mechanisms and liability thresholds is expected through implementing regulations.
Governance and enforcement framework
Oversight and enforcement of the Child Safety Law fall primarily under the responsibility of the Telecommunications and Digital Government Regulatory Authority (TDRA), which is empowered to monitor compliance, issue regulatory instructions, and coordinate enforcement actions.
The law also establishes a Child Digital Safety Council, chaired by the Minister of Family, tasked with coordinating national child digital safety efforts, proposing legislation, and advising on standards, tools, and indicators. Platforms are expected to cooperate with the TDRA and other competent authorities, participate in policy development initiatives, and support awareness-raising programs.
Enforcement and penalties
Non-compliance with the Child Safety Law may result in administrative sanctions, including partial or full blocking of services, suspension, or closure. While the law provides the enforcement basis, detailed penalties and procedural rules are expected to be set out in a separate Administrative Penalties Regulation, to be issued by the Cabinet
Practical next steps for UAE businesses
With the CDS Law now in force and a one-year grace period running until January 2027, businesses should begin early compliance planning. This includes reviewing existing governance, content moderation, age verification, and data protection frameworks, and identifying gaps in child-safety controls.
Particular attention should be paid to the forthcoming platform risk classification system, as this will determine the scope and proportionality of obligations applicable to different services. Digital platforms and Internet Service Providers (whether established in the UAE or targeting UAE users) should closely monitor the issuance of implementing regulations and Cabinet decisions, and prepare strategically for differentiated, risk-based compliance obligations under the new regime.
About Us
Middle East Briefing is one of five regional publications under the Asia Briefing brand. It is supported by Dezan Shira & Associates, a pan-Asia, multi-disciplinary professional services firm that assists foreign investors throughout Asia, including through offices in Dubai (UAE). Dezan Shira & Associates also maintains offices or has alliance partners assisting foreign investors in China (including the Hong Kong SAR), Indonesia, Singapore, Malaysia, Mongolia, Japan, South Korea, Nepal, The Philippines, Sri Lanka, Thailand, Italy, Germany, Bangladesh, Australia, United States, and United Kingdom and Ireland.
For a complimentary subscription to Middle East Briefing’s content products, please click here. For support with establishing a business in the Middle East or for assistance in analyzing and entering markets elsewhere in Asia, please contact us at dubai@dezshira.com or visit us at www.dezshira.com.
- Previous Article Dubai Launches New Crypto Framework in DIFC
- Next Article Egypt’s Renewable Energy Pivot: Investment Trends and Outlook

