{"id":8232,"date":"2026-02-26T10:19:16","date_gmt":"2026-02-26T09:19:16","guid":{"rendered":"https:\/\/www.okaju.lu\/?post_type=document&#038;p=8232"},"modified":"2026-02-26T10:23:49","modified_gmt":"2026-02-26T09:23:49","slug":"eurochilds-position-on-age-restrictions-on-social-media","status":"publish","type":"document","link":"https:\/\/www.okaju.lu\/de\/document\/eurochilds-position-on-age-restrictions-on-social-media\/","title":{"rendered":"Eurochild&#8217;s position on age restrictions on social media"},"content":{"rendered":"\n<p><strong><em>A call to rethink the business model of social media to address risks for children<\/em><\/strong>.<\/p>\n\n\n\n<p>The debate on age restrictions on social media should be used to push for&nbsp;<strong>substantive reform of how social media operates, and discuss how&nbsp;<\/strong>companies can prioritise children\u2019s rights over profit and&nbsp;<strong>uphold them as a condition for operating in European countries<\/strong>.&nbsp;<strong>The choice is not simply between a \u201cban\u201d and \u201cno ban\u201d. That framing obscures the real issue.<\/strong><\/p>\n\n\n\n<p><strong>The real choice is whether we accept a digital environment designed around profit and attention capture, or insist on platforms that are accountable, transparent, and safe-by-design for children. This paper argues that age restrictions alone won\u2019t keep children safe, so the EU should prioritise children\u2019s rights-based, safe-by-default regulation that tackles platforms\u2019 risk-driving business models and design choices, making platforms safer for children and therefore safer for everyone.<\/strong><\/p>\n\n\n\n<p>Age restrictions alone won\u2019t keep children safe unless platforms are held accountable for&nbsp;<strong>risk-driving business models and design choices<\/strong>&nbsp;(attention extraction, profiling, addictive features) through&nbsp;<strong>child-centred and safe-by-design legislation&nbsp;<\/strong>and enforcement. We do&nbsp;<strong>not<\/strong>&nbsp;call for a blanket ban, but for a&nbsp;<strong>rights-based framework<\/strong>&nbsp;where any age-gating is&nbsp;<strong>necessary, proportionate and privacy-preserving<\/strong>, paired with stronger&nbsp;<strong>independent risk assessments<\/strong>, researcher access to data, and states investing in offline support rather than outsourcing children\u2019s rights to platforms.<\/p>\n\n\n\n<p><strong>Key messages and recommendations:<\/strong><\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li><strong>Age restrictions can never replace regulation or company responsibility<br><\/strong>Even if age limits exist, they will not address harms on their own. Existing rules (like the\u00a0<strong>DSA, GDPR and AI Act<\/strong>) must be enforced strongly, and new rules (including on online child sexual abuse) are vital because harms happen beyond \u201csocial media\u201d, and they are solely driven by whether a child can access given platforms.<br><\/li>\n\n\n\n<li><strong>Children\u2019s rights apply to everyone under 18, with safeguards that evolve by age<\/strong><br>Your rights don\u2019t switch off because of age gates. Safeguards should\u00a0<strong>increase with younger age<\/strong>, but older teens should be\u00a0<strong>empowered<\/strong>\u00a0(evolving capacities).<br><\/li>\n\n\n\n<li><strong>Platform power requires structural accountability<\/strong><br>Children might experience social media differently, but it is neither fair nor realistic to expect every child and caregiver to \u201cself-manage\u201d services intentionally designed to be hard to disengage from. Protective design must be required by law and set as the default.<br><\/li>\n\n\n\n<li><strong>Data extraction for behavioural advertising and engagement optimisation must end<\/strong><br>The current business model treats children\u2019s identities, emotions, and behaviours as monetisable assets. While companies must not use data of minors for commercial practices, it is also crucial to raise awareness around the issues of sharenting and childfluencers, protecting children\u2019s privacy, dignity and protection from exploitation.<br><\/li>\n\n\n\n<li><strong>The business model must change: reduce harm at the source<\/strong><br>Eurochild calls for action against features that fuel compulsive use and risky exposure (e.g.,\u00a0<strong>infinite scroll, autoplay, manipulative nudges<\/strong>), or illegal content. Social media must be completely revolutionised and monitored independently.<br><\/li>\n\n\n\n<li><strong>Social media must be safe by default<\/strong><br>Some people will always get around age checks. That\u2019s exactly why platforms must not keep a \u201cwild west\u201d experience for anyone who isn\u2019t logged in.\u00a0<strong>High privacy and safety should be the default for everyone.<\/strong><br><\/li>\n\n\n\n<li><strong>Regulation must be strengthened to make risk assessments independent and much more robust<\/strong><br>Independent, detailed and compulsory standards is needed, to robustly review risks and set proportionate minimum-age and age-assurance requirements using privacy-preserving technology.<br><\/li>\n\n\n\n<li><strong>Independent access to platform data is essential, and more research is needed<\/strong><br>After years of controversies and scandals, trust cannot be rebuilt without independent scientific scrutiny and interdisciplinary research combining survey data with objective platform data and (where appropriate) neuroscientific evidence. Researchers must be given meaningful access to platform data.<br><\/li>\n\n\n\n<li><strong>Age assurance and verification should protect privacy and avoid discrimination<\/strong><br>If age checks are used, they must be\u00a0<strong>reliable, non-intrusive and non-discriminatory<\/strong>. Eurochild points to the\u00a0<strong>EU Digital Identity Wallet<\/strong>\u00a0as a potentially more privacy-preserving option\u2014<em>but only if tested properly and safe for marginalised children<\/em>.<br><\/li>\n\n\n\n<li><strong>Governments can\u2019t outsource their responsibilities to platforms<\/strong><br>If social media is filling gaps (safe spaces, youth services, mental health support), that\u2019s a warning sign. States must invest in real offline and online support, platforms can\u2019t replace public responsibility.<\/li>\n<\/ol>\n","protected":false},"featured_media":8217,"parent":0,"template":"","meta":{"_acf_changed":true},"tags":[225,226,223,224],"class_list":["post-8232","document","type-document","status-publish","has-post-thumbnail","hentry","tag-dossier","tag-environnement-numerique","tag-eurochild","tag-social-media"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.okaju.lu\/de\/wp-json\/wp\/v2\/document\/8232","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.okaju.lu\/de\/wp-json\/wp\/v2\/document"}],"about":[{"href":"https:\/\/www.okaju.lu\/de\/wp-json\/wp\/v2\/types\/document"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.okaju.lu\/de\/wp-json\/wp\/v2\/media\/8217"}],"wp:attachment":[{"href":"https:\/\/www.okaju.lu\/de\/wp-json\/wp\/v2\/media?parent=8232"}],"wp:term":[{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.okaju.lu\/de\/wp-json\/wp\/v2\/tags?post=8232"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}