The question of how to foster digital wellbeing among youth has become increasingly crucial in a rapidly evolving digital landscape. Stakeholders across the spectrum, from government to private industry to advocacy groups, have adopted different approaches to addressing the challenges affecting minors’ rights, parental controls, and overall digital wellbeing among youth.
This year, Aspen Digital set out to better understand the varying approaches to digital wellbeing for young people. What we found is that federal and state governments have adopted protection measures for issues like combating harmful content among minors and creating research task forces. With federal and state level work focusing mostly on protections and restrictions, local level governments seem to combine protection with digital literacy tools for young online users and their families.
Major tech firms have implemented solutions such as age verification and content restriction, and some even offer digital literacy tools on their platforms. Youth-led initiatives and advocacy organizations take a different approach by centering the leadership and development of young people by building programs, fellowships, dialogue, and research models. They assert that not only is it crucial to have youth at the table in decision-making but also protection measures and restrictions are not enough because young people also need the tools to navigate online spaces and shed light on digital inequities and marginalization.
But first, what exactly is digital wellbeing?
Aspen Digital’s Empowered Communities team defines digital wellbeing as a practice that “…dismantles systemic barriers to full participation in our increasingly digital world.”
That means centering the experiences of those most impacted by and historically left out of ever-evolving technology and information spaces, and helping ensure that people have equal access to essential digital infrastructure and the skills to use it.
In this sense, digital wellbeing is not merely a set of tools or protections codified in legislation but also a practice of informing systems and decision-making by lived experiences, as well as understanding how different identities, cultures, and communities experience and interact in online spaces.
The Kids Are Online: Quick Facts
~67%
of adolescents are exposed to hate-based content on social media.
The US Surgeon General’s 2023 Social Media and Youth Mental Health advisory
According to Pew Research data, digital integration in teens’ lives, particularly through mobile devices, has increased significantly in the past 10 years, which has implications for digital literacy, education, and digital wellbeing.
~37%
of youth between the ages of 12-17 have experienced cyberbullying.
50%
of LGBTQ+ youth disproportionately face online harassment.
Multiple Approaches to Digital Wellbeing
Federal, State, and Local Levels
On a federal level, the Kids Online Safety Act (KOSA) is a major piece of legislation that was presented to Congress in 2022. It’s focused on protecting minors from harmful online content by creating requirements for social media platforms, including disabling features that encourage addictive platform use. KOSA decisively passed the senate in July 2024, led by Senator Cantwell (D-WA), who claims that social media companies profit significantly from target advertising to young users and data collection, alluding to the idea that social media platforms should play a role in advancing youth safety online. KOSA and its ‘sister’ legislation COPPA 2.0 “will give parents new tools to protect their kids online, hold social media companies accountable for harm, require consent before data can be collected and ban targeted advertising to kids under 17.” The House Energy and Commerce subcommittee passed KOSA in September 2024, however KOSA underwent serious revisions. Today, it no longer requires tech companies to remove digital features that would prevent things like substance abuse or suicidal behaviors or those that are linked to addictive platform use. Essentially, the stripped back KOSA will only apply to cases of physical harassment and/or violence.
States are taking varied approaches. In 2023, 35 states introduced youth protection legislation, which included establishing research task forces, digital literacy initiatives, and age verification requirements on platforms. Some notable examples include Colorado’s HOUSE BILL 24-1136 (enacted) calling for more research on social media and brain development and Georgia’s Senate Bill 351 (enacted), which requires the Department of Education to update its model programs for youth digital education.
The local level has seen initiatives and ordinances focused on the collection, use, and management of sensitive personal information and digital literacy resources for parents and youth on cyber harassment and internet safety. Examples include the Digital Wellness Ordinance in Chula Vista, CA, and the Responsible Tech Use for Youth Act in Portland, OR.
Youth-Led Initiatives and Advocacy Groups
Organizations like the Aspen Institute’s NextGen Network, Civics Unplugged, the Born This Way Foundation, and Youthprise are championing youth empowerment in digital spaces. The NextGen Network equips young leaders with the tools to build solutions for global challenges, such as those related to the design, development, and deployment of AI. Civics Unplugged provides young leaders with funding and training through intergenerational alliances and fellowships. The Born this Way Foundation builds campaigns and programming that raise awareness of the issues that youth face online and beyond. Youthprise centers youth in leading community research initiatives through Youth Participatory Action Research (YPAR).
The emergence of youth-led approaches sheds light on the importance of not only including young people in decision-making but also nurturing youth leadership and development through programming and intergenerational alliances.
Youth advocacy groups claim that there are shortcomings in current online safety legislation, signaling the potential erasure of LGTBQ+ youth through KOSA’s content censorship. Furthermore, the framing of digital wellbeing among youth seems mostly focused on implementing parental controls. While parents play a crucial role, they may not always be the ideal advocates for every young person’s online safety. Teachers, mentors, and peers often also foster safe spaces where youth feel comfortable discussing their digital wellbeing and developing effective solutions. These groups call for a nuanced approach to online safety that acknowledges that young people need more than just safeguards – they also need the tools to recognize and respond to harmful content on their own.
Tech Industry Responses
Major platforms like Meta and Microsoft have been rolling out enhanced safety tools and digital literacy initiatives to promote digital wellbeing among youth. Facebook and Meta adopted measures to give youth and parents built-in tools to track time spent on their apps as well as age-appropriate content control. Microsoft offers a family safety toolkit for educating users on digital safety, issues like sextortion and grooming, how to avoid harmful content, and how to use AI responsibly.
An interesting tension arises between Meta and app stores as they claim that the Google Play Store and Apple App Store should be responsible for implementing age controls and parental consent requirements, rather than individual social media platforms. Meta argues this approach would create a more consistent experience across platforms, as current age verification methods vary by state. The idea is that federal legislation should require parental consent for users under 16 when downloading apps. App stores would handle the verification process, and platforms like Meta would then focus on providing age-appropriate features and settings.
More tensions arise with tech firms opposing age appropriate design code. For example, the California Age Appropriate Design Code (CAADC), passed in 2022, requires that companies consider youth digital wellbeing in their product design. CAADC was met by a lawsuit that was filed by major tech firms, including Meta, Amazon, and Google, which claimed that it violated the First Amendment. The court upheld key revisions, which “reaffirms that the core principles of safety-by-design and privacy-by-default are constitutional” (5Rights Foundation).
A Path Forward: Intergenerational Alliances
Digital wellbeing among youth is complex, and current approaches from a variety of stakeholders demonstrate the urgency to protect young people from harmful online content and experiences. However, government and private sector approaches fall short in adequately representing the experiences, needs, and leadership of young people.
Charting a path forward requires moving the focus beyond implementing protections and restrictions to creating opportunities for youth to be involved in decision-making. The future of youth digital wellbeing lies in creating ecosystems that balance protection with empowerment and that bring together policy, technology, and community-based expertise to support holistic development and wellbeing.