How is AI reshaping the way we live, create, connect, and evolve?
On June 13, Shared Futures: The AI Forum will bring together the cultural architects of our time to explore.
How is AI reshaping the way we live, create, connect, and evolve?
On June 13, Shared Futures: The AI Forum will bring together the cultural architects of our time to explore.
Workstream
Topic
Share
In February 2025, Aspen Digital hosted a roundtable to discuss the impact of technology on systemically marginalized communities and their place in a thriving democracy. A follow-up to our Algorithmic Suppression convening in August 2024, this meeting brought together leaders from civil rights and community-serving organizations, academics, civil society, top tech companies, and groundbreaking AI startups. Though we came from diverse backgrounds we shared a common belief: that building an inclusive future relies on collaboration between communities and those on the cutting edge of innovation.
Community leaders have much to learn about emerging tech, especially when it comes to navigating the myths versus the reality on the ground. The lack of contextual understanding for how these technologies actually work can keep communities at arm’s length from the product development lifecycle, even when those products can have an outsized impact on their lives. Roundtable participants agreed that closing this gap is essential for not just improving products for all users, but for the preservation of democratic rights and civic engagement.
A few insights from the discussions include:
Five of our participants have shared their reflections on what a meaningful collaboration between the tech industry and excluded communities might look like.
Vernon E. Jordan, Jr. Endowed Chair in Civil Rights, Howard University School of Law
This moment that we’re in now as a country is perhaps the most dangerous we’ve experienced since the aftermath of the Civil War. If there is any silver lining to the risk of complete collapse we’re seeing with so many of our institutions, it’s that the cracks that so many of us have been trying to compel Americans to see for over 30 years are now unavoidable. No single person can just walk into an actually healthy democracy and just completely upend it unless seeds were planted long ago and have just been waiting to be watered.
Corporations, including tech companies, cannot be agnostic about democracy. In this country they are created by democracy and enjoy protections. Whether they embrace it or not, tech platforms that have purported to create “the public square” have very clear obligations to not just love America but to truly know it. As civil rights leaders, it’s a necessity for us to understand tech. But it’s just as important that tech leaders fully understand the critical importance of civil rights and their responsibility to uphold those hard-won protections. Remember that the physical public square not too long ago included separate water fountains and bathrooms and the exclusion of people based on race. Women were and still are harassed in the public square. The digital public square is no different, and our responsibility to guard against those same kinds of inequities is even more important than ever.
So gatherings like this that bring together tech and communities are massively important but we need an ongoing effort that continually asserts an equal power balance between those who are building our future and those of us who have to deal with its consequences. And real change can only come when all-powerful individual leaders are not empowered by their boards to dismantle the company’s commitments and the infrastructure built to honor those commitments, based merely on a whim or changing political winds. Board members and those with fiduciary responsibilities to the continued business success of these companies must also realize that along with their fiduciary responsibility to shareholders, they have an equal fiduciary responsibility to democracy itself. And until we make that foundational shift in how we engage, we will keep being at risk for total collapse.
Co-Founder, Kentucky Student Voice Team
Co-Founder, Responsible Technology Youth Power Fund
Consider the role young people and technology played in the 2024 presidential election: a 20-year-old radicalized in online forums attempted to assassinate the president, a teenage Barron Trump helped his father tap into the “manosphere”—a shift that stunned political veterans but was no surprise to digital natives—and Gen Z voters swung 29 points toward Donald Trump.
I joined Aspen Digital’s post-election convening with these moments in mind. I was prepared to discuss where we go from here but I wasn’t prepared to be the only voice in the room explicitly there to represent youth perspectives. At 29, I can occasionally get away with being considered Gen Z, but even so, the lack of intergenerational diversity (yes, I said it) was impossible to ignore.
This dynamic, one I’m all too familiar with, led me and several peers to co-found the Responsible Technology Youth Power Fund (RTYPF) seeking to ensure young people have a say in shaping the digital world that impacts them most. Over the past two years, RTYPF has engaged 15+ funders, raised more than $4 million, and supported dozens of youth-led organizations working at the intersection of youth power and responsible technology. Notably, this year the Fund’s steering committee invited five young people with voting rights ensuring the grant making process is as intergenerational as the organizations we seek to fund. Already, RT YPF grantees are driving real change—filing FTC complaints, pushing for laws to protect against online exploitation, and challenging Big Tech’s governance decisions.
So, what will it take to improve the relationship between young people, democratic institutions, and technology?
None of this is easy, but the first step is ensuring young people are in the room. Thank you, Aspen Digital, for the platform. And yes, AI did help write this reflection.
Tech has abandoned equity, doubling down on policies that empower unsafe spaces, lack accountability, and help build dangerous movements. Whether it’s social media platforms, AI, or other products, tech has infiltrated our lives with limited potential for benefit as it plummets into a harmful agenda–one that was largely the result of advocacy in opposition to civil rights and equity–is now threatening our democracy and destabilizing the world.
Following this convening, I am clear-eyed that we must take three actions:
These may seem simple, yet we are not all meshed in these actions. We need a cohesive strategy that centers local communities to build our power from the ground. There’s no more time.
Project Lead, Disability Rights in Technology Policy, Center for Democracy & Technology
We are in an immensely challenging moment for those who value inclusion and social justice. But challenges can, and often do, exist alongside moments of significant opportunity. Technology, for example, can be extremely helpful for people with disabilities, and can also be a lever of discrimination against disabled people. The guiding principle here must be to maximize benefits, and minimize harms.
To build trust, it’s vital to prioritize marginalized communities, by actively including marginalized people at every stage of any decision-making process, and by valuing their perspectives. In my work, I speak often about the importance of centering people with disabilities in tech policy — this means not only ensuring that disabled people are present during conversations, but also that their needs and lived and learned experiences are properly taken into account. I am excited by the incredibly diverse group of stakeholders that conveners like the Aspen Institute are able to bring together, as well as advocates’ increased emphasis on looking at technology as something that can impact civil rights and liberties. By continuing to invest in this type of community engagement, we create spaces where we can find ways, together, to reap the benefits of technology while minimizing its risks, especially for marginalized people.
General Manager & Director, Product Equity, Adobe
Technology shapes our world; the law determines its boundaries. Walking away from this convening that brought together grassroots civic leaders, policymakers, and tech professionals—that statement was my greatest takeaway. This room was full of brilliant minds utilizing their skills to drive policy, protect civil rights, and challenge systemic inequities.
However, technology was perceived as this big amorphous thing that we all recognize has a global impact but also lacks boundaries. The convening revealed a stark disconnect: those shaping legislation often do not understand the product development process.
At the same time, those designing technology fail to grasp the real-world consequences of their decisions. Policy is disconnected from practice, and the lack of trust between the two stems from a lack of understanding. The result of this lack of understanding harms the most marginalized of communities.
I also witnessed a passionate attempt to learn and grow. Attendees centered on community experiences and deeply desired to democratize power to push these systems, technologies, and policies to work better for historically and systemically underinvested communities. I saw a collective of people all sharing stories, excitement, and empathy—ideating on better collective futures.
I saw curiosity mixed with conviction. Our biggest challenge is to stop seeing technology as monolithic, a Mt. Everest that only a few can reach. To stop seeing mis/disinformation as a brand new and unsolvable challenge. We must stop seeing ourselves as “not technologists” because we use these technologies daily. Algorithms have become a commonly used term when we see changes in our digital experiences.
We are ALL technologists, and if we approach our work and teach our communities this, then we increase our collective power to create necessary expectations and outcomes for the technologies we use. Systemic change happens at the intersection of advocacy, governance, and law. This convening opened new pathways to deepening my impact at the intersection of technology, human rights, and policy.
The views represented herein are those of the author(s) and do not necessarily reflect the views of the Aspen Institute, its programs, staff, volunteers, participants, or its trustees.
As hard-won progress on representation have started to unravel, democracy and the future of AI are inextricably tied together.
Rebuild trust in tech with Sarell’s actionable framework. Discover how Product Equity principles can create more inclusive, transparent platforms.