Shared Futures: The A.I. Forum

How is AI reshaping the way we live, create, connect, and evolve?

On June 13, Shared Futures: The AI Forum will bring together the cultural architects of our time to explore.

Aspen Digital

Thoughts on Democracy and the Future of A.I.

Where Do We Go From Here?

A close up image of the capital dome. It represents democracy and the future of AI
April 17, 2025

What was so clear to me throughout the arc of our conversations that day–from the formal panel sessions to the casual banter–is that we stand at a critical inflection point where specific decisions about AI governance, platform regulation, and data protection will directly determine who participates in our democracy and how. 

We also wrestled with that central question of where do we go from here. Tech executive TB Bardlavens eloquently noted that our collective work should transcend “grievance politics.” Rather, it must focus on building products that represent our diverse world, ensuring digital products are “useful, usable, and accessible” for a global community. As Bardlavens has written previously, “Our work is about building brand love and scaling the business while also co-creating with community—with our past, present and future customers—to drive deep insights and broad product innovation.” These words articulate why community engagement is so needed in tech product design and why product equity is foundational to innovation. Democracy and the future of AI are inextricably tied together, each reinforcing the power of the other.

The brilliant lawyer and civil rights legend Sherrilyn Ifill offered a way forward. She reminded those in civil society and tech alike that historically the physical public square has been a location where our democracy’s greatest promise has been strengthened but also the site of our democracy’s shortcomings. The same public square that has platformed free expression has also been one where the voices of women and people of color were  muted or even denied.  Ifill urged us not to be ahistorical  and to understand the responsibility of the digital public square to center inclusive civil discourse. Ifill also powerfully articulated that democracy and notions of racial, ethnic and gendered supremacy cannot co-exist—that the work of American democracy is fundamentally about fairness, representation, and equality.

Democracy cannot thrive in the context of supremacy, and that must be reflected in our AI systems. When we feed AI models data shaped by centuries of supremacist thinking—where some voices and experiences are centered while others are marginalized—we shouldn’t be surprised when these systems encode and amplify those same hierarchies. In most cases of algorithmic harm, the AI system didn’t create the hierarchy—it inherited it.  They harden these hierarchies into seemingly objective, technological fact. The supremacy is laundered through algorithms until it appears as neutral, technical truth rather than a contestable power arrangement.

This isn’t just about making AI “fair” or “unbiased”—it’s about recognizing that the very possibility of democratic governance of AI technologies depends on our ability to move beyond frameworks of supremacy in how we design, deploy, and regulate these increasingly powerful systems. We cannot cede the architecture of AI technologies to broken and dangerous supremacist ideologies. To do so will compromise a technology that ought to serve everyone, and will dangerously tether AI’s innovation to undemocratic narratives and regimes.

This convening—by building connections across sectors, disciplines, and communities—demonstrates that even as some forces appear to reverse progress, we will still insist on and create new pathways toward a more inclusive, democratic digital future.

No doubt, the challenges are significant. But the work continues, and we continue it together. Democracy and the future of AI depend on it.

The views represented herein are those of the author(s) and do not necessarily reflect the views of the Aspen Institute, its programs, staff, volunteers, participants, or its trustees.

Browse More Posts

(Re)Building Trust in Tech

(Re)Building Trust in Tech

Rebuild trust in tech with Sarell’s actionable framework. Discover how Product Equity principles can create more inclusive, transparent platforms.

Between Tech, Democracy, and Marginalized Communities

Between Tech, Democracy, and Marginalized Communities

How can bridging the gap between tech and marginalized communities help to preserve democracy? Our roundtable participants share reflections.