May 5th, 2026
At a recent Open Forum for AI talk at Purdue Libraries and School of Information Studies, Dr. Sayeed Choudhury framed a central tension shaping the future of AI: it’s not just about building more powerful systems—it’s about who gets to shape them, and how open that process really is.

He began by grounding AI in a familiar idea—infrastructure. Just like the internet, AI isn’t a single tool but an ecosystem built through a balance of industry, academia, and government. Right now, he argued, that balance is off. The private sector is moving fast and investing heavily, but without enough counterweight from public and academic institutions, the long-term direction of AI risks becoming too narrow.
That’s where the Open Forum for AI (OFAI) comes in. Rather than a formal organization, it’s a loose, global network of collaborators working across research, policy, technical prototypes, and community engagement. The goal is breadth over depth—connecting insights across domains to build a more holistic view of AI systems and their impacts.
A major theme was what “open” actually means in AI. In software, open source has a clear definition: the freedom to use, study, modify, and share—but AI complicates that. Models depend not just on code, but on training data and model weights—components that are often difficult or impossible to fully share due to privacy, scale, or legal constraints. Choudhury emphasized that openness in AI isn’t all-or-nothing; it exists on a spectrum, and transparency about data and methods is just as important as access.
He also pushed back on the idea that bigger is always better. While massive foundation models dominate headlines, there’s growing evidence that smaller, more specialized models—often open—can be just as effective, especially when combined in agent-based systems. This shift mirrors earlier trends in computing: from centralized systems to more distributed, flexible approaches.
Ultimately, the talk returned to a simple but unresolved idea: openness isn’t just a technical choice—it’s a design principle with real consequences. Making AI more transparent, participatory, and accountable may slow things down in the short term. But without that balance, the systems being built today could define access, power, and knowledge in ways that are hard to undo.
Filed under: general, News and Announcements if(!is_single()) echo "|"; ?>