Technology

Wikimedia’s CTO: In the age of AI, human contributors still matter

Selena Deckelmann has never been afraid of people on the internet. With a TV repairman and CB radio enthusiast for a grandfather and a pipe fitter for a stepdad, Deckelmann grew up solving problems by talking and tinkering. So when she found her way to Linux, one of the earliest open-source operating systems, as a college student in the 1990s, the online community felt comfortingly familiar. And the thrilling new technology inspired Deckelmann to change her major from chemistry to computer science. 

Now almost three decades into a career in open-source technology, Deckelmann is the chief product and technology officer (CPTO) at the Wikimedia Foundation, the nonprofit that hosts and manages Wikipedia. There she not only guides one of the most turned-to sources of information in the world but serves a vast community of “Wikipedians,” the hundreds of thousands of real-life individuals who spend their free time writing, editing, and discussing entries—in more than 300 languages—to make Wikipedia what it is today. 

It is undeniable that technological advances and cultural shifts have transformed our online universe over the years—especially with the recent surge in AI-generated content—but Deckelmann still isn’t afraid of people on the internet. She believes they are its future.  

In the summer of 2022, when she stepped into the newly created role of CPTO, Deckelmann didn’t know that a few months later, the race to build generative AI would accelerate to a breakneck pace. With the release of OpenAI’s ChatGPT and other large language models, and the multibillion-dollar funding cycle that followed, 2023 became the year of the chatbot. And because these models require heaps of cheap (or, preferably, even free) content to function, Wikipedia’s tens of millions of articles have become a rich source of fuel. 

To anyone who’s spent time on the internet, it makes sense that bots and bot builders would look to Wikipedia to strengthen their own knowledge collections. Over its 23 years, Wikipedia has become one of the most trusted sources for information—and a totally free one, thanks to the site’s open-source mission and foundation support. But with the proliferation of AI-generated text and images contributing to a growing misinformation and disinformation problem, Deckelmann must tackle an existential question for Wikipedia’s product and community: How can the site’s open-source ethos survive the coming content flood? 

Deckelmann argues that Wikipedia will become an even more valuable resource as nuanced, human perspectives become harder to find online. But fulfilling that promise requires continued focus on preserving and protecting Wikipedia’s beating heart: the Wikipedians who volunteer their time and care to keep the information up to date through old-fashioned talking and tinkering. Deckelmann and her team are dedicated to an AI strategy that prioritizes building tools for contributors, editors, and moderators to make their work faster and easier, while running off-platform AI experiments with ongoing feedback from the community. “My role is to focus attention on sustainability and people,” says Deckelmann. “How are we really making life better for them as we’re playing around with some cool technology?”

What Deckelmann means by “sustainability” is a pressing concern in the open-source space more broadly. When complex services or entire platforms like Wikipedia depend on the time and labor of volunteers, contributors may not get the support they need to keep going—and keep those projects afloat. Building sustainable pathways for the people who make the internet has been Deckelmann’s personal passion for years. In addition to working as an engineering and product leader at places like Intel and Mozilla and contributing to open-source projects herself, she has founded, run, and advised multiple organizations and conferences that support open-source communities and open doors for contributors from underrepresented groups. “She has always put the community first, even when the community is full of jerks making life unnecessarily hard,” says Valerie Aurora, who cofounded the Ada Initiative—a former nonprofit supporting women in open-source technology that had brought Deckelmann into its board of directors and advisory board. 

Addressing both a community’s needs and an organization’s priorities can be a challenging balancing act—one that is at the core of open-source philosophy. At the Wikimedia Foundation, everything from the product’s long-term direction to details on its very first redesign in decades is open for public feedback from Wikipedia’s enormous and vocal community. 

Today Deckelmann sees a newer sustainability problem in AI development: the predominant method for training models is to pull content from sites like Wikipedia, often generated by open-source creators without compensation or even, sometimes, awareness of how their work will be used. “If people stop being motivated to [contribute content online],” she warns, “either because they think that these models are not giving anything back or because they’re creating a lot of value for a very small number of people—then that’s not sustainable.” At Wikipedia, Deckelmann’s internal AI strategy revolves around supporting contributors with the technology rather than short-circuiting them. The machine-learning and product teams are working on launching new features that, for example, automate summaries of verbose debates on a wiki’s “Talk” pages (where back-and-forth discussions can go back as far as 20 years) or suggest related links when editors are updating pages. “We’re looking at new ways that we can save volunteers lots of time by summarizing text, detecting vandalism, or responding to different kinds of threats,” she says.

But the product and engineering teams are also preparing for a potential future where Wikipedia may need to meet its readers elsewhere online, given current trends. While Wikipedia’s traffic didn’t shift significantly during ChatGPT’s meteoric rise, the site has seen a general decline in visitors over the last decade as a result of Google’s ongoing search updates and generational changes in online behavior. In July 2023, as part of a project to explore how the Wikimedia Foundation could offer its knowledge base as a service to other platforms, Deckelmann’s team launched an AI experiment: a plug-in for ChatGPT’s platform that allows the chatbot to use and summarize Wikipedia’s most up-to-date information to answer a user’s query. The results of that experiment are still being analyzed, but Deckelmann says it’s far from clear how and even if users may want to interact with Wikipedia off the platform. Meanwhile, in February she convened leaders from open-source technology, research, academia, and industry to discuss ways to collaborate and coordinate on addressing the big, thorny questions raised by AI. It’s the first of multiple meetings that Deckelmann hopes will push forward the conversation around sustainability. 

Deckelmann’s product approach is careful and considered—and that’s by design. In contrast to so much of the tech industry’s mad dash to capitalize on the AI hype, her goal is to bring Wikipedia forward to meet the moment, while supporting the complex human ecosystem that makes it special. It’s a particularly humble mission, but one that follows from her career-long dedication to supporting healthy and sustainable communities online. “Wikipedia is an incredible thing, and you might look at it and think, ‘Oh, man, I want to leave my mark on it.’ But I don’t,” she says. “I want to help [Wikipedia] out just enough that it’s able to keep going for a really long time.” She has faith that the people of the internet can take it from there.

Rebecca Ackermann is a writer, designer, and artist based in San Francisco.

Related Articles

Back to top button