Sitemap

From ▪️Black Boxes to 🕳️Black Holes: Navigating Tech Accountability in 2025

9 min readApr 12, 2025

🔳Over the past decade, the landscape of “Big Tech” accountability has fundamentally transformed. We are moving beyond the post-API era. So-called “Big Tech” exercises unprecedented control over its visibility while deploying increasingly sophisticated opacity mechanisms. This is the culmination of long-standing trends, and it demands a hard rethinking of strategic approaches from researchers, journalists, funders, and regulators across technology, politics, and society.

▪️◾️From Provisioned Access to Structural Opacity

Despite the flood of Twitter-related research over the past decade, there is a foundational myth that must be recognized: APIs were never meant to be transparency tools they existed as capital growth mechanisms, selectively exposing data that benefited platform expansion while restricting access to sensitive operational metrics.

While Twitter’s public APIs became the research community’s darling, Facebook’s Graph API lurked in the shadows, arguably the most consequential data source in platform history in leveraging user data to drive platform value, particularly for polickal data insights.

Today’s transparency conundrum is further down the spiral of opacity by design — after years of data and privacy scandals, tech systems are now engineered specifically to resist external scrutiny. This includes a number of transparency research “pain points” such as:

  • Ephemeral architectures where content dissipates before meaningful capture or analysis is possible;
  • Engineered fragmentation ensuring no single access point provides a comprehensive view (e.g., Facebook/Messenger/WhatsApp ecosystem)
  • Legal fortification through terms of service that explicitly prohibit research methods like scraping, backed by aggressive enforcement, and pseudo-research partnerships allow limited data access that must be vetted by the organization before results are released.

AI doesnt change the natire of the game, but AI systems do represent a quantum leap in structural opacity. Large language models and AI transformer-based features are being integrated into every product possible.

The problem here is that they obscure not just the rules for how data is served/provided, but the entire info-sourcing and decision-making processes. Meaning, the emergent properties of tech systems not only resist documentation, models’ continuous evolution also means evidence becomes outdated immediately — sometimes within hours of collection as platforms deploy continuous shadow updates to their models and systems.

▪️◾️🔳The New Reality for Meaningful Tech Accountability

The shift from API-based to post-API research represents a profound transformation. In the 2010s, researchers could automate collecting posts from Instagram, Facebook, Twitter, et al. via public APIs, analyze the data using standard methods, and often publish findings within days or weeks.

Fast forward to the 2020s, and effective accountability research requires sustained, time- and resource-intensive approaches, often involving multiple methodologies, repeated treatments, technical and legal support, and interdisciplinary expertise — meaning even simple projects can span months to years.

💸What Funders Must do

For funders seeking to support truly meaningful and impactful tech accountability work, the transformed environment demands carefully updated approaches. This includes:

✂️Restructuring Funding Models

  • Funders should provide multi-year core support rather than project or “story” specific grants;
  • Investment in institutional capacity building for increased technical capabilities;
  • “Rapid responsefunds for time-sensitive research opportunities should also be set aside

To support accountability research, funders should reduce the focus on results. Instead, they should fund and encourage experimentation. We need to fund methodological innovation and work that’s focused on overcoming the close-to-endgame level opacity barriers. Results hardly matter if we can’t find ways to access meaningful data to begin with.

Additinally, we need to develop and fund shared technical resources that can serve multiple research teams.

We need to support shared legal defense funds that can protect researchers from intimidation when projects involve sensitive or potentially negative press.

How can we move forward? We can start by reimagining deliverables and success metrics. Value process innovations and transparency strategies as much as findings, and by embracing longer timelines and integrating milestone frameworks that acknowledge the increased complexity of modern tech accountability work.

Embedding research translation into funded projects to ensure technical findings reach key stakeholders in ways that matter is key.

Require all work to be open access. Include a standalone fund for open access publication—this should even include agreements with newspapers and magazines. I mean, honestly, aren’t most of them in content exclusivity agreements with big tech anyway?

Much of the public tech accountability work I have done, for example — including work that has been published on the front pages of the New York Times, Washington Post, in major feature stories in Wired, I am unable to easily access now due to paywalls. While I support journalism and subscribe to some publications, it is unacceptable that the major, if not some of the landmark tech accountability work I have done is now stuck behind paywalls.

Funded work that ends up behind for-profit scholarly journal paywalls is even more unacceptable.

🛡️What Tech Journalists Should Do

Technology journalism must evolve from covering tech-as-products and personalities-as-tech to investigating technology (fka platforms) as the dominant infrastructure fundamentally reshaping society. This is non negiotiable if journalistic tech accountability work is to succeed in the near future. Without a shift in approach, I’ll argue that funders are wasting money on a story headline in a few now-paywalled publications that’s akin to a drop of water in an ocean of chaos. At the best, it’s fleeting impact.

Critical coverage frames that must be prioritized include:

Tech’s Colonization of Global Information Ecosystems

Fake accounts, bots, caustic political Facebook memes, and deepfakes are one thing. But there are far more pressing matters: the systematic replacement of authentic information sources — local news organizations, for example — with algorithmically-optimized simulacra that mimic traditional trust indicators while serving entirely different purposes. Investigations have revealed how seemingly local news sources have formed vast networks designed to shape community perceptions. And we don’t know much about this type of occurrence outside of non-Western contexts.

This colonization is equivalent to a full-scale war; it’s more than the cycle of skirmishes and battles around truth, countering mis/disinformation, and combatting false narratives. There has been a strategic restructuring at every level of society’s information flows, often though what appears to be organic interest-based news that serves ideologically-driven priorities and interests.

The emergence of what I call “fauxcal” news networks are just one manifestation of this phenomenon. Democracy isn’t crumbling because of posts on Truth Social, bets on Polymarket, or because of evil billionaires. It’s because information chaos has seeped into the infrastructure of reality, and while journalism was chasing clicks from tech and policy spectacles — it’s materialized right before our eyes.

🔀Reality Consensus Fragmentation

Even before the internet, there has been an accelerating divergence in how different population segments perceive basic factual realities. It’s fair tonsay that in 2025, it has surpassed levels that simply threaten democratic governance. We are walking a tightrope in a twilight zone between dystopian fiction and unbridled information chaos. Research on perception gaps related to crime and the economy reveals how technological systems can amplify cognitive biases. But this is merely a symptom of a larger systemic problem.

In 2025, perception gaps have expanded beyond political issues to disrupt fundamental understandings of how life itself functions, creating societies operating with incompatible versions of reality. News update: we are still in the upside down—it’s Stranger Things Season 5, if you will.

🧠Cognitive Extraction Capitalism

Let’s consider another evolution: tech business models have evolved from attention harvesting and targeting to what might best by called “cognitive processing exploitation.” By this, I mean human mental processes, rather than attention, are the raw resource. As AI systems advance, they increasingly rely on human cognitive labor for training, validation, and model refinement—usually without transparency or compensation. Like the Matrix, except everyone is feeding the machine with every swipe, click, video edit, podcast, post, and interaction.

This cognitive extraction is systemic—everything from educational tech used by schools to knowledge work to creative expression has quietly been repurposed as AI training grounds.

The various class action lawsuits against AI companies for copyright infringement represent just the first wave of resistance to this model. Yet, while researchers are threatened when they circumvent API limitations and scrape data, the entirety of the internet and much of humanity’s digitized knowledge has already been scraped and assimilated into proprietary profit-generators.

🛠️What Future Collaboration for Accountability Impact Should Look Like

Effective platform accountability requires moving beyond traditional researcher-journalist relationships, “data journalism” teams, etc. to strategic collaborations strategically designed to drive platform, stakeholder, and regulatory action.

Impact. Change.

I cannot stress enough that tech accountability work requires evidence frameworks that drive action rather than publicity.

We need to make waves outside of the current global infrastructure of control. Meaning, work needs to break outside of the success metrics—views, clicks, shares— designed by the systems and structures that we are investigating.

Second, journalists need legitimate research to counter the “anecdote vs. aggregate” defense. By this, I mean accountability work needs to systematically document patterns while preserving individual stories for public interest purposes.

Tech accountability storytelling must develop translation layers between technical, policy, and legal evidence to serve key stakeholder needs and ensure that work is interpretable and meaningful by the general public, by policymakers, tech company leaders and product managers, students, and perhaps most importantly—by teachers and educators.

Third, we must work to create shared accountability resources that enable consistent measurement and progress indicators across projects and over time. We need better institutional-level arrangements for impact, and funders should help accountability work through accountability consortia.

For academic work. this means team effort, not a tenure quest.

Shared resources, shared data, shared protocols, and planned preventative mechanisms — ensuring there are basic cyber, personal, as well as legal defenses.

I can’t tell you the number of times I’ve watched journalists doing sensitive work send off SMS to confidential sources on their five-plus year-old Android phone, or scheduled group video meetings on their organization’s surveilled, managed big tech-provisioned platforms.

🧰Moving Forward: More Infrastructure, Less Incidents

Effective tech accountability work must resist treating technology-as-products, and instead treat it as critical infrastructure requiring transparency — and, when warranted through evidence, regulatory oversight.

Just as journalists cover natural disasters and public safety failures, we must approach transgressions of trust and exploitative AI-driven cognitive extraction as systemic threats requiring careful, coordinated responses.

Impact is not a bonus for accountability work—it’s not ancillary. Impact must be the means and the end.

Post 2024, the most important tech stories aren’t about future platform missteps or the latest new AI Assistant feature setting up another moral panic, but about the role of ubiquitous technologies’ fundamental reconfiguration of social, economic, and political systems.

Each stakeholder should be informed in ways that matter. In this process, effective tech accountability journalism must evolve from consumer spectacle and product guidance to more rigid, systemic investigations.

Technology IS political. Politics ARE technological. In 2025, they are one and the same. As such, accountability work must reveal the invisible architectures reshaping society.

It’s been almost a decade since the 2016 US election. What has been the resulting change after hundreds of millions of dollars of funded tech accountability work? Outside of the E.U. and a few jurisdictions that have implemented stringent policies based on evidence directly from research, as far as transparency and accountability are concerned, funders’ return on investment isn’t just low, the ROI is negative. Meaning, after eight years of treating tech-as-a-story, tech-as-incident, and tech-as-personalities, it’s led us to an even worse situation in 2025.

Researchers and journalists — working together—might still be able to create the sort of sustained publish interest and accountability pressure needed to actually drive meaningful change in tech company behavior and regulatory frameworks. While tech is a big part of the problem, there are ethical, responsible, and stand-up individuals working at every level across technology and politics.

The question isn’t whether we can keep pace with opacity mechanisms, but whether we can build the accountability infrastructure necessary to try.

This post synthesizes insights from a recent International Journalism Festival panel on platform accountability, opacity mechanisms, and the changing landscape of tech reporting. For more background on these topics, see my main Medium page and a collection of tech and politics accountability work over the past decade.

--

--

Jonathan Albright
Jonathan Albright

Written by Jonathan Albright

Professor/researcher. Award-nominated data journalist. Media, data, & tech frmly #columbiajournalism #towcenter #berkmanklein #elonuniversity

No responses yet