Weaponizing the Digital Influence Machine: The Political Perils of Online Ad Tech identifies the technologies, conditions, and tactics that enable today’s digital advertising infrastructure to be weaponized by political and anti-democratic actors.
“We consider it weaponization whenever an advertising system is used to prioritize vulnerability over relevance.”
Authors Anthony Nadler, Matthew Crain, and Joan Donovan (Media Manipulation Research Lead, Data & Society) define the “Digital Influence Machine,” or DIM, as the “infrastructure of data collection and targeting capacities” developed by ad platforms, web publishers, and other intermediaries.
The DIM includes consumer monitoring, audience-targeting, and automated technologies that enhance its reach and, ultimately, its power to influence.
Three key shifts in the United States media landscape have provided the conditions for the weaponization of the DIM, write the authors: the decline of professional journalism; the expansion of financial resources devoted to political influence; and the growing sophistication of targeted advertising with little oversight.
The report argues that political and anti-democratic actors use three main strategies to weaponize the DIM by targeting specifically selected audiences at “weak points” when they’re most vulnerable to manipulation, including:
- Mobilizing supporters through identity threats;
- Dividing an opponent’s coalition; and
- Leveraging influence techniques informed by behavioral science.
Rather than using these techniques to change long-standing beliefs, actors target audiences with the goals of amplifying existing resentments and anxieties; sowing distrust in others; influencing political decisions; and deepening divisions.
To combat this weaponization, Nadler, Crain, and Donovan recommend interventions into the technical structures, institutional policies, and legal regulations of the DIM. They suggest that:
- Ad tech companies should refuse to work with dark money groups;
- Platforms should require explicit non-coercive user consent for viewing political ads that are part of split-testing; and
- Further ethical guidelines for political advertising should be developed, with independent committees representing diverse community stakeholders.
This report comes from the Media Manipulation Initiative (MMI) at Data & Society, which works to provide news organizations, civil society, platforms, and policymakers with insights into new forms of media manipulation to ensure a close and informed relationship between technical research and socio-political outcomes.