2026-03-08
Cross-Spectrum Calibration: Detecting Influence Regardless of Ideology
Bouncer's influence analyzer evaluates YouTube videos for covert persuasion — techniques that operate beneath the viewer's conscious awareness. We found a measurable gap in detection sensitivity across the political spectrum and fixed it. This is how.
The Problem
Across 728 analyses spanning 38 channels, we measured average transparency and intensity scores by political category:
| Category | Channels | Avg Transparency | Avg Intensity | Std Dev (Transp.) |
|---|---|---|---|---|
| Conservative | 13 | 0.66 | 0.63 | 0.23 |
| Independent | 7 | 0.78 | 0.57 | — |
| Progressive | 10 | 0.86 | 0.47 | 0.09 |
| Mainstream | 8 | 0.85 | 0.33 | — |
The 0.20-point transparency gap between conservative and progressive channels is notable, but the more telling signal is the standard deviation. Conservative transparency scores were spread across the full range (0.23 stddev), while progressive scores clustered tightly around 0.86 (0.09 stddev). That tight clustering suggests a detection blind spot — the model wasn't differentiating between progressive channels at all.
The Diagnosis
The original prompt's examples and detection patterns were all modeled on conservative-coded manipulation — manufactured anxiety leading to a product pitch, fear escalation driving a call to action, explicit us-vs-them framing. These patterns are real, well-documented, and correctly detected.
But progressive media uses different techniques that were passing undetected:
Moral Authority as Neutrality
Positioning the channel's perspective as objective truth rather than a viewpoint. "We're just telling it like it is" makes the framing invisible. Test: does the content acknowledge its own perspective as a perspective?
Institutional Appeal as Cover
Citing studies, experts, or fact-checkers selectively to manufacture the appearance of neutral authority. The manipulation is in which evidence is selected, not that evidence is cited. Test: are dissenting experts acknowledged or silently excluded?
Empathy as Compliance
"If you care about X group, you must support Y policy" — where the emotional bond is manufactured to make policy disagreement feel like cruelty. Test: is the viewer given space to care about the group AND disagree on policy?
Consensus Manufacturing
"Everyone reasonable agrees" used to make dissent feel illegitimate rather than engaging counter-arguments. Test: is the consensus claim used to inform or to foreclose debate?
Righteous Outrage
Same emotional intensity as conservative fear/anger, but coded as moral clarity. It passes the "does the emotion match the content?" test because the outrage feels justified — but the function is the same: bypass critical thinking via emotional arousal. Test: is the outrage proportional, or engineered to make not sharing it feel like a moral failing?
We also identified anti-establishment techniques used across left and right populism: revelation framing ("what they don't want you to know" creates artificial insider status) and systemic enemy (naming a vague villain that can't defend itself, making claims unfalsifiable).
The Fix
We added explicit cross-spectrum calibration to the influence analyzer prompt (version
2026-03-08a). The calibration gives the model equal detection
vocabulary for conservative-coded, progressive-coded, and anti-establishment techniques, each
with concrete examples and diagnostic tests.
The key principle added:
A transparent advocate openly argues "here's my position and why." A covert manipulator of any ideology makes their position feel like the only reasonable conclusion while hiding that framing work is happening.
Ongoing Work
This is a single calibration pass on 15 videos. We plan to run broader re-analysis across the full corpus and publish updated aggregate statistics. The prompt, model, and methodology are open to scrutiny — we believe transparency about our own detection system is as important as transparency in the content we analyze.
Prompt version: 2026-03-08a ·
Model: google/gemini-3-flash-preview ·
Sample: 15 videos across 12 channels