Similar to how we analyzed Twitter in our self-generated CSAM report, we did a brief analysis of public timelines of prominent servers, processing media with PhotoDNA and SafeSearch. The results were legitimately jaw-dropping: our first pDNA alerts started rolling in within minutes. The true scale of the problem is much larger, as inferred by cross-referencing CSAM-related hashtags with SafeSearch level 5 nudity matches.