During a podcast interview with Vox editor Ezra Klein, Mark admitted the terrifying revelation.
He received a phone call from staff at Facebook's Mountain View firm, and was told their systems had blocked attempts to send "scandalous" messages about ethnic cleansing in Myanmar.
"In that case, our systems detect what’s going on. We stop those messages from going through."
According to Bloomberg, Messenger says it doesn’t use data from messages, but scans it for the purpose of advertising.
Maybe that's why all the weird sh*t we talk about pops up as ads on our FB.
Facebook went on to explain to the publication that it uses the same procedure to "prevent abuse" in messages.
"For example, on Messenger, when you send a photo, our automated systems scan it using photo matching technology to detect known child exploitation imagery or when you send a link, we scan it for malware or viruses.
"Facebook designed these automated tools so we can rapidly stop abusive behaviour on our platform."
Meaning ALL THE NUDES have been seen by Facebook workers.