Social media giant Twitter refused to take down images and videos of a teenage sex trafficking victim after its investigation “didn’t find a violation” of the company’s “policies,” a shocking lawsuit has alleged.

According to the New York Post, a federal suit, filed by the victim and his mother in the Northern District of California, also alleges Twitter made money off the clips which were allegedly widely shared. The then 13-year-old  was shown engaging in sex acts, and the images are a form of child sexual abuse material, or child pornography the legal action states.

The teenager was duped by sex traffickers, who posed as a 16-year-old female classmate, to strike up conversations on Snapchat, the lawsuit alleges. The traffickers then convinced the young boy to send nude photos, and then used those in turn to blackmail him into sending more and more explicit material. They threatened him to the point where he acceded to their demands that he include another child in the videos.

The Post says:
Eventually, Doe blocked the traffickers and they stopped harassing him, but at some point in 2019, the videos surfaced on Twitter under two accounts that were known to share child sexual abuse material, court papers allege.

Over the next month, the videos would be reported to Twitter at least three times — first on Dec. 25, 2019 — but the tech giant failed to do anything about it until a federal law enforcement officer got involved, the suit states.

Doe became aware of the tweets in January 2020 because they’d been viewed widely by his classmates, which subjected him to “teasing, harassment, vicious bullying” and led him to become “suicidal,” court records show.

While Doe’s parents contacted the school and made police reports, he filed a complaint with Twitter, saying there were two tweets depicting child pornography of himself and they needed to be removed because they were illegal, harmful and were in violation of the site’s policies.

A support agent followed up and asked for a copy of Doe’s ID so they could prove it was him and after the teen complied, there was no response for a week, the family claims.

Around the same time, Doe’s mother filed two complaints to Twitter reporting the same material and for a week, she also received no response, the suit states.

Finally on Jan. 28, Twitter replied to Doe and said they wouldn’t be taking down the material, which had already racked up over 167,000 views and 2,223 retweets, the suit states.

“Thanks for reaching out. We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time,” the response reads, according to the lawsuit.

“If you believe there’s a potential copyright infringement, please start a new report. If the content is hosted on a third-party website, you’ll need to contact that website’s support team to report it. Your safety is the most important thing, and if you believe you are in danger, we encourage you to contact your local authorities.”

The teenager responded in shock saying: “What do you mean you don’t see a problem? We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL and they need to be taken down. ”

He  included his case number from a local law enforcement agency, but claims the the tech giant ignored him and refused to do anything about the child sexual abuse material which continued to rack up thousands of views.

Eventually, the boy’s mum made contact with the Department of Homeland Security through a mutual contact and the videos were finally removed on Jan. 30.

“Only after this take-down demand from a federal agent did Twitter suspend the user accounts that were distributing the CSAM and report the CSAM to the National Center on Missing and Exploited Children,” states the suit, filed by the National Center on Sexual Exploitation and two law firms.

“This is directly in contrast to what their automated reply message and User Agreement state they will do to protect children.”

Twitter declined to comment when contacted by The Post.