calibeebee
Supporter
- May 6, 2007
- 19,746
- 25,802
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Yup.. my badWrong thread
My question to you is, how does Twitter plan to be more responsive to these ancillary issues with a skeleton crew? It takes more than replying to tweets of two-year-old stories all day to make social media platforms safe, especially when you eliminate the part of your org tasked with monitoring content.I guess Twitters council of deciders was too busy worrying about the spread of "misinformation"
Musk has inherited the platform’s longtime battle with the scourge of child sexual abuse material. Since 2019, Twitter has reported some 200,000 pieces of such illegal content to the National Center for Missing and Exploited Children (NCMEC), a nonprofit that funnels these “cybertips” from tech companies to law enforcement. Former law enforcement officials told Forbes that even under Dorsey, the platform had a worse grasp of the issue and weaker reporting processes than competitors. Now, in the wake of Musk’s chaotic takeover of the company, they worry the problem will intensify.
“If it was bad when there were a bunch of people [working on child protection], imagine nobody there taking care of it,” said former police officer Yami Pence, who served as a detective in Florida’s Internet Crimes Against Children unit until last year. As part of that role, she investigated social media companies’ tips to NCMEC, including many from Twitter. She told Forbes that even under Twitter’s previous leadership, its reports to NCMEC were typically so delayed that the lag often prevented law enforcement from finding and prosecuting those involved.
Still, Twitter's trust and safety team—responsible for monitoring the platform for illegal content—has been eviscerated in the last two weeks. Many of its staffers have either been let go or resigned, most notably former head of safety and integrity Yoel Roth. With the exodus of employees following a Musk ultimatum that staff either sign up to an “extremely hardcore,” long-hours Twitter or leave, it’s unclear how many from Roth’s old team remain. But a recently departed Twitter employee told Forbes some staff directly handling the monitoring of CSAM and child grooming on the site were let go in the last two weeks. More decided to leave yesterday rather than sign up for Musk’s Twitter by his 5 p.m. ET deadline.
My question to you is, how does Twitter plan to be more responsive to these ancillary issues with a skeleton crew? It takes more than replying to tweets of two-year-old stories all day to make social media platforms safe, especially when you eliminate the part of your org tasked with monitoring content.
Maybe, this is why there is a "decrease" of objectionable activity on the platform according to Musk: nobody is reading the reports.
Easiest job hes ever held.
Jack makes a good point. Too bad Elon is only interested in making the left look bad.
So we're just going to ignore the post that Elon actually replied to?
""
I guess Twitters council of deciders was too busy worrying about the spread of "misinformation"
Somebody has to gather the information needed to train the AI continuously, which means that somebody has to evaluate user reports. Engineers don't so that. In addition, someone has to evaluate the AI results to separate actual hits from false positives.by relying on engineers and machine learning
if apple can do it with end to end encryption on private messages then twitter can do it without E2E on public messages
just because the team that was largely involved in the process has been depleted doesnt mean that all of sudden no one understands what child porn is
thats the same type of thinking that went around during the mass layoffs
LOL."so and so is gone therefore these critical systems will fail because no one is left that understands them............................."
Somebody has to gather the information needed to train the AI continuously, which means that somebody has to evaluate user reports. Engineers don't so that. In addition, someone has to evaluate the AI results to separate actual hits from false positives.
There is a lot of support activities that go into making the job of engineers effective, and there's a lot of additional work that goes on after a computer spits out a result.
LOL.
Spoken like someone who doesn't know what makes complex systems functional.
If you don't understand the resources it takes to design, maintain, and improve the systems the general public interacts with on a daily basis, none of what I've been saying will make sense to you.What are you even arguing dude?
That the people who left were the only people on earth capable of gathering information for engineers?
"... But you need people to train machines"
I wonder if anyone has told Elon that
The guy pushing for self driving cars has no idea that machines need info to learn