- Dec 9, 2002
- 34,665
- 28,027
Here we go again
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
With what?Here we go again
I agree but if others don't see it that way okay they can have that pov too.Are message boards social?
Are message boards a form of media?
Of course NT is social media
He doesn't understand how to live amongst those with differing opinions.I agree but if others don't see it that way okay they can have that pov too.
He doesn't understand how to live amongst those with differing opinions.
Ignore him
This is waaay different from social media... yeah its message based but anonymity is optional and the instant gratification of a like or whatever isn't giving out here...I never understand cats like you that are still on messageboards talking about man flip social media. I see this as something similar honestly.
He doesn't understand how to live amongst those with differing opinions.
Ignore him
In a recent essay published in The Washington Post, a mother explained her decision to continue writing essays and blog posts about her daughter even after the girl had protested. The woman said that while she felt bad, she was "not done exploring my motherhood in my writing".
One commentor criticized parents like the essay's author for having "turned their family's daily dramas into content".
Another said the woman's essay surfaces a "nagging – and loaded – question among parents in the age of Instagram. … Are our present social media posts going to mortify our kids in the future?"
These questions are valid, and I've published research about the need for parents to steward their children's privacy online. I agree with critics who accuse the woman of being tone-deaf to her child's concerns.
However, I believe the broader criticism of parents and their social media behavior is misplaced.
I've been studying this topic – sometimes called "sharenting" – for six years. Too often, public discourse pits parents against children.
Parents, critics say, are being narcissistic by blogging about their kids and posting their photos on Facebook and Instagram; they're willing to invade their child's privacy in exchange for attention and likes from their friends. So the story goes.
But this parent-versus-child framing obscures a bigger problem: the economic logic of social media platforms that exploit users for profit.
A Natural Impulse
Despite the heated responses sharenting can evoke, it's nothing new. For centuries, people have recorded daily minutiae in diaries and scrapbooks.
Products like baby books explicitly invite parents to log information about their children.
Communication scholar Lee Humphreys sees the impulse parents feel to document and share information about their kids as a form of "media accounting".
Throughout their lives, people occupy many roles – child, spouse, parent, friend, colleague. Humphreys argues that one way to perform these roles is by documenting them.
Looking back on these traces can help people shape a sense of self, construct a coherent life story and feel connected to others.
If you've ever thumbed through an old yearbook, a grandparent's travel photos or a historical figure's diary, you've looked at media accounts.
Same if you've scrolled through a blog's archives or your Facebook Timeline. Social media may be fairly new, but the act of recording everyday life is age-old.
Writing about family life online can help parents express themselves creatively and connect with other parents. Media accounting can also help people make sense of their identities as a parent.
Being a parent – and seeing yourself as a parent – involves talking and writing about your children.
Surveillance Capitalism Enters the Equation
Framed this way, it becomes clear why telling parents to stop blogging or posting about their children online is a challenging proposition. Media accounting is central to people's social lives, and it's been happening for a long time.
But the fact that parents are doing it on blogs and social media does raise unique issues.
Family album photos don't transmit digital data and become visible only when you decide to show them to someone, whereas those Instagram pictures sit on servers owned by Facebook and are visible to anyone who scrolls through your profile.
Children's opinions matter, and if a child vehemently opposes sharenting, parents could always consider using paper diaries or physical photo albums. Parents can take other steps to manage their children's privacy, such as using a pseudonym for their child and giving their child veto power over content.
However, debates about privacy and sharenting often focus on a parent's followers or friends seeing the content. They tend to ignore what corporations do with that data.
Social media didn't cause parents to engage in media accounting, but it has profoundly altered the terms by which they do so.
Unlike the diary entries, photo albums and home videos of yore, blog posts, Instagram photos and YouTube videos reside in platforms owned by corporations and can be made visible to far more people than most parents realize or expect.
The problem is less about parents and more about social media platforms. These platforms increasingly operate according to an economic logic that business scholar Shoshana Zuboff calls "surveillance capitalism".
They produce goods and services designed to extract enormous amounts of data from individuals, mine that data for patterns, and use it to influence people's behavior.
It doesn't have to be this way. In her book on media accounting, Humphreys mentions that in its early days, Kodak exclusively developed its customers' film.
"While Kodak processed millions of customer photos," Humphreys writes, "they did not share that information with advertisers in exchange for access to their customers. … In other words, Kodak did not commodify its users."
Social media platforms do just that. Sharenting tells them what your child looks like, when she was born, what she likes to do, when she hits her developmental milestones and more.
These platforms pursue a business model predicated on knowing users – perhaps more deeply than they know themselves – and using that knowledge to their own ends.
Against this backdrop, the concern is less that parents talk about their kids online and more that the places where parents spend time online are owned by companies who want access to every corner of our lives.
In my view, that's the privacy problem that needs fixing.
Tips for committing suicide are appearing in children’s cartoons on YouTube and the YouTube Kids app.
The sinister content was first flagged by doctors on the pediatrician-run parenting blog pedimom.com and later reported by the Washington Post. An anonymous “physician mother” initially spotted the content while watching cartoons with her son on YouTube Kids as a distraction while he had a nosebleed. Four minutes and forty-five seconds into a video, the cartoon cut away to a clip of a man, who resembles Internet personality Joji (formerly Filthy Frank). He walks onto the screen and simulates cutting his wrist. “Remember, kids, sideways for attention, longways for results,” he says and then walks off screen. The video then quickly flips back to the cartoon.
Man giving kids wrist-slitting tips in the middle of a cartoon found on YouTube.
“I am disturbed, I am saddened, I am disgusted,” the physician wrote. “But I am also relieved that I was there to see this video with my own eyes, so that I could take the appropriate actions to protect my family.” Those actions included deleting the YouTube Kids app and forever banning it from the house.
That particular video was later taken down from YouTube Kids after the doctor reported it to YouTube. However, parents have since discovered that several other cartoons contain information about how to commit suicide, including the same spliced-in video clip. In a subsequent blog post, pediatrician Free Hess, who runs pedimom, reported another cartoon—this time on YouTube—with the clip spliced in at four minutes and forty-four seconds. That cartoon was also later taken down, but Hess captured a recording of itbeforehand, which you can view on the blog.
In an emailed statement, a spokesperson for YouTube told Ars:
We work to make the videos in YouTube Kids family-friendly and take feedback very seriously. We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video. Flagged videos are manually reviewed 24/7 and any videos that don't belong in the app are removed. We’ve also been investing in new controls for parents including the ability to hand pick videos and channels in the app. We are making constant improvements to our systems and recognize there’s more work to do.
Nadine Kaslow, a past president of the American Psychological Association and professor at Emory University School of Medicine, told the Post that simply taking down the videos isn’t enough. “For children who have been exposed, they’ve been exposed. There needs to be messaging—this is why it’s not okay.” Vulnerable children, perhaps too young to understand suicide, may develop nightmares or try harming themselves out of curiosity, she warned.
Suicide is the third leading cause of death among individuals between the ages of 10 and 24, according to data from the Centers for Disease Control and Prevention. However, more youths survive suicide attempts than die. Each year, emergency departments nationwide treat self-inflicted injuries in 157,000 youth between the ages of 10 and 24. Sixteen percent of high-school students reported seriously considering suicide in a nationwide survey.
FURTHER READING
Suicide tips stashed in otherwise benign cartoons are just the latest ghastly twist in the corruption of kids’ content on YouTube and YouTube Kids. For years, the video-sharing company has struggled with a whack-a-mole-style effort to keep a variety of disturbing and potentially scarring content out of videos targeting children. Videos have been found with adult content ranging from foul-language to depictions of mass shootings, alcohol use, fetishes, human trafficking stories, and sexual situations. Many contain—and attract clicks with—popular cartoon characters, such as Elsa from the 2013 animated Disney film Frozen. This chilling phenomenon has been referred to as Elsagate. Though YouTube has deleted channels and removed videos, Hess points out that it’s still easy to find a plethora of “horrifying” content aimed at children on YouTube Kids.
Last week, YouTube lost several advertisers, including Fortnite maker Epic Games, Disney, and Nestle, over a “wormhole into a soft-core pedophilia ring.”
If you or someone you know is feeling suicidal or in distress, please call the Suicide Prevention Lifeline number: 1-800-273-TALK (8255), which will put you in touch with a local crisis center.
Been working in soc services for a minute and one thing I realized is black folks aren’t anti family as media portrays. They love hard and show empathy way more than other ethnic groups.Probably the parents are products of getting "whooped" and clowned on themselves back in the day so they extra protective so their kids won't have as bad as them. That's what I think goes on with most of this stuff folks too hype about.
This Is The Real Problem With Constantly Posting About Your Kids Online
https://www.sciencealert.com/the-re...s-sharing-their-child-s-lives-on-social-media