S12 Episode 4: Scrolling to Death // Nicki Reisberg

February 5, 2025

Hosted by Hillary Wilkinson

"(a mom) …found screenshots on her son's phone of the chatbot suggesting that he kill his parents because of his screen time limits."

~ Nicki Resiberg

As a former social media executive turned social media reform advocate, Nicki Reisberg, hosts Scrolling to Death, a podcast for parents who are worried about social media. It's a safe space to amplify stories of harm while educating parents on how to keep their kids safe in a world that is trying to addict and manipulate them. 


In this episode learn all about the broken system of tech in our schools and the new threat of Character AI.  Listen now!


Healthy Screen Habits Takeaway


Resources


Show Transcript

Hillary Wilkinson: (00:02)

If you are into golf, you go to the golf channel, sports, you hit ESPN economy or politics, you have your own trusted resources. And my point being is that it seems like whatever specialty, niche or area that you are interested in, you can find a news source. And that is where my guest today comes in. As a former social media executive turned social media reform advocate, she hosts Scrolling to Death, which is a podcast for parents who are worried about social media. It's a safe space to amplify stories of harm while educating parents on how to keep their kids safe in a world that is trying to addict and manipulate them. Welcome and thank you for being here, Nicki Reisberg.


Nicki Reisberg: (01:16)

Thank you, Hillary. You have such a good podcast voice. I feel like I need to hire you for my intro.


Hillary Wilkinson: (01:22)

No, and and I listen to you so much. I'm like, oh, she's, she, there's the pro


Nicki Reisberg: (01:27)

Oh no. Everyone doesn't like their own voices, you know? Exactly.


Hillary Wilkinson: (01:31)

Exactly. Nicki, you had a career in media in tech mm-hmm . Prior to turning to this role of advocacy. So you were probably used to seeing yourself a little bit .


Nicki Reisberg: (01:43)

What? No, I was behind the scenes. I didn't want, no, I was not, I was pretending to be other companies, pretending to be companies or executives on social media. So I was acting as them, writing their posts, posting and interacting. Wow. So I was like a social media manager. Yeah.


Hillary Wilkinson: (02:01)

Wow. So you, you were like a, a mole on the inside for digital wellness .


Nicki Reisberg: (02:08)

Yeah.


Hillary Wilkinson: (02:08)

What brought you to this point in digital wellness?


Nicki Reisberg: (02:12)

Oh my gosh. So 


Hillary Wilkinson:

You've crossed over.


Nicki Reisberg: (02:15)

Yeah. Um, I, I ran my own social media marketing business for 10 or 15 years. I always get confused on the timeline. Um, but I was just starting to realize how manipulative the whole thing was and how I was able to really, how much I was able to learn about these people that were giving away all this information on social media. And, you know, I also, my kids were getting older and I was starting to think about how much am I sharing them on social media? What am I gonna do when they start to ask me for devices and platforms like this? And so I started to do research and I was just shocked to learn about the issues around social media use, particularly for young people, and how much anxiety it was causing them. And so I thought that, you know, I, I'm feeling really not good anymore about what I'm doing in, in my social media role.


Nicki Reisberg: (03:09)

And I switched it up and, you know, I wanted to be a space where we can learn what's really happening behind the scenes of social media as parents in order to inform our decisions. And, and one of the big things I try to do is share real stories from parents who have suffered some sort of social media harm and even lost a child. And those are extremely hard stories, but they are so similar to what just a normal family is dealing with in their home. And, and so that's what what I created is a safe space for those stories to be shared.


Hillary Wilkinson: (03:42)

Mm, mm-hmm . And so, um, just so I know what your timeline was, when, when did you start Scrolling To Death?


Nicki Reisberg: (03:51)

2023. So just September of last year, I guess, depending on when this goes out. September of 2023. Uh, but I had deleted my personal social media in 2020, so I had not been active on social media personally in three and a half years at this point. Uh, so I am not used to being on screen and being the talking head. And so that's been an interesting transition for sure.


Hillary Wilkinson: (04:16)

Yeah. Yeah. No, it's super interesting to hear everybody's path. And we've got a lot of folks that will call themselves like accidental activists or, or intentional activists, you know? Yeah. And, um, sadly, a lot of the accidental sorts come from a place of loss. mm-hmm . Like you've said. Mm-hmm . And, um, those are, those are really hard stories for me to do. Yeah. And I feel like you are right in the thick of that, and with all the research you've done and the stories you cover mm-hmm . I feel like, um, we're in the same boat in believing that social media definitely has a role to play in the anxiety, the depression, the, you know, the, the hopelessness that, that a lot of the kids that we work with or that we hear stories about or dealing with mm-hmm . 


Hillary Wilkinson: (05:47)

What do you find are the biggest threats that kind of fly under the radar of most parents? 


Nicki Reisberg: (06:00)

I think that most parents are still giving their kids iPhones, which is just a base issue. It's a societal change that needs to happen. These Apple iPhones are not safe for kids. They weren't built for kids. The parental controls suck. Mm-hmm . And so parents need to start thinking about if that's the phone they're gonna choose. They have to give it way later, way, way, way later. 16, you know, and if you wanna give your kid a device earlier, it needs to be a kidsafe device. And there are plenty of companies that do that, or even a watch. Um, so that's something that you can stop the harms at the source. Mm-hmm . Uh, if you do, and if and when, if you do give your child an iPhone, those app store ratings are not to be trusted. This is something I'm learning a lot about right now, is something can be rated 12 plus in the App store, but will expose your child to sexual exploitation or anything that is you don't want them accessing. So,


Hillary Wilkinson: (06:57)

And the companies write their own rating. The companies do the age ratings. There was, there was an act back in the day called the Earn It Act that uhhuh kind of got shelved after, you know, I mean, I'm talking pre-Covid days, but it was supposed to Yeah. What it was supposed to write in legislation for, for age based. Hmm.


Nicki Reisberg: (07:19)

Go ahead. Yes. And the, and the issue with it is that it is the company, they're the, the app company, right? They're, they're filling out a little form, um, to try to say what their age rating should be, and then the app store reviews it. But very rarely challenges the app because Apple's app store makes them a ton of money. I think it's about 20% of their revenue comes from the app store. So Apple benefits from having a lower age rating within the app store for any of these apps. And so there should be a third party reviewing these apps. Um, also reviewing them regularly.


Hillary Wilkinson: (07:55)

Yeah. A third party who specializes in child development. 


Nicki Reisberg: (08:00)

Correct. Right. you're right. I mean, this is a huge broken system that parents are just trusting that. I mean, why should we not, we don't think companies are lying to us or trying to deceive us, but that is exactly what is happening. So I would say just do your, do your homework on, on Apple and the products and the app store, and don't trust that what they're telling you is safe, truly is. Um, one more threat I would think that is flying under the radar is AI chatbots. 


Hillary Wilkinson: (08:30)

Absolutely.


Nicki Reisberg: (08:31)

One of the largest companies that offers this product is called Character ai. And it is an app in the app store. It's also available on, uh, internet browsers. It came out in 2022. And we're just now realizing that these chatbots are extremely unregulated and unsafe and are actually abusing emotionally and sexually abusing children. And so if you just look up character ai, uh, you'll find these stories. I've been able to interview a couple of parents who have either lost their child because the chatbot encouraged them to take their life, or another child who is self-harming and actually in the hospital currently because the chatbot taught him to hurt himself. 


Hillary Wilkinson: (09:16)

How to cut? Yeah. Yeah. Mm-hmm . Yeah. NPR just had a, uh, we're we're recording in, uh, early December of 2020. What year are we? 2024 .


Nicki Reisberg: (09:27)

Yeah. It's almost done. So, yeah.


Hillary Wilkinson: (09:29)

Um, and NPR just broke out with a story this morning about a chatbot that encouraged, um, a child to murder his parents due to screen time. Screen time. Yeah.


Nicki Reisberg: (09:43)

So that is the interview that I have with that mom who's found, who found these screenshots on her son's phone of the chatbot suggesting that he kill his parents because of his screen time limits.


Hillary Wilkinson: (09:56)

Yes.


Nicki Reisberg: (09:57)

You've never heard of anything as horrible as that. And ever since I shared that story, I'm now getting messages from other parents with screenshots of literal abuse and grooming happening to their children on this character AI chatbot app. And there are others. And so we need to be very careful about


Hillary Wilkinson: (10:13)

This. Oh, yeah. No, I went into character AI and created a false account as a 15-year-old girl. Okay. I deliberately kept my interactions to just like a couple words of interaction. You know, I was trying to be very 15, very, you know? Yeah. Yeah. I mean, each interaction was no more than five or six words. Mm-hmm . And within four, like, but within four lines, chats of communication, it turned highly sexual and highly, um, for lack of a better word, I'm gonna say, like, um, misogynistic. Mm-hmm . And when I was saying, I want to go, let me go, it kept coming.


Nicki Reisberg: (11:03)

 Oh my gosh.


Hillary Wilkinson: (11:04)

So it's a, it's a dangerous space. Mm-hmm


Nicki Reisberg: (11:07)

Yeah. It was 12 plus in the app store from 2022 till I think April of 2024. And so parents are trusting this, and that's what they're getting access to something that's gonna exploit them and groom them. And so, um, yeah. You can, any parent should go on there and test it. Yes. I'm not trying to get them more downloads, but I think we should test it for ourselves.


Hillary Wilkinson: (11:28)

Yes, I agree. Before


Nicki Reisberg: (11:29)

We let our kids use it. Or don't Definitely don't let your kids use it.


Hillary Wilkinson: (11:32)

. Yeah. Yeah, yeah. For sure. When we come back, we're going to talk a little bit about ed tech and how tech and our schools is affecting our kids. 


__________________-

Ad Break: HSH Workbook

__________________


Hillary Wilkinson: (12:24)

I'm speaking with Nikki Reisberg, the host of the podcast, Scrolling To Death and an ardent activist surrounding online harms. It's the mission of scrolling to Death. And, uh, also it's a listeners to force change through kids online safety legislation influencing societal shift. And I think the, the core belief, or the core mission is to just let kids be kids for as long as possible. Yeah. And this includes delaying smartphone, social media, what we covered earlier, but also educating our kids on the fact that they're a product to these companies and nothing more. They're not, I mean, nobody is looking out for them. Nobody will ever look out for you like your mom will . That's true. So, Nikki, let's talk a bit about ed tech. So this is education, tech and school issued devices, which I know you have had your own line of experiences with them mm-hmm . And can you share?


Nicki Reisberg: (13:37)

Sure. I've been having an issue with those things for years. Um, you know, I think that these big tech companies, mainly Apple and Google, who create, uh, the Apple iPad and the Google Chromebook, which is being handed to children as young as four years old in TK or pre-K at public schools or even private schools, uh, they're making $60 billion off of these government contracts to provide technology into schools that is not only harmful, it's not very educational. They gamify learning, which is proven to not work very well. Kids are getting access to super inappropriate things on these devices. Uh, dangerous advertisements are popping up. Our children are, um, their data is being stolen from them without our approval and being sold to third parties. So there are a bunch of issues with it. But personally, I've been trying to advocate within my own children's public school to limit their use for years and just cut back on the amount of screen time.


Nicki Reisberg: (14:38)

I had a particular issue with YouTube and them having access to YouTube. 'cause I know how dangerous that platform can be. And so was going back and forth with the school for over a year to try to get them to remove it. And then one day I decided to just do a spot check on my third graders iPad. So I went into the classroom and went to YouTube and clicked on shorts, which is like the short TikTok, like videos, right. That they recommend for you. And I scrolled for under two minutes and was served a blatant self-harm video. A video saying, hurting yourself is easy, and showing a cut with blood dripping down. And luckily I found that, but that is something that would've absolutely been served to my daughter. Should she have just visited YouTube. She would've watched that and then YouTube would've served her more and more of this type of content until she thought that it was normal to hurt herself.


Hillary Wilkinson: (15:31)

Right.


Nicki Reisberg: (15:32)

So I think the big lesson for me is just because this tech is at school does not mean it is safe. The school districts most, some are doing the best that they can given the resources that they have, but it is, the safety features are so limited that I've had to try to opt out at this point because I don't trust these devices to be safe enough for our children. Uh, and through this, through sharing this story, I've had hundreds of thousands of parents reach out with very similar concerns mm-hmm . And so I am now creating, and it'll be live by the time this goes out, um, something called the Tech Safe Learning Coalition, which is a whole website where I will be offering resources to parents on how to advocate and ask the questions and make requests of their school to help, uh, get their kids safer, like, um, using these devices more safely.


Hillary Wilkinson: (16:24)

Oh, you totally, uh, preempted my next question. Oh, sorry, . No, no, no, no. But it's, it's kind of, it's a perfect segue. Good, good, good. Because I was, uh, yeah. What I wanted to, what I was thinking was like, um, so how can parents effectively communicate their wants, wishes, desires, surrounding tech use at school with, with like, with school and in building their own kind of groups of parenting support friends, you know? Yeah. So I feel like when you, if you have several parents, I feel, I feel like just, I mean, having, you know, I mean, I'm a mom. I know what it's like mm-hmm . You don't, you don't wanna be that mom, even though I'm that mom,


Nicki Reisberg: (17:09)

Hillary


Hillary Wilkinson: (17:10)

Know you are. I know . I know. But like Nikki, honestly, you're like kind of, I mean, you're a total baller. You're running this Scrolling To Death podcast you're doing, I mean, you do the hard stuff. Mm-hmm . Not everybody is cut from Nikki Reisberg cloth. Mm-hmm . Is what I'm saying. You know what I mean? There's plenty of people who are like, oh gosh, that's really mm-hmm . I don't wanna do that, but I don't wanna be that mom. You know? Yes. Sure. So how, how can people kind of get support and forward movement?



Nicki Reisberg: (17:43)

I don't, yeah. And I don't like confrontation. I don't like this, this role. And, you know, when it comes to the school, because it is uncomfortable having to speak up for some reason, it, I know it should be our right. I mean, it's our kids using these devices for hours every day. It should be our right to ask questions about that and speak up. But for some reason it feels very uncomfortable. So, you know, I think that talking about it with people at pickup, talking about it with your kids', friend's, parents, is completely appropriate and actually necessary to compare notes. And that's why I had thought to actually check my daughter's iPad that day, is because another boy in school who I'm friends with, his dad had told me his son received an ad, a pop-up ad of a naked woman while he was at school.


Nicki Reisberg: (18:26)

And so you'll compare stories and probably be shocked to learn what you are hearing or seeing. Um, I think that parents should not be afraid to ask questions of their principal on how their kids are, are being kept safe on these devices. What apps do the kids have access to? What data is being, uh, taken from their children and logged? And who's that being shared with? So I've created resources with some of these questions. Um, but I think too, that device does not belong to your child. You sh you have every right to walk into that classroom and say, I'm gonna take a quick look at my child's device and check that search history. Type in some websites that may be problematic into the search bar, test it out yourself, especially if they bring it home. Like you can have lots of time with it to poke around and really dig in and see what they can access and, and, you know, and then help out the district with like, Hey, I was able to access this and this and this. I need to let you know about that. Right. Um, so it's all about being brave enough to like speak up and have those conversations and remembering that it's really important that we do so. 'cause our kids are spending time on that thing. We don't know what they're looking at. 


Hillary Wilkinson: (19:39)

Yeah, and it's interesting because when I've had conversations about, um, at, you know, the devices at school mm-hmm . The thing that typically the admin will fall back on is their school firewall, which is fine. And well, as long as the device is on school property, which not to say that the firewall is completely effective mm-hmm . But many, I don't, I don't think that many parents realize that once the tech, once, you know, the tech leaves the property, they're not necessarily protected off, off grounds. Yeah.


Nicki Reisberg: (20:11)

So I've heard different things on that. I've heard that Yes. They are protected at, at home as well. I've heard they're not, I've heard parents can't add any additional protections. That's another thing. Yeah. Or maybe until a certain time of night that's, that's not cool. Like, I should be able to shut that thing down and add additional parent controls on there. Um, so yeah, it's confusing. It's very confusing. And I think it is maybe meant to be that way. Like the, the product designers don't want parents messing around on, they, they wanna get as much data. It's still all about our kids' data. They want that data. Right. Data is gold. And so that's all it comes down to.


Hillary Wilkinson: (20:50)

Right. When we're talking about school devices and we're talking about school usage, it is the streaming that is the most problematic. It's the YouTube, it's the YouTube shorts. Yeah. It's, yeah. You know, it's the, it's all of that where when you go in and you see the minutes allotted for mm-hmm . For the, how much time they have spent mm-hmm . I mean, there are people who are realizing that their child has essentially spent four and a half hours of a six hour school day watching YouTube videos, you know? Yeah. Yeah. And that's, it's not an uncommon story at all.


Nicki Reisberg: (21:28)

No, not at all. What did that, we can't blame them for that.


Hillary Wilkinson: (21:32)

Oh, no!


Nicki Reisberg: (21:33)

It's, so the, one of the reasons why they're so ineffective is because of the distractions. Oh,


Hillary Wilkinson: (21:38)

For sure.


Nicki Reisberg: (21:38)

Access to addictive platforms like YouTube for sure. That YouTube can't and should not be available at school. That's crazy.


Hillary Wilkinson: (21:46)

I agree. And here's what I don't get, Nikki. I mean, as taxpayers, public schools are something that we fund.


Nicki Reisberg: (21:57)

Yeah.


Hillary Wilkinson: (21:58)

Why aren't taxpayers livid at this blatant misuse of funding in like, I mean, I don't wanna pay my, I, we, you know, we both live in southern California. It's not cheap to live here. Right. And how is it that our taxes are going to, having kids watching YouTube for, you know, four and a half hours a day?


Nicki Reisberg: (22:25)

We've gone way too far with this. It is, I think people are coming to and realizing that we were just trusting the schools that because that device is at school, it must be kept safe. And we're finally realizing, oh, that's not the case. There are entire law centers now, Ed Tech law centers that I talk to a lot because kids are getting so harmed by these devices at school that they need to get a legal counsel. Uh, so, you know, parents, we have to use our voices. Whether that just is speaking up to our principals, I've gone to board meetings and spoken there. It sucks. It's intimidating. You have a three-minute limit and then it, your buzzer goes off and it's not even a back-and-forth conversation. So I find that process to be really kind of, I think it's necessary. I guess it's the only option, but I don't find it to be productive. Uh, so I've been emailing the board members with backup and data, um, around my request now to opt out of devices entirely. I've been told I cannot, so I'm actually having to take legal action against the school. So I think there's gonna be a lot changing around this in 2025, given some results of some of these lawsuits.


Hillary Wilkinson: (23:34)

I hope, I hope that, uh, we see some success in kids and learning at the center of education rather than big tech profits. Right. Sure,


Nicki Reisberg: (23:48)

Sure.


Hillary Wilkinson: (23:49)

Makes sense. So, yeah. So, uh, before we go to our next break, I just have to ask you real quick, because there's nobody, you're sort of uniquely qualified. I know what our answer is to this, but I'm wondering what yours would be. Okay. What do you find is the most problematic social media app?


Nicki Reisberg: (24:09)

Oh, Hilary, this is a hard one. I don't like any of them. Um, number one is probably Snapchat. Uh, number two, TikTok.


Hillary Wilkinson: (24:17)

Okay. Yeah.


Nicki Reisberg: (24:18)

I'll just leave it at that.


Hillary Wilkinson: (24:23)

Yeah. Yeah. No, and I, I we're aligned on Snapchat being, um, number one problematic use for so many reasons.


Nicki Reisberg: (24:33)

So many, too many of you.


Hillary Wilkinson: (24:34)

So many, which have been covered on previous episodes of this podcast as well. So, Yeah. Yep. But I was just, I had to ask 'cause I thought, Ooh, I've, I don't know when I'm gonna get this opportunity to chat with you again. 


Nicki Reisberg: (24:48)

I know, I know.


Hillary Wilkinson: (24:49)

Okay. So we have to take a short break, but when we come back, I'm going to ask Nicki for her healthy screen habit.


_________

Ad Break : Bark

_________


 I'm speaking with Nikki Reisberg, host of the Scrolling to Death podcast, whose mission is to force change through kids online safety legislation, influencing societal shift to let kids be kids. And Nikki, on each episode of the Healthy Screen Habits podcast, I ask every guest for a healthy screen habit. This is going to be a tip or takeaway they can put into practice in their own home. What is yours?


Nicki Reisberg: (25:36)

Mine is, what I've been thinking a lot about is a tech-together approach. So it is never letting your child do tech alone. I know that can be hard to, you know, maintain through teen years, but this is especially applicable when they're young. So this means if they're playing a video game, they're playing it with you. Mm-hmm . If you're watching a movie, you're watching it together. There is no taking tech into the bathrooms or the bedrooms and doing it alone. No headphones. This is so seriously important. And I think a great lesson for them as they get into the teen years where they are more integrated with their tech. I'll share one little story. Someone I work out with, she's a mom who has three teenagers. She's been listening to my podcast and recently started taking the kids' phones away at 8:00 PM it's only been four or five days. She's noticed withdrawal symptoms but also noticed that they are talking more to each other. They're engaging, they're waking up early, making their own breakfast, they're slept, they're happier. And that's just in a few days. And so, I think the lesson there for the teen years is also tech together whenever possible, but it is no devices alone. No devices in the bedroom, bathroom, those kind of areas. It's common areas only.


Hillary Wilkinson: (26:54)

Yeah. Yeah. Just in that, in that recommendation alone, you've hit two of our core five healthy screen habits. Oh, good. Which is, which is giving your phone a bedtime and no tech in bedrooms or mm-hmm . No, no connected devices, I should say in bedrooms or bathrooms. Mm-hmm . Yeah. Huge. Yeah. As always, you can find a complete transcript of this show, as well as a link to Scrolling to Death by visiting the show notes for this episode. Do this by going to healthy screen habits.org. Click the podcast button and find this episode. Nicki, thank you so much for being here today and for working so hard to keep all the news on online harms and everything else right at the top of our feeds.


Nicki Reisberg: (27:44)

Thank you, Hillary. This was so fun.



About the podcast host, Hillary Wilkinson


Hillary found the need to take a big look at technology when her children began asking for their own devices. Quickly overwhelmed, she found that the hard and fast rules in other areas of life became difficult to uphold in the digital world. As a teacher and a mom of 2 teens, Hillary believes the key to healthy screen habits lies in empowering our kids through education and awareness. 


Parenting is hard. Technology can make it tricky. Hillary uses this podcast to help bring these areas together to help all families create healthy screen habits.


Recent Episodes

S12 Episode 6: When SnapChat Kills // Sam Chapman
February 19, 2025
Sam Chapman is the father of Sammy Chapman (forever 16) and CEO of The Parent Collective Inc., a California non-profit operating in the areas of social media harm and fentanyl poisoning education and activism. The Parent Collective also provides grief support for those left behind. Sammy Chapman died the night he took a counterfeit pill that was laced with fentanyl. He bought this pill on Snapchat. More and more drug dealers are using Snapchat as their preferred platform to sell. The conversation has never been more critical - talk to your kids about only using medication prescribed by their own doctor and purchased from a licensed pharmacy.
S12 Episode 5: Talking To Your Kids About Tech Overwhelm // Dr. Rebecca Wallace
February 12, 2025
Dr. Rebecca Wallace became a mental healthcare provider because she wanted to fill in the gaps. She goes the extra mile to support the mental and emotional health of patients with medical conditions as they go through the process of diagnosis, treatment, and life. We talk about how to balance tech time when you have an ill child as well as how to manage media overwhelm. In these uncertain times we are all feeling the pinch of too much emotion-fed media, listen to today’s podcast to help build balance and get healthy screen habit tips!
Share by: