Hosted by Hillary Wilkinson
On March 28th of this year, Children and Screens, Institute of Digital Media and Child Development announced the release of a landmark report unveiling the significant impacts of the UK's Age Appropriate Design Code (AADC) on digital platforms.
The Institute's review identified an unprecedented wave of 91 changes made across leading social media and digital platforms. Find out what they are when you listen to this episode!
UK Information Commissioner: Introduction to the Children's Code
Healthy Screen Habits Podcast: Season 8, Episode 13 with Kris
Hillary Wilkinson (00:38):
On March 28th of this year, children in Screens, Institute of Digital Media and Child Development announced the release of a landmark report unveiling the significant impacts of the United Kingdom's age appropriate design code on digital platforms. The Institute's review identified an unprecedented wave of 91 changes made across leading social media and digital platforms, all aimed at fostering a safer, more secure and age appropriate online environment. So this underscores the crucial role of regulation in improving the digital landscape for children and teens. We at Healthy Screen Habits speak with families and parents every day who are battling over technology in their home and firmly believe that without legislative backing and regulation, big tech has not proven themselves to be trustworthy of self gatekeeping. Chris Perry, the Executive Director of Children in Screens, is here today, and I can think of no better person to both review this landmark report and kind of translate it for me. If you're interested in learning more about Chris and the important work that children in screens does, please go back. Listen to season eight, episode 13 when we chatted about the outstanding work they do supporting digital wellness. I don't know a ton about the UK's age appropriate design code, and I am really looking forward to learning more. Welcome back to Healthy Screen Habits, Chris.
Kris Perry (02:30):
Thanks, Hillary. Hi.
Hillary Wilkinson (02:33):
Hi. Okay, so let's just get right into it. There's a ton of media coverage around tech and kids and mental health, and I'm wondering, so because we're gonna talk about something super specific, I wanna just back it up a little bit and set a real definition around what it is that we're talking about. Can you explain what the age appropriate design code is and how, how it impacts child data privacy and their safety as well?
Kris Perry (03:06):
Sure. So the age appropriate design code or the AADC, uh, is a set of 15 interlinked standards, which establish a statutory code of practice to ensure that youth data privacy and safety are in accordance with the European Union, the EU's General Data Protection Regulation called GDPR and the 2018 Data Protection Act, DPA. Um, so these interlocking interlinked policies, um, were intended to have a positive impact on the digital landscape for children and teens by helping to guide specific change that would ultimately make platforms safer.
Hillary Wilkinson (03:58):
Okay. And, uh, it went into place in 2021, correct?
Kris Perry (04:03):
They started in 2018 with
Hillary Wilkinson (04:06):
The, oh gosh. Okay.
Kris Perry (04:08):
GDPR, and then it was followed by the AADC, the age appropriate design code in 2021. So you were right, but there's stacking policies and one preceded the other.
Hillary Wilkinson (04:21):
Okay. Okay. So as a result of these stacked policies, have we seen any specific improvements made by tech companies in areas like youth safety, age appropriate design, privacy, time management, those things, what you, you talked about the 15 interlocking, but are they, are they addressing the needs <laugh>, I guess is what I'm asking?
Kris Perry (04:54):
Yes, they, they are. Okay. And you noted those four key areas, but just to highlight some examples within each, there are youth safety and wellbeing improvements, and that would include mental health tools, stronger reporting mechanisms, um, tools to reduce cyberbullying and other harmful content, as well as limits on personal advertising. So that would, those would all fall into the youth safety and wellbeing category. The next one we've talked about age appropriate design, um, is where a user's developmental stage is considered in any designs, including default settings. And these include things like age specific settings for content or time allowances. The third area; privacy and security. This includes giving minors more control over their data, um, such as the ability to delete information about themselves, reducing the data that is collected and transferred, and default settings that prioritize privacy. And then the fourth category is time management. And this includes things like turning off push notifications for youth and tools to help them manage their time online.
Hillary Wilkinson (06:12):
I feel like these are tools that wouldn't just benefit youth, but would benefit everyone. I mean, I, I know I, I go in and do those things manually, but having it set to default would be incredibly helpful because I know with so many of our updates, things get reset and open the gates again to things. So it would help me manage my own time <laugh>. Um, okay. So it's been in place a couple of years now. Uh, do you wanna talk about results? Are there, what are we seeing?
Kris Perry (06:51):
Uh, yes, there are these changes and they are improvements, and yet there's still so much more work to do. Um, one of the areas we are concerned about at the institute is just the need for more research. Mm mm-Hmm, <affirmative>, um, to, to, there's, there's what we can observe happening in, in the real world, what children are doing online, what, how long they're online, but we don't have access to what the platforms have, the data that's collected, the design, all of the design features of their products. And we are at a point where it would really be helpful if researchers could fully examine across all platforms how changes are made, you know, what experience of the, of youth are, what are some of the downstream impacts or outcomes such as mental health or safety and wellbeing. That scenario, I, I feel, is still very necessary and was not yet addressed in these, um, changes we just talked about.
Kris Perry (07:56):
And it's also, as you pointed out in your opening, it's unlikely that the platforms will go beyond Mm-Hmm. <affirmative> what's required in the AADC despite perhaps knowing that there are many other design features or changes that they could make that would improve the safety of their products. Back to my point with research and ability to see what all of the features are and the data that's collected and how it's being used, researchers could also weigh in and give input to the, to the platforms or to policymakers, or even to parents about changes they could make or assist the platforms in making so that the products are safer. It's really all about having access to all of the information the platforms have so that we're making, you know, evidence-based recommendations to them about how to improve the safety of their products.
Hillary Wilkinson (08:53):
Sure. Sure. So we need a a, a, a request for transparency, kind of unleashing the black box is what it sounds like. Excellent. Okay. So when we come back, we're gonna talk about how the UK's AADC, I'm gonna use the fancy acronym, <laugh> , compares with Digital Safety Standards for Children in the United States.
Ad Break - Thank you donors!
HIllary Wilkinson
I'm speaking with Kris Perry, the Executive Director of Children and Screens Institute of Digital Media and Child Development, and today we are exploring the UK's age appropriate design code, which we also refer to as the UK's AADC. So Kris, how does the approach of the UK in implementing these age appropriate design codes compare with actions that are being taken in the United States?
Kris Perry (10:37):
Great question. There are many examples of policy change at the federal level in the United States, as well as within states just in 2024 alone. But it's important before we talk about some of those examples, to give credit to the UK and the EU for shining a light on the importance and the impact that legislation can have on the safety of online products for kids and how we can have healthier digital spaces for young people. Um, the UK's approach is national and, um, in the United States, despite the fact that there are some federal level policy change, I mentioned there are a number of, uh, state level changes in the works as well. One example would be in California in 2022, they passed an age appropriate design code bill that happens to be challenged right now in the courts by the platforms. And in just the last few months, we've seen similar codes emerging in Minnesota, Maryland, and New Mexico.
Kris Perry (11:40):
And there are even other policies emerging from Vermont and New York in the, just the last few days. So in the absence of federal legislation in the United States, states are starting to take action themselves. But let's talk a little bit about what is pending at the federal level. There really are two major pieces of legislation. One is called the Kids Online Safety Act, and the other is called the Children's Online Privacy and Protection Act 2.0. The acronyms are KOSA and COPPA, KOSA most is most similar to the AADC in, in the sense that it really is, um, giving guidance to companies about the design of their products, much like the AADC, COPPA 2.0 is really more of a privacy bill. Mm-Hmm. <affirmative> and both are important. They're addressing different issues related to these online products, but they are national, which we now know from the UK has, has positive effects on, on the whole country, which could be great for the United States.
Hillary Wilkinson (12:48):
Yeah. Yeah. And, um, if you'd like any more information on KOSA or COPPA, we have COPPA 2.0, we have that on our website as well. So just because we were talking about transparency and, uh, I, I just want it to be known that the reason that, that the group that has stymied the California Kids Code, uh, was a group who filed the lawsuit was Net Choice, which was a, which is a coalition of trade organizations representing the country's largest tech companies. So I think it's kind of important that we understand where the stumbling blocks are happening, because across the board we have seen KOSA and COPPA 2.0 to be a bipartisan issue. It's something that everybody agrees on. So except for maybe the tech companies <laugh>, who are giving some pushback, <laugh>. So there are some kind of common misconceptions about content moderation under the UK's AADC. And there is concern, Net Choice has, has pushed back with concern about freedom of speech. And can you talk about that? Is there, like, what are some common misconceptions about content moderation and does it go against freedom of speech?
Kris Perry (14:22):
These are such great questions and I I appreciate that you're pointing out more of the detail about the, the, where there is agreement and disagreement. It's important to note that there might not be any other pending legislation right now in Congress with this level of bipartisan support. Last time I checked, there was a filibuster proof majority of senators, something like 68 in agreement that KOSA should, should be voted on and approved, and the house last week, it created its own, um, KOSA bill. And the hope is that they'll all align and we will see something passed this year. Um, in the other, where we are seeing some disagreement is with the policies at the federal and state level and the, the, the platforms themselves. Mm-Hmm. <affirmative>. And one of the concerns the platforms have raised is content moderation. And, and some people might see that as a euphemism for First Amendment or free speech rights.
Kris Perry (15:21):
And it's important to note that, that the aa, the UK AADC didn't require any content moderation and, and yet it could still very effectively protect children's data and privacy rights. So, um, we don't believe that there is a requirement of the co of the company's to enforce anybody else's, uh, content standards. They can choose how to moderate content themselves. And, you know, we think that some, sometimes when it's, it muddies the issue for them to say there are content moderation issues when that really hasn't been proposed up to this point, there have been, um, also, um, groups of kids, marginalized youth and others who have, um, expressed concern about, about some online safety bills, insofar as they have positive experiences online building community and connecting with peers. And there have, and the authors of KOSA and others have addressed those concerns through changes to how it would be enforced and, um, how the child would be able to control some of some of the content. So we're at a point where the groups who would be affected by the, the law have their concerns have been addressed. Members of Congress have come to agreement on major, major provisions within KOSA, and yet there is disagreement from the platforms and Congress on how to move forward. And we, it really remains to be seen at this point how, how it'll all be resolved this year.
Hillary Wilkinson (16:59):
Okay. Yeah. Yeah. I'm glad. Thank you for touching on. I know KOSA was, um, getting some, or there was, there was, like you said, special interest groups had concerns about it, and those have been addressed and they have been cleared, and some people have not received updated information along those lines because I continue to hear the same, that, that kind of same argument that maybe got a lot of press at the, at, you know, earlier on versions of KOSA and people just haven't realized that those have, those have been addressed and adapted now. So, um, like we were saying, the big tech platforms have responded to the UK's code and is there sort of, I'm just, you used the term standardization earlier standard something, I don't know, and it just made me think, how about is there variability in how different platforms are complying? Or are they, is there standardized compliance? Are there, are they creating their own paths? How, how, how is it going <laugh>?
Kris Perry (18:13):
Well, let's, let's back up and talk just a minute about the methods we used Okay. To do the report, and then I can talk to you a little bit about how the, the different companies are going about addressing the, the code requirements. So the way we went about this was we started, um, checking to see when, um, companies made public statements about changes they'd made. So we knew when AADC went into effect, but then the only way you could really tell if a change had been made to be in compliance with AADC was to check press releases or other public information where the companies announced that they had made a change. So that was the methodology used and the dates we had it run, those we looked for, those statements were between May of 2018 and September of 2023. As we talked about earlier, some laws went into effect before others. So we were checking over a five-year period, and we really focused in on the four most popular social media platforms among youth in the UK and the US so that we were sure to capture the, the, the companies that were gonna have the greatest impact if they made changes, if that makes sense.
Hillary Wilkinson (19:23):
Can you name drop? Can you, can you
Kris Perry (19:25):
Tell us what <laugh>? Um, so now we can go and talk about how these four companies went about making changes, and those were Instagram, Google search, YouTube and TikTok. Mm-Hmm. And among the four, they did make changes in the categories we talked about at the beginning, for example, youth safety and wellbeing, right? That was a category we talked about at the very beginning. And when we went and looked for public statements from those four platforms, we would learn how they had approached that category themselves independently from each other. And for an example, Instagram to address youth safety and wellbeing in May of 2018, put in a new filter that hides comments containing attacks on a person's appearance or character, as well as threats to a person's wellbeing or health. That's amazing. That's really great. In June, one month later, Google search announces users can report a search result for being spam or phishing or malware.
Kris Perry (20:30):
That's good. A couple months later, Instagram again announces around youth safety and wellbeing using machine learning technology to proactively detect bullying and photos and their captions and send them to their community operations team to review. In other words, they're using an upstream technology to capture online abuse rather than only allowing or leaving it to the child to find that and turn that in. So those are a couple really tangible results that happen in that one category. We can talk about age appropriate design, that's a separate category. And a couple of the changes, let's just use TikTok. In April, 2019, they upgraded optional restricted viewing mode that limits inappropriate content, and the feature is activated via password and valid for 30 days. In other words, they're really curtailing how long you can view something that they perceive or as inappropriate. Mm-Hmm. <affirmative> TikTok. Then a couple months later also says, only users aged 18 and over are allowed to purchase, send or receive virtual gifts.
Kris Perry (21:36):
These are, again, small changes, but when you hear them, you think, yeah, that, that would help cut down on spending Mm-Hmm. <affirmative> or, you know, other kinds of behavior online that the child may be just far too young to be engaged in. And so that's a good thing. A couple more examples under privacy, security and data management. Um, let's talk about Instagram again. They announced new in-app features to help U users better control the data users share with third parties and TikTok, um, believe it or not, they've, they, um, disabled direct messaging for users under 16. That was in April of 2020. That's TikTok, that's a, that's a major design change. Mm-Hmm. <affirmative>. And one that, you know, kind of addresses some of the concerns we hear about, um, predatory behavior online from, you know, um, individuals children don't know being able to contact them.
Kris Perry (22:33):
Mm-Hmm. <affirmative>, um, this really is an attempt to protect children from that. Um, last category was time management, and you know, YouTube is famous for its, um, uh, instant auto play feature that makes it very difficult to turn off. In August of 2021, YouTube, um, their auto play feature was turned off for under 18 seconds. And, um, there were breaks built in and bedtime reminders built in by default. Um, which is another great, you know, way of reminding the user that it's getting late and how long they've been on. And, um, those are just a few examples of things that, uh, companies announced over the five year period.
Hillary Wilkinson (23:18):
Okay. Okay. So yeah, I, I, and I think that each of the platforms, they have a different, um, presentation. I'm gonna use that word. I don't know if it's the correct one, but, so I can see how it, it might not be like a standardization of, of each of these areas across the board because some won't apply to others. So it's been tailored, you know, to the platform. But just so we're very clear, they've pulled all of this stuff together and enacted it in the uk, but it has not, am I correct in understanding it's not been done in the us?
Kris Perry (23:59):
That's correct. In fact, the one state that passed an age appropriate design code bill that would've gone into effect this past January is being challenged in court by the platform. So it's, it's on hold. It is not yet in, in motion.
Hillary Wilkinson (24:16):
Mm-Hmm. <affirmative>. Mm-Hmm. <affirmative>. So yeah. Now that we know that the changes have been put in place, um, is there any data that reflects any changes in, say, children's mental health or time spent online? Are there, are there, has that type of research been done yet?
Kris Perry (24:42):
That is such a great question. And, um, we, we don't have that data yet. This wasn't, let me put it differently that we didn't look into that as part of this report. Mm-Hmm. <affirmative>. But it is interesting, I think now that we've set this baseline for when changes were made, it would be easier now for another researcher to come in and say, oh, okay, now I can see on what day these changes were made by which platform, and I could do a deeper dive now into TikTok or YouTube to see, um, you know, what, what, what other features are going on. But more importantly, maybe you could start, you know, with archival data from A B, CD and other big studies in the United States, you might be a, or the uk you might be able to backtrack and see how mental health, um, rates have, you know, changed emergency room visits, that sort of thing. Is there any correlation there that that's something a future researcher could do now that we've established that the code changes are in effect, right. And they Yeah, there may be a correlation.
Hillary Wilkinson (25:49):
Yeah. Yeah. And I think that's the kind of research that, um, hopefully is happening. Maybe, maybe people are putting studies in place that as we speak <laugh> Yeah.
Kris Perry (26:02):
Because those, those downstream impacts.
Hillary Wilkinson (26:04):
Mm-Hmm. <affirmative>
Kris Perry (26:06):
Is, you know, you're, you're really putting your finger right on that the overall intent was of, of the age appropriate design code, which is that these downstream impacts would improve, that youth outcomes would improve, that they would have better mental health, they would have better sleep, that they would have better academic routines, that they would have lower levels of problematic use, that it would really improve their lives by allowing them to return to these other activities or aspects of daily life that are disrupted Yeah. By the, by the platforms, the way they're currently designed. Uh, so I think those downstream impacts are critical to be researched, and I mentioned this at the beginning, how important it would be to have more data on changes that are in the works or have been attempted so that researchers can gain a better understanding of whether or not those changes actually have an impact downstream.
Hillary Wilkinson (27:06):
Right, right, right. So we have to take a short break, but when we come back, I'm going to ask Kris Perry for a different type of healthy screen habit.
Ad Break - HSH Website
Hillary Wilkinson (28:06):
I'm speaking with Kris Perry, executive Director of Children in Screens, who has been discussing the UK's age appropriate design code. And Kris, as you know, as a former guest, <laugh>, on every episode of the Healthy Screen Habits podcast, I typically ask a guest, ask the guest at this point for a healthy screen habit. However, today I want to do something different and I will have, you know, the, you are setting a new precedent. I have never done this before! <laugh>, I live and die by my healthy screen habit! It's like, I feel like it's the cheese at the end of the maze for the people who made it through the whole episode. However, this is such a critical topic, and I just, I would like to know what major takeaway would you like listeners to have about the impact of legislation on digital safety for our children, for young users?
Kris Perry (29:11):
That there would be a positive impact on youth, and that the platforms would be inspired to make additional changes to continue to protect youth. That there would be a virtuous cycle of, of good outcomes from the youth back to the platform, so that we're creating a healthier digital online experience for young people that enhances their lives and doesn't detract from a healthy life.
Hillary Wilkinson (29:44):
Oh, you're, you're the best.
I love that, a virtuous cycle. I think that beautifully illustrates what we are striving for here with digital wellness. So as always, you can find a complete transcript of this show and a link to that previous episode I referenced, as well as any resources discussed by visiting the show notes for this episode, you do this by going to healthyscreenhabits.org. Click the podcast button and find this episode. Kris, thank you so much for everything you do. It is always an honor to speak with you.
Kris Perry (30:20):
My pleasure, Hillary. Thank you.
About the podcast host, Hillary Wilkinson
Hillary found the need to take a big look at technology when her children began asking for their own devices. Quickly overwhelmed, she found that the hard and fast rules in other areas of life became difficult to uphold in the digital world. As a teacher and a mom of 2 teens, Hillary believes the key to healthy screen habits lies in empowering our kids through education and awareness.
Parenting is hard. Technology can make it tricky. Hillary uses this podcast to help bring these areas together to help all families create healthy screen habits.
Email:
info@healthyscreenhabits.org
Mailing address:
144 W. Los Angeles Ave. #106-362, Moorpark, CA 93021
All Rights Reserved | Healthy Screen Habits is a registered 501(c)(3) nonprofit