March 3, 2026

Teenagers, AI, Nudes and Online Blackmail: What You Need to Know

Teenagers, AI, Nudes and Online Blackmail: What You Need to Know
The player is loading ...
Teenagers, AI, Nudes and Online Blackmail: What You Need to Know

Ask Rachel anything There's been a dramatic increase in reports of grooming, sextortion and AI generated child sexual abuse material in recent years, and most parents believe politicans and technology companies aren't doing enough to protect kids. The UK government recently announced that makers of AI chatbots that put children at risk will face massive fines or even see their services blocked in the UK under law changes. And the French offices of Elon Musk's X were recently raided by the Par...

Apple Podcasts podcast player iconSpotify podcast player iconCastro podcast player iconRSS Feed podcast player icon
Apple Podcasts podcast player iconSpotify podcast player iconCastro podcast player iconRSS Feed podcast player icon

Ask Rachel anything

There's been a dramatic increase in reports of grooming, sextortion and AI generated child sexual abuse material in recent years, and most parents believe politicans and technology companies aren't doing enough to protect kids.

The UK government recently announced that makers of AI chatbots that put children at risk will face massive fines or even see their services blocked in the UK under law changes.

And the French offices of Elon Musk's X were recently raided by the Paris prosecutor's cyber-crime unit, as part of an investigation into suspected offences including complicity in the possession of child sexual abuse material (CSAM). 

Four in five EU citizens support requiring online service providers to detect, report and remove child sexual abuse material, but while governments and technology companies wrangle over a fast-developing issue, we parents need accurate information and support on how best to keep our kids safe if they are online. 

The Internet Watch Foundation has been around for 30 years and works alongside the UK charity Childline to protect children who have been affected, by offering emotional support and a means of tagging and removing images that predators use to extort and make money online.

THE BEST PROTECTION:

Keep devices out of bedrooms and bathrooms.

Read my devices guide, with links to all of the relevant episodes, here

KEY RESOURCES:

Support the show

Please hit the follow button if you like the podcast, and share it with anyone who might benefit.

You can review us on Apple podcasts by going to the show page, scrolling down to the bottom where you can click on a star then you can leave your message.

Please don't hesitate to seek the advice of a specialist if you're not coping. There's no shame in reaching out for support. When you look after yourself your entire family benefits.


My email is teenagersuntangled@gmail.com
My website has a blog, searchable episodes, and ways to contact me:
www.teenagersuntangled.com

Find me on Substack: https://teenagersuntangled.substack.com/
Instagram: https://www.instagram.com/teenagersuntangled/
Facebook: https://m.facebook.com/teenagersuntangled/

You can reach Susie at www.amindful-life.co.uk

01:26 - 312,000 Reports of Child Abuse Images: What It Really Means

02:40 - How the IWF Uses Digital Fingerprints to Block Abuse Images

04:17 - Are Teen Nudes in Relationships Classed as Child Abuse Images?

05:05 - How Report Remove Works for Teens in the UK

06:40 - Grooming via Webcam: When “Fun” Games Turn Sexual

10:56 - Why 11–13s (and Even 7–10s) Are Most at Risk Online

11:39 - AI-Generated Child Abuse: Deepfake & Nudifying Tools Explained

13:45 - Nudifying Apps & Sextortion: How Boys and Girls Are Targeted

17:16 - Should Parents Take Kids’ Photos Offline Completely?

18:38 - The Double Bind of Social Media: Safety vs Inclusion

19:29 - Why There’s “No Joke” in Nudifying Apps & Upcoming UK Ban

20:27 - How to Talk to Teens About Nudes Without Scaring Them

21:20 - The ‘Rude Fruit’ Campaign & Why Conversation Is Your Superpower

23:50 - TALK Framework: Practical Steps for Family Online Safety (Talk–Agree–Learn–Know; boundaries, devices downstairs, family agreements)

27:16 - Preparing Kids’ “Script” for Sextortion and Peer Pressure (Pre-loading responses, comparison to alcohol conversations, Chimp Paradox)

29:34 - If the Worst Happens: How Parents Should React

30:53 - Where to Get Help: IWF, Childline, and Teenagers Untangled

WEBVTT

00:00:02.399 --> 00:01:25.400
Rachel, hello and welcome to teenagers untangled the audio hug for everyone supporting anyone going through the tween and teen years. I'm Rachel Richards, journalist, mother of two teenagers and two bonus daughters. Now I don't know about you, but I sometimes find it hard to know what I should think about the online world, given the non stop feed of information about harms done to our kids and the rise of AI. You know, I like to take a positive approach to our journey as parents and keep our heads held high and believe that we can do good, but sometimes we have to talk about the bad stuff. Now, a while ago, I created an episode looking into why we need to really educate our kids about the cameras on their devices, and why sending nudes, even to someone they trust, can open them up to abuse. I've also created an episode about grooming and sextortion, but I feel like the changes brought in by AI have fundamentally shifted the landscape, and we do need to understand what's happening for the sake of our kids. So who better to talk to about it than the organization that is a world leader in removing illegal and abusive images from the Internet. The Internet Watch Foundation, along with Childline, they run the report remove service for anyone under 18 years old in the UK to anonymously report sexual images and videos of themselves on the internet and stop them from spreading Emma Hardy, a director at the service, joins us today.

00:01:22.219 --> 00:01:25.819
Emma, thanks so much for being with us.

00:01:26.239 --> 00:01:27.259
Really pleased to join you.

00:01:27.920 --> 00:01:33.140
Yeah, and Emma's a mother too. So this is really, really helpful to us.

00:01:29.659 --> 00:01:58.959
Now, last year, you dealt with a record breaking total of 312,000 reports of child abuse images in the UK, which was a 7% increase on the year before. What do these figures tell us about the online world our children are growing up in, especially compared to, you know, like I was parenting young teens and tweens, 510 years ago, and they're a bit older now. So So for people who starting to come through, what should we be thinking about this? So everything I

00:01:58.959 --> 00:03:07.020
say now comes from my experience having worked at the IWF for almost 15 years, which is an incredibly long time to work in this area, and I can say that it feels like the problem is getting worse despite our best efforts. So that figure you gave the 312,000 that is where we have received a report to our hotline filled with analysts that actually sit in Cambridge. And they have looked and they said, Yes, we can see at least one or many images and videos that show the sexual abuse of children on that web page. So the amount of content associated with each report could vastly exceed that 312,000 and these are, this is content that we trace to being all over the world. So whilst we do this work in the UK, we work globally. So much of what we see is repeatedly shared over and over and over again, and our job is to try and stop that sharing, stop that imagery. And we are very good at what we do. When we find a child sexual abuse image or video, we can create from it something called a hash, which is simply a digital fingerprint.

00:03:07.080 --> 00:04:13.620
It's a string of code that relates only to that image. And that's an amazing thing to be able to do, because we have over 3 million such codes that equate to individual images and videos, and we supply those to tech companies and platforms, and when they use them, that can prevent that matching image from being uploaded. So if someone downloads that image and then they say, right, I'm going to upload it to this social media platform, let's just say that. I think there'd be a fool to try and do that, but let's just say someone's tried to do that, then instantly, the back end of that platform would see if it matches anything on that list, and if it does, it would stop it from being uploaded. And then, of course, there's a whole load of questions about what would happen to that person who had tried to upload it, but we can stop the repeated uploading of images like that, where a company works with us and takes that, that service that we're able to provide them and and we are really, really successful at getting tech companies to use that. But the internet is huge.

00:04:09.539 --> 00:04:17.100
It's absolutely huge, and we only see a certain part of the internet. So when

00:04:17.100 --> 00:04:25.220
we're talking about child sexual abuse, if it's a nude that a child has shared with somebody who's contacted them. Would that qualify? Or is that something completely

00:04:25.220 --> 00:05:01.319
different that would qualify as a child but could potentially qualify as a child sexual abuse image? So I think there's lots of ways in which imagery could be gained from children and gained of children that's then shared. Let's start with a scenario where perhaps you've got a couple of teenagers. They're both under the age of 18. They're in a relationship. They trust each other, they have devices, and they're sharing nude imagery, imagery as part of their relationship, technically speaking, that could well be child sexual abuse imagery. It might pass the criminal threshold, but for them, it's within. A trusted relationship.

00:05:01.980 --> 00:05:14.879
Sometimes we find that imagery out on in the wild, on the open Internet, and of course, that signals that somebody in that relationship wasn't very trustworthy, and they've shared it outside that sphere of trust.

00:05:11.160 --> 00:06:09.779
And that does happen, and the way that that could be quite successfully dealt with is by that young person going to the report remove service that you've mentioned already that we run in conjunction with NSPCC, Childline. You would go to the Childline website, you can just type in report, remove in Google, and it will come straight up that young person can submit their image or video or multiple images or videos, and it will come through to the analysts in our hotline in Cambridge. Now, our analysts in Cambridge don't know anything about that young person. All they know is it's come through report remove and this is the image. Childline does have details about the person, but they can't see the imagery, but they can wrap that blanket of care around the child and offer them counseling services, should they need it and give them that reassurance that they are doing the right thing. When it comes to us, we'll assess that image like, does it contravene UK law?

00:06:06.180 --> 00:07:21.139
And we can then do that system of creating a hash from it. So we can create that digital fingerprint. It goes on our list of digital fingerprints that go out to tech companies, and it will stop it from being uploaded to platforms where that list is deployed. So that is something that can give that, that teen that control, that sense of control back. Now I want to come back to the issue of sextortion, which you mentioned earlier, having done a podcast on because that is a really, really important tool for anyone in that scenario. The other thing that we've seen for many years, and we started tracking this really in about 2012 is the type of imagery that's created and captured where a child has no adult physically present in the room, but what that child has is a device. It could be a laptop, it could be a phone, it could be a tablet, a webcam, like I'm talking to you now, and that child needs to have a certain environment around them, so somewhere private, they might have another child, another sibling, in the room with them, but it needs to be somewhere private. It's typically a bedroom or a bathroom, and somebody will contact them over the internet and will groom and coerce and encourage them into some kind of sexual activity.

00:07:21.680 --> 00:07:45.519
Now somebody is capturing that along the way, whether it's the child, whether it's the offender on the other side of that camera, doesn't matter who it is, but it's got into the wrong hands. That material has, and we found it once it's gone onto the open internet, being shared quite often with people who have a sexual interest in children.

00:07:41.079 --> 00:08:11.279
It enters this marketplace. Now we know from research that we did a few years back with Britain, thinks we were asking young people, and we were asking parents about this and and from what we can see of offenders and their conversations they're having online, there's a number of things going on here. So for that child or young person, they might think they're playing a game. Maybe they're a great dancer, and they just want an audience to show off their dancing. Who doesn't remember doing that when they were young?

00:08:11.279 --> 00:08:42.580
Except they're being goaded into something where they might not be fully dressed, or maybe they're good at gymnastics and they're asked to do a handstand, but, you know, do a handstand wearing a skirt, and so very often, things will be dressed up as a game. There's also predators, offenders, sexual abusers prey on this, this, this popularity thing. I mean, we all want to feel popular. We all want to be liked. It's no different. And regardless of how old you are, and the offenders will tell the young people, Oh, you're so beautiful, aren't you?

00:08:42.580 --> 00:08:50.799
Pretty certain platforms will allow them to offer them gems or hearts. It makes them feel good.

00:08:47.259 --> 00:09:37.460
It's just reinforcing that feeling of affirmation. You know, who? When you're growing up, let's say you're 12 years old, and you've you're starting to get these you know, you understand what it is to be popular and liked, and somebody is doing that, and they're feeling that for you, and offenders know this, and they prey on this. So you can see how very, very quickly you can go from a scenario that's completely innocuous to a scenario all of a sudden where there's clothes being removed and topless pictures, pictures of genitals, for example, being shown because they're in following instructions. There's some videos, heartbreaking videos, where the child is going, what? What's that I don't understand? What's that word?

00:09:37.759 --> 00:10:08.519
Because they're being asked to do things that they've they don't know what these things are, and then they're reading again, oh, okay, you mean like this, and they're responding back to somebody. But the environment is you need the tech. You need some kind of camera that's capturing it, and the child needs to feel like they're in a private space when they're. Doing that 11 to 13 year old children are most at risk. That's who we see most often in this type of imagery.

00:10:09.179 --> 00:10:56.919
But more and more, we are seeing the seven to 10 year olds coming up fast behind the 11 to 13. So as parents, we have to think, and educators have to think, how do we help them when we're when we're often incapable? We can't make a difference between how a platform is designed and we can't influence the activities of a sexual offender or predator that we know nothing about. But what can we do as parents and carers in the family home? What can educators do to help our children in those scenarios to have that critical thinking, because it's awful to say, but how do we help them pull up their defenses when they might not even recognize what's going on? Because it feels like a game. It feels fun. They're getting this adoration that's

00:10:56.919 --> 00:11:00.039
pretty hard.

00:10:56.919 --> 00:11:39.080
It's very hard. And actually, I did have an excellent conversation with a woman who's worked with offenders, and she has some excellent steps for understanding how they operate and what the what the flags are, which is incredibly helpful, because this is, this is the difficulty that they don't very often. It feels nice until it doesn't. Yeah, and your your figures also show a big step up in AI abuse images. Yeah, because what we've been talking about so far, which is really important and utterly critical, is textually different to what we're now starting to see, which is nutifying apps. And, you know, some parents wouldn't even understand

00:11:39.080 --> 00:12:47.980
what that is. It feels like AI is everywhere, but generative ai, ai tools, new technology, has the ability to Create Child Sexual Abuse imagery without there needing to be a real, living, breathing, physical child there. But that doesn't mean real children aren't involved. So we see how existing victims of sexual abuse, or this type of imagery that's already circulating online, how AI can take the likeness of that child and place them into new scenarios. So we know that some sex offenders collect images of the same child, like you might collect football cards or whatever it is that you collect, but let's just say that child has never been seen in a sexual abuse image on the beach, and they really want to see that there are tools out there that will enable those people to be able to just whatever they have in their head as a fantasy. You can type it.

00:12:43.539 --> 00:14:35.539
You can give it as an instruction to AI and it can create it so you can create new material that shows children who are existing victims of sexual abuse, so their their being a victim just continues, continues, continues in the imagery that lives on that they might not have actually appeared in nudifying apps is a whole new thing altogether. And I think anyone who has any imagery of them online needs to be aware of this. And we saw this with grok x A eyes chat bot at the beginning of the year, if there's a picture or image of you out there, it can be fed into a nutifying app or a nudifying tool, and you can be de clothed. So what this means for children and young people, for example, is if there's a picture of them online, anywhere someone can take that picture, feed it through an app, and you can tell it to de close, and it will the AI will imagine what that child's body looks like, or what your body looks like, or whoever it is, it won't be your body. But very often, why does that even matter? Because the shame, the humiliation, is there. There's that power that that person has to declothe you, to then use that image. If we come back to this idea of sexually coerced extortion or sextortion, I'll explain what that is, and then I'll explain how this comes together with nutifying apps, and what we're seeing so we are seeing more and more how boys are being made victims through somebody contacting them, perhaps a love interest, and they get talking and they share intimate imagery.

00:14:31.460 --> 00:15:43.419
So this is a teen boy, for example, that intimate image will be played back to him very soon afterwards with a threat written across his image, very often something incredibly nasty. And it will say, I know your school, I know your social media contacts, because they've been sharing all this information in what they thought was a trust, what he thought was a trusted relationship. I will share this with your school and with these people, unless you pay me. Money, $30 100 pounds,$500 whatever it is that's being asked for. Now we know that some boys have taken their own lives because of this. It's absolutely horrendous, heartbreaking. This is where report removed comes in. It can almost act as an emergency service where it can come through. We prioritize those reports we can give that young man the understand, the knowledge that we have done everything we can to stop that image from going anywhere and he's done the right thing, and Childline can wrap their big arms around him and give him the support that he needs, which is great when it comes to nudifying apps, it may well be that no image needs to have exchanged.

00:15:43.600 --> 00:17:02.279
You can just have a profile picture, but this app can de close. This is particularly we see this most often with girls images rather than boys, because some of these nutifying apps, most of them, are built to work on female bodies, not male bodies. So let's say you're a female teacher in a school, someone could put your image through it, and all of a sudden you've got a naked female teacher, anyone, anyone whose images are out there. When it comes to girls, the girl doesn't have to exchange any intimate images. All there needs to be is one picture of them on the internet that's good enough, quality picture that's fed through a unifying app, and then all of a sudden she can be blackmailed. And this happens some typically, we see how offenders would blackmail girls for more imagery, because there's this marketplace that actually generates money. Boys will be blackmailed for the money, straight for the money, and we've seen handbooks online that explain to people how they can do this. So coming back to how we have to think about it as parents. Firstly, child is never to blame. If they find themselves in this scenario, they've been a victim in some way. They are up against really organized people, adults who know exactly what they're doing, who know what buttons to press.

00:16:58.240 --> 00:17:13.440
So how do we ensure that we're having those correct conversations and doing what we can in the family home to help prepare our young people for whatever they may or may not?

00:17:10.319 --> 00:17:13.980
Let's hope. Face on the internet,

00:17:16.440 --> 00:17:27.200
it sounds really dark. I mean, it sounds, you know, as as a parent, I listen to this and I think, God, do I need to get my kids to just take all their images down, because it's, it's quite upsetting, isn't it?

00:17:27.440 --> 00:17:39.380
Yeah. And actually, I ask myself a lot, you know, in as a parent, when you put your kids into a holiday club, and I just just so it's a bit of context to my life. I have two children. They're both boys.

00:17:36.319 --> 00:18:37.220
They are seven and 10 years old, and when they go into holiday clubs, and you get this from school, you have this little tick box that says, Can I Can we take pictures of your child for marketing purposes? I find myself going, Oh, my God, oh my god. What do I tick? I want to tick No, and I'm increasingly ticking, no, you can't take pictures of my child. But I know full well. A couple of summers ago, my son came home and he said he felt, and I'm putting this word in his mouth, segregated, pulled aside from the rest of the group, because he was the child. He's not allowed his picture taken, and therefore he felt like he was being punished for something. So there's a disconnect there with how this this materializes in the in the setting children are in. But absolutely, I am really concerned about pictures of my children going online now, and I guess in some ways, being a public figurehead that talks about this, maybe I should be more concerned. Maybe I'm making myself a target in some way.

00:18:38.358 --> 00:19:29.058
Well, I hope not. I mean, I do think this, what you've just said there, plays into also the problems we have as parents when it comes to social media or any kind of connection with the internet and being online, because we're almost in a double bind, though it's very it's when you try and exclude your children to keep them safe, you then exclude them from all the other benefits, and it's almost like we don't have much choice about some of these things. I mean, we do, we do ostensibly, we can say no, but it does make it very difficult, and it's interesting, because some people would justify the nudify apps or the creation of these images as well. It's fake, so it's not really hurting anybody. What would be the impact on somebody of seeing images of themselves in in these compromising situations?

00:19:29.778 --> 00:19:37.098
I don't think there is any legitimate reason for these nutifying apps whatsoever.

00:19:32.538 --> 00:20:04.439
I can see how, on one level, maybe in a bit of a lad culture from the 90s, if we could look at it like that, oh, it's just a joke. It's a bit of fun. But when these are actually targeting women, they work on women's bodies. They're being used for the purposes of bullying, humiliation, shaming black male. Where is the room for a joke in that? And when you're talking about then. Being used to create illegal imagery, Child Sexual Abuse imagery.

00:20:05.159 --> 00:20:20.659
There is no justification for these. And actually these are these tools are going to be banned. The government is changing the rules. These tools are going to be banned. I mean, there's lots of detail I'm sure to work through about how operationally that happens, but I think that's a good signal.

00:20:20.719 --> 00:20:27.199
That is a really good signal as a society that we want to tolerate. Yeah, no, no.

00:20:27.558 --> 00:20:37.759
And so for parents, how can they talk about these risks without making their children feel frightened, you know, having to constantly monitor them? What do we say?

00:20:38.480 --> 00:21:55.720
Yeah, this is a really tough one. And actually I think that what do we say will be different for every family environment, because no family is a carbon copy of another family. So maybe we'll just put that out there. We actually have some resources, and we also run some campaigns, which I'm feeling slightly lighter being able to talk about it's because it's more light hearted. I've said some really gloomy, horrendous stuff, yeah. But actually, we can look at this in quite a fun way. And we try and do this at the Internet Watch foundation. So we actually have a campaign that goes out to parents and it goes out to teens and educators, where we simply use fruit, rude fruit. So it's like a peach that looks a little bit like a bottom, and it's a banana with a bit of cream on it. And I have to say, we got complaints about the banana for parents, but and this is me getting my butt in first before anyone complains, if they go and look at this that was created by children and young people themselves, they co created the banana with cream because it was funny. So anyway, we have some funny imagery, and we raised this issue of sharing of nudes, we use language like nudes. We don't need to talk about child sexual abuse. Nudes is absolutely fine. And that's that's a word that is typically used among young people. Yes.

00:21:55.720 --> 00:22:10.079
They will use it, yes. Keep well away from the term, scary term like child sexual abuse, and it's the campaign is designed when it comes to parents and carers, to get parents and carers to realize that these conversations need to be had.

00:22:10.200 --> 00:23:10.140
And actually, on a we did some work with the international policing and public protection Research Institute at Anglia Ruskin University, and what we discovered is good quality conversations within the family home, regardless of what the makeup of your family home is, is the best superpower that you have as a parent in order to keep your children safe online, because you need to have that trust there, and if something goes wrong, the best thing you can do is establish yourself as a trusted adult, to take it, take that, that issue too, not to get angry. And in fact, a lot of our materials, the first message we give is, don't get angry. Because, you know, you could just think, Oh, my God, you stupid, whatever. What have you done? Doing this, sending this image. Oh, my goodness, when you actually realize that they've been groomed, coerced, encouraged, they've been flattered, there's all these tricks employed by the offender on the other side of the webcam.

00:23:10.559 --> 00:23:45.160
My gosh, give them a break a second. So good quality conversations. And you can start this with a little and often talk about what you like doing online, don't go straight into child sexual abuse and needs create an understanding that we can talk about stuff that happens online in the family home, and maybe while I've sat down at dinner, I could just say something like, gosh, you know, I saw something really funny on the internet today. It was this, or, you know what? I saw this thing today when I was scrolling through Instagram. I don't know why, but it just unsettled me a little bit. What do you think?

00:23:45.640 --> 00:24:09.660
Do you ever see stuff on whatever Tiktok or, you know, whatever the platform of choice is? Do you ever see stuff that makes you feel a bit unsettled as well? What do you think I should do about it? Because very often, young people have a fairly good idea about what you should do. And if you see there's gaps, like, I don't know, I get my kids as young as they are, they have gamer names.

00:24:06.059 --> 00:24:13.920
We call it. Don't use your real name. You use a made up name.

00:24:09.660 --> 00:24:30.140
You have your gamer name. And I'll just emphasize like, you don't give real details about where you live ever. We don't share passwords. So I guess when you have this dialog, look for the bear the gaps are. And can you just go, oh, you know what?

00:24:28.039 --> 00:25:05.819
It's really good idea that you just don't share that level, because you never really know how any who anyone is. Or this is something that my kids said to me, Oh, I know when I'm in the kid area for this, whatever the game is, that it's only kids in there. And I was like, Do you know that said, I wonder. It's like, how do you know that someone isn't a 40 year old man just pretending that they're 10 years old and they're like, oh, I don't know. And I was like, maybe, maybe. You can't know that, maybe we don't know, actually, that everybody is as young as you think they are.

00:25:06.420 --> 00:25:57.099
That does come back to our research from a few years ago, where we've one of the key things we found was there's a myth. There's a misguided belief among some young people that they they put too much trust in the platform that they're using, that there is a barrier to stop older people getting in. And of course, there isn't. There isn't that doesn't exist. That's not a thing. Coming back to the conversation, we have a whole resource on this. If you go to talk dot i, W, f.org.uk, or if there's a way that you can easily link from that for your that'll be perfect. You can download a resource that's had loads of research go into producing something that gives you little bite sized tips, and actually talk is a mnemonic. So talk about the things that your child likes on the internet. So you talk about the good, talk about the bad, good, bad, ugly.

00:25:57.160 --> 00:26:02.880
You know, we do this around the dinner table. What's your good, bad, ugly from today to get up to get the conversation going.

00:26:02.880 --> 00:26:12.539
Agree digital boundaries. How will you as a family agree you're going to use technology?

00:26:08.940 --> 00:26:23.240
There's an organization called Child net who has this family agreement, which I think is a great way to approach this, where you agree as a family how you're going to use technology.

00:26:18.480 --> 00:26:32.480
So very much in my house, no digital devices upstairs because of that scenario I was saying about the digital device with a camera used in a private space.

00:26:32.839 --> 00:27:59.559
So we've already spoken about the risk of a child being able to have a private space with technology with a webcam, if they are contacted by anybody online, you also don't, you don't know what's going on. So that's something that we certainly do. And I love the idea of an agreement. So agree how we use technology together as a family. Next is l learn about the apps and the platforms that your child loves, which, my gosh, I know this can be really, really overwhelming, but there are some great guides out there, so you can go to the UK Safer Internet Center, website, or another organization called Internet Matters has some fabulous guides to help you understand how to use those platforms that your child loves, so that you can have those conversations. Did you know you can turn off the location settings in name your app, for example, but if you know about those things, then you can have the conversations. And the K is to know how to use the tools and the settings and the resource that's available goes through little conversation starters that you can have. And there's also a section in there for if you have a child with a special educational need, then how you could also approach that as well, because that's incredibly important. I think, though, you can't go far wrong if you just have little and often conversations and you take an interest if the worst happens, you need them to know that you will be there for them and that they can come to you. They just need a trusted adult in their lives.

00:28:00.278 --> 00:28:49.358
Yeah, I think that's absolutely brilliant advice. And the people I've spoken to who've have children who have been impacted by this, each one of them, thankfully, went to their parent. One of them got caught out. He tried to pay the money, paid the money, and then they demanded more, because this is how they operate. And that's when he went to his parents, and he was just looked after. And the more we can have these conversations with our kids and prepare them, the more there's a speed bump in the road. Because I think the other thing is interesting is I found this with drinking alcohol, having those conversations beforehand helps your child have a script in their head that they can use as information. Because if you haven't laid down any foundations, they don't have a script to go to. They just have to make it up as they go along, and that's much harder to resist. So I love what you're saying.

00:28:50.019 --> 00:28:58.240
That's a really good point. Yeah, you need, I guess you just need to have that familiarity of this path I've trodden before in my mind or through conversation.

00:28:58.240 --> 00:29:06.420
Yes, Professor Stephen Peters talks about it in the Chimp Paradox, or rather, one of his other books, where he says, we have a chimp, we have a human, and we have a computer.

00:29:06.539 --> 00:29:34.940
And what happens is, the chimp always responds first. But the com, the chimp will look in the computer first. The chimp is your kind of emotional side, and the best thing to do is to put something in that computer. So the chimp will look in the computer and go, oh yeah, no. My mom said, Don't ever send nudes and or don't ever, you know, and then they have to make a decision that would be against what they knew already would be the right thing. And that can happen, but at least they've had a something there that gives them some guidance. So yeah, it's brilliant. Is there anything else you'd like to say

00:29:34.940 --> 00:30:26.660
to parents? The biggest piece of advice to parents is try not to get angry, or you can be angry, but just manage how that manifests on your face and in your words in that moment. That doesn't mean you can't go off screaming a pillow or something, but for that young person, being calm feel and helping them to feel in control is really important, and if you've done that little bit of thinking. Ing to go. I know there's report remove. I know there are people out there who can help with us. I know that you aren't at fault, even though I'm still bloody annoyed, keep that picture yourself. I know you're not at fault. Then I guess it's rehearsing for a parent how you might react in that scenario. Maybe that's just as important as a parent. Think about how you react and how you want to react, and try and live up to how you want to react.

00:30:26.720 --> 00:30:48.640
That's great advice. Emma, I love that. Thank you so much. Thank you for giving us the time and really deep insights as a parent, because I think that does make a massive difference. When we listen to someone who's also walking the same walk with us. I will put all the links in the podcast notes, because it's always good to know, oh, I can go look there and I can get that. And I love all the resources that you've shared as well. That's very, very helpful.

00:30:48.640 --> 00:30:51.759
So if you'd like to contact the internet, watch foundation.

00:30:51.759 --> 00:30:53.079
What's the best way? I would say, go

00:30:53.079 --> 00:31:02.519
to the website, which is I wf.org.uk, and you can find loads of information there, including contact details for different parts of the organization.

00:31:02.519 --> 00:31:04.259
Brilliant.

00:31:02.519 --> 00:31:16.380
Thank you. And of course, Childline is another wonderful resource, and if you want to contact me, it's teenagersuntangled@gmail.com I have a substack where I do a lot more writing, more sort of detailed analysis of the things that we've been talking about.

00:31:16.500 --> 00:32:42.980
And I also have PDF printout notes of the podcast, so that you've got a little checklist that can be helpful. That's teenagersuntangled.substack.com, that's it from me. Have a great week. Big hug. Bye, bye. You.

00:31:27.980 --> 00:32:42.980
Good you