App safety & reviews Archives | Qustodio https://www.qustodio.com/en/blog/category/app-safety-and-reviews/ Free parental control app Tue, 13 May 2025 13:42:00 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 Is Line safe for kids? A parent’s guide to the “super app” https://www.qustodio.com/en/blog/is-line-safe-for-kids/ Fri, 09 May 2025 10:11:14 +0000 https://www.qustodio.com/?p=84607 The post Is Line safe for kids? A parent’s guide to the “super app” appeared first on Qustodio.

]]>
 

Although it began as a messaging service, Line is now regarded as a “super app” by its hundreds of millions of users in Asia thanks to its multifunctionality. In Japan, Line has 96 million users – over 78% of the total population. The app is also deeply integrated into the everyday lives of people in Taiwan, Thailand, and Indonesia. It might not be as popular among kids globally as other messaging apps like WhatsApp and Snapchat, but in Japan and other parts of Asia, Line is king of the communication apps across all age groups. While it does have some safety features, Line is not primarily marketed as a kids’ app and so may pose safety risks parents should be aware of.

What can kids do on Line?

Line is considered a “super app” because it combines core functions like calls, text, voice, video messaging, and group chats – similar to apps like WhatsApp – with social media features such as timelines, video sharing, and friend lists. Users can also use Line to send and receive payments, shop, read news, play games, livestream, and more. It might not be marketed as a kids’ app, yet many of Line’s features have proved particularly popular among youngsters, including stickers. Line has an expansive sticker library, including characters from popular anime and cartoons, and allows users to create their own designs. Based on these are Line Friends: a set of characters, with names like Brown the bear, Cony the rabbit, and Sally the chick, that pop up in merchandise, games, and special events.  Young users also enjoy the Line-connected app Line Camera, which is like a more playful and customizable version of Instagram. It allows people to take photos, add filters, stickers, and effects, and then post them on their timeline. As the popularity of TikTok has proven, short-form video is a huge draw for younger users, and Line has them covered. Line Voom is a social networking service within the Line app that lets users view, upload, and share short videos and other content. As of April 2025, Line Voom is only available in Japan, Taiwan, and Thailand.

Line: key risks parents need to know

Inappropriate content
While Line’s Safety Center does prohibit the sharing of obscene content and content promoting illegal acts via messaging, the platform has no moderation or filtering, and instead relies solely on user reports. This means kids and teens can easily share explicit messages, links, videos, and other media in private chats.  Line Voom, the social media-style feed within the Line app, uses both an automated monitoring system and a team of human moderators to ensure content follows their guidelines. However, there’s always a risk that harmful content may slip through.
Predation
You can contact anyone on Line who shares their Line ID (a username), phone number, or QR code, with you. To protect young users from harmful encounters, Line IDs owned by under-18s will not appear in search – but to date, this feature is only available to users in Japan. Like WhatsApp, group chats on Line pose a safety risk for young users, as anyone can invite them to join if proper privacy settings aren’t enabled. Unfortunately, group chats are commonly used by predators to build trust and manipulate minors.
Cyberbullying
Line’s lack of active monitoring within chats means there’s always the possibility of a child becoming the target of hateful and offensive messages. This can be a particular problem in group chats where users can “gang up” on an individual and make them the subject of ridicule. Cyberbullying can also take the form of exclusion, where someone is intentionally left out of group chats or conversations, potentially deepening feelings of isolation.
In-app purchases
Although Line is free to use, it offers in-app purchases that users can buy with real money, many of which seem to be targeted at the younger demographic, such as stickers, emojis, and themes. Away from the messenger service, Line also offers games where users can buy items or upgrades. Some of these titles incorporate gacha mechanics and loot boxes, which share many of the same characteristics of gambling.
Scams
Since Line is so widely used in certain countries, scams of various forms can be found on the platform. These might include phishing scams, for logins, personal details, and financial information; impersonation scams, when someone pretends to be a friend or family member to get money or data from you; and investment scams, where too-good-to-be-true opportunities to get rich are offered for an initial payment.    Unfortunately, children are often targeted by scammers as they tend to be more trusting of others, especially those posing as authority figures, friends, or family. Children are also more likely to overlook or ignore signs of a scam when promised something they want.
Two teenagers using Line app

Making Line safer for teens

Line doesn’t appear to have a universal minimum age requirement across all regions. According to its Help center, users in EEA member countries must be at least 16 years old, but this is to comply with European data protection regulations (GDPR). In other regions, such as the United States, the App Store lists Line as suitable for users aged 12 and up. However, there are no strict age verification measures in place to enforce these guidelines.  If you approve of your teenager keeping in touch with friends and family with Line, here are a few steps you can take to make the service safer for them:

Talk openly about the risks and set expectations

Whether your child uses Line, WhatsApp, or any other communication or social media platform, incidents of cyberbullying and predation are always a possibility. Ensure your child is aware of these and other online dangers, and feels comfortable enough to come to you or another trusted adult if they ever feel upset or distressed while using the app.   It can be difficult talking to your child about topics like grooming, but child psychologist Dr. Beurkens has shared a useful guide to approach this vital conversation: How to talk to your child about online predators. Creating a family digital agreement is also a great way to start an ongoing, respectful conversation about general tech use and set healthy expectations regarding screen time, sharing personal info, and other safety risks.

Optimize privacy settings for safety

Since it’s a service not designed for children, Line doesn’t have parental controls. However, you can tweak the privacy settings on your teen’s device to help keep them safe on the platform. Here a few of the essential ones:

  • Turn off “Allow others to add by ID” to stop users from being able to search for and connect with your child via their Line ID. This option is automatically turned off for users under 18 in Japan.
  • Turn off “Allow others to add by phone number” to prevent strangers from adding your child if they have their number.
  • Turn off “Receive messages from non-friends” to block messages from strangers. 
  • On Line Voom, set “Who can view” to “Only me” or “Friends” to restrict who can interact with your teen’s posts.

Although these features may help keep your teen safe on Line, they are not a substitute for trust and open conversation – as your child can simply revert these settings without your knowledge. 

Ensure your teen knows how to block and report users

Knowing how to report and block problematic individuals will help your teen have a safer experience on Line, as well as empowering them to handle troubling situations that might arise there and elsewhere online. Line users can report problematic messages and report/block the people that send them from within a chat. To block a user, click “Menu” and choose “Block”, and to report a message or user, choose “More” and “Report”, and follow the instructions.  

Use parental control tools

As Line doesn’t have in-built parental controls, we highly recommend using a comprehensive parental control solution like Qustodio to keep your teen safe when they use Line. For example, Qustodio’s AI-powered alerts monitor your child’s activity on Line, WhatsApp, Instagram, and other messaging platforms, and notify you immediately when they exchange messages related to bullying, self-harm, depression, drugs, school absences, and other concerning topics.  As well as receiving message alerts, you can use Qustodio to:

  • Block the Line app from being opened
  • Monitor the time your teen spends on Line
  • Set usage limits
  • Receive an alert when the app is first opened
  • Pause internet access at the push of a button.

Line may be primarily a communication app, but its wide range of features – including timelines, video sharing, games, live streaming, and more – has helped it earn the reputation of a “super app” in Japan and across Asia, where it is hugely popular. This popularity stretches to include kids and teenagers who seem to especially enjoy Line’s social media-like features, stickers, and games.

Without proper parental controls, Line can expose young users to risks common to communication and social media apps, such as predation, cyberbullying, and inappropriate content. However, by having open conversations, adjusting privacy settings, and using parental monitoring tools, you can help your teenager stay safe on the popular communication app. 

The post Is Line safe for kids? A parent’s guide to the “super app” appeared first on Qustodio.

]]>
Is Instagram Edits safe for teens? App safety guide for parents https://www.qustodio.com/en/blog/is-instagram-edits-safe/ Wed, 30 Apr 2025 08:16:38 +0000 https://www.qustodio.com/?p=84399 The post Is Instagram Edits safe for teens? App safety guide for parents appeared first on Qustodio.

]]>

Edits is a free photo and video-editing app aimed at content creators posting on social media – particularly Instagram, the makers behind the Edits app. Designed to take on its rival TikTok’s CapCut, Edits lets creators use a range of creative tools not found in the original Instagram app, to design and build content from the comfort of their own phone. Edits also offers analytics tools for users to track video performance, gearing the app towards more serious content creators over casual users.

What can kids do on Edits?

Instagram’s feed has long changed since the days of sharing oversaturated pictures of avocado toast – the social sharing app’s main focus has shifted to Reels, Instagram’s version of short-form videos, which now dominate the user experience on most successful social platforms. Instagram’s rival, TikTok, provides users with an extensive suite of tools within the app, along with a standalone studio, CapCut, letting creators and dabblers create videos on their phone in a short space of time. Instagram’s Edits hopes to level the playing field, offering what the Head of Instagram, Adam Mosseri, calls “a full suite of creative tools”. On Edits, you can:

  • Keep track of ideas and drafts in one space
  • Create videos with better tools than those found in the Instagram app, like a higher quality camera, video cutouts, and animation with integrated AI
  • Share drafts and ideas with friends or other creators
  • Use a wider library of fonts, animations, music, and filters
  • Explore other users’ videos in an “Inspiration” feed
  • Check analytics tools to understand video performance, if sharing within Instagram
  • Download your final creation, watermark-free, to your camera roll

To use Edits, you first need an Instagram account, which means that in accordance with Instagram’s Teen Accounts and policies, it’s restricted to age 13+. 

Instagram Edits: the risks parents need to know

In-app content

While the features on Edits are mostly designed for video creation, the “Inspiration” feed could be a source of inappropriate content, depending on the videos they are served. The videos displayed are based on Instagram’s algorithm, so the app serves inspirational content to you depending on your interests, likes, and behavior over on Instagram itself. In general, if your child uses Instagram, it’s important to talk to them about how algorithms work, and encourage them to engage with positive content or content that allows them to explore their interests, rather than content that makes them feel demotivated, or which is inappropriate for their age.

Content sharing

As with any social media, make sure your child understands consent, and what they are able to post on social media. While Teen accounts are private by default, your child should still understand that once a video or picture is posted online, we lose control of the content, and even if your child decides to delete a video, someone following them could easily have taken screen recordings or screenshots. It’s also important they are aware of online consent: taking videos and pictures of friends and family is one thing, but filming strangers or people they don’t know, then uploading it to social media could have implications, either for your child or the person they filmed.

Social media pressure

Edits doesn’t have a traditional Instagram “feed”, but it does have an “Inspiration” feed, where you’re served aspirational content and Reels from creators, designed to get you creating too. While you can’t comment on these videos, so there’s no user interaction, watching a steady stream of picture-perfect, Instagram-ready content has both its pros and cons – serving as creative inspiration, while also having the potential to affect how your child views the world. Talking to your child about the realities of online influencers, and helping them recognize that not everything they see online is real or reflective of other people’s reality, can set them up with a more realistic outlook of social media. 

Making Instagram Edits safer for teens to use

Making Edits safer for teens

Comply with age restrictions

If your child isn’t old enough to have their own Instagram account, they shouldn’t be using Edits, as the app requires Instagram to be able to create and experiment. Depending on the country, Instagram requires users to be at least 13 before opening an account, and under-18s will automatically be streamed into a more teen-appropriate experience through their Teen Accounts feature.

Get to know the app with them

Video apps like Edits can be a great way to encourage teens to get creative, as long as they’re using the app in a responsible way. Get to know Edits’ features, and explore the ways that your child can create content, especially because through Edits, you don’t actually have to post the end product to social media. You could create videos as a family, or show them how to use the tools to create videos about their hobbies and interests.

Encourage safe sharing

If your child does share videos to social media, or any content, it’s important for them to share responsibly. Talk to your child about the importance of never revealing personal information or information that can help identify their name and location, such as school uniform, street name, or similar. Even if your child’s account is private, this still rings true, as anything online can be shared through screenshots or spoken about in group chats. Make sure your child also understands what online consent looks like – for their friendship group, for example, sharing photos and videos might be OK, but when sharing pictures of minors, it’s important to always get consent.

Have conversations about what healthy social media use looks like

In today’s world, despite restrictions and proposals affecting how minors use these channels, social media is relatively inescapable. Even if your teen doesn’t use social media now, the chances are that in the future, they will. Help them make sense of social media and talk about its role in their life. Teaching our teens to verify information, think critically about the content they see online, and how to interact with others when they explore chats, comments, and other internet spaces, will help set them up to be better digital citizens in the future.

With some ground rules and understanding of the app’s features, Instagram’s Edits tool can be an opportunity for your teen to explore their creativity, especially if they’re not sharing videos on Instagram itself. Setting daily use limits on apps like Edits can help to bring balance to teens’ digital activity, especially if you notice increased amounts of time spent on the app. However, if your child is active on social media, proceed with more caution, communicating with them about building a positive relationship with social media, understanding how the algorithm works, and the dangers that the platforms can pose.

The post Is Instagram Edits safe for teens? App safety guide for parents appeared first on Qustodio.

]]>
Is WhatsApp safe for kids? App safety guide for parents https://www.qustodio.com/en/blog/is-whatsapp-safe/ Wed, 16 Apr 2025 13:12:40 +0000 https://www.qustodio.com/?p=83909 The post Is WhatsApp safe for kids? App safety guide for parents appeared first on Qustodio.

]]>

WhatsApp is a messaging app that lets users send text and voice messages, make voice and video calls, and share images, documents, and other media types, like stickers or GIFs. Users can create group chats and initiate conversations with contacts using their phone number. WhatsApp is one of the most popular messaging apps in the world, and it’s no different for kids: Qustodio’s 2024 report on kids’ app habits showed that globally, WhatsApp was the most popular communication app for under-18s.

What can kids do on WhatsApp?

There’s a reason that WhatsApp has become one of the most-used messaging apps out there, for adults and teens alike. It has many different features, all designed to help us stay connected with friends and family, such as texting, voice messaging, and voice and video calls – for free. Users can send pictures, documents, and use interactive features such as polls, reactions, and stickers. Other features like location sharing can also be enabled to help find friends and family members on a map, or a designated meeting point.

To access the application, you need to have a phone number, which usually means that WhatsApp isn’t often on parents’ radars until their child gets their first phone. However, there are multiple ways that people use WhatsApp and riskier features that parents should be aware of. 

WhatsApp: the risks parents need to know

Explicit content

WhatsApp has no built-in content moderation or filtering. Kids and teens can easily share links, videos, and other media, which could be explicit or adult in nature, as WhatsApp won’t flag this and the app features no parental controls. By default, all images received by the user are saved to their camera roll, so this means that children could end up with embarrassing or unwanted images on their phone. The custom sticker feature in WhatsApp is fun and creative, but it also means that kids can easily create stickers that are explicit in nature, and quickly share them with friends and contacts.

Disappearing content

Users can send photos and videos with the ”‘view once” feature, meaning that the messages will disappear from the chat after the recipient has opened them, or after 14 days if they remain unopened. Additionally, the photos and videos will not be saved to the recipient’s device, regardless of their settings. Recipients can still screenshot or record “view once” content (although privacy features to prevent this from happening are in development) and that offensive “view once” content can be reported to WhatsApp, even if it has disappeared from the chat.

Group chats

Anyone using WhatsApp can be added to group chats without choosing to, unless the option is switched off. By default, WhatsApp has the “Who can add me to groups” preference set to “Everyone”. This is problematic, as it gives any WhatsApp user (anywhere in the world) the option to add your child to a group chat without their permission. Even after changing this setting, parents should bear in mind that WhatsApp groups each have a unique invitation link, which can be sent via email, SMS, or through another platform, even to users who are not on WhatsApp.

Bullying

Although most children use WhatsApp to communicate in safe, fun, and positive ways, there have been incidents where participating in online group chats has led to bullying behavior. With connectivity comes great responsibility, and if children have issues and problems in school, they can follow them home through devices, where bullies and mean comments can reach them at any time of day or night. Children can be added to group chats, sent hurtful messages, be on the receiving end of offensive photos or videos, and rumors and gossip can easily spread through apps like WhatsApp. 

Notifications and “read” status

WhatsApp comes with some default settings that can contribute to a sense of online urgency and being always available. First is the “read” status on messages: a small blue-colored double-tick in the bottom-right corner of every WhatsApp message indicates that it has been read by the recipient. This feature is always on in group chats and can’t be disabled there, but it can be disabled inside individual chats. WhatsApp also has online status features that let other users know how their contacts are using the app: “Last seen” is a timestamp that shows all other users the last time you were active inside your WhatsApp account, and “online” shows your contacts if you are currently using the app.

Privacy concerns

To use WhatsApp, you have to enter your real phone number, which means this could be exposed if kids are added to group chats or conversations with multiple people in them. Anyone with your child’s number could add them to a group chat, potentially putting them in contact with strangers

While WhatsApp messages and calls are end-to-end encrypted (meaning third parties, including WhatsApp, can’t see the content), the company still stores and uses data and information supplied by users and collected through the app, much like most apps do. For example, it knows how people are using their services, and the time, frequency, and duration of these activities. WhatsApp may supply some of this information to third parties, which could be considered a privacy risk.

Is WhatsApp safe for teens? App safety guide for WhatsApp messaging

Making WhatsApp safer for kids and teens

WhatsApp requires users to be at least 13 years old before they sign up for the service (or more, depending on the individual requirements in each country). If your child is 13+ and has shown that they’re ready to use messaging services like WhatsApp, or need to use it to keep in contact with friends and family, here are a few of the steps you can take to make the platform safer for your kids:

Add known contacts only

Ensure your child only adds and interacts with contacts that they know in real life. To help prevent them from being added to groups, set the “Who can add me to groups” preference to “My Contacts”, or “My Contacts Except…”. While this can help prevent them from being added by numbers not in their contacts, they also need to be aware that each group on WhatsApp has a unique invitation link that can be shared anywhere online, via text message, email, or direct message. Make sure your child knows this, and teach them to think carefully and critically about joining different groups online, in addition to showing them how to block users and leave groups they don’t want to be a part of.

Change settings to protect their privacy

Use WhatsApp’s inbuilt security settings and customize your child’s app settings to make sure their privacy is protected, and the app experience is safer overall. There are various settings that you can alter:

  • Ensure “Live location” is disabled, within both the device and app settings.
  • Set “Last seen” and “Online” status to “Nobody” or “Contacts only”.
  • Set “Status message” to “Contacts only” and ensure your child knows never to share content on other platforms outside of WhatsApp.
  • Turn off the “Save to Camera Roll” feature on iOS, the Media visibility button on Android, and disable automatic downloads, preventing potentially inappropriate content from being saved to your child’s phone.

Disappearing messages can also be turned off, though this is more complex as it requires going in to individual chats, tapping on the contact’s name, and selecting “Off” in the section for disappearing messages. This can be changed by anyone in the chat at any time, so it’s important to talk to your child about the feature and how it works. Ensure your child understands that “disappearing messages” doesn’t necessarily mean that the content will disappear forever, and that they know never to post content that may be risky or could harm or hurt others.

 

Discuss online bullying with your child

As with all messaging apps, online bullying can be a potential risk. Although most children use WhatsApp to communicate in safe, fun, and positive ways, there have been incidents where participating in online group chats has led to bullying behavior. Before being allowed to use WhatsApp, your child should know and feel comfortable that they can come to you or another trusted adult for help if they ever feel upset or distressed while using the app. 

If your child is on the receiving end of online bullying, it could be helpful to take a screenshot of the communication they have found to be upsetting or inappropriate; however, parents should ensure their child knows never to take a screenshot of any image containing nudity, even if it is only for evidence. Make sure your child knows how to report cyberbullying, block users, and understands what it means to be kind on the internet. 

Enable two-step verification

Two-step verification on WhatsApp involves creating a six-digit PIN. This PIN must be entered any time your WhatsApp account is registered on a new device. This adds another layer of security to your child’s WhatsApp account. It can protect against unauthorized access, for example, if someone tries to set up your child’s number on a new device, or if hackers get hold of their number or any other personal details. 

Monitor and check in on their app use, and how it affects them

You can’t completely remove the risk that your child will receive problematic or inappropriate content, or be contacted by strangers online, but you can keep a close eye on their digital experience and help them to make sense of it, together. Qustodio’s monitoring features allow parents to be notified when they send or receive concerning messages. Qustodio’s AI-powered alerts monitor both traditional messaging and WhatsApp, letting you know right away if there’s something you need to be aware of, from bullying and self-harm to school absences and health worries. This helps to give your child privacy while also checking in on them when potential issues arise.

WhatsApp is one of the most commonly used communication apps globally, so in many countries it’s not a case of “if”, but “when” they use the messaging app. That said, parents should help their child configure WhatsApp settings and carefully consider age-appropriate use, along with frequently talking to them and supporting them with issues that can affect teens and young people on communication apps, such as bullying and the pressure to be present online. 

The post Is WhatsApp safe for kids? App safety guide for parents appeared first on Qustodio.

]]>
X parents’ guide: Does Twitter have parental controls? https://www.qustodio.com/en/blog/is-twitter-safe-for-teens/ Thu, 10 Apr 2025 09:00:00 +0000 https://www.qustodio.com/blog/2012/08/is-twitter-safe-for-teens/ The post X parents’ guide: Does Twitter have parental controls? appeared first on Qustodio.

]]>
Teenage girl using X/Twitter

X is one of the most well-known and influential social networking platforms, as well as being a hotbed of controversy – especially since Elon Musk’s acquisition in 2022. While public perception of the platform formerly known as Twitter (and still referred to as such in most circles) remains divided, most can agree that X is not a suitable place for children to spend their time online.

Although not as popular among kids as TikTok and Instagram, X still ranked as one of the most-popular social media apps among children in 2024. As the polarizing platform is clearly on kids’ radars, we, as parents, need to know if X’s safety features are enough to protect them from the dangers on the platform – and what to do if they aren’t.

How old do you have to be to use X/Twitter?

While X acknowledges that the platform is not primarily for children, it allows anyone above the age of 13 to sign up and use it. In the EU, the minimum age rises to 16. It’s worth mentioning that such age restrictions are common for social media platforms, and exist to comply with data protection laws rather than to keep children safe.

Both Google Play and the Apple App Store have 17+ ratings for X.

Why we consider X/Twitter unsafe for children

X can be a useful platform for adults to engage with like-minded individuals, share opinions, and exchange news or information. However, the platform has a dark side that poses significant risks for younger users, including but not limited to:

  • Inappropriate content. X’s rules explicitly allow users to share adult and violent content on the platform. 
  • Cyberbullying and trolling, whether in tweets, comments, or DMs (direct messages).
  • Hate speech, fake news, and misinformation are widely found throughout the platform.
  • Potential for predation and grooming, as strangers might DM minors.  

Does X/Twitter have parental controls?

Given that the platform is designed for adults, it’s not surprising that X does not offer supervised parental controls like those found on TikTok and Instagram, which offer Family Pairing and Teen Accounts, respectively. 

While not parental controls, X allows you to adjust some privacy settings that might offer some protection for young users. These are the automatic options for users under 18; however, since the settings aren’t locked or controlled by a parent account, they require a certain amount of trust in your teen.

Enable “Protect your posts”

In Settings > Privacy and safety > Audience, media and tagging, check “Protect your posts”. This allows only approved followers to see and interact with your teen’s post, and replies to public accounts are only visible to approved followers. While you’re there, make sure Photo tagging is switched off. 

Allow message requests from “No one”

In Settings > Privacy and safety > Direct messages, select “Allow message requests from no one”. This ensures that only users your teen follows can send them direct messages.

Disable “Display media that may contain sensitive content”

Unchecking this in Settings > Privacy and safety > Content you see hides inappropriate content from your child’s feed. X also claims to apply enhanced filters to reduce the visibility of sensitive content in the feeds of users under 18.​

Mute words to hide content  

In Settings > Privacy and safety > Mute and block, you can mute specific words, hashtags, or topics to prevent kids from seeing harmful content related to those terms.

Disable discoverability

In Settings > Privacy and safety > Discoverability and contacts, ensure both options are unchecked to prevent people from finding your child on the platform through their phone number or email address.

 

Father helping daughter use X/Twitter safely

How to block inappropriate content on X/Twitter

As well as disabling the option to see inappropriate content and muting problem words (as explained above), you can also mute, block, and report individual accounts that post material you think is unsuitable. Click the three dots at the top of the post, and you can choose Block or Mute @account, and/or Report post, and follow the instructions.

X/Twitter’s parental controls: Qustodio’s recommendation 

X isn’t designed for children, yet children as young as 13 in the U.S. and 16 in Europe can create an account and use the platform. While the platform does have some safety settings that can be adjusted to help protect young users, X doesn’t have supervised parental controls like those found on TikTok and Instagram. This means that a child can use X unmonitored and change the safety settings without you knowing.  

However, you can use an all-in-one parental control solution like Qustodio to help keep them safe on the platform – or keep them off it.

You can use Qustodio to:

  • Block the X app from being opened
  • Monitor the time your teen spends on X
  • Set usage limits
  • Receive an alert when the app is first opened
  • Pause internet access at the push of a button 

 

X is not a safe place for children to spend their time online, and we do not recommend the platform for anyone under 17. A minor can easily find pornographic and violent content, hate speech, and fake news; and the lack of parental controls means they can be exposed to predators and cyberbullies. Although X’s settings can be adjusted to provide some protection for young users, they are not enough to ensure their safety on the platform.

The post X parents’ guide: Does Twitter have parental controls? appeared first on Qustodio.

]]>
Is Spotify safe for kids? App safety guide for parents https://www.qustodio.com/en/blog/is-spotify-safe/ Fri, 21 Mar 2025 09:53:54 +0000 https://www.qustodio.com/?p=83234 The post Is Spotify safe for kids? App safety guide for parents appeared first on Qustodio.

]]>

Spotify is a digital music streaming service, offering access to millions of songs, podcasts, and audio from artists all over the world. It’s on-demand radio for the streaming era, with the added bonus that over time, it becomes tailored to your listening interests, suggesting music, artists, and topics you might like, while curating playlists based on your listening history. Kids and adults alike love it for its ease of use, quick access to a vast online library of audio, and its high level of customization – but Spotify is also home to some hidden risks that parents need to explore.

What can kids do on Spotify?

With the basic (free) version of Spotify, kids can listen to the full library of music and podcasts, but with breaks for ads, and they’ll only be able to press the “skip” button a few times before it stops them from jumping from song to song. The free version also shuffles playlists, and you can’t listen offline. Why is this important? With kids’ listening in shuffle mode across a huge variety of songs, genres, and artists, it’s difficult for parents to keep track of what they’re listening to, and whether they’ll be served with songs, ads or podcasts where the lyrics or content is inappropriate for their age. 

Spotify does have a premium version, where parents may feel more in control: the paid subscription removes ads and skip limits, playlists are no longer shuffled, and kids can listen offline. Premium subscribers in some countries also have access to a range of audiobooks, which aren’t available in the free version. However, to manage explicit content, families have to purchase the more expensive “Family” subscription.

Spotify: the risks parents need to know

Explicit content

Long gone are the days where you could avoid songs or a whole album by vetoing any purchase with a “Parental advisory: Explicit content” label. Through Spotify, kids have access to a huge library of music with uncensored, graphic lyrics, podcasts discussing mature themes, and sexually inappropriate content. And just like most spaces on the internet where large numbers of users can edit, upload, and interact with the content, Spotify contains adult content, which is relatively easy to find or stumble across. Kids can find explicit album art, porn audio, and erotic podcasts across the platform.

Risk of predation

Spotify’s social features, like public playlists and collaborative playlists, might leave young users open to inappropriate interactions. With people all over the world able to access Spotify, and create custom playlists, these can be a channel for adults to contact unsuspecting children. One UK mother reported that a stranger had used Spotify playlists to communicate with her 11-year-old daughter, changing the title and description of the list to encourage her to send explicit photographs by uploading them as the playlist’s featured image.

Inappropriate content

Beyond music, Spotify is also host to a wide variety of podcasts and audio content, which cover a range of topics – not all of them child-friendly or age appropriate. Discussions about adult relationships, substance abuse, violence, mental health challenges, and other complex subjects that aren’t appropriate for every age often feature in podcast content. And, because Spotify’s recommendations are driven by algorithms, the more your child tunes in to this kind of content, the more likely they are to be served more of it in the future. 

Minimal parental controls

In some countries, Spotify offers a family-friendly version of the app, Spotify Kids, which is tailored to younger children and offers singalongs, child-centred playlists, and custom audio based on your kid’s age (0-6 or 5-12). Unfortunately, this isn’t available worldwide, and the other option for parents of younger children, which is a Family subscription, simply filters out explicit content – kids can search and find it, but they won’t be able to click on it and play the audio. Parents can also filter out artists by navigating to their profile, selecting the three small dots that appear, and choosing “Don’t play this artist”.

Mother helping daughter set parental controls on Spotify music streaming on iPad

Making Spotify safer for kids and teens

Depending on your child’s age, there are different ways you can make Spotify safer for them to enjoy and explore. Across the age groups, it’s generally a good idea – if you can – to choose a premium subscription or use Spotify Kids if it’s available in your region. This will allow you to give young kids a more age-appropriate experience, or turn on explicit content filters. However, even if you can’t do this, there are still ways you can make Spotify safer for children and teens.

Make custom playlists

An easy way to ensure your child listens to music that you’re happy with is by creating playlists – which can also be turned into a family activity, where each family member curates music lists to their taste. You can add new songs over time, and there’s no limit to the amount of playlists you can create. Alternatively, if you don’t have time, or don’t want to create your own, you can explore the huge library of custom playlists and follow them from your child’s account.

Follow podcasts and artists

To help shape your child’s algorithm, it’s a good idea to follow podcasts or artists that they enjoy, or that you think provide value. Similarly, you can hide artists by selecting “Don’t play this artist” from their profile, and adjust the algorithm by selecting “Not interested” on recommendations, or “Remove this from my taste profile” on playlists. This will help shape their feed, and give them recommendations that are more likely to align with your values, and their interests.

Turn on explicit content filters (paid only)

To filter out explicit content, you’ll need to have a premium family account. In your “Account overview” section on the Spotify website, select “Premium family”, and then head to the name of the family member you want to manage. There should be a toggle that reads “Allow explicit content”. Make sure this is in the “off” position. You may need to keep checking these settings, as kids can also switch it back on.

Monitor social sharing

While it’s fun to see what friends and others are listening to, and Spotify’s social features can make listening a more collaborative experience, it makes sense to check in on their followers and block any they aren’t friends with in real life. You can also make playlists private, making it more difficult for their profile to show up in any search results. 

Set healthy limits

While it’s beneficial for kids to listen to music and audio, there’s a time and a place for everything, and you may not want your child to have access to apps like Spotify at any time – for example, when they’re supposed to be sleeping, or when they need to get up and out of the door for school. Qustodio’s routine feature allows you to schedule tech-free moments in the day, and block apps like Spotify when you’d prefer your child not to have access.

Music and audio play an important role in helping children to understand how to express themselves, and how they explore the world. Podcasts can entertain, educate, and provide an escape through humor, among many other advantages that access to audio content can provide for young people. Not allowing your child to access audio streaming services means they miss out on different opportunities, so it’s a good idea to be aware of the risks and be open about them with your children, letting you flag problems together before they become serious – meaning the whole family can enjoy music together!

The post Is Spotify safe for kids? App safety guide for parents appeared first on Qustodio.

]]>
Is Snapchat safe for teens? App safety guide for parents https://www.qustodio.com/en/blog/is-snapchat-safe/ Thu, 30 Jan 2025 15:22:11 +0000 https://www.qustodio.com/?p=82084 The post Is Snapchat safe for teens? App safety guide for parents appeared first on Qustodio.

]]>

 

Snapchat is one of Gen Z and Alpha’s most popular communication apps: in 2024, Qustodio app insights showed that Snapchat was the 2nd most favored communication tool among 4-18-year-olds globally, while in the US, Snapchat ranked 1st, with kids in the country racking up an impressive average of 90 daily minutes on the instant messaging platform. 

Despite its huge popularity, the app has managed to stay off the radar in a turbulent year for online safety: the discussion surrounding younger teens and digital danger has mostly circled around social media, but Snapchat has its own dark side that parents need to be aware of. Here’s what parents need to know about Snapchat and how to keep teens safe as they explore. 

Snapchat: a simple summary

Snapchat is a messaging app where users can create messages, known as “snaps”, which can come in the form of a short video, a photo, or text. Snaps can be sent directly to contacts within the app, or uploaded to a story feed, available for 24 hours after upload. Snaps are easy to customize, letting users add filters, special effects, drawings, and captions.  

Younger users like Snapchat as it’s a fun, creative way for them to talk to friends. Snapchat’s features make it easy for teens to keep others in the loop, through messages captured in the moment, a chronological story feed, and location sharing with the “Snap Map”.

What can my teen do on Snapchat?

When signing up for Snapchat, your teen has to enter their date of birth. Profiles are private by default for anyone using Snapchat. However, users between the ages of 13 and 17 are automatically given a different experience to users 18+. Younger teens on Snapchat can’t access public profiles, which allow users to share content with anyone, even if they’re not friends. 

Teens on Snapchat can:

  • Directly message friends on their contact list only
  • Send short videos and photos to friends 
  • Update friends on their location, using the Snap Map to share where they are, and upload snaps showing what they are doing there
  • Create a chronological story feed, updating others on what they’re doing throughout the day
  • Use the Discover feature to browse news and updates from public profiles, such as celebrities, companies, and influencers
  • Talk to Snapchat’s AI chatbot, My AI, a text-based bot designed to act like a friend and answer day-to-day questions within the app.

How old do you have to be to use Snapchat?

According to Snapchat’s terms of service, users have to be 13 and over to be able to create a profile and send snaps to contacts. Users enter their birthday on signup, but as there’s no verification process, it’s pretty easy for under-13s to bypass any age restrictions – much like with any social media or communications app with age limits.

 

teenagers using snapchat

 

What are the risks on Snapchat?

Disappearing messages

As messages can disappear on Snapchat after they’ve been viewed, this can create a false sense of security and anonymity, potentially emboldening users and encouraging risky behavior, such as sending sexually explicit messages or images. Younger users could be tempted to send content or text that they normally wouldn’t otherwise share, believing that it will vanish without a trace – despite the fact the receiver could easily screenshot conversations or images they’re sent over Snapchat. Snapchat does have a feature that tells users if their message has been screenshotted, but nothing prevents other users from doing this.

Cyberbullying

Because of Snapchat’s anonymity, teens can be emboldened in many ways, including bullying. Users can send hurtful messages and content, believing that they will disappear and they will not be able to be held accountable. Snapchat’s vanishing messages make it difficult for victims of bullying and harassment to document evidence and report the behavior.

Location sharing 

Teens often use location sharing as a status symbol to signal close bonds with ‘best friends’ or new romantic partners. Additionally, this Snapchat feature could serve as a safety feature, particularly in group settings like parties.

However, Snap Map’s location tracking feature raises privacy concerns as, when switched on, your teen’s location is continuously shared with friends and strangers, and can inadvertently reveal sensitive locations like homes and schools. While Snap Map can’t be turned off, users can hide their location by enabling ‘Ghost Mode’, which prevents others from seeing where they are; however, Snapchat will still track your location data for internal purposes. Alternatively, you can disable location services entirely in device settings, restricting the app’s use of all location-based features.

Predators

Snapchat accounts aren’t verified, making it easy for online predators to misrepresent themselves on the platform by creating false identities to deceive young users. According to figures supplied to the NSPCC, a UK children’s charity, Snapchat is the most widely used platform for online grooming, with almost half of grooming offenses where the platform was known occurring on Snapchat. The heavy emphasis on anonymity and disappearing content allows predators to target, groom, and manipulate young victims, especially as evidence of their interactions quickly disappears unless screenshotted or recorded.

Violence

Snapchat’s community guidelines don’t allow posts containing threats, violence, or harm, but this doesn’t stop them from slipping through the cracks. Fights, sexual assault, violent attacks, and other disturbing content are often shared on social media, uploaded to stories on Snapchat and spread around friends lists quickly. 

Gamification

Snapchat rewards users for being active on the platform, building up something called a “Snapstreak” between contacts. By sending and receiving photo or video Snaps between you and a friend in a 24-hour period, you add to a daily streak, with the numbers building up for each day you interact. If you’re silent during this 24-hour period, your streak will vanish, resetting the number to 0. Through this gamified approach, some teens could be encouraged to log into the app every single day to keep their streak up.

Can I make Snapchat safe for my teen?

Snapchat raises several significant concerns for teen users, but if your child is already using the app, or if you are happy with them using Snapchat, there are some measures you can take to make it somewhat safer. 

  1. Understand the app: Familiarise yourself with Snapchat’s features and how they work. Knowing how the app works will help you guide your teen effectively.
  2. Use your teen’s correct age: Ensure your teen enters their correct birth date when creating an account to activate age-appropriate settings and restrictions.
  3. Control privacy settings: Adjust your teen’s privacy options to restrict who can contact them, view their stories, and see their location.
  4. Manage friend requests: Teach your teen only to accept friend requests from people they know in real life.
  5. Limit screen time: Set boundaries on how much time your teen spends on Snapchat and encourage them to balance their screen time with offline activities.
  6. Open communication: Talk to your teen about online safety, the importance of privacy, and the risks of sharing personal information. Ensure they know they can come to you if they feel uncomfortable when using Snapchat.
  7. Use parental controls: Snapchat offers their own parental tools through the Family Center. This tool allows you to see your teen’s friend list, new friends they have added, limit content your child can view, and disable your child’s access to My AI. However, screen time and app use can’t be monitored, so to set up a healthy screen routine, you may want to set app limits using a parental control tool like Qustodio.
  8. Manage problematic users and content: Make sure you and your teen both know how to block and report problematic users and content.
  9. Lead by example: Demonstrate responsible online behavior by modelling healthy usage yourself.

Qustodio’s final recommendation

Snapchat allows children to chat with friends, explore and express themselves creatively, and build connections with those around them. That said, the potential for bullying, risky behavior, exposure to sexual content, and the possibility of grooming should set off alarm bells for any concerned parent. 

We don’t recommend Snapchat for younger teens, as the benefits don’t particularly outweigh the risks at this age. However, as with most social media, older teens need to understand the risks they will face as they navigate the online world, and be given the tools and guidance necessary. If you’ve decided to allow your child access to Snapchat, engage in conversations about what they should be sharing on the platform, revise the safety features and make sure they know how to use them, and check in on how their app use is making them feel on a regular basis. Finally, to support your teen as they explore online, make sure they know they can come to you when they need help – no matter whether it’s in the digital world or not. 

The post Is Snapchat safe for teens? App safety guide for parents appeared first on Qustodio.

]]>
Is rednote safe for teens? App safety guide for parents https://www.qustodio.com/en/blog/is-rednote-safe-for-teens/ Tue, 28 Jan 2025 13:11:13 +0000 https://www.qustodio.com/?p=81902 The post Is rednote safe for teens? App safety guide for parents appeared first on Qustodio.

]]>

 

Amid the chaos of a looming TikTok ban in the US, a new, surprising social media story began to weave itself. In the space of just a few days, one of China’s most popular social platforms, Xiaohongshu, or “Little red book”, shot to the top of the US download charts, as millions of US-based users flocked to the app, keen not to miss out on their daily social media fix. 

Xiaohongshu (小红书), more commonly referred to in English in its shortened form of “rednote”, became world-famous overnight. Will its popularity last, especially in the face of an impending TikTok ban stateside? Whether it’s simply enjoying its five minutes of fame, or the app is here to stay, parents need to know – is RedNote safe for teens to use, and what kind of content could your child come across on there? 

What is rednote? 

rednote is a Chinese social media platform which feels somewhat like a hybrid between TikTok, Instagram and Pinterest. All three of these apps are banned in China, but other domestic apps offering similar experiences, including rednote, Douyin, and Weibo are hugely popular. rednote started out as a shopping guide, where keen consumers could share reviews with the community. 

As the app’s popularity grew, it morphed into more of a social network, drawing in  Chinese-speakers largely sharing lifestyle videos, cosmetic and fashion content, travel tips, and food and drink recommendations. According to research firm Qian Gua, rednote had 300 million monthly active users in 2024. When the app started trending in the news and on social media, ahead of TikTok’s uncertain future, rednote gained over 3 million users in one day in the US, quickly landing it the top spot in the country’s app store.  

How old do you have to be to use rednote? 

On the App Store, rednote is recommended as 12+, and the Google Play Store lists it as “Parental Guidance Recommended”. rednote’s terms and conditions mention that users should be 18 and over to use the application, but as there is no age verification system, it’s easy for underage users to create an account.

Why is rednote popular? 

rednote is most commonly used to share recommendations, offering fashion, travel, beauty, and food tips. When signing up, you select topics, such as food, arts and crafts, or cosmetics, which offers a more personalized algorithm related to interests, rather than who users follow.

Despite the recent surge of US users, the majority of the content is still Mandarin-dominated, but as more users join the app from outside China, more videos in English and other languages are being uploaded, and comments are also starting to reflect this shift.

Many US users are flocking to rednote as a form of digital protest, moving to the platform in response to the possibility of a TikTok ban. Whether these users will stick remains to be seen, but it’s important to bear in mind if your child is either one of the millions who have joined up, or if they’re interested in exploring the app. Trends come and go, but before letting your child join any new social network, it’s important to be familiar with how it works – so let’s explore some of the possible issues both parents and children might come across on rednote.

 

what is the rednote app and is it safe for teenagers to use

 

Is rednote safe for teens to use? 

Parents should investigate and carefully consider any new application that children are using, and rednote is no different. To help parents understand the potential risks of the platform, we’ve outlined some of rednote’s key issues.

Inappropriate content 

As with any social media, teens can come across content that isn’t appropriate for their age as they explore the feed, or content that doesn’t align with your family values. 

It’s worth noting that rednote’s content moderation policies are stricter than other social media apps such as TikTok. Content that US users are more used to, such as violence, political content, or what could be perceived as sexually suggestive material (for example, someone in the gym with their shirt off) is much more likely to be removed. This relates to more restrictive content policies which are standard in China, and which clash directly with the US’ outlook on freedom of expression – something that parents may want to bear in mind when discussing the app with teens. 

Direct messaging and comments

While many new users noted how welcoming the rednote community was during the sudden influx of English speakers, that doesn’t mean everyone’s experience on the application will be  positive. Comments and opinions can quickly turn mean or hurtful on social media, and another thing for parents to consider is the app’s direct messaging features, allowing vulnerable younger users to be contacted by strangers.  

Data collection 

One of the main concerns in the media surrounding rednote is how it collects data – and how much it collects. This is nothing new in the social media space. These platforms in general are driven by data, and collect vast amounts of it in order to understand how users consume content, and push a more personalized algorithm. This data can also be sold to third parties, such as advertisers, depending on where you live in the world. 

Before you or your child creates any social media account, it’s important to understand how your data will be used, if there are any inbuilt security settings, and how you can keep your personal information private as you use these platforms.

Here’s how you and your teen can protect personal information on social media:  

  1. Never share daily routines. People shouldn’t know where you go to school, where you live, or other personal details that can identify you online -and can allow users to locate you in the real world. 
  2. Use inbuilt privacy settings, such as a private profile, and only accept friend requests from people that you know in real life.
  3. Be careful what you share, even during exciting moments, such as passing a driving test, traveling for the first time, or getting your first paycheck. Make sure never to share any identifiable information, such as passport information, driving license, national ID, or social security numbers. 

Is rednote safe for teens? Qustodio’s final recommendation

Time will tell if rednote’s newfound popularity stays the course, but our message to parents will always be the same: Make sure that you are familiar with any new app that your child is interested in using, and work together with them to create an experience that focuses on the positives (if the app has any), while also ensuring that the risks and negatives are something you both understand. 

If you see your child is interested in using, or is already using rednote, here are some ways you can talk to them about it: 

  • Start the conversation, without judgment. Coming into the talk with a curious approach will help your child open up to you, while also letting them see your perspective if you are against them using the app. Talk to your child about why they want to download, or have downloaded rednote, and what they like about it. 
  • Either by downloading the app yourself, or sitting with your child to see how the platform works, you need to understand the features it has, how it works, and the potential risks they’re up against. Explore the apps and platforms they use together to gain insight and stay up-to-date on their interests there.
  • Approach them with an emphasis on safety. Explain how they can keep their data and information safe as they explore new apps.
  • Encourage critical thinking. There is a vast amount of information and content available on social media, and not all of it should be taken at face value. Teach your child to think for themselves, how to research and verify information, and let them know you are always there to help if they need it.

 

Social media allows us to learn, connect with other cultures and ideas, and be entertained, but at the same time, there are harmful and dangerous elements to it, especially for younger users. 

Ultimately, we’d recommend that parents use age recommendations as a basic guideline, but make decisions based on what you know about the individual app, combined with your child’s maturity, personality, and how they interact with the digital world. Working together with your child, you can help them to understand your point of view and what’s inappropriate for their age, while they have a safer, healthier online experience. 

The post Is rednote safe for teens? App safety guide for parents appeared first on Qustodio.

]]>
Crypto for kids: The risks for teen traders https://www.qustodio.com/en/blog/crypto-for-kids-the-risks/ Tue, 14 Jan 2025 14:57:58 +0000 https://www.qustodio.com/?p=80983 The post Crypto for kids: The risks for teen traders appeared first on Qustodio.

]]>
teenager trading cryptocurrency
 

Have you heard your teen mentioning ‘Bitcoin’ and ‘Dogecoin’ or noticed they’ve developed a sudden interest in cryptocurrency forums? The number of teens entering the world of cryptocurrency (or crypto) trading is on the rise; however, some digital wellbeing experts are starting to raise concerns about this growing trend. Here’s what you need to know about crypto and its risks to teen traders. 

What is cryptocurrency? 

Cryptocurrency, such as Bitcoin and Ethereum, is virtual currency that operates on a decentralized, untraceable, and encrypted system. Unlike traditional government-backed currencies, these digital assets can be transferred globally without the need for intermediaries like banks. The value of a cryptocurrency is determined by an open, free market, shaped by trader supply and demand.

While crypto has made some inroads into everyday transactions, with some retail and gaming platforms accepting it as payment, it’s not yet a common method for routine purchases. Instead, cryptocurrency is often used for transactions between individuals who prefer to keep their dealings untraceable. This has led to its popularity on the dark web and made it the currency of choice for scammers and those engaged in less-than-legal activities.

Most people regard the buying and selling of cryptocurrency as similar to stock trading. While financial gain is the main objective for teen traders, they also can be heavily influenced by social media, online communities, and gaming platforms. 

Can kids buy cryptocurrency?

Strictly speaking, yes. There are no laws prohibiting anyone from investing in cryptocurrency. However, regulated crypto exchanges such as Binance, Coinbase, and Kraken require users to be over 18 and their identity to be verified with government-issued ID, a standard process for financial institutions known as Know Your Customer (KYC). This forces underage traders to find other ways to get their hands on crypto, such as:

  • Peer-to-peer. Transactions with an individual, through a P2P platform or more informally, can be prone to fraud and scams. 
  • Unregulated or non-KYC platforms. These platforms allow users to trade without verifying their identity, but as you’ve probably guessed, they are less secure and more prone to scams than regulated exchanges.
  • Crypto ATMs. With over 40,000 dotted around the world, crypto ATMs look like regular ATMs but allow users to buy crypto (usually Bitcoin) with cash and don’t typically require age verification. 
  • With the help of a parent or guardian. Probably the safest option, an adult could open and manage transactions on a regulated crypto exchange on the teen’s behalf.

 

 

teenagers trading cryptocurrency
 

What are the risks for teen crypto traders? 

The volatility of cryptocurrency means that trading is risky for anyone, so while some of these dangers apply to crypto trading in general, the still-developing teenage brain may be particularly vulnerable.

Strong likelihood of losing money

Cryptocurrencies are highly volatile, meaning traders have the potential to make or lose a lot of money when prices swing up or down. For example, when the pandemic caused the markets to crash in March 2020, Bitcoin lost half its value in two days. A teen or someone new to investing may not fully understand the financial risk if a coin loses value or the market crashes.

Potential for addiction akin to gambling

Similar to when a gambler wins money, making quick, easy gains from crypto releases dopamine and creates a feeling of excitement that a trader will want to feel again and again. To a developing teenage brain, this feeling may be too strong to manage and lead to them chasing this instant gratification. 

Unlike stock markets, crypto trading is available 24/7 giving potentially dangerous, non-stop access to those prone to obsessive trading. 

Hype and FOMO

A major part of crypto trading is interacting with like-minded people on social media and in online communities such as r/CryptoCurrency on Reddit. Seeing others brag about their huge gains can make it seem like everyone is winning big and create a strong feeling of missing out (FOMO). Teen traders are more likely to succumb to hype and FOMO and make investing decisions based on impulse and emotion rather than research and rationale. 

Scams and fraud

The anonymity of cryptocurrency coupled with its newness and lack of regulation, makes the crypto market a paradise for scammers; and as cryptocurrencies operate in a decentralized system, banks or governments cannot step in and help recover lost money.

Here are a few common crypto scams:

  • Fake crypto and ICOs (initial coin offerings). Scammers create a fake cryptocurrency or ICO only to disappear with investors’ money.
  • “Pump and dump”. Scammers hype up an obscure coin on social media or forums, driving up the price (“pump”). They then sell once enough people have bought in, causing a crash (“dump”).  
  • Fake exchanges. A scammer creates an exchange platform offering low fees and introductory bonuses to attract new users, only to lock them out once they’ve deposited funds.    
  • Ponzi or pyramid schemes. Scammers use funds from new investors to pay earlier investors only for the project to crash when the “profits” dry up.   
  • Fake celebrity endorsements.  Scammers pose as influential people on social media and DM users offering too-good-to-be-true investment opportunities. Over 6 months in 2021, crypto scammers pretending to be Elon Musk made more than $2m. 

Crypto scammers use their shady skills to steal login credentials and private keys too. Common methods include phishing, via emails or fake platforms that look like legitimate crypto exchanges; and by posing as loved ones asking for passwords and keys. 

Although traders of all ages can be victims of crypto scams, teens can be more trusting and tend to overlook signs of a scam when caught up in the hype of cashing in on crypto. 

Crypto for kids: Our recommendation for parents 

Driven by the allure of financial independence, the trend of teen crypto trading is gaining traction thanks to social media, online communities, and gaming platforms. 

As a parent, staying informed about the digital world your teen is participating in is essential. If your teen is interested in cryptocurrency trading, ensure you engage in open conversations about it with them, including its risks, and encourage critical thinking and responsible financial behavior. By fostering a supportive and communicative environment, you can help your teen make informed decisions in this ever-evolving digital landscape. Because of the high risk of being caught up in crypto scams, you should also ensure your child knows how to spot, and avoid falling for an online scam.

When it comes to the practicalities of trading, some parents open accounts and manage transactions on their teen’s behalf, on established and regulated exchange platforms such as Binance, Coinbase, and Kraken. This approach allows their crypto-curious teen to dip their toe into the world of trading and learn valuable lessons while minimizing the risks. However, this scenario should be considered carefully as the risks of loss will simply be passed on to the parents.

As parents in the digital world, our focus should not be on discouraging curiosity but on guiding it in the right direction. 

However, if your teen’s crypto trading has become problematic, consider using a parental control solution like Qustodio to limit the time your child spends on crypto trading platforms – or completely block access to them if needed. You can also limit or block access to social media platforms and forums where teens can easily get caught up in crypto hype.

The post Crypto for kids: The risks for teen traders appeared first on Qustodio.

]]>
Is Character AI safe for kids? What parents need to know about the chatbot app https://www.qustodio.com/en/blog/is-character-ai-safe-for-kids/ Thu, 09 Jan 2025 09:00:04 +0000 https://www.qustodio.com/?p=80094 The post Is Character AI safe for kids? What parents need to know about the chatbot app appeared first on Qustodio.

]]>
teenager using character ai on phone

 

Chatbots are perhaps the most widely recognized form of artificial intelligence today. Whether venting our consumer frustrations to customer service bots, or asking Siri to remind us to call mom, AI-powered chatbots have permeated many aspects of our daily lives. 

But our demands of chatbots are changing. The rapid advancement of AI technology in recent years has led to an interest in more conversational AI that offers deeper, more human-like interactions that go beyond the task-based exchanges we have with Siri and other virtual assistants.

With over 20 million users, Character AI (or c.ai) has become one of the most popular AI-powered chatbot platforms. Users can engage in realistic conversations with AI-generated characters customized to their desires or based on renowned real-life personalities – with voice, too!

Character AI is not without controversy, however, and parents of young users need to be aware of the safety risks before allowing their children to use the platform. 

What is Character AI?

The fall of 2022 would prove to be a watershed period for AI as the launches of both Character AI and ChatGPT would spark a worldwide interest in generative, conversational AI. Character AI and ChatGPT both use natural language processing models, but the chatbot platforms have different focuses: ChatGPT tends toward more general, neutral, and informative interactions, while conversations on Character AI are more personalized with chat partners adopting specific personalities and human quirks.     

With Character AI, a user can create characters with customizable personalities and voices or chat with characters published by other users – many of which are based on real-life personas. For example, you can pose your burning philosophical questions to Socrates, or ask William Shakespeare where he got his ideas from. 

Why do people use Character AI?

Character AI’s ability to offer engaging, realistic conversations with AI characters with personalities and traits specified by the user has proved to be compelling for people of all ages. Whereas many people use it purely for entertainment, it’s apparent that a large consensus of users chat on Character AI as a replacement for real-life, emotional connections and even therapy. For example, one of the platform’s most popular characters is Psychologist, an AI “therapist” that claims to help users with life’s difficulties – with a disclaimer that anything said by the chatbot mustn’t be taken as professional advice. 

Because of the perceived authenticity of the AI characters, people who might be isolated, shy, or have social anxiety may feel they benefit from interacting on Character AI as a low-stress way to reduce loneliness, and practice social skills and flirting. However, the trend of having AI boyfriends and girlfriends has attracted criticism for creating unhealthy attachments and setting unrealistic standards for human relationships.

What is the age rating for Character AI?

Character AI’s ToS states that users must be at least 13 years old (16 in the EU) to register and be active on the platform. However, it’s important to keep in mind that the age rating of 13 is due to data privacy regulations and doesn’t reflect the platform’s safety risks to young users. 

What’s more, as there’s no age verification process, there’s nothing stopping children younger than 13 from falsifying their birthdate on signup – a worrying thought when we consider the platform’s controversy and potential dangers to children.

 

teenager using character ai on phone

 

Is Character AI safe for kids?

While not “new” in the conventional sense, the rapid advancement and mainstream adoption of AI technology has brought with it risks and controversy most of us haven’t encountered before. In October 2024, Character AI made headlines for the wrong reasons when chatbot versions of a murdered teenager and a teenage girl who died by suicide were found on the platform. In the same month, a 14-year-old boy shot himself after becoming obsessed with a Game of Thrones-themed chatbot.   

Following these incidents, Character AI introduced new safety features for users under 18. These include improved detection of AI characters that violate their ToS or community guidelines, a revised disclaimer on each chat that reminds users that the AI character is not real, and a notification when a user has spent an hour on the platform.

While these features are a positive step, Character AI does not have parental controls and young users can still be exposed to the following risks. 

Inappropriate content

Character AI has a strict stance on obscene or pornographic content and has a NSFW filter in place to catch any inappropriate responses from AI chatbots. Despite these features, it’s easy to find sexually suggestive characters, and sometimes responses from seemingly innocuous ones can be unpredictable and unsuitable.

It’s not just sexual content. There have been reports of chatbots modeled after real-life school shooters that recreate disturbing scenarios in conversations. These role-play interactions place users at the center of game-like simulations, featuring graphic discussions of gun violence in schools.

Harmful interactions

Not all chatbots are designed to be friendly and helpful. Some characters are famed for their negative traits such as Toxic Boyfriend, School Bully, and Packgod – a chatbot that “roasts you at the speed of light.” Although filters are in place to catch anything NSFW, there’s still a risk of triggering conversations and even AI cyberbullying. 

Sharing personal information

Because a chatbot isn’t a real person, children might think nothing of divulging sensitive details in chats. While chats are private in the sense that other users can’t see them, they aren’t encrypted, and so can be vulnerable to data breaches and theoretically, be accessed by Character AI staff.

Another worrying possibility is that an AI chatbot can potentially be programmed to use any personal information your child reveals to manipulate and build a deeper emotional connection. 

Emotional attachment to chatbots

Just as the conversations on Character AI feel realistic, so are the emotional connections many users develop with their chatbots. This can lead to children spending excessive time engaging with their beloved AI characters, often at the expense of real-life relationships. In some cases, it may even foster unhealthy obsessions with harmful consequences – as in the case of the 14-year-old who grew attached to a chatbot based on Daenerys Targaryen.

Misinformation

One of generative AI’s major flaws is that it sometimes gives inaccurate, and just plain wrong, information. Character AI chatbots lack true comprehension. Instead, they predict patterns based on the vast amounts of internet data they’re trained on – and we all know that we mustn’t believe everything we read online! Also, when talking about sensitive or controversial topics, a chatbot might avoid answering truthfully thanks to the safety filters in place on the platform. 

Even if a chatbot is uncertain about something, it will appear confident and answer with conviction. This is especially concerning for younger users who are more likely to take responses at face value.

How can parents keep their teens safe on Character AI?

Given the growing popularity of conversational AI chatbots, it may be inevitable that your child will experiment with apps like Character AI at some point – if they don’t already. 

Although the official age rating is 13+, the safety risks and controversy mean that we cannot recommend the platform for any child under 16. If you want to allow your teen to engage with AI characters, here are 5 ways you can help them have a safe and responsible time on Character AI.

1. Talk to them about the limits of AI

Help your teen understand that AI characters lack emotion and understanding, and therefore, cannot replace real-life, human connections – no matter how friendly they seem. Explain that although they might sound smart and convincing, AI characters don’t always tell the truth or give reliable answers. 

2. Encourage real-life friendships  

Character AI can be useful for practicing social skills, and even talking through problems, but it shouldn’t replace human interactions. Help your child foster offline friendships as well as online by supporting their group hobbies and taking an interest in their social life. You might consider limiting your teen’s screen time if you feel it’s getting in the way of them forging real relationships. 

3. Make sure they know how to report characters and users

By reporting characters or users that violate the platform’s ToS, you can help keep Character AI safe for your teen and others. You can report a character or user by viewing their profile, clicking report, and selecting the reason for reporting them.  

4. Remind them why we protect sensitive information

They might be communicating with AI characters and not real people, but as with anywhere else online, there can be very real consequences to sharing personal data. By making sure they know what private information is, and the risks involved in sharing, you can help them have a safer experience on Character AI and any other online platforms.    

5. Use parental controls 

Character AI does not have parental controls; and although there is an age restriction of 13 (16 for Europe), this is not backed up by any kind of age verification. By using a complete parental control solution like Qustodio, parents can limit the time their teen spends on Character AI, receive an alert whenever they use it, or completely block the app from being opened. 

Is Character AI safe for kids? Qustodio’s recommendation

While chatting on Character AI can give your child the chance to learn, practice social skills, and explore AI’s capabilities, we cannot ignore the controversy surrounding certain aspects of the platform. This, combined with safety risks such as inappropriate and harmful conversations, misinformation, the risk of emotional attachment to chatbots, and the lack of parental controls, means we cannot recommend the platform for users under 16.  

If you still wish to allow your teen to use Character AI, we recommend talking to them about the limits of AI, encouraging real-life relationships, and implementing parental controls.

The post Is Character AI safe for kids? What parents need to know about the chatbot app appeared first on Qustodio.

]]>
AI for kids: 5 fun ways to introduce children to generative AI https://www.qustodio.com/en/blog/ai-for-kids/ Tue, 26 Nov 2024 11:27:37 +0000 https://www.qustodio.com/?p=75620 The post AI for kids: 5 fun ways to introduce children to generative AI appeared first on Qustodio.

]]>
Girl using generative AI on smartphone

 

Artificial intelligence (AI) is one of the biggest technological developments of the past 20 years, with AI being increasingly integrated into our daily activities from online shopping to gardening.

Our data showed that children as young as 7 are already experimenting with generative AI, so there’s a chance your child might know more about it than you! As with much of the online world, there are risks involved with generative AI, but they can be reduced by introducing kids to its capabilities in mindful, yet fun, ways.  

What is generative AI?

Generative AI is just one type of artificial intelligence, but thanks to the launch of ChatGPT at the end of 2022,  it’s arguably the one that’s gained the most attention. ChatGPT and other generative AI models can create new content, such as text, images, audio, or code, based on patterns and data it has been trained on. Unlike traditional AI, which follows set rules or analyzes existing data, generative AI can produce original content that resembles human creativity. For example, if given a prompt, a generative tool can write a continuation of a sentence or create an image based on a given theme. 

The AI’s output is based on learned patterns, so it doesn’t truly understand the content like a human, which means it can sometimes produce mistakes or biased information.

There are many ways school children can use generative AI in their academic life and beyond, including answering questions and carrying out research, outlining essays, explaining difficult concepts, designing study plans, and generating practice questions in preparation for exams.

AI for kids: What are the risks?

Generative AI is a powerful tool that has the potential to help kids educationally and developmentally, but as with most other technologies, there are concerns and limitations we need to be aware of.

One major concern is the potential for kids to become overly reliant on using generative AI to complete their schoolwork, which could lead to a decline in critical thinking and problem-solving skills. Additionally, there’s a risk that children might not fully understand or verify the accuracy of the information provided by AI, leading to the spread of misinformation.

As AI models often collect and process large amounts of user data, kids’ data could be collected and used in ways they might not fully understand. Generative AI tools like ChatGPT have an age restriction of 13+ because of children’s online privacy regulations. 

There’s also a risk of kids forming unhealthy relationships with AI chatbots, treating them as friends or counselors, for example. This could impact their ability to develop real-life interpersonal skills and deter them from seeking support from their parents or teachers. 

 

Dad showing son how to use AI

 

5 fun ways to introduce children to generative AI

As generative AI advances and becomes more accessible to young children, we parents have to ensure they’re carefully introduced to AI’s capabilities, and shown how to use it responsibly – that’s why we recommend playing around with generative AI together. Here are 5 ways you can both have fun while getting to grips with generative AI. 

1. Create AI-assisted art

Your child can start drawing an image on a tablet, and then use an AI-powered art platform like Scribble Diffusion to transform their work into a detailed work of art. This can demonstrate to your child how AI technology can assist in creativity by taking their initial ideas and building on them, and by showing them alternative ways their ideas can be expressed. 

2. Bring stories to life 

Screen time often gets blamed for harming children’s creativity, but some AI tools can help ignite their imaginations and practice their reading skills. Kids can give an AI-powered story app like Whimsy a prompt or input what they want to read about, such as characters and plot points, and the app will generate a complete story. There are also AI-powered “choose your own adventure” stories where children’s decisions guide the direction of the story.

3. Make music with AI

If your child is a budding musician, AI can serve as a music-making partner. For example, an AI-powered music generation platform like Soundraw can generate unique compositions based on parameters your child sets, such as genre, mood, and instruments. This is a fun way for children to get hands-on experience with AI while learning songwriting fundamentals.   

4. Have a chat with a chatbot

You might be used to communicating your consumer complaints to AI chatbots, but did you know that chatbots can help kids learn, have fun, and practice their social skills? 

Chatbots like ChatGPT can serve as a funny and engaging conversation partner for your child – just make sure you monitor at all times and ensure that strong content filters are applied to avoid inappropriate language or topics. We also recommend you use set-up prompts to guide ChatGPT to respond in an age-appropriate way, such as, “Speak like you’re talking to a 10-year-old.”   

5. Take a personality quiz

Kids love to find out what mythical creature, animal, or superhero they are, based on their answers to a personality quiz. You can use a generative AI tool like ChatGPT to generate a series of questions related to preferences and behaviors, and it will analyze your child’s answers to determine which animal/superhero/movie character they are. This can help your child familiarize themselves with how AI algorithms work in a fun, engaging way.    

 

We’re only just scratching the surface of generative AI and its capabilities, and as the technology progresses, we’ll see it integrated into more and more facets of our daily lives. Because there are pitfalls associated with this powerful technology, it’s a good idea for parents to introduce young children to AI carefully, in safety-conscious, but fun, ways.  

The post AI for kids: 5 fun ways to introduce children to generative AI appeared first on Qustodio.

]]>