Instagram to default young teens to private accounts, restrict ads and unwanted adult contacts – TechCrunch


As you prepare for expand access to younger users, Instagram this morning Announced a series of updates designed to make your app a safer place for teens online. The company says that now default users will have private accounts when signing up if they are under 16, or under 18 in certain places, including in the EU. It will also pressure existing users under 16 to change their account to private. . , if you haven’t already. Additionally, Instagram will implement new technology aimed at reducing unwanted contact from adults, such as those who have already been blocked or reported by other teens, and will change the way advertisers can reach their teen audience.

The most visible change for younger users will be the switch to private accounts.

Historically, when users signed up for a new Instagram account, they were asked to choose between a public or private account. But Instagram says its research found that 8 out of 10 teens selected the “private” option during setup, so it will now make it the default option for those under 16.

Image credits: Instagram

However, it will not force adolescents to remain private. They can switch to public accounts at any time, even during registration. Those with existing public accounts will be alerted to the benefits of going private and will receive instructions on how to make the change via an in-app notification, but Instagram won’t force them to go private, it says.

This change follows a similar move by rival platform TikTok, which this January announced that update private settings and defaults for users under 18 years of age. In the case of TikTok, it changed accounts for 13-15 year olds to private by default, but also adjusted other controls related to how teens use the app, with comments, video downloads, and other TikTok features. . , such as Duets and Stitches.

Instagram isn’t going so far as to restrict other settings beyond suggesting the default account type for teens, but it is taking steps to address some of the issues that result from adults participating in the same app that minors use.

The company says it will use new technology to identify accounts that have exhibited “potentially suspicious behavior,” such as those that have recently been blocked or reported by other teens. This is just one of many signals Instagram uses to identify suspicious behavior, but the company says it will not advertise to others as it does not want people to be able to play with its system.

Once identified as “potentially suspicious”, Instagram will restrict the accounts of these adults so that they cannot interact with the accounts of young people.

For starters, Instagram will no longer show youth accounts in Browse, Reels, or in the “Suggested Accounts for You” feature to these potentially suspicious adults. If the adult, on the other hand, locates the account of a young person through a search, he will not be able to follow him. And they won’t be able to see youth comments on other people’s posts nor will they be able to leave their own comments on youth posts.

(Any teenager planning to report and block their parents probably won’t turn on the algorithm, Instagram tells us, as it uses a combination of signals to trigger its restrictions.)

These new restrictions are based on the technology that Instagram introduced earlier this year, that restricted adults’ ability to communicate with teens who did not follow them anymore. This made it possible for the teens to still interact with their family and family friends, while limiting unwanted contact from adults they didn’t know.

Cutting out troubled adults from young teen content like this actually goes beyond what’s available on other social media, such as TikTok or YouTube, where disturbing comments are often left on youth videos; in many cases, girls being sexualized Y harassed by adult men. The YouTube comment section was even once home to a pedophile ring, which pushed YouTube to completely disable comments on videos with young children.

Instagram isn’t blocking the comment section entirely, instead it more selectively searches for bad actors and then makes child-created content that much harder for them to find in the first place.

The other big change that will be implemented in the coming weeks affects advertisers looking to target ads for teens under 18 (or older in certain countries).

Image credits: Instagram

Previously available targeting options, such as those based on teens’ interests or activity on other apps or websites, will no longer be available to advertisers. Instead, advertisers will only be able to target their ads based on age, gender, and location. This will take effect on Instagram, Facebook, and Messenger.

The company says the decision was influenced by recommendations from youth advocates who said that younger people may not be as well equipped to make decisions related to opting out of interest-based advertising, leading to the new ones. restrictions.

In reality, however, Facebook’s multi-billion dollar interest-based ad network has been targeted by regulators Y competitors alike, and the company has been working to diversify its revenue beyond ads to include things like e-commerce with the expectation that potential changes to its business are just around the corner.

In a recent iOS update, for example, Apple restricted Facebook’s ability to collect data from third-party apps by asking users if they wanted to opt out of being tracked. Most of the people said “no” to the follow-up. Meanwhile, attacks on the personalized advertising industry have included those of advocacy groups that have argued that tech companies should turn off personalized ads for those under the age of 18, not just those under the age of 13, who are already protected by today’s children’s privacy laws.

At the same time, Instagram has been toying with the idea of ​​opening its app until children under 13 years old, and today’s series of changes could help show regulators that you’re moving forward with youth safety in mind, or so the company hopes.

On this front, Instagram says it has expanded its group of “Youth Advisors” to include new experts such as Jutta croll at Digital Opportunities Foundation, Pattie gonsalves in Sangath and It’s ok to talk, Vicki shotbolt in ParentZone UK, Alfiee M. Breland-Noble at AAKOMA Project, Rachel Rodgers at Northeastern University, Janis Whitlock at Cornell University, and Amelia vance at the Future of Privacy Forum.

The group also includes the Family Online Safety Institute, the Digital Wellness Lab, MediaSmarts, Project Rockit, and the Cyberbullying Research Center.

He’s also working with lawmakers on age verification and parental consent standards that he hopes to talk more about in the coming months. In a related ad, Instagram said it is using artificial intelligence technology that estimates people’s ages. You can look for signs like people wishing someone a “happy birthday” or a “happy quinceañera,” which can help lower a person’s age, for example. This technology is already being used to stop some interacting adults with youth accounts, including the new restrictions announced today.




feedproxy.google.com

Leave a Reply

Your email address will not be published.