Social media platforms brace for hit to user numbers from age checks

Social media platforms brace for hit to user numbers from age checks

Source Node: 1898668

Social media companies expect age verification measures in the UK’s Online Safety Bill will reduce user numbers, hitting advertising revenue on platforms including TikTok and Instagram.

The long-awaited legislation, which will begin its final stages in the House of Commons next week, would not only remove underage users from the platforms but also discourage individuals without identification or with privacy concerns, people involved with policy at leading social media companies said.

Fear of falling user numbers comes as the platforms deal with declining ad revenue, their primary source of income, brought on by the global economic slowdown, and as legislation around the world is introduced that places stringent new demands on tech giants to police content on their platforms.

“More vetting of users means fewer users,” said a person familiar with advertising at Instagram. “That means fewer users to advertise to, less inventory and fewer clicks and views for business”.

In 2022, revenues at Meta, Facebook and Instagram’s parent, and Snap grew at their slowest pace since the companies went public in 2012 and 2017, respectively, while TikTok also slashed global revenue targets by 20 per cent.

The UK bill, aimed at protecting children from harmful content online and removing all illegal content, is considered to be leading the agenda for big tech regulation globally, along with the EU’s Digital Services Act.

TikTok, Instagram, Facebook and Snapchat only allow users over the age of 13 on their platforms, but the proposed law will require businesses to enforce limits more stringently, including by explaining measures in their terms of service. UK regulator Ofcom will have the power to levy heavy fines on companies that fail to comply with the new rules.

The UK government department responsible for the bill said there was public support for keeping underage children off platforms.

“Companies can use their vast resources and ingenuity to develop their own solutions or use the range of age assurance technologies already on offer which protect people’s privacy and are used in other industries,” the Department of Digital, Culture, Media and Sport said.

Platforms’ current checks involve asking users for their date of birth and scanning text in comments, captions and bios for anything that may indicate an age.

Additional verification involves requesting ID or using age estimation through face scanning technology. Instagram users can choose to upload a photo of their identification, a video selfie to scan or ask mutual friends to verify their age.

However, Snapchat and TikTok have expressed concerns that current age estimation technology is unreliable.

Leading social networks have also warned the government that ID checks would freeze out people without identification or those who have legitimate reasons for not wanting to share it, such as survivors of domestic abuse or the transgender community.

“We are talking about really serious data collection and privacy issues, along with the potential disproportionate impact of needing to collect this data from a demographic that doesn’t [. . . ] all have ID and communities that don’t have ID,” said Nona Farahnik Yadegar, director of platform policy at Snap, the parent company of Snapchat.

Just over 70 per cent of 13-year-olds have valid UK passports, according to calculations by Yoti, a British provider of age estimation technology, based on freedom of information requests.

In comments published by Meta this week, Nick Clegg, the company’s president of global affairs, said improving child safety online is “a space where I think it is totally legitimate and normal for regulators to act.”

But in a submission to the Online Safety Bill in June, Meta said the British measures risk “dividing the UK online space into three strata: verified adults, unverified adults, and users under 18s” and could exclude “young Brits from being able to participate in the digital world freely and safely”.

As well as limiting access to certain types of users, a person familiar with public policy at TikTok said the company was concerned about how the online safety bill might impact user experience.

TikTok’s “For You” page, which recommends videos and content from across the world, would be reduced to a much smaller pool of content if people select to see only verified users.

This would hit freedom of speech on the app and lead to “filter bubbles”, with users seeing a more limited amount of content from verified users with similar interests, the TikTok employee said.

Several companies, including US firm Jumio and Estonian company Veriff, provide tech to verify IDs, but age estimation tech, which can involve face-scanning or hand measuring, is emerging in the sector.

Providers charge about £0.10 per face scan, according to a government impact statement for the Online Safety Bill, although this can be reduced for larger user numbers.

Facebook Dating, Instagram and adult content host OnlyFans have recently contracted Yoti to conduct face scans and verify age. Yubo, a livestreaming social media site aimed primarily at teenagers, said it had seen between a 10 and 20 per cent drop off in users since it introduced Yoti’s technology in May.

Yoti said it could accurately assess 6 to 11-year-olds as being under 13 in 98.9 per cent of cases, but there is a two-year error range for children aged 13 and 14, with non-white children less likely to be assessed correctly.

Those familiar with public policy at Snapchat and TikTok have suggested age verification could be done at an app-store level, which would place the onus on providers such as Apple or Google rather than the platforms themselves.

“[Tech companies’] worry is that by doing it, they are going to end up with a much smaller audience number for advertisers, which is going to massively dent their revenues,” said Tony Allen, chief executive of independent auditor Age Check Certification Scheme. “My argument is that it will adjust itself if it is done across the marketplace.”

TikTok and Meta declined to comment.

Additional reporting by Anna Gross in London

<!–
–>

Time Stamp:

More from Blockchain Consultants