Colman Noctor: We need to reboot our approach to internet safety

Many programmes focus on teaching young people how to use technology safely, but they should also demonstrate how technology uses them.
Colman Noctor: We need to reboot our approach to internet safety

"My concerns about young people’s mental wellbeing in the technological era are not so much about what they are doing online but more about what they are ā€˜not doing’ when they spend so much time online."

Online safety fatigue is on the rise. Many young people I talk to dismiss warnings, saying ā€œthat won’t happen to meā€. Appealing to young people’s rebellious nature and informing them about ā€˜surveillance capitalism’ could be a good place to start.

Our children are growing up in a hyperconnected world where every online move is tracked, analysed, and monetised. Social media platforms, search engines, and even gaming apps operate within a vast system known as surveillance capitalism. This business model thrives on the collection and sale of user data. Yet, many young users remain unaware of how their personal information is used to manipulate their behaviour and shape their digital experiences. Educating the next generation about these issues is crucial to fostering digital literacy and ensuring they have control over their online lives.

Most of us are unaware of how much control over our choices we hand over to online platforms. For example, Google Maps and Uber Eats use AI to suggest where to eat based on location, past visits, and reviews. Tinder, Bumble, and Hinge use AI to decide on potential partners based on swipes, chat patterns, and interests. Netflix and YouTube recommend shows and music tailored to our tastes, while Instagram and Facebook curate our feeds, deciding which content we see first. AI-driven recommendation engines (Amazon, Instagram, Pinterest) influence purchasing decisions, and apps like MyFitnessPal create personalised diet plans.

The term surveillance capitalism, coined by Harvard scholar Shoshana Zuboff, describes how tech companies extract personal data from users to generate profit. Unlike traditional capitalism, where businesses sell products or services, surveillance capitalism turns human experiences into raw data to predict and influence future behaviour.

It goes far beyond our Google queries. Every ā€˜like’ and moment spent scrolling through a social media exchange generates data points that feed sophisticated algorithms that don’t just learn what users prefer — they anticipate what they want next and craft a targeted digital experience. This data is then sold to advertisers, who bid for the opportunity to capture users’ attention with hyper-specific ads.

The most high-profile example of surveillance capitalism was identified in 2018 when it was revealed that Cambridge Analytica, a political consulting firm, harvested data from millions of Facebook profiles without consent. The company built psychological profiles to deliver highly targeted political advertisements, influencing elections worldwide by analysing users’ likes, shares, and social connections. The scandal underscored the power of manipulating data to influence individual choices.

Impact of data harvesting

We need to explain to young people how data harvesting impacts their lives by providing examples like ā€˜dynamic pricing’, where online retailers use surveillance capitalism to adjust prices based on browsing history, location and even spending habits. Other examples of how our data is unwittingly collected include how Google Maps engages in behavioural tracking. This app doesn’t just provide directions; it collects vast amounts of location data. Analysing movement patterns can predict where you go to school, where you shop, and even when you’re likely to be home. Google then uses this data for hyper-local ad targeting, such as showing you an ad for a coffee shop just before your morning commute.

Established music platform Spotify doesn’t just recommend songs; it analyses your listening habits and moods to provide data on users’ emotional states that can be leveraged for advertising. A company selling sleep aids, for instance, might target users who listen to a lot of relaxation or sleep playlists late at night.

Some platforms use deceptive designs known as ā€œdark patternsā€, which bury privacy settings in complicated menus or use misleading wording to disguise data-tracking prompts. These data manipulation methods are becoming more concerning with the emergence of AI tools, which can generate hyper-realistic deepfake content to make endorsements and product recommendations seem more trustworthy.

Beyond data collection, online platforms are keen to hack our attention. Psychologists and engineers in Silicon Valley work together to ensure that apps and websites maximise user engagement, often employing techniques similar to those used in gambling and behavioural psychology. This manipulation, sometimes called ā€œattention hacking,ā€ keeps users hooked and ensures they spend as much time as possible on a platform.

One of the most well-known examples is the infinite scroll function pioneered by social media apps like Twitter, Instagram, and TikTok, which continuously supplies fresh posts, making it difficult for users to break away. This scrolling taps into a psychological phenomenon known as the ā€œdopamine loopā€, where each new post provides a tiny pleasure hit, compelling users to keep scrolling for the next rewarding piece of content.

Value of personal data

My concerns about young people’s mental wellbeing in the technological era are not so much about what they are doing online but more about what they are ā€˜not doing’ when they spend so much time online. The social atrophy caused by living predominantly in the virtual world has profound implications for our collective sense of meaning and purpose. It’s essential that young people value their attention and have a say over where they use it.

As technology increasingly dictates how young people interact with the world, we must equip them with the knowledge and skills to navigate these platforms more critically. While much of our internet safety and digital literacy narrative has taught young people how to use technology, it should also demonstrate how technology uses them.

One of the first lessons young people must learn is that their personal data has immense value. While platforms like Google, TikTok, and Snapchat may appear free, users pay with their data. We must introduce the ā€œdata economyā€ concept to young people and illustrate how companies turn their online interactions into profit. Young users should be informed about the benefits of practical actions such as reviewing app permissions or checking ad settings. By seeing firsthand how their data is categorised and used for advertising, they can begin to understand the implications of surveillance capitalism.

When students understand the psychological tricks these platforms use, they will be better equipped to resist them. By reflecting on their digital consumption, young people can make more intentional choices about engaging with technology.

Beyond personal habits, young people should also be encouraged to think critically about algorithmic bias, misinformation, and digital privacy to better understand the tech landscape.

Taking back control

Data protection is not easy to sell to young people. Still, trying to foster young people’s inbuilt sense of activism and encouraging them to resist having their choices controlled by tech companies might be an approach worth trialling. However, if these sessions turn into discussions about regulation and data protection laws, such as the European Union’s General Data Protection Regulation (GDPR), we won’t help students understand how policy influences digital rights — we will lose the room. It’s not necessarily about inspiring civic engagement or fighting for a fairer internet; it is about telling young people that these companies do not have their best interests at heart and that ignoring the realities of online manipulation may leave them vulnerable.

Surveillance capitalism and attention hacking have reshaped the internet into a battleground for user engagement, where every click, like, and interaction is commodified. While tech companies have become increasingly sophisticated in collecting data and employing persuasion tactics, young people don’t have to participate passively.

Instead of our internet safety interventions consisting of lectures about screen time, cyberbullying and pornography, we need to look at fostering their practical, everyday digital literacy skills.

By encouraging young people to think critically about their online lives, we can empower them to take control of their online experiences rather than be controlled by them.

Knowledge is the best defence in an era when data is more valuable than oil. If young people can become more capable of navigating the online world with curiosity and caution, they may be able to use AI and the internet as tools for empowerment, not exploitation.

More in this section

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

Ā© Examiner Echo Group Limited