Trust, Safety and Organic’s Blurring Lines
Creativity in the 2020's is no longer centred on pictures in a feed with clearly outlined areas for ads to appear.
It’s video. It’s immersive media. It’s generative AI.
It’s new ways of creating content and it’s new business models interwoven with it.
Amplified by these macro-trends, the traditional borders between organic and monetized content are blurring. On top of which, users rightly demand authenticity and safety from brands.
This is creating an interesting shift in Trust and Safety. A shift that’s under-appreciated. Bluntly stated: How do users know what to trust?
Ad integrity sits within this emerging space of new challenges and unanswered questions.
And this shift is especially important for younger users. Individuals are entering this new internet at increasingly earlier ages. All platforms have a duty to protect them. But more than that - to raise standards in how industries speak to children and teenagers when they are promoting their brands and products.
Despite the latest noise about moderation at Twitter. The platforms need more moderation. Not less.
— — —
This post is part of the Trust, Safety & Integrity Sequence a series of short thoughts inspired by my time working on Monetisation Integrity with TikTok. The posts focus mostly on the experience of trust and safety and ad integrity online, with particular emphasis on the larger digital platforms.