Ads from major companies including Verizon, Amazon, Popeyes, Shell, Roblox and others appeared alongside sexually suggestive and racially offensive content on the XShorts app, which currently ranks among the top entertainment downloads on Google Play. The app promotes itself as a platform for “short and hot videos” and carries a “Teen” rating, despite featuring explicit and inappropriate material.
An ADWEEK investigation found that ads from prominent brands were displayed next to problematic videos, primarily via automated ad networks such as Facebook Audience Network, Epsilon, InMobi, and others. Many brands claimed they were unaware of the ad placements due to a lack of transparency in ad algorithms and campaign settings.
Following ADWEEK’s inquiry, Google blocked XShorts from monetizing through ads, and Meta removed the app from its ad network. Several ad tech providers have since taken similar action, while brands are reassessing their media strategies and introducing stricter controls.
The case once again highlights the challenges of brand safety in digital advertising, especially on fast-growing platforms that rely on automated ad distribution. Experts emphasize the need for marketers and agencies to closely manage ad settings to avoid placements in unsuitable environments.
