Instead of testing individual ad variations, advertisers can use the "Dynamic Creative" (for leads) or "Flexible Creative" (for sales) toggles. This allows combining multiple top-performing images, videos, headlines, and text into a single ad unit, which Meta’s algorithm then mixes and matches to find the optimal combination for different users.

Related Insights

Contrary to the prevailing "video-first" narrative, Meta's own data shows that 60-70% of ad conversions still come from static images. Furthermore, carousel ads are experiencing a significant resurgence, making them a top-performing format that advertisers should prioritize for the new algorithm.

The next evolution, the Generative Ads Recommendation Model (GEM), aims to fully automate ad creation. Marketers will simply provide an image and a budget, and the AI will generate the entire ad library. This shifts the marketer's primary value from ad creation to optimizing the post-click customer journey and offer.

The largest advertisers on platforms like Meta launch over 10,000 new creatives a year, equating to more than 40 per workday. This massive scale of experimentation is manually impossible for most companies, creating a clear market need for AI platforms that automate and scale video production.

AI can now analyze video ads frame by frame, identifying the most compelling moments and justifying its choices with sophisticated creative principles like color theory and narrative juxtaposition. This allows for deep qualitative analysis of creative effectiveness at scale, surpassing simple A/B testing.

Ridge automates ad creation using a custom GPT and N8N, producing 500 static ads daily. Even if 90% are unusable, the remaining 50 ads provide a constant stream of testable creative, increasing the chances of finding winning variants for personalized campaigns at scale.

Previously, marketers told Meta who to target. With the new AI algorithm, marketers provide diverse creative, and the AI uses that creative to find the right audience. Targeting control has shifted from human to machine, fundamentally changing how ads are built and optimized.

Top creators like Mr. Beast relentlessly A/B test thumbnails and video intros to maximize views. AI video platforms now bring this data-driven experimentation to SMBs, allowing them to rapidly test variations of spokespeople, demographics, and creative elements to optimize ad performance.

Simply swapping headlines or colors on the same image is now penalized with higher CPMs. The Andromeda algorithm demands a wide variety of creative formats (static images, UGC, carousels, memes) and angles (pain points, testimonials, curiosity), viewing minor iterations as a single, less valuable creative piece.

With Meta's Andromeda algorithm automating audience targeting, the primary reason for poor ad performance is no longer incorrect targeting settings. Wasted money is now almost exclusively a result of insufficient or non-diverse creative, making creative strategy the most critical component of a successful campaign.

For products where A/B testing lacks signal, Resident uses a robust naming protocol and AI to analyze creative elements in aggregate. They tag attributes like room color, music BPM, and even mattress angle to identify winning trends across all ads, bypassing the need for direct tests.