Adobe ‘Catchy Content’ Shows How Content Intelligence Powers Personalization

Marketers are obsessed with data. But when that obsession turns into tunnel vision on customer data only, it becomes impossible to achieve personalization at scale. Instead, brands need to know both their customer and their content, and Adobe took the wraps off a prototype that aims to do just that.

At the company’s fully virtual Adobe Summit, it showed off prototype software dubbed “Catchy Content” during Sneaks — an hour-long look into the R&D projects at Adobe that may or may not become products. Early beta customers included a professional services company in APAC and a B2B software-as-a-service company in the US. The companies were able to understand how different audiences responded to language characteristics like length and tone and whether imagery like infographics, photojournalistic images, or stock art performed better.

We’ve told you that there’s no chance of personalization without understanding your content. What the Adobe prototype aims to do is understand customers’ content preferences at scale, and then it folds those insights back into the creation process to deliver more relevant and engaging content.

The process works in three steps:

  1. It gathers data on the types of content that engage the customer. This could be product copy, images, and videos of specific products or lifestyle images that align with the brand voice. It does this using Sensei, Adobe’s proprietary AI system, to “look” at images and text to extract meaning.
  2. It understands the attributes of the content. In the Adobe example (screenshot below), it uses a mock surfing website and extracts the tone and sentiment of the copy and the types of imagery.
  3. It matches up relevant content with the segment, persona, or individual that brands are looking to target. In the screenshot below, you can see how different types of content engage the “female surf enthusiast” persona.
This screenshot from the dashboard of the prototype “Catchy Content” demonstration shows how different types of content align with customer affinities. Image: Adobe

 

This is an important milestone on the personalization journey, because it:

  • Shows how content intelligence is just as important as customer data. Understanding content at scale is hard, especially with a vast library that spans multiple silos. When brands can consolidate their assets into a content hub or digital asset management (DAM) solution, they create a single source of truth across the organization. Then they can begin to understand that content and match it up with experiences.
  • Drives better customer experiences. Personalized content isn’t about selling stuff — though it does help with conversions — but more relevant experiences show customers that brands value their time, which is one of the hallmarks of a high customer experience (CX) score. And great CX, powered by great digital experiences, means profitable growth.
  • Paves the way to content automation. When brands know a particular image or type of image performs well, it can surface other relevant images to customers. When it knows that the tone or sentiment of copy resonates with a type of buyer, it could match up variants of that copy with different segments. Does this mean content creators are out of jobs? No, of course not. But it means that the mundane task of creating content variants or filling in metadata could be done by a bot, while higher-level creative work can be done by humans.

 

Want to talk more about AI, personalization, and content? Set up an inquiry with me.