Does using price tags in ad creatives for jewelry products in Meta ads increase purchase intent?

High-end consumer products like jewelry are increasingly using digital media channels to increase their exposure and even sell products online. In fact, the value of the online personal luxury goods market worldwide was valued at 1.1 billion euros in 2004 and rose to 72.4 billion euros in 2023 (Statista, 2024).

Although these platforms have guidelines on best practices for creating the most engaging content, there are still many gray areas when it comes to specific products and services. This study aims to answer the following question: Does using price tags in ad creatives for jewelry products in Meta ads increase purchase intent?

Research Method:

Objective:

This study will focus primarily on the creative aspect of the ads rather than the campaign setup. It will examine whether a change in ad creative can attract traffic from a more relevant audience and ultimately convert them into customers. A/B test will be used since A/B testing, often known as online controlled experimentation or continuous experimentation, involves comparing two versions of an application in a real-world setting from the perspective of the end user (Quin et al., 2024).

Hypotheses:

1-Compared to an ad creative without price tag, an ad creative with a price tag will increase purchase rate since customers already know the price range of the product.

2-Increase in purchase rate will lower the cost per purchase, therefore it will result in more purchases with the same budget.

Null hypotheses:

1-Ads with price tags will have no difference in purchase intent.

Design:

Since we are focusing on Meta ads, the most reliable platform to run the A/B test will be Meta’s Ads Manager tool. This tool enables advertisers to run custom tests with their preferred setup without requiring coding. The study aims to measure purchase intent, so the campaign objective will be set to “Sales” with the optimization goal of “Maximize number of conversions.” There are three layers in Meta Ads Manager campaigns:

  1. Campaign Level (Objective)
  2. Ad Set Level (Audience, Schedule, Placement, and Spend)
  3. Ad Level (Ad Creative, Data Tracking Inputs)

The test will run at the Ad Set level, using identical settings for Groups A and B. For the first ad set, an ad with price tags will be displayed, while the second ad set will show an ad without price tags. The results from these ad sets will determine which ad creative increases purchase intent.

Data Collection:

Platform-related data (impressions, reach, engagement) will be gathered from Meta Ads Manager. Landing page-related data (content views, add-to-cart, checkout initiations, and purchases) will also be collected from Meta Ads Manager, with a pixel code embedded in the website code for tracking purposes. This data can be cross-verified with Google Analytics.

Process:

An A/B test was created on the 17th of November, running until the 30th of November, total duration was 2 weeks. The total spend was £225, evenly distributed between the two ad sets (£112.5 each). The platform automatically created 2 test groups, showed the ad creatives with the price tags to the first and without the price tags to the other. Since this type of setup limits the algorithms’ ability to find new audience, the campaign effectiveness was very low.

Audience:

The ads were shown to 30,989 people in total, 24,617 were female, 6,060 were male and 312 were uncategorized. The biggest portion of the audience was between the ages of 25–34. Since we didn’t use custom audiences, the audience preferences were determined by the previously selected interest keywords.

Results:

The ads were shown 88,939 times in total with a frequency of 2.87. Overall CTR (Click-through Rate) was 1.05%, which is lower than the account average for sales campaigns (Account average 3.04%). The campaign generated 937 link clicks, 382 of those clicks converted into landing page views. On the website the campaign generated 1,075 content views, 19 add to carts, 8 checkout initiated, 2 add payment info and 0 purchases.

For Ad Set 1 (Without the price tags), the ads reached 15,335 people, the ads were shown 44,441 times which makes the frequency 2.9. The CTR was 1.22%. 541 link clicks were generated and 209 of them converted to landing page views. On the website, the ad set achieved 598 content views, 13 add to cart, 6 checkout initiated, 2 add payment info and 0 purchases.

For Ad Set 2 (With the price tags), the ads reached 16,109 people, the ads were shown 44,498 times which makes the frequency 2.76. The CTR was 0.89%. 396 link clicks were generated and 173 of them converted to landing page views. On the website, the ad set achieved 477 content views, 6 add to cart, 2 checkout initiated, 0 add payment info and 0 purchases.

Key Findings:

Even though the campaign generated 0 purchases, it generated enough data to make comments about which ad set generated more purchase intention. Our main metric for determining the winner of the test was cost per purchase and the secondary metrics were; Cost per link clicks, cost per landing page view, cost per content view, cost per add to cart, cost per checkout initiated and cost per add payment info. Throughout all of these secondary metrics, the ad without the price tag was cheaper as seen in the table below.

Limitations:

Because of the budget limitation and the setup of the campaign (the set-up didn’t follow the best practices since it was a test), no purchases were generated. Although there is a clear pattern when it comes to the events that lead to purchases, the test did not meet the requirements.

Conclusion:

Based on the results, this test didn’t get enough data to make definite assumptions about the purchase intention. So the hypothesises neither can be proved or disproved. Although it’s safe to say that visuals without the price tag generates cheaper results for website events that leads the visitors to the purchase action, there is no proof that it will generate more purchases or better return on ad spend.

In conclusion, the test should be reconducted with a bigger budget and using a retargeting audience, since it’s hard to find customers when the audience targeting is broad. Also, the duration of the test can be extended to provide the algorithm with more space to optimize and generate results with a lower cost per result. Nevertheless, the company should start using visuals without the price tags for traffic objectives since statistically, it will generate more visitors because the cost per link click is cheaper. An audience from the visitors could be created and later can be used for retargeting purposes.

Adblock test (Why?)