Meta (Facebook, Instagram) Advertising Tips “Is there anything that you have analyzed more thoroughly than this (Meta Ads)?” ✔️Test *It was conducted in a total of three tests. Meta(Facebook) Test A: First Step in Ad Optimization Meta(Facebook) Test B: Possible to optimize advertisements? Meta(Facebook) Test C: Don’t believe A/B Test 🅰️ Conclusion Meta (Campaign) can have the most dramatic effect for mobile-friendly services (products). 'Advertising Optimization' should be carried out based on the advertisement. Rather than relying on the 'confidence level' from A/B test results , manage your ads based on actual advertising experience and numbers. If you're unsure of the best ad material and format during the initial ad setting , try setting up two ad sets, with three to six ads each . Do you know that different types of CTRs? The CPC of an image could be the most expensive! Try using an ad format (design) that can show various designs and products at once . The most effective way to improve ad performance is to turn 'OFF' in the 'Ad Set' rather than turning 'OFF' the 'Ad'. Turning 'OFF' the 'Ad' is also a good method for operation management. As far as possible, use 'Advantage+Exposure' instead of 'Manual Exposure' for ad exposure locations.
❓ Curiosity? You may have heard that creating many ads in Meta (Facebook) can yield better results. In fact, the maximum number of ads that can be created in an 'Ad Set' is up to 50 per Ad Set . So, can we see the best results by creating 50 ads in one Ad Set? Meta (Facebook) recommends keeping the number of ads for ad performance to six or less . While an exact number is not given, it is generally recommended to set the number of ads to six or less, and any more than that would make little difference. (Meta recommend to use at least 10 ~ 20 Ads for ASC(Advantage Shopping Campaign) Based on the above, we conducted a test to see how much difference can be made if we proceed with a difference in the number of 'creative (ads)'. Also, based on the results, we tested and analyzed variables related to associated parts, so we recommend it as a reference for those who operate Meta (Facebook) ads. 💻Learn more about the conclusion! Meta (Campaign) is expected to have the most dramatic effect for mobile-friendly services (products). Exposure Device Mobile Desk Exposure Location Audience Network native, banner and full-page ads Audience Network Reward Video Facebook profile feed Facebook Reels Facebook Story Facebook Feed Facebook In-Stream Video Facebook search results Facebook Feed: Video Feed Instagram Browse/Explore Tab Instagram Explore Home Instagram Reels Instagram Story Instagram Feed Messenger Inbox Facebook Feed Facebook In-Stream Video Facebook Right Column Instagram Story Instagram Feed Total 15 5 The table above distinguishes devices by exposure location. Despite variations depending on each ad format (image, video), you can easily see that mobile is overwhelmingly dominant. [image update] Looking at the test results conducted so far, we can confirm that over 80% of the exposure occurs on smartphones when analyzed by 'exposure device'. Based on this data and information, we can predict that customers who visit through ads operated on Meta will naturally visit the product (service) via mobile. Are your products (services) operating mobile-friendly services? If the answer is "No," why not consider whether you really need Meta (Facebook) ads? (Please actively operate features like 'Call' in your ads.) For services providing high-value products or information, use traffic and awareness campaigns instead of sales campaigns as conversions often occur on PCs rather than mobile. For products (services) that receive a lot of inquiries, actively use the call feature. For e-commerce services, set up the landing page to have a payment function that can be immediately purchased on mobile. Use separate landing pages optimized for mobile UX and UI. ‘Optimization’ try to be conducted based on the ads. How long do you work on optimization when running ads on Meta? Do you optimize ads until satisfactory figures come out? Or do you not proceed because you don't know how to optimize? If you don't have clear criteria, why not try optimizing based on the ads? Campaign Performance (CTR (Total)) Collection CTR (Total) Collection CPC (Total) Excellent Ad Material Test A Campaign 1.44% 2.65% RM 0.99 Collection Test B Campaign 2.24% 2.72% RM 0.82 Collection Test C Campaign 2.23% 2.55% RM 0.86 Collection Based on these results, the ad optimization process involved changing campaign ad set settings, changing text, changing age, etc. , and ultimately, we were able to achieve better results from the 'Test B Campaign' and 'Test C Campaign' than from the 'Test A Campaign'. However, despite these settings, we confirmed that there was no significant change in the performance metrics of the ad , and based on these test results, we found that the increase in performance of a specific campaign or ad set is largely influenced by ad changes, and the ad performance according to subsequent settings is less influential. Based on this, if you set the goal of your optimization work to the 'best ad metrics', it is judged that you can operate based on concrete and clear metrics, not 'simple expected metrics'. CTR (Total) CPC (Total) Test D: Campaign (Advantage+Exposure Position)_Single Collection Ad Setting 2.44% RM 1.24 Test D: Campaign (Manual Exposure Position)_Single Collection Ad Setting 1.44% RM 2.00 Test E: Campaign (Advantage+Exposure Position)_Single Image Ad Setting 0.50% RM 1.70 Test E: Campaign (Manual Exposure Position)_Single Image Ad Setting 0.47% RM 1.62 However, as in the above data, there can be differences in performance depending on the 'advantage' feature in the 'ad set' and the number of ads. Therefore, I think it is necessary to analyze the differences in performance according to these features before optimization. Please operate ads based on actual ad experience and numbers, rather than understanding the 'confidence' that comes from ‘A/B test’ results. In this test processed 3 times. As a result, we could confirm a high confidence of 86% (Test A), 86% (Test B), and 87% (Test C). However, opposite results appeared in 'Test B' between 'Test C,' where all settings were the same. Of course, there could be various reasons for these results, but surprisingly, the overall ad metrics came out similar. Specifically, the differences in 'cost per result', 'excellent ad CTR', and 'not excellent ad CTR' were '0.01RM (3 KRW)', '0.06%', and '1.93%', respectively, which was the same figure in the case judged not excellent. As can be seen in these 'A/B test' results, it is necessary to analyze the numbers in the ad manager as well as the 'A/B test' data. If you don't know the best ad material and format from Meat Business Manager , try setting 2 ad sets, 3 to 6 ads in each set. When running Meta (Facebook) ads, each ad set can contain anywhere from '1 ad to a maximum of 50 ads,' but Meta (Facebook) recommends setting 6 or fewer ads. The reason is that when too many ads are posted in Meta (Facebook)'s ad performance optimization method, ‘machine learning’ , the individual ad posting frequency decreases and more budget is spent on performance optimization. Based on the above information, I set 4 ads and 1 ads each on two ad sets to broaden the choice of machine learning and operate the most efficient ad test. (If you want to set different ads by language and country, it is recommended to combine ad sets and use multiple language functions for performance.) Test Exposure Result CTR (Total) CPC (Total) Test A 4 ad settings 33,553 658 1.95% Test A 1 ad setting 20,473 97 0.60% Test B 4 ad settings 4,743 119 2.49% Test B 1 ad setting 3,727 74 1.93% Test C 4 ad settings 4,821 93 1.93% Test C 1 ad setting 4,634 120 2.55% Test D 1 ad setting (Advantage+Exposure Position) 1,882 46 2.44% Test D 1 ad setting (Manual Exposure Position) 2,013 30 1.49% In the test results, 'Test A' conducted before ad optimization had a relatively low CTR (Total), and the CTR (Total) in optimized 'Test B', 'Test C', 'Test D' did not show a big difference regardless of the number of ads. However, the CPC in 'Test D' was paying a higher cost than 'Test A' conducted before optimization. Especially, compared to the CPC costs of 0.86 and 0.82, which have similar CTR (Total) in 'Test B' and 'Test C', a difference of about 100 KRW(KRW) can be seen. Test Exposure Result CTR (Total) CPC (Total) Collection from 4 ad settings in Test A 22,097 595 2.65% Collection from 4 ad settings in Test B 4,086 112 2.72% Collection from 1 ad setting in Test B 3,727 74 1.93% Collection from 4 ad settings in Test C 4,466 93 2.08% Collection from 1 ad setting in Test C 4,634 120 2.55% Collection from 1 ad setting in Test D (Advantage+Exposure Position) 1,882 46 2.44% Collection from 1 ad setting in Test D (Manual Exposure Position) 2,013 30 1.49% If we look at the ‘Collection’, which is the best ad in the 'ad setting,' there was a difference of 0.2 ~ 0.3% in CTR (Total), but they all had similar figures. It's different from CPC (Total). For this reason, it was judged that if an optimized 'ad setting' is possible, there would be no big difference in performance even if only one 'ad' is set. Nevertheless, we recommend setting '2 ad sets' and '3 or more ads' through this test because of the possibility that the difference in CPC (Total) can occur due to machine learning, and it appears to be the best way to achieve maximum performance. However, too many ad sets can cause auction overlap , which can slightly affect performance. Do you know that CTR types are different? You may have seen high click rates and results when running ads, but the number of people who actually buy or visit the website is unusually small. There can be various reasons, but it occurs when one person clicks multiple times. You can use 'Unique CTR' as a way to minimize this difference. 6 types of CTR distinguished by Meta (Facebook) CTR (Total) The ratio of clicks to total impressions. CTR (Link Click-Through Rate) The ratio of link clicks to total impressions. Unique CTR (Link Click-Through Rate)_Estimate The ratio of link click behavior by unique people who saw the ad. Outbound CTR (Click-Through Rate) The ratio of outbound clicks to total impressions. Unique Outbound CTR (Click-Through Rate)_Estimate The ratio of outbound click behavior by unique people who saw the ad. Unique CTR (Total)_Estimate The ratio of click behavior by unique people who saw the ad. You can check the above 6 types of CTR in the 'Performance and Clicks' column of the ad manager. Are you monitoring ads in real time? You may be monitoring ads in real time after launching the ads. However, please note that the initial data after launching the ads may not be accurate because it is in the learning phase . Therefore, it would be good to wait for at least 24 hours after the ad launch and then check the ad performance. However, if there is a critical issue in the operation of the ads, such as the ads not being exposed at all or the cost per result being too high, please check the ads immediately. Consider using ad formats (designs) that can showcase various designs and products at once. In this test, we used the same image for video, image, slide, and collection ads. Interestingly, all three tests showed significantly higher performance in the ‘Collection’ format. Of course, the results may vary depending on the service (product), industry, and various environments and situations, but it seems worth trying to operate ads using designs and formats that can showcase multiple designs and products at once to expect good results. Image CPC can also be the most expensive! Generally, many people believe that the cost of CPC is lowest for images. However, the correct answer is actually “It depends on the results.” When checking the image CPC from actual ads, it was relatively high at RM 1.34. [image update] [image update] On the other hand, collection and video CPCs were RM0.99 and RM0.54, respectively, which were cheaper. If you are considering ad creation with CPC in mind, you will find that the most important thing is not the format of the content but the most optimized ad creation. The best way to improve ad performance metrics is to turn off ads at the ‘ad set’ level rather than turning off individual ads, which is more efficient, although turning off individual ads is also a good method for operational management. If you operate ads by adding specific ads or turning them on and off to improve performance metrics, try turning them on and off at the ad set level from now on. In the case of Meta Ads, machine learning is performed based on the ad set, and the most excellent ad within the ad set is found and operated intensively during the machine learning process. Intentionally turning off or adding specific ads during this process can lead to additional opportunity costs and potentially turning off optimized ads. For these reasons, the most ideal way to improve ad performance is to set up ad sets and only turn off underperforming ads (low-performing ads, limited machine learning ads), and create new ad sets for new content. This is a more efficient way to operate. If multiple ad settings need to be set or operated continuously, you can use ‘Advantage Campaign Budget’ for more comfortable operation, although A/B testing will not be possible. For ad placement, use ‘Advantage+ Placement’ rather than ‘Manual Placement.’ Through three tests, we found the most popular placements and tested them using ‘Manual Placement’ and ‘Advantage+ Placement.’ Using the most popular ‘Collection’ ad and the least performing ‘Image’ content based on previous ad results, we set up and conducted the tests. Test Name Placement Result CTR (Overall) CPC (Overall) Test D (Collection): Advantage+ Placement 1,882 46 2.44% RM 1.24 Test D (Collection): Manual Placement 2,013 30 1.49% RM 1.93 Test E (Image): Advantage+ Placement 7,972 31 0.39% RM 2.19 Test E (Image): Manual Placement 8,536 27 0.32% RM 2.40 In conclusion, the most excellent performance was achieved with ‘Advantage+ Placement.’ Although continuous testing is necessary depending on the campaign, industry, service, and other factors, this test showed that ‘Advantage+ Placement’ brought higher performance for traffic campaigns. The difference in CTR (Overall) was up to 1% higher, and the CPC (Overall) was up to 200 KRW lower. While finding meaningful specific placements through actual user behavior analysis and operating accordingly is essential, if the goal is simply to increase traffic or go through the optimization process, try operating in the manner described above. 📖 Reference Campaign, ad set, and ad limits per ad account Advertising count management information Machine Learning Step Information Machine Learning Step Guide