The beginning of the year is a time for forecasts, predicting trends, summarizing the previous year and drawing conclusions for the future. Nevertheless, in April this may already feel tedious. Furthermore, there is a limited number of opinions that go beyond 2017. So what is digital advertisement facing in the future beyond the upcoming 12 months? Have a look!
Before we discuss details, I have a few remarks. Forecasting what will happen in the industry that evolves so quickly, namely online advertisement, is a risky activity. Nobody knows what technologies will be developed in the next few months or so. That is why it is worth focusing on macro trends that we have been observing for some time now.
On-going trend: battle with latency
Seems like this is nothing new. But it’s still a valid statement… Facing latency, meaning the time necessary for a website or advertisement to load on the internet will stay in the focus of publishers and marketers for long. Even with the computing power of devices that we use, including mobile devices, continuously increasing (remember the famous Moore’s law? Perhaps things are no longer exactly like what he said 50 years ago but there’s definitely some truth in his statements), the size of advertisements and technical requirements for creatives keeps going up.
Maybe after a while we will reach a dead end in increasing requirements when it comes to the computing power required to display ads – more and more is talked about invasiveness of rich media advertisement and on the other hand, a native ad generates very good results. Cooperation with publishers in Asia, Europe and South America has shown us that many of them are ready to display native campaigns, both in terms of the technology and mentality. Nevertheless, I am not sure if the advertisers are equally as prepared for that.
Besides, the used programmatic techniques impact the time of loading of ads. Many publishers continue to use the waterfall. Waterfall may be composed of ad exchanges that use an SSP waterfall platform. This causes the time for sending enquires to all of them to be long. This is one of the reasons which was decisive in the increase of the popularity of header bidding (briefly speaking, its advantage is that it sends enquiries to multiple SSPs at the same time, etc.) Another simplification is header bidding from the level of the servers and it seems that this will be a hit in 2017. It will also impact the time of loading the website and ad (which will take less). A similar solution based on server communication is being developed by Google.
Nevertheless, it seems probable that when we optimize the time of website and ad loading, we will learn about a new technology that will be a breakthrough but it will require optimization in terms of latency. And here the story starts again…
The trend for the upcoming future is to simplify implementation of codes
Before this technology appears, header bidding will stay with us for a while. Besides many positives, there is also one basic negative – it is not at all that easy to implement it on the publishers’ websites. This requires programming works to be completed on the part of the internet service owner. It often happens that web developers:
- are in fact external service providers / employees of an external company and every hour of their work is a huge expense for the publisher;
- as employees of large media groups, they have more orders from various departments than they have time for their execution. This means that you need to wait in a long queue before the programmer is available to complete your task and once it’s your turn, the works are completed in a rush;
- they don’t know the specific environment of adtech because they work on a number of different orders;
- they correspond to the explanation of more than one of the above items.
Unfortunately, this means that implementation of codes isn’t always correct. As a result, people in charge of optimizing ad placements cannot use the entire potential offered by header bidding.
And this is where I look at server-to-server header bidding again, with hope. It seems that thanks to this solution, particular publishers will not face the slow task of incorporating the proper scripts.
It is also worth adding that currently in development of new ad tech solutions, the actual possibility of implementation is taken into account in terms of the publisher or advertiser. This means that the degree of complexity of the implementation plays an important role in the latest technologies. Even if we’re not aware of it.
The trend that we are still facing: websites creative
Publishers optimize the size and placement of advertisement formats on their part, and check how users react to content layout, etc. through A/B tests. Everything is done so that the content layout matches the preferences of site visitors. Recently, programmatic creative has become popular. It involves the use of automated processes in creating and displaying ads. What if we used the same logic for websites? I think this will be possible in a while. We already have platforms that use artificial intelligence when designing services. Grid and Wix. Nevertheless, both have been designed for recipients who are individuals or kind if SMEs and they offer one “best” content layout.
The bottom line is that there is no such thing as a permanent best layout of particular components. Their optimum grid depends on the user and also changes with time. Since we’re experimenting with this in the scope of programmatic creative, why can’t these experiments become more serious?
The easiest way to build such an intelligent website is to use exchangeable modules or design several or a dozen or so similar layouts. A particular content layout would be selected on the basis of the history of visited sites, interactions with them, their quality, user preferences, interests, etc. An analysis of so many components would take some time which means that we are back at point one: battling latency. And here the story starts again.