An advertising industry insider claims that some companies which provide advertisements for web-pages deliberately slow down the loading time of pages in order to drive up the price of the paid ad.

Speaking to Business Insider, the unnamed source – whose placement and credentials in the industry was verified via LinkedIn – claims that the 100ms time-out for competing services to bid for the available space can be set significantly higher in order to accommodate ‘late-comers’ in the automated bidding process.

A large number of publishers employ the process of ‘daisy-chaining’, wherein the most profitable or preferred ad-publisher is polled first for potential placement, with the chain only turning to less profitable networks if the prime candidate defaults or has no base ad to fill the slot. The time needed to work down the chain can eat into the putative 100ms auction cap, and the suggestion from Business Insider’s informant is that this limit is passed over in favour of a longer search for a profitable placement among the B and C lists.

The source identifies himself as a long-standing Principal Engineer and Solution Architect at a global news company fielding 14 million impressions a month, and commented:

“Part of the problem is that behind the scenes, [the company] actually runs a real-time automated auction for ad slots in which ad buyers (or their software really) bid for the slot. The best bid wins and the slot is placed. Naturally, there is a timeout (and not a massively conservative one) on this auction, so slow-responding bidders hold everything up, and therefore hold up the ad slot, and therefore the page. All this happens as soon as you request an ad and is designed to keep ad prices up…My entire team of devs and testers mostly used Adblock when developing sites, just because it was so painful otherwise!”

There’s no suggestion that any of the submissions, prompt or tardy, are anything other than automated participants assessing various criteria to determine the maximum amount they client company is willing to spend in winning a slot on a particular page. But slow servers or ‘deeper’ automated research into the viability of the host-page could provide additional ‘investigative’ processes and cause lag in the bidding procedure.

Anecdotally most of us have the acuity to notice when a web-page’s ads are delivered before all else over a slow connection, where these are still found. In the age before broadband, when watching a page ‘build’ up over 56k modems was a common experience, it frequently seemed that the content was the last factor that the server would request. The negative impact of such ‘prioritising’ on viewing figures later led to techniques such as preloading, wherein the entire web-page would need to be downloaded to the viewer’s computer before any of it was shown. Such practices meant that a laggard ad-server, or any other element which failed to load at all could completely prevent the page visualising.

In these days of asynchronous loading, these procedures are rarely seen, and stand out as counterproductive to the publishing company’s intent when they are.

The article’s researchers sought out validation for the informant’s theory, but responses were divided between utter dismissal and agreement that the practice of delaying page-loads to maximise ad revenue is a current one in the industry.

Home