STAQ Handles Google's DCM Issues In Stride

Publishers were deeply affected when Google's DoubleClick Campaign Manager experienced API connection failures in late February. While Google was acknowledging the problem and promising a fix, STAQ was able to patch the issue and kept data flowing. Google was ultimately able to provide a fix.

Because STAQ handles data collections across for many publishers dependent on DCM, we were able to diagnose the problem and find a temporary solution while Google fixed it for the long term.


Team STAQ Raises Over $6,000 At Bowling For BreastCancer.Org

At the 10th-Anniversary event, team STAQ raised over $6,000 for Bowling For We bowled our hearts out and had a blast. Thank you to our teammates and those who contributed to this amazing event. Bowling For BreastCancer.Org has raised over $4 Million to date for the world's leading online resource for breast health and breast cancer information and support.


Q4 Top Of The STAQ Winner | Feifan Chen at Daily Mail

Our quarterly Top of the STAQ award is given to those who have shown extraordinary skills and effort in driving insights for their companies that increase performance and yield by leveraging automated reporting.


Our Q4 winner, is Feifan Chen - Manager, Global Commercial Data at The Daily Mail.

Feifan has leveraged STAQ to build out an incredible automated data set for The Daily Mail, spreading revenue driving insights across his organization. Since implementing STAQ to collect and unify reporting from their direct buyers, programmatic exchanges and analytics partners, The Daily Mail has seen a 250% increase in their revenue. Prior to this, reporting collection and analysis would take three hours a day and has since been reduced to 10 minutes - enabling their business teams to instantly receive the insights they need.

With the success of this approach in the U.S., This top 50 ComScore brand with over 200m unique visitors has adopted this data strategy globally, across the U.K., AU. and the rest of the world. Congrats Feifan!

The State Of Programmatic CPMs - An Analysis

At AdExchanger’s Programmatic IO on October 15th in NYC, Matthew Goldstein presented an overhead analysis of the industry utilizing an anonymized data set fueled by’s STAQ Industry Benchmarks product. This compelling analysis covers the industry’s most important metrics, including the average CPMs of ad sizes, the performance of desktop over mobile, the market share of largest exchanges and much more!

STAQ’s Industry Benchmark product enables publishers to compare their own performance against the performance of their peers. With over 20 brand name publishers and $500m in annual revenue, this massive collection of data reveals incredible insights into the industry and can guide publishers to higher revenues!

Click Here To Download The Presentation

Top Of The STAQ Award - Q2 2018


Our quarterly Top of the STAQ award is given to those who have shown extraordinary skills and effort in driving insights for their companies by leveraging automated reporting.

This quarter's winners have not only skillfully reigned in the firehose of reporting data that their companies collect on a daily basis, but have also unpackaged specific insights which directly impacted their bottom line revenue.

We're very excited for this quarter's winners and congratulate them on their hard work...



Justin Hansen, Yield Manager

Intermarkets - Reston, Virginia


Justin and the team at Intermarkets have an incredible skillset in utilizing automated reporting for media optimization. So it was of no surprise to hear that Justin was able to double the CPM of one of their websites in less than a month! Intermarkets handles a large portfolio of brand web sites, serving over 3 billion impressions a month across all screens.


Joy Bian, Platform Specialist

33 Across - New York, NY

Joy Bian streamlined the reporting process for the sales & operations teams at 33 Across, enabling an astonishing 3X ROI on the investment into reporting automation. 33Across delivers deep engagement and increased ROI for buyers with their Attention Platform™ product and a unique monetization solution for publishers that has been named "Best Publisher Technology" by Digiday.

Congratulations Justin and Joy!


Remember: We Are Not A Real Time Industry.

In our industry, we all base our operations and finances of what is reported in a static UI at the end of the day as the billable number, not a real time auction price.

Yes, there are real time auctions, but the actual payment on these auctions change afterwards in reporting from both the buy and sell side, and from 3rd parties.

Nearly ALL of the ad tech revenue that is billed is reported by the buyer to the DSP, exchange and the end publisher, based on what the buyer displays in their reporting UI or what their verification partner clears as valid. All of this is well after the real time auction is complete. And this reporting changes even after it is published, often it will change many weeks afterwards.

Why and how can billable reporting change?

The IAB industry standard has had the buyer with audit control over payments. This came from a need to control of the performance and fraud that our wild west industry was plagued with in the beginning. The buyer has a right to check and determine the performance of the inventory they bought after the auction and change the publisher’s payment if they see any of the traffic they bought was:

  • Not from a human

  • Not “in view”

  • Not within the audience they contracted for

  • Not within the content (or placements) they contracted for

To determine the above, it takes investigations, 3rd party verification companies, changing databases and re-posting reporting counts. All of this happens frequently on any normal day or week.

This past year, there have been calls for more transparency for both sides with news on baked in buyer fees, auction adjustments, and demands for 100% “in view” on human traffic only.

All of these items have been already baked into the numbers in available reporting. I’m not saying any of these practices are right... or misleading. That’s up for you to decide as a buyer or seller.

But in the end, the reporting inside partner interfaces at the end of the month are the numbers in which we run our businesses. As long as our campaigns need to be recalibrated and adjjust our inventory based on daily reporting changes, we are not real time.


James Curran

Co-Founder & CPO

Top Of The STAQ Award

Announcing The First Winners Of The "Top Of The STAQ Award"

We’re happy to announce the Top Of The STAQ Award. This award was created to recognize innovative STAQ users across our global client roster. We are privileged to work with so many people who use STAQ every day to improve their company’s insights, revenue growth and profitability.


The first round of winners were difficult to choose. All of the nominees are experts in automated data and reporting, showing effort and skill when building and maintaining their STAQ accounts, leveraging the product to the fullest extent and working to build an incredibly valuable automated data solution for their companies.

The winners have exceptional examples of using the STAQ product against their organization's unique business needs and conditions through:

 - Showing clear improvements to their organization's business

 - Organizing their data for their company's unique purposes

 - Applying creative ideas at the nexus of data and media to create insights that have driven results


....And our first winners are:

Bud Johnson - HealthGrades

Walter (Bud) Johnson is an experienced data operations professional that took STAQ’s capabilities to the next level. He took advantage of STAQ’s flexible reporting UI and created custom reports to meet HealthGrade’s needs.

Through the implementation and usage of STAQ, Bud Johnson created insights that cut HealthGrade's buffering levels in half and increased availability of inventory by 17%, by connecting 1st party ad server data with 3rd party and 4th party verification tools. Their Sales teams are now able to sell a higher volume of inventory as a result of these insights.


“Bud just knows how to use the data. He has a vision for how he wants to see his organization use the platform” - Ryan Weber, STAQ


Steve Mummey - AccuWeather

Steve Mummey has leveraged both his technical knowledge and data analytics experience as well as a passion to carry the flag across projects in Accuweather. His skill set includes front-end HTML, JS and CSS development/maintenance, SEO strategies and tactics, general analytics and reporting, advertising integration with DFP, project management; team lead and liaison between external firms, consultants and internal stakeholders.

Steve created a Programmatic Forecasting Report with STAQ which allowed him to see AccuWeather's actual revenue on a daily basis against the budgeted revenue, enabling the organization to see their pacing against forecasted revenue. Then using historical data to compare the current month revenue to last year's revenue, the company is able to watch their performance for year over year, on a daily basis... all with complete automation.

“Not only does Steve do an exceptional job optimizing programmatic and direct for AccuWeather, but he's not afraid to get into the weeds, and he continuously finds new way to present the data to his organization.”  - Zach Root, STAQ

- Criteria for the awards are based on knowledge and ability to unify the automated data sources flowing into STAQ, time and usage of the platform, as well as the operational efficiencies and insights they have brought to their organization as a result of their work. 

Congratulations Bud and Steve!

-James Curran, STAQ Co-Founder

Don't Drown While Filling Your Data Lake

Publishers are starting to ramp up their investment in data, especially as they gain more prowess in programmatic advertising. Many are throwing around the idea of creating their own data warehouse.

Data is currency. Amazon uses data to give you a more relevant retail experience. Google uses data to organize all of the information on the Internet. Facebook uses data to map your social behavior (sometimes they wind up in trouble.)

Publishers need to determine what they want to use data for, and also if they have the resources to collect, review and manage that data well enough for an investment in a warehouse to be worthwhile. Otherwise, a data warehouse ends up more like a data lake, with quicksand at the bottom.



Focus on The Goal

Data can be currency, but every data point isn’t valuable enough to keep. Thinking that you should simply collect and park every data point that comes into your organization is bad idea. Storing it will be expensive. You will have trouble squeezing insights from a huge data set. And you run a bigger security risk. It’s highly unlikely that this scenario will reap you enough long term reward to overcome the early problems.

Instead, start with a clear goal, and focus on the data points that help you reach your goal. You might have a goal to normalize pricing across your inventory based on advertiser bidding patterns. In that case, you would focus on collecting real-time bid stream data from your demand partners. Or, you might be interested in understanding the market value of various pieces of advertising inventory on your website cross-analyzed with different audience groups. So you’d need a solution that merges data from your DMP and your ad servers. Every business problem is different, and so every data warehouse should look different.

Keep Your Head Out of the Clouds, Even if Your Data Is In the Cloud

Think of a data warehouse like a real brick and mortar building that will store your stuff. You need a clean, safe, secure storage facility. You need to be able to grant access to certain people and restrict other people. Trucks need to be able to pull up and drop data off at regular times, and you need to find a place to put incoming data that’s organized and works with what’s already there.

For publishers with limited resources, these responsibilities might stretch beyond the reasonable limits of their organization. Don’t let developer hubris get in the way of a prudent decision. You probably do not need your own servers or your own room in a custom data center. Amazon, Microsoft Azure or Google Cloud will likley end up being the best partner because they are a relatively full service and that’s OK. The most important points to cover are that your data is secure, organized and accessible, and can accommodate the influx of new data without becoming unmanageable.



  • Log level data

  • Try to get a sample if you can to understand how to answers the question below: (crazy if you are multiple pubs, not as bad but still sucks if you’re single…. Because is based Custom content key value pairs, or Audience Key Values)

  • Be aware that the amount of data is exponentially more than rolled up data most digital publishers and marketers are collecting and analyzing on their own.

  • A roll up strategy is needed to handle it. For example when you have this data, what do you plan on looking for / getting out of it.

    • Do you need to see Bids, by Page or Content

    • Do you need to see Clearing price by Advertiser or Exchange

    • Do you need to see certain fluctuations by Time (meaning holidays and day of week)


Only when you roll it up, can you actually use it. If you can’t answers questions from it, then be careful collecting this data until you have at least figured out your top 3 questions.

  • And can you act on this when you have it????

Do you raise floors during winter if your content gets higher bids in sick season?

Do you change Guar. direct advertiser CPMs before this season?

The Opposite of Set It and Forget It

Speaking of unmanageable, data has a habit of spinning out of control, and you’ll need a lot more than an organized warehouse to keep it in shape. Taking the programmatic example, every day, a typical publisher pulls data in from 10, 20 or 30 different data sources and every day, there are errors in that data. You’ll need to have the resources to address errors within millions or billions of data points before you simply back up the truck and dump the info into your warehouse. On top of that, APIs stop working, field names change, partners change their policies, and you need to be on top of every minute change or you fall victim to the “garbage in, garbage out” problem. At that point, your entire warehouse is compromised.

This is where the warehouse analogy really matters. There is no brick-and-mortar warehouse that sits unattended where trucks simply back up and dump merchandise. There are people managing which door the trucks come to, people driving the forklifts, people checking, recording and cataloguing each delivery, and janitors keeping it clean. These labor costs are well understood in the world of physical storage, but are often dangerously neglected in the world of data.

I know of one publisher that put all warehouse management responsibility on a single person. When that person left the company, their data warehouse did sit unattended as tons of data piled up. Their storage costs and risk piled up, too. It was several months before the finance department noticed the mounting costs and figured out where they were coming from.

The moral of the story is that collecting and storing data is complicated. It requires a plan and goals, management and oversight. Otherwise, all your valuable insights will be sucked into the quicksand at the bottom of the lake.