• How Accurate are Google AdWords Search Volumes?

    In one of our recent blog posts, we posted an ROI calculator that helps you figure out whether you’re getting enough value from what you spend on SEO by targeting the right keywords. The results were based on an extremely conservative “fudge factor,” or an estimated margin of error, of about 90%, meaning that we only used 10% of given search volumes in our calculations. However, the numbers weren’t all adding up, so we conducted an experiment using several websites. The results showed some rather major discrepancies in the search volume numbers returned by Google AdWords.

    Currently, Music.com ranks #2 for the keyword “music” on Google. According to Google AdWords, this should give them a high percentage of the estimated 124,000,000 global searches that Google says occur monthly. Using the belief of Chitika Research Director Daniel Ruby that ranking #2 for a specific keyword will earn you about 17% of all searches (we use 16.96% in the ROI calculator), Music.com should be getting 21,080,000 visits from the keyword music alone per month. This may seem like an attainable goal for a keyword as common as “music,” but based on traffic estimates from Compete.com, the actual number of hits Music.com gets from this keyword is nowhere near 21 million.
    So there appear to be three logical explanations for this discrepancy. First, the percentage of a given search you can expect to secure is unrealistically high. Second, the Google AdWords tool is overestimating the monthly global searches. Or thirdly—both are correct. Our initial hypothesis was that the second option was the most likely, given the large amount of research done by Chitika on the traffic percentage of Google’s search results. Our experiment’s procedure looked like this:

    Step 1: We identified 4,000 sample keywords from 7 websites which we have Google Analytics data for (Peer1.com, Serverbeach.com, Voxeo.com, Music.com, Brandstack.com, Unbounce.com, and Missionrs.com)
    Step 2: We then found the Google rankings for all 4,000 keywords.
    Step 3: We then sorted the 4,000 keywords by the best ranking to the worst ranking and narrowed the list down to the words which ranked in the top 10 results (roughly 1,000).
    Step 4: For the remaining words, we gathered the individual search volumes (broad, global)
    Step 5: Next, we went to each sites respective Google Analytics account and pulled the actual traffic from each keyword.
    Step 6: Per each SERP, we took the actual visits and compared that number to how many Chitika’s calculations would project it should get. The percentage of difference between the two numbers is what we call a “fudge factor”, or the percentage that Google AdWords inflates the global search volumes for a given keyword.

    The results of the experiment told us two things: firstly, that Google AdWords was significantly overestimating search volumes and secondly, that this overstatement tended to rise exponentially with popularity of a keyword. Looking at the necessary deflation of Google AdWords’ search volumes for the top ten rankings on Google, we found that, on average, AdWords is overstating global search volume by 42.29%. This fudge factor is an important part of our ROI calculator and determining realistic traffic figures. So if, for example, you were to try to target the keyword “cheap dedicated hosting”, it would be wise to apply this 42.29% fudge factor to Google AdWords’ estimate of 8,100 monthly global searches.

    So ranking #1 “cheap dedicated hosting” would earn a website 1,606, not 2,782, because of this 42.49% fudge factor. This is demonstrated on the graphic below.

    The graph below illustrates our second finding: Google AdWords exponentially overstates search volume as search frequency of keyword increases. The Y-axis shows the true market share of a given keyword by a website which ranks #1 for that keyword and the X-axis show Google AdWord’s estimated volume. As you can see, once the search volume moves past 3,000, it becomes extremely difficult to obtain any sort of significant share of the searches. The market share is certainly nowhere near the 34.45% warranted by ranking #1. The data for the second through tenth position tell a similar story. It seems the only way to obtain a decent percentage of hits for a keyword is to target and rank for moderate keywords. So while a fudge factor of 36.8% is a good average to use, this will typically need to be increased for higher search volume keywords and decreased for lower search volume keywords. You can see from the graph below that, on smaller keywords, the market share is as high as 160%. Obviously, this isn’t accurate either, so while a fudge factor is an important factor in calculating traffic, it is far from an exact tool.

    The take-away from this experiment is to exercise caution when targeting a specific rank and market share for keywords if you are relying on Google AdWord’s estimation of search volume. While it may not be an exact figure, we recommend using 42.29% as your fudge factor. This number becomes more subjective when dealing with extremely high or low volume estimations as well. For very large keywords, with volumes of 15,000 or greater, an accurate fudge factor would be about 90% and for smaller keyword, under 1,500 searches, the fudge factor is actually under stated by about 60%. The range on these percentages, although just an estimation, shows the inconsistency of Google AdWords’ estimations. The bottom line is if you’re going to spend a majority of your budget trying to target a specific position for a high traffic keyword, just know that the reward is a slice of a much smaller pie than Google would have you believe.

     

    15 comments
    netgainassociates
    netgainassociates

    Here's a dose of reality for ya. I have a site with a 2 month old #1 ranking in Google on a keyword with a Adwords Local traffic volume estimate = 3600. Actual visits reported through that keyword via Google Analytics for the past 30 days = 28. Translation: calculating CTR for #1 position based upon Google's Search Volume estimate landed at a .777% CTR -- that's POINT seven seven seven percent Click Through Rate based upon Google's reported global/local search volumes. The CTR number is kinda sucky for that golden #1 organic position. This is in contrast to a zealously naive entry into an SEO campaign and watching the ranks climb, excitedly anticipating that this position might get a 25% CTR from 3600 impressions, which should be 900 visits. With a likely 30% conversion rate in this "wallet out" niche, one could hope for 270 new "hot" leads per month, yielding 30% of customer contacts becoming sales worth an average of $750 each at 40% profit, or $300 "profit in the pocket" per sale. So... if the Adwords Local Traffic Volume estimate were accurate, that #1 ranking on that single term should have been landed 81 sales, $61,750 revenue, and $24,300 profit, minus the cost of the ~$1000 worth of SEO effort exerted over the course of the 5 months it took to get that #1 ranking. The profitability seems way too good to be true, because it is. The reality is... only 28 visits came from that keyword's #1 ranking, yielded NO form submissions, and we can't measure phone leads without buying phone tracking systems that the client can't afford to implement until they lands some sales from the traffic that a #1 ranking should be capable of generating. Knowing that competitors and vendors routinely do keyword searches, we know that a substantial fixed number of visits will be bogus. For all we can see, this #1 ranking could very well be attributable to no revenue at all. For business and campaign planning purposes, there is a VERY big difference between scoring nearly $25,000/month in pure profit from this ranking (which would approx 14,000% return on a $175/month SEO investment for this single term), versus not being able to track that #1 ranking to a single sale. It certainly raises doubt about the value of spending the time & money it takes to score #1 rankings on this particular term, as compared with investing funds into other marketing options. My personal guesses about CTR from humans, conversions, and sale percentages might be fairly close to being right for this niche. Using my predictions about human behavior in this niche to cast judgement from the 28 visits, the actual number of human searches that occurred on this "volume=3600" k/w is likely around 110, with probably HALF of those visits coming from inter-industry competitors and other online marketing service providers. Why are the Adwords Search Volume numbers so exaggerated? Perhaps that search volume estimate includes the huge number of automated queries that occur from keyword tracking tools. Knowing that most of the tools out there have costs associated with them, it's stands to reason that the more common terms will be tracked by a lot more advertisers and their tools, grossly inflating the search volume estimates on the most popular terms within your niche. I haven't confidently decided what number to put in for estimates, but for now, I am conservatively plugging in numbers that assume that the search volume is 10x higher than it will ultimately turn out to be, then taking whatever CTR I expect and muliplying by ".1", then estimating actions per visitor and cutting that in half, and then halving the anticipated number of sales per conversions. Real CTR = .1(expected CTR) Real Conversion Rate = .5(expected Conversion Rate) Real Sales Count = .5(expected sales count per conversion) Calculate ROI potential from there. If this pessimistic number is still quite positive, be skeptical, but consider SEO for that term being worth a try while accounting for the risk of competitors with unrealistic expectations pumping more effort into SEOing the word than the keyword really deserves. SEO is still very worthwhile for many terms within many niches. It's just not the "slam dunk" or "silver bullet" that it used to be.

    Ryan Kelly
    Ryan Kelly

    @netgainassociates thank you for sharing! You make an interesting (and highly overlooked) point of competitors and automated bots doing searches for keywords, assuming that's about half the volume - which I could believe. Do you think our "fudge factors" are accurate or useful?

    netgainassociates
    netgainassociates

    @Ryan Kelly @netgainassociates Yes I do. I don't have enough data to cast an accurate judgment on it, but the 90% fudge factor might land well within reason.

    LauLau81
    LauLau81

    Is this application will defeat the other one?

    Jacqui Jones
    Jacqui Jones

    Have you considered using Google Webmaster Tools reporting to review click through rates from organic listings?  

    Troy Martz
    Troy Martz

    Ryan, Excellent post, and something most SEO's & PPC marketers suspected for a long time. It's refreshing to see actual data to back this up. I'm wondering how hard it would be to do a similar analysis for EXACT match / GLOBAL for the same data set? Is that something you could provide in a follow-up post? :) Thank you!

    Ryan Kelly
    Ryan Kelly

    Hi Justin, We used broad match only because I believe that's the way people really search. No one I know puts exact match syntax into Google, and we also pulled the traffic from GA by doing a broad match on the term, so we essentially got "music" and any keywords with "music" in it. So at least it was apples to apples from search volume to actual traffic.

    Joricam
    Joricam

    lol, the above statement makes you look really bad

    Ryan Kelly
    Ryan Kelly

    why? @Joricam

    robdods
    robdods

    hmm... that was all neatly paragraphed... not sure what happened oh well.

    robdods
    robdods

    He was referring to the fact that "broad" / general search on the user side is not the same as broad search on adwords side.  Two totally different animals.Yes... not many people use exact match when doing google searches, but that isn't the issue or even relevant.  Exact match on the adwords side simply means exactly what you typed in the box -- not someone who typed +"cheap dedicated hosting".  When doing seo... you have to target specific results.  So you need to use exact match to optimize your seo for one specific term.  Then you can use broad to see how many people are searching for phrases that include, but are not limited to your specific wording.  To use the term in the article as an example -- 'cheap dedicated hosting' right now has a broad total of 6,600 but on exact match it is only 590.  Now you may be scoring well on related terms such as 'reliable cheap dedicated hosting' & 'cheap dedicated hosting plans'.  However, you might be doing horribly on 'cheap dedicated email hosting' & 'cheap dedicated hosting server'.  I think your fudge factor equations might prove accurate to some measure, but not in the way you intended.  It seems you might have backward engineered a quick & dirty formula to get an idea of what kind of results you can expect to get from broader array of related terms, but this doesn't apply to specifically targeted seo.   This would explain why the popularity / frequency of the term plays such a large part in adjusting your factor.  More popular terms (ex.. 'Music') have many more possible phrases they could be used in.  While, low popularity terms have far fewer possible phrases (ex. 'Aardvarks for sale').So ... to conclude... you've came up with some useful data & a workable formula, however it doesn't represent what you thought it did or the way you thought it did.Now, please, go do these same calculations using exact match ... come up with some more data now that a few key points have been clarified, and let us know the result!I'm sure we'd all appreciate it... and you owe me... I had to take my ADD meds to type this. @Ryan Kelly 

    Justin Seibert
    Justin Seibert

    Hi Ryan - Thanks for the post.  One question - were you using exact match figures from within the AdWords estimator? When I ran "music" for example, it showed 277m matches on broad, but "only" 2.2m on exact match.  I don't believe their estimates are accurate, but exact match estimates (if you weren't using them) may provide a better apples to apples comparison.  Thanks!

    Ryan Cahill
    Ryan Cahill

    Thanks for the question, Justin. We used broad match and compared it to a broad match on keyword hits for the various sites in Google Analytics to get the same 'apples to apples' comparison you are talking about. Because we were dealing with some smaller keywords, we used broad search because we found the exact match results often didn't show any data for such keywords.

    David Kimberley
    David Kimberley

    I thought broad would mean if the keyword was used in the search, so "i like music' would be added to the broad overall calculation while exact match music would be when someone typed in "music" only. If Google doesn't give a result back on exact perhaps nobody is searching the term! You would also need to see how you rank in each country! ie China could be responsible for 10% of what is searched but perhaps the website doesn't rank so high over there. 

    July

    26

 
 
 
 

Have Questions? Give us a call (888) 427-2178

PearBook

facebook.com/pearanalytics
  • How to play the long game with your SEO strategy, with tips from the experts at SMX West 2014....
  • Scientific social media guidelines for the ideal lengths of tweets, Facebook posts, titles and everything in between....
  • If you have 30 mins, here is an interview you might enjoy....

PearTweets

@pearanalytics
Pinterest