Welcome to WebmasterWorld Guest from 188.8.131.52
1. There is very liite traffic but this is highly unlikely I think.
2. There is traffic and Google is not serving our site for all of the search queries.
If the second guess is correct can we assume that this is a type of sandboxing and that over time the situation will improve? In the past I've had experience in this vertical and got more traffic with just a few pages or content so it's all pretty confusing. My question when it's all said and done is this - Does Google limit the amount of "exposures" in organic search in any way for new sites?
these terms get lots of searches without any click since most of the searches are from auto KW ranking tools and competitors that just check their position compared to you
asking around people in other locations, using diff. settings,
logging out, logging in, setting no. of results to 10, 100 and in between, removing any additional parameters from the query URL manually, adding or removing &filter=0 and so on.
But to be honest...
In these few years - especially for stuff I couldn't relate to - I had quite a few sites where I expected competitive, popular phrases to perform like magic, and they just... didn't. Do you know why?
All the statistics that you can check when deciding which variant / word order / synonym to target the most intensively have a tendency to include much of the long tail queries in the figures. Meaning even paid monitoring services will show skewed stats for the most popular, shortest variants. Example: if you enter red widgets into Google Trends you'll get a pretty impressive graph that'd probably compare to a lot of popular searches. But this only means that red widgets, these two words, are often queried together, and the stats have almost no relation to word order or whether additional words were present! Unless you use "red widgets" instead of red widgets, you won't get a hold of an accurate estimate on the number of searches.
Also, since Google has this temporary ban on automated queries, most rank checking software are mimicking browsers to a point where not a single filter will be able to tell them apart from users. Meaning a lot of SEO related queries are either made by hand or a checker software... I'm sure you knew that, but I just wanted to warn you that the 'exclude bogus' features for certain databases won't do much for you.
And finally, which is the most annoying, perhaps in this industry people go for the first three ( or two ) results. The more generic the query, the more likely this is the case. In some sectors ( and ouch this can be bothersome ) people don't look past the top 3. While in others no. 11 is still a hot spot.
...top ten placement for ultra-competitive phrase, little traffic, but maximized with great titles... although your guess was that the problem is more like bouncing back and forth on the edge of a filter, I wanted to share some of the related info if someone - like me - who reads the thread title jumps at the topic *grin*
if checked against all parameters your subdomain/page really IS under performing in the SERPs ( not as top 5 as it seems ), I'd suggest that you check whether it's linked from the main domain either as a separate website would be, OR as any other nav element. Just make it consistent.
[edited by: Miamacs at 12:51 pm (utc) on Dec. 26, 2007]