Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: open
In FI, CW and DC, my site is in the index. Most search terms performing the same as usual, EXCEPT the top keywords.
Top keywords are keywords used a lot in anchor text and contained in the title of my page.
The reason I am posting this is not to blame google or what. I need to at least find our the reason why this is happening.
Anyone here having the same experience? Maybe we can discussed here and find out the reason. Maybe there is a new filter working againt us.
I think joe public is getting more sophisticated about searching and beginning to use longer search strings and sites will still be ranking well for these.
Plus, our traffic is up a bit overall right now. Too many sites/factors to be certain, but logs seem to back up what we've been discussing. Have lost some ground on shorter search terms, gained on other longer terms...pretty much just what GG was saying.
Theming + focus on sites with broader arrays of content + adjustments to fight over-optimization = Dominic, or so it seems to us.
The only major issues I have are these...
Regarding the apparent new direction G is taking: I see some really good, very niche sites are suffering right now (one is in an area that is my hobby - nothing to do with my business - really a shame). Sure hope G does not throw the baby out with the bathwater as far as highly targeted sites go.
Regarding deployment of Dominic: Terrible miscalculation. Bugs, dropped sites, spam everywhere, dup content everywhere. Old pages. It's like someone waved a magnet over the servers. Yes I understand much of this is expected to evaporate "over time" when bugs are worked out, new data is pulled in, backlinks are added, etc. But why release this disaster publicly. Very bad form.
Today, we decided that if G is not better within two weeks, we'll go to a cheaper search solution...not that it will dent their bottom line much...but thousands of single votes *can* make an election...just ask Al Gore ;-)
P.S. JasonIR - At most of my sites, one or two terms do clearly get more traffic than any others. The thing is, it's rare that those hottest terms by themselves ever exceed 20% of hits...at least in our case. 80% come from all the other combinations taken together.
I don't like your anti-SEO theory, and I sincerely hope that you are wrong.
If I understand properly, you think that if your site is all about blue widgets, and "heavily optimized" for these keywords (title, H1 tag, anchor text, etc.), you will from now on come up lower in the SERPs than sites that are not optimized for blue widgets, but happen to use these words here and there? Let's say, for example, sites that are about blue birds and fuzzy widgets? That doesn't sound good at all for the users! Although my opinion of Google has gone down quite a bit lately, I still don't think they are that stupid.
I hope that they are instead testing a more sophisticated algo (probably with themes playing a major role). If that's true, it's unfortunately far from perfect at this point...
That's for sure. Of all the things we can speculate about this phantom sophistication is obviously not true. All a person has to do is set up an Ad Words campaign(s) and see how many searches are done in a day for different terms. I get boatloads of multi-word referals, but all those searches together are dwarfed by the raw number one word searches people do. And it is almost logarithmic... one word searches are way more prevelant than two word ones, and two word ones are way more prevelant than three word ones, etc.
People who do poorly on one word searches can comfort themselves with multiple word ones but they are just scratching the surface of markets. (Of course, some niches will be exceptions, where there isn't any sensible single word to search for.)
So I'll just say that even the most finely targeted of *our* sites (one that focuses on two-work keyphrase) receives only about 19% of its traffic from that one most important phrase. Another 12% contains that phrase. The balance comes from other related terms. That site is up a bit since the new SERP's went live (poor as they may be).
Our experience is reflected by annej's comment from another thread:
just checked my stats and dozens is an understatement. People found my biggest site so far this month using 2701 different phrases. Only 3.8% of searchers came in on the single keyword I've been obsessing about. That was an eye opener!
steveb, it's not that we do poorly on our most important two-word keyphrase - it's simply that we *also* do well on the other 81% ;-)
I think MHes' theory is well constructed, explains much of what we're seeing, and fits with things we already expected Google to be considering. But that doesn't mean it is correct; it's only a guess that we think makes sense so far. *I hope it's wrong too, as it's fraught with pitfalls, and if it exists, seems to be hurting some very legitimate sites with narrow focus right now, as I already noted above.*
The 'seo algo' kicks in if you over optimise for a specific keyword phrase. If you tone down the on page optimisation, the algo does not kick in, and the playing field is more even for all sites, allowing other factors to determine the ranking of sites.
I think this addresses the problem of keyword domains and seo experts flooding the serps..... so yeah, I don't like the algo either :)
How about doing 50,000 variations on relevant adwords and compare to 5 recognised keywords.... I bet you get more traffic from the 50,000 collectively.... and that is what an average content site will provide you. The variations picked up on searches by a simple page of 500 words are huge, if you let the spider in.
We get many top 5 positions on search phrases (two words) out of 2 million + results. Our search engine traffic is around 40,000 uniques per day, but 80% of this traffic comes from almost unique search phrases and ones we would never have dream't up.
We are in a sector which allows a broad range of searches, if you are doing a fan site for "the rolling widgets" then, yes, that phrase is important.
Analyzing my site, here are thing that I "think" (possible) caused my site and even your's to be "semi-penalize."
Theory: Google's new algo may be computing the density of links versus text/words. Maybe they have a certain percentage that is acceptable for them.
hunch meter: 99.9% :)
hunch meter: 50%
hunch meter: 100%
You seem pretty sure that over optimising a page for a specific search term can now harm the ranking for this term in the SERPS.
It does explain alot of things for me, so I am not doubting your judgement.
Do you have any other facts to back this up?
When should one start to amend pages? I was previously from the "don't change anything until everything settles with new data / algorithms applied etc" club - but now I am wondering?
Does this also mean that lot's of inbound links with anchor text are NOT harming rankings, as previously thought in this thread, and that we should be looking more at on page factors?
I've kept up with most of the posts re. the Dominic update, particularly regarding semi-bans, PR0 pages, H1, H2 tags, page titles etc.
We run a successful travel portal which, until the update, enjoyed a large number of #1 #2 rankings on Google/Yahoo for two/three keyword phrases. Then, while SJ/FI and other datacenters started to be updated (and the new algo introduced) we experienced a drop down to 20-30 (and lower) for the same searches. It was about then that SEOs on this forum generally started having palpitations, attacked GG and predicted that all was not well at G. plex etc. etc.
We know we've got a well-optimised site (that adheres to all the proper methods of achieving good rankings) and one that contains good content (the phrase "content is king" applies even more today than before the update). So we haven't been unduly worried. In fact, new pages on our site (which currently have PR0 rankings) are doing rather well in the new indexes.
So my advice is to sit tight for a bit longer. Even if your rankings appear to be in the doldrums right now, GG has said that elements of the new algo still need to be factored in. Established SEO techniques, such as concise, well-structured page titles, proper use of H1, H2, H3 tags and accurate page descriptions have not suddenly become obsolete. In fact, those who are changing their pages now (on the basis of advice offered by some here), may well suffer significant drops in their rankings 1-2 months from now. And all because they panicked!
Completely agree. I have said throughout this thread:
These ideas/thoughts are pure conjecture, based on observation of the data centers when they show reasonably up to date listings. However, new filters are kicking in all the time, and a future one may make all the above obsolete. However, to date the theory is holding water.
IMHO No one should think of doing any changes for at least 4 weeks, as obviously the rule book has been thrown away. Lets just chat about these things, which is fun, but not assume anything is stable.
If you need to worry about something, I suppose you can worry about Google being broken, but it would be wiser to WAIT AND SEE.
But, if you read all of GoogleGuys posts you will see that there is no panic - Google seem pretty happy with the quality of the SERPS - and other algorithm tweaks will be added over time.
Google have changed things for a reason. I don't see them just changing things back! (Although I wish they would)
I think the algorithms are becomming clearer, but feel that the lack of deep crawl data is making things even harder to judge!
GoogleGuy - where are we up to with the overall schedule?
Google is now showing great results without our sites there. They won't bother whether if your site is there to make up the good results. Because there are no different to them and to their visitors.
We were no.1 for main keyword, but a few days ago
homepage disappeared, although interior pages are on page 3.
Checked allinachor and we are still no.1, PR is still
It apppears we have this semi penalty as our site is
highly SEO for this keyword, with lots of anchor text etc.
Other No.1 rankings for other keywords are unaffected.
If this sticks, then it is going to take a lot of work
to get round this filter!
The semi-penalty is not a 'real' penalty where it only some filter/algo which took place when detemining your site ranking with certain specific keywords. As long as you fixed the problem, it will do well again with the ranking. But the knowledge on how to fix is quite limited at the moments. And I also belive it would be quite complicated because we are dealing with our links in others web site.
p/s a real penalty would have a site blacklisted, either removed from the index entirely or invoking their PRs. So it is different with semi-penalty.
I have a site just two weeks old so its listing is only via freshbot and has no pr. The domain is keyword1 keyword2.com . Index is optimised in the usual way for keyword1 keyword2. My links page only has the terms appear in a link back to the index page saying "keyword1 keyword2 index". Now if i search for keyword1 keyword2 my links page is reurned right before my index page. So google is now saying my links page with the term just once is more important than the homepage with content and the keyword1 keyword2 repeated. Now however you look at it, google is not serving the most relevant page and this is bad. It's one thing to ban a site but another to say well we'd rather the user gets a less relevant page than risk rewarding a seo company.
Today after a few test tweaks yesterday, the about us page is showing up on page 5.
Yesterday I altered the title tag on this page from 'widget widgets - about us' to 'About us - Widget widgets'.
The h1 text says 'About Us' whereas the index h1 says 'widget widgets'
The page content has the 'widget widgets' keywords scattered VERY evenly throughout the text, at the start of paragraphs, within paragraphs and at the end of paragraphs. Not repeated at all within a paragraph.
This page has no external links to it, only internal links.
So although I have no idea if the TITLE tag changes have any relevance, the other factors do seem interesting and worth some analysis. Could indicate there is a definite google filter for overegging the keyword density and overuse of H1 tags, especially where external links are using the same keywords.
Anyone else got any similar breadcrumbs?
<blush> Yes, I removed the <h1>-tag </blush>
If removing the H1 tag really improved your ranking that fast, it would imply that Google recrawled your site in last several days - and more importantly - recalculated your standing in the SERP's almost immediately.
And if that were true it would say a lot about what's happening with their new 'system' (GG's word, not mine).
Isn't it more likely that some new filter was added or dropped? Can you see no other difference that could account for this?