| 5:51 pm on May 22, 2003 (gmt 0)|
|I think joe public is getting more sophisticated about searching and beginning to use longer search strings and sites will still be ranking well for these. |
Agreed, and of course Google clearly understands this. Virtually any site's logfiles will confirm this (if they get decent traffic from SE's).
Plus, our traffic is up a bit overall right now. Too many sites/factors to be certain, but logs seem to back up what we've been discussing. Have lost some ground on shorter search terms, gained on other longer terms...pretty much just what GG was saying.
Theming + focus on sites with broader arrays of content + adjustments to fight over-optimization = Dominic, or so it seems to us.
The only major issues I have are these...
Regarding the apparent new direction G is taking: I see some really good, very niche sites are suffering right now (one is in an area that is my hobby - nothing to do with my business - really a shame). Sure hope G does not throw the baby out with the bathwater as far as highly targeted sites go.
Regarding deployment of Dominic: Terrible miscalculation. Bugs, dropped sites, spam everywhere, dup content everywhere. Old pages. It's like someone waved a magnet over the servers. Yes I understand much of this is expected to evaporate "over time" when bugs are worked out, new data is pulled in, backlinks are added, etc. But why release this disaster publicly. Very bad form.
Today, we decided that if G is not better within two weeks, we'll go to a cheaper search solution...not that it will dent their bottom line much...but thousands of single votes *can* make an election...just ask Al Gore ;-)
P.S. JasonIR - At most of my sites, one or two terms do clearly get more traffic than any others. The thing is, it's rare that those hottest terms by themselves ever exceed 20% of hits...at least in our case. 80% come from all the other combinations taken together.
| 8:05 pm on May 22, 2003 (gmt 0)|
I don't like your anti-SEO theory, and I sincerely hope that you are wrong.
If I understand properly, you think that if your site is all about blue widgets, and "heavily optimized" for these keywords (title, H1 tag, anchor text, etc.), you will from now on come up lower in the SERPs than sites that are not optimized for blue widgets, but happen to use these words here and there? Let's say, for example, sites that are about blue birds and fuzzy widgets? That doesn't sound good at all for the users! Although my opinion of Google has gone down quite a bit lately, I still don't think they are that stupid.
I hope that they are instead testing a more sophisticated algo (probably with themes playing a major role). If that's true, it's unfortunately far from perfect at this point...
| 8:47 pm on May 22, 2003 (gmt 0)|
>Agreed, and of course Google clearly understands this. Virtually any site's logfiles will confirm this (if they get decent traffic from SE's).
Then my guess is those are logs of sites that don't do well one the 1 or 2 word searches, and all they get are the long searches.
| 8:55 pm on May 22, 2003 (gmt 0)|
| 9:09 pm on May 22, 2003 (gmt 0)|
"Then my guess is those are logs of sites that don't do well one the 1 or 2 word searches, and all they get are the long searches."
That's for sure. Of all the things we can speculate about this phantom sophistication is obviously not true. All a person has to do is set up an Ad Words campaign(s) and see how many searches are done in a day for different terms. I get boatloads of multi-word referals, but all those searches together are dwarfed by the raw number one word searches people do. And it is almost logarithmic... one word searches are way more prevelant than two word ones, and two word ones are way more prevelant than three word ones, etc.
People who do poorly on one word searches can comfort themselves with multiple word ones but they are just scratching the surface of markets. (Of course, some niches will be exceptions, where there isn't any sensible single word to search for.)
| 9:36 pm on May 22, 2003 (gmt 0)|
Hey guys, perhaps I should back off of my *definitive* stance about where site traffic comes from...it was based primarily on our own sites. Clearly you are having different experiences.
So I'll just say that even the most finely targeted of *our* sites (one that focuses on two-work keyphrase) receives only about 19% of its traffic from that one most important phrase. Another 12% contains that phrase. The balance comes from other related terms. That site is up a bit since the new SERP's went live (poor as they may be).
Our experience is reflected by annej's comment from another thread:
|just checked my stats and dozens is an understatement. People found my biggest site so far this month using 2701 different phrases. Only 3.8% of searchers came in on the single keyword I've been obsessing about. That was an eye opener! |
steveb, it's not that we do poorly on our most important two-word keyphrase - it's simply that we *also* do well on the other 81% ;-)
I think MHes' theory is well constructed, explains much of what we're seeing, and fits with things we already expected Google to be considering. But that doesn't mean it is correct; it's only a guess that we think makes sense so far. *I hope it's wrong too, as it's fraught with pitfalls, and if it exists, seems to be hurting some very legitimate sites with narrow focus right now, as I already noted above.*
| 9:52 pm on May 22, 2003 (gmt 0)|
The 'seo algo' kicks in if you over optimise for a specific keyword phrase. If you tone down the on page optimisation, the algo does not kick in, and the playing field is more even for all sites, allowing other factors to determine the ranking of sites.
I think this addresses the problem of keyword domains and seo experts flooding the serps..... so yeah, I don't like the algo either :)
| 10:00 pm on May 22, 2003 (gmt 0)|
How about doing 50,000 variations on relevant adwords and compare to 5 recognised keywords.... I bet you get more traffic from the 50,000 collectively.... and that is what an average content site will provide you. The variations picked up on searches by a simple page of 500 words are huge, if you let the spider in.
We get many top 5 positions on search phrases (two words) out of 2 million + results. Our search engine traffic is around 40,000 uniques per day, but 80% of this traffic comes from almost unique search phrases and ones we would never have dream't up.
We are in a sector which allows a broad range of searches, if you are doing a fan site for "the rolling widgets" then, yes, that phrase is important.
| 10:24 pm on May 22, 2003 (gmt 0)|
Brief overview: my site did not moved from its number #1 and #2 from my keyword when doing a search. But it shows 0 PR.
Analyzing my site, here are thing that I "think" (possible) caused my site and even your's to be "semi-penalize."
- links on page is more than the text (not link). <which is why my link page also got 0 PR and even your link pages>
Theory: Google's new algo may be computing the density of links versus text/words. Maybe they have a certain percentage that is acceptable for them.
hunch meter: 99.9% :)
- My <title> are also my Keywords which also appears on the page as a footer.
hunch meter: 50%
- My back link/s disappeared which is from ODP/dmoz and it won't appear in the Google Directory which it appeared before. (i think this is Google's problem)
hunch meter: 100%
| 10:29 pm on May 22, 2003 (gmt 0)|
So that means that us amateurs who bumble along and mess up half of the optimisation will beat you pros who get it all perfect.
Dont worry...you can always buy WPG and catch us all up.
| 1:21 am on May 23, 2003 (gmt 0)|
We don't live in an either/or world. The verifiable fact is generally searchers search by typing as little into the search box as possible. Volume of searches from one word searches dwarf multi-word queries. Diversifying is important, but the people are at the most simple searches.
| 9:09 am on May 23, 2003 (gmt 0)|
You seem pretty sure that over optimising a page for a specific search term can now harm the ranking for this term in the SERPS.
It does explain alot of things for me, so I am not doubting your judgement.
Do you have any other facts to back this up?
When should one start to amend pages? I was previously from the "don't change anything until everything settles with new data / algorithms applied etc" club - but now I am wondering?
Does this also mean that lot's of inbound links with anchor text are NOT harming rankings, as previously thought in this thread, and that we should be looking more at on page factors?
| 10:07 am on May 23, 2003 (gmt 0)|
I've kept up with most of the posts re. the Dominic update, particularly regarding semi-bans, PR0 pages, H1, H2 tags, page titles etc.
We run a successful travel portal which, until the update, enjoyed a large number of #1 #2 rankings on Google/Yahoo for two/three keyword phrases. Then, while SJ/FI and other datacenters started to be updated (and the new algo introduced) we experienced a drop down to 20-30 (and lower) for the same searches. It was about then that SEOs on this forum generally started having palpitations, attacked GG and predicted that all was not well at G. plex etc. etc.
We know we've got a well-optimised site (that adheres to all the proper methods of achieving good rankings) and one that contains good content (the phrase "content is king" applies even more today than before the update). So we haven't been unduly worried. In fact, new pages on our site (which currently have PR0 rankings) are doing rather well in the new indexes.
So my advice is to sit tight for a bit longer. Even if your rankings appear to be in the doldrums right now, GG has said that elements of the new algo still need to be factored in. Established SEO techniques, such as concise, well-structured page titles, proper use of H1, H2, H3 tags and accurate page descriptions have not suddenly become obsolete. In fact, those who are changing their pages now (on the basis of advice offered by some here), may well suffer significant drops in their rankings 1-2 months from now. And all because they panicked!
| 11:28 am on May 23, 2003 (gmt 0)|
Completely agree. I have said throughout this thread:
These ideas/thoughts are pure conjecture, based on observation of the data centers when they show reasonably up to date listings. However, new filters are kicking in all the time, and a future one may make all the above obsolete. However, to date the theory is holding water.
IMHO No one should think of doing any changes for at least 4 weeks, as obviously the rule book has been thrown away. Lets just chat about these things, which is fun, but not assume anything is stable.
| 1:36 pm on May 23, 2003 (gmt 0)|
If you read the observation posted by my3cents in another thread, you will see that Google itself is suffering from the "half-penalty": it dropped down 3 places for the query "search engine" behind Altavista and My Excite.
So, it's clear that we all need to stop worrying about the placement of our sites in the current index.
If you need to worry about something, I suppose you can worry about Google being broken, but it would be wiser to WAIT AND SEE.
| 2:02 pm on May 23, 2003 (gmt 0)|
Google dropped doesn't related to semi-penalty we are discussing here. If it is, it would dropped more than #3.
| 3:16 pm on May 23, 2003 (gmt 0)|
I'm all for hoping that rankings will come back etc.
But, if you read all of GoogleGuys posts you will see that there is no panic - Google seem pretty happy with the quality of the SERPS - and other algorithm tweaks will be added over time.
Google have changed things for a reason. I don't see them just changing things back! (Although I wish they would)
I think the algorithms are becomming clearer, but feel that the lack of deep crawl data is making things even harder to judge!
GoogleGuy - where are we up to with the overall schedule?
| 4:02 pm on May 23, 2003 (gmt 0)|
Don't dream. Your rank won't come back up. After more filters is added, more sites will vanished. And you are still hoping for site to reappear?!
Google is now showing great results without our sites there. They won't bother whether if your site is there to make up the good results. Because there are no different to them and to their visitors.
| 1:55 am on May 24, 2003 (gmt 0)|
Yep, we are seeing exactly the same problem.
We were no.1 for main keyword, but a few days ago
homepage disappeared, although interior pages are on page 3.
Checked allinachor and we are still no.1, PR is still
It apppears we have this semi penalty as our site is
highly SEO for this keyword, with lots of anchor text etc.
Other No.1 rankings for other keywords are unaffected.
If this sticks, then it is going to take a lot of work
to get round this filter!
| 4:06 am on May 24, 2003 (gmt 0)|
I thought I read where GG said that Google would prefer to deal with spam thru an algo. If so, could the penalty that is applied be determined by an algo so that a site that is penalized could come back once the offense is corrected and sites that continue to offend would remain penalized?
| 7:11 am on May 24, 2003 (gmt 0)|
The semi-penalty is not a 'real' penalty where it only some filter/algo which took place when detemining your site ranking with certain specific keywords. As long as you fixed the problem, it will do well again with the ranking. But the knowledge on how to fix is quite limited at the moments. And I also belive it would be quite complicated because we are dealing with our links in others web site.
p/s a real penalty would have a site blacklisted, either removed from the index entirely or invoking their PRs. So it is different with semi-penalty.
| 7:24 am on May 24, 2003 (gmt 0)|
ok, i find the following very interesting.
I have a site just two weeks old so its listing is only via freshbot and has no pr. The domain is keyword1 keyword2.com . Index is optimised in the usual way for keyword1 keyword2. My links page only has the terms appear in a link back to the index page saying "keyword1 keyword2 index". Now if i search for keyword1 keyword2 my links page is reurned right before my index page. So google is now saying my links page with the term just once is more important than the homepage with content and the keyword1 keyword2 repeated. Now however you look at it, google is not serving the most relevant page and this is bad. It's one thing to ban a site but another to say well we'd rather the user gets a less relevant page than risk rewarding a seo company.
| 9:49 am on May 25, 2003 (gmt 0)|
Just for the record: It seems that the semi-penalty has been lifted for my index page on 8 out of 9 Google centres.
(Still waiting for Google-fi :) )
| 10:08 am on May 25, 2003 (gmt 0)|
Did you do anything to your website for this to happen?
| 10:53 am on May 25, 2003 (gmt 0)|
<blush> Yes, I removed the <h1>-tag </blush>
| 4:00 pm on May 25, 2003 (gmt 0)|
Interesting changes just occurred for one of my sites.
Index page was ranking #4 for a two word product phrase query 'widget widgets' then around May 20th it bombed off the rankings. Only way to find it was a domain query.
Today after a few test tweaks yesterday, the about us page is showing up on page 5.
Yesterday I altered the title tag on this page from 'widget widgets - about us' to 'About us - Widget widgets'.
The h1 text says 'About Us' whereas the index h1 says 'widget widgets'
The page content has the 'widget widgets' keywords scattered VERY evenly throughout the text, at the start of paragraphs, within paragraphs and at the end of paragraphs. Not repeated at all within a paragraph.
This page has no external links to it, only internal links.
So although I have no idea if the TITLE tag changes have any relevance, the other factors do seem interesting and worth some analysis. Could indicate there is a definite google filter for overegging the keyword density and overuse of H1 tags, especially where external links are using the same keywords.
Anyone else got any similar breadcrumbs?
| 4:10 pm on May 25, 2003 (gmt 0)|
Do you mean you about page appear after the tweaking of your index page?
| 4:12 pm on May 25, 2003 (gmt 0)|
I have PR4, H1 tags on all 300+ pages, but only a few backward links. I have no serious change in my rankings for any search term.
| 4:18 pm on May 25, 2003 (gmt 0)|
No sorry...the about us page appeared after I tweaked the about us page.
This could be coincidence, but regardless, the difference in keyword density and placement and lack of backlinks between that page and the index page does give clues as to the filters google is using.
| 5:09 pm on May 25, 2003 (gmt 0)|
|<blush> Yes, I removed the <h1>-tag </blush> |
1) Is that all you did?
2) I have trouble believing that the H1 tag removal is responsible for your improvement (and I'm a believer in the likelihood of MHes' seo algo theory).
If removing the H1 tag really improved your ranking that fast, it would imply that Google recrawled your site in last several days - and more importantly - recalculated your standing in the SERP's almost immediately.
And if that were true it would say a lot about what's happening with their new 'system' (GG's word, not mine).
Isn't it more likely that some new filter was added or dropped? Can you see no other difference that could account for this?
| 6:13 pm on May 25, 2003 (gmt 0)|
Just now coming into this discussion, so apologies for the late comments.
"Hmmm, I think this is the first time Google take actions against ZEUS and Linksmanager."
I gotta chime in. I do use Linksmanager, and can attest that it does not harvest links or email addresses. Each link submitted has to be approved manually by a human (me!). It checks for reciprocation, and allows me to create static HTML links pages without having to continually be coding new links, changes, deletions, etc. I take those HTML pages and upload them to my web server under my own domain just like everyone else does. Linksmanager is a tool, not a PR boosting scheme or program. I have links pages for my website visitors to check out - it adds value to my site.
Just wanted to defend that program, as its been a real time saver and improved the quality of my site.
Can't attest for ZEUS as I've never used it, don't plan on it.
Hope this Google thing gets resolved soon.