Forum Moderators: open
Title of page: word1 word2 word3 word4Result of search for word1 word2 word3: TOP RESULT
Result of search for word1 word2 word3 word4: NOWHERE TO BE SEEN IN FIRST 10 PAGES (and in page 10 none of those words are even mentioned in the title).
Obviously my page is getting a penalty because the title too closely matches the search phrase. How else is this explained?
Maybe you should try changing your title from "Keyword1 Keyword2" to "Company name - the best source for keyword1 keyword2" and you will do better in the SERPs?
SWG: I don't find this to be true. I have a site with 1,800 pages with plenty that disprove this theory.
I don't know whether I've dropped on some keywords and gone up on others but I've got a 30% boost in traffic recently. I've had a more detailed look at my logs. It's all coming from Google. On all my sites. And it's since Florida started.
I don't know if this helps you guys figure out Florida... but here's some info about my sites:
I haven't checked allinurl, allinanchor or any of the other tools all you experts seem to be using. I haven't done a detailed analysis of linkbacks and anchor text. I haven't got any optimisation software. Heck, one of my sites is 200 pages in size without a single H1 tag (only learnt about H1 recently). My sites are whitehat, or not intentionally anything but whitehat. I interlink all of them with a lot of all kinds of cross linking (relevant cross linking methinks), don't have alt tags on most images, have never done a keyword density check so KWD may vary wildly from page to page depending on the content. I'm awful at page titles and descriptions, must really revise them all to dmoz editor guideline standards. In fact I'm awful at html, I use Frontpage with all it's attendant problems like code bloat. I have a few hundred inward links (PR1-PR7). I have a few FFA type inward links which I mistakenly put in place some years ago and which still seem to be there. Of the other inward links about 50% of them are reciprocal. Links from authority sites aren't reciprocated but I do have a few of those. And I do like adding content. For the last 2-3 years I've been adding a few new pages every month, usually consisting of my own ramblings on subjects I like to think I know a little about.
All the sites taken together have about 200 PR5 pages, 300 PR4 pages and about 1000 pages of PR0-3. Total unique visitors I guess is about 3-5K per day. Not a big sample so I don't know if anybody can draw any conclusions from any of this. Or whether these sites/topics/confused ramblings have affected SERPS because of some quirk/s. Or even whether people have just been searching for different KWs since the Florida update started.
All this talk of a mighty anti-SEO algo change is poppycock. Hope everyone's foolishly making changes based upon this silly assumption 'cause it's a zero-sum game right? I will refrain from saying Google is broke because we're all guilty of an occasional bug--doesn't mean it don't work, just needs fixing.
I was a victim of the last allinanchor/index page bug, and I didn't change a thing and came out smelling like a rose when Google got it right. Then, many old-timers were saying major algo change, index pages had been devalued, blah-blah-blah. That simply wasn't the case; it was a bug.
Hopefully, everyone's followed the guidelines & built and marketed plenty of internal content that doesn't rely on the index for traffic. I've taken a hit on rankings but stats are holding up pretty well, all things considered. It's going to be a long couple of weeks before Google fixes the bug; get comfortable with it.
Doing a search for www.mydomain.com shows the same number of pages in www, www2, www3, -dc, -qv, -ab, including 1 obsolete page url.
-va is showing 1 less page, and the obsolete page has gone.
-ex, -in, -cw, -fi are showing older results.
.co.uk is the same as www etc if searching all the web, but is (as usual) hopelessly wrong for just UK results. (Mine is a .com site hosted in the UK.)
Note: number of pages are those actually shown in SERPs. The "about" numbers are inaccurate on some DCs.
notsleepy, I respect your opinion but to suggest making your page TITLE not reflect the search term for that page is absolutely crazy.
Even if as suggested before this is the case when other page/link factors are considered by Google, to make your page TITLE not reflect the search term your targetting is mad IMO.
Seriously Pissed
David
In all the searches for my site (four areas) I see no significant changes. HOWEVER, traffic from Google is down by something like 80% or more - very strange.
Some months ago someone suggested that MS may have a saboteur working at Google. Well, if Google is as badly broken as this thread suggests (from all my searches it still seems ok) then I think the saboteur/sleeper has struck. Certainly, if the public loose faith in Google, I would expect MSN to be the big winner.
This theory may be nonsense, but the timing is very interesting. Also, I don't think many people who've watched MS over the years would doubt that this sort of dirty trick was possible.
Kaled.
PS
I'd be very interested to hear from anyone whose SERPS appear unchanged but whos traffic from Google has dropped significantly. I live in the UK and my site is hosted here. I'm wondering if Google are using some strange geolocation filters.
Certainly, if the public loose faith in Google, I would expect MSN to be the big winner.
Case example: My fiancee's best friend who is a hairdresser and non-computer savvy (doesnt even own a computer) had this to say:
"I was searching for something at work yesterday and I was getting really aggrevated because a lot of them had nothing to do with what I was searching for."
It's not just us seeing this folks :)
different on each one. I will take this as proof that it isn't over
It for sure isn't over. I've been trying different searches from www.google.com, www.google.fi and www.google.se and the serps are all different for some of my keywords.
However. This isn't true for all keywords!?!?!
The site I'm watching the closest seem to be getting more traffic since Florida started, but it is a new site where we are adding about 20 pages per day, so it might just be the novelty effect. Older sites also do well, but they are in a language spoken by about 9 million people, where most seem to use MSN, so it's hard to tell what effect Florida might have on them.
[edited by: Nikke at 5:21 pm (utc) on Nov. 19, 2003]
Some months ago someone suggested that MS may have a saboteur working at Google.
Best conspiracy theory yet and an evil mole no doubt! :)
I could not stand the algo which I called "anchor text spamming". I refused to do this and to be honest am very pleased that G has applied this new filter to stop this from happening
Funny, I see a low PR site that derives its ranking from anchor text spamming (doorways with 100 links, all the same anchor text) ruling the day--pre-Florida and post-Florida. Anchor text filter my fanny.
Looks to me like it partially has. However, the index has been cooking some more on -in, and the others still haven't picked that up. We are in mid-dance, and nowehere near the end.
>Is -in showing results that I will expect to see in another few days, or is it still shuffling around?
Still shuffling.
Sites which look like they have been dropped way down the SERPS, actually haven't - they are still as high as they were before... for that specific keyphrase.
However, many other sites have moved up above those sites because using broad matching means that there is now much more general competition... because SERPS are now topic-based rather than keyphrase-based.
My latest guess is that Google AI analyses the keyphrase and instead of returning sites highly optimised for that phrase, it ascertains the topic and returns sites related to the topic and only broadly (rather than specifically) related to the keyphrase.
Just a hypothesis...
I suggest that perhaps Google have changed the Freshbot algo as well and needed to flush out all of the old pages?
Followed the sage advice here in the forums over the last year. Slowly built up a dozen sites for self and clients, achieved excellent rankings and lots of #1 returns. Felt proud doing it with hard work and no funny stuff. Esme/dom smacked me pretty good, but all recovered. However, even at the depths, it was never this bad. Sites that were ranked #1 for two word key phrases are now buried deeper than 500, which is as far as I had the stomach to check. However, have noticed two things to give me pause.
Used to show up at #1 for phrase area widgets , now, that phrase is buried 500+ down, but the same site shows #1 for area widget , and area-widgets Just tried this on another site hammered in Florida, and same results. Something regarding plural/singular and/or using the – has changed. Which leads me to a question: Why would an innocuous character like a –, akin to a space, lead a site from the bowels of 500+ to a first place result, when it has no direct material effect on the RELEVANCY of the search? And why would the plural trigger the site to drop 500+ places?
Normally, I’d just take the prescribed walk around the lake and, in the words of Lock Stock and Two Smoking Barrels, “Chill Whin-stonn” But just before retiring last night, I read the GoogleGuy post that basically says it’s done, and “some will like it, some won’t”
Just let one voice from the desert state: I do not believe there is any kind of conspiracy to boost adwords bids, I do not believe there is an MS saboteur in Google, I do not believe this is the beginning of the end for Google, and I DO believe that Google attempts to provide the most relevant possible search results.
But in my opinion, there has been a huge step backwards; or forward into a morass. (Thus far) A former unabashed Google supporter is now just going to bite lip and be quiet. Still hopeful it's not over, and results will return ala esme/dom.
1. There is a word-pair filter in place. This is not a penalty in the old manner of Google penalties -- as, for example, when they gave SearchKing a PR zero. This is a filter that is applied to the SERPs. If a given search produces 500 results, the order in which the results are displayed is determined, as usual, by PR and all the other algos. However, there is a new added filter such that if the searcher's search terms match the word-pair filter of particular pages within those 500, then those particular pages are placed further down in the queue for display purposes. PR stays the same, and a site's ranking for anything other than word-pair hits stays the same. Using a hyphen in your search terms defeats the word-pair matching.
2. This algo was apparently applied only to English-language pages. Perhaps only to dot-coms within English-language pages. Perhaps only to the most-frequent, most-competitive word pairs, as determined by an English-language word-pair dictionary organized by frequency of appearance, across all English-language pages. There is probably a front-end dictionary that's consulted for all two-word searches, and if the word pair is found in the "hit list" dictionary, then the SERPs are put through an extra step of consulting the threshold number for pages that score highly for those two words. It's probably done on the fly for each page in the SERPs -- the page is already decompressed to get the snippet out, and if you confine the scan to word-pairs inside of titles, links, anchors, and headlines, then it would not incur much overhead.
3. Single-word search terms, and most searches that use more than two words, are not affected. It seems to be a word-pair filter. If the string in which the word-pair was found contains three words, it was probably not considered for inclusion. If it contains more than three, it was probably parsed for pairs within the string. Hard to say. Suffice it to say that most SEOs use two-word terms to optimize their sites, and targeting two-word combinations makes a lot of sense from Google's point of view.
4. Anchor text in external links might be included in the word-pair hit count for a page. Hard to say about that. However, it's already clear that anchor text in external links alone will not trigger the filter. On-page word pairs in titles, links, and headlines are of fundamental importance. Internal linking is very important (and it's very easy to pick out whether a link on a page is an internal link to another page on the same site). If it's done on the fly, as suggested in 2) above, then I doubt that external links play any role at all.
5. There's a threshold applied for word-pair density. It might be a moving target, as in a threshold determined as a percentage of all text, or all linking text, or whatever, for a particular page.
Bottom line: Google was so predictable for so long, with respect to the role played by anchor text in links, that almost all SEOs have been doing too much of this in the last six to twelve months. Basically, it's Google's fault that it's been this easy for this long. Now Google is trying to correct it. But the correction leaves a lot of spam that replaces the filtered sites, simply because these sites missed getting hit by the filter.
well from what I'm seing with my results -in has not spread to the other servers yet.Is -in showing results that I will expect to see in another few days, or is it still shuffling around?
I don't know if this can bring up a kind of answer to that, but from where I check (France), results for site:www.mysite.com -jkzjhekzjeh are :
- ex, va, dc, ab, cw : 927 pages
- fi : 928 pages
- in : 17 pages
- gv : 628 pages
- kr, mc : 624 pages
- zu, sj : servers down
As for backlinks, only VA shows some. New PR shown only on VA too (maybe the only logical thing to be found ;-)).
You guys probably don't see the same things from where you are, but those results can't be done updating. And I can't see how -in could be leading the way.
Fred
I have title: kword1 kword2 kword3 - company name and I was #2 for the kword1 kword2 kword3 search until this update where I'm nowhere on -in.
The other thing I was thinking is that I have an H1 that matches my title (or atleast part of it). I added the tag a few months back and shot right up to the top, now I show up great for two word combinations of my keywords, but not the three word combination. Droppinb the '-' won't change the meaning or words in my title, but it should be an interesting test to see what happens.
I'll post my results...
Any could be correct of course, but the most viable I guess would be the Adwords conspiracy (or perhaps the one that has them boosting the big sites with deep pockets to tempt them at IPO).
However, the most likely I think is still that Google has just got it wrong. Maybe they aren't as clever as many people think they are.
They produce a product (algo) and test it with their own test bed. Then they release it. It the real world it sucks. The results are quite a lot worse than they were before.
That sort of thing happens with many products and many companies. It happens in almost every industry.
The issue though is where they go from here. The good companies address the issues very quickly, to limit the damage. They backtrack, or they apply a quick fix.
Google in the past has done exactly this.
The telling factor this time is whether they do it again. If we see a fix or a backtrack soon, fair enough. Everyone makes mistakes.
If they don't, and decide to run with an inferior index... then we'd better start looking at those theories very closely.