Forum Moderators: open
The panic is settling down, the whine of worry is receding to a steady hum in the back of my head, and several recovery plans are forming...
I lost my index page entirely, due to lazy keyword stuffing. My fault! Unfortunately, mine is a very small business: no listing = no food (let alone xmas).
I was planning on overhauling the website anyway, and I've given myself until 1/1/04 before I accept an opening with another business and abandon my own. The question now is: overhaul the index page and resubmit to Google immediately, overhaul the entire website and resubmit the whole thing in a few weeks, overhaul the website (starting with the index page of course) and wait for Googlebot. Time is most definitely a factor.
...are any of these plans likely to restore my index page to the directory before I have to throw in the towel in January?
There are also longer range options of starting over with a new website and closing the old.
Mahalo Nui Loa! (Thank you very much!)
My site is a travel related site for a city whose name consists of 3 words. Practically every Web user conducting a search for a place within this city preceeds the name of the place or places he/she is looking for with the 3-word city name. For Exammple, if the sercher is looking for hotels in this area he would use the search term: keyword1 keyword2 keyword3 hotels. Now, before Florida, when the searcher used this keyword phrase my site(index page) would come up in the top 3 or 4 pages of the SERPS. Now it is not listed in the SERPS at all, but, if you insert a single hyphen between any pair of words the site is again listed in the top 3 or 4 pages. It appears that all of the other factors such PR and backlinks are the same as before Florida. Obviously the hyphen is an important factor in Florida.
[edited by: Sunset_Jim at 8:02 pm (utc) on Nov. 19, 2003]
Dear Google,
I see you are experiencing serious problems. I thought I might help with a temporary cure until you can get things worked out. Please paste the .html snippet below in the "body" of your search page under the search bar...
<strong><font color="#ff0000" size="4">NOTE: If your search term consists of a two word phrase, please place a hyphen "-" between the words for most relevant results!</font></strong>
:)
I have a page which is a counter example to many theories here.
www.pretty-blue-widgets.com/blue-widget.html
Title: Pretty Blue Widgets
H1: Pretty Blue Widgets
zero internal links to this page.
Every single external link to this page has the exact anchor text "Pretty Blue Widgets"
The site is about as SEO spammy as you can get except for one factor.
It uses the phrase Pretty Blue Widgets only once in the actual content of the page.
It was #1 and remains #1 for Pretty Blue Widgets.
I have another page at
www.brand.com
Title: Brand's Pretty Blue Widgets
H1: Brand's Pretty Blue Widgets
A wide variety of natural and solicited anchor text phrases in the back links. The majority being Brand, Brand.com, or Brand's Widgets, or Brand's Pretty Blue Widgets.
It used to be in the top 10 on the competitive "Blue Widgets" and a few variations and on page 3 or so for the very highly competitive "widgets."
The catch is that I used the phrase Blue Widgets 25 times on the page in the form of a catalog:
Blue Widget model #A, Blue Widget model #B, etc.
After this google update, the site has dropped out of all search results for the phrase blue widget. Even Brand's pretty blue widgets does not bring up the site in the first 5 or 6 pages. "Brand's pretty blue widgets" Does, however bring up the site in the #1 slot.
(its a shame, too, because I can tell from overture that people use search as a bookmark for the site.)
My strong suspicion is that update brings purely a keyword stuffing penalty.
Didn't googleguy say something like the cloakers wouldn't like this update? Well, what on page factors can google use to detect cloaking? Keyword stuffed, randomly generated pages?
(I can't seem to find the find all posts by XXX on this forum anymore to find exactly what googleguy said.)
Last May and June, I noticed some very extreme examples of sites rocketing to the top for a two-word search with those paired words in the anchor text of two or three external links. This was even true when those paired keywords were not on the target page itself. By "extreme," I mean the first or second spot out of thousands or tens of thousands. That led to the "broken" theories.
Now there's a paired keyword filter for competitive e-commerce sites -- at least for the English language, and for the most competitive keyword pairs.
It's almost as if Google put the cheese into the trap about six months ago. The cheese was so obvious, and so irresistible for anyone earning money from the web, that no responsive SEO firm could ignore it. Everyone got hungry and went after the cheese.
However, using keyword stuffing was the equivalent of signing the page as, "Brought to you by optimization." It made it easy for Google to detect who was doing the optimizing. And using keyword pairs instead of individual words makes it much more targeted toward the deliberate optimizers, since that's how most of them measure success.
Now Google is filtering out these pages from the SERPs. Whether or not external links are being used at this point is debatable. But you have to admit that using the same filtering on paired keywords in external links to a page, even if not done currently, is an option for Google. The reason why this would be effective is that optimizers and webmasters in general have a very hard time changing external links on sites outside of their immediate control.
When was the last time you sent out a few hundred emails asking for a change in a link? I did it almost a year ago, and it was a domain change -- which meant the old link was broken, and it was in the interests of the linker to make the change on their site. I got about a twenty percent success rate. Can you imagine trying to get other sites to change the anchor text only? Very tough to do -- it's probably easier to start over with a new domain.
This latest move by Google is an SEO killer.
Very good advice. When it comes to algo tweaks, it just makes no sense for Google to ever make more links = lower rank. Including if this was done, it could be used by competitors to hose a site.
Only problem here is that Google would have to limit this to spammy keyword pairs. Otherwise, there would be a lot of innocent info sites ending up as collateral damage. For example, a lot of instances "ham radio" on a page about that hobby would be natural.
i know that can't possibly be true, but wow - what a coincidence.
I had about 10 sites that I have been working on for about 8 months. Getting links, tweaking them a bit, etc.
About three days before this happened all my sites jumped up about 5 pages to having top ten rankings.
Now they have all moved back to exactly where they were before they moved up.
Any input would be appreciated. I am just wondering why they moved up at all for those three days. I actually wish they would have stayed where they were all along.
thanks
Of course, I do not want to try and analyze the anchor text for all those backlinks.
This could be a very interesting tool by the way: enter a URI and the keyword density is analyzed for all incoming local and/or external links.
Laurenz
Only problem here is that Google would have to limit this to spammy keyword pairs. Otherwise, there would be a lot of innocent info sites ending up as collateral damage. For example, a lot of instances "ham radio" on a page about that hobby would be natural.
Agreed, but it's not a problem.
When crawling dot-com sites, pull out word pairs. Compile a dictionary of word pairs. Sort by frequency of appearance. Scrape the cream off of the top.
"Ham radio" won't even show up after you scrape.
Now for all two-word searches, check the creamy dictionary (now sorted alphabetically for B-tree access). If the search words hit on a dictionary entry, you know that this search needs the filter treatment.
Meanwhile, start stashing the top one or two keyword pairs for each dot-com page when you compile the external links for that page. Admittedly, using the external links to a page involves significantly more overhead than just using on-page data, and that's probably why it's not being done right now. The on-page data can be done on the fly, at the time of the search, while the external links would have to be done once per crawl. But it's certainly an option for the future.
I've spent quite a bit of time just surfing different vertical markets... there are some where the entire top 10 are just horrible... doorways, redirects, flat out and obvious spam..
I've been sending emails to webmaster(@)google.com as GG asked... I think we better bury them in examples before this crap sticks!
About three days before this happened all my sites jumped up about 5 pages to having top ten rankings.
Ditto.. last week www-in showed me in the #1 spot for the keyword I ranked #3 on every other dc. The thread I posted about this eventually turned into the Google update thread.
This could be a very interesting tool by the way: enter a URI and the keyword density is analyzed for all incoming local and/or external links.
There is a tool out there that gets close to doing this, but I won't mention it by name ;P
[edited by: synergy at 9:06 pm (utc) on Nov. 19, 2003]
Before Florida
widget - #31
widgets - #8
Post Florida
widget -?
widgets -?
Current
allinanchor - widgets = #4 widget = #30
allintitle - widgets = #4 widget = #28
allintext - widgets = #5 widget = #30
allinurl - widgets = #2
My conclusion is that the ado about 2 and 3 word search terms is a red herring in terms of figuring out what is going on.
WBF
The problem with using anchor text in external links is that a competitor could set up a network of 100+ sites, and links to the competition with the right anchor text to hose them. If limited to on page data, there can be no possibility of this being done by a competitor or the malicious. Google would definitely need to limit this to keyword pairs and such that are heavily spammed.
The problem with using anchor text in external links is that a competitor could set up a network of 100+ sites, and links to the competition with the right anchor text to hose them. - rfgdxm1
Unless the anchor text penalty is only applied to external links that you have recipricated from your site - i.e. a mutual consent between both sites.
I just checked a three word search with 3,300,000 results. None of the top ten results had the exact search phrase on the page more than twice. Half of them didn't even have the exact phrase on their page at all.
The back links for the #1 result on this page almost all contain the exact search phrase I used.
9 out of the 10 have the three keywords in their title, just not using the exact phrase.
The #1 spot on this phrase is still the same after the update as it was before. During the update, it had dropped off.
keyword1 keyword2 #2 for allinanchor
keyword1 keyword2 #2 for allintext
keyword1 keyword2 #2 for allintitle
keyword1-keyword2 #2 in SERPS
keyword1 keyword2 NOWHERE IN SERPS
keyword1 2nd page in SERPS (very competitive term 21 Million + results)
keyword2 #1 in SERPS
Single Keyword I'm watching:
keyword #10 for allinanchor
keyword #11 for allintext
keyword #11 for allintitle
keyword NOWHERE IN SERPS
Net effect = Zero sales
I'm new to this game. I purchased sites that someone else SEOed. I'm the beneficiary/victim of his hard work.
Until this update I didn't even know about allinanchor, allintext or allintext. Now I'd like to know what is their purpose?
Are those tools for SEO guys? What do we do with the information?
I hope -gv is the end result of this update. I'm looking good on -gv.
With the amount I spend on adwords every month they owe me their souls!
Seriously though, with a combination of SERPS/Adwords and our affiliate partners we make a reasonable profit. If we lose all our SERPS and our affiliate partners lose theirs too then we would have to shut up shop and google would lose a good adwords customer or 50,000
Food for thought google. Commercial websites pay your wages and if you're going to dump on them and fill the results full of spam/dead sites/news sites and irelevant results then we'll go elsewhere, our category is a joke right now.
As it is i've upped my adwords spend and will watch it for the next week,but if things don't improve soon then i'll be looking for other places to advertise, because I can't afford to rely on adwords and no SERPS.