Forum Moderators: open
The panic is settling down, the whine of worry is receding to a steady hum in the back of my head, and several recovery plans are forming...
I lost my index page entirely, due to lazy keyword stuffing. My fault! Unfortunately, mine is a very small business: no listing = no food (let alone xmas).
I was planning on overhauling the website anyway, and I've given myself until 1/1/04 before I accept an opening with another business and abandon my own. The question now is: overhaul the index page and resubmit to Google immediately, overhaul the entire website and resubmit the whole thing in a few weeks, overhaul the website (starting with the index page of course) and wait for Googlebot. Time is most definitely a factor.
...are any of these plans likely to restore my index page to the directory before I have to throw in the towel in January?
There are also longer range options of starting over with a new website and closing the old.
Mahalo Nui Loa! (Thank you very much!)
Under this theory, how would new sites and new content ever get PR? How would new sites ever get indexed if Google only followed links from old data sets?
I have some new sites (less than 90 days old) in Google right now that held their top rankings, while others fell. There is no “old” data to compare to. If there is no old data to compare to, you would think the sites would lose their ranking, but this is not 100% true in my case.
Just another inconsistency I suppose.
I think this theory holds some merit, but trying to poke holes.
--On the one hand, G knows (from Adwords, etc.) that the more closely a keyword search phrase matches an ad headline, the higher the click-thru...
--On the other hand, extraordinary steps seem to have been taken in Florida to weed out closely matching paired keywords, in favor of related, but not identical matches.
Talk about pretzel logic...gotta laugh at that one. :-)
I'm hoping this is all just about testing, albeit at the expense of the searcher, to learn some really good stuff! Then, soon we'll be back with SERP's that are bigger and better than ever! G, don't let us down!
A site I have had up for 9 months is MIA.
One I put up a couple years ago is suddenly at #10 for a two word term, out of the blue I might add since it was always a loser.
Another about three years old has risen into the top ten across multiple terms.
If time is a factor, it appears to be sites first indexed around the time of esmeralda.
And Brett, if it is themes, Google needs to take some lessons from Teoma :)
WBF
This is the sort of thing that has been an issue for Google since the beginning. A lot of old message board scripts were based on how Usenet was, and thus incorporated .sigs. Thus for a long time Google rankings have been influenced by .sigs, almost always with the signer doing it without even thinking of Google. The question is whether Google has figured out how to ignore these? Most message boards use standard scripts that are obvious to spot, just like guestbooks. Google finally has managed to learn how to ignore guestbooks, and my guess if they haven't, soon they will with message boards. However, the spammers may be doing it now even if Google does ignore the links. I noticed that a lot of guestbook spam was blatant, including hyping the site. Obviously they wouldn't have done that if they were thinking about Google. Making the spam obvious encourages the site owner to delete it.
When you mentioned themeing with Google Brett, did you mean the Teoma style? There are 2 possible kinds of themeing. One is the Teoma style, which tries to spot communities of sites on a common topic. The other is where the idea isn't to spot communities, but instead comprehensive sites on a topic. This latter is done by seeing how many pages the site has on this topic, and boosting the site for searches on that. IOW, if my site has just one page mentioning "purple widgets", it will rank lower than if Googlebot spots I have 500 pages that mention "purple widgets."
IOW, if my site has just one page mentioning "purple widgets", it will rank lower than if Googlebot spots I have 500 pages that mention "purple widgets."
Hard to accept this as being the model given the number of sites with anly one page, and/or only passing reference to the topic showing up.
Search for tacks - #7 = "Yahoo tacks fees onto e-mail..." Hard to find any theme on that top ten result!
WBF
Looks like my site is not even showing up for keyword1-keyword2 anymore, let alone keyword1 keyword2.
This is confirmed by looking at actual keyword pairs mentioned on other forums days ago.
This supports my suspicion that there is a lookup of two-word search terms in a new Google dictionary of the most popular two-word search terms used on commercial, English-language sites.
A lookup like this is probably being done semi-centrally, by referring two-word searches to dedicated machines at one or two locations over their internal network. That way they can respond quickly without waiting for the next update.
Before a couple days ago, it was embarrassing to Google that hyphenated word pairs were not recognized as matches in this lookup, and were therefore producing non-filtered results because the hyphenated word pair didn't flag the search as requiring the new filter treatment. This glitch provided us with one of the best clues that a special filter was in place.
Now that lookup is fixed on the hyphen matching. It's a lot easier to fix that lookup than it would have been to roll back the dance, and it's the only thing that's been definitively fixed. On the other hand, they could have turned off the lookup just as easily, with a simple "return" with the flag still set to zero, instead of going through a lookup. Google decided to improve the lookup by fixing the matching glitch.
This is strong evidence that the filter is deliberate. Google isn't broken, Google is filtering. The nature of the dictionary is Top Secret. It might be more sophisticated than just keyword pairs. Also, it's a dictionary designed for its anti-spam and/or anti-SEO effects, and is therefore limited in size. A lot of the "my site did this" vs. the "you're wrong, my site didn't do that" that we see on WebmasterWorld comes from the fact that we cannot use real keywords on WebmasterWorld. That means we cannot figure out what's listed in the dictionary and what isn't, and we just go back and forth.
I think this theory holds some merit, but trying to poke holes.
--
Google "normally" compares to last month dataset ( I think they started doing this in aug. - sep. ), so if your site is added in august ( before the august update ) you will get PR in sep update. In the "OLD" days you could give a site PR within days ( not any more ).
Google do follow / index and list new sites , but now they do not get any PR before the inlinks has been seen on the same page for 1 - 2 month.
And still, absolutely no one has mentioned working towards the Inktomi transition while all of this other stuff is going on - nor studying these odd results with the intention of possibly capitalizing on this situation if it were to ever occur again.
We are... and whether it's a Google-fart or not, why should your optimizing and SE campaigns end at Google? Scared of doing some real SEO?
Relax and enjoy the weekend, it's not an algo tweak, it's not a new filter, it's a good old fashioned Google SNA*U.
Glengara, remind me to buy you a pint at Pubcon FL.
Anyone else tired of bashing their head against the wall?
I never started. Then again, play the game for a certain length of time and you get used to big changes and temporary wormholes.
Do you really think that your site is gone forever from the top of the SERPS - even if you have one of the most relevant sites out there for it's subject matter? Google would never shoot their own feet - and that's what that frame of mind implies... that Google is no longer interested in providing relevant results.
This can't be good and my two clients are very likely not the only real businesses in the world whos is ticked at Google.
Of course they aren't - but then again, there are alot of BADLY RUN businesses who's Principals have absolutely no stamina for adjusting for change and waiting for better results.
IOW, if my site has just one page mentioning "purple widgets", it will rank lower than if Googlebot spots I have 500 pages that mention "purple widgets."
I am not sure this is far off. all of the datacenters except VA are showing directorys and authority sites for 8 out of the top 10 results for a particular 3 keyword phrase that I watch.
And, more importantly, what was not in the dictionary.
The same thing would work if you have a pre-update search vs. a filtered search.
Too late. Anyone who discovers an equivalent glitch probably won't be posting it.
Are we still theorising that Google checks to see if a given page contains any high-density on-page keyword strings... and if it does, if those same keyword strings are also listed in "the dictionary"... and if they are, Googlebot looks at inbound link text to see if a high proportion of links also contain the same keyword string... and if everything matches a filter is applied to the page (regardless of whether it's a genuine non-spam resource or one of Chef Spam's plats du jour?)
How does that fit in with the theory the Google is broad-matching query strings with pages?
Does it simply look like broad-matching, because the most relevant pages with high on-page keyword-string density and many identical inbound links are being filtered out and leaving behind only those pages which are broadly relevant?
Kirby quotes something I wrote as being written by Napolean, and then says I'm agreeing with Brett. Cold....
"Disappearing" sites is a problem which I don't see at all in my area, but assuming it is true in some areas, people who dropped a lot in the serps are not the same as "disappeared". Lord knows there are plenty of sites in the serps that either should lose hundreds of places in rank or could plausibly be ranked much lower than they are. "Not in the index" is completly different.
This is easily Google's most positive algo move this year. Not counting the disappeared phenomenon, the problems seem to be:
- certain types of spam (often created by one person/entity) have been able to get large blocks of crap ranking right next to each other. Google seems to be removing these when they find them though, so offering them specific feedback on plainly obvious garbage is a good idea.
- problems with figuring out duplicate pages/content
- duplicate anchor text-based crap still does okay, though this junk's days may be numbered
- authority sites are thank God finally being valued over anchor text drivel; however, Google needs to better value context.
There is zero "localrank" sensibility applied now, which means an authority site (like a newspaper) that happens to mention keywords in an article may be doing better than they should, at the expense of "authority on this topic" sites. Said it before, will say it again... the roots of this update are a tremendous improvement. Quality content -- and a *history* of quality content -- is being rewarded, but the stew isn't done. It needs more topic-context, valuation of authority within a niche. If that is the next step (and since Google bought that localrank technology, one would guess that it is either the next step or the one after that), then Google could have something truly amazing on their hands.
LOL - The best phrase to come out of this thread :)
The fun and games sort of reminds of INK in 2000 pre PFI (the AV turkey shoot was not to encourage payment, I don't think) - it looks like it might be get your credit cards out time! The Google free lunch may soon be over for commercial sites.
Emphasizing page title a bit more would also address the problem of newspaper-like pages ranking highly just because the text happens to be in one article. If title mattered more, and a keyword wasn't in the title, these newspaper examples would have to drop drastically.
I see pages that have the keywords listed once and a about a paragraph of content coming back as number one for that keyword pair search.
Then I cannot even find my page using those same keywords. I have several paragraphs of content. My page is decently linked and has a fair keyword density like is recommended here. The page that is coming back number one has one or two incoming links.
Valuing and awaiting your expert opinion...
[edited by: airpal at 10:55 pm (utc) on Nov. 21, 2003]
The webmasters that register 100's of domains could just put a different name and business name each time they buy a domain... they don't HAVE to use a proxy to hide their identity. Besides, many large, legitimate companies are now making small portals to go after niche-targeted audiences.
Starbucks doesn't have just one coffee shop, why should webmasters have just one website? Don't wait for the people to come to you, go after the people.
Google surely wants to eliminate this activity. I was just pointing out a way to it.