Forum Moderators: open
My observation so far: little change this month from last. Anchor text of inbound links still counts big time, and PR seems to be worth the same as before. IOW, its the same old, same old. One aspect that isn't relevant with the SERPs I am most familiar with is "spamminess". I don't see much more spam, but then these SERPs don't tend to be the ones that spammers would be found on. Thus, the index may be more spammy, and I wouldn't see it.
translation - put your url on your business cards and brochures as this will be the only way people will find it and raw internet marketing will do nothing for you.
Probably a good idea if your customer insists on flash movie intros. I have a customer that insists on flash intros. Very irritating when I go there. I should probably add index2.html to my favorites to avoid this irritation.
A drop in backlinks is noted, but the net effect on position and PR seems negligable. (wish I had kept track of the total results count in the past)
Results from pages deeper in the site have appeared. This may be just the effect of time.
Are there freshbot results without dates mixed in there? Things I changed a couple of days ago are appearing like they had just been included in the update. (wait a few days for final results, I guess)
The plague of spammers that were crowding the results a few months ago seem to have been banished entirely. Very clean results, best ever in my little category.
It seems to me that the easiest way for the google algo to "catch" expired domain names would be to simply assign a penalty to all domains that have expired in the past couple years in one fell swoop. I do not see how they could realistically find out which sites got new links, etc.
rjohara:
I've seen the same thing happening myself...on more than one site. I'm hoping next month will be better...
Clark, are many of your pages static pages that you haven't changed in a long while? I'm still trying to understand why my total indexed pages dropped from 300 to 200 last month, and to 110 this month - not a happy result! All pages are spam-free static text pages on my .org site. Nothing has been changed, and the number of indexed pages keeps dropping.
However google still has the problem with "expired domains". On my searches I still see these domains ;( Is it possible to filter them in "automated way" - so consider the anchor text and the text on the site, which refer to this site, and if text absiolutely differs - just don't count it? ... it will solve this problem and the problem with "unrelated links pages" forever. At least from searcher point of view - he wouldn't like to jump from buying small gift for his girlfriend to luxury car, or porno site - what do you think GoogleGuy?
However the sites with hidden text were completely removed - thank you!
So the general the change is "less spam".
The web site is relative to the keywords, however it is spam heavy, three domain names with largely the same contant, cross-links and the Google cached page version differs from the actual site page, with the cached page optimized for specific keywords.
My Question:
Is it possible that such sites will collapse and fall before the dance ends or do we have to settle for second, third, fourth or fifth place below spam merchants?
I love Google, but this is a bad day :(
A drop in backlinks is noted, but the net effect on position and PR seems negligable. (wish I had kept track of the total results count in the past)
I had a small usual mistake(or whatever!) in my page and I did not know that. something like this example here...
ABCD, def[extra space here] , Neh2008
Now, When I ask for ABCD, def, neh2008 Google suggests me
my original phrase (with the comma after def). But, doesn't show me ANY page for that suggestion when I click on the suggestion.
I hope you all are clear about the GOOGLE SUGGESTION, it says Did you mean: ... on the top of the SERP results.
Please make me think more about the possibilities.
TIA
With this update, the English language/other language balance is a little more in favor of the English language sites, and with a much higher number of English language sites in the top ten positions. I wonder if other people with similar keywords (used in English and foreign languages) have seen similar movement . . .
my opinion is that this update has given more weight to .edu and possibly .org sites... in my main keyword, the rise of those sites is quite obvious. anyone notice this?
Have a read about my post "I'd sell my wife for PR" I've made a similar conclusion but a slightly different way.
e.g. Sites with greater human involvement get higher rank.
The purchase of blogger.com is no coincidence - it's
about creating filters to track more human based recommendations, and how people recommend sites.
With this update, the English language/other language balance is a little more in favor of the English language sites, and with a much higher number of English language sites in the top ten positions. I wonder if other people with similar keywords (used in English and foreign languages) have seen similar movement . . .
For the first time I tried using foreign language last week. (on a totally english site) Not as a "keyword", but just a few words of text buried down in the main page. I am amazed to find I am getting first place for searches of the non-english words.
Since I've never tried this before, I can't comment on any changes with this update, but the results for the foreign words are way better than I expected. Always been like this, or result of algo changes?
e.g. Sites with greater human involvement get higher rank.
i know this... but i am talking about this update giving a little "extra" (possible change in the algo). the edu & org sites for my keyword that have NOTICEABLY jumped up, are sites that are quite static, and have not been updated in years (i.e. one page that is ahead of me is an article from 1991, and has not been updated in a long time). so this page has not added new content, not SEO'ed, not anything. just jumped up about 7 spots. just wondering if anyone else has noticed this, or am i just a coincidence?
It can't be good for the searcher to get results that list;
Site #1
Site #1
Site #2
Site #2
Site #3
Site #1
Site #1 affilliate
etc.
I should not complain, as I "enjoy" several #1 or #2 pairings for my own site. Still, it can't be good for any of us. Better access to the best results is the goal isn't it?
if you want to ask about specific sites, maybe drop us a report?
How and where? "Not satisfied with search results?" does not seem to have any effect. There are two software sites that use gibberish phrases and show up inapproriately, one of which shows a directory listing for renewable energy! There are about 10 doorway pages all displaying the same Flash menu.
I have been wondering if they can somehow "straddle googlebot" by appearing to be fresh content. I use javascript:alert(document.lastModified) in the address bar to check their last mod date and this technique always displays my system clock time for their pages (but correct for other sites). Is that why they cannot be evicted and get this inappropriate exposure?