Welcome to WebmasterWorld Guest from 18.104.22.168
In doing searches for topics that I am interested in, but unrelated to my sites, I have noticed it takes longer to find good pages. Just a year or so ago, I could find four to eight good web pages on the first page. Now, sometimes I find one or none on the first page. Some keyword phrases are so bad they have been literally crippled to search with.
This is not really Google's fault. The web is just getting littered with all kinds of 'high tech' tricky stuff to make web sites easily and get traffic quickly. I assume this is to make a quick buck. A real driving force is the fact that people can put pay-per-click ads on their very own sites for free. These types of ads, such as Google's own Adsense is a great idea, but there is bound to be a downside to every new thing.
I am sure Google is trying hard to fix these problems because I am seeing different results every few weeks or so for the same search phrases. The phrases I check the most are ones that bring my own web sites up on the first page. This way I can tell how much of my traffic is coming from the search engines.
Luckily my web sites rank pretty much the same, but the other sites that are ranked above and below mine keep changing. Some sites that are ranked near mine help my traffic while others don't. It seems that since there are so many 'trick' sites showing up, that it can be good for you to be next to one. People will notice it is crappy and then visit yours and realize it is better and bookmark it and come back again later.
With the amount of web pages growing fast and the amount of 'quick buck' folks showing up, this situation can only get harder for Google to handle, but imaging having a slew of crap pages all around your listing? Your quality listing will show up very well and your will get higher quality traffic and possibly more of it.
Google is looking into a way to 'register' your domains with them and give you an ever growing 'boost' in ranking for registering as the system grows and matures. The 'bad folks' can then easily be tracked and removed for breaking ethical rules in web design to cheat their way to the top. If these pesky folks decide not to register to 'hide' from Google, then there rankings will start to suffer as legit registered sites get ranked higher. It is like a hybrid of a search engine and a directory.
As for these latest changes. Lots of movement has taken place. My site is the only one that hasn’t moved on this update. Every other site has moved and a few new ones added. It saddens me to see that a few 'quick buck' sites are now on the same page as mine. But, like I mentioned above, this isn't always a bad thing. I wont be able to tell if it helps my site for a week or so. Sometimes these neighbors of mine can have a huge impact on my traffic from the Google search pages!
Another thing that I have noticed over the past few years is that more and more of the best quality traffic to my sites comes from links on other sites. Google's search pages only provide about 15% of the income earning traffic to my sites. If you 'lose' a ranking on the Google pages, that may not affect your income at all. Get links from other sites similar to yours and watch your quality traffic grow. Trust me it works. I have been doing it for years. People have even said that if you get links Google will actually place your pages higher on theirs! A double bonus in my book!
Besides those links from other sites, add two or three pages per day with information about your topic. These pages will get indexed very fast and other sites will link to the pages. Make sure people can get to the main part of your site from each new page you make. I have read in this forum that adding new pages not only brings links, but once again, Google will move your site above the others just for adding new pages all the time. You don’t need to add more then two or three a day though. If you run out of subjects start to branch out. Don't write anything boring. People will leave and will forget about the site and wont link to it. Boring things include sales tactics or repeating the same thing over.
On bad thing about making the site with article pages is that you will get TONS of email from sites trying to swap links! It is a royal pain and you should prepare for it. Ironically, most of these people write web sites that try to make a quick buck! (See Above) Someone said that linking to some of these sites will get you ignored by Google! Do not answer these emails and do not swap links! Don't let these bad guys stop you from adding to your site. Also be ready for bad guys that want to buy links from you. Google might ignore you if you link to them. I am not sure if Google is trying to keep these crappy sites from getting links, but the less they have, the less people that will visit them and they will give up an go away! We can only hope.
In the time some of us spend checking the Google data servers, we could have written one good article on our very own site.
Good luck with your web site...
[edited by: tedster at 1:57 am (utc) on July 7, 2006]
I notice that on 3 of the 72's - 22.214.171.124, 126.96.36.199 and 188.8.131.52, my site is at the top for the site:www.domain.com command. However, my rankings are not the same as pre June 27th - with the exception of one keyword I've checked, they're worse.
The only DC where my rankings remain unchanged is
Do your sites agree with this pattern, or are your rankings actually fully restored on the 72's?
Are providing a real ray of hope today, both the site:mysite.com and site:www.mysite.com are showing properly and I'm back on the first page for my best search term, in the 9 position instead of the 3 but much better than the 80 position I was in after the 27th changes.
Proper listings on site:www.mystore.com but not site:mystore.com, and listings improved only slightly, not indicative of a real fix.
Does everyone see the corrected 72's as a good sign just the same?
Ready? Here goes...
I believe any changes on the 27th were refreshing data used by an existing algorithm.
I guess they switched on an experimental buggy filter by mistake then? Hopefully in another six months or so they might accidentally switch it off again.
Anyway. Nice to know that they are working away feverishly behind the scenes investigating the cause of the problem.
If it was simply refreshing an existing algorithm then why would sites doing well before suddenly dissappear in the rankings?
Could they have accidently implemented an old algorithm from a few years ago?
Would love your take on what Matt said.
Occasionaly a "roll back" of serps to a previous date
Occasionaly pages added to inflate the index size
I would also say that this happens 2 different ways;
Once at the end of every business quarter(usually not noticed on all data centers) and sometimes a month later ( maybe when the previous data refresh propagates)
Thats speculation on my part, but I have 1 year of daily historical data that shows me every ranking i have on a hunderd or so keywords for 2 different domains and its almost like clockwork since December 27th with the data refreshes.
I am so not understanding why a site: search that should exclude www pages returns thousands of www pages that are Supplemental Results.
I see this effect for most sites for several months now.
which shows normal www pages that are not Supplemental.
I asked Matt Cutts to comment on this, and the comment was deleted.
What really gets me also is that we do a redirect from non-www to www and under the site command without the www shows results that have www which is the exact same as with the www. They are all non supplemental and it appears like it is the exact same dataset.
Now on some other sites I have checked having the redirect the opposite (www to no-www) the site: command shows two different sets. Searching with www will usually show supplementals with www and searchin without www will show correct non www results. Europeforvisitors site is a good example of this.
Now I have seen other sites who do the non www to www versions that are acting the same way as us and they are having problems just the same but I haven't looked at enough to make a judgement.
What I would like to know...Which is correct? Could there be problems here we should know about?
Keep in mind that doing a site command for www and non www versions of webmasterworld.com acts in a similar manner as us but they seem fine.
[edited by: arubicus at 8:58 pm (utc) on July 7, 2006]