Welcome to WebmasterWorld Guest from 220.127.116.11
If you have the 301 from non-www to www you may want to consider adding a link somewhere to the 301 (eg the non-www page) so that Googlebot finds and indexes the 301.
One of my sites lists the top 100 widgets site. One of the sites used a keyword in the description, that is the one keyword I never targeted with my strategy, therefore, this keywords exists once and once only in this sites description.
Thanks to Jagger, I now rank top 10 for this keyword which is really competitive . However, I should not be ranking for this, how I do rank for this is completely absurd.
I personally don't think I deserve this ranking....Google thinks I do.
Now, unless Google controls the content in it's listings, the search engine becomes redundant, because people will look elsewhere for more relevant results.
Because the latest update is such a major one, and I believe, aimed at creating the most valid and worthy set of search results, it is necessary to shake up the SERP's, delete invalid ones, and put the rest in representative order.
To do this, it means that not everyone will end up happy. But it also means that the SERP's will be more valid. However, the update is not yet over, and until it is, the newly formed Google SERP's will not ne available.
The alternative choice would be to continue with a list of search results that could be manipulated by SEO's, scammers, and so on. At least Google is conscientiously trying to make it fair for everyone, and at the same time, hoping to provide a concrete base of good quality results.
It seems to me that the SERP's were getting out of control and Google is trying to fix it. My suggestion is that we all wait and see what the final outcome us, after the Jagger 3 update.
One term I have been watching (medical for a certain city) seems to be completely filled with EASY-TO-RECOGNIZE spam. E.g. when looking for a doctor, I see a tax consultant (!) as #10 on the first page instead of a M.D. (and there are 100s if not 1000s of M.D.s in this city). And a dating site is on page 2...?
This had been eleminated in previous updates because it was so easy to detect. Have G been doing a complete re-write of the engine?
[edited by: mzanzig at 10:59 am (utc) on Oct. 28, 2005]
joined:Oct 11, 2005
Lets help Google to get rid of those scammers forever
Sure. What makes me curious, though, is the fact that in previous updates Google already got rid of these blatant spams. Why are they now all back? The results of Jagger2 look to me as if I am looking at a newly built engine instead of a mature, fully empowered engine. I wouldn't even want to call the SERPs "beta" - in its current state the SERPs are far from being useful to end cunsumers at all.
And yes, I reported some spammers using the way described by GG before, but I started to wonder: shouldn't Google be doing their homework first, given the many obvious spam sites in the SERPs? Why ask webmasters to look for spam sites?
joined:Oct 11, 2005
>>"Why ask webmasters to look for spam sites"
because a mature webmaster knows better the way to spot spammers ,because he has knowledge of how Search Engines used to work and rank pages in the past (but not anymore).I believe that jagger way of fighting spam by reports of webmasters ,mainly ,will be followed from Yahoo and MSN as well .<<
joined:Oct 11, 2005
It strikes me that if Google can't see the problems for themselves, then relying on spam reports is not going to achieve much overall. I thought they tried to identify tactics and adjust algos to weed out spammers on mass. Spamming is not rocket science and by now they should know all the tricks. Why have they started looking at it in such an inefficient way?
Have they run out of better ideas?
What I am wondering is: Looking at a jagger2 search result that contains up to 80 per cent pure spam and/or black hat techniques on the first page, where is the point in reporting spam? This turns out to be endless, and it could take hours to weed out the results - and this is just for one single phrase. And for stuff that HAD BEEN fixed already, i.e. it was machine detectable before.
joined:Oct 11, 2005
The relevance of this, is that some people suggest o here that Google wants to increase revenue with the new update, as much as provide an update on the SERP's.
I think that Google has realised that it cannot stop spam through algo's / filters alone. By encouraging spam reports it is allowing human judgement a bigger role. And human judgement is always going to be better than algorithms at weeding out bad sites in the ever changing spam game.
Think about it, if Google had stayed algo-only there would be no end to this algo change > tweak site > mcdar / webceo cycle. Google would never have great serps and would waste massive amounts of time and money spidering and tweaking.
As it is, Google will soon have human verified serps in most sectors and can then, once it has trust in the existing sites in the serps, shift it's energy to policing newcomers.
For me Jagger (so far anyway) represents a huge leap forward in attitude form Google that has set it apart from the other search engines and will keep it ahead for a long time.
All these spam reports that must be flooding in now give them a vast expanse of information to work with.
It's all about Google trying to establish trust in Webmasters.
Perhaps they will see new tactics and adjust the algo but in this frenzie of reporting I think the bigger picture is being lost. Its a great way to appear 'caring' towards webmasters in this forum but ultimately it is a stitch up.
If I wanted to soften the impact of a big mistake I would:
1) Say its rolling out over several weeks.
2) Ask for key players (webmasters) for help and appear to be responding.
3) Blame canonical technical problems that seems to be very complicated and a bit woolly.