| 8:38 am on Oct 28, 2005 (gmt 0)|
Is there a way to fix Canonical problem on your self?
I just did a site:domain.com -www and I see that my home page is listed.
I am using a 301 redirect in my htaccess for a long time thinking it will solve it.....:(
| 8:43 am on Oct 28, 2005 (gmt 0)|
Asher02 - well hopefully Jagger3 will solve it - so in first instances I would wait.
If you have the 301 from non-www to www you may want to consider adding a link somewhere to the 301 (eg the non-www page) so that Googlebot finds and indexes the 301.
| 8:44 am on Oct 28, 2005 (gmt 0)|
|guys which DC's apart from 66.102.9 seem to have the jagger2 |
I have the same question..
| 8:46 am on Oct 28, 2005 (gmt 0)|
At the moment it just looks like 66.102.9* & 66.102.11.* - it has spread back and forth on a couple more C-classes.
| 8:49 am on Oct 28, 2005 (gmt 0)|
Thanks Dayo ;)
| 8:55 am on Oct 28, 2005 (gmt 0)|
| 9:20 am on Oct 28, 2005 (gmt 0)|
These results suck, Relevance has gone out the window.
One of my sites lists the top 100 widgets site. One of the sites used a keyword in the description, that is the one keyword I never targeted with my strategy, therefore, this keywords exists once and once only in this sites description.
Thanks to Jagger, I now rank top 10 for this keyword which is really competitive . However, I should not be ranking for this, how I do rank for this is completely absurd.
I personally don't think I deserve this ranking....Google thinks I do.
| 9:48 am on Oct 28, 2005 (gmt 0)|
Let's take a step bak, a minute,and think of the logic of an update such as this. Google has been running since 1998, and has continuously added billions of websites to its listings.
Now, unless Google controls the content in it's listings, the search engine becomes redundant, because people will look elsewhere for more relevant results.
Because the latest update is such a major one, and I believe, aimed at creating the most valid and worthy set of search results, it is necessary to shake up the SERP's, delete invalid ones, and put the rest in representative order.
To do this, it means that not everyone will end up happy. But it also means that the SERP's will be more valid. However, the update is not yet over, and until it is, the newly formed Google SERP's will not ne available.
The alternative choice would be to continue with a list of search results that could be manipulated by SEO's, scammers, and so on. At least Google is conscientiously trying to make it fair for everyone, and at the same time, hoping to provide a concrete base of good quality results.
It seems to me that the SERP's were getting out of control and Google is trying to fix it. My suggestion is that we all wait and see what the final outcome us, after the Jagger 3 update.
| 9:55 am on Oct 28, 2005 (gmt 0)|
"At least Google is conscientiously trying to make it fair for everyone............."
Eazygoin.....Which planet did you just drop in from?
| 9:59 am on Oct 28, 2005 (gmt 0)|
Obviously, not the same one as you. I am trying to be objective, not negative.
| 10:21 am on Oct 28, 2005 (gmt 0)|
Google is a business whose only responsibility is to its shareholders. Consideration of Googles tactics starts to make sense only when this is taken into account. Google is not the local SEO charity trying to be fair to all of us poor marketeers.
| 10:43 am on Oct 28, 2005 (gmt 0)|
JudgeJeffries - How does any company provide value to it's shareholders?
By providing a service to its customers, or by saying screw the customers, and the money will come in anyway?
My view is that by providing the most effective and up to date database possible, Google will continue to be seen as leader in it's field, and thus maintain its customer base, or increase it.
I don't wish to argue the point with you, but I guess we all have our opinions, and I respect yours.
| 10:51 am on Oct 28, 2005 (gmt 0)|
Googleguy thank you for assistance!
thanks for seeing to my report through the dissatisfied link!
| 10:53 am on Oct 28, 2005 (gmt 0)|
The results of Jagger2 are, erm, sub-optimal.
One term I have been watching (medical for a certain city) seems to be completely filled with EASY-TO-RECOGNIZE spam. E.g. when looking for a doctor, I see a tax consultant (!) as #10 on the first page instead of a M.D. (and there are 100s if not 1000s of M.D.s in this city). And a dating site is on page 2...?
This had been eleminated in previous updates because it was so easy to detect. Have G been doing a complete re-write of the engine?
[edited by: mzanzig at 10:59 am (utc) on Oct. 28, 2005]
| 10:56 am on Oct 28, 2005 (gmt 0)|
I spend hours discovering superspammers at the top 10 listings and reporting them .Lets hope that all of you instead of moaning do the same and forget your pages problems for a few days.Lets help Google to get rid of those scammers forever.
| 11:04 am on Oct 28, 2005 (gmt 0)|
So when will jagger3 begin?
| 11:07 am on Oct 28, 2005 (gmt 0)|
|Lets help Google to get rid of those scammers forever |
Sure. What makes me curious, though, is the fact that in previous updates Google already got rid of these blatant spams. Why are they now all back? The results of Jagger2 look to me as if I am looking at a newly built engine instead of a mature, fully empowered engine. I wouldn't even want to call the SERPs "beta" - in its current state the SERPs are far from being useful to end cunsumers at all.
And yes, I reported some spammers using the way described by GG before, but I started to wonder: shouldn't Google be doing their homework first, given the many obvious spam sites in the SERPs? Why ask webmasters to look for spam sites?
| 11:20 am on Oct 28, 2005 (gmt 0)|
"Why ask webmasters to look for spam sites"
because a mature webmaster knows better the way to spot spammers ,because he has knowledge of how Search Engines used to work and rank pages in the past (but not anymore).I believe that jagger way of fighting spam by reports of webmasters ,mainly ,will be followed from Yahoo and MSN as well .
| 11:24 am on Oct 28, 2005 (gmt 0)|
>>"Why ask webmasters to look for spam sites"
because a mature webmaster knows better the way to spot spammers ,because he has knowledge of how Search Engines used to work and rank pages in the past (but not anymore).I believe that jagger way of fighting spam by reports of webmasters ,mainly ,will be followed from Yahoo and MSN as well .<<
| 11:30 am on Oct 28, 2005 (gmt 0)|
I also think reports to Google doesn't make much sense. Why to report spammers when they are so easy to detect by manual check by a Google-team? Removing individual pages doesn't make much sense.
Those spam reports to Googlie might have more meaning as a relieve for the emotions of white and grey hat webmasters who went down then that it will improve much at the Serps on long run. It would only make sense if webmasters know techniques to detect those crooks Google doesn't know. Just looking at the top of the search is something Google must be able to do better I think.
Reporting spammers doesn't feel good to me. There is a to thin line between grey hat spammers and some of my competitors. Bad for ego. Just my opinion.
| 11:36 am on Oct 28, 2005 (gmt 0)|
spam spotting lessons.
1)check source code
2)check noframe code
3)check Google's cached text only.
4)edit select all and check for hidden text
5)download the background img .gif or .jpg check the colour of the img (lets say #FFFFFF) if text and links are #FFFFFF then we have an obvious hidden text+links spam case
6)check networks of pages that all point to the main big Pufadder mama.
7)check for meta refresh redirections or download the JS file from the scam page to invastigate the JS redirection script (www.mysite.com/red.js).
that's for starters........
| 11:40 am on Oct 28, 2005 (gmt 0)|
I'm not happy with this 'report a spammer' concept.
It strikes me that if Google can't see the problems for themselves, then relying on spam reports is not going to achieve much overall. I thought they tried to identify tactics and adjust algos to weed out spammers on mass. Spamming is not rocket science and by now they should know all the tricks. Why have they started looking at it in such an inefficient way?
Have they run out of better ideas?
| 11:42 am on Oct 28, 2005 (gmt 0)|
My site until last week had a PR5 with 200 backlinks.
Today it is PR 0 (all 200 or so pages) and no backlinks - am I in the sandbox?
Any ideas how I get my PR back
| 11:45 am on Oct 28, 2005 (gmt 0)|
Please do not misunderstand me - I am all for getting rid of spam in SERPs. No discussion about this.
What I am wondering is: Looking at a jagger2 search result that contains up to 80 per cent pure spam and/or black hat techniques on the first page, where is the point in reporting spam? This turns out to be endless, and it could take hours to weed out the results - and this is just for one single phrase. And for stuff that HAD BEEN fixed already, i.e. it was machine detectable before.
| 11:47 am on Oct 28, 2005 (gmt 0)|
This spam reporting business seems like trying to empty the Atlantic Ocean with a teaspoon.
| 11:50 am on Oct 28, 2005 (gmt 0)|
"I'm not happy with this 'report a spammer' concept"
The above spamspotting i posted it's almost impossible to be detected automatically by Algos.
That is the only way I recon to get rid of spam ,and it needs hundreds even thousands of people to spot it voluntarily( if we want a better web in future).
| 11:56 am on Oct 28, 2005 (gmt 0)|
Has anyone tried out the new Adwords Keyword Tool yet? You can check your URL for keywords now, but it didn't seem to work for my website, returning now words. So, I included all URL's on the website, some 1200 pages, and it returned results for just one page.
The relevance of this, is that some people suggest o here that Google wants to increase revenue with the new update, as much as provide an update on the SERP's.
| 11:57 am on Oct 28, 2005 (gmt 0)|
For months I have been arguing for more human input into the serps.
I think that Google has realised that it cannot stop spam through algo's / filters alone. By encouraging spam reports it is allowing human judgement a bigger role. And human judgement is always going to be better than algorithms at weeding out bad sites in the ever changing spam game.
Think about it, if Google had stayed algo-only there would be no end to this algo change > tweak site > mcdar / webceo cycle. Google would never have great serps and would waste massive amounts of time and money spidering and tweaking.
As it is, Google will soon have human verified serps in most sectors and can then, once it has trust in the existing sites in the serps, shift it's energy to policing newcomers.
For me Jagger (so far anyway) represents a huge leap forward in attitude form Google that has set it apart from the other search engines and will keep it ahead for a long time.
All these spam reports that must be flooding in now give them a vast expanse of information to work with.
It's all about Google trying to establish trust in Webmasters.
| 11:57 am on Oct 28, 2005 (gmt 0)|
mzanzig - I agree.
Perhaps they will see new tactics and adjust the algo but in this frenzie of reporting I think the bigger picture is being lost. Its a great way to appear 'caring' towards webmasters in this forum but ultimately it is a stitch up.
If I wanted to soften the impact of a big mistake I would:
1) Say its rolling out over several weeks.
2) Ask for key players (webmasters) for help and appear to be responding.
3) Blame canonical technical problems that seems to be very complicated and a bit woolly.
| 11:58 am on Oct 28, 2005 (gmt 0)|
meant to read 'returning no words'
| 11:58 am on Oct 28, 2005 (gmt 0)|
>>The top sites have been changing over the past week or so but how can a site with no tags rank so well?
Thanks for the info vBMechanic and McMohan.
Do you think its worth while removing these tags from my sites or maybe wait until Jagger is complete, to date i've always ranked very well with them?