homepage Welcome to WebmasterWorld Guest from 54.166.111.111
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 152 message thread spans 6 pages: < < 152 ( 1 2 3 4 [5] 6 > >     
Update Pluto : Back Links Updated on Some Data Centers
Bhavin




msg:3058625
 1:23 pm on Aug 24, 2006 (gmt 0)

I have noticed on 64.233.187.104 and 64.233.187.99 datacenters Back Links Updated.

 

reseller




msg:3069808
 11:31 am on Sep 2, 2006 (gmt 0)

colin_h

"Reseller, how many months have we been saying this? "

Too many months, indeed. Lets say that we have been waiting at least since Bigdaddy was on the move [mattcutts.com].

However, it seems that the deployment of the new infrastructure has been more complicated than the good friends at the plex have expected.

I have faith in what our kind fellow member GoogleGuy has told us about ".....at the end of this summer", and my own personal motto has always been; Always Look On the Bright Side of Life ;-)

g1smd




msg:3070016
 5:34 pm on Sep 2, 2006 (gmt 0)

>> It must be getting on 18 months for some of the forum participants and still no real light at the end of the tunnel. <<

All the stuff that I put in place 18 months ago, was finally fixed about 6 months ago.

All the stuff that I put in place 12 months ago was fixed just last week.

Some types of Supplemental Results need you to take action; others can be ignored and Google takes them out of the SERPs after a year. All of this seems very very clear.

g1smd




msg:3070190
 10:12 pm on Sep 2, 2006 (gmt 0)

It wasn't quite so clear back six months ago, when having failed to fix everything I thought that Google had majorly goofed.

Turns out that things take far longer to fix than anyone here ever imagined (but they are fixed in a systematic and logical way), and that some types of Supplemental Results need care and attention, while others can be left to fade away on their own... eventually.

reseller




msg:3070200
 10:47 pm on Sep 2, 2006 (gmt 0)

g1smd

"Turns out that things take far longer to fix than anyone here ever imagined"

And to be honest, I guess even the folks at the plex have underestimated that.

I recall, if I'm correct, tedster writing very informative post in the past about how difficult it could be to deploy something like BigDaddy software update.

g1smd




msg:3070205
 11:02 pm on Sep 2, 2006 (gmt 0)

On the isue of timescales, I am talking about how they hang on to Supplemental Results for almost exactly one year after that URL is redirected or the content is gone; as well as how they hang on to the Supplemental Result for the previous version of content at a live URL, for almost a similar timescale.

When the original Supplemental cleanup occurred last August, it was only partial. The next one back in February or March was only partial again. The latest one, last week, was yet again only partial; but having now seen the same effects three times I suddenly realised how it all fits together, as well as what you should be looking at and correcting, and what is already unimportant and no longer needs to be tracked.

I already know what state various sites will be in at the next supplemental update (in another 6 months or so from now), and I have already taken all the steps necessary to make the best usage of that update.

Bewenched




msg:3070246
 1:04 am on Sep 3, 2006 (gmt 0)

So I guess the question should be. If the content is still there and has been updated will Google ever pull it out of supplimental or is it totally doomed for a year? We have items stuck in supplimental for no apparent reason.

g1smd




msg:3070266
 1:49 am on Sep 3, 2006 (gmt 0)

If content has been edited, then searches for the old content should return a Supplemental Result, and searches for the new version of the content at the same URL should return a normal result.

If the "current content" searches return a Supplemental Result, then it is likely that you have more than one URL that can access that content (www and non-www, or multiple domains, or multiple parameters in dynamic URLs) causing you duplicate content issues, or you have used the same title and/or meta desciption on multiple pages causing you pseudo-duplicate content issues. Both are easy enough to fix.

Alternatively you just haven't got enough PageRank flowing round the site, and you need to get a few more inbound, on-topic, quality links.

Finally, make sure that every page of the site links back to http://www.domain.com/ rather than to /index.html as that can cause duplicate content and PageRank issues too.

Related thread: [webmasterworld.com...]

colin_h




msg:3070357
 7:08 am on Sep 3, 2006 (gmt 0)

I'm sure that Big Daddy is difficult to get working properly, but Google are a billion dollar company and surely know that they shouldn't implement something unless they can get it to work.

I'm always willing to listen to your insights Reseller, but this time I think you are too forgiving of Google's past 18 months. Please, anyone, tell me I'm not over-reacting here ;-)

All the best

Col :-)

reseller




msg:3070388
 8:00 am on Sep 3, 2006 (gmt 0)

colin_h

"I'm always willing to listen to your insights Reseller, but this time I think you are too forgiving of Google's past 18 months. Please, anyone, tell me I'm not over-reacting here ;-)"

You are not over-reacting here :-)

You are right as I said that the process of deploy of BigDaddy has taken looooonger than any of us has expected. During that process there have been several, sometimes strange, "Data Refresh" "Data Push" and even "Bad Data Push" ;-)

But as g1smd mentioned, the folks at the plex have been working on fixing issues. Furthermore, GoogleGuy, Matt Cutts, Vanessa and Adam-BDP (Bad Data Push :-)) have all been here on forum 30 talking to us.

Furthermore GoogleGuy told us that more changes will be noticed on the DCs at the end of this summer (I can see no more summer here in Denmark, btw :-)).

All that are posative encouraging signes, IMO.

Whitey




msg:3070407
 8:37 am on Sep 3, 2006 (gmt 0)

So we've fixed our sites, done all the basics per [ Related thread: [webmasterworld.com...] ], Google's responded ; Matt has said:

http://www.threadwatch.org/node/8222
Just to chime in, I agree that site:edited.com is because of the meta description tag; it's got nothing to do with any data push. When all the snippet results look the same, we often collapse them together and show the "click here to see all the duplicates" message that turns on "&filter=0" as a url parameter. For "edited site" , I think it's happening because of the meta description always being the same, and as soon as "edited site" drops the meta tag or makes them all different, we'd immediately show plenty of "edited sites" results again.

So in summary, the power to show up how "edited persons" want for site: is entirely in the power of "edited site" , and they can change it at any time and as soon as we crawl/index the results, more results will show up. No data pushes involved at all. ;)

So does this mean that if this was completed in the last few days, with Googlebot hitting a site , repaired sites can just sit back and watch their results generally return?

[edited by: Whitey at 8:38 am (utc) on Sep. 3, 2006]

g1smd




msg:3070610
 5:21 pm on Sep 3, 2006 (gmt 0)

Yes, in general they can.

The problem is that many webmasters don't know how to read the Google results to see what actually needs fixing, and they then continue to look at things that Google will hang on to for another year and wonder why the fixes they implemented don't appear to be working: it's because they are looking at the wrong thing.

I was doing that up until 6 months ago too. Not any more.

Bewenched




msg:3070623
 5:40 pm on Sep 3, 2006 (gmt 0)

g1smd mentioned

Finally, make sure that every page of the site links back to [domain.com...] rather than to /index.html as that can cause duplicate content and PageRank issues too.

It's very interesting that you bring this up, we have never linked to our default page.. I repeat .. NEVER linked to it. However if you use Google Analytics it requires you to define that page .. and just last week that default page started showing up in the results. I know I have mentioned this in another post on the forums, and I'd love to hear from GoogleGuy on this one.

I'm so tired of writing 301's to satisfy the google bot .. I'd much rather be working to imporove my site for our customers and employees.

So to wrap it up in a nutshell for our experience:
We tried to legitimize our site with Google by using sitemaps (implemented November 2005), started using Analytics to see where our traffic was from and how we can build a better site for visitors.

We have had damaged done to our site by having someone (probably a competitor) link to our site under SSL, but it's always been told that no one can do damaged to your site listings in the search engines, but this is so not true.

Through these actions we have had serious and possibly irreparable damage to our site standings, rankings and have watched pages slip into supplimentals by the tens thousands.

We are now faced with options of:
Marketing heavily through ppc, banners etc... all of which the ROI is beyond pathetic and deal with the rampant fraud that will occur , or the worst option of all ... laying off employees until the mess can be cleaned up.

I'm personally going to wait until the end of september to see if things improve. We've not done anything intentionally shady, we dont use link farms, we dont buy links, we dont cloak, we dont spam, we have 404's and 301's in place, .. we simply have a very large ecomm site.

If things do not improve .. for whatever reason (unless sitemaps tells me a reason) I will be removing sitemaps, taking off analytics code and letting the bot find its own way around using up it's own resources instead of giving google the heads up of .. "hey spider just these pages" Which is what I assumed the sitemaps program was for.

Sorry for the rant, but I'm to the point of saying "let go and let god" but in this case it's "let go and let goog"

trinorthlighting




msg:3070750
 9:00 pm on Sep 3, 2006 (gmt 0)

New infastructure rolling out, good data pushes, bad data pushes, partial page rank updates, etc.....

These are what we all need to be prepared to deal with in the future as well. Google can not be perfect so you should have a back up plan.

colin_h




msg:3071204
 12:24 pm on Sep 4, 2006 (gmt 0)

Hi Trinorthlighting,

"Google can not be perfect..." - please note my comments are not meant as an attack on your words, just a critism of Google.

How big an organisation, and how much money does a company have to be to be perfect. Google are constantly changing the rules that call for higher & higher levels of perfection from webmasters, web site owners and amateurs alike ... and yet we're sitting here, 18 months of manic tinkering behind us and still no further forward than before Jagger, Alegra, Old Uncle Tom Cobbly an' all!

IMHO, there is no company in the world better placed to be perfect than Google. If they can't get it right, then they should stop getting us to jump through hoops (i.e. numbers of links per page, link styles to home pages, meta tag formats etc.) ... There used to be a time that I just sat and created cool content, now I have to write junk to fit Google's daft webmaster rules.

As said before, but worth saying again ... not meant as a slur to trinorthlighting, just Google.

kwngian




msg:3071210
 12:45 pm on Sep 4, 2006 (gmt 0)


Perhaps it is something that google has done that can't be undone.

Just like when I am filtering email spam, I started with blacklist which fails terribly then I switch to greylist. I know that my greylist may have some false positives (in google's case massive) but I cannot turn back because the spam that gone through is so huge, I rather have those false positives.

oaktown




msg:3071294
 3:09 pm on Sep 4, 2006 (gmt 0)

I agree. I still think Google is broken. My theory is that the combination of G's algos/filters (having been tweaked and adjusted countless times) and the complexities of G's new infrastructure, have created a mix that the engineers can no longer manipulate with predictability.

When they change something, they HOPE it will do X, and if it does Y instead, they can sometimes roll it back somewhat, but never to the point where they started from. Eventually it becomes a guessing-game for them and we are just along for the ride.

Like I said. It's only a theory. Any takers?

Alex70




msg:3071332
 3:38 pm on Sep 4, 2006 (gmt 0)

>>Eventually it becomes a guessing-game<<
IMHO they will win the battle against sppammers. Duplicated content it can be discovered easily now, they are also working on cloaking, mass publishing of pages, unnatural link growth, and many other issues. In an optimistic view al this will be fixed by the end of the summer..I hope.

trinorthlighting




msg:3071387
 4:41 pm on Sep 4, 2006 (gmt 0)

colin_h,

I never take your words as an attack. I do agree google has a lot of money and always changes the rules. I always ask myself why they change the rules and I find two answers:

1. To make their search engine better.
2. To fight spam

I do believe that google does try to serve up good results, but an algo is only as perfect as the engineers who are programming it and they have to constantly tweak it to fight spam.

We all know people are not perfect so we can all agree that google will never be perfect.

Simsi




msg:3071419
 5:08 pm on Sep 4, 2006 (gmt 0)

Bewenched: we have 404's and 301's in place

I was under the impression that having a standard 404 page was a bad thing as Google sees it as many pages of duplicate content?

tedster




msg:3071489
 6:08 pm on Sep 4, 2006 (gmt 0)

The issue is whether the server returns a 404 header. If it does, the page content can be a standard message, or a highly customized one.

But if the http header does not show a 404 status, and instead shows a 200 or a 302, that's when duplicate content troubles can start -- because you are telling the bot that many different urls all resolve to the same content. WIth a 404 header, you are telling the bot "that url does not exist here."

crobb305




msg:3071558
 7:11 pm on Sep 4, 2006 (gmt 0)

I don't watch invidudal datacenters much anymore, as the task has gotten too tedious and the results too inconsistent. But, I do still watch for trends. In the past few days, I have seen that the dataset 72.14.207.104 which represents the "newer infrastructure" has spread to about 6 other datacenters.

Reseller, would you agree? "End of summer" is just 16 days away, based on Matt's "Autumnal Equinox" definition of "end of summer".

hehe :)

steveb




msg:3071594
 8:07 pm on Sep 4, 2006 (gmt 0)

End of summer, but probably not THIS summer...

2009 maybe.

They still have not gotten rid of those ancient supplementals on any datacenter. They only hide them sometimes, while other times they are displayed. "Hiding" is not "removing".

Right now the folks at the plex would be lucky to even find a calendar, let alone fix any of their innumerable screw ups in 16 days.

g1smd




msg:3071601
 8:16 pm on Sep 4, 2006 (gmt 0)

All of the supplemental results for redirected and 404 pages that I expected to update and disappear have done so.

All of the things that I thought would be updated to be newer supplemental results have updated their data and remained supplemental. I now know they will disappear about next February.

Almost all of the things that I thought might start to show as supplemental have done so. These are results for pages recently edited, redirected, or deleted.

The process is as I described it at: [webmasterworld.com...] What part(s) of that are you seeing differently?

steveb




msg:3071660
 8:50 pm on Sep 4, 2006 (gmt 0)

The part where supplementals have not disapeared at all.

Yes they get hidden for extended periods now, but that means nothing. None (or hardly any) of the supplemetals have gone anywhere. The large batch to June 2005 and smaller batch back to December 2004 are being diplayed on every datacenter right now.

No progress has been made on supplementals this summer, and an enormous step backwards has been introduced in that Google now tries to hide them instead of displaying them.

g1smd




msg:3071663
 8:53 pm on Sep 4, 2006 (gmt 0)

I am seeing gfe-eh.google.com is far more advanced in Supplemental Cleanup than other DCs.

Supplemental Results have changed on all DCs in the last few weeks; but that one has more changes.

reseller




msg:3071664
 8:56 pm on Sep 4, 2006 (gmt 0)

crobb305

>>Reseller, would you agree? "End of summer" is just 16 days away, based on Matt's "Autumnal Equinox" definition of "end of summer". <<

I recall GoogleGuy mentioning the end of third quarter too, which means end of September. I.e we have still 26 very long days to wait :-)

jrs_66




msg:3071691
 9:27 pm on Sep 4, 2006 (gmt 0)

I'm seeing MAJOR changes across the board for a large number of my keywords on 64.233.167.99, 64.233.167.104, 64.233.183.99, 64.233.183.104, 66.102.7.99, 64.233.183.107, 66.102.7.147.

Has anyone else noticed anything funny about these? I'm seeing MANY pages which never were in the top 100, suddenly popping to the top 5 (often #1). Unless there's some sort of testing going on here, it seems to me like my 11 month old site could be popping out of the sandbox!

please...please...please!

Once again, anyone seeing something strange about these DC's?

Bewenched




msg:3071699
 9:44 pm on Sep 4, 2006 (gmt 0)

I wish the supps would drop... Alot of our major pages that are flagged as supplimental have cache dates of August of 2005! Sadly these were the pages that got spidered under SSL mysteriously. At least we now have the 301 in place for them so it cannot happen again.

colin_h




msg:3071705
 9:59 pm on Sep 4, 2006 (gmt 0)

So let's stop messing about ... Come the end of summer, the ones who have defended Google stop posting. It's about time things got better, if they don't ...

... Google defenders should beware - you have wasted our time and this will no longer be tolerated.

g1smd




msg:3071711
 10:10 pm on Sep 4, 2006 (gmt 0)

>> A lot of our major pages that are flagged as supplemental have cache dates of August of 2005! <<

If those are URLs that now redirecting, then those dates sound correct. They will stay as Supplemental Results for a while more before disappearing. In the meantime you have to look at the main "200 OK" URL for that content and ensure that it gets fully indexed in the next few weeks.

If the Supplemental URLs are not already redirected or tagged as noindex then you need to sort out whatever duplicate content (www vs. www; multiple domains; differing dynamic parameter URL; http vs. https) issue caused the problem. That can be done by adding noindex tags and/or redirects as necessary.

reseller




msg:3071716
 10:17 pm on Sep 4, 2006 (gmt 0)

colin_h

The 26 days left could be "consumed" in for example cleaning and SEO your site before the next update at the end of this month. Google is so generous to alert us in good time. Or you can do as Matt Cutts..relax and enjoy Labor Day and "think about hamburger patty optimization (HPO) instead of SEO today" :-)

This 152 message thread spans 6 pages: < < 152 ( 1 2 3 4 [5] 6 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved