homepage Welcome to WebmasterWorld Guest from 54.227.25.58
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 1014 message thread spans 34 pages: < < 1014 ( 1 ... 3 4 5 6 7 8 9 10 11 12 [13] 14 15 16 17 18 19 20 21 ... 34 > >     
My site has been First Now vanished from Google
My site has been the first of its kind, I drop off Google
sabine7777




msg:760006
 6:35 am on Sep 20, 2005 (gmt 0)

For the past year I have experienced periodically being completely dropped off Google. My site has been the FIRST of its kind and is in all the natural search results on the first spot. I'm just a small business, but since spet of 2004 I have been vanishing off of Google every 6 weeks or so--recently it has been more often and for longer periods. Does Google discriminate against Older sites? Are they doing it so that we will advertise with them? Any help, advice, comment from a desperate single mother of 4!

 

textex




msg:760366
 5:53 pm on Sep 26, 2005 (gmt 0)

"66.102.9.99 --- Barcelona & Madrid, Spain - Switzerland - Hannover, Germany - Dublin, Ireland - Prague, Czech Republic - Denmark
66.102.9.104 --- Barcelona & Madrid, Spain - Switzerland - Hannover, Germany - Dublin, Ireland - Denmark - Romania - Brussels, Belgium"

Both these DCs all, and I mean ALL of my sites that have 301's set-up got hit with www and non-www duplicate. These 301s have been up since February.

Hope these DCs are not the end result.

Anyone else experience this?

I am really at a loss.

cleanup




msg:760367
 6:01 pm on Sep 26, 2005 (gmt 0)

This DC seems to produce less results.

Lorel




msg:760368
 6:15 pm on Sep 26, 2005 (gmt 0)

Freedom:


When someone using a free webhosting site copies my 4 year old domain that doesn't expire for 5 years, it should be obvious to Google engineers who is the original website, and who is copying the content - without even a human looking at the page.

I'm not suggesting that free hosting sites/pages get dropped or punished.

Quite the contrary. It's quite easy to get scraper sites removed if they are hosted on free hosting companies. Gather up all your proof, get the very strict copyright rules off the host site and send all this info to the hosting company. It's been my experience they drop the scrapers instantly without question. For more info on how to contact these hosts search for Stop 302 Redirect Hijacking.

Janiss


My site, which was decimated in May from Bourbon, came back in July, has now been decimated again as of September 21. Some I know who has researched these things extensively says it's because of my pages being framed by others and because of being hijacked - all this adds up to a glitch on Google's part that makes it refuse to acknowledge content sites like mine and leaves them low in the SERPs.

Just add a pop out of frames script on all affected pages. However check to make sure it works in all browsers.

If you believe your site has been hijacked do the search mentioned above to see actions you can take to stop it.

For all those who have seen their sites index quadrupled and drop in the SERPs--are you using session IDs? If so you might want to read the following:


A Comment from googleguy:

So what's the problem with a session id, and why doesn't Googlebot crawl them? Well, we don't just have one machine for crawling. Instead, there are lots of bot machines fetching pages in parallel. For a really large site, it's easily possible to have many different machines at Google fetch a page from that site. The problem is that the web server would serve up a different session-id to each machine! That means that you'd get the exact same page multiple times--only the url would be different. It's things like that which keep some search engines from crawling dynamic pages, and especially pages with session-ids.

Google can do some smart stuff looking for duplicates, and sometimes inferring about the url parameters, but in general it's best to play it safe and avoid session-ids whenever you can.

Google's Webmaster Technical Guidelines:

*Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.
*Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the


wiseapple




msg:760369
 6:48 pm on Sep 26, 2005 (gmt 0)

Could those who have had their sites drop please PM me your URL. I am trying to do a comparison to see if I can figure out what is up. We have been dead in the water since Feb. 2nd. I cannot figure out what is causing us to have a penalty. I would like to do some comparison. I have looked at three other sites that are very similar to ours that is also experiencing penalties.

Thanks.

moftary




msg:760370
 7:10 pm on Sep 26, 2005 (gmt 0)

I'll change Fat Lady for Large Woman!

Joking?
There are already applications I have seen that are automatically doing so for the purpose of avoiding the dupe filters :)

Lorel




msg:760371
 7:15 pm on Sep 26, 2005 (gmt 0)

Taps:


- My provider moved and so my IP adress changed
But I cannot see a reason why my site is banned completely.

If your provider placed your site on a shared IP this could be the problem. Possibly someone else on that IP was banned which means all sites on that IP get banned.

Solution: Request a dedicated IP address. Most reputable hosts only charge $1.00 more per month. If your host doesn't provide this then find a new host.


In my case I think, I have found the source of my problem.

After Allegra I used robots.txt and URL removal console to remove duplicate content. This was in March. After that I continously had a robots.txt with

User-agent: *
Disallow: dup1.php
...

If your .php scripts are using Session IDs see my message above re Google not being able to process them correctly.

textex




msg:760372
 7:44 pm on Sep 26, 2005 (gmt 0)

Anyone with drops in SERPS do a 'from site' search?

Do you see any new canonical url issues?
We do.

We have had 301s in place since February. How are we getting hit with the canonical issue now?

I can not understand how Google got so dumb!

My biggest concern is, like sites in the past, we are dropping in rankings now, but are going to disappear completely once the effects of the cononical url issue fully kicks in.

taps




msg:760373
 7:46 pm on Sep 26, 2005 (gmt 0)

Lorel

thank you for your help. Fortunately my provider is a good friend of mine and my ip is very dedicated ;-)

I was just wondering if moving to another ip could cause a problem.

Meanwhile I found out what caused my problem: it is in fact duplicate content which came up again - I stated that somewhere in this thread...

Concerning the Session-IDs: That was my mistake three years ago when I kicked out myself from google the first time.

It was someone like you who helped me out then. That's why I love being here: People help each other.

stargeek




msg:760374
 7:47 pm on Sep 26, 2005 (gmt 0)

"I can not understand how Google got so dumb! "

one of two possibilities:

a) the "dumber" they get the more money they make via adwords

b)they are implimenting new features too fast to keep ahead of yahoo and msn. But aren't testing these features enough.

or a little bit of both?

textex




msg:760375
 7:51 pm on Sep 26, 2005 (gmt 0)

One more rant...

Does anybody remember engines like Excite, Alta Vista (remember black monday?, or InfoSeek? Are we seeing another slow death?

Google does not care. Why?

I think Google is trying to monetize their adwords. Quality search is not a priority. Priority is developing all of their 'new' ideas.

Majority of general public is so hooked on 'Googling' it, they don't even realize that their are being chalked up crap results. They scam free listings and jump to adwords quicker than you can 'giddiup!'.

For Mr. Cutts to claim no knowledge of 301 issues is ridiculous!

aff_dan




msg:760376
 8:38 pm on Sep 26, 2005 (gmt 0)

Hello,

I brought a domain anysite.co.uk on 05-Sep-2005
Added content to site made manually before 17-Sep-2005
Added first link on the Internet on 19-Septem-2005

Indexed on 20-September, appear in Google all pages indexed ~60 pages.
25-Sept-2005 last visit of Googlebot.
26-Sep-2005 dissapeared from Google.

Can someone explain me what is going on?
It is about my IP? .htacces for 301 redirect from non-www to www?

Regards,
Dan

WebFusion




msg:760377
 8:38 pm on Sep 26, 2005 (gmt 0)

I honestly think they are simply being out-smarted by the spammers. They are hopelessly outnumbered, and they are simply applying tweak after tweak after filter in an attempt to regain the quality they lost 2-3 years ago.

It has become painfully obvious (IMHO) that engines are years away from being able to use a technical approach only to discerning the good sites from the bad. Without human intervention, far too many quality sites will be killed in an effort to clear out the spam. What good will a technical approach be (however advanced) if in the end the serps you are left with are only 50-60% as "good" as they could be due to so amny quality sites being lef tout in the cold?

Having said that, the google gravy train has been nice, but it won't make or break my business. Making a few grand a day is all too easy without google, you just have to find the traffic elsewhere.

modemmike




msg:760378
 8:47 pm on Sep 26, 2005 (gmt 0)

aff_dan, sounds like classic sandbox to me. Your pages won't be seen in Google for an extended period of time now.

aff_dan




msg:760379
 8:55 pm on Sep 26, 2005 (gmt 0)

modemmike,
how much time I need to wait? No page will not be show up on google? I just tought I need to change all my info and IP, ISP?

reseller




msg:760380
 9:13 pm on Sep 26, 2005 (gmt 0)

aff_dan

Here are few interesting threads about the sandbox.

[webmasterworld.com...]

[webmasterworld.com...]

[webmasterworld.com...]

I hope this helps.

modemmike




msg:760381
 9:19 pm on Sep 26, 2005 (gmt 0)

how much time I need to wait?

Wish I knew that answer myself, even a guesstimate would be nice! Sandbox timing is widely debated, from what I understand it somewhere in the neighborhood of 6-18 months.

kamikaze Optimizer




msg:760382
 9:19 pm on Sep 26, 2005 (gmt 0)

aff_dan, sounds like classic sandbox to me. Your pages won't be seen in Google for an extended period of time now.

I might disagree with that. I mean, he could very well be in the sandbox, and I hope he is not. But, he could just be in the freshy index, not fully indexed yet.

Not much time has passed since he loaded his site.

I would wait a few more weeks before diagnosing him with a sandbox.

aff_dan




msg:760383
 9:22 pm on Sep 26, 2005 (gmt 0)

Hi Reseller,

One question: sandbox means also removing new sites with fresh indexed from search results and appearing again after a while?

Dan

jeglin




msg:760384
 9:35 pm on Sep 26, 2005 (gmt 0)

Or, you can enter your URL in

[copyscape.com...] and see if your content has been duped.

pescatore




msg:760385
 11:15 pm on Sep 26, 2005 (gmt 0)

where is Google Guy to give us a feedback?

anttiv




msg:760386
 11:48 pm on Sep 26, 2005 (gmt 0)

On his yacht?

Seriously, give Google what it likes. I saw free web space and subdomain accounts in the top for my keywords so I registered some new accounts myself and already see traffic.

At the moment it is the only way to get Google traffic for my original website (apart from Adwords).

Lorel




msg:760387
 12:39 am on Sep 27, 2005 (gmt 0)


Aff Dan


I brought a domain anysite.co.uk on 05-Sep-2005
Added content to site made manually before 17-Sep-2005
Added first link on the Internet on 19-Septem-2005

Indexed on 20-September, appear in Google all pages indexed ~60 pages.
25-Sept-2005 last visit of Googlebot.
26-Sep-2005 dissapeared from Google.

If your keywords are highly competitive and you were ranking for your major keywords the first few weeks and then disappeared and there are no technology blockers detering the search engine bots then it is likely Google's sand box which may last from 6 months to a year. If that is the case then find alternate ways to advertise your site (forums, press releases, writing articles, taking out ads and other paid advertising) and by submitting it to directories in your market.

FattyB




msg:760388
 1:55 am on Sep 27, 2005 (gmt 0)

Folks, I have seen a change tonight.

We don't track keywords as site is broad in coverage. But when traffic dropped I check the site name in Google as in "Site name" not sitename.com.

Over the weekend this showed 4 pages of results from 1.6 million. First few our main page and a couple of subdomains and the rest sites that link to us.

Now previously before traffic plummetted, we had thousands of pages but not listing fro other sites, rather Google showed all our internal links (we use lots of subdomains).

Now tonight I am seeing 1.6 million results with thousands of pages listed. Our main site on the first few and some subs listed. But it is now listing all our external links rather than our internal ones. I figure this can only be good news.

Traffic appears to be recovering today....

James

Kimkia




msg:760389
 4:58 am on Sep 27, 2005 (gmt 0)

I'm seeing some changes also, but not all of them positive. I recovered some of my traffic, but not all.

One thing that I tried tonight was dumping a 1000 plus page affiliate product catalog, that was fed by remote server into my domain. I used the Google removal tool to take out that directory, and the results were quick: inurl and site:mysite.com have dropped those 1000 pages in Googles SERPS, although supplemental results remain. I'm hoping that removing this from my site will help recover any dilution of page rank or other penalties that might have occurred.

Also -- here's an odd one -- in trying out search terms for my site today, and then using &filter=0, I had some surprising results. On at least three searches, with the &filter=0, I occupied the top three or four positions on page one. These pointed to different pages on my site, all of them relevant for the search term used, those being home page, directory index page, and a particular content page or pages.

I can see why these pages would be picked for SERPS - each one includes the search term in some fashion ie home page might have "visit our widget index for more homemade widget directions," then the directory index itself has a title, like, Homemade Widgets, and a short snippet description of various pages available, with the content pages having a title like: Homemade Widgets: How to Make a Red Widget" and corresponding meta description and page content. But of course it would be ridiculous for Google to return three or four pages from the same site in the top four positions - so they impose a filter and take out ALL the pages. Problem solved.

Anyway, someone suggested (pages back in this thread) that duplicate navigation or duplicate metas might be triggering a filter in this update. I would say so...at least in the example cited above. It also involves duplicate snippets or phrases the pages might have in common.

As sites grow, and have to sub-divide content into different directories, this tendency to have multiple pages with the same snippet or description on it is almost impossible to avoid. It's logical and what readers expect to see -- but apparently the bots can't tell the difference between useful navigation descriptions and duplicate content/spam.

The filter that Google is applying is obviously meant to reduce duplication in SERPS results - the problem is, it is working too well and removing every mention of top ranking, worthwhile sites. They are sites who have worked hard to have a navigation system easily traversed by readers and robots. Their efforts are tripping Google out and I think this is why we are seeing respectable top ranking sites being totally devastated in this update.

Someone at Google needs to turn down the volume on the trigger button.

I'm also trying to deal with 301 and 302 redirects, so that's also part of it for me.

Sorry this is so long, but I felt this experience was relevant to solving at least part of the mystery we are all trying so hard to understand.

FattyB




msg:760390
 5:15 am on Sep 27, 2005 (gmt 0)

Kimkia,

Well we certainly have a lot of snippets. Since we are a news site we tend to have a summary, which is first few lines of the story. No way round this as we publish 400 articles a day and as you say you have to display something beyond a headline for punters. We also have likes of news topics, where news from across the site is gathered...so again same snippets. But not something I will be changing.

Anyway, will be interesting week. If they are now listing all our external links I will be very happy as we have thousands, many from top 200 sites. Partly due to moreover, google news as well as when we get a scoop.

Traffic wise as I said it looks like last part of tonight it picked up and I noticed a boost to our stories on G News regards ranking, this must be tied to g search. But still below normal traffic...guess tomorrow and rest of the week we will see.

I also think there is more to come as you allude... fingers crossed all goes well.

I also think it odd this only seems to have hit certain sites. Hopefully Google Guy or such will weigh in with some explanation of what went/is going on.

arubicus




msg:760391
 6:28 am on Sep 27, 2005 (gmt 0)

"As sites grow, and have to sub-divide content into different directories, this tendency to have multiple pages with the same snippet or description on it is almost impossible to avoid. It's logical and what readers expect to see -- but apparently the bots can't tell the difference between useful navigation descriptions and duplicate content/spam."

I hear ya there. Our directory system uses the title of the article as the link and also a snippet (description) to give readers an idea of what each article is about. These happen to be the meta title and meta description of the the actual articles. To make matters worse on the bottom of most articles we show the most related articles to that article in which each use titles and snippets that are found in our directory. For visitors this structure is quite logical, easy to understand, and keep them within topics they are most interested in.

If there is some kind of penalty for doing this then I have the option of:

1) Removing the Title and snippets from the related articles as well as directory listings of the articles. This would leave virtually nothing in the directory and take usability away from our visitors by not having the related articles. <- OK stupid idea.

2) Rewriting thousands of descriptions and titles so that the link does not match the meta title/description or title of article or both. To keep the related articles we would have to add a 3rd description/title for snippets on related articles so they don't match any meta titles/descriptions or any links or snippets in the directory.

I guess #2 would be a good test. But why should any webmaster have to do all of that. (thank goodness I don't add some articles to another part of our site (like our shopping directory) then I would have to create a 4th title(link text)/description for each.

szurma




msg:760392
 6:32 am on Sep 27, 2005 (gmt 0)

Kimkia,

Perhaps, the reason is same. I assume my site get penalty also, because I added to many content during too short time ...

I also plan to use Google removal tool.

Please inform us about your recover.
Thanks,

reseller




msg:760393
 6:44 am on Sep 27, 2005 (gmt 0)

Good morning Folks!

Staring the day by looking at not less than 18 supplemental results added to my site when running command site:

Those files are something I removed by the removal tool for sometime ago. So here Im removing the same files again by the same removal tool, instead of enjoying my morning cup of Java ;-)

This time Im gonna keep the empty files (with meta NOINDEX, FOLLOW) on the server so I dont need to create them again and again and again ....

I love ya Gooooooooooooogle ... but please keep those old supplemental results away.. maybe for ever if possible ;-)

taps




msg:760394
 7:00 am on Sep 27, 2005 (gmt 0)

arubicus

We have exactly the same pattern: Headline as link, snippet, snippet in meta description and snippets from related articles.

I think, this is totally normal and it is user friendly. So I hope Google will not consider this as dupe content.

However we are punished. But I think it's related to another more obvious problem that was easy to solve via url console.

arubicus




msg:760395
 7:23 am on Sep 27, 2005 (gmt 0)

"I think, this is totally normal and it is user friendly. So I hope Google will not consider this as dupe content."

I wouldn't think google would punish for that but ya never know anymore.

thecityofgold2005




msg:760396
 7:28 am on Sep 27, 2005 (gmt 0)

In my sector it seems the best way to move up the serps for Google is to have several sites, all slightly different, and then interlink them vigorously.

Sites with associate networks like this have done very well ever since Bourbon. In my sector the #1 was nowhere 2 years ago and then launched 5 near identical sister sites all at the same time. Result was they have been firmly rooted at number 1 for almost 2 years. Plus they get the new sub topics underneath their listings.

Confusing and spammy link networks are a sure sign of authority for the dumb Googlebot.

This 1014 message thread spans 34 pages: < < 1014 ( 1 ... 3 4 5 6 7 8 9 10 11 12 [13] 14 15 16 17 18 19 20 21 ... 34 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved