Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Push the reset button and start over?

Remove content and ask for reinclusion...

         

photopassjapan

2:53 pm on Sep 2, 2006 (gmt 0)

10+ Year Member



Hi there.

I'm thinking about a scenario in which I'll ask for my site to be excluded from Google in its entirety and then resubmit. Quite a naive idea I know.

The problem is that the site has been indexed through a link we didn't know or would have controll of... and in a stage when it wasn't there, actually. It was indexed, and got a PR of 0... the full story is on this forum. [webmasterworld.com...]

We also changed the profile for it seemed after some months that the wording we chose made us compete in an industry that is already well over-serviced, not to mention we had no real intention of entering it either... seemed like a good idea to broaden the theme but we were wrong.

Now that the site has been indexed with its who knows how many thousand static pages... the old profile and wording is stuck in the cache... after all, these pages are meant to serve as reference and ideally we wouldn't have updated them just to remove some irrelevant phrases.

After some less successful attempts of trying to contact Google about the utterly bad first impression of our site in their index ( being seen before it was even there ) we thought of the following.

What would happen if we asked Google to remove the cache, or for a matter of fact, the entire website from its index?

Would the PR be reseted, the cache deleted, or just made unaccessible from the SERPs?

Also, if we did such a thing, and perhaps the records would just vanish... would it be re-indexed at all just because of many links pointing to it on the net? Actually we're in a stage where having just a couple of pages indexed properly and their PR calculated again would make more sense than having thousands of pages indexed as they are right now.

So after being removed, we wonder whether we'd be indexed again at all... either by our own request at the site re-inclusion page, either because of the uploaded sitemap, or because of the inbound links?

Or would we blacklist ourselves practically for half a year.

Has anyone been through such a... mess?
Ideas and experiences of either process are most welcome.

Links to the services... not that you didn't know them:
[services.google.com:8882...]
[google.com...]

And thanks in advance

Quadrille

4:17 pm on Sep 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You cannot control Google, and it is usually unwise to try.

If you follow your plan, and your site is eventually relisted (assuming it is), then you'd be probably be treated as a new site.

Whatever your GPR is now, it would revert to a gey bar, and your rankings would all start from scratch.

I've read the other thread, but don't understand why you connect your actions to your ranking - your ranking is a fluid thing, updated constantly. It's only TOOLBAR gpr that is unchanging for three months at a time.

If your site is quality, then it has doubtless progressed - you are proposing to throw away that progression and start again.

If your site has a current problem that prevents it progressing, leaving Google for six months will not fix that problem.

What percentage of your visitors actually look at the Google cache, do you reckon?

Aplogies if I'm missing something, but froam all you say, I'd suggest

1. Concentrate on the future - keep developing your site

or

2. Abandon the lot and REALLY start afresh, with a new domain.

Either is more likely to work than trying to tell Google how to run their ship!

photopassjapan

5:52 pm on Sep 2, 2006 (gmt 0)

10+ Year Member



The Pagerank is 0.
Ever since the site was indexed with an admin panel in its place, becasue a certain site catched the whois info of us registering the domain and put up a link to it before its launch.

I didn't think so either until i cross examined all the logs and compared to what i saw in google the day we went live.

I'd love to see the pagerank grey right now. I'm not sure if it will move from 0 if we just keep waiting. It has been three months since the site was indexed with its actual content, and four since the site was first indexed with the wrong page. And we're trying to rely on not only the toolbar for diagnosing what's wrong but other kind of information as well... with which we can keep ourselves in the hopes of seeing it improving on one day, and chill down the "false alarm" on another. I'm tired of only reading prayers, theories and magic tricks of SEOs about this, and these options are offered on the Google website. The reason of us using it would be certified, we can't start up otherwise than starting again. This time, without having us publicised by a company and in a context that literally decided everything for us before launch. I'm not sure... yeah, it might change. But you know... it might stay this way as well. Or is there an official "penalties, false penalties, their reasons, and afterlife" document by Google?

Seeing the pagerank grey would mean that it WILL get calculated based on the actual content for once. Someday. I wound't mind if we have to wait for two months if it actually happens after all.

My concern isn't that it would be reverted... but rather that it wouldn't be. Even after doing this all.

Our point on cache: the cache is what google uses to determine search strings relevant to our site. Whoever says that it isn't so, because of supplemental results, even pages that are not even THERE right now can hurt us to have certain phrases displayed on the SERPs. And since the thousands of static innermost pages were ( supposed to be and really were ) indexed only once, ( when the first version went up, without us noticing that we're already IN the index ), getting them to disappear and wait for reindexing as a new site seems to be a better idea then having them create an impression of us being something else. This is a commercial site vs non-commercial site problem which really isn't to be underestimated. The site isn't commercial. It seems it looked like one, and now it doesn't, we edited out words that made people think this is some kind of a major company. Also if these pages weren't there, they'd not "soften up" the profile of the site by associating it with an industry that has an unbeatable competition.

The only other option is of course the new domain.
We knew this from the start.

Which would not just make that amateur but hard work of branding this domain... completely obsolete, but also... what to do with the sites that are linking to us? Redirect this domain to the new one?, no way. I don't want to preserve this wonderful PR0. And i'm not talking about just random link exchange partners or something, but places we like, are relevant and have people who liked our site on their end...

I'd rather try anything else than changing the domain. We still could do so, once everything else fails. But even before that we'd need to get this site out of the index, for it would mean duplicate content to have it in there on two domains. ( perhaps not, but i don't want to try... )

I'm not sure if this is unethical at all. Asking to be removed and indexed again based on the REAL site instead of that admin page we had behid the walls? I know, it's nowhere to be seen right now. But its EFFECT still is. The page that was indexed by google through a questionably legal and questionably ethical site that put up link to us to tell domain brokers about our existence? The problems here are technical related i think. Likewise... being reindexed would mean of course dropping out for some time, and being only gradually allowed to get back. That timeframe is of course frightening but...

But dropping out from where...?
We don't come up for anything at all.
Our traffic right now is more or less random.
Which is funny because there is traffic, and people like the site.

It's like a zombie state where you're not sure why, but with no penalty for the site, and several backlinks, it still is underground in Google with PR 0. I've never had a site that had PR 0 when it was launched... i had seen quite some sites launched, but all of them were PR 3 to 4 at start, perhaps decreased later if not updated but that good faith kind of PR was always there. And with the exact same parameters except that glitch i discussed in the other thread.

Please does anyone KNOW whether asking for removal is an option in this case at all?

Will the site be entirely removed, along with its PR, its cache, or just not shown on SERPs? Will starting it up again bring back the PR0 or make it revert to pre-calculated nothingness? Would the site be able to start up again, or would it be cleared from Google for that 180 days, fixed?

If there is a precedence or case study of someone with this much reason to do this being blacklisted, we'd be curious to hear but i don't thik this is unethical, or would be treated as so by Google.

Neither am i sure that i want to do this, hence this post.

g1smd

11:54 pm on Sep 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The removal is not a removal. All the data remains in Google's database, and everything reappears in the SERPs 180-days later as if nothing had ever happened in the last 6 months. All the data appears exactly as it was at the time that it was hidden away.

Supplemental Results for URLs that are already redirected or where the content has "gone" from the live site remain in the SERPs for a year. You have no control over that. Starting again is what spammers do once their network of sites has crashed.

Google's latency, and stickyness of data, makes it far harder to restart at a new URL, or to just simply restart at all.

photopassjapan

12:00 am on Sep 3, 2006 (gmt 0)

10+ Year Member



And on a sidenote... the cache-related paragraph is just something that popped into my mind. Deleting the cache through the urgent only tool, just for the sake of relevancy, WOULD of course be unethical if that's what you meant. I'd never intend to do that, not that i think i could get away with it.

It's the other way around... i'm just not worrying because of the indexed pages' cache. Although i just recently saw that those two are not connected, but still :-)

photopassjapan

12:05 am on Sep 3, 2006 (gmt 0)

10+ Year Member



g1smd: Thanks! That's exactly what i was curius of. Didn't know this was spammer protection too.

Moving... or in other words redirecting is out of question, that much i concluded myself as well. How about just a brand new domain for the very same site, on the same server? Starting from scratch as far as its indexing goes. I'm not worried about it's content, i'd like it to be seen as it is... and indexed for what it is. And not what the serverside placeholder was before going live. That's all.

goubarev

12:55 am on Sep 3, 2006 (gmt 0)

10+ Year Member



I have a "gray-hat" method for you ;c)

Ok here is what you do to take advantage of your situation... You've said that there are thousands of pages listed in google for the site you don't want - you re-make those thousand of pages and put re-direct on them to the pages that you want. This way the pages that are there will be re-cached by google, it will also help google to spider your "real" pages - and once it realizes all those pages are redirects only till drop them faster then it would take you to exclude/re-include the site (which I don't think is possible at all)...

I would also recommend making a site map - so google will get to your unwanted pages faster and re-cache them...

Also about the PR0 - I wouldn't worry much, since it doesn't really mean anything...

About the whois - google, yahoo, mns, and ask are all using the whois to spider newly restered domains - it shows up as "survey bot" on the logs - so you didn't even have to have a single link to get spidered... it's not really a fault of that website, you were talking about... you would've gotten spidered even without a link...

g1smd

1:44 am on Sep 3, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



A URL can either issue a redirect (like "301 redirect") to some other URL OR it can serve content (with a "200 OK" status).

It cannot do both at the same time.

glengara

10:10 am on Sep 3, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



This is an MFA, right?

photopassjapan

11:59 pm on Sep 3, 2006 (gmt 0)

10+ Year Member



Type in my username at google and you'll see how huge and sinister evil the project harbours. There are still websites on the net that are actually online presentations of something real. You know... content. Images, text, information-like things you send over http to people who're interested :-)

This isn't the only website i'm managing, but sure is the most troublesome. Being the simplest in its structure, purest in intention ( <:-) and the closest to me personally... makes it even harder to think clearly. But I learn a lot here and there that will come handy for this and other sites as well.
If nothing else... i now concluded that if you get caught in the net of domains to be punished with a site that's questionable you stay low and quiet, whilst the same happening ( perhaps by causes other than you can imagine ) you get downright offended ( like in this case ).

Reasons to be punished may be random all the same for both, and the outcome will probably be as well.

But i thank you all for the help so far!

goubarev

12:35 am on Sep 4, 2006 (gmt 0)

10+ Year Member



Dude, what are from Japan or something... :c)

If nothing else... i now concluded that if you get caught in the net of domains to be punished with a site that's questionable you stay low and quiet, whilst the same happening ( perhaps by causes other than you can imagine ) you get downright offended ( like in this case ).

This "statement" makes no sence... you are asking for help/advice from the webmasters - we have no time for your blah blah blah - ask the qustion, get an answer... next...

photopassjapan

12:01 pm on Sep 4, 2006 (gmt 0)

10+ Year Member



Understood.

Last post minus the blah:

What happens if i register another domain, shut down this one, put the site up again to start over, but the original site is still in the index of google, cached, indexed?

Assuming that there are no OTHER problems with it...

Will the new one be busted because of dupe content?

twebdonny

2:15 pm on Sep 4, 2006 (gmt 0)



Completely removed our site using URL removal, site was gone for 180 days, returned with exactly the same PR
and exactly the same penalties and/or filters in place.
Site still shows index.htm when searching on page 3 of SERPS for unique company name. Never got a forthright answer from G why this is occurring and probably never will. Pulled our Adwords account and will spend our budget elsewhere where there is some type of reasonable response. G doesn't have time for us, and most likely
will not have the time for you as you don't have the big budget monies they are seeking.

IMHO, waste of time

Quadrille

3:22 pm on Sep 4, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Assuming that there are no OTHER problems with it...

But there are other problems; you haven't told us what they are, but there's clearly something that's causing you all this anxiety about your cache.

1. Sort out the site (which I suspect you have done)

2. Get a Google site map to assist in deep thorough spidering

3. Keep developing the site in a methodical and compliant way. think about it - new (extra) pages will get their own cache - and Google works on pages not on sites

Google will do the rest, given time and you not checking the cache every five minutes ;)

To cut to the chase; removal from Google cannot be an answer to getting a better listing with Google.

Don't worry about the cache; that's Google's business. Worry about the serps, referrals, visitors and what visitors do when they arrive. That's the future. The past is done.

dataguy

3:34 pm on Sep 4, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Buddy, I've read both of your threads and took a look at your site. I don't know if there is some dark secret you are eluding to and you're expecting us to read between the lines to figure it out, or if you're just so close to this subject matter that you're not thinking straight.

You've got a great site. It appears clean and has solid content, and lots of it. I'd say it's a cleaner site than the majority of the sites represented in these forums. (It doesn't even display any 3rd party ads, so it's definitely not MFA.) When and how Google found your site is irrelevant. Google knows about it and appears to like it. I see nothing to indicate otherwise.

The only thing you've written that would indicate something is wrong is that the home page has toolbar PR0, but then you wrote that you've had 10 solid backlinks for 2 and half months. Toolbar PR hasn't been updated in the past 2 and half months, so of course it's PR0. Google's cache has been hit and miss for months, but currently I see no problems with what Google has cached for your site.

You've got nearly a thousand pages indexed by Google, and with recent cache dates it appears that Google is visiting your site regularly, which would indicate that your links are at least adequate. Shoot, your site would be of value to some of my users, I'd link to you just because it's a useful, well-done site.

I've searched on a few of your main keyphrases, and your site is often listed on page 2 or 3 on G, which is what I'd expect with under a dozen backlinks.

So what's the problem? Chill a bit. Find some more related sites that would link to your site (which shouldn't be hard) and then add some more content. I'd give it a year and people will be coming to you for advice.

glengara

4:28 pm on Sep 4, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Here's what G tells us about the site

Results 1 - 7 of about 896 for "www.com".
related:www.com/ - did not match any documents.
Your search - link:NCZ9ZtdJILsJ:www..com/ - did not match any documents
Results 31 - 32 of about 3,890 from www.com

You've got 4290 pages indexed, with supplementals starting at 866 which is pretty good for what looks like very poor link juice...

photopassjapan

6:27 pm on Sep 4, 2006 (gmt 0)

10+ Year Member



Thank you all for the replies, yeah i need to cool down a bit.

It's not exactly the cache, i'm sorry that i've brought it up, a completely irrelevant story to say, and the only thing i would like to HIDE about this site is the now scrapped idea of the phrase "stock photo". Which we shouldn't have used at all... bad karma. But i never intended to do anything else about it than wait for it to clear up.

I've sorted out the site, added new pages, features, sent the sitemap, i don't think i missed any oppportunity at all to get things straight. It's not its current state but the tendencies that i was worried about. The backlinks are there but not one is shown, indexing doesn't move the PR an inch for months, and so on... but if this is NORMAL i may just need to shut the... i mean stay quiet.

It's probably the sandbox, or something like that, the only thing that made me worry is that this was the first site ever that had PR 0 instead of NO PR at startup. And the meaning of PR 0 is nearly as mysterious as whether PR 10 is subject to editorial review or not.

As for keywords... i've server side stats from 3 programs and google analytics. Both side show me that whilst it's true that we rank like top 3 for some phrases, those are the words you can rank high with PR0, and attract the thus predictable ( quite low ) amount of visitors.

Quite honestly, after this BLUR is gone from my eyes, thanks to this forum ( which is the most helpful so far... INCLUDING the seemingly heavy pre-moderated Google Webmaster Help forum )... the best advice seems to be...

Wait for some more.

Then if nothing happens for like half a year ( PR0 stays )... register a new domain with all its bitterness and delete the old one.

Or have i missed something?

g1smd

7:47 pm on Sep 4, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>> Site still shows index.htm when searching on page 3 of SERPS <<

Get the 301 redirect from /index.html to / in place, and make sure that every page of your site links to http://www.domain.com/ in exactly that format. The index entry will turn into a Supplemental Result and continue to be listed for another year. Ignore it, it will not be harming things.

.

>> Results 1 - 7 of about 896 for "www....com" <<

Most likely a Duplicate Content problem; and maybe one that you can sort out yourself.

Do you have any pages at www.domain.com/some.page.html that can be accessed as non-www on the same domain, or via another domain, or via another URL that has different parameters or a different path? If so, then fix it now.

Alternatively maybe your title tags and meta descriptions are too similar across the site. Matt Cutts [threadwatch.org] has already addressed this several times recently.

leadegroot

1:15 am on Sep 5, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm not sure exactly what your issue is (you're very good at talking around what you want to say, btw :( ) but the following anecdote might be useful to you.
We had a site that was well indexed. It wasn't very important, we thought, and we had a hard disk crash and lost it. No backup. Bugger. Didn't matter we thought, as it wasn't important.
But... there were a couple of pages I had been using as a reference. I had lost that material! Double bummer (improve backup routines...) and then I thought - Google Cache! And sure enough they were there. Hurray!
Being a lazy git I kept refering to the cache rather than taking a copy.
Oops.
Suddenly the cache page was missing.
Compare it to other cached pages that I hadn't been looking at 3 times a day - they are still there..

Conclusion: if you keep hitting the cached copy of a page you have removed from your site then Google will be woken and go and get an update. If they find the page missing they may then remove it from their cache.
Caveat: this was a few years ago, before the big supplemental issues arose.
HIH!

photopassjapan

11:23 am on Sep 6, 2006 (gmt 0)

10+ Year Member



Do you have any pages at www.domain.com/some.page.html that can be accessed as non-www on the same domain, or via another domain, or via another URL that has different parameters or a different path? If so, then fix it now.

Alternatively maybe your title tags and meta descriptions are too similar across the site. Matt Cutts has already addressed this several times recently.

The entire site can be accessed both with and w/o the www.
Is that bad? I thought... or would have hoped to think that the "preferred domain" in sitemaps solves this problem, even though they are keen to warn me that it's for how the site's displayed on the SERPs and for nothing else. (?)

And the meta tags... they are grouped.
No two pages have the same title... but there are some that share the same description and keywords ( not that keywords count ), simply because the style and info value of the content IS the same for them, only the actual content is different.

Should i remove the desc. tag and see what google inserts as sniplets or go through the ENTIRE site and not leave two pages with the same description?

For ten thousand pages that's nonsense though... having to do that.

glengara

11:55 am on Sep 6, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>> Results 1 - 7 of about 896 for "www....com" <<

*Most likely a Duplicate Content problem; and maybe one that you can sort out yourself. *

They're the "contains the term" results, which makes me suspect very poor link juice is the root problem...

Alex70

12:01 pm on Sep 6, 2006 (gmt 0)

10+ Year Member



photopassjapan,

check the photogallery pages, is each page different in content, has a different title and desc?

webdude

1:32 pm on Sep 6, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



photopassjapan,

There are a few things that will help you immensely...

Make sure each page has a different title.
Make sure each page has a different description.
Try to use 1 h tag per page (on topic)
Pay attention to posts from g1smd on linking. Pick either [foo.com...] or [foo.com...] and stick with it for ALL links. Also do a redirect to either. If linking to a directory, always include the trailing /.

g1smd

5:22 pm on Sep 6, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



photopassjapan:

Get the redirect from non-www to www installed.

Duplicate meta description tags are also a big problem. I have mentioned this about 50 times in the last month. Matt Cutts [threadwatch.org] has mentioned it too.

Do not remove the meta description. You will be even worse off without one.

See also: [webmasterworld.com...]

photopassjapan

9:15 pm on Sep 6, 2006 (gmt 0)

10+ Year Member



Okay. Will look into these.
Titles are unique to all pages by the way.
But since then i did the "omitted results" test for site:blah.com

And yes, every page that had a unique meta desc. was listed... all the others omitted. Thanks for the tip.

Makes me wonder though, both the bots and people look at the TITLE first, not the sniplets.

Most of the time they look at the title ONLY, as in that describes the content, and the description finetunes its origin... at least from the user point of view. The HTML and search engine methodology is the opposite, with the title as the origin and the sniplet as the actual content preview. I guess. Not that it's okay this way but i only have controll over my own sites.

- Quick question.
To what extent must the title be different from the meta desc. tag not to be considered as spam? Yeah i know that no one knows the thresold, but an approximate based on experience?

edited: I mean could the title be made from a part of the description, or the description include ( parts of ) the title and things like that.

goubarev

11:34 pm on Sep 6, 2006 (gmt 0)

10+ Year Member



Not say it's right or anything... I repeat 3 keywords in the description that are in the title - in the different sequence though.

One of my targeted pages:
Title: 7-10 words - 3 important keywords/phrase (placed in the start)
Meta Description: 10-30 words - 3 important keywords/phrase same is the title (placed in the start)
Meta Keywords: 10-15 words - 3 important keywords/phrase same is the title (placed in the start)
On the page: I repeat 3 important keywords/phrase - at least 3 times - the first time within 100 first words - 1 within header tag... - 1 as link to athority site...