Welcome to WebmasterWorld Guest from 18.104.22.168
I am very new and I start my site on 26 May, it's about <widgets>.
I am not good at english so writing content is like killing me, then i copied a few articles from one of the top <widgets> website and paste as my contents (of course we provide our services, too)
So far, the site jumped to PR3 since the july google update and ranked at 1 million over at Alexa.
It's CSS and html, no tables, all the tags are carefully designed (well, i admit i use some hidden text by applying css disply:none, in order to get more keyword density, however, the hidden text is less than 30% on each page).
Now, the site is ranking near top in MSN and Yahoo for almost all my keywords(it's country targetted), however, in google, no news. I can only find my site on a few very minor keywords (no competition at all).
The backlinks, google shows 8 only, yahoo shows 85 and msn shows 139 ... and more than 50% of the pages (total: 40+ ) are indexed by every of them.
I really don't know if my site is still in sandbox or not? (as i can see my site on those non-competing keywords, so i doubt) or I was penalized by google because i use display:none or duplicated content?
[edited by: ciml at 10:56 am (utc) on July 25, 2005]
[edit reason] Widgetised. [/edit]
Do something honest with your site, write your own content, get rid of the hidden text, and you may have a chance... once you get out of the sandbox.
If you are trying to make a site that is guaranteed to fail in google, you're doing all the right things. Duplicated content pages will basically be ignored, since the pages you copied the stuff from will already be considered the source page, so those will probably never rank. Even if you get away with hidden text for now, with the amount you're putting on the pages you won't get away with it for long. And the chances are good that some time in the future your site will also drop out of yahoo and msn.
If you're going to use hidden text, use a more sophisticated method than display none, that's the easiest one to spot, though I don't think google looks to much at css libary files currently, though that could change at any time.
We had a authority site (yes, added to ROOT dmoz category of your language, like Word: My Language: My Category, #1 in your keyword, that equivalent site in english is alexa top 2000, and some other good things).
Your main domain, rank well in google, of course, had tons of backlinks, dont had any type of SEO yet too. But we had another domain, sub-session of your site, with few backlinks but some content, that got penalized in Bourbon.
Will be a good ideia move the content to your main domain or sub-domain, using 301 redirect the pages? My fear is your main domain got penalized too "301 moving penalization".
Sorry for my english, tks
I think I'm seeing some changes in my niche.
I honestly hope this time Google gives a full knock-out to all those scrapers.
Like we say in Costa Rica, pura vida Google!
It's good to know the Plex has been working hard on their infrastructure to improve the quality of results.
Just let us know when the new equipment is in place and working. We'll tell you if the new infrastructure is good enough ...
I added in hidden text not long ago, just because i wanna test whether i am still in sandbox or my on-page factor is not enough ....
but how long have to wait to get out the sandbox?
Another thing I dont think hidden text will get you baned, because I have seen a site with hidden text in the serps for years and they also have google adsense, but I would NEVER use those tricks, so if you are serious about your site, then make your own content/website and drop this scraper/dublication/hiddentext.
I was trying to use hidden text to raise the keywords density to see whether can help on the ranking or not, it's common trial and error for beginner like me:
if i do see the change on ranking, that means i am already out of sandbox and need to do some other thing to improve;
if no change on ranking, then i am still in sandbox...
Here's the site structure:
Product Info Pages, such as demo, features, contact etc. and DIRECTORY.
Since the site is a classic sample of ordinary internet site, i assume that our DIRECTORY is to be blamed for making us a "scraper" in Google eyes. One thing though... WE ARE NOT SCRAPING ANYTHING. We have human edited directory, with niche links, all added by webmasters, and with unique descriptions, written by our administrators. We do not use anybody's search results, not copying, scraping or stealing anything. I insist that we have 100% same (yet smaller) directory as , say Yahoo, or DMOZ, but its UNIQUE. Now I am sure that Google should consider this and accept that they could make a mistake. Please tell me what should I do to have the problem fixed fast. How may I let Google know fast about this particular problem. Would Google even assume that they could make a mistake?
If some uses a directory for there uses they can link to Yahoo directory/DMOZ or google, the ponit it most site only have dir. because of the added content.
These days I working on a site where I delete alot of text, because its hit by dublicated content, because some of the info tells the user what to do and that is repeated on most pages, with the google filter that high I have to remove the text and the user have guess what to do.
And what if it's only a part of the site, and of minor importance, and maybe removed, if requested? Why would our sites be banned whereas Yahoo Dir or Dmoz Dir are still in? I say it's oduble standarts then.
Mine had been in DMOZ for years. And it is not a directory site. It's a site about a specific topic with articles written everyday!
::: sigh ::: This is the day I've been dreading, years of work down the drain.