|Does Google Ban or Filter Web Directories?|
I think the subject worth a thread itself. It's a suspision so far. Yet I don't see dmoz, yahoo nor any major web directory were banned/filter nor PRed zero as my web directory did. I tried to check it in Alexa (powered by google) and I see some results from my site. Appearently, Alexa brings old results from Google but something weird is that Alexa itself has PR0 now. But that's another story!
If you run a web directory, feel free to post your experience here.
|Without quality serps, there will be no Adwords-Adsense program. |
I completely disagree.
You have to keep in mind the size of your KW list. PPC is, as a general rule, the tool used for large KW lists, whereas SEO is great for sharp-shooting high demand terms (with some exceptions which I won't get into).
Some of my (PPC) clients have keyword lists of 5000 +, which is far easier to manage with PPC than SEO.
You have to go into SEO knowing that, regardless of whether you perceive the SERPs as being "quality", they are going to change because you have competition, which is why a hyrbid SEO / PPC strategy is the best way to go.
a hybrid SEO / PPC strategy is the best way to go!
I like this!
Google didn't delete many of their customers.
The end of Free Traffic?
Seo, controlling SERPs? áNow many SEO sites show no ranking, sites deleted!
We become Google customers!
Will being a Google customer restore me to the SERPs?
Google became popular as being the search engine that could dig up really obscure information easily. (they still do)
More commercial searches can be less than perfect giving the option of adwords.
If people find their site through adwords they still tell thier friends "I found it on Google"
Works out great for them.
"Can you please verify these numbers? What is your source?"
My source is myself. 175 websites, 150.000/200.000 unique visitors by day on my network, targeting about 100 different keywords in 3 languages for searches always with more than 10 million competitors results.
"Most internet surfers either don't know about other search engines or are to lazy to type the URL of an other search engine in. "
Sure, and this is our fault. The webmasters fault, we place Google where it is and we will pay if we do not give to users other options. No market is safe for little marketers if more than 25% of the market is at hands of one company.
"Google uses dirty tricks, as for example the Sand Box effect, that really is just the way to make new webmasters to pay for be listed for a long time" Can you show any proof that this is true? "
Yes I can. I have dozens of examples. DOZENS. And Sand Box is increasing the "dead line" for relevant words. Some words with more than 100 millions of results have now 18 months of Sand Box and increasing.
"Often older web sites offers more then the new web sites coming out"
Totally false. Older websites follow the rule that if they have a good relevant position at Google, better do not touch nothing. Are completly obsoletes and do not any update never or they do irrelevant updates. Yahoo and MSN have different formulas for SERPS counting the new updates, and these websites listed at Google results never are there.
Of course we are talking always on relevant searches, at least with more than 50 million of results.
|Now many SEO sites show no ranking, sites deleted! |
We become Google customers!
I have been reading the argument that Google plays with the SERPs to make everyone PPC customers for years.
I believed that a few years ago. But then I followed many updates, during which I heard the conspiracy theories, but also saw that Google's SERPs were changing NOT to drive people to PPC but to evolve their algo.
They work hard at improving their results (whether or not you like them), and with their changes come changes in the SERPs.
A sophisticated marketer understands this and creates a strategy that is diversified, so that, like a mutual fund, risk is minimized.
|Google has 50 to 60% of SE market, but delivers to websites listed on them more or less same traffic that Yahoo or MSN with only 15 to 20%. |
Maybe that's true for you. It isn't true for me. Which just goes to show that different search engines have different ways of determining what's most relevant to users' queries--just as common sense would suggest.
|My source is myself. 175 websites, 150.000/200.000 unique visitors by day on my network, targeting about 100 different keywords in 3 languages for searches always with more than 10 million competitors results. |
|The activity at more than 60 search sites makes up the total search volume upon which percentages are based -- 4.3 billion searches in this month.(may) - source: Nielsen |
Your query sample represents: 100 different keywords / 4,300,000,000 searches ~ 0.000000000233% of searches. (Please, note I did not take into account the foreign searches, so the sample size compared to all searches possible, is actually smaller than stated.)
|317,646,084 domains registered - Source Internet Domain Survey, Jan 2005, Internet Systems Consortium. |
Your domain sample represents: 175 domains / 317,646,084 domains ~ 0.000000005509% of registered domains.
Not sure the sample sizes are quite large enough to be conclusive.
|Not sure the sample sizes are quite large enough to be conclusive. |
Not sure? I'm pretty sure you're sure.
1. To play devil's advocate:
I've read posters on WebmasterWorld saying they have sites with pages numbering in the 100,000's. If that's common, Google is doing what it's always done: use software to present the most useful results to users.
Which means dumping pages, and sometimes, sites. 100,000 pages of anything can't _all_ be top-notch.
They're a publicly quoted company now. Their shareholders expect returns. All they have is their SERPs, really, and lately, their brand. The heat is on.
They do an update, spammers get hammered, a few legit webmasters complain, put 'em back in, tweak the algorithm again, fewer complaints, more junk 'flushed', a good day's work.
That means that any page ...
- That doesn't have a few good external links to it,
- Has sections which are repeated elsewhere in the site, or on other sites,
- Has lots of links to other sites,
- Is autogenerated, so has flags that can be tagged
... is not as 'interesting' as it's opposite:
- A quirky hand-made page with authoritative, unique content, with links _to_ it from independent sites.
The trick for SEO is how to mimic the latter as easily as possible.
(Sorry if I'm restating the obvious.)
2. I got my main site back in. Not too bothered about the satellites, which I've nuked, anyway.
a. Removed all the content similar to that on pseudo-directory-Adsense-scraper-autogenerated-link-heavy sites
b. Emailed Google, admitted what I'd done wrong, and what I'd done to correct it. Was very polite, short, and to the point; gave the full URLs of the sites involved. Made it easy for them. No waffle, no 'buts', no carping (I bet they get a lot of that).
c. Got a reply 'your request has been forwarded to
d. A few follow up emails about what I was doing to correct problem; again, short, polite and to the point. I skim long emails; I bet they do too.
Got another email saying 'thank you for your patience', or words to that effect (subtext: stop bugging us(?))
Back in again about one week later. SERP positions seem the same. PR restored.
My site is not a directory. It's an old, eclectic personal site on steriods. That probably helped.
I'm going to make damn sure I don't rely on one site, or search engine, in future, for my traffic.
Thanks to all on MWM that helped with their informative posts, especially (heh, heh) the Contractor.
I'm pretty sure you're sure he's sure....
Googlebot is back on my site. After sending Google an email informing them that they're idiots and they dropped my site mistakenly, I received the "forwarded to engineers" email.
Last night I see the bot was back in pretty much full action. Looks like Google is trying to clean up their mistakes after all.
Tiger, Junior, others-
The common characteristic on my sites that were dropped was that they either
(1) were the ORIGINATING web site with an external sitewide link (or several links) -- in other words, there was a link on every page of the site (including the home page, all content pages, all link directory pages) which pointed to a different site. This link was usually in the nav bar or footer of every page,
(2) the dropped site was the TARGET (recipient) of a sitewide link coming from a banned site (see #1 above).
Was this true of your sites?
No Girish, I had no footers.
Well, it appears that this ordeal that started with the July 28th massacre is now over for me. My site has now been restored to Google's SERPs with most of its original position and PR. One exception is that instead of 85,000 pages indexed, there are just under 6,000 now found by doing a site: search.
My site is a custom-coded directory, auto-generated with static pages once a month, with my own database of listings which has roughly 10% overlap with my other directories.
I was hoping that Google found a global solution to reinstate the sites that should not have been removed in the first place, but unfortunately the other directories that I've been watching continue to show that they are no where to be found in Google.
I was probably one of the first to notice that I had a site de-indexed, and so I was also probably one of the first to submit a re-inclusion request. My re-inclusion request was short and courteous, stating that my site could no longer be found using the site: command and that I had followed their suggestions the best I knew how. Hopefully it's just a matter of time until Google gets to the other requests and takes appropriate action.
I must say that over the past week I've taken a long hard look at my sites and my current business models and I've concluded that the best method of ensuring the longevity of my business is to make my sites totally focused on the end-user. There may be times where Google screws up and delists one of my sites, maybe even for a few months, but if my users really find my sites useful, my business will continue and Google will eventually correct the problem.
HAHAHA! I'm back! Happy days are here again...
>Well, it appears that this ordeal that started with the July 28th massacre is now over for me.<
And please don┤t forget July 22nd massacre where my site was hit too and hasn┤t recovered yet ;-)
I've been hit in beginning of bourbon (may 16th) and not yet recovered, I think the site had a looooot of problems with his structure so the time had to come I guess. We are rebuilding the whole site, but google still seems to have problems crawling and re-indexing the whole site. We have also re written all the content of the site, so I don't know what could be wrong. Except that many pages were redirected and are still pr0.
im sorry to ask but whats a "scrapper" site?
There hasn't been much of a discussion of the July 22nd hit. That was when my site was hit as well. Is it your opinion that July 22nd was a rollback or was it the first of duplicate filters getting imposed on sites.
My site used to rank well for allin* searches. As of July 22, it nolonger does.
My site used to show up #1 for "domain.com" searches. As of July 22, it nolonger does.
I still see lots of googlebot activity.
My site has machine generated content with duplicate excerpts.
My site is over 200K pages and is mostly indexed with heavy interlinking of relevant topics. Few external links. All pages are flat.html.
Every once in a while I see a google referal from a decent serp position, but it usually lasts less than a day.
My site first started seeing significant traffic with Bourbon.
My site is 2-3 yrs old.
My site as a link directory that is now obsolete so I blocked it 3 months ago with robots.txt
I have adsense on 98% of pages, but I spend 2K month on adwords.
I have one other site on the same IP that is related in topic, but with different content that I launched in early July.
Lost 95-99% of google traffic July 22. Where I used to rank in the top 10, I now rank between 100 - 500.
Google has said that there is not a penalty on my site.
What similarities do you see? Any?
There hasn't been much of a discussion of the July 22nd hit. That was when my site was hit as well. Is it your opinion that July 22nd was a rollback or was it the first of duplicate filters getting imposed on sites.<<
I think it was Google attempts to remove scrapers and the automatic processes caused dramatic shifts in the ranking or indexing of innocent sites too.
>>What similarities do you see? Any? <<
Not much ;-)
My site is from 1997 (i.e around 8 years old) and was hit first by Allegra (3rd Feb 2005) where it lost 75% of its Google┤s referrals, then got a hit on 22nd july 2005 ending up now with less than 10% Google┤s referrals. Page rank of homepage PR5 and most other pages PR4.
- show up #1 for "www.mysite.dk" searches.
- show up #1 or #2 for home page title searches
- site:www.mysite.dk shows all pages indexed correctly; no duplicates, no non-www vs www.mysite.dk problem etc...
- link:www.mysite.dk looks ok.
- PR values seems not affected.
- Googlebot still visiting my site once/twice aday.
- I have adsense on most pages
- Google informed me that my site is not currently banned or penalized by Google
I hope this help
So I just did a search for site:www.domain.com and it said 1-10 of about 346000 pages which is about 100k pages more than I have.
I don't see any www.domain.com vs. domain.com listed pages. Is there a better way to look for dup pages?
One other thing I noticed today for the first time is in the serps, google has the title, description, and the url at the bottom. In some of the listings the url at the bottom has an extra space after one of the / that seems wierd. If you click on the link in the title it resolves to the right page.
>>In some of the listings the url at the bottom has an extra space after one of the / that seems wierd. If you click on the link in the title it resolves to the right page.<<
You may wish to view this thread
Reseller, thanks...I had just found that thread.
So, I've been trying to narrow down the problem. When I do the site:www.domain.com it returned many more pages than I have which makes me suspect of a duplicate content penalty. So as I go through the results I noticed a large amount of URL only listings. Then it gets a little stranger, if I do an english only search, I get the right amount of pages and all the results have titles and descriptions.
Do you have url only listings?
If you do a english only search for site:www.domain.com do you have titles and description for every page?
I have been trying to figure this problem out forever...
Google reports that we 76,400 pages. On our site we have only 21,000 pages.
Our site disappeared from Google after Feb. 2nd.
Now, if I search for only english results, Google returns a total of 38,900.
My questions, what are the other 50,000 pages that Google has indexed?
Based on recommendation, we did the www to non-www 301 redirect. I have even moved directories and did a 301 to the new directories to try to get the counts more in line.
Can anyone shed some light on what to do in this situation? Do many people have the problem where Google reports many more pages than actually existing? Can this cause a dup-penalt? If these are dups --- how do I get rid of them? We do not have duplicated content --- therefore, it is something on the Google side that has decided to duplicate theses?
I have written Google multiple times about this issue - however, I get the same repsonse - "No penalty - sometimes sites get removed from the index..."
The bot visits our site each day... Actually, this week the moz 5.0 Googlebot requested a total of 23,000 pages.
We do have many pages which are URL only. Also, we have many pages which have cache dates of Nov 1 2004, Dec 2 2004, and Feb 5 2004.
Does any of this ring a bell?
If anyone has any ideas... I am am all ears.
Ok back to the main topic..
- People who think that Googlebot has been back crawling their sites should make sure that it's not "Googlebot-Images" and surely not "Mediapartners-Google".
- People who has been back to google serps, please state whether you are back after a re-inclusion request, an update to their sites, or they have back with no action from their side.
- It's now a fact that the massacre of 29th of july was targetting ODP clones and negatively affected unique directories. It's a fact also that not all ODP clones have been/filtered banned (alexa, opera, directory.net, etc..).
Google is giving me a "we are sorry message when I try to search now" from multiple machines. Is anybody else having this issue?
Trying to figure out this more pages listed than I have issue and google chokes...must have been getting close.
WOOOO, there back online....
I have heavy interlinking and noticed on pages where I have over 1k links to pages that those urls are showing inflated page numbers . So I have a page that links to every city in illinois. There are 1200 ish cities, so I have 1200 ish pages. Google says that I have over 4K pages in that path/subdir by site:www.domain.com/subdir. Only one of these pages has the title and description.
Do people with inflated page counts have over 1k links on a page that points to them? Are all pages in the same subdir?
I've seen pages hit on a purely informational content site (with an ODP listing and no dodgy links) that has only a few links on each of several pages going to genuine resources - not a directory site, not a scraper site, and scraper sites and directories don't link to those type of sites that are being linked to.
But those pages are being given the URL-only treatment just as though they were. Not the site - just those pages.
>>But those pages are being given the URL-only treatment just as though they were. Not the site - just those pages.<<
Have those pages AdSense blocks on?
Those of you who have been hit on 22nd or 28th July. Wish to know how your sites are doing on the following DCs. I┤m asking because I see my site which was hit on 22nd July doing well and cache is from 6 Aug 2005.
And you are right.. I wish to see more of such DCs ;-)
28th and not doing good in any of those.
How many kept their ODP portion and got reincluded?