Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: open
This is a general question to the whole community:
I've been hired by a company that has a network of about 80 different web sites, all of them containing information related to travel services in the same country, but each one of them covering a different aspect of the travel industry (hotels, tours, transportation, etc.) and also independent sites covering specific travel services available in different towns of the country.
In the beginning the network of sites was doing very well on SERPS, with more than 200 #1 positions in Google for some of the most important keywords in the topic. But it only worked for a few months and of course, after November 2003, most sites begun to disappear from SERPS in Google, but not in the other SE’s ... not yet at least.
They hired me to try to solve this situation that look to me to be the result of the sites being filtered out of Goggle’s SERPS, either by loosing ‘authority score’ once the Hilltop Algorithm identified all the sites as affiliate sites and therefore stepping back in SERPS, or just by being considered as a site spamming Google, which doesn’t seem to be the case, as the sites keep their PR 4 & 5 and back links, but don’t show in the Google primary results.
I’ve recommended that they consolidate all of their sites into one ‘megasite’ containing all the information that right now is being divided in the separated sites, and this way avoid any future problem or even being banned by the SE’s. This would also add depth and relevancy to the internet presence of the company.
Now the question:
Do you consider that I should try to contact Google and let them know that we are moving into a ‘better’ or ‘not abusive’ approach, as we have understood that having a network of sites might be seen as a deceiving technique and then place all the 80 domains into the 301 status and redirected to the new site or should I just make the new site completely independent from the old network and begin from scratch?
I really appreciate your comments on this topic.
Start up a new mega-site and design it for Google. Link a couple of the other sites to it to get it crawled, but do not link back, and do not go overboard.
As you are building it, get links from outside your network, and only add more links from your network at about half the speed that you are getting outside links.
Build the content and links over time and build a stable position in Google. Trying for the quick kill in all sorts of different areas will lead to what happened before. Good content and good links leads to stability.
Are all or most of these 80 sites on the same IP address, as this could've triggered the filter you mention.
If so I'd make sure your new mega-site is on a completely different IP address to be safe, and like Big Dave said, don't link to any of the 80 sites from your new meag-site.
In fact I'd even steer clear of linking to the mega-site from any of the old sites just in case Google is somehow utilizing whois information to penalise excessive linking between sites owned by the same person/company. Whilst there is no evidence to support such a theory, it is nevertheless a conspiracy theory doing the rounds and if you want to be ultra safe I wouldn't risk it.
The best thing to do would be to set up your new site WITHOUT killing your old ones. In fact, as someone else said, put links to your new one from your old ones. As you work up your rank in the search engines for the new one, ONE OR TWO AT A TIME you can slowly drop off the network sites and start redirecting them to the new one. You'll want to do this slowly and keep an eye on the position and PageRank of the new one - I would say a week or two between redirecting each site. The process may take a year, possibly more, but will be necessary and worth it for keeping your position.
You implied in your original message that your client might be looking for long lasting results. Well, long lasting results take a while to build, so don't take shortcuts. Make this the large autonomous site that is squeeky clean.
Don't kill your old sites for Google. The reality is... that it is extremely difficult to get any results in Google. Their florida/mom-and-pop/adwords/money-word filter is pure idiocy. About the only people left in the results are large directories with DMOZ listings. If you do not qualify as such, Google don't love you no more. If you must get hits from Google break out the wallet and pay for adwords on existing sites IMO. The CEOs at Google will thank you when they deposit their millions in the bank, and you might get some G hits.
(I hope this didn't sound to negative, I am told I am too negative. I should be more cheerful about my sites being trashed by Google. So I am sorry for that.)
[edited by: ILuvSrchEngines at 5:09 am (utc) on April 28, 2004]
Hehe.Nice to see you are now looking on the bright side of death.:)
It feels like a dark cloud has lifted off your shoulders ILuvSrchEngines.
If you heard about "new site sandboxed", this is not a good time to start a new one.
I moved a new one a month ago with 301 redirect but no SERP. I lost all the google traffic to the new site.
I would create a new mega site from the scratch. As Bigdave said, be carefulr when you link. I wouldn't recommend crosslinking. Just one way linking from high PR or authoritive sites would be better.
You can rewrite the contents from those 80 sites and reorganize them with better and cleaner SEO. You won't lose the current traffic from Other SEs and build one for all SEs.
It takes longer these days to be ranked well on Goolge especially with a NEW SITE. So, until you see some significant traffic to the newly designed mega site, I wouldn't put all the eggs in one basket.
Soccer_star: Yes. One of the problems that might have triggered the filter is that all the sites share the first 3 IP octets. In fact, that’s not the only thing they share. They even share the navigation bar, which makes even more amazing that they didn’t get filtered sooner. Thus, one of the first measures that I was considering to take is to host the new site in a completely different IP series, even though this means hiring a different hosting provider.
I agree with most of you that killing the old sites while they are still ranking well on other SE’s would be too risky, as we have no guarantee that the new site is going to keep the rankings inherited from the old sites, specially because we would be redirecting several entirely INDEPENDENT URL’s to several INTERNAL PAGES of a unique URL.
Therefore, keeping both, the old and the new sites simultaneously would be the ‘safest’ approach, but this is not only too complex (we have to maintain updated rates for all the services we offer), this also triggers one of my main doubts… by doing this, we would be continuing with the attempt of fooling Google, and one of MY most important rules is trying to do everything aiming for long term positive results, and in SEO world this can only be achieved by following the basic rules of the SE’s, period. And one of the main Google Quality Guidelines is: “Don't create multiple pages, sub-domains, or domains with substantially duplicate content.”
Cabbie’s idea of putting the - googlebot noindex nofollow - on the old 80 sites sounds interesting but I would like to hear the community thoughts about this option. Will that really prevent Google from seeing the old sites and the new mega site as duplicate sites? From my point of view, having duplicated sites on-line is not per-se spamming… It should only be considered spamming IF various duplicate content sites are indexed on the same SE.
We still don't know if we'll go for the single domain amalgamated site or just reproduce the old sites. Both have their disadvantages. If you build one site and it gets banned, you lose the house. But if you build multiple sites you need to get more incoming links as a whole to get decent SERPs.
Anyhow, to the problem of robots.txt - If we ban Googlebot from the old sites, we'll never know if Google lifted the penalty since they will all show as unindexed. If we don't ban Googlebot we risk a duplicate content penalty, especially from a manual spam report. But things being as they are, we'll most likely ban Googlebot becuase we could wait until hell freezes over for the penalty to be lifted.
Just one small note about small vs. large site: Most of our smaller sites were toasted because we had links pages on them and cross linked them. We are now beginning to succeed with a large site for which we obtain links to the internal pages. On the smaller sites we would only ever get links for the home page.
The only logical way to procede IMO when all of your sites are banned in Google but doing fine everywhere else is to pull out the wallet for Adwords like they want you to. I personally don't beleive Google is imposing penalties on everyone.
It is more likely that they are excluding many sites based on some sort of Hilltop like algo which effectively locks people out if they do not pass the Hilltop tests. The bad thing about what they are doing is seen right here on this thread. People are wasting tons of time and money trying to fix sites that aren't even broken.
I'd like to add that, though it was a totally different situation, that was part of what I did for a site hit by Florida that did not come back when most other sites did. Not only did I de-optimize somewhat and increase the scope of the site a bit to be more relevant for the full local phrase, but instead of having all links point to the homepage I got some to link to appropriate internal pages that are relevant for the search term.
I can't pin-point any one particular thing because there were several things done, but that was one I considered primary because I saw it done with the first site that came back for the search in question. The site did come back, and is stronger than ever.
As far as different hosting and IP is concerned, in addition to that it might be worth a look into not having the whois info match up either. There is domain registration that provides transparency.
I've kept the older sites as they continue to do reasonably well across the other SEs. Should other SEs change to "like" megasites better, as G seems to like them now, I'm ahead of the game and am patiently waiting for them. And as BigDave mentioned in an earlier post in this thread - "you never know, Google may tweak their algo next month and those sites may shoot back up to the top."
(It's like when we used to leave old pages, that once did well but dropped, "alone" because you never knew when they might be in vogue again. Only now it is old "sites" we leave alone and not "pages" - who would have ever thunk it?)
And I agreed with most of what was said in the replies, ( keeping the old sites if they are doing fine in the other SE’s, creating the new megasite from scratch and on independent IP’s, putting googlebot noindex nofollow to the old sites, etc.) but something really interesting has happened, that I believe might be worth considering:
The first step I took when I began working for this company was to re-optimize all the sites and see if they came back to Google SERPS, but they didn’t. The sites began to gain positions in other SE’s, but Google listings just kept on disappearing.
I believe this confirmed that we were being filtered some how by Google, because optimization done to the pages was working, but not for G.
Then I did something else and the results obtained are the whole reason of this new post:
Almost a week ago I begun an AdWords campaign and guess what… many of our long lost SERPS in Google have started to appear again this week. Coincidence? I honestly don’t know, but now I’m even re-considering the idea of consolidating into one megasite. I mean, maybe ILuvSrchEngines and many others were right, and all Google wants is big companies (basically those companies that can afford paying SEO) to pull out the wallet for AdWords.
Your thoughts on this one?
-- this might be a little out of topic… should I begin a new thread? --