Forum Moderators: Robert Charlton & goodroi
So why have I not done this until now?
The reason is our site did very well in Google so was too scared to do even the slightest alteration just incase Google didn't like it.
Now we have lost our rankings in Google, so we have decided to do what we wanted to do all the time and thats make it a much better site than it already was.
I wonder how much Google have slowed down the natural progression of the quality of the internet?
I admit we are also changing the site hoping to get back in G's good books, but this would have been done long ago if Google was not about.
For a site which has not already suffered a ranking loss, I'd recommend an incremental approach -- Go slow. Yes, it's much more difficult, but this is what Google recommends for large changes; Matt Cutts recommended an incremental approach instead of mass-changing all page URLs on a site.
The alternative is a temporary ranking loss ranging from weeks to months to quarters; The potential effect on a company's revenue and viability cannot be ignored.
I'd also recommend making a plan -- Complete redesigns and mass-changes of URLs are usually only necessary when the original design was lacking in planning or implementation. It's easy enough to understand how this could happen, especially when a site transforms over time from a hobby site to being a very-popular Web destination (think about Jerry Yang and David Filo's original links collection -- "Jerry and David's Guide to the World Wide Web," which became Yahoo!). But we're now in a different era -- Sites should be designed so that links and pages don't ever need to change -- unless there's a lawsuit forcing the change from outside.
Pages need to be designed, but so do URL-architectures and file-structure architectures -- and the naming conventions used to support them. With the advent of CSS, a good page design can take on "a whole new look" simply by changing a stylesheet, and this lost-rankings issue clearly shows the benefit of separating content from presentation.
Jim
I wonder how much Google have slowed down the natural progression of the quality of the internet?
This is a good point although it is not just Google but search engines in general and it has been going in since before google existed.
If search engines could miraculously be taken out of the Internet equation I believe that we would have a better Internet. No more trying to strike a balance with the writing of text content for both users and the search engines for example.
If search engines could miraculously be taken out of the Internet equation I believe that we would have a better Internet.
And an Internet where we had a lot troubnle finding content, too. Along these lines, (and slightly off-topic) there are pioneers who are working on semantic tagging schemas through several different microformats - such as RDFa and eRDF.
For any who wonder how this might shift things, see this Yahoo Search interview with Ben Adida [ysearchblog.com]. Ben is the Creative Commons representative to the W3C and also the chairperson for the RDF-in-HTML task force.
------
I agree that the current state of search technology makes me think twice before doing any major overhauls to a website. But maybe that's a good thing, too. "Improvements" made without usability testing often create more trouble than they fix! And to be completely fair to Google et. al., it's defending against webspam that puts the spanner in the works, not their indexing technology.