| 10:43 pm on May 20, 2011 (gmt 0)|
tedster, thanks for pointing out that video!
I have a theory and it goes something like this:
pretend you never read a seo article. The more you "optimize" your site using nofollow tags, repetitive anchor tags, keyword stuff etc the more google can tell you are trying very hard to "manipulate" there rankings. It sounds messed up but it is true, to what extent I do not know.
| 11:32 pm on May 20, 2011 (gmt 0)|
You may be right. For quite a while, Matt Cutts, John Mueller and others have been telling webmasters to "chase their users" rather than chasing the algorithm. Many webmasters have been calling that FUD. I hear it differently.
Those folks know that Google has been shifting their ranking process into a different zone completely, one that is almost impossible for them to describe with any precision to the general public. I hear their words as being common sense, trying to save webmasters the frustration of chasing an algo that they can't even model in their minds, and therefore they base actions on old advice, outdated formulas etc.
I remember very well when the "minus 950" penalty first got noticed (December 2006!). It looked like nothing anyone had ever seen. Matt Cutts said in a video that it was being applied against sites where the webmaster just did a lot of things they read on SEO forums. That was when we started calling it an OOP (over-optimization penalty.)
Healthy skepticism is one thing, full blown paranoia is something different. It's important to maintain some balance in all things Google. Even the name "over-optimization penalty" that we invented is an oxymoron. If some action loses rankings, then it surely isn't any kind of optimization!
| 2:42 am on May 21, 2011 (gmt 0)|
tedster, before I got into SEO I built websites for businesses, mostly ecommerce. I eventually became interested in developing websites for myself after learning some tricks on how to rank in search engines. I remember this one article that outlined all of the main factors that makes you rank in google. It essentially said that you need to put your main keywords and phrases in your title tags, meta tags/keywords, in the URL itself, in H1 tags and H2 tags, in between bold tags and in your paragraphs. This is what I did as well as build backlinks (very over optimized backlinks at that as well).
The site never ranked, it used to fall on about the 100th page for the main keywords no matter how many backlinks it had. I soon realized that some sites I wasnt paying attention to were ranking very well without any SEO at all, these were sites I had built before I started reading about SEO. I then attempted to apply enough SEO without going overboard and getting hit with a penalty. This is what we all essentially try to do. We try to optimize our sites enough without hitting the trip wire. It seems google keeps on making their "trip wires" more and more sensitive over the years.
Over optimization is a real thing, it always will be. Google does not want anyone to manipulate there system. I remember someone once told me many years ago that the most important word in real SEO is "natural" because google is getting better and better at understanding artificial behavior.
|norton j radstock|
| 5:17 am on May 21, 2011 (gmt 0)|
This is ridiculous. You have a page perfectly suited for its function -quick loading and does the job exactly. Why worry.
However if you really must, there are lots of things that might usefully go on a search page.
Firstly something about why people might want to contact you, eg "complete the form below for more technical information on express widgets"
You might make it more personal, with a note from you, or a quote from a customer saying how wonderful your services are. You could also put something about you and your ethos.
You might also include your legal and privacy information here, along with any further information that might answer a readers questions without sending an email.
You could add Facebook/Stumbleupon/Delicious/Google +1 buttons, in case somebody is looking to congratulate you on a fabulous website.
Of course you should also offer ways to navigate back into the site if the reader changes their mind or came there by accident -search box, nav bars etc.
| 3:54 pm on May 21, 2011 (gmt 0)|
|My "contact us" page is a thin one |
|The more you "optimize" your site using nofollow tags, repetitive anchor tags, keyword stuff etc the more google can tell you are trying very hard to "manipulate" there rankings. |
+1 ... IMO Optimization is 'nuanced' and 'subtle', not 'over the top' and 'everything everywhere' ... If you think about it from a SE perspective, it really has to be 'scored' that way, because if it's not you end up with 'the SERPs of old' where 'cramming and stuffing' are the answer, so you miss the 'high quality' of the web where people put information online and don't care (or know) much about search engines.
IOW: As a search engine, to be 'inclusive' rather than 'exclusive' in the SERPs, I think you really have to try to detect and rank the 'subtlety' of natural writing and web page creation, not simply the 'overdone' aspects of so-called optimization, or the 'overtness' of 'what optimization used to be'.
| 4:47 pm on May 21, 2011 (gmt 0)|
+1 +1 +1
That was almost perfectly expressed. Now again, how many times should I repeat the keyword in the first paragraph? ;)
| 6:40 pm on May 21, 2011 (gmt 0)|
Thanks tedster! Glad I got it out in 'understandable English', which I may have been known to struggle with on occasion! lol
| 2:38 am on May 22, 2011 (gmt 0)|
I see many sites that have a strong section doing much better on the very weak sections, one of the weak and easily exploitable points of Panda. In fact they are outranking other better sites thanks to their sitescore or whatever it's called. These sites also are getting away with thin pages, boilerplate text and downright spam. Imagine WebmasterWorld adding some webhosting links with 50 words a page and ranking #1 because of what we talk here. So for some having better pages is not enough since it's clear to me that even user metrics aren't based on page basis but sitewide.
But for normal sites I wouldn't be surprised if it's math: bad pages /good pages = x and if x is more than y you're screwed. What's a good page and what's a bad page is up to Google of course. So, how sure are you?
What do you lose if you noindex the page?
What can you lose if it's "thin" and that page matters?
| 12:44 am on May 23, 2011 (gmt 0)|
I just want to quickly touch on nofollow on internal links once more since it is related to this thread.
Why did google release the nofollow attribute? Its sole purpose is to be able to link out to an external website without passing link juice and being penalized such as paid links and comment links. It allows you to link out to questionable websites without fearing you might end up hurt.
nofollow was never meant for internal links. If you do not want google to browse certain pages it is always best to use robots.txt.
Using nofollow on your inner pages can be seen as an attempt to game googles algorithm, something they very much do not favor.
| This 39 message thread spans 2 pages: < < 39 ( 1  ) |