| 10:26 pm on Feb 8, 2011 (gmt 0)|
I think my single best seo strategy is realizing that Google has evolved so much that a strategy must be multi-faceted to be successful in the long term.
There are some tricks that can exploit temporary algo weaknesses. Google is getting faster at fixing those weaknesses. IMHO it is no longer profitable enough to follow the "churn and burn" method. I see greater revenue potential in long term methods and that means less tricks and more fundamentals.
If you focus solely on links you are going to have problems. You need good content to keep people on your site and convert. Without some content your usage data will be poor and Google seems to be using this more and more in their rankings.
If you focus solely on content you are going to end up building a mansion with no roads pointing to it. Eventually great content can be discovered and garner the links it deserves but it takes too long and too much investment to simply hope that people discover your site & link to it.
If you focus solely on social you are going to get a short term boost. Social websites tend to have very short lives. When a social site loses popularity so goes your success.
I could go on and mention other aspects that can benefit you. The best strategy is to build synergy using a multi-faceted strategy. For example use social marketing to gain user generated content which attracts new links which then exposes you to more people which then feeds into more social popularity.
Thats just my thought.
| 10:30 pm on Feb 8, 2011 (gmt 0)|
Quit SEOing and get word of mouth traffic. (lol ... sort of)
| 11:00 pm on Feb 8, 2011 (gmt 0)|
Depending on the site, I think I'd go for valid HTML. I have a client whose HTML was so bad that Google only found 19 of the 50k+ pages. After fixing the HTML the number of pages indexed went through the roof. The number of links soared too because Google could index the pages with links pointing to them and award proper credit. Link count went from 9 to 15,000 in two weeks and there are over 1.6 million links credited now. The site had been up for over a year when I started for them and hadn't accomplished anything.
| 11:08 pm on Feb 8, 2011 (gmt 0)|
|I think I'd go for valid HTML. |
Here, here, I'll second that! Actually, valid everything!
| 11:25 pm on Feb 8, 2011 (gmt 0)|
For me too, without question, valid html. Through W3C Unicorn I get 3 back links per valid page from a PR10 domain (root). This usually propels my client sites to first page results within weeks of launch with almost 0 back links.
And you have to link to them in a way that your page becomes a bookmark in their database so it is always available even without being referred by your domain.
I don't mind letting this be known because to take advantage of it one has to produce valid code or there will be no back links on the W3C page. Google is somehow capable of finding those bookmarked back links within W3C and gives (superb) credit for them.
| 4:29 am on Feb 9, 2011 (gmt 0)|
Correction to the above -- it's actually 4 back links not 3. Also, maybe it's best if I provide a more detailed explanation of what I said above.
When most people place the validation link on their page they use the old school URI of validator.w3.org/check?uri=referer -- but that page does not exist on W3C, it's just a throw away reference and it can only be accessed by referral directly from your site.
If instead you use the new school way of understanding that it's all about conformance then you will use validator.w3.org/unicorn/ such as validator.w3.org/unicorn/check?ucn_uri=www.example.com%2F&ucn_task=conformance (me luvs unicorns).
When you see the validation result on the page W3C will provide a bookmark suggestion of:
validator.w3.org/check?uri=http%3A%2F%2Fwww.example.com%2F <------- Don't use this one it will only give you one back link.
Rather than simply use it as a bookmark, apply it as your link to them.
validator.w3.org/unicorn/check?ucn_uri=www.example.com%2F&ucn_task=conformance <------- This one forces a full conformance check and gives you 4 back links per valid page.
To get that URI above just copy it from the browser address window but DON'T FORGET TO CHANGE THE & TO & -- or else your code will become invalid, oh the irony of it all.
Now place that link on each valid page of your site and wait a few weeks for it to get indexed. Then go to Google and search your site by www.example.com (without the "site:" operator), and you will see a bunch of back links from W3C :)
I don't know why it is like this but by doing it this way your links from your domain persist on the W3C domain so that it appears as if it is a page on W3C rather than just a results page that gets thrown away after you leave. They then get indexed by Google.
| 9:19 am on Feb 9, 2011 (gmt 0)|
SevenCubed, thank you for sharing this info. I read and reread your post. I may be acting dumb, but I can still see 4 links to my site even if my site doesn't pass validation test. Should one then take the extra mile of validating the site or simply stick a link to the validation page from somewhere and get it indexed?
When I append & with & on a valid URL, I get the message "No task specified. Unicorn used its default task: "General Conformance Check"." Why should & be appended with AMP;?
| 10:26 am on Feb 9, 2011 (gmt 0)|
Right. I applied the following code: validator.w3.org/unicorn/check?ucn_uri=www.example.com%2F&ucn_task=conformance on the footer of my blog and I got 4 backlinks without my blog passing the validation...it that easy?
| 12:14 pm on Feb 9, 2011 (gmt 0)|
Ya'll are phunny!
Did you really think there was value in a backlink there? Come on now, I thought you were joshing us?
|Then go to Google and search your site by www.example.com (without the "site:" operator), and you will see a bunch of back links from W3C. |
Yikes! Really?! No way. If that is the case, it won't be that way for long, not after this topic. ;)
Oh wait, just checked, those are URI only entries and is the default behavior for documents that are disallowed via robots.txt. I wouldn't expect one inkling of link value to come from those.
| 12:28 pm on Feb 9, 2011 (gmt 0)|
Geez. WW 138 errors
| 12:56 pm on Feb 9, 2011 (gmt 0)|
My thoughts on this...
|When most people place the validation link on their page they use the old school URI of validator.w3.org/check?uri=referer -- but that page does not exist on W3C, it's just a throw away reference and it can only be accessed by referral directly from your site. |
^ Absolutely the best way to validate pages on the fly.
|If instead you use the new school way of understanding that it's all about conformance then you will use validator.w3.org/unicorn/ such as validator.w3.org/unicorn/check?ucn_uri=www.example.com%2F&ucn_task=conformance (me luvs unicorns). |
Nice try but I don't believe it is doing what you think it is. Those references are disallowed via the W3 robots.txt file. That means Google will find them, from that new school way of understanding, and show a URI only listing when performing certain searches.
If these are new sites and you are performing searches and finding those URI only references, that means Google knows about the documents but that is about all.
If anything, I'd go out on a limb and say that your throwing just a little bit of equity at a black hole. ;)
By the way, the /check? produces the same results so they are both displaying the same behavior, it is not a throw away reference as you say.
| 1:01 pm on Feb 9, 2011 (gmt 0)|
Social networking is one of the best seo strategy
[edited by: goodroi at 1:44 pm (utc) on Feb 9, 2011]
[edit reason] Please no personal urls [/edit]
| 1:05 pm on Feb 9, 2011 (gmt 0)|
Parroting TMS: Quit SEOing and start writing for real people. Y'all are fretting too much on the tech. People convert, SE's do not.
| 2:50 pm on Feb 9, 2011 (gmt 0)|
Best SEO strategy: Do not believe everything you read in SEO forums.
Also important, diversify. We only have one site (although we own all similar domains, they are 301'd), so we have diversified our traffic sources and our revenue sources. More subtly, we have diversified different areas of our site in terms of structure so that lost traffic to one area tends to be compensated by gains elsewhere.
| 3:15 pm on Feb 9, 2011 (gmt 0)|
Diversity saved my bacon :).
Little bit of adsense, little bit of SEO, little bit of direct ad sales, little bit of direct sales, all wrapped up makes for stable income. Even when one aspect has the bottom drop out of it. And in my niche I have 3 solid white hat sites that rank OK; one primary but if the primary ever dies I have 2 more ready to go with very little push.
One of the projects that my spouse accused me of "working on stuff that doesn't make any money" is making up almost half my monthly income right now.
(OTOH a couple of things my friends make very good cash on, I'm lucky if I get $100 a month on).
I'm building the structure for another website that's not even in my niche right now, not even sure how I'm going to monetize it 100%, but I think there's a reasonable chance it'll produce an ongoing revenue stream as well. Always good to have one more thing in your back pocket.
| 3:48 pm on Feb 9, 2011 (gmt 0)|
I believe wholeheartedly in validating, and do it where I can, but I don't get all pageoneresults about it. My #1 tool is site architecture. It all has to MAKE SENSE; for search engines sure, but mostly for users. If it doesn't, none of the other stuff will help very long.
(unless you're a big brand. then you can get away with murder)
| 4:04 pm on Feb 9, 2011 (gmt 0)|
Watching internal site search reports for queries that get zero results - this has been excellent for some sites. The idea is this. If visitors are looking for a topic, then we ought to have something for them. It's a real demand from real people. When we create supply for that demand, the amount of SE traffic can also be amazing.
| 4:33 pm on Feb 9, 2011 (gmt 0)|
Oh you're sneaky tedster! That's an interesting idea. Put a search box on the site and collect the data.
| 5:02 pm on Feb 9, 2011 (gmt 0)|
wheel: the autocomplete box is a great tool to figure out what and how people look for things...
| 5:07 pm on Feb 9, 2011 (gmt 0)|
|Watching internal site search reports for queries that get zero results - this has been excellent for some sites. The idea is this. If visitors are looking for a topic, then we ought to have something for them. It's a real demand from real people. When we create supply for that demand, the amount of SE traffic can also be amazing. |
Funny you should mention, I just started paying a lot more attention to this in recent months. I'm also crafting custom "item not found" pages for some of these searches - if it's a product we're out of temporarily, I'll get their contact info to let them know when it comes in, and if it's something we don't carry at all, I'll use a page that suggests some alternatives. Not sure that counts as SEO though.
| 5:13 pm on Feb 9, 2011 (gmt 0)|
If internal searches give you new ideas for content to put on the site, I'd count that as SEO.
I very much agree with Goodroi when he says "a strategy must be multi-faceted." Good SEO is seldom about one Magic Amazing Secret Technique, it's more about getting a gazillion details fine-tuned so they're pushing in the same direction.
I aim for validation but I give up when it comes to "illegal" characters within affiliate URLs!
| 5:14 pm on Feb 9, 2011 (gmt 0)|
forget about seo and just build the site worked best for so far.
| 6:36 pm on Feb 9, 2011 (gmt 0)|
At the very root of my post is that I am saying html validation is one of the surest ways to gain in SERPs. I have taken it further by sharing some of my observations that resulted from validation with a twist. I would not have stated what I have if it was not tested and true. Whether or not Google gives kudos to those back links is up for each individual to decide for themselves. I know it has been beneficial for me, your results may differ.
But, it can be reproduced by everyone and you will see indexed results. Beyond that, interpret it as you want. Regardless of what W3C's robots.txt file says, and I wasn't aware of that, if applied as I have outlined you will get pages indexed by Google from W3C (apparently also non-valid ones which I also wasn't aware of -- you're right frank72 and McMohan). Why Google ignores the do-not-crawl directive from W3C or how they do it is not my concern. I'm simply saying it is so. Nobody was more surprised than me to see back linking to my site indexed in Google SERPs from W3C than me. But I did specifically implement it in that manner to see if it could be done and much to my delite it was.
For those who are saying that we shouldn't be concerned about the technical stuff and just do what's best for the visitor -- I agree. But for me, high on a good user experience list is the technical stuff, unknown and unapparent to them. Valid html and css, and even the order of recursion of it if you really want to dig down even deeper, makes a page lightning fast to render and load. I could go into even more technical detail but that would lead me too far off topic.
And like netmeg mentioned -- information architecture is also right up there at the top. My mind functions optimally in images and patterns and I build sites, for lack of a way of explaining it very well, in binary form for the search engines (it's a story within a story) and they also serve as a dual meaningful visual presentation for visitors. Often times I cannot easily express in words what I am processing internally. But the sites I work on are produced in patterns not words. The written words become the result of the underlying patterns I have applied. And, most of the time, they are very effective. For me, technical stuff is most important, that is SEO from my way of thinking. But that technicality is not apparent to the reader, based on analytics, the sites are sticky so it must be appealing to them or answering their questions.
Maybe pageoneresults is right. Maybe because of this thread exposing this ability something will be done by someone to prevent it from continuing but even if it does it makes no matter to me because my pages still validate and that's what's most important. I have no doubt that Google assigns a healthy portion of their algorithm to on site performance factors. Those are the things I focus on because they are within my control rather than off-site factors that are not.
| 7:22 pm on Feb 9, 2011 (gmt 0)|
1) a decent business
2) A nice website
3) some adwords
4) Some affiliates
5) some links
6) Some content
7) other sales leads
8) even more diversification with even more sales leads from many sources
9) SEO comes after paid search these days on ecommerce sites
| 7:57 pm on Feb 9, 2011 (gmt 0)|
Is it possible that links are now NOT the king of ranking?
I certainly know that content is not king - and now I am beginning to wonder if links arent as strong as they used to be either.
I just see some sites out ranking my own which I watch carefully - and their links just arent as strong or as many.
Maybe I am wrong about the above sentence - and in honesty I wouldnt know how to measure a give links strength accurately enough to say whether its a red hot link or a cold ungiving one.
I would say that a hot one comes
a) with bang on anchor text.
b) from a high pr site (I know I know - that old chestnut)
c) sits on a page with no other links or only a few
d) Is in the middle of a relevant paragraph of text
e) with or without nofollow (still not sure on that one)
f) Is indexed !
And now I run out of options for my list. I would also say that a cold link is the opposite of the above.
But the thing is - I just have like 300 links and they are nice and hot - but my competitor has 40 or so cold ones - hes at 1 and im at 7 or so.
My domain is a few months older.
It make me wonder whether hes done something "else" which I havent done - may be theres another OFFSITE factor other than links which is boosting him - which is not something which shows up on Yahoo site explorer - or the link: on google.
From my observations (and they are around my own sites and competitors - so not a very broad observation Im afraid to admit) but anyway from my observations - there is another piece of SEO which isnt links, validating, site structure, semantics, bounce rate, adwords, adsense, domain age, domain name, TLD, KWDensity, Title Tag, news, articles.
Its something else.
Its out there somewhere but I cannot for the life of me find it - but people are doing it and beating the serps in a very stylish and puzzling way.
Whatever it is - it is my submission as my number 1 thing for SEO - its a year old maybe 2.
What is it?
| 9:03 pm on Feb 9, 2011 (gmt 0)|
I admit I don't know much of anything about link development, and my efforts have been paltry at best. But I haven't had too much trouble with organics, either for myself or my clients. YMMV.
| 9:35 pm on Feb 9, 2011 (gmt 0)|
I agree it's all about the copy. Kinda like the Kevin Costner statement in a movie "If you build it, they will come."
|good content to keep people on your site and convert. Without some content your usage data will be poor |
I couldn't agree more.
|Social networking is one of the best seo strategy |
I wish I had thought of that.... JK Seriously, Welcome to the hood.
|People convert, SE's do not |
But if it weren't for the SE's we would never get in front of the people.
|Little bit of adsense, little bit of SEO, little bit of direct ad sales, little bit of direct sales, all wrapped up makes for stable income. |
That sounds like a good solid plan. ;)
|If visitors are looking for a topic, then we ought to have something for them. |
I know, We can make some serious cha-ching cha-ching.
I would have to disagree with you and say that quality, authoritative content is and always will be king. It's then becomes a matter of getting it found. Now, that's where the waters become a little cloudy.
If you had to wear a hat would it be black or white?
| 10:21 pm on Feb 9, 2011 (gmt 0)|
The best SEO is to prepare for non search engine traffic.
I say that because your time should be focused on producing the highest quality content you can and let the search engines figure out how to recognize it's just that, the best out there.
If the search engines can't it looks badly on them, not you. Build sites for people, not engines.
If/when you feel like it's time to hire an SEO, hire the best you can afford just like you would with a contractor or mechanic. SEO isn't for the newb anymore and there is no "just one thing", everything is interconnected in ways the average jane/joe can't get their head around.
By concentrating on content instead of do-it-yourself SEO you make your SEO's job much easier too.
| 10:32 pm on Feb 9, 2011 (gmt 0)|
The single most important thing is your content. Period.
But, that not a useful answer. But, the other posts here are great. The fundamentals do apply. Great thread.
| This 116 message thread spans 4 pages: 116 (  2 3 4 ) > > |