I thought everybody mined their on-site search. It's absolutely a gold mine.
My snippets: write good content and provide real value for real people.
And fix your html. I'm still working on that one!
|@netmeg Do you really think that valid HTML helps rankings if the site architecture is otherwise optimized and Google can find and crawl the content? I've heard both sides of the debate and have certainly had no problem ranking sites #1 for competitive terms when the site has a ton of validation errors. Also, valid to which DOCTYPE? |
No, I don't think validation directly helps rankings. I try to validate because it's the quickest way to find errors and warnings about issues that could be preventing users or search engines from seeing the site at its full capacity. And I think stuff like that does affect rankings.
I'm still validating to XHTML Strict. I keep trying to validate mobile pages to Mobile, but the standards are like *impossible*
@netmeg thanks for the feedback. I'm not a techie and having backup from someone who is helps when I go to the Dev team and recommend that they ensure the code validates.
It has become very easy for every one to get a features loaded website, most of pre amde scripts have state of art internal seo. But, you still need to focus on design eliments, take care of fonts, design, usability. If you manage to keep your bounce rate low, fast loading pages and an appealing design, you have long way to go. But, you must not phase out the importance of good content and external links.
> But, when done properly, SEO will produce long lasting
> benefits that require only slight upkeep
You are quickly into that gray area where SEO does or does not equal information architecture. I am slowly stepping a foot into the anything you can do on your own site is not SEO anymore camp.
New SEO 2011:
- Social Signals (twitter, Fb, Blogs, Linkedin, 4sq, gowalla, Flikr, Picasa, YouTube, misc Profiles).
- Quality (not quantity) backlinks.
- External brand references (not links).
- Analytic traffic signals (pageviews, uniques, locations, paths)
- Advertising signals (Affiliate Programs, PPC plays)
- Local signals.
Info Architecture (old/dying SEO):
- Using keywords in prominent locations.
- Directory and site structure.
- Valid code.
In all things in life, if we are following the flock we are going in the wrong direction. The things you are interpreting as positive signals may have been 2 years ago but they have played out on the stage already. Momentum will shift back to solid content and presentation. Building that as a foundation will ensure long-term stability. I'm not concerned about month-to-month or even year to year peaks and valleys.
It's like print books. People have been publishing their ideas, theories, and philosophies for eons. Some make it to best-seller status in in a select country due to cultural promotion. But, the classics transcend borders and time and will always be classics long after many "phenom" culture driven successes are long out-of-print.
Here's another strategy that seems to be growing in importance for me: maintaining opt-in email lists.
I've long liked of this kind of regular contact for customer retention, because retention is almost always easier than acquisition of new customers. But you're bound to get some gmail addresses on your list - and Google tracks clicks made from within an email. They are working to measure actual market engagement, and email clicks are one factor that shows it.
We're in no way SEO experts, but we've seen some good results when we improved our server response times. We lowered our loading time from 3-4 secs. to 1 sec. (as reported by WMT) and since then we've seen an steady increase in SEO visits. Not only that, but google bot is now coming to our site a LOT more. Actually, if we look at the WMT crawling stats, it seems now quite apparent that there is an inverse relation between crawling time per page and number of pages crawled everyday.
This is not something we were anticipating. Our (wild) assumption is that the google crawling system is time constrained, and they will give preference to sites that can be crawled faster.
For next year, our priorities are to work on the canonicals, as we have some duplicated content, and rewriting some content to make it unique ( it's content that came from an XML Feed).
|For next year, our priorities are ... rewriting some content to make it unique ( it's content that came from an XML Feed). |
Article rewriting... yeah it might be "unique" but that doesn't mean it's "good". Write for users, not search engines.
No, its an online store. That content came from the manufacturer, and as such shared by tons of websites. This is what we're trying to solve.
Best SEO thing I did in 2010?
Was really looking through google analytics and finding terms that people were using to come to our site, but which we really didn't have content / products for.
Finding the pages that had high traffic but high bounce rates, and then either adding products to those pages or adding links to product pages.
Did it increase my position in the SERPs? Probably not. Did it leverage what little traffic I had? Yes, definitely.
"Depending on the site, I think I'd go for valid HTML. I have a client whose HTML was so bad that Google only found 19 of the 50k+ pages. After fixing the HTML the number of pages indexed went through the roof. The number of links soared too because Google could index the pages with links pointing to them and award proper credit. Link count went from 9 to 15,000 in two weeks and there are over 1.6 million links credited now. The site had been up for over a year when I started for them and hadn't accomplished anything. "
Your profile says your a senior member but
A. more sites that don't pass markup have superior organics than those that do
B. the number of pages being indexed and it correlating to "Fixing" them requires more clarity; unless site wide inter linking is what your talking about - which is SEPARATE from valid code.
C. Your emphasis on links indicates your original proposition should be deleted as LINK ENHANCEMENT is the only argument you've made.
I've studied the #*$! as well- in depth - in a number of verticals. Valid code from the start of 09 to today (maybe today), is no more the best strategy than using word press for crawling and Google alerts.
Are there other senior members that know something about SEO here besides Tedster?
|When done properly, SEO will produce long lasting benefits that require only slight upkeep whereas with traditional marketing you have to keep grinding away at it. |
To add in to the inhouse search goldmine suggestion, I've seen good use of HitTail. Reveals in real-time the least utilized, most promising keywords hidden in the Long Tail of your natural search results.
Best strategy is remain aware through testing stuff YOURSELF -
It's hilarious how predictably circular trends everything SEO move about. 2 years ago the only one talking about the importance of content was Michael Martinez, while everyone was building link farms. Throw a bunch of junk up and see what works. Best strategy is scientifically documenting all the tests you run.
Geeze... I was reading a print I manually made of a client SERP from three years ago, then again the same from last year, and again 3 weeks ago. NO RAVEN, NO AUTOMATED PROGRAM. Just taking the time to collect a pretty long period of historical data for one client vertical. You'd be amazed what you can learn from something so basic.
Welcome to the forums balistreri. We encourage all levels of experience here, from new to SEO [webmasterworld.com] to the well seasoned veterans. The designations such as "Senior Member" are based on post count. But yes, we have some very expert senior members here, and we also learn a good bit from helping those who just entered the field, too.
Another solid group of members here are the business owners and managers who are looking to understand technical SEO issues so they can guide their teams more effectively.
I encourage you to read the threads and decide for yourself whose input is valuable to you. I know that I learn a bunch interacting with members who've just arrived as well as those who've been here for years.
The Hot Topics area [webmasterworld.com], which is always pinned to the top of this forum's index page will give you a good idea of the areas we cover, from semantic issues to page speed to... well, the whole gamut really.
And now, back to our regularly scheduled program - "your single best SEO strategy."
|more sites that don't pass markup have superior organics than those that do |
That's to be expected, as sites that don't validate fully are more common than those that do.
Valid HTML is not a magic ticket to the top, no arguments there. However, once in a while the wrong kind of error can prevent a spider from reading the page and that's a problem. Many validation errors can be safely ignored, but it doesn't follow that they all can.
Checking validation and fixing egregious errors should be considered common sense for the conscientious SEO.
I've always wondered if Google doesn't give a small boost to validated code - if I were them, I'd want to encourage things that make life easier. (Another example of them doing this is them penalizing low slow times - I really don't think an extra 5 second loading time effects 'the user experience' to the same level it affects the 'Googlebot resource allocation teams experience')
I've got one client who is punching way above his sites weight, and as much as I'd like to take credit for it, I think his social strategy (Which he's doing in house, mainly Twitter and Facebook) is what's actually giving him the push. I, of course, haven't told him that. :)
Google is, at the moment, really sensitive to anchor text abuse. If you are looking to make a quick buck, that's the way I'd go about doing it. As ever with blackhat, that loophole will be closed pretty sharpish. If your going to flog a crappy Ebook, now's the time!
If your like my clients and have generally narrow business focus, having keyword heavy domains is working a treat for me at the moment.
|Checking validation and fixing egregious errors should be considered common sense for the conscientious SEO. |
I'll second that one too!
|Info Architecture (old/dying SEO): |
- Using keywords in prominent locations.
- Directory and site structure.
- Valid code.
Really? You mean to tell me that the foundation of the website is old and dying? Maybe I misinterpreted what you said?
Valid Code - Old/Dying?
I would think these days it is at the top of the list. With all the technologies in place, microformats, mobile, etc, valid code is not old and dying. In fact, some technologies won't allow invalid code. Try producing a semantic outline on an invalid document, it can be very difficult.
If the bots have gotten as smart as I think they have, valid code should be first and foremost with any document - period!
Site Structure - Old/Dying?
Are you saying the Pyramid no longer applies? Isn't that about directory and site structure?
Some folks consider themselves old at age 60.
But they still live on to die 30/40 years later.
To say that something is old and dying implies it was at one time alive. From what I've seen over the years code validation has never been alive. The greatest majority of sites never pay any attention to it. From the perspective I've been happy because it has given me an edge but now that Google is promoting it via the Page Speed add-on more people are aware of it and are going to make an effort to jump on the bandwagon just because Google said jump. I've always maintained it as a foundation of pride in craftsmanship and I'm content that the finished product ensures it is accessible to most devices including assisted technologies.
I really cannot stress the advantages gained from it any more than I have. My last comment on it for now though is that I consider it a misnomer to call it SEO, really it should be referred to "website optimization"
Well, you asked.
I wear a tin foil hat... and I think that two of the best things I did for SEO in 2010, were around myself or the browser itself.
1 - I will only use Google services in their Chrome browser. Period. Gmail, WMT, Docs, Analytics, essentially ANYTHING that requires a Google cookie, is done in Chrome. That is the ONLY thing I use chrome for.
100% of everything else I do on the web is done in Firefox, with several Google domains and cookies blocked completely. I access and update my site, research, etc etc.
I feel that Google knows me, they know I am a webmaster and they know I like to do a bit of SEO and sometimes questionable marketing. Why share any more info about my sites than I have to...
2 - I quit talking about what works and what doesn't work, in any public forum where my site(s) or those I manage for others can be attached to my efforts.
Yup... I have decided to go underground, and overall, I think it was the best thing I did in 2010 to help my SEO efforts.
|wondered if Google doesn't give a small boost to validated code |
I've never seen evidence that Google gives any actual boost to validated code, but on the other hand I've seen situations which strongly suggested that bad code was holding a site back in the SERPs.
Cleaning up validation errors could be compared to scraping barnacles off a boat hull. It reduces drag and helps the boat get better performance out of whatever wind is blowing.
Several Google spokespeople have flat out said they don't give valid mark-up a boost in the rankings. And then they went on to encourage validation.
Google wants to give users the best matches for their query - valid code has nothing to do with that.
Hmmmm... I'm rethinking mine: [webmasterworld.com...]
A hint? It's NOT Valid Code.
My single best changes were sorry I have 3 but if I had to chose one it was going dedicated. Couldn't have done 2 if not dedicated, well could but it would have cost an arm and a leg.
1- fixing all my urls to be more SE user friendly and doing a 301 from the old url to the new url.
2- Moving from a hosted account to a dedicated server were I had the control to do what I needed when I needed.
3- Took control of DNS from the hosting company to an outside source I have control.
Best strategy ... And I apologize for coming off goonish in my post several days back Tedster ... for client SEO tactically has always been do a little bit more than your clients' competitors. Links, micros, on page, mobile it's all the same. And every client competitor vertical requires different emphasis areas from whatever you've been working on within the previous client vertical. If theyre paying enough, tests should allow predicting stuff and developing it out; with no loss except for the time spent.
Best strategy for the SEO practitioner is to realize after having succesfully crushed, killed and destroyed the last clients' competitor vertical, you still know slightly more than someone that couldn't spell SEO; an open mind is the best starting point.
And kill off 90% of the social media SEO garbage you feel compelled to still sift through. You'll have more time for productive Seo things, like testing stuff the gurus never bothered too.
1st post was for my ecommerce site.
Post for work sites.
1- Preventing duplicate canonical url issues with a site.
Valid code is a really good thing but I gave up with so many people working on a site and with the number I work it became an impossible task. I do make sure the site doesn't have any kill areas in the code and they display properly in all browsers. The valid thing became impossible.
Validating code is cool...
It's a good thing to do, but when you only have one recommendation for SEO? LOL
Let's have a contest...
The people who are saying 'validate code' is the 'one strategy for SEO' go build a brand new site that validates and I'll go build one with 100+ validation errors, thin content and make my 'one strategy for SEO' building links...
Anyone want to wager who's site ranks better?
Really, off the top of your head who do you think wins the contest?
a.) The one that validates
b.) The one with the inbound links
Most of the time my code validates or only has an error or two, because I care about what I do, but recommend validation as 'the one thing to do for SEO'?
Pleaaaaaaase stop giving people really bad advice, please...
Validated code doesn't even count.
Read the Other Recent Thread I Linked and Click the Following Link Before You Argue Please ... [validator.w3.org...]
Maybe mine would be don't waste your time listening to all the garbage spread around by 'enthusiasts'; build links instead...
I'll assume you are directing that comment at me because I am actively posting in other threads at the moment, but you could have been more specific about who you are debating.
If you could even remotely understand the depth of potential issues that are resolved by validated code you might understand better. But I will not waste anymore time on the matter. I have nothing to prove to anyone. The OP said "your single best SEO strategy" and that is mine. I send my sites out into the wild built for endurance not for the popularity game.
Absolutely not... It was totally general.
My posts are almost always generally targeted and for the readers who may not know any better. Sorry for sounding harsh, there's just times when I read here and can't help but going WTF? then I vent a bit.
Links to your internal pages solve all your crawl issues... Those pages get crawled from external links, not yours. Who cares what links get it crawled if it ranks ... That's search engine optimization is about, isn't it? Ranking in search engines. ;)
Here's a thread burner
Want great organics? Stop using word press.
| This 116 message thread spans 4 pages: < < 116 ( 1 2  4 ) > > |