Forum Moderators: Robert Charlton & goodroi
Now my sites have been through the ringer over the past 16 months or so and his have not been so much as touched. He has held the same position for our "Town Name Web Designer" keywords since I got knocked off.
The only thing we do differently in our designing processes is that I actively SEO my work and he leaves his to luck, time, call it what you will ... not even a title tag, no incoming links ... nada!
My question is this ... Is Google targeting all branches of SEO techniques, not just the dodgy ones?
Many of these specifics could often correlate with an earlier registration date for the domain, even though the date itself was not directly playing into the algorithm. In other words, it's a related factor, but not necessarily cause and effect.
[edited by: tedster at 4:12 am (utc) on Aug. 20, 2006]
Most CMSs, by default, use the headline (H1) as the TITLE. I guess 95 % of them do that.
So unless you are into hacking CMSs, you have overdone SEO? I dont think Google looks at it that way. Also, for media websites, they all use CMSs and you will find that its usually in the format of <title>Sitename - headline </title>. That is no indication of SEo, its an indication of using a CMS.
Even if you don't validate every last page, at the very least invest some time to check how your main templates validate so that whatever is being replicated throughout your site is as close to valid as you can make it.
If the validator objects to something, be sure you understand why. Some so-called errors can be safely ignored (example: quirky characters in affiliate links), while others could make large sections of your page unspiderable (example: not closing a tag properly).
It all helps ...
1. Code Validation (W3C etc.)
Code validation has absolutely nothing to do with proper SEO (to coin the phrase) in Google. Several examples of this have been given in this forum recently.
Correct. The list from which this came was written as part of a suggestion for non/minimal SEO activities.
If you feel the need to apportion an 'SEO aspect' to code validation, I'd would strongly suggest that SE robots will be far more productive and successfull in scanning validated page code.
[edited by: Panic_Man at 8:31 am (utc) on Aug. 20, 2006]
Regarding your observation that the original statement had some holes in it ... you're not wrong. Mainly due to me not explaining myself properly. The links that we had given to each other were the reason that we actually became friends in the first place (not as my initial statement first implied .. that we were undertaking a carefully planned seo operation ;-)).
It's a bit drippy but I noticed that for number of years we had been linking to each other and when I became over-worked one time, I forwarded some work to him and have been friends ever since. So nothing quite on the scale of the billion page site ;-)
All the best
Col :-)
So .. our titles in the title bar should not be the title on the page? This makes no sense.
I know. I don't like it. I have a nasty habit of using the same thing in titles, h1s, filenames, and internal navigation. It makes sense to a user and myself.
I think Google will dump a site for tons of reasons. I'm talking about things the average webmaster would do these days; things everyone tells them to do based on 3 year old google knowledge that everyone had at the time. I think those things can usually be overcome / overpowered with other components of the algo. But, you really want all of your scoring to be as positive as possible.
Reading a "what to do for google" article will be a step backward most of the time. If you read current writings from google, much is focused on making it harder to SEO. They smile and wave at webmasters in forums, but if you try to influence rankings, you're a spammer in their eyes.
They smile and wave at webmasters in forums, but if you try to influence rankings, you're a spammer in their eyes.
Maybe, but it's worth noting that the Google Webmaster Guidelines clearly state:
"Think about the words users would type to find your pages, and make sure that your site actually includes those words within it."
"Make sure that your TITLE and ALT tags are descriptive and accurate."
Writing descriptive titles and headlines isn't "trying to influence rankings," it's just good sense. To use my earlier "doughnut" example, look at Wikipedia's doughnut article: The page title is "Doughnut - Wikipedia, the free encyclopedia" and the article's headline is "Doughnut." What's more, the article comes up in the #1 spot in a Google search for "doughnut." Why? Probably because there's nothing else on the pag or site that might be viewed as attempts to influence Google's rankings (which may not be the case with SEOed sites whose owners think they've been penalized for using the topics of their pages in titles and headlines).
Too optimized: keyword stuffing - the same keyterm repeated too many times on the same page. Hidden text, too many alt tags with keyterms, too many headings with keyterms.
Too many links - as all seo's know it makes ranking a lot easier if you have a good Pagerank. This is achieved by getting as many links as possible. If Google sees you getting too many links too quickly for a new site, it does not look very natural, and therefore must be and "artificial" PR boost.
Anything else?
Why would having the title in the bar and the title of the page in h1 that are the same be so bad? That's exactly the way it should be right?
If it has any influence I think it is a part of an overall profile. In other words it's just one small factor amoung several things that are common with questionable sites.
In the topic of old sites doing better. I'm not sure if Google looks at the origianl date the site started up or if it's just that older sites have collected more natural inbound links over the years. But I can see how age might be considered as one small factor by Google. How many spammy sites have been around since 96 or even 2000?
So .. our titles in the title bar should not be the title on the page? This makes no sense.
For me, the page Title reflects what the page is about, however the index page also refers to the fucntion of the business.
I have never had an issue with the Title / H1 being the same and there being any penalty on sites I optimize. This isn't to see it isn't a reality, however I haven't seen it. I usually include the second keyword in an interesting manner inthe second header, but not always, only if is important in the scheme of the writing.
Usually a keyword header on its own can be boring period, so if it makes sense from a visitor standpoint to write more in the H1 and H2 tags, do so...and ultimately, ironically, you would be "seo'ing" the website anyway.
If it has any influence I think it is a part of an overall profile. In other words it's just one small factor amoung several things that are common with questionable sites.
Sure, and a lot of times the factors are likely to be obvious. When I see somebody here talking about having 20,000 inbound links for a widgets page, I can't help being skeptical, and I'll bet that most pages with 20,000 inbound links (unless they're Microsoft or the BBC) fit a "questionable site" profile in other ways. Ditto for pages of a million pages or more that come from out of nowhere. It's possible that some such sites are legitimate, but it's easy to understand why an inanimate genie in a black box might decide otherwise until overruled by a manual review after a reinclusion request.
However, what if you have no intention to spam or cheat? What would one do if a legitimate, quality site gets affected by google's anti-SEO steps? Stuff like H1, H2, keyword stuffing etc are easy to identify if one has overdone it - but what if it is not, and you are caught unawares? Google doesn't offer a checklist, and when one doesn't even know if its a problem with the site or some issue with Google's data / algo. Taking any action may worsen the situation.
I pretty much know for sure that one of my sites, down now, will be back with the next data refresh. And go down again with the one afterwards. What should I check for then? My errors of theirs? I am clueless.
Being alive gets you penalised now!
I remember the good old days......back in 2001 where I owned all first 3+ pages of results for all of my common search terms......boy those were the days!
Today is totally different. Google mixes it up, there are no fixed rules to play by, other than links there isn't much at all to play by!
To fix the results today you have to play the percentages. A huge number of pages with great links for a huge number of search terms.
A single site has little chance.....think hundreds to thousands of sites with millions of pages......anything else and you will be vunerable to a Google mod/whim!
2. How long is your domain registered for? a good 10 year registration could help you.
3. who is your domain registered with, is it a local company in your town?
4. Do you have your phone number on your site? is it a local number?
5. can you find your site by searching for your address/phone number on google?
6. Where are your servers located? are they in your town? are his?
I'd be interested to know your answers :)
my sites have been through the ringer over the past 16 months or so and his have not been so much as touched
Do the pages on the client sites of your competitor each contain a link back to his site?
I was doing a similar analysis. My conclusion is that this may be a factor, where he has more links back to his site than you do to your site.
There's however another possibility - maybe (s)he works for Google? Here's his ealier post
I think Trinorthlighting was simply quoting a Google help file.
and now BTW something completely different: In times of new google communication policy with its attempts to build a community of trustworthy webmasters: Has anyone considered the - in most cases empty - possibility of simply inserting your (main) URL in your webmasterworld profile as a positive trust-building or even ranking factor yet?
Now what does that leave out - only Google's own mysterious ways?
Has anyone considered the - in most cases empty - possibility of simply inserting your (main) URL in your webmasterworld profile as a positive trust-building or even ranking factor yet?
Surely you aren't suggesting that members should offer the same transparency and accountability that they demand of Google? :-)
I just dont feel its ethical SEO that is the culprit. Bad data which still keeps reappearing? Canonical problems still not fully resolved? Inevitable collateral damage?
** Oh - just noticed this in sitemaps. Like during the days of canonical issues, my main site's highest PR page is an internal page, not the homepage. **
Answer: Emphatically YES!
Surely you aren't suggesting that members should offer the same transparency and accountability that they demand of Google? :-)
I really doubt transparency concerning the algos is something we can expect from a search engine. By the very nature of the matter.
If this basic misunderstanding was resolved once and for all, I guess the number of postings in here would shrink to a half.