| 11:55 pm on Aug 18, 2006 (gmt 0)|
Years ago, I used to focus extensively on SEO -- check rankings every week and tweak the pages that dropped. I was always right on top of what the latest guesses were on what "best fit the algorithm."
I wasted a lot of time on that. Now I design strictly for visitors and, you know what, my rankings are just as good as they were when I obsessed over seo.
I even had an experience where one site suddenly lost Google rankings site-wide. I was busy with another project, so I just watched this site function at a fraction of its former traffic.
I actually considered going back to seo-ing that site as soon as I had some time for it. But wouldn't you know that a month later, its Google rankings suddenly reappeared, stronger than ever -- without me lifting a finger!
Sometimes I wonder if I'm shooting myself in the foot with some of the things I do. I name file names with keywords that will help me identify the content on the page, which is supposed to be a sign of spam. But, hey, it makes logical sense for both me and for visitors, so hopefully it won't hurt me in the long run. It hasn't yet.
My theory is that it hasn't affected me because my sites feature only a couple of spam triggers instead of a preponderence of them, as a heavily seo-ed site would. But I just go on designing for visitors and things seem to turn out all right.
| 2:12 am on Aug 19, 2006 (gmt 0)|
One thing that made me pause and wonder about Google's ability to deal with the "modern" web page was a question answered by Matt Cutts regarding the weighing given to <b> vs <strong> tags.
He indicated that <b> held a very slight advantage.
That makes me wonder if the use of CSS is a shot in the foot.
Maybe Google has had their head buried so far down in the vats of SPAM that the world of "modern" websites has passed them by.
| 2:39 am on Aug 19, 2006 (gmt 0)|
|One thing that made me pause and wonder about Google's ability to deal with the "modern" web page was a question answered by Matt Cutts regarding the weighing given to <b> vs <strong> tags. |
He indicated that <b> held a very slight advantage.
That was in one of his videos. In a later video (the next day?) he said he'd checked this with the technical team and in fact it wasn't true, both <strong> and <b> were treated EXACTLY the same by Google - oops.
| 4:19 am on Aug 19, 2006 (gmt 0)|
I suspect doing anything at all to my sites, thinking I might be doing something which was perfectly alright till a few months back and now wrong is too dangerous. I could be doing absolutely the wrong thing.
For example, how is one to know if the problem is because of older data (with supplementals, canonical problems, site hijackers) is fed to the algo during the data refresh?
| 5:05 am on Aug 19, 2006 (gmt 0)|
So the trick now might be to under SEO. All the knowledge on what to do to get better rankings for your website has now become what not to do. So what are some of the main pointers for this reverse seo strategy?
| 5:07 am on Aug 19, 2006 (gmt 0)|
You're right, of course, but I just disappeared. We had both been in business for 7 or so years and we had always danced around the top spot ... having a laugh at each other when we dropped a spot or two. Between us we must have designed 1500 websites in our time and we were just as good as each other, then suddenly my site gets walloped and his seems totally unaffected (except for about 1 week during Alexa).
The only point I was trying to make was that, given that our sites are broadly the same kimd of site (i.e. my services, past clients, new developments, contact details and prices), the only thing I could think of that could have caused such a change was his diss-affecion for any kind of SEO and my love for getting it spot on (Obviously not this time ;-)).
All the Best
| 5:15 am on Aug 19, 2006 (gmt 0)|
<< This program is impractical and would be ineffective >>
You're probably right, but isn't in the interest of all honest Bobs for there to be a tightening of these registration rules? If not, why have them if anyone can give false details. What are the registration people getting paid for, surely not for just running an online form database that has no checks and safeguards.
I stick by my comments and call for a tightening of registration control and the introduction of solid blacklists for abusers.
<< Unlike the current scheme, where, despite some claims made here, there really isn't any collateral damage at all: just regularly shifting collateral benefits (as the positions shift, different websites get temporary placement to which none of them had any just or permanent claim.) >>
Sorry, but I think this statement just flies in the face of many of the displaced webmasters' comments on this WebmasterWorld Google group. Haven't you been reading what people have been saying? The anger they have been showing? Their sites are incurring penalties for apparent reason, when they ask Google for advice, they are fed a load of gobbledegook which implies they have broken some 11th commandment and that they should read the Google webmaster rules and confess when reapplying for reinclusion. Then, to top it off, after several weeks of trawling through your site looking for any trace of a problem, Google slackens its algorythm and we all go back to the way it was before. As my grandfather used to say (Although I accept he didn't know much about Google) "What a pile of tripe!".
[edited by: colin_h at 5:23 am (utc) on Aug. 19, 2006]
| 5:18 am on Aug 19, 2006 (gmt 0)|
>>>>>>So the trick now might be to under SEO. All the knowledge on what to do to get better rankings for your website has now become what not to do. <<<<<<<<
I see this all the time in my sectors. Sites that do it all wrong. No doctypes...no trailing slash...directing back to index.html...no 301 to solve canonical www and non-www and they stay on top...no matter what.
I am beginning to believe it's not seo at all. G just likes uninformed site builders that still make sites like it is 1995.
| 5:29 am on Aug 19, 2006 (gmt 0)|
<< G just likes uninformed site builders that still make sites like it is 1995 >>
That sounds like me ;-) Still using cut and paste code sections database instead of software compilers. I can see the sense in css, with the argument for reducing the body text saturation etc ... but old dogs still impress occassionally with an old trick eh?
All the Best
| 5:44 am on Aug 19, 2006 (gmt 0)|
Make sure Google doesn't have any old pages indexed that you no longer have in your nav.
Do a site:yourdomain.com on Google. If they have any pages indexed that you don't want typical users to see, hide them with the robots.txt OR password protect them.
It sounds kind of crazy, but for me, this has happened several times. Several sites nearly vanished. Then I deleted the junk, or added a "noindex" in some cases, and within a week the sites popped back up.
| 6:10 am on Aug 19, 2006 (gmt 0)|
According to me : SEO for new sites with low PR does not make sense.
Google might consider it spammy.
| 6:37 am on Aug 19, 2006 (gmt 0)|
|That was in one of his videos. In a later video (the next day?) he said he'd checked this with the technical team and in fact it wasn't true, both <strong> and <b> were treated EXACTLY the same by Google - oops. |
And since [apparently?] he made no comment about CSS, I am still wondering about Google's treatment of CSS designed pages.
"- oops." right back at you.
| 10:41 am on Aug 19, 2006 (gmt 0)|
Nah, just different.
|Why would having the title in the bar and the title of the page in h1 that are the same be so bad? That's exactly the way it should be right? |
Your title is your title.
That made algo sense until it became the easiest way to find an seo, which Google obviously dislikes.
|Haven't you been reading what people have been saying? The anger they have been showing? |
Talking about collateral damage or collateral benefits sounds like a glass half empty or full discussion. Either way, anger does nothing to increase algo points. Seems like Google decided that they didn't really need goodwill from webmasters quite some time ago, and I'd say they they've been doing just fine with much less of it.
| 11:29 am on Aug 19, 2006 (gmt 0)|
Plenty of badly optimized sites rank well.
Plenty of 'well' optimized sites rank well.
I still think quality incoming links is the major factor, barring a major screw-up. (by the webmaster OR google)
| 1:46 pm on Aug 19, 2006 (gmt 0)|
>> I am beginning to believe it's not seo at all. G just likes uninformed site builders that still make sites like it is 1995.
what if, and it's only what if, age is the factor that keeps them up high? They did their sites in 97-98, and left them there. They got plenty of OLD links and google is saying that if a page has been there for 8-9 years, has many old links (before buying was invented) and still get some new links, it must be good. Just a thought.
| 2:05 pm on Aug 19, 2006 (gmt 0)|
We are suffering from the same problem as the 9 blind men touching different parts of the elephant.
We each sense different aspects of the Google algorithms, based on our own situation and those of other sites we pay attention to.
With hundreds of factors interacting in the algorithms, and the dials constantly being twisted a bit this way or that, it is nearly impossible for any of us to have a clear picture of what is going on, much less achieve an accurate understanding of the exact cause and effect relationships.
I sincerely doubt that standard SEO techniques are being penalized to the point where all SEO backfires. But, perhaps "excessive" use of SEO techniques is now hurting some sites -- or perhaps those techniques are no longer helping, and the end result looks the same as if they were now hurting.
I am convinced that the age of the site, or the age of its links, makes a huge difference -- if for no other reason than because various spam techniques (e.g. buying links) weren't invented until recently, so google can treat those sites differently (e.g. if the links are old, it can assume they aren't due to a modern-era lnk scheme).
Perhaps there is some sort of "profile" of old never-SEO'd sites that looks very different to Google than the profile of an equally old "Heavy-SEO'd" site?
| 2:56 pm on Aug 19, 2006 (gmt 0)|
|G just likes uninformed site builders that still make sites like it is 1995 |
1997, in my case, and it's been working pretty well. :-)
As for the suggestion made earlier that Google doesn't like keyword-rich titles and headlines, I disagree for three reasons:
1) Keywords provide "spider food" to help Google determine what a page is about. (If a page is about doughnuts, a search crawler is going to be much happier with "doughnuts" in the title and headline than with "circles of sweetness" or "enjoy a hole in one with your breakfast.")
2) It's natural and sensible to use descriptive page titles and headlines. (Filenames, too, for that matter--after all, Google often uses descriptive filenames for its own pages, probably because they're more user-friendly than a long string of database gibberish.)
3) It wouldn't make sense for Google to penalize or disregard good HTML structure.
Of course, if a page is titled "Doughnuts Donuts Crullers Beignets Churros Krispy Kreme Dunkin Donut Mr Donut," all bets are off. And if the body text and alt text are stuffed with doughnut-related keywords, the page is likely to fit a spam profile. But just using "doughnuts" in the headline, title, and even the filename shouldn't get you into trouble if doughnuts are what the page is about and the keyword is used in a natural context.
| 3:27 pm on Aug 19, 2006 (gmt 0)|
I've found over many years of successful SEO work, esecially with Google, that there remains one tried-n-true SEO technique that works on sites I've SEO'd for even as recently as last month.
If the title tag, and the description meta tag closely match (but not verbatim) the site will do better, provided it's an honest site.
It also helps to have keywords in anchor text for menu items, but not tremendously.
Just my two cents.
| 3:34 pm on Aug 19, 2006 (gmt 0)|
I think, if they are not doing it now, eventually Google will move away from ranking factors that are easily manipulated by individual webmasters. So the way of the future is to have sites that get clicked on read, bookmarked, have repeat visitors, etc.
| 3:49 pm on Aug 19, 2006 (gmt 0)|
If EFV is right about headlines and titles and filenames, I am back to square one! :) This is the only potential danger area I can see in any of my sites, and Google or not, I am not going to change whats logical for me.
That is all the SEO I do, and while I research and learn more, its more an academic activity than hardcore SEO.
I remember Matt Cutts mentioning overoptimization in one of his videos. However, that is extremely tricky to define - and the tolerance levels for overoptimization could change by the age of the site, links, area / market etc I suppose.
I still don't get what data refresh means either - is it new data (indexed pages or something else?) getting fed to an algo? Or old data plus new data? Because I was hit for the first time ever in Google due to what I suspect was a 302 hijack from thousands of sites, and it is theoretically possible that if the algo is fed data from that time, my site would drop in ranking.
| 3:52 pm on Aug 19, 2006 (gmt 0)|
>That makes me wonder if the use of CSS is a shot in the foot.
> ignorance of noise words.
Yes, in LSI and related frameworks. However, the more you dive into linguistic analysis, the more your hair greys, if you share the pedantantry of a typical programmer. Particularly the stoppwords cover so wide and often ambiguous contents; and don't forget how many lanugages there are. Modal concepts most of us don't have a clue about, think of the ancient greek "medium" and extrapolate that into non-european languages.
I did not have the time, to read the source you gave, yet. Might be interesting.
| 6:21 pm on Aug 19, 2006 (gmt 0)|
Someone very eloquently posted on these forums, a while back now, a gem that has stuck in my mind to this day:
"It usually takes someone a thousand posts before they realise that, in essence, SEO is actually about doing very little at all."
The most recent re-vamp of own site used this theory with *very* pleasing results ;)
[edited by: Panic_Man at 6:23 pm (utc) on Aug. 19, 2006]
| 6:27 pm on Aug 19, 2006 (gmt 0)|
Google SEO is about what works in Google SERPs. It could be a little or a lot depending on the type of site. I think I do very little, and that sends me up and down the tree with every data refresh.
| 6:33 pm on Aug 19, 2006 (gmt 0)|
From the first post
> because of this we both have links to each others sites.
> I actively SEO my work and he leaves his to luck, time, call it what you will ... not even a title tag, no incoming links ... nada!
Isn't that a contradiction?
| 6:42 pm on Aug 19, 2006 (gmt 0)|
Why would having the title in the bar and the title of the page in h1 that are the same be so bad? That's exactly the way it should be right?
Your title is your title.
That made algo sense until it became the easiest way to find an seo, which Google obviously dislikes.
So .. our titles in the title bar should not be the title on the page? This makes no sense.. if I see a title on a search engine.. I expect to find that data on the page.. it tells me .. the end user that in fact I did select the right link and will find exactly what I am looking for.
I totally detest SEO, but I hate it worse for being penalized for doing whats right for my customers and potential customers. If they want to "ding" long standing websites then they need to do manual site reviews before they penalize.
I can understand somewhat doing "across the board" penalties for new sites until they have been deemed worthy of placement, but to just do it blindly is not only hurting good websites, but also customers using their search engine.
If searchers keep landing on pages that look like they were built in 1996 they are going to start thinking that all of the google searches are out of date and go elsewhere.
Maybe they should implement a way to pay them for a montly manual review of the site. If the site is garbage .. they still get the money and you get no listing. If the site is good then they lift penalties and spider well.
| 6:43 pm on Aug 19, 2006 (gmt 0)|
You can give links to each other for completely non-SEO reasons. But like it or not, links are SEO now unless its nofollowed.
| 6:49 pm on Aug 19, 2006 (gmt 0)|
Oh and this is a Featured Homepage Discussion now! :-)
Those who suggest doing minimal SEO - could you please explain what you mean by minimal? My minimal SEO doesn't seem to save me from wildly swinging traffic.
| 7:32 pm on Aug 19, 2006 (gmt 0)|
Focus your energy in other areas:
1. Code Validation (W3C etc.)
2. Minimum, but correct use of a wider range of HTML tags
3. CSS (seperate presentation from content)
4. Clear navigation and structure - organise your site pages into clearly defined 'topic areas' - I see this so many times - Why talk about the same subject in two or more areas of your site?
7. Quality of writing
I call it - "doing it the way it was always intended to be done."
Primarily an ASP developer, I like to optimize my site code in the same way I would optimize my ASP scripts. Get the most done in the least possible code - CONSTANLY ask yourself - how can I make it do the same thing, but with less code (in this case, HTML tags)?
In short, write your sites for the maximum benefit of your (wider) audience. That's what search engines REALLY want (and will reward handsomely for) - deeply informative, high-quality sites to feed it's information-hungry customers.
[edited by: Panic_Man at 7:55 pm (utc) on Aug. 19, 2006]
| 7:42 pm on Aug 19, 2006 (gmt 0)|
>Sorry, but I think this statement just flies in the face of many of the displaced webmasters' comments on this WebmasterWorld Google group. Haven't you been reading what people have been saying? The anger they have been showing?
The thing is, when someone get something he doesn't deserve, he generally thinks he deserves it. But when he loses something he doesn't deserve, he gets angry. Not only does he lose something he had, he loses something that's been built into his own self-respect. And this is an especially acute emotion when the gaining of that undeserved thing was a significant part of his own sense of self-achievement.
Google goes about, harming nobody, but giving more-or-less-random help to more-or-less-undeserving people. The people who GET the new random help think, "all my rain-dances have been successful, it rains!" and the people who got the last random help think, "the gods have turned on me!"
The former group keep mousy-quiet, not wanting anyone else to know the secret of the half-elbow twist on the eighth loop around the sacred kumquat (the discovery of which, after years of painful and unproductive choreography, had just made the whole rain dance successful).
But there are no limits to the volume and rancor of the blasphemous rage expressed by the latter group, because the gods have betrayed them, by ignoring that elaborate head-wobble on the fifth backflip over the consecrated camellia, which, LAST year, THEY had discovered after Y.O.P.A.U.C, and which had once made THEIR rain dance successful.
It's the same chorus every month: the only difference is WHICH voices are silent, and which are screeching.
But Google is still putting just as many listings in the first ten search results, as they ever did. Sure, more of those listings are spam now, but now there is more spam to choose from -- many orders of magnitude more! So you can't meaningfully compare what Google is doing now (with say 20 million spammers to choose from) with what they did two years ago (with only 5 million spammers to choose from).
Anyone with a legitimate site is being overwhelmed by more spam than ever: and SEO won't solve the problem, it just makes you LOOK more like the spammers. You have to have human help, which means you have to create content for the benefit of other humans, and you have to design for humans, and you have to allow humans to respond on their own time scale (that is, you're in it for the long term: all that can happen quickly is death. Spammers and fraudsters have more competition than ever--competition that's more and more skilled at counterfeiting genuine sites.
But there's no point in blaming Google for either of those situations. It is not Google's fault that it's deceived by people deliberately setting out to deceive it. It's a miracle, as much as Google is lied to, that they ever find ANY genuine authoritative websites. After all, most search engines don't.
| 9:22 pm on Aug 19, 2006 (gmt 0)|
I have seen this same situation with many of my clients. What I find upon further digging is that the higher site usually has:
1. Earlier registration date
2. More stable content
3. Different URL structure
4. Better content
5. The lower isn't W3C Compliant
Are any of these factors present?
[edited by: GaryTheScubaGuy at 9:23 pm (utc) on Aug. 19, 2006]
| 9:29 pm on Aug 19, 2006 (gmt 0)|
Point 1 (Earlier registration date) often appears to have a disturbingly high priority in Google Algos.
[edited by: Panic_Man at 9:49 pm (utc) on Aug. 19, 2006]
| This 88 message thread spans 3 pages: < < 88 ( 1  3 ) > > |