Forum Moderators: open

Message Too Old, No Replies

Order in which Factors are applied in the Google Algo

Trying to make sense of big changes in SERPS through the update

         

BikeMan

3:14 am on Jun 21, 2003 (gmt 0)

10+ Year Member



The previous update had seen my site slide from page 2 of the SERPS to nowhere. Didn't bother me too much because I was sure that the problems with the index/algo would be sorted out and I would be back probably on page 1 with the links I had added.

Having monitored this update closely I have seen myself not listed in the top 100, 94th, 8th and now 56th for my main two key word search term.

checking for allinanchor: for the search term lists me 9th on the SERPS.

Keyword density is no more nor less than other pages on the first page.

PR is similar to other sites listed on the 1st couple of pages of the SERPS.

Links are obtained from almost the same sites.

Title Tag contains the search term

H1 Tag with the search term.

Good quality content

No cloaking, hidden text used

As I am quite lost as to where I am losing out I am hoping someone with a clue as to how the algo works would be able to shed some light.

crobb305

6:00 pm on Jun 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've never seen Googlebot download a CSS file, so unless the CSS was in the <head> or inline, how would it know the <H1>'s were CSS modified?

In my html code, the modification appears as <H1><size=2> ....so...Google can see that the original H1 font was modified to size=2

Not sure why it appears that way though since I use CSS for this. I never noticed it before and may have caused me a penalty. UGH. ;)

[edited by: crobb305 at 6:19 pm (utc) on June 29, 2003]

crobb305

6:06 pm on Jun 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Back to the discussion...does anyone else see any similar changes in serps that may be affected by manipulated H1?

[edited by: Marcia at 9:13 pm (utc) on June 29, 2003]

g1smd

6:15 pm on Jun 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Still #1 with:

<body>
<table width="100%">
<tr>
<td><img src="image.jpg" height="90" width="90" alt="
Blah Blah Blah"></td>
<td><center><h1><font color="#000066" size="+4"><b>
Blah Blah Blah Blah</b></font></h1></center></td>
</tr>
</table>

Yeah, this is going to be converted to CSS quite soon.

Problem? Penalty? What problem? None seen here.

hazardtomyself

6:17 pm on Jun 29, 2003 (gmt 0)

10+ Year Member



I have two different regional sites. The first one, 8 months old has weathered all algo changes. H1 tags on every page. The title of the company has all keywords in H1 tags.

The second site, only 3 months old, representing a different region but the exact same design, H1 as the title and company name which are also keywords, was #2 out of 975,000 results pre Esmerelda. Now the index page cannot be found and the weakest page in the site holds a page 3 position.

No explanation at all. Two sites with identical designs (different content all together).

Only difference is site age. BTW site #1 PR4-very stable. Site #2 PR5-Lost.?

Age is the only difference I see with my sites.

Dolemite

6:24 pm on Jun 29, 2003 (gmt 0)

10+ Year Member



If you haven't taken care to do the simple things like move modification of <Hx> tags to external CSS, then what are you doing trying analyze an algorithm?

My algorithmic analysis: don't be surprised if Google decides inline-modified <Hx> tags shouldn't be given the full weight.

[edited by: Marcia at 9:16 pm (utc) on June 29, 2003]

crobb305

6:47 pm on Jun 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Not everyone in this forum is an html expert. The extent to which google accepts Hx tags (modified or not) has been a topic of debate long before you joined this forum a mere 3 months ago. The debate goes on. Everyone here is worthy of participating in this forum without sarcastic replies.

[edited by: Marcia at 9:18 pm (utc) on June 29, 2003]

crobb305

6:55 pm on Jun 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Despite my feelings regaring <Hx> tags, I agree with shaadi that it is a bit early to try to analyze too much. I still think there is a lot of missing anchor text that may explain some drops in rankings.

Beastie

7:31 pm on Jun 29, 2003 (gmt 0)

10+ Year Member



Very weird things for me:

i) I have 54 backlinks showing, and none are PR4+ Some are PR0-1.

ii) If I check "sites that contain the term 'domain.com'" there are hundreds more sites, some of which are definitely HTML links, but these are NOT showing as backlinks. What the hell?!

? If Google has found these, why are they not showing as backlinks? Is this totally inconsistent and completely insane or what!?

allinanchor:domain.com I am number 1. Under every datacentre and www I am positioned 470. I think this side of things is to do with PR. I might have more actual links containing relevant text, but my PR remains low.

Still, what about that backlinks thing? That is absolutely crazy stuff. More annoying than I can express. When this update started, I was ranked #2. Now I am nowhere. All my other sites have followed suit. GeexAghgh!£W!@!

Dolemite

7:43 pm on Jun 29, 2003 (gmt 0)

10+ Year Member



i) I have 54 backlinks showing, and none are PR4+ Some are PR0-1.

ii) If I check "sites that contain the term 'domain.com'" there are hundreds more sites, some of which are definitely HTML links, but these are NOT showing as backlinks. What the hell?!

Beastie,

Backlinks are still a bit shakey now, IMO, and PR definitely is. Its possible that those backlinks that are showing as PR0-1 are actually 4+ but just haven't quite solidified. In any case, the "sites that contain the term..." search should return many more results since its not filtering any levels of PR and doesn't have to contain a link. I would sure be pissed if I found a PR7 link in there that didn't show up as a backlink, though. ;)

[edited by: Marcia at 9:20 pm (utc) on June 29, 2003]

Symbios

7:49 pm on Jun 29, 2003 (gmt 0)

10+ Year Member



I'm a bit worried about the H tags but will wait until G settles down before I make any changes, I just did 10 searches for one of my keyword terms on the 9 datacenters and the results were different each time, it was almost as if the datacenters were alternating results.

I also agree about the page rank results being flaky, I have some sites showing PR0 with backawards links with top place for some search terms.

crobb305

7:54 pm on Jun 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Symbios, I am also worried about the Hx tags...
I just temporarily removed my H1 tags, so we will see what happens. I have been #1 across the board for a year...but all of a sudden dropped 2 to 3 pages on major phrases. On secondary phrases (not contained in the Hx tags)I am still in the top 5.

[edited by: Marcia at 9:21 pm (utc) on June 29, 2003]

tigger

7:55 pm on Jun 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think this should be dropped before this situation needs the moderator to step in and edit postings

Cheers Guys :)

[edited by: Marcia at 9:24 pm (utc) on June 29, 2003]
[edit reason] Note: Editing done. [/edit]

kstprod

8:25 pm on Jun 29, 2003 (gmt 0)

10+ Year Member



Crobb,

I too am worried that my H1 is effecting my index page drop. I also had the css on page in the html, rather than external.

I also removed my H1 tag, just to see if anything would happen. Like changing anything could make the current situation WORSE. G crawled deep last night after I made changes, so we should know by tomorrow if it did any good. I'll let you know what happens.

BUT, I do have to say, that I use the same H1 set up on ALL of my pages, including a sub-directory, and NONE of these pages have had an ill effect in this "Google weirdness". So, it's quite possible that by us removing the H1's, we won't see much change, if any. I sure hope so, but I doubt it since all my other pages use it just fine.

OTOH, maybe this H1 problem only has to do with the index page, which is why mine is gone. Only time will tell I guess.

James_Dale

8:30 pm on Jun 29, 2003 (gmt 0)

10+ Year Member



I'm seeing several high ranking sites using H1 tags. The most convincing case I've heard so far for the 'megaflux' effect (coining that phrase!) is that Google is having problems consolidating domain.com and www.domain.com. It seems that inbound links to domain.com, if found, are sporadically being associated with www.domain.com. When this happens, the real links to www.domain.com are discounted completely. Using a mod_rewrite as a fix has has successful results with some people around WW.

Marcia

9:29 pm on Jun 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Brief pause for a TOS [webmasterworld.com] reminder:

Always be respectful of other users, the system, and the moderators. We put the system online in good faith, please use it in good faith.

We need to remember that we're all in this together and all here to help each other.

kstprod

10:43 pm on Jun 29, 2003 (gmt 0)

10+ Year Member



James,

That does make sense, but I don't think that it applies to my situation. At first, I thought it was very possible, but upon checking Google, I found nothing when I searched for my domain - minus the www. So, I don't think I have any internal or external links without it. I do, however, have some backlinks, that have the anchor text as MyCompany.com and maybe even some links to www.MyCompany.com. At the time, I thought the capitalizing was a good idea, so I would stand out..but now think maybe it could hurt me?

I was wondering though, if I'm right, and I don't have any domain.com links, would adding the mod-rewrite hurt anything? I thought maybe I should go ahead with it, just in case. Better to be safe than sorry, especially if it wouldn't hurt to add it, right?

Marcia

10:47 pm on Jun 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There was concern expressed a couple of years back about modifying H1 with font tags. At that time some people kept the modifyng font tags outside rather than nested within the <H1>. It's not generally accepted protocol, but some were using <H2> instead for this reason and claimed to be doing just as well as if it were <H1>.

Kirby

10:57 pm on Jun 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Marcia, I am one that has used <H2> rather than modify <H1>, but I am seeing sites that greatly abuse the <hx> tags and since Dominic and now Esmeralda have been rewarded, going from nowhere to page 1.

[edited by: Marcia at 11:11 pm (utc) on June 29, 2003]
[edit reason] No specifics necessary. [/edit]

kstprod

11:21 pm on Jun 29, 2003 (gmt 0)

10+ Year Member



Marcia,

I meant that I modify my H1 with css, but have the css in the HEAD section, instead of externally. Do you think it makes a big difference?

Kirby,

Holy Cow. That is quite a case of overusing. Maybe this is going to be the standard nowadays to achieving page 1. I, for one, would be too scared to even try it, for fear of ban. Gotta give these guys a little credit, they sure have guts.

Kirby

11:30 pm on Jun 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I would be curious to hear if someone gets a penalty for <Hx> abuse, but I don't know how someone would know for sure the real reason for the penalty. I haven't seen consistent applications of any spam filters at this point.

Kirby

11:33 pm on Jun 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I would be curious to hear if someone gets a penalty for <Hx> abuse, but I don't know how someone would know for sure the real reason for the penalty. I haven't seen consistent applications of any spam filters at this point.

Dolemite

12:42 am on Jun 30, 2003 (gmt 0)

10+ Year Member



I don't think there would be a penalty.

At a certain point of overuse they just have to stop meaning something, though.

pageoneresults

1:06 am on Jun 30, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I would be curious to hear if someone gets a penalty for <Hx> abuse, but I don't know how someone would know for sure the real reason for the penalty.

Overoptimization seems to be popping up more these days. If I were Googlebot, I'd be looking at character lengths of <h> tags. I'd also be looking at surrounding text. Wrapping paragraph content in <h> tags might be an issue. Putting <h> tags back to back might also be an issue unless there is a logical structure. You've got to look at everything else surrounding those <h> tags to effectively determine whether or not you are utilizing them as they were intended to be.

Usually when there is <h> tag abuse, there are other areas that are abused. Google is going to have everyone guessing from this point forward. I say just keep doing what you are doing if it has proven successful. If you've seen some bumps this last update, wait until the next one, if there is one, to determine whether or not change is needed. Too many people rushing to change things and since this is not your typical Google behavior, I'd wait and see. What if everything is back to normal in 2 weeks, or tomorrow?

g1smd

1:17 am on Jun 30, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month




You should run your pages through [validator.w3.org...] taking care to tick the boxes for show source and verbose output and most importantly show outline.

Validate the page, then scroll down past the error messages (if there are any) until you find the Document Outline. If the text list there does not look like a summary of your document then you are abusing the <hx> tag.

willybfriendly

2:12 am on Jun 30, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



From the analysis that I offered earlier, on page factors do not have a lot of positive influence. Remember, 4 of 7 sites listed in the top three results of two single word searches had a combined 1 occurence of the keyword showing in their copy.

There may be penalties applied to on page optimization such as Hx tags, etc. But these are not an outright ban, just a depreciation.

It appears from the examples that I gave that off page is where the action is. Why else would pages with zero copy rate #1, 2 or 3? Title and domain name are a factor in gaining a boost. On page stuff, at best, can cause a loss of position.

It is a new game when image map navigation, framed pages and pure graphic pages with no alt tags can take the top three positions.

Does anyone have evidence that off-page stuff has not taken on new importance? I just did a search that returns 1,150,000 results. Discount widgets. Check the cache and it says "These terms only appear in links pointing to this page: discount" No H tags at all. Terrible code bloat. JS galore. But the domain name is "discount-keyword.com"

Hmmm....

I have already picked up a keyword1-keyword2 domain name :) Now I will work on off page stuff while I wait a month to see what changes the Google gods might make. If these are spam filters, they are far too tight. If nothing changes in a month I will simply remove all spiderable content from my index page, stuff my title with keywords, use the keyword stuffed domain name and let the off page stuff do it's magic :)

WBF

BikeMan

12:52 pm on Jun 30, 2003 (gmt 0)

10+ Year Member



Following the advice to look at on page factors (since the allinurl was high enough) I did a bit of analysis. I do have the setting <font size=6> in the H1 tag.

Keyword density for the first page of the SERPs for the two keyword phrase range from 0% to 46%. For my page it is at 31%.

Number of occurences range from 0 to 35. For my page it is 17.

The only significant difference is that the Text to HTML ratio for my site is at almost 34% while it ranges between 6% to 27% for sites on the 1st page of the SERPS.

More text generally equals more content.... so where could the problem lie?

g1smd

6:54 pm on Jun 30, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Is that 34% referring to the HTML or the text on the page?

BikeMan

7:16 pm on Jun 30, 2003 (gmt 0)

10+ Year Member



Text

Janet

1:04 am on Jul 1, 2003 (gmt 0)

10+ Year Member



CWright back in msg #18 made an interesting point about pages focussing on high traffic/competitive keywords having more instability in the index than lower-competitive ones.

I also see these symptoms with some of my sites. Of course the high traffic keywords are what makes sales for my clients, so the erratic behaviour is causing serious pain. A page that was no1 in a high traffic keyword is bouncing around in the SERP's - who knows where it will pop up next!
Pages (from the same website) that were no1 for less competitive terms/lower-trafficed have held their placement.

Could there be some problem with Google's database in being able to handle results from higher volumes of queries or where there are a higher number of pages returned that fit the search criteria? I'm clutching at straws maybe but there's not much that makes a lot of sense in the results from this update.

Also there seems to be more problems with sites of mine that have been recently placed in Google ie. just prior to the Dominic update (first indexed during march/April). Older sites seem to be a lot more stable.

jaffstar

1:58 am on Jul 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I watched this afternoon as my keyword started coming across all 9 data centers, its now back to the top :)

Maybe the algo change is taking time to import all the variables back?

I vanished pre dominic but if you look at my previous post, everhing said I should be higher.

This 69 message thread spans 3 pages: 69