homepage Welcome to WebmasterWorld Guest from 54.227.56.174
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Home page indexing issue - follows a redesign disaster
bbarbeau




msg:4209574
 8:57 pm on Sep 30, 2010 (gmt 0)

I've posted about this problem earlier in the month towards the bottom of this thread:

[webmasterworld.com...]

But things have shown no signs of improvement and so I wanted to get additional input specific to the problem, hence the new thread.

Overview of problem and what lead to it.
A client launched a redesign for his ecommerce site in the first week of August. In the days following, the site slipped from page one to basically nowhere for their most competitive keywords.

Facts about redesign:
-URLs remained the same
-Content (in terms of body paragraph text) remained the same
-Internal link size more than doubled for each page. (Top nav changed from a static, image-only top navigation section to a CSS-driven text link top nav with primary links going across the top, groups of secondary links dropping down on hover, and even tertiary links popping to the side when hovering over secondary links.)
-content was drowned beneath a sea of code powering the top nav and other parts of the page.
-h1 tags got jacked by the designer. CSS is being used to accomplish the same. :-/ (just caught this now, will be remedied posthaste)

How it's effected the site

The result is that, in Google, the home page seems to have been filtered from results. For site: searches, the home page is in the #200's. There was a time a few weeks ago when the index was once again appearing first for site: searches, but it has since reverted to the 200s again. To my knowledge, no changes were made to the home page directly before or after this brief fluctuation.

Interior pages continue to rank well on some terms, mostly terms that aren't terribly competitive, but they still are driving some traffic.

In general, Google traffic has been decimated to the tune of 50% or so.

Bing and Yahoo have rewarded the new design with increased rankings on mission critical keywords, which subsequently has improved traffic from both engines.

In general, time on page is up, conversion rate is up, and bounce rate down, so it seems that the users that *are* able to find it are finding it better to use.

Steps we've taken since:
-Kill tertiary links from top nav area.
-Put content very high in the source code and repositioned with CSS.

Solutions considered

1)Focus on building links to rebuild PageRank for the home page. This approach operates under the assumption that the drastic changes in link structure led to a hemorrhaging of its PageRank throughout the site, and the only way to fix it is to just starting (re)building more links.

2)Posting the old site back up. This was recently suggested. I hesitate to do it because my concern is that if the home page *is* in some kind of filter/sandbox/penalty/whatever situation, then that is likely attached to the domain name regardless of what the source code reads. Is it erroneous to assume that is true? If Google recrawls and the old source code (that enjoyed nice rankings) is there again, will the filter go away (relatively quickly)?

Any questions? Advice? Better solutions than those two?

I'm desperate to get this filter lifted but am just unsure how to go about doing it.

[edited by: Robert_Charlton at 9:24 pm (utc) on Sep 30, 2010]
[edit reason] fixed formatting [/edit]

 

tedster




msg:4209591
 9:25 pm on Sep 30, 2010 (gmt 0)

Most important - does the domain root (www.example.com or example.com) resolve directly without any redirects? And if so, does that page have real content, or is it a splash page of some kind?

bbarbeau




msg:4210043
 4:25 pm on Oct 1, 2010 (gmt 0)

Thanks for the reply tedster! (And thanks Robert for fixing the formatting... I went to preview before I posted and hit the submit button instead.)

Yes, www.example.com 200ok's.

example.com 301s to www.example.com, as well.

One thing I briefly suspected was the sitewide link to return to the homepage is coded as "index.html," so I thought that may be a problem. But index.html 301's to www.example.com as well, so I no longer suspect it's an issue.

Also forgot to add that there is lots of genuine content on the home page (product images as well as text)

tedster




msg:4210106
 6:27 pm on Oct 1, 2010 (gmt 0)

How long has this been going on? It sounds like at least a few weeks. There is a Google bug that happens once in a while that causes home pages to go missing, but it doesn't usually last for weeks and weeks. Plus, from what you say your home page isn't missing, just not ranking.

Definitely fix the internal links for Home so they point to the domain root with the "www". It cannot harm and it might help. I would also get as much of that extra code off the page and into external files as possible. Is that "sea of code" for the top navigation still crawler friendly? This new menu system is my first suspect for the problem's source right now.

What happened to your title tags during the redesign - any changes there? Titles are a lot more important than the H1 tags.

Robert Charlton




msg:4210112
 6:41 pm on Oct 1, 2010 (gmt 0)

One thing I briefly suspected was the sitewide link to return to the homepage is coded as "index.html," so I thought that may be a problem. But index.html 301's to www.example.com as well, so I no longer suspect it's an issue.

If I understand you correctly, you're globally linking to "index.html" instead of to "/" (or to your absolute default canonical url)... and then you're assuming that the 301 will undo this.

IMO, this may well be your problem, as the internal links are fighting your redirects.

bbarbeau




msg:4210151
 8:04 pm on Oct 1, 2010 (gmt 0)

The redesign launched in early August. Rankings only slipped a bit initially, but within two weeks the home page had disappeared from many search queries. (It still appeared very low in site: searches, and #1 if you searched for [ company name ]. The latter I expected, the former I still don't understand.)

Ill-advised as it may have been, we were doing things during the initial position slippage to try to recover the rankings. These were mainly focused on the navigation menu. Killing tertiary links. Properly ordering the navigation and list items (the designer had some very funny stuff going on, where the source was coded such that the menu read right-to-left and children items to parents, rather than parents with children following, and reading left-to-right).

I then thought it prudent that we just wait a few more weeks and let Google digest all the changes. But now we're closing in on 8 weeks post-launch and I'm thinking this isn't something about Google digesting changes, and that it's done all the processing it's going to do (after all, interior pages are doing alright).

As far as crawler-friendliness of the menu, the design change actually was hoped to make it more friendly, since it went from text-images to hyperlinked text. However, now that you mention it, part of the issue causing the "sea of code" is that each link has its own javascript elements, e.g.

<li class="secondary" onmouseover="this.className+=' overa';" onmouseout="this.className=this.className.replace(' overa', '');"><a href="link_target.html" class="secondarya">Anchor Text Here</a></li>

(Yes, they are relative links :( )

I will see what the designer can do in terms of locating this stuff externally.

In terms of the title tag, I just now noticed that there were minor changes made between the old design and the relaunch. One thing that changed was that a "by Company Name" that used to conclude all the title tags on the old site did not get ported over to the new design. A two-word keyphrase was added to the end of the new title tag as well, so I guess overall the count stayed roughly the same. Could that cause a significant difference?

I agree with you though that I'm looking hard at the menu as the perpetrator. Especially now that you led me to notice this javascript issue. (It's amazing how bad tunnel vision can get when you're faced with a big problem. I feel like I shoulda caught that on day one.)


@ Robert and tedster
re: index.html sitewide link. I agree that this needs to change. When I first started looking at it, I thought I'd found the culprit. But when I looked at the old site, it also used this same methodology, so I disregarded it as a cause for the filtering we are experiencing.

However, even if it isn't the case, I do agree with you both that it makes sense to use the canonical rather than forcing a 301 to bear the load unnecessarily.


So after getting your guys input, I will get the global homepage link implemented, as well as look into getting the designer to clean up redundant code in the source.

One more thing I'd like your input on specifically: I'm facing pressue to repost the old design until I can sort this out. What are your thoughts on that? If the site is somehow filtered, will another drastic sitewide change help or hurt us?

When someone suggested this, my first response was "*facepalm* why didn't I think of that?"

But the more I think of it, the less appealing it seems to me. Guess I just wanted some outside opinion on that course of action.

Thanks again to both of you for your input.

Robert Charlton




msg:4210166
 8:38 pm on Oct 1, 2010 (gmt 0)

In terms of the title tag, I just now noticed that there were minor changes made between the old design and the relaunch. One thing that changed was that a "by Company Name" that used to conclude all the title tags on the old site did not get ported over to the new design. A two-word keyphrase was added to the end of the new title tag as well, so I guess overall the count stayed roughly the same. Could that cause a significant difference?

The page title element, as it should be called, is generally the most important single onpage factor that influences search engine ranking. Yes, the changes you describe could cause significant differences. A lot depends on the specifics.

But navigation, titles, and headings are pretty significant just by themselves. Taken all together, with who knows how many other unknowns, and yes, you're likely to be seeing big differences. How about onpage content itself? I know you say you kept the same body paragraph text, but what about the rest of it?

I would almost be tempted to revert to the old site, put the new one on a password-protected development server, and systematically go through the differences and make sure nothing is changed until you understand what it going on. That's pretty extreme... and in truth something I've never needed to do, so I can't fully predict how Google would react. I generally wouldn't even consider this kind of back step, but I don't think that sufficient detail was paid to a lot of potentially crucial details. At the least, you need very quickly to assemble a complete list of changes that were made, and decide whether it's going to be faster to go forward or to go back.

Ths may be a Humpty Dumpty kind of situation, where you can't reconstruct all the pieces, so I'd make sure that you can go back before you try to do so.

bbarbeau




msg:4210209
 9:54 pm on Oct 1, 2010 (gmt 0)

*shaking my head*... Yes, paragraph content stayed the same as I mentioned, but product images and text links to the product pages (both images and text were hyperlinked) were added to the home page as well. (This linking, of course, compounding the problem of lots of new links added via menu navigation as well.)

As of right now, I will be taking your advice of basically conducting a careful examination of the two versions and minimizing the changes on critical technical issues which were frankly overlooked.

As an FYI, I won't resort to the old site yet. The new site has done very well in Bing and Yahoo, and as I think I mentioned earlier, user metrics are up. I still think the design change was positive, but I need to make sure I get the site to a place where Google is of the same opinion. Reverting to the old site has too many unknowns, and not knowing how Google would react I don't think justifies the risk. If we can fix the things Google has taken issue with, then I think things will be good all around.

I'll report back on changes made and what effects they've had once we get to that point.

tedster




msg:4210269
 12:42 am on Oct 2, 2010 (gmt 0)

Even though each navigational link includes some onmouseover javascript, the link itself is crawler friendly. What is apparently NOT crawler friendly is any "secondary" link generated by the JavaScript. If the URL for the link is not directly readable in the source code, you usually have a problem.

I will see what the designer can do in terms of locating this stuff externally.

There are more SE friendly menu systems available. They use hover behaviors to switch the visibility of secondary navigation on and off - not onmouseover and onmouseout events.

So how many total navigational links appear on every page now - with and without the links generated by JavaScript events?

-----

Other members here have reported ranking problems after site-wide changes to the title element. I've never personally run into that kind of problem, but it is worth considering. The theory goes that Google flags a site for widespread title changes because it might be an attempt to reverse-engineer the algorithm. However, even if that's true, it would paint you into a corner - going back to the previous titles would be yet another change. Not sure what to say about that.

Planet13




msg:4210746
 3:51 pm on Oct 3, 2010 (gmt 0)

In general, time on page is up, conversion rate is up, and bounce rate down...


This hasn't be asked yet, so don't mind me butting in...

How is the number of conversions doing?

I just ask this because sometimes maybe we get focused on the wrong things. I have been spending a LOT of time looking at my google analytics data over the last ten days or so, and realize that visits to my home page don't convert nearly as well as visits to my main category pages (one click away from the home page - I guess you call them secondary links?)

So while you have people breathing down your neck for results, is it possible you could show them that traffic has increased to your better converting pages? That might buy you some time with the brass until you can come up with a concrete course of action.

Hope this helps.

bbarbeau




msg:4211131
 4:07 pm on Oct 4, 2010 (gmt 0)

What is apparently NOT crawler friendly is any "secondary" link generated by the JavaScript. If the URL for the link is not directly readable in the source code, you usually have a problem.


Can you say a little more about this? I'm not sure I understand what you mean when you say "secondary link generated by the JavaScript," along with the mention of whether the link is directly readable in the source.

You can read all the links in the source, but they all have those JavaScript elements included. And when viewing the page through Lynx, you can see all the links in the navigation menu (primary and secondary links).

So I think I'm a bit confused as to the degree to which this navigation structure is or is not bad.

So how many total navigational links appear on every page now - with and without the links generated by JavaScript events?

There are 6 primary links in the top navigation which are always visible, regardless of mouse activity. In addition, there are 54 total secondary links which drop down from the various primary links.

This total of 60 links is up from the previous 8 total top navigation links in the previous design. I'm definitely going to revise this as I think there is fat that can be cut from the menu, both in terms of link juice and user experience.

Also, interesting info about large scale title tag changes. I'm going to recommend going back to the old tags (since we never recommended they be changed in the redesign, but it is what it is now), so I'll report whether there are any effects or not from doing that.

How is the number of conversions doing?

Unfortunately, conversion numbers are down in general, despite rates being up. Thanks for the input though... I'll definitely be doing some architectural performance comparisons soon, but at the moment everything is going into the recovery plan.

enigma1




msg:4211141
 4:39 pm on Oct 4, 2010 (gmt 0)

For the links, at least the code you posted, use plain links without any js in-between.

<li class="secondary"><a href="link_target.html" class="secondarya">Anchor Text Here</a></li>

You can still have js firing on the various events (rollovers, clicks etc) by having a framework like jQuery. Move as much js-code and resources as you can into external files (css external js files).

pageoneresults




msg:4211147
 4:53 pm on Oct 4, 2010 (gmt 0)

I'd be looking at technical issues here. If you have GWT enabled, what type of 404 activity is there? I've experienced a site tanking after a malformed rewrite rule that Googlebot managed to find outside of our testing. It put the bot in a loop and caused turmoil during the timeframe involved. Things are back to normal and better since correcting but I'll tell you what, that was a career changing experience. ;)

70%+ of Webmaster challenges are usually from a technical standpoint. Many times, it is stuff that folks don't see. For example, have you looked at your raw logfiles to see what type of crawl activity is taking place? You can probably pinpoint when and where along with the why if you analyze those files closely. GWT has been a great resource for detecting crawl anomalies and such. I review that area as much as I do GA. :)

bbarbeau




msg:4211174
 5:46 pm on Oct 4, 2010 (gmt 0)

haha, I lived in GWT post-launch and found oodles of stuff that needed fixing: a plethora of 404s, bad URLs in the XML sitemap, etc. etc.

Interesting info about the malformed rewrite. I haven't looked at the .htaccess file since the redesign (assuming nothing changed in it, since there was no need to change anything), but I think I may need to do that stat since so many other assumptions I've had about the redesign have proven no to be the case.

We tried to get the client to send us raw access logs, but the client either doesn't know how to give us them or is reluctant to do so. I will ask again and see what he has to say.

From GWT though, crawler activity spiked in the days/weeks immediately following the redesign. They have since dropped to regular levels, but it still seems like pages crawled/day and KB DLed /day are up since before the redesign launch.

The site also slowed down significantly following the redesign. Load time (according to GWT) averaged around 2sec/page pre-launch, and jumped to around 3.5sec/page post-launch. Some of this I figure is increased product image/links on pages (which I'll be recommending be edited down to fewer pics/links).

bwnbwn




msg:4211341
 2:32 am on Oct 5, 2010 (gmt 0)

Are both the links going to the same page? Image and text links?

Sgt_Kickaxe




msg:4211428
 7:16 am on Oct 5, 2010 (gmt 0)

Slow things down a little, you need to make sure Google is absorbing all of your changes sitewide before making new changes else you may do the right thing and take it apart thinking it's not working.

Here's a tip. Add a noarchive meta tag when you make a change. Wait until you see that none of your pages have a cache in google before making more changes. When the cache is gone sitewide make another change but this time remove the noarchive tag. Wait for every page to be archived to know that your change is done propagating. Rinse and repeat until happy with results.

pageoneresults




msg:4211640
 2:06 pm on Oct 5, 2010 (gmt 0)

I lived in GWT post-launch.


Very wise move. ;)

And found oodles of stuff that needed fixing: a plethora of 404s, bad URLs in the XML sitemap, etc. etc.


If you found oodles of stuff that needed fixing, that isn't a good sign - especially the plethora of 404s.

Interesting info about the malformed rewrite. I haven't looked at the .htaccess file since the redesign (assuming nothing changed in it, since there was no need to change anything), but I think I may need to do that stat since so many other assumptions I've had about the redesign have proven no to be the case.


I'd be running server header checks on everything. There is a possibility that you may not be returning the proper header for some responses. For instance, the dreaded 404 > 200 response which may have a direct influence on a site tanking. Especially if there are 100s or 1000s of pages that should be 404 but are returning a 200. Google doesn't like that and will just nix the site until you fix things.

We tried to get the client to send us raw access logs, but the client either doesn't know how to give us them or is reluctant to do so. I will ask again and see what he has to say.


Let them know that it probably holds the key to their woes.

From GWT though, crawler activity spiked in the days/weeks immediately following the redesign.


A spike would be expected as there are major changes in code structure, there is a recalculation that needs to take place.

They have since dropped to regular levels, but it still seems like pages crawled/day and KB DLed /day are up since before the redesign launch.


Might be an indicator that there is a potential loop somewhere in a rewrite? Not sure. But if you have a bit going on with dynamics and are relying on a rewrite to handle everything with the URIs, I'd be triple checking everything. Make sure that 301s are returning a 301 and not a 302 unless of course a 302 is appropriate which is not too frequently. ;)

Make sure pages that no longer exist are returning either a 404 Not Found response or a 410 Gone. I prefer 410. I just experimented with it these past few weeks and Google will remove a document very quickly once it receives a 410 Gone response - or at least it did in this instance. I was a little surprised and expecting it to be treated as a 404 which gets held on to for who knows how long.

The site also slowed down significantly following the redesign. Load time (according to GWT) averaged around 2sec/page pre-launch, and jumped to around 3.5sec/page post-launch. Some of this I figure is increased product image/links on pages (which I'll be recommending be edited down to fewer pics/links).


That's a pretty major shift in load times, especially with Google. If there are competing sites with quicker load times and other things being equal, they may outperform in this instance. From 2.0 seconds to 3.5 seconds, that's a pretty hefty hit.

Have you validated the pages? (Okay, for those reading this, stop the groaning!) I would strongly suggest that you validate each and every page to make sure there are no blatant parsing errors. There may be certain errors that will impede a clean indexing, who knows, you could be a victim of something like that, especially in a redesign.

Added: I was reading the history in the other topic along with what you've outlined above. You've got multiple challenges taking place and they may be compounding one another, or they did in the past. I'm not sure what condition the markup is in at this point and that would be another area I'd be focusing on, the quality of the markup.

bbarbeau




msg:4211954
 9:16 pm on Oct 5, 2010 (gmt 0)

Most important - does the domain root (www.example.com or example.com) resolve directly without any redirects?


omg, I am an idiot. tedster, you were right on your very first post.


Sending request:

GET / HTTP/1.1
Host: www.domain.com
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.10) Gecko/20100914 Firefox/3.6.10 GTB7.1 ( .NET CLR 3.5.30729; .NET4.0C)
Referer: x
Connection: close

• Finding host IP address...
• Host IP address = x
• Finding TCP protocol...
• Binding to local socket...
• Connecting to host...
• Sending request...
• Waiting for response...

Receiving Header:
HTTP/1.1·200·OK(CR)(LF)
Date:·Tue,·05·Oct·2010·20:21:39·GMT(CR)(LF)
Set-Cookie:·BX=e9f90kt6an26j&b=3&s=5e;·expires=Tue,·02-Jun-2037·20:00:00·GMT;·path=/;·domain=.example.com(CR)(LF)
P3P:·policyref="http://p3p.yahoo.com/w3c/p3p.xml",·CP="CAO·DSP·COR·CUR·ADM·DEV·TAI·PSA·PSD·IVAi·IVDi·CONi·
TELo·OTPi·OUR·DELi·SAMi·OTRi·UNRi·PUBi·IND·PHY·ONL·UNI·PUR·FIN·COM·NAV·INT·DEM·CNT·STA·POL·HEA·PRE·GOV"(CR)(LF)
Cache-Control:·private(CR)(LF)
X-XRDS-Location:·http://www.example.com/ystore/openid/rp.xrds(CR)(LF)
Connection:·close(CR)(LF)
Transfer-Encoding:·chunked(CR)(LF)
Content-Type:·text/html(CR)(LF)
Expires:·Tue,·05·Oct·2010·20:21:39·GMT(CR)(LF)


When I first checked for redirects, my eyes basically got to 200 OK and looked no further. And even if I had continued, I wouldn't have known what I was looking at.

But when checking to see what the load time was at a web page analyzer, I noticed it was returning an analysis for www.domain.com/ystore/openid/rp.xrds instead of the home page. It wasn't until I looked again very closely at the header check that I noticed this same URL appeared in the header under X-XRDS-Location.

Sorry for missing that the first time around. Until today I only knew about 301/302 redirects, meta refresh redirects, and Javascript redirects.

My question now is: Is this X-XRDS-Location actually functioning as a redirect in this capacity?

In the header checker tool I usually use (from rexswain.com), it 200ok's and the returns the source code. But in this webpage analyzer at weboptimiztion.com, it returns the XRDS file.

It seems to me like it's functioning as a redirect of sorts, since it is being called in the header. And when you go to that URL, you find what looks like some kind of redirect:




<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 3.2//EN">
<html>
<head>
<title></title>
</head>
<body> http://specs.openid.net/auth/2.0/return_to http://www.domain.com/ystore/openid/FinishAuth </body>

</html>




A few spot checks of the site show that this X-XRDS-Location header item only appears on the homepage, making me very suspicious that this could be a primary cause (among many, many missteps which even today still need to be addressed) that went wrong with the redesign launch, and one that could be raising some red flags with Google and keeping specifically the home page from appearing highly in the SERPs.

The site is a Yahoo Merchant site, and I know OpenID is a legitimate login manager or whatnot. But perhaps the two together (mixed with this strange redirect) caused Google to filter the home page from its results?

Anyone have any more info on XRDS header element? Googling [ "x xrds location" ] (exact phrase match) is not terribly enlightening on the subject, and even [ xrds site:webmasterworld.com ] only has two results.

Anyways, I'm sure the client would rather have rankings back than allow for OpenID logins, so yanking the OpenID stuff is starting to look like a primary course of action. But does anyone know if this can be implemented a different way?

Some of what I came across in my search suggests that you can implement a meta http-equiv tag.

Anyhow, just curious if anyone had experience with it. Since this was something the client's designer definitely did, I'd like to task him with the job of getting it to work differently. But should he not know how else to work it, it would be good to have a solution up my sleeve. (Again, aside from just pulling that feature altogether.)

-------------


Are both the links going to the same page? Image and text links?

Yes, both point to the same page. The site doesn't use any dynamic URLs or variables in the URL string, so in that sense, every page on the site is canonical (except the aforementioned root vs. index.html page, but even index.html is 301'd. Despite that, we will still be changing global links to the homepage to point to / and not /index.html.


Slow things down a little, you need to make sure Google is absorbing all of your changes sitewide before making new changes else you may do the right thing and take it apart thinking it's not working.

Here's a tip. Add a noarchive meta tag when you make a change.


Well after some of the tweaks we did immediately post-launch, the site has been largely untouched (perhaps a dozen pages max had content changed but no sitewide changes or even code changes on individual pages aside from body text). The advice I'm getting here will largely be implemented in one round of changes to minimize constantly touching the page. I think the good thing there is that we'll basically touch the pages once with a bunch of changes; the bad thing is we lose the granularity of seeing which change causes which effect. Is there a consensus on how to implement changes like this? I'd love to slowly roll them out in a kind of split test, but we need to recover fast and I don't want to keep touching the page for fear of upsetting the Google monster more than we already have. ;)

That's an interesting tip though about the noarchive. I may implement that prior to making these major upcoming changes.


@pageoneresults You were also right about technical issues being a source (assuming this XRDS stuff is the source of the filtering--and I hope to god it is). It was your comments about load times that lead me to test out load times myself, which caused me to trip over this XRDS issue.

I think the increase in load time is also a function of having more images per page. Moving forward optimizing load times will be an area of much scrutiny, but if we can just get the index page back in the mix, hopefully these other problems which have come to light can be dealt with in a more deliberate/less "panic mode" manner.

[edited by: Robert_Charlton at 8:03 pm (utc) on Oct 6, 2010]
[edit reason] broke line of code to fix side-scrolling [/edit]

Jez123




msg:4212250
 11:07 am on Oct 6, 2010 (gmt 0)

@bbarbeau - is the home page back to where it should be now? Looks like it is from where I am looking (UK)

bbarbeau




msg:4212564
 8:25 pm on Oct 6, 2010 (gmt 0)

@bbarbeau - is the home page back to where it should be now? Looks like it is from where I am looking (UK)


No, not yet for the mission-critical keywords, and it's still not first for site: searches.

I'm hoping correcting the OpenID verification issue (the XRDS header item) will help out.

As mentioned throughout the thread, there's really a rich ecosystem of SEO and design missteps which have occurred. But I'm hoping this is a start towards a recovery.



The issues we are addressing now are the header issue, changing the global link to the homepage from "/index.html" to "/" , and we're also looking at a bit of javascript in the source that occurs after the opening html tag and before the opening head tag.

The following thread from Google's webmaster forum ( [google.com...] ) exhibits two of those three issues I just mentioned. Their site also appears to be a Yahoo merchant site, so hopefully what helped them will also help us.

Granted, the OP doesn't mention which of the fixes, if either, he implemented, but I think addressing both those issues along with others will help us see some improvement.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved