homepage Welcome to WebmasterWorld Guest from 54.205.241.107
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 63 message thread spans 3 pages: 63 ( [1] 2 3 > >     
Has anybody fixed things AND seen results
Big Daddy supplemental changes and impact
dtcoates

5+ Year Member



 
Msg#: 3093165 posted 4:34 pm on Sep 22, 2006 (gmt 0)

Hi,

I'm among those "victims" of Google changes since Big Daddy. I've fixed everything that everyone has talked about - canonical issues, meta descriptions, different domains pointing at the same site, robots.txt etc etc etc. I mean EVERYTHING. The words of g1smd et al have become the guiding force in my life - thank you :-)

BUT, I've seen no real improvement, and it is now worse than ever - ten days ago we had 200 pages, today we have 57. That's after several months of fiddling, analysis, heartache and stress. Frankly, I'm absolutely desperate, depressed and thoroughly tired of the whole thing.

My question is simple - given that I am not the only one who has fixed these issues, can ANYBODY report any success? Or are we all just wasting our time?

If no-one has been successful, might we be better off looking at it from another angle? Is duplicate content really the issue here? Is it too soon to see any results? Should I just give up?

I'd appreciate your thoughts...

 

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3093165 posted 6:16 pm on Sep 22, 2006 (gmt 0)

Yes, I have worked with several webmasters this year on these kinds of issues and seen results in their search results. I also know g1smd has hd some successes, as you can read in his posts.

Note, however, that the key to measuring success, at least at this particular moment, must be traffic and the actual urls ranking in the ordinary search results. Something very odd is going on with the site: operator right now [webmasterworld.com].

dtcoates

5+ Year Member



 
Msg#: 3093165 posted 8:10 pm on Sep 22, 2006 (gmt 0)

Thanks, Tedster.

I know what you mean about the site operator - that's how I'm getting the figures. However, they're backed up by the fact that the pages that are now supplemental in the site: results are also way down in the results.

Tomorrow, if you don't mind, I will do a thorough analysis of spider visits, supplemental pages etc. and post a bit more. As I said, I have tried EVERYTHING, have closed down EVERY loophole in my site's structure as recommended by you guys. Nothing seems to produce sustained results, and it is simply not possible to run a business under these fluctuating conditions. I can't say to my staff, hey, we're doing good in Google, come to work, and then the next week, hey, Google's gone bad again, stay home.

I know that people will be say that if you rely totally on Google free results, your business model is no good. Agreed, and we do not, but if you had 50% from Google and that is reduced, any owner of a business with fixed overheads will know that that 50% may be the difference between profit and no profit.

Thanks again.

Ma2T

5+ Year Member



 
Msg#: 3093165 posted 8:43 pm on Sep 22, 2006 (gmt 0)

I would like to say thanks to tedster and the others on this forum for the knowledge and info.

I was a "victim" last month, and I am doing everything I can to bounce back next month!, I hope we have some luck.

AustrianOak

10+ Year Member



 
Msg#: 3093165 posted 8:48 pm on Sep 22, 2006 (gmt 0)

Unfortunately NO. Never had any of the issues people were trying to "correct" so no need.

Hard to fix somethign when it's not broken according to the guidelines.

miki99

5+ Year Member



 
Msg#: 3093165 posted 8:56 pm on Sep 22, 2006 (gmt 0)

I only realized yesterday that my site had major "canonical issues" -- a term I never even heard of until a couple of days ago. I've been getting help with this on the New to Web Development Forum, and I just now did a 301 redirect so that all my non-www urls are now converting to www urls.

Does anyone have any idea how long it may take before Google again recognizes all my pages as belonging to one site? Next month's update?

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3093165 posted 9:26 pm on Sep 22, 2006 (gmt 0)

>> I only realized yesterday that my site had major "canonical issues" <<

Googleguy started mentioning this stuff about two years ago; but it has taken a long time to see the full implications of not listening to those hints.

.

Once the 301 redirect from non-www to www is in place, you should see more and more www pages being indexed within just a few weeks, and any URL-only www entries turning into normal results too.

Make sure that you link back to www.domain.com/ from every page of the site, making sure that the link does NOT include the index file filename itself. Make sure, too, that every page of the site has a unique title tag and a unique meta description, and they both accurately describe what is to be found on that particular page.

Once this work is done, any non-www URLs that are already marked as being Supplemental may well hang around in the index for another year, but they will not be harming things. Once the redirect is in place they are no longer classed as being duplicate content.

You may even find that some non-www URLs reappear in the index marked as Supplemental Results a few months after the redirect is put in place. That is normal and not harming things. In fact, where those Supplemental Results still appear in the SERPs they will still deliver visitors to your site via the redirect that you have already installed.

Your measure of success is in how many www pages appear in the normal index, not in counting how many non-www Supplemental URLs remain listed. Google will clean those Supplemental Results up after a year: you have no control over that.

You do need to investigate any www URLs that stay supplemental. If those are for pages that are 404 then you can ignore tham. If they are for URLs that return a "200 OK" then that may still indicate a problem: usually duplicate content of some sort. Duplicate content takes many forms, read the other recent threads that discuss it to get yet more details.

Maybe start with: [webmasterworld.com...] ?

miki99

5+ Year Member



 
Msg#: 3093165 posted 9:43 pm on Sep 22, 2006 (gmt 0)

Thanks so much, g1smd, for your detailed reply. I've saved your post for ready reference.

"Make sure that you link back to www.domain.com/ from every page of the site, making sure that the link does NOT include the index file filename itself. Make sure, too, that every page of the site has a unique title tag and a unique meta description, and they both accurately describe what is to be found on that particular page."

I think I already have all this in place, except...regarding your first recommendation, does the www.domain.com need the "/" behind it?

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3093165 posted 10:00 pm on Sep 22, 2006 (gmt 0)

Yes, it does, otherwise visitors see a redirect from www.domain.com to www.domain.com/ before the content is actually served.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3093165 posted 10:06 pm on Sep 22, 2006 (gmt 0)

Also note, if your fixes require widespread changes such as shifts in your url structure, lots of changes to internal links, many 301 redirects -- and especially if you are removing any dodgy techniques whatsoever -- then your "reappearance" may be rather gradual, dampened by some of the same filters that create the infamous sandbox effect for relatively new domains.

As we notice, Google no longer offers instant gratification and has apparently become very keen on historical analysis as well as present time analysis.

fraudcop

10+ Year Member



 
Msg#: 3093165 posted 10:21 pm on Sep 22, 2006 (gmt 0)

Make sure, too, that every page of the site has a unique title tag and a unique meta description, and they both accurately describe what is to be found on that particular page.

how different the description tag has to be?

I have 500 description made of 13/14 words each where the difference are only 3/4 key words placed one at the beginning, one center and the other at the and of each description.

are those considered unique meta descriptions?

Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3093165 posted 10:28 pm on Sep 22, 2006 (gmt 0)

It's long and drawn out for us.

Let's call it progress rather than "fixed." But no improvement on G traffic at all. ie "zilch' "nixs" "zero" ok - maybe 5% of expectancy :)

Time scale from major "fixes" is around 7-8 weeks to this point.

We have Googlebot crawling, deep pages cached on some DC's, between 15 and 75% of pages showing, depending on the site and a sandbox effect on our results.

Results that do show well are in brackets ie "keyword keyword" which match our meta title text, so my feeling is, some folks could experience what we have found ie " a filter" of some sort.

miki99

5+ Year Member



 
Msg#: 3093165 posted 10:34 pm on Sep 22, 2006 (gmt 0)

Thanks again, g1smd and tedster. I was actually thinking of redesigning my website over the next little while, so maybe I'll resist uploading the redesigned version until after I see major recovery (if I even get it finished before then).

I've also wondered about just how "unique" document titles and meta descriptions need to be. In my Amazon store I have many pages of similar items, but I do try to vary the meta descriptions by including the descriptive names of a few of the items on each page--such as book titles, authors, etc.

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3093165 posted 10:34 pm on Sep 22, 2006 (gmt 0)

If you have chosen www.domain.com as your base domain for all links, but your server default sitename is set to domain.com, and you are redirecting all non-www URLs to the www version, beware of what happens when you are linking to a folder where you forget to add the trailing / to the URL in that link.

If you forget the trailing / then your link to www.domain.com/folder will first be redirected to domain.com/folder/ {without www!} before arriving at the required www.domain.com/folder/ page.

The intermediate step, at domain.com/folder/ will kill your listings. You have a redirection chain.

Luckily, this effect is very easy to see if you use Xenu LinkSleuth to check your site: it shows up as reporting double the number of pages (when you generate the sitemap) that you actually have, with half of the pages having a title of "301 Moved".

Always include the trailing / on any folder links.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3093165 posted 10:35 pm on Sep 22, 2006 (gmt 0)

I have 500 description made of 13/14 words each where the difference are only 3/4 key words placed one at the beginning, one center and the other at the and of each description. are those considered unique meta descriptions?

That sounds like plenty - as long as those 3/4 keywords you mention are completely unique to each page and not reused elsewhere. I work with several sites with 100,000's of urls that need to dynamically generate their meta tags, and so far the kind of variation you describe has been enough for them.

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3093165 posted 10:40 pm on Sep 22, 2006 (gmt 0)

I was going to link to this other thread a few posts back, too: [webmasterworld.com...]

Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3093165 posted 10:41 pm on Sep 22, 2006 (gmt 0)

Make sure, too, that every page of the site has a unique title tag and a unique meta description, and they both accurately describe what is to be found on that particular page.

Since this is so critical, it would be great if anyone could estimate what they believe would be the "borderline" no of common characters in the meta title and meta description.

We all understand totally unique is ideal - but for dynamic sites this maybe less possible.

It would be helpful if we could keep any new guidelines on [webmasterworld.com...] for ease of reference

SuddenlySara



 
Msg#: 3093165 posted 10:43 pm on Sep 22, 2006 (gmt 0)

I don't believe having a domain that re-directs to www version of your site has anything to do with your placement in Google results.

Some sites that I have, have great links in and the html code of my sites are really bad. It's about being popular and connecting your popularity in and out with any SE now days.

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3093165 posted 10:52 pm on Sep 22, 2006 (gmt 0)

>> I don't believe having a domain that re-directs to www version of your site has anything to do with your placement in Google results. <<

Some sites might get away with it; but I have seen plenty that have not: a mish-mash of fully indexed and URL-only listings, split Pagerank, many pages not listed at all, duplicate content issues, and thousands of Supplemental Results.

Within weeks of adding the redirect all of the www pages were indexed and all were fully indexed with title and snippet, the site also ranked higher, pages appeared in the SERPs that had previously been missing or not ranking.

The non-www URLs hung around in the SERPs as Supplemental Results for a very long time. I thought that Google was broken. Turns out that Supplemental Results for redirected URLs hang around for a year before Google cleans them up. It took a long time to realise that is how it works.

StarryEyed

5+ Year Member



 
Msg#: 3093165 posted 11:07 pm on Sep 22, 2006 (gmt 0)

hello dtcoates-

I do want to let you know there is hope! My site disappeared on June 27th. It went from being number 4 or 5 on page 1 ( out of 6,310,000)for may main search term to about 755...all my other terms- for which it ranked well..exactly the same. For one term I was pushed from the first page to number 957. It was devastating. Many thanks to the posts of g1smd and others. I found things wrong with my site that I didn't even know were possible! I've worked harder on my site since June 27th than EVER before and I am back in now # 5 on page one out of six million and for other terms ranked much higher than before June 27th. My site "reappeared" on September 15th. Will it last? who knows- but I hope so.

I am a part time webmaster and it appears that the posters here are much more advanced than I am. I don't want to bore you with a long post of all the rudimentary things I had to do to "get right" with Google. There are many things that you guys- I'm sure, won't have to do- but if you're interested just sticky mail me and I'll send you my laundry list! :-)

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3093165 posted 11:09 pm on Sep 22, 2006 (gmt 0)

Oh, do post it!

It is good to discuss such things.

miki99

5+ Year Member



 
Msg#: 3093165 posted 11:25 pm on Sep 22, 2006 (gmt 0)

Yes, please! :-)

miki99

5+ Year Member



 
Msg#: 3093165 posted 11:31 pm on Sep 22, 2006 (gmt 0)

Um, another newbie-type question.

"Always include the trailing / on any folder links."

Other than linking to my base domain, why would I link directly to a folder? I only link to files inside folders.

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3093165 posted 11:36 pm on Sep 22, 2006 (gmt 0)

The URLs www.domain.com/folder/index.html and www.domain.com/folder/ are duplicate content. You need to avoid that.

Google usually prefers the www.domain.com/folder/ version. That one, too, is also future-proof. You can change the actual index file filename to be anything that you want it to be, and no one will ever know because you never use the file name itself when you refer to it.

When you link to an index page do not include the index file filename in the link. Link to http://www.domain.com/folder/ in exactly that format.

See also: [webmasterworld.com...]

miki99

5+ Year Member



 
Msg#: 3093165 posted 11:57 pm on Sep 22, 2006 (gmt 0)

I get that last part. I just checked, and all my site's pages actually had a link going to [mysite.com...]

So I'm in the process of changing them all to [mysite.com...]

But, I still don't understand completely about www.domain.com/folder/index.html and www.domain.com/folder/

I feel pretty dense, sorry. I do understand what you're saying about them being duplicate content, but I'm not familiar with having an index.html file inside a directory other than the root directory. I nearly always have more than one html file inside a folder, so if I don't link directly to an html file, how does the browser know which page I want it to go to? (Unless it's the index file in the root directory?)

Am I misunderstanding something?

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3093165 posted 12:04 am on Sep 23, 2006 (gmt 0)

If you have a page like page34.html in a folder, you would link to it using www.domain.com/folder/page34.html for example.

If you link to an index page in a folder then you don't need to specify the actual index file filename itself. When you access that folder the server automatically finds and then serves the index page for you.

In that case you only need to link to www.domain.com/folder/

miki99

5+ Year Member



 
Msg#: 3093165 posted 12:07 am on Sep 23, 2006 (gmt 0)

I get it, thank you. Interesting!

StarryEyed

5+ Year Member



 
Msg#: 3093165 posted 12:42 am on Sep 23, 2006 (gmt 0)

Ok then gsm1d has twisted my arm....hope there aren't too many typos.

Here are some of the things I've done over
the past two and a half months....bear in mind
I have no idea which item helped the most or if
my positive results will even last but maybe
something here will help someone:

1.cleaned up the code in my index page, Reduced it's
size from 59K to 29K with CSS.
2.validated the page - it's 4.01 compliant. Yes I know
it doesn't matter based on some of the other stuff
you see on the web- but I did it anyway. Also did same
for the main entrance pages to my site and soon
every page will be compliant.
3. got rid of all of my affilate links banners etc except for adsense, one book seller and one niche specific program.
4. went through my reciprocal link directory- yes i kept my directory and "culled" all the dead soldiers. It had been six months since I had done so - but a lot of sites went under in 6 months. I used an automated link page building program whose name I won't mention- and what I did was get rid of all the code that tells a bot that this program was used. I'll do it all by hand now
and my links are all good- I'm keeping 'em!
5. Went thru the entire site and made sure I put in a meta description a description that made the page unique. If I didn't think the page was unique enough I got rid of the page. Then went into the Google URL removal tool and requested it be removed from the index. Made the page titles unique as well.
6. In doing #5 - I realized that there were some pages I wanted to keep- but a bot would probably always think of them as duplicates no matter what I did. Decided out of pages A, B, C,D and E to keep A in the index and put noindex, no follow in the others and also
use the url removal tool on all but A.
7. looked through the site for any references to www.mysite.com/index.html rather than www.mysite.com/ and to my shock found a few. Fixed those.
8. ran a site check for orphaned files- found MANY. Fixed that by removing pages from the server and using the URL removal tool.
9. I've been using sitemaps for awhile. Checked it very closely. Found www.mysite.com listed as a page as well as www.mysite.com/index.html. YIKES! That must have been there for a year - but now I think it matters! Got rid of the reference to index.html and resubmitted sitemap.
10....this one really falls under the getting rid of affiliate programs reference but it kind of deserves it's own spot ..got rid of the links to a particular web
marketer who used to and I think still does really push reciprocal linking. When that was "in" I did do it- but it has been almost two years since i asked for a link.
I know there was a problem between this marketer and Google last year and I decided to get rid of any refernces to this person's site and program. His site was always number 1 with pr6 but has been no where to be found for awhile and has pr0 now.
11. At least twice a month I do a search for "mysite". no www or .com...just my domain. I have found more made for adsense scraper sites that way than doing anything else. I report them as spam to google- just 10 or so at a a pop. In 5 days they are gone...replaced by new ones!
I report only the most blatant scrapers that make me lose my mind- but I do it regularly. Did it help me? Can't say.
12. About 4 weeks ago after working hard - I submitted my reinclusion request to Google.Explained my situation to them - that I run a site in my spare time and I did nothing wrong intentionally, never meant to violate any guidelines-but told them after reading posts here at webmasterworld- I found many things to fix. Outlined for them all I had done. Asked them to please think about re-evaluating my site as it has a lot to offer the visitor.
13. noticed after submitting reinclusion request MANY bot requests in my server log for pages
I had recently removed and that I KNEW Google thought were duplicates. They were just old pages. No longer
used or linked to. Bots even tried other directories for those pages- I guess they thought maybe I just moved the pages. Wrong!Totally clean server now.
14. I also checked site:www.mysite.com...looked at the supplementals and removed the pages or used noindex, nofollow and url removal tool if I agreed it really didn't have to be in the index.
15 Employed a techinque I read here to get rid of sites that use a redirect to link to my site. Don't do this without reading the thread here at webmasterworld about it and the trouble it can cause if done incorrectly...If I discovered site using a redirect to link to me - I did a header check on it to make it was a redirect and then put noindex, nofollow in my index page.FOR ONLY A
MINUTE.Then immediately submitted the offending redirect url to Google as spam which redirected to my index which now said "noindex ,no follow". After submitting that request and getting the "your request will be processed" message I immediately uploaded my index page with "follow, index". Use extreme caution with this technique.
16 added a few new pages of content- not tons of pages- a few... and I'll add a page or two each month. Nothing too extreme.
17. made sure that internal links to My_Page.html were all the same..no, some pages with my_page.html and others with My_Page.html..I noticed that My_Page.html had a PR4 while my_page.html had pr0...hmmmmm.

Sorry this is so long and rambling...I hope something here helps...there may be nothing new here but I think doing ALL of these things made more of a difference to Google than any one thing...just my opinion....Good Luck :-)

gehrlekrona

5+ Year Member



 
Msg#: 3093165 posted 12:59 am on Sep 23, 2006 (gmt 0)

After havin been shot down on June 27th I too did a lot of changes.
In August my site came back so I thought that what I had done was something right and it was all good until Setpember 15th so now I am at a loss. Was all the work I did in vain? Didn't it mean anything to Google? Was what I did totally the opposite of what I should have done?
Well, I guess, with Google you'll never know. What is a truth one day is a total lie the next. I have decided to do what I have done all the time and eventually Google will catch up, or not!
Thank God for the other search engines since they are picking up some of the slack from Google and I am hoping that they get even better at it!

StarryEyed

5+ Year Member



 
Msg#: 3093165 posted 1:12 am on Sep 23, 2006 (gmt 0)

Yes - I put in a lot of work too and I wonder how long my site's reappearance will last... I feel now that being ranked well is not permanent or to be relied upon.

This 63 message thread spans 3 pages: 63 ( [1] 2 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved