|TheBear looked at my site and raised the possibility that I'm tripping a duplicate content filter, whereas I was not previously. |
I noticed that too when I took at looked at your site...and mentioned this as a possible problem (as in on page duplication). Only looked at your main page and not the others, but your headers were all too much the same as far as that page is concerned.
Think in terms of word clusters for each of your headers. I stickied you a website that would give you some ideas.
Clean up your code and rethink your main topics (headers) and organize them better (looking at your content it should not be an h1 followed by all h2's). Like I said I went no further than the page you referred to me, but if you are re-hashing the same content on the other pages as you are on main one...that could trip a dup. filter just as well.
Validate your code and do whatever theBear suggests. He has the knowledge you are looking for. Then you should make up your lost ground.
Cleaning up coding errors is one thing, but it's a bit premature to think about template changes.
Remember, GG says the update isn't over yet. Also, didn't you previously mention that some changes you made to try to appease G knocked you from page 1 to page 2 on Yahoo? Better to keep and improve the Yahoo traffic, in my opinion, as it's much clearer what Yahoo likes.
japanese - Nice to see more knows about the Greek Myth, I think I will go to earth as a wolf soon if nothing is gonna change.
About Alexa I did see some change was not sure what was going on, Its nice to see that some engine is working.
302 and scrapers go hand in hand, its sad to see that google still sponsor those sites.
I hope things will be ok again and as said before they also have to update there supplemental DB soon, I think its every 2-3 month they update that 1.
Meh.. If there is activity going on with part of this update, looks like it's just getting worse for me. I had some datacenters at least having me at position 352 but now they are all between 543 and 817.
<beating dead horse>
'How to best modify my pages to "untrip" the filter?'
Do what GoogleGuy suggested. It was your obvious issue from the beginning, but you still haven't done anything about it. It may not be your 'fault" that you aren't technical enough to really understand the issues, but lots of sites have been hit for more than a year, and the solution is very simple. You just have to do it. Google sometimes but not always has problems with sloppy webmastering, so webmasters need to help Google along as best we can, if we want to rank well that is.
</dead horse, carry on not addressing the problem>
"something different I am noticing..."
I've been following this since Allegra, where one of my sites went into oblivion. The Bourbon update only seems to accentuate people's anxiety over Google.
The key thing here is to realise just how big the web is. I have lost out over Allegra, and still fear about the rest of Bourbon, but it shouldn't cloud everyone's judgement to the fact that the amount of 'information' out there right now is huge.
I had a hard lesson after Allegra. After suffering from a different 'binary' as GG likes to put it, I talked to someone in my niche and they were surprisingly forthcoming. They changed my perception of how the web actually works. In essence, for those that believe that Google is 85% of the Internet, you are wrong. Google may be the starting point for some people, but that has nothing to do with how people find your site, enjoy your site, come back to your site and recommend your site.
My sites are all in a competitive area - if the Bourbon update is going to kill me in the next week, so be it... that is my problem, not Google's fault. I will probably stand to lose my livelihood if that did happen, I have come to realise that the only way forward is to be positive, form alliances/networks, and perhaps run your business slightly more like an offline one than relying on your money keywords.
Got tmp files on your server? I think i may have discovered something impportant - [webmasterworld.com...] , msg 62
Yes, I like the Greek Myths.
Thatís why I was an isolated google fan because ranking in google is also a myth that is enshrouded in a veil of secrecy that is operated by an unaccountable clandestine company staffed with sworn to secrecy covert undercover employees who act surreptitiously with a cloak and dagger attitude armed with stealth algorithms against overt webmasters.
steveb, I'm no dead horse, I don't require a beating, I've been slaving away replacing the repeated tidbit that is the copyright notice on my pages by a .jpg equivalent. I have been hard at work varying the remaining content as fast as I can since the beginning, so you can't say that I "haven't done anything about it!" I am taking all suggestions, and implementing them as fast as I can type.
Can you please tell me which particular GG suggestion you are referring to? You seem to be 100% sure of the reason why my site dropped from #1 to #150, and not coming up for its company name, so I'd be extremely grateful if you were more explicit in your recommendations! Thanks ahead.
My thoughts exactly. I have been taking opinions into account and fixing as much as I can.
If you are so certain what it is we need to do, by all means, share.
Is everyone being cached daily by the way?
Nicely put and of course, you are absolutely correct that one should not rely on google for a living.
This is exactly the point many webmasters are saying. To hell with portentous google and its constantly tweaked algo's that are now the bane of many webmasters. What right has google to monopolise the internet as it wishes.
Lets say they purchased the water purification companies. They can sure afford it. And google decided you are no longer deserved of its water. Would you then still be as acquiescent to their domineering and despotic attitude?
Hell, if there were one other single proper competitor bourbon would never had come about let alone the other sensationally named updates. Webmasters need to wake up and fight google at its game. Google was helped by webmasters to get where it is. Now webmasters are paying the price of googleís success. Donít bury your head in the sand or sound like a defeatist. Google can be brought to its knees. It is not God of the internet. It never pioneered anything. All it knows are improvements of previous pioneering search engines.
Atticus and the others could have a good point about Google conversion vs Yahoo. After those comments last night, I checked my Adsense report today, and after being dumped from Google and losing about 40% of our total traffic, but still going strong on Y and MSN, the last two days (yesterday and today) set new records for CPM on both pages and individual ads! Of course with traffic at a low as well, we're not yet breaking even.
Wait until word gets out that GOOGLE TRAFFIC doesn't convert! All those adwords snobs who were previouslty clicking on the "no-network traffic" box in favor of only google search results will be switching to "network only" and our CPMs will ALL soar as they jockey for all those sites which were dumped by G. :-)
[Okay back from powertrip daydream, now]
we are probably being watched.
from the merchant side to all ya'll adsense people...the epiphany has finally dawned on your side of the equation.
market research...study the google demographic
put 2 and 2 together as they say.
Message #7 [webmasterworld.com...]
(Also helleborine, I don't see how people get to your main page from your interior pages...)
yes...they (google) wants you to do well on other search engines with adsense. a backdoor to demographics that convert.
give them the credit they are due. it is rocket science.
quit worrying about bourbon. take googleguy's thinly veiled advice...make you site friendly to all SE's
Are you suggesting that I should change my relative links to absolute links, and that alone would restore my former rank? The 301 "problem" I cannot fix because this site isn't on its own domain name, and the hosting company is totally unresponsive to my requests.
No, you can't get back to the index page from the internal pages.
Before Bourbon, you could. My index was a combination of the current index (which is mostly text about spinning widget plans) and links to the 450 widget plans on the site.
Fearing that Google may find that a front page with 450 internal links might look scraperry, I changed what should be a good front page for my visitors into a second-tier navigation page.
I'm aware it's stupid navigation, and I'll have to find a more elegant solution after I finish making these 450 pages look more different from one another.
|Message #7 [webmasterworld.com...] |
(Also helleborine, I don't see how people get to your main page from your interior pages...)
i concur with steve...look back at my first sticky to you...pointed that out.
you are getting advice from alot of people that with worth a fortune...you got hours of work ahead of you. look at it this way, what have you got to lose at this point?
|Are you suggesting that I should change my relative links to absolute links, and that alone would restore my former rank? |
Yes....change your relative urls to absolutes
No...that alone will not restore your former rank
it will get you going in that direction though
validate your code...get rid of stuff that negates your headers...use proper syntax and eliminate code that confuses the bot.
eliminate dup. content.
apologies to interupt ... i am lost somewhere around page 60s for this thread. here are some sharings of mine regarding on this update.
first of all, here are some details for my site
- 10 months old
- 300+ pages with 150+ backlinks indexed
- affiliate sites with G adsense
- G adsense not in homepage
- simple html code
- with css
- clean design with less than 50 links per page
- less than 20k perpage through the site
- uses .htaccess to redirect non-www to www since the site is built
- the site is in a moderate competative area with 1.5 to 4 mil results for money phases.
effects from Bourban:
massive swings from nowhere to top 5 positions in the SERP for last week, it is nowsettling down on the very top position for 5-10 of my targeted 3 word-keyphase. yes, this means it is doing much better for my sites: Google referrals from 250/month jump to 4500/month (estimation: i am getting around 150/day)
yup, i know that's small traffic compare to the big players around here. i am not making much income from all these but i am really interested on how things can go after this update.
Here's what i think why my site is running high, these are my personal opinions and I DEADLY wish you guys are not agree with me and give storm my brain with new ideas about ranking the big G.
- reciprocate link is not dead! its just that linkage is playing a big part here.
- PR is not dead either! old skool rules: links from high PR = better ranks.
- whois information and site ip are highly related to your ranking! two sites with the same (or around the same?) whois details cant rank high for the same keywords (in fact, i believe its the whole industries)
- simple is good. static html just do the best ....if you are just a business man on the internet world. G seems doesnt like sites with massive coding.
- content content content - i still think content is king, and incoming link is queen.
as i am looking at G SERP in my field, i must say that G is doing much better than the other two big players. scrapper sites and crap sites with old domain are now gone. Sites (well, not all of course - but 90% of them) ranking above me are sites that i respect. i got what i want from G when i am acting as an user ..searching for my engineering techincal info.
thats my 2 cents ...thanks for listening.
Just to put a few minds at ease about internal duplicate content.
If google is penalizing for internal pages that are up to 95% similar then google is welcome to tank 3 sites I run that is at the top for their competitive keywords. Its the way I want surfers to see them and they were not designed for search engins. In fact they all are indexed and all is well and whats more they rank independantly.
I have almost 80 near identical pages and I gurantee they work in your favour, not a problem for duplicate content. On another site I have near 200 identical pages and the only diference in them are JPEG's. Even their titles are identical cause I could not be bothered, NEW_PAGE_01 etc etc. no description nothing, to a bot they are identical. And the site is number 1 for its keyword in google, yahoo and many others.
GoogleGuys recommendation is only his point of view. There are big advantages in relative linking.
GoogleGuy also mentioned a 301 contradicting reports that many webmaster now understand to be a deadly procedure. To do a redirect during volatile updates where your site has tanked is dangerous. He neglected to explain if google will attribute existing pagerank to the resolving page. It is getting worse by the day.
Where does any documentation exist in google, yahoo or msn that stipulates how they attribute link popularity or pagerank from the 301 to the resolving page. I would say that it is not calcultaed but lost in google and you could expose your site to be considered by google as being unstable compared to a similar site that has not done a redirect.
It is all about penalties with google. Zero is the highest and ultimate point. 500+ secret parameters.
I put up a detailed post regarding the 301 which is infinitely more informative and better advise on how and when to do it. He mentions nothing about a missing trailing slash and nothing about what will happen to inbound links pointing to the 301. Will it migrate to the resolving page or do you lose a few hundred links?
|Yes....change your relative urls to absolutes |
GoogleGuy suggested that people might want to do that for new sites (though not necessarily for existing sites).
I agree with Japanese that there are good reasons for using relative links--especially on existing sites, where changing the links from relative to absolute might be a huge, time-consuming project that could result in errors and problems for both users and search crawlers.
helleborine, if you can't do anything about the www, then you definitely need to use absolute links. Don't make Google try and figure stuff out. It isn't very good at it. Tell Google exactly what the URL is.
In a year of reading these threads, not one time has anyone posted of problems if they do all the things GG suggested. Relative links are the root cause of almost every problem posted here. A 301 and other things might mitigate the problem, GG's post is as clear as can be. Do those things and you will make headaches much less likely.
"GoogleGuy suggested that people might want to do that for new sites"
No, not true, and efv, your problems were because of your relative links. Clearly changing those would be a massive task now, and the 301 mostly deals with it, but the basic problem is the relative links themselves.
This is probably a very stupid question, but can an IP change impact Google? No laughing too hard at my dumb questions please. :)
> This is probably a very stupid question, but can an IP change impact Google
Not likely. One could say that if Google has that IP blacklisted from a previous site, but I doubt Google blacklists IPs; they oughta know that IPs are changed commonly.
I use relative links, but I can't see how that can be my problem. Googlebot indexes every page on my site pretty much daily. (And it's not trying to index non-existent pages, either, except where people have broken links to me.)
I have a dedicated IP address and I've canonicalised my site URL (non-www) since I got the current domain over five years ago, so that's not my problem.
My navigation structure is pretty straightforward - the only thing I'm a bit unhappy about is the extra profile given to authors and titles starting with A (I've contemplated making the Title and Author menu links go to a random letter instead).
There is some duplication of my content out there -- Usenet mirrors, most notably -- but it's scattered and has never been a problem till now. Except for the pages with ODP entries, which have links from all the ODP clones, I can't see that many scraper links. Real incoming links seem to outnumber the junk scraper/spammer ones, anyway.
My current best explanation is "ranking too well"/"over-optimisation". I've never done any optimisation (density shmensity, I say) but I generate both H1s and TITLEs from the authors and titles of books, and use those in internal links as well.
Anyone got any better ideas?
|not one time has anyone posted of problems if they do all the things GG suggested |
This, like the idea that only spammers suffer drastic penalties from Google algorithm changes, is just plain wrong. Apart from my own site, I've seen more than enough other well-built, high-quality sites suffer penalties - including ones from people who don't post here.
'Reality' television programming ain't got nuthin' on a Google update thread, especially this one. :)
|where changing the links from relative to absolute might be a huge, time-consuming project |
Does not have to be if you use find/edit and replace for all pages. Front Page has this feature if you are using it. I have even found another software program that can find and replace multiple lines of code at one time. Quite handy when trying to change your layout on hundreds of pages.
You can also write perl scripts to search and change a series of .html documents!
Dreamweaver can find and replace also.