Welcome to WebmasterWorld Guest from 22.214.171.124
Jagger is winding down and life must go on. If Jagger has been kind to your site, Congrats. But for the rest of fellow members who lost rankings or their sites dropped of the index, its time to do some thinking and decide on what to improve or change on your affected websites. Still ethical measures are what interest me most.
Some food for the thought.
After my site was hit by Allegra (2-3 Feb 2005) and lost 75% of my Google's referrals and hit for second time on 22nd July 2005 ending up with only 5-10% of pre-Allegra Google's referrals.
My site is now back to the level of around 50% of pre-Allegra Google's referrals and growing... until further. I say "until further" because who knows what the next update or "everflux" do to my site!
Before my site returned back around 19-22 Sept 2005 (very slow at the begining), I went through my site several times for months and did the followings:
- removed duplicate pages. In my case it was several testing pages (even back to 1997) which I just forgot on the server.
- removed one or two 100% frame pages.
- removed some pre-sell affiliate program pages with content provided entirely by affiliate program vendors.
- removed few (affiliate referrals) outbound links which was on the menu bar of all pages (maybe we are talking about sitewide linking).
- on resource pages, I reduced the outbound links to be less than 100 .
- made a 301 redirect non-www to www (thanks to my good Norwich friend Dayo-UK).
- finally filed a reinclusion request in accordance with the guidelines posted on Matt's blog (thanks Mr. Inigo).
Would you be kind to tell us how Jagger Update affected your site, and what do you intend to do about it.
>>Thank you Reseller but you did not answer the question.
An internal page may be linked by 30 other internal pages because it is important or contains basic information. Such a page may be a comment page for a news service.
One may want that page indexed by the search engine. But the search engine may say that 30 links to it is excessive linking.<<
Shall do my best to answer your question this time :-)
Most pages of the site I talked about in my first post of this thread has a side-menu containing 80+ internal links without the site being penalized at present.
However because the side-menu has grown very long, I intend to group theme-pages on "theme-entry-pages" covering short description and links to the theme-pages. Then add a link to each "theme-entry-pages" on the side-menu instead of links of all the theme-pages. This way the number of internal links on the side-menu will be much less than now and the side-menu wouldn't be as long as its now.
I hope above answer your question :-)
Just because people discuss something excitedly and at great length does not make it valid.
Incidentally, my take on the original question of the thread ("Your site dropped? Lost rankings? What to do now?") is to look at sites that are doing well and then look at your site(s) with an unbiased eye, rather than spending time with people who are also suffering from being affected (and often blaming their misfortune on external "errors").
On the internal linking (dup. content issue):
I have one site that has a link to the "fitting guides" on each product page. So many pages are "landing pages" that I feel they need to be on them or "many" customers (I take a lot of phone orders) obviously can't locate a BIG RED DOT if it was in the center of each page, and have to be talked through the process of looking up their product in the fitting guide and then selecting it. This site hasn't budged (except "Florida") and then popped back to the top again where it remains.
My site that got mowed down this time is another niche but:
I still use internal links on product pages and section pages to steer the customer to one of three informational pages that describe the ingredients of the product. Each page has a link to the ingredient page that it matches and some to two or 3 info pages. These pop into new windows and don't take you off the item page. I keep my sites "3 layers": index page>section page>item & order button.
The only difference between the site that dropped and the one that didn't is the sections have links to ingredient pages, which makes for about an extra 20 internal links that are not neccessary, as they will be on the item page if the customer clicks through. They should probably go right?
Also my side navigation is bloated with duplication (keywords) but so do the sites that weren't moved? I have "brand X widgets", "brand Y widgets", "brand Z widgets" and they are listed as such in the side nav bar. Should I put all the widgets in one section from the nav bar - then the "X, Y, Z" sub-sections? This will make all "widgets" 4 layers deep to the order button, but will sure clean up a lot of duplication of keywords in my navigation bar which runs globally through the site? Should I do that?
Apologies for the big post. I appreciate your help!
I know of a site with 150 pages that has always had the exact same problem (last 18 months or so). Within 4 days of changing all of the titles and meta descriptions to be different on every page, about 30 pages were listed, and after less than two weeks is already up to about 60 pages listed before the "Repeat this Search with similar pages included" message appears...
There have been opposing opinions about this on many threads for two months now.
I think we need to look at the individual circumstances of each site.
Here's a simple example: if you take a site with a 2000-page glossary. There's a left nav bar with 50 internal links on every page. Each page is composed of a 20-word one-sentence definition. The definition may be literally "drowning" in the 200+ duplicated keywords, in the header, nav bar and footer of each page. This scenario may certainly trip a dup filter which might be avoided if you used a breadcrumb style navigation and removed the left nav bar.
On the other hand, a site with deep content (200-400 word pages) might easily "tolerate" an extensive left nav bar.
So we cannot conclude that a left nav bar on the face of it is causing a problem. The context must be in play too.
joined:Dec 29, 2003
let's not get carried away. Out of the 20 things one "fixes" when hit by Google, chances are that just one was the problem. I don't think internal linking is an issue. Virtually every site had a navigation bar, many with 20-30 links on the left side.
Look back at previous threads that discuss "302 redirect URL hijack" as well as the article by Claus - all very easy to find via Google.
I'm also changing all links and image urls to absolute rather than relative links (a big chore) for the same reason. Again, I'm a bit blurry on the benefit here?
Does anyone know how to prevent this? Not only does it pose a duplicate penalty threat, it deprives me of ad revenue, since many people see the text only, printer friendly version of my work.
If I can get to your page as www.domain.com/the-page.html as well as domain.com/the-page.html as well as via a 302 redirect from competitor.com/redirect.php?your-site then which URL should be indexed?
If the page contains a <base href="www.real-name.com/real-page.html"> tag then that must be what the page is really called, and if using that URL to access the page actually does work, then that is what you think it should really be called, and what most search engines would believe that it really is called.
>> I am very concerned because Google is indexing the printable pages of my articles as well as the main content pages. <<
Get the script to test the URL of the page that is being served and to add that meta tag to all pages that you do not want to be indexed: <meta name="robots" content="noindex">
And I feel that it's actually a penalty being applied - a sort of damping effect on the domain. For one site, I put the same content on another domain and immediately got better results.. until Google worked out that it was duplicate content and dropped out the newer one in favour of the older one based on PageRank.
Here's a simple experiment - type in your domain name. You should always be number one for this.. unless you are under a penalty.
At some point in J1 i noticed 3 of my keywords were gone, completely out the index. If i typed in the keyword followed by my domain i just got a whole load of scraper sites with my old content.
Then after J3 has been rolled out all my big keywords were taken off with all the pages PR reduced to 0.
My non money keywords were left in place and even gained a higher PR on some pages.
Now I'm working more towards using PPC and other methods to get traffic and money back in. My income is now close to 0 from my site.
This was due to happen at some point i felt, it was all going to well.
Also, sites pointing to my site are doing better than my actual site.
I do not know if you mean what I have noticed with a "lost" site. When searching for just the url other sites with the url to my site are indexed before my site.
P.S. Four (4) things to check could be:
1) Duplicate content (on page and off page)
2) Keyword stuffing
3) www non www issues
4) same owner of sites within same theme
Alas this is not true.
My main site has two ODP links and it is since september 22 nearly gone in Google. It seems to be a dampening effect.
The reasons I can find are:
- to much the same anchor texts from navigation to main pages. Pages who survived have very little text on them and modest interlinking.
- a dup content penalty is possible. I found at the end of many pages the same html advertisements (with same sentecences only some words changed).
- This site has two parts. One educational, one more travel themed. The educational part is old but is still getting fresh links. Google seems to think now that the site is educational since those pages are doing not that bad in the SERPS.
- Links from two of my other sites (all same theme only different niches).
I am busy changing point 1 and 2 slowly (sandbox fear). Point 3 I don't know what to do about it yet.
The links between my sites are neccessary. Otherwise I should put the same content on one site (which I don't like).
I have a lot of original content, including several requests each month to publish my content in different magainzes (small magazines) or textbooks (usually college oriented). And I do get one-way links from a lot of these sources. However, the links are almost never pageranked and they never seem to make any difference in our rankings. Now that we've been hit so hard with the last two updates, it seems that the harder we try the further we will fall. Therefore I'm tempted to do nothing but keep collecting quality links and waiting.
So are we best just to put a few words in the meta and alt tags?
No. Use them to as they are meant to be used and according to Google guidelines "Make sure that your TITLE and ALT tags are descriptive and accurate."
The rule is not to stuff them with your keywords!
1) Duplication in titles dilutes the importance of all the keywords used, so avoid using your site slogan or <b>similar repetition</b>. Focus clearly and concisely on the content of that particular page.
Well if your site is about the company X and its products you must have that word X on all of your pages. This is repetition yes, but I do not think this should or will harm your rankings. Themed pages rank good and to create a theme you must repet your main keywords.
3) Avoid too much cross-linking between "related pages." I'm pretty sure I tripped a filter here, and it works a heck of a lot better if I just point to the appropriate section index instead of cross-linking multiple related pages.
Donīt you think this would be quite strange when the basic idea with the web is to link to related pages and sites. This helps visitors and is user friendly and that is something Google supports.
Basically, I have been top 5 for 2 years for a keyword with 10m results. I also ranked heavily accross 1000s of KWs for related results.
The day the update happened I dropped to 170ish. My homepage that had ranked so well had been replaced in the serps by my highest PR ranked sub page.
I also noticed my PR dropped from 5 to 4.
Over the previous 2 months before update Jagger 2 things had happened. I stopped paying for a link on a competitior site (PR6 and the number 1 for the keyword in which I used to list top 5).
I had added 3 extra paragraphs of *slightly* keyworded text to the homepage.
My traffic dropped by about half. I was saved by Yahoo and MSN.
Here's what I did: I deleted the paragraphs I wrote.
Here's what happened: I am now on page three of the listings with my home page PR4.
Here's what I think happened: The loss of the high, ontopic, oneway link caused a drop in PR and AT THE SAME TIME I tripped a filter somehow with adding the extra paragraphs.
I believe the de-optomisation returned my homepage and that I am *rightfully* on page 3 according to my new PR / linking factors.
Next I intend to build more links again to increase inbounds to the homepage. I am also considering clearing out some outbound links - but I might leave that for a while.
One thing that pleased me was how I could still make money on the site without G, it was an interesting experience and one I intend to learn from.
I have now made that affiliate site noindex via robots.txt and waiting...
For myself, I know there were several factors at play, including the one just mentioned...adding affiliate dynamic-driven pages...something I won't don again in a hurry.
I deleted the whole thing, and used the Google Url Remove tool to delete the entire directory that it was in. But a "noindex" directive makes more sense. Will you let us know how that works please?
Will you let us know how that works please?
The affiliate site (the whole directory) is not in the index anymore, but my site is still not ranking. I guess I triggered some other filters too and have to make more adjustements to the site.
Maybe my site is "re-sanboxed" - I do not know if this can happen, but if there is something like this I could be there (with some of you :-).