>>Thank you Reseller but you did not answer the question.
An internal page may be linked by 30 other internal pages because it is important or contains basic information. Such a page may be a comment page for a news service.
One may want that page indexed by the search engine. But the search engine may say that 30 links to it is excessive linking.<<
Shall do my best to answer your question this time :-)
Most pages of the site I talked about in my first post of this thread has a side-menu containing 80+ internal links without the site being penalized at present.
However because the side-menu has grown very long, I intend to group theme-pages on "theme-entry-pages" covering short description and links to the theme-pages. Then add a link to each "theme-entry-pages" on the side-menu instead of links of all the theme-pages. This way the number of internal links on the side-menu will be much less than now and the side-menu wouldn't be as long as its now.
I hope above answer your question :-)
If you would like me to make the effort to put my tuppence in on your question (which I'm sure is a matter of complete indifference to you), perhaps you could be so polite as to make the effort to substantiate why you believe that there may be something like this happening.
Just because people discuss something excitedly and at great length does not make it valid.
Incidentally, my take on the original question of the thread ("Your site dropped? Lost rankings? What to do now?") is to look at sites that are doing well and then look at your site(s) with an unbiased eye, rather than spending time with people who are also suffering from being affected (and often blaming their misfortune on external "errors").
reseller - others,
On the internal linking (dup. content issue):
I have one site that has a link to the "fitting guides" on each product page. So many pages are "landing pages" that I feel they need to be on them or "many" customers (I take a lot of phone orders) obviously can't locate a BIG RED DOT if it was in the center of each page, and have to be talked through the process of looking up their product in the fitting guide and then selecting it. This site hasn't budged (except "Florida") and then popped back to the top again where it remains.
My site that got mowed down this time is another niche but:
I still use internal links on product pages and section pages to steer the customer to one of three informational pages that describe the ingredients of the product. Each page has a link to the ingredient page that it matches and some to two or 3 info pages. These pop into new windows and don't take you off the item page. I keep my sites "3 layers": index page>section page>item & order button.
The only difference between the site that dropped and the one that didn't is the sections have links to ingredient pages, which makes for about an extra 20 internal links that are not neccessary, as they will be on the item page if the customer clicks through. They should probably go right?
Also my side navigation is bloated with duplication (keywords) but so do the sites that weren't moved? I have "brand X widgets", "brand Y widgets", "brand Z widgets" and they are listed as such in the side nav bar. Should I put all the widgets in one section from the nav bar - then the "X, Y, Z" sub-sections? This will make all "widgets" 4 layers deep to the order button, but will sure clean up a lot of duplication of keywords in my navigation bar which runs globally through the site? Should I do that?
Apologies for the big post. I appreciate your help!
>> When I do a site:mydomain.com search it only shows about two or three of my pages and omits the rest as similar pages. <<
I know of a site with 150 pages that has always had the exact same problem (last 18 months or so). Within 4 days of changing all of the titles and meta descriptions to be different on every page, about 30 pages were listed, and after less than two weeks is already up to about 60 pages listed before the "Repeat this Search with similar pages included" message appears...
On the question of whether or not an elongated side nav with the same 50 keyword-rich internal links on every page of the site is causing a site to flounder in Jagger...
There have been opposing opinions about this on many threads for two months now.
I think we need to look at the individual circumstances of each site.
Here's a simple example: if you take a site with a 2000-page glossary. There's a left nav bar with 50 internal links on every page. Each page is composed of a 20-word one-sentence definition. The definition may be literally "drowning" in the 200+ duplicated keywords, in the header, nav bar and footer of each page. This scenario may certainly trip a dup filter which might be avoided if you used a breadcrumb style navigation and removed the left nav bar.
On the other hand, a site with deep content (200-400 word pages) might easily "tolerate" an extensive left nav bar.
So we cannot conclude that a left nav bar on the face of it is causing a problem. The context must be in play too.
>> The question is should one use the rel"nofollow" tag in 29 of the links to prevent an excessive internal linking penalty.
let's not get carried away. Out of the 20 things one "fixes" when hit by Google, chances are that just one was the problem. I don't think internal linking is an issue. Virtually every site had a navigation bar, many with 20-30 links on the left side.
The only examples I've seen of internal links being a possible cause was where the site was obviously being artificially inflated, minimum content but with maximum linkage to the money pages...
"Check your site for canonical problems and hijackers."
What is hijacking?
How can I check if my site is hijacked? Are there any tools available?
Is it significant that you ask that question when your profile says "posts: 302", or just a very scary co-incidence?
Look back at previous threads that discuss "302 redirect URL hijack" as well as the article by Claus - all very easy to find via Google.
I look at my stats and it says 7.7 percent 302
What does really mean?
In the process of rewriting my site, I am including a base href tag in the head of each page: <base href="http://www.mydomain.com/particular-page.htm"> because it was recommended as a good anti-hijacking measure, although I'm not 100% sure why. Can someone explain this for me?
I'm also changing all links and image urls to absolute rather than relative links (a big chore) for the same reason. Again, I'm a bit blurry on the benefit here?
I have a relatively new site managed with PostNuke, and I am very concerned because Google is indexing the printable pages of my articles as well as the main content pages.
Does anyone know how to prevent this? Not only does it pose a duplicate penalty threat, it deprives me of ad revenue, since many people see the text only, printer friendly version of my work.
Put those print pages in a seperate directory and exclude it in your robots.txt and include:
<meta name="robots" content="noindex,nofollow" />
in the head of each of those pages to be doubly sure.
>> I am including a base href tag in the head of each page: <base href="http://www.mydomain.com/particular-page.htm"> because it was recommended as a good anti-hijacking measure, although I'm not 100% sure why. <<
If I can get to your page as www.domain.com/the-page.html as well as domain.com/the-page.html as well as via a 302 redirect from competitor.com/redirect.php?your-site then which URL should be indexed?
If the page contains a <base href="www.real-name.com/real-page.html"> tag then that must be what the page is really called, and if using that URL to access the page actually does work, then that is what you think it should really be called, and what most search engines would believe that it really is called.
>> I am very concerned because Google is indexing the printable pages of my articles as well as the main content pages. <<
Get the script to test the URL of the page that is being served and to add that meta tag to all pages that you do not want to be indexed: <meta name="robots" content="noindex">
Thank you for those replies and explanation. I'll see if I can find how to implement the nofollow on PostNuke.
Now I understand base href :) Thanks!
I have been thinking about this what do you do now question and for me its back to basics at first I had a clear vision of what my sites was to become and didnt worry to much about SEO I just built it for visitors easy navigate in hierarchical structure and easy on the eye. Then I got side tracked after reading ONE article and from then until J2 all I did was tweak tweak tweak and I have tweaked so bloody much I cant remember what I tweaked (keyword stuffing LOL) So I pulled up the original vision and I am back on track I am not worried anymore about J1 J2 or J3 there done. Back to basics because the next update will definitely change the web from what it is today.
Honestly I think.. ODP-listed sites are doing well. Non-ODP-listed sites aren't. TrustRank is a big player here I feel.
And I feel that it's actually a penalty being applied - a sort of damping effect on the domain. For one site, I put the same content on another domain and immediately got better results.. until Google worked out that it was duplicate content and dropped out the newer one in favour of the older one based on PageRank.
Here's a simple experiment - type in your domain name. You should always be number one for this.. unless you are under a penalty.
Yes, ODP sites seem to fare well. Also content to link ratio may play a role. I also believe that the ratio of reciprocals is now dampening sites. The higher the ratio of two-way links, the more of a dampening.
My site has been hand edited out the index i feel.
At some point in J1 i noticed 3 of my keywords were gone, completely out the index. If i typed in the keyword followed by my domain i just got a whole load of scraper sites with my old content.
Then after J3 has been rolled out all my big keywords were taken off with all the pages PR reduced to 0.
My non money keywords were left in place and even gained a higher PR on some pages.
Now I'm working more towards using PPC and other methods to get traffic and money back in. My income is now close to 0 from my site.
This was due to happen at some point i felt, it was all going to well.
Can someone sum up the top 5 things that will get you into trouble with Jagger? I am having trouble getting a handle on it. What are the top 5 things one needs to do to recover if you have lost position. Also, sites pointing to my site are doing better than my actual site.
We dont have any ideas yet, we can only summise for the minute.
|Also, sites pointing to my site are doing better than my actual site. |
I do not know if you mean what I have noticed with a "lost" site. When searching for just the url other sites with the url to my site are indexed before my site.
P.S. Four (4) things to check could be:
1) Duplicate content (on page and off page)
2) Keyword stuffing
3) www non www issues
4) same owner of sites within same theme
<<ODP-listed sites are doing well. Non-ODP-listed sites aren't. TrustRank is a big player here I feel. >>
Alas this is not true.
My main site has two ODP links and it is since september 22 nearly gone in Google. It seems to be a dampening effect.
The reasons I can find are:
- to much the same anchor texts from navigation to main pages. Pages who survived have very little text on them and modest interlinking.
- a dup content penalty is possible. I found at the end of many pages the same html advertisements (with same sentecences only some words changed).
- This site has two parts. One educational, one more travel themed. The educational part is old but is still getting fresh links. Google seems to think now that the site is educational since those pages are doing not that bad in the SERPS.
- Links from two of my other sites (all same theme only different niches).
I am busy changing point 1 and 2 slowly (sandbox fear). Point 3 I don't know what to do about it yet.
The links between my sites are neccessary. Otherwise I should put the same content on one site (which I don't like).
So are we best just to put a few words in the meta and alt tags?
I've seen a big drop after years of good rankings. I don't keyword stuff pages (intentionally) or optimize after each update, but I do exchange links with people to gradually help my sites. However, whenever I read these forums people say that if you have good content that people will link to you automatically. I have to disagree with this.
I have a lot of original content, including several requests each month to publish my content in different magainzes (small magazines) or textbooks (usually college oriented). And I do get one-way links from a lot of these sources. However, the links are almost never pageranked and they never seem to make any difference in our rankings. Now that we've been hit so hard with the last two updates, it seems that the harder we try the further we will fall. Therefore I'm tempted to do nothing but keep collecting quality links and waiting.
|So are we best just to put a few words in the meta and alt tags? |
No. Use them to as they are meant to be used and according to Google guidelines "Make sure that your TITLE and ALT tags are descriptive and accurate."
The rule is not to stuff them with your keywords!
|1) Duplication in titles dilutes the importance of all the keywords used, so avoid using your site slogan or <b>similar repetition</b>. Focus clearly and concisely on the content of that particular page. |
Well if your site is about the company X and its products you must have that word X on all of your pages. This is repetition yes, but I do not think this should or will harm your rankings. Themed pages rank good and to create a theme you must repet your main keywords.
|3) Avoid too much cross-linking between "related pages." I'm pretty sure I tripped a filter here, and it works a heck of a lot better if I just point to the appropriate section index instead of cross-linking multiple related pages. |
Donīt you think this would be quite strange when the basic idea with the web is to link to related pages and sites. This helps visitors and is user friendly and that is something Google supports.
I first posted my experience of update Jagger in the members forum.
Basically, I have been top 5 for 2 years for a keyword with 10m results. I also ranked heavily accross 1000s of KWs for related results.
The day the update happened I dropped to 170ish. My homepage that had ranked so well had been replaced in the serps by my highest PR ranked sub page.
I also noticed my PR dropped from 5 to 4.
Over the previous 2 months before update Jagger 2 things had happened. I stopped paying for a link on a competitior site (PR6 and the number 1 for the keyword in which I used to list top 5).
I had added 3 extra paragraphs of *slightly* keyworded text to the homepage.
My traffic dropped by about half. I was saved by Yahoo and MSN.
Here's what I did: I deleted the paragraphs I wrote.
Here's what happened: I am now on page three of the listings with my home page PR4.
Here's what I think happened: The loss of the high, ontopic, oneway link caused a drop in PR and AT THE SAME TIME I tripped a filter somehow with adding the extra paragraphs.
I believe the de-optomisation returned my homepage and that I am *rightfully* on page 3 according to my new PR / linking factors.
Next I intend to build more links again to increase inbounds to the homepage. I am also considering clearing out some outbound links - but I might leave that for a while.
One thing that pleased me was how I could still make money on the site without G, it was an interesting experience and one I intend to learn from.
My drop in rankings for one site propably occured when I added a template driven affiliate site as a part of my web site -> Too many new pages too fast and many near duplicate pages.
I have now made that affiliate site noindex via robots.txt and waiting...
It's obvious that this update covers a number of different issues, thus perhaps, the Jagger 1, 2, and 3, to deal with different aspects.
For myself, I know there were several factors at play, including the one just mentioned...adding affiliate dynamic-driven pages...something I won't don again in a hurry.
I deleted the whole thing, and used the Google Url Remove tool to delete the entire directory that it was in. But a "noindex" directive makes more sense. Will you let us know how that works please?
|Will you let us know how that works please? |
The affiliate site (the whole directory) is not in the index anymore, but my site is still not ranking. I guess I triggered some other filters too and have to make more adjustements to the site.
Maybe my site is "re-sanboxed" - I do not know if this can happen, but if there is something like this I could be there (with some of you :-).
Jagger 1 IMHO was one phase of the update designed to tackle one aspect of SERP's. Many people have differeing views on exactly what it tackled, and whether it was successful. Many sites bounced out from dropping to 100+, others didn't.
Here is what I have seen from the sites I work with:
Removing or adding on page content has made little difference. In fact, it would almost seem google is very lenient in comparsion ot MSN and Yahoo in penalizing for onpage sutting and spam unless a report is sent in, because their algo doesnt rely heavily on on-page factors except for the Title in competitive phrases. This is why they wish people to report spamming so much...
|Search excessive linking penalty in Google and look at the questions about too much internal linking possibly triggering a spam penalty. |
Not likely, unless your entire page is spammed links. Some of the sites I web for havce 50+ left side navbars linking to everypage. Many of those pages link further explicitly in content to their target. This is simply good web hierarchy. However, I have seen some improvement in serps with sites who were previously linked from many pages to home using their keyword. The jury is still out on this one.
non-www to www. This is IMHO huge in this update, and lots of great work has been summed up in Google News by Dayo and co. Lots of sites I see that had no explanation of loss in rank have had or are now experiencing issues with this and supplemental listings as a result.
As one or two members pointed out, the meta description does seem to be an important factor to look at. Make sure each page accurately describes what the content is, without adding keywords. Use your keyword only once here and use it so it makes sense. Limit keywords to only one-time version of your keys.
I dont see any evidence showing that too many links leaving one oage causes problems, even though its in Google guidelines. Lots of #1 sites have resources pages with over 300+ links leaving them, many unthemed.
I dont see any evidence showing that an ODP listing changes anything. I have a top 5 web hosting site that has been online less than one year...
However, I do see that what many sites in the ODP have in common is length of time online as well as length of aged links.