Forum Moderators: Robert Charlton & goodroi
It makes no difference at all.
Keywords in URLs of course help somewhat and it is a pure "white hat" way to optimize. There is nothing more natural than having a page about apples and calling it apples.html
So I am moving them to a throwaway domain. I will set up doorways in my main site to go to the new pages on the throwaway domain. The doorways will have content and refer people for details to the throwaway domain. The throwaway domain will get penalized.
This will also be done for catalog pages. For example, a catalog of similar products has a standard format with only some changed details. So once again one gets hit with duplicate content. So they will be moved to a throwaway domain and doorways made in the main site.
We tripped over this accidently when we noticed a new site, with no more than 20 pages of content, all of which referred back to a main site started ranking well while the main site has stayed where it dropped to in June. To make the main site rank, we will now move anything in a standard format to a new throwaway domain.
Any comments on this? Are there any better solutions?
Stilll loads of movement going on but it's slow . I've checked two DC's regularly this one - 66.102.9.99 and .104 and things are still being "crunched" as far as I can tell.
Still problems links in and anchor text, PR and other variables have not yet been fully factored in.
Also saw the sponsored ads at the bottom.
How do you say? Crap!
What to do on a Sunday morning other than start your day on The Mother of All Forums, The Greatest Forum on Planet.. Only on WebmasterWorld .. Forum 30
And I see a happy tigger today :-)
>>Thanks GG the offending site has been completely removed, now if you could just do the opposite with my site I'd be a happy man<<
I told you folks that GG & Matt mean business this time regarding removing spam and spammers of the index.
Please allow me to post this little shameless anti-spam promotion spot. Cleaning the index of spam will bring benefits for all of us.
Folks! Don't wait to send Jagger-related spam feedback; I'd send that now. Using the keyword "Jagger3" at [google.com...] will get someone reading and checking it out.
I guess I've lost some ROI as the site I bought ads on, about 6 pages were all knocked down to PR5! I advertised on the main page and some interior pages.
I have seen no movement after being paved under by Jagger until the J3 which has maybe 4 terms (down from about a dozen) back to first page, and "above the crease".
It is not the site in my profile but another site and I'm waiting 'til the dust settles.
Do you think the site I advertised on will give me an extra month? ;) Darn I wanted to get a PR5 on my site (is PR4 now and still holding) but that can't happen w/o those PR6 page links.
reseller,
Some said (to me) that they were just polititians and full of **** but I think when you asked to have that site looked at, and they did it so fast, proves they're not just BS'ers but are serious and will do what they say. That's good news!
There is a considerable amount of flux on the new DB's still. I'd rather call it consolidation, as the first pages seem to be condensing the good results.
In reference to title tags and URL's, which many of you have commented on ths past day, the <title> tag is an initial reference for search engines, and as such it helps to include keywords that form a logical phrase, unless your site is famous and beyond all that, where it doesn't really matter what you write, to a certain extent!
In reference to URL's, I have seen many .asp and template sites suffering on internal pages, because the URL relates to just a number, rather than a product or service. So, I believe that if the URL of internals also contains keywords, it helps.
In reference to duplicate content, someone commented on internal page format, and once again, the layout of the page is irrelevant, IF the content is different on each page.
[edited by: Eazygoin at 10:03 am (utc) on Nov. 13, 2005]
Thanks to Jagger, I have returned with a vengeance. I'm scoring very high for all those competitive keywords, plus quite a few more, and I'm delighted. I don't have it down to an exact science, but here are my observations of what works for me and what doesn't:
1) Duplication in titles dilutes the importance of all the keywords used, so avoid using your site slogan or similar repetition. Focus clearly and concisely on the content of that particular page.
2) If you are subdividing your site into logical and important sections then use your home page to point to and describe each section. Have a universal menu on all pages that points to home page, plus each index page of the separate sections. This establishes the importance that you place on these pages and is echoed in Google both in page rank and index placement.
3) Avoid too much cross-linking between "related pages." I'm pretty sure I tripped a filter here, and it works a heck of a lot better if I just point to the appropriate section index instead of cross-linking multiple related pages. It's as if googlebot goes merrily along, identifying important pages, then hits cross-links to not so important pages and has a hissy fit because it just doesn't understand you any more. Unfortunately, When googlebot doesn't "get it," you don't get indexed!
4) Avoid keyword stuffing. I was guilty of overusing keywords, putting my site slogan on each page, plus a keyword rich description of the page topic, before actually reaching the topic itself, which also included keywords. I'm a professional magazine writer, and this isn't a natural way for me to write, but it had become second nature when writing for my site. I was, in effect, leaving a breadcrumb trail all over for Googlebot to follow...and, in the end, Googlebot said screw the brumbcrumbs - all I want is the meat and potatoes. (My theory is that scrapers use so many breadcrumbs that they might as well make stuffing - so Googlebot has lost its appetite for breadcrumbs in general.)
5) Tighten up meta descriptions to reflect the content of the page only - no slogans or overused keywords here either.
6) Check your site for canonical problems and hijackers. If you find www and non-www issues, throw up a 301 redirect immediately and check WebmasterWorld for progress in this area. If you find hijackers...either use the Url Removal tool as described elsewhere on WebmasterWorld, or report them as spammers, whichever applies. I had to do both (and special thanks to Bear for alerting me to canonical issues, and to Dayo and Reseller for keeping me up-to-date with the latest.) I'm still wrestling with both of these but they are, at least, under some kind of control now.
7) The only reciprocal linking you should do is genuine, heartfelt recommendations of one site for another. Forget the "you rub my back, I'll rub yours" links. If you are a worthwhile content site you will get natural links. Augment that by distributing worthwhile content articles with your bio and link included in the deal.
8) Be pro-active if you find spammers in your niche appearing in the Google index. As Resellers keeps reminding us, report them. If the spammers are using AdSense, report them to AdSense as well. On one occasion, a report that I made to both AdSense, and google user support, resulted in the demise of about 10 different spam domains. These guys were appearing in the index spoofing my domain with page titles like "MyDomain specific page...url moved" then a description that said "url moved, please visit spamsite.homepage.htm" complete with fake 404s. Grrrrr....
9. Examine the successful competition with a microscope. Much as you dislike them - and I sure do dislike my top competitor - you can learn something from them! My competition site puts up hand selected links with brief descriptions as if they belong to the site itself, then describes Amazon books to fluff out the rest of the page and, from September 22 to Jagger 3, managed to beat me consistently in the index. It is still doing far too well, getting too many visitors but not much more than 1 page view per visitor (and what does that say, I ask you?). Regardless, its success in Google, I think, is because it never cross-links individual pages, (because there is nothing of substance to cross-link to) and therefore the site remains focused on section index links - where you find more of the same sh...stuff. I learned and copied her narrow navigation technique and it works. Point is...you can learn from whatever your competitors are doing, successful or otherwise.
10. Browse WebmasterWorld, and go a little further than you normally do. You'll be surprised what you will pick up. Next on my agenda is banning bad bots via htaccess, and WebmasterWorld is likely the source of the code that I will use.
11. Realize evolution on the net is inevitable and survival of the fittest applies. You can learn the skills you need on this forum, but you will also need to constantly hone those skills, and learn new ones, as Google continues its updates and the internet continues to grow.
Many thanks for your generous sharing and contribution. Much appreciated.
If you wish, you are most welcome to post the same message on the thread I have just started
Dealing With Consequences of Jagger Update
[webmasterworld.com...]
Where I hope to compile as many valuable tips as possible for the benefit of our kind fellow members.
Thanks a bunch.
If is nice to say IF the content is different. The essential content is different, but the content is duplicate as far as the search engine is concerned because there are so many duplicate lead ins.
It's absolute nonsense to say that the layout of the page has to be different on all internal pages. Dynamic websites run on a set layout/structure, so that A goes to B, B goes to C, and so on.
I also think the comment on internal linking is wrong, as pages that link to one another fromm any or all parts of a website, create a sitemap, and make it easy for robots to follow.
If two hotel pages have a high percent of the content indentical because of standard text, they are being penalized as duplicate.
Our catalog sites on our Chamber of Commerce site has been hit particualy hard.
It is not ridiculous, it is fact. It has happened to us and we are doing everything we can to work around the algo.
I track about 15 keywords in one sector and they've all gone back to exactly how they were at j2.
About ebay... I believe that when a site hits a certain threshhold of links,pr or whatever, they are exempt from some of the minor penalties that we receive, being just ordinary small-medium websites on the internet.
The layout is similar but the sections are themed so only content relevant to that section appears on the page. One of the decent top sites in my field is also built like this and rarely moves from the top. I also notice that Ebay has kw's in the page url's (can't remember if that was always the case). Ebay is like a massive collection of shops,each with unique content. I've just checked two pages from the same section on Ebay and the pages are only 1% similar!
On my site, other content appears to such a degree due to the templates and menus that each page is at least 80% similar.
My previous handmade html site always ranked highly because I knew how to organise the content so that each section was hardly similar at all. If I was a half decent web designer I could use the php content to build a site with unique content on most pages, but I'm not (which is why I bought an out of the box solution - the wrong one probably!)
I doubt it is the same for all templated sites - there are some at the top of my sector, but most use a particular site design which eliminates the top menus (mine can't do that without tweaking which I don't know about).
<It's absolute nonsense to say that the layout of the page has to be different on all internal pages. Dynamic websites run on a set layout/structure, so that A goes to B, B goes to C, and so on.
I also think the comment on internal linking is wrong, as pages that link to one another fromm any or all parts of a website, create a sitemap, and make it easy for robots to follow. >
There's nothing wrong with that, but what would happen if for some reason you ended up with 5 links to one particular section (due to linking from articles or promotions in addition to the normal menus)?