Forum Moderators: open

Message Too Old, No Replies

Slow indexing and ranking problems

         

allanp73

7:41 pm on Mar 25, 2004 (gmt 0)

10+ Year Member



I do not want this to be a thread bashing Google. I want to focus on the issues related to the Google's serps. After Florida update Google has become a lost cause. I have managed to get around the filter on several occasions but why must I make my site(s) look like directories when really they are informational in nature.

About 2 months ago I launched 20 sites with each having about 100 pages of unique content. Each target very niche city related terms and are only major content sites for the cities they target. They have dozens of links from directories and related sites, in the lastest pr update many become pr4-4 (meaning a pr4 on the index page and pr4 on the one level deep pages). Before Florida I would have expected the richness of content, high pr, cleaness of code, and lack of competition to guarantee that these sites would be top ranked for their "city widget" terms. However, every term is filtered. The sites are no where to be found in top1000 serps. Even for the sites unique name, which happens to have a keyword, they are no where to be found.

Because Google is filtering based on city related terms anything that people would use to find these does not work. Google was once my top referrer is now provide only a tickle of traffic and usually for non-related searches.

Even more annoying is that recently I have found Google is slow to index sites even when they follow proper design to allow for easy indexing. I have new sites (with no pr yet but many links) that were launched a month ago with only 1 page out of several hundred indexed. When Google first started the continuous update I was very pleased. Results were fresh and changes were quickly indexed. Now, it like Google has become lethargic and only slowly indexes even high pr sites.

So have others noticed these types of problems? How have people managed to deal with the new Google? How does one target city related terms without compromising the integrity of their sites? How can we survive this new age of Google?

Dayo_UK

8:46 am on Mar 26, 2004 (gmt 0)



allanp73

I have similar problems with sites launched approx 2 months ago. Most of these sites have now got page rank - including a PR4 like you - however the results are nowhere to be seen in the SERPS.....

Googlebot has not indexed as many pages or gone as deep as I would have expected if these sites were launched September, October time.

Perhaps age of a site or a site needs to become established is becoming more important to Google.

We will see - perhaps (hopefully) the Serps are due a shake up soon.

Patrick Taylor

10:09 am on Mar 26, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



allanp73, I don't have lots of sites like you. I have three or four I watch - a couple of oldish ones and a couple of newish ones: one from Dec 03 and another from early Feb 04. As far as city-related search goes, one of my new ones has a homepage optimised for (eg) "widget designer in mycity keyword1 keyword2 keyword3" and it began by being #1 for "widget designer in mycity", then in January it disappeared for that phrase, though remains in fairly good positions for strange variants of the phrase.

My other recent site had an important page optimised for a certain phrase and also began with #1 or #2 serps then in mid-March vanished for that phrase. Only when I throw in terms from the body text into the key phrase does it show in those positions.

Without boring you with the details, and having experimented at length, I've concluded that there's a possibility that when a page is optimised for a phrase (eg that phrase is the page title, is used in headings, and is used in anchor text, as well as repeated a few times in the body text), Google picks that phrase out and somehow downgrades the page for it unless the search phrase is varied in some way - when the page is searched for - like throwing in an extra word or spelling one of the words wrongly. Quite honestly, it appears to me like some sort of malfunction rather than a logical process at work.

Based purely on my own limited experience, if I was to offer an opinion as to what is wrong with Google, I would say I don't know but (i) I suspect that the day of the PR concept (where PR has been the measure of the "importance" of a page, as determined by backlinks - which is clearly rubbish these days) is over and they're experimenting with a view to replacing it with something meaningful, and (ii) their anti-spam filters are too "intelligent" (if you will).

BallochBD

10:36 am on Mar 26, 2004 (gmt 0)

10+ Year Member



Without boring you with the details, and having experimented at length, I've concluded that there's a possibility that when a page is optimised for a phrase (eg that phrase is the page title, is used in headings, and is used in anchor text, as well as repeated a few times in the body text), Google picks that phrase out and somehow downgrades the page for it unless the search phrase is varied in some way - when the page is searched for - like throwing in an extra word or spelling one of the words wrongly. Quite honestly, it appears to me like some sort of malfunction rather than a logical process at work.

Patrick I most certainly agree with you but this frightens me. If Google IS penalising sites for this then we are doomed.

Maybe I am missing something so let's forget about SEO for a moment. Let's say that I have a site about cars and a page within this site about car engine repairs. What would I call the page, let's see, how about carenginerepairs.htm? What would I use for the page heading? How about "Car Engine Repairs"? Nothing wrong so far is there?

A few of the other pages on my site are linked to this page. What would I call the links? I think "Car Engine Repairs" would be good link text. It says exactly what the page is about and makes things easier for my site visitors.

Now when I am writing the copy for the page it is reasonable to assume that I will mention this text again and probably more than once, after all that is what the page is all about. How can this possibly be seen to be wrong? If I had a filter I would conclude that it is overwhelming evidence that the subject of the page is car engine repairs and that it should NOT be filtered from any SERPS for this phrase.

Am I missing something?

Marval

10:50 am on Mar 26, 2004 (gmt 0)

10+ Year Member



I don't think it's limited to specific phrases or filters - sites just aren't being added in on freshness and haven't been for almost two months now - maybe a page or two on established sites but I haven't seen any new stuff we've put out showing up recently - in a wide variety of industries.

tantalus

11:50 am on Mar 26, 2004 (gmt 0)

10+ Year Member



<<Maybe I am missing something so let's forget about SEO for a moment. Let's say that I have a site about cars and a page within this site about car engine repairs. What would I call the page, let's see, how about carenginerepairs.htm? What would I use for the page heading? How about "Car Engine Repairs"? Nothing wrong so far is there?

A few of the other pages on my site are linked to this page. What would I call the links? I think "Car Engine Repairs" would be good link text.>>

I have been thinking about this and as a web designer I would say that's not the naming convention most designers would use, at least without any seo nous.

carenginerepairs.htm is just too long (Designers or at least me are lazy) repairs.htm or engine_repairs.htm if there were more pages about repairs (ie body_repairs) would suffice. At the end of the day file naming is essentially just for the benefit of whoever maintains & builds the site.

Ditto for links: As a designer your main concern is to use the least amount of words in order to fit the space allocated. Again Repairs is enough and if the site is about cars you really would have to be dim to expect the link to take you to say housing repairs.

I do think Google is taking these things into consideration and trying to create a more level playing field.

Cheap & discounted car engine repairs for all makes & models inc (model, another model & yet another model ) in this location [macalester.edu]

Would insult the intelligence of monkey, particularly from a user perspective.

Engine Repairs
Body Work Repairs
Windscreen Replacements
Vehicle Servicing
Auto Parts

Works. And if you look at the above semantically...

Sorry if this is off topic.

Patrick Taylor

12:33 pm on Mar 26, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The actual words don't matter. I think the point is that matching anchor text (on or off site) to the title of the page linked to has been the sort of thing that's been advocated for a while, but for some people it seems not to work any more, as if Google is now penalizing some of the conventional wisdoms even when they're applied in a non-intensive way and still work with other search engines.

I believe the way for an average small-size webmastering operation to "survive the new Google" is not to design / optimize specifically for Google but to aim for an overall balance across Google, Yahoo, MSN, etc.

BallochBD

12:57 pm on Mar 26, 2004 (gmt 0)

10+ Year Member



Tantalus you took me literally. I was just trying to provide an illustration. Obviously this was a bad one :o(

My point was that it is not beyond the realms of possibility that the page title, headers and anchor text could all, quite innocently, contain common content, particularly when it was the subject of the page.

I think that it would be wrong to apply a blanket filter on this.

steveb

1:11 pm on Mar 26, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"I think that it would be wrong to apply a blanket filter on this."

Which is why no one does.

tantalus

2:37 pm on Mar 26, 2004 (gmt 0)

10+ Year Member



I don't think common content is a problem I do think an exact match is.

carenginerepairs.html
Car Engine Repairs (title)
Car Engine Repairs (hx)
Car Engine Repairs (anchor)

It adds nothing from a user perspective and this fits very nicely with Google's policy of "build your sites for users"

Patricks case of matching anchor and title is interesting.
The shorter your title the less choice you have but to match, however the longer your title...

Anyway just my penny's worth :)

Midhurst

3:15 pm on Mar 26, 2004 (gmt 0)

10+ Year Member



Let's try a bit of analysis together and see where that leads us
Let's assume what we are experiencing is not a Google glitch but a penalty.
So what might Google be penalising?
1. Keyword density of the text - over 3%, 5%, 10%?
Have a look at your sites and estimate a figure.
2.Keyword Repetition - ie keyword strings, just repeating the same old phrases, or reworks of the same KEYWORDS at the bottom of the page, perhaps with same colour as the background.
3.Anchor text (internally) repeating the above keywords
4.External incoming links with the anchor text as above.
5. Dodgy neighborhoods ie linking out to to off theme sites.
I don't think having headers with a keyphrase in it is a problem.
My favourite villeins are Keyword Stuffing in strings, a miss match between the text content (and keywords) and the titles,page headers,etc - though I think this just gets you demoted not banned, primarily on the ground of relevance.
I had a site to optimise which had a large graphic on the front page. I inserted a few lines of HTML text with one keyword repeated with other words like k1k2 words k1k3 words k1k4 words k1k5 words. And guess what?it refused to surface. Took out the k1 from 4 out of the 5 phrases and added two properly constructed sentences of explanation, and it sailed into view almost overnight at #7 and has since gone to about #14 and stopped dropping. A coincidence? Perhaps, but I've also got 20 'bad'link exchanges to remove. So there may be two elements here for me to correct
So what bad features might your sites be showing? None?
Then we're no further forward.

cabbie

4:10 pm on Mar 26, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>>>About 2 months ago I launched 20 sites with each having about 100 pages of unique content.
Sounds like that this could be the problem.
20 sites interlinked suddenly appearing in google all with backlinks already established,all probably bought from the same register at the same time, probably on the same server with the same whois information.
Hmmm I wonder if google is thinking what I am thinking.:)

metrostang

4:47 pm on Mar 26, 2004 (gmt 0)

10+ Year Member



From my experience "exact match" might not be the problem.

My site dropped with Austin and came back with Brandy after adding the exact keywords in two additional places. Once in the heading and another in anchor text. These words were already in the title, keyword and description meta tags.

The first 6 pages indexed after Brandy came back immediately and 36 others have made it to the top as they made it into the index. They are still there except for two that for unknown reasons are gone from the index.

I've made additional changes since then and pages have all been crawled many times, but the fresh pages have been slow to show up. I've experimented with adding more of these same keywords with two pages and they both moved to #1. In once case I added one keyword 12 more times and it didn't drop from #1.

This tells me that at least in the context that I used the keywords, there was no penalty.

Midhurst

5:25 pm on Mar 26, 2004 (gmt 0)

10+ Year Member



After my last post I reviewed some of my websites, just those which has opted to join a Small Town Local Network. The Network of about 20 sites decided to try a bit of Trip Wire Marketing to get more passers by and tourists to stop in the town rather than belt off to the cities north and south.
1. Two very small unoptimised sites started off OK in December when they were launched but soon were difficult to find under their main keyphase when the location was added. This must surely have resulted from being part of an off theme network.
2. A medium sized site, 5 years old, gently optimised on each page, lots of quality inward links and good rich relevant content remains top of the serps world wide out of 6.5 million pages. Not effected at all.
3. A mdium sized site, smaller than (2) with good rich relevant content but arguably over optimised with too many keyword strings was top of the serps in Google, Yahoo and Altavista in September and got hit in early Feb when it could no longer be found for its principal 4 word keyphrase.
Since then has recovered somewhat, now easily found for a contracted version of the keyphrase ie instead of widget1 2 3 4 it is widget 2 3 4
Not being a mathematician or algo guru, I can only think in terms of bonus points and minus points.
Does the algo give in effect + or - points to certain traits; lots of bonus points for high PR on theme incoming links; lots of minus points for linking to a PRO's site or joining a very bad neighbourhood.
Am I being fanciful?
The two small sites get hammered for joining the network with no redeeming features: content thin, no good incoming links, etc whereas the medium sized rich comtent, lightly optimised site with a plethora of quality incoming links gets of scot free because of its massive number of bonus points.
Fanciful?
So now I intend to first add a plethora of good incomimng links to the two small sites and see if they recover; and then remove the new incoming links and eleminate the offending off theme link exchange and see what happens.

allanp73

7:17 pm on Mar 26, 2004 (gmt 0)

10+ Year Member



cabbie,

"20 sites interlinked suddenly appearing in google all with backlinks already established"

I never said that the sites interlinked. Actually, they don't have links on any others in the group. I got links from directories and other sites with the same theme who value the content. I strictly avoid any kind of cross-linking because I wanted to make clean strong sites. The goal with these sites is to really test Google. Is spam the issue? Is over optimization? Or lack of links?
The 20+ sites do not spam and have strong links and content.
The one thing that could be a problem, is local rank. Most of the links coming in are from sites virtually hosted on one or two IPs. Possibly the sites lack local rank as the other links (due to Google's laziness about crawling) haven't been picked up.
I wonder how many links it would require (not to get a pr boost) but to build a local rank which would prevent Google filtering (or extremely poor ranking for those who do not believe in the filter). 20? 30? 50? How many and is pr of those links an issue with local rank? (I imagine yes, but who knows)
Does anyone have experience with a site that has strong local rank (meaning many links from many differnet IPs)? Have these sites been affected?

BigDave

7:39 pm on Mar 26, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Uh, why isn't the <title> of your carenginerepair.html page branded?

Shouldn't it be "MyAutomotiveSite.com: Car Engine Repair"?

From a user perspective, I really prefer branded titles, and sites with favicons, because it makes it a lot easier when using tabbed browsers.

And from an SE standpoint, you aren't as likely to have all your links and header tags matching your title exactly.

HayMeadows

8:08 pm on Mar 26, 2004 (gmt 0)

10+ Year Member



sites just aren't being added in on freshness and haven't been for almost two months now

Spot on Marval, thankfully they are elsewhere! Sorda.

cabbie

9:18 pm on Mar 26, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



allanp73

Sorry for making some wrong assumptions but ,whether they are interlinked or not 20 sites come to their notice because of links from sites "virtually hosted on one or two IPs".
Now if this isn't a flag for google to suspect some manipulation then they aught to offer me a job there to help them out with spam tactics.;)
How could google takes the votes(links)to these sites seriously.

If these 20 sites are also on the same server as each other and have the same registration date and contain similar whois info then you are only compounding the issue.

allanp73

9:38 pm on Mar 26, 2004 (gmt 0)

10+ Year Member



cabbie,

Actually, these sites are subdomains. Though I have other sites that are older with a similar problem. The IPs of the links to these sites are from completely different servers, though one of the hosts happens to host dozens of themed sites virtually one IP. I got links from of these sites though not all.
My thought is this does create a problem with local rank which would just ignore it altogether.
So have to get more links and watch what the IPs are.

cabbie

9:46 pm on Mar 26, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>>So have to get more links and watch what the IPs are.

Also maybe if you have sites already in the serps for that niche and google connects that the new ones are related then it still may not show them.