Forum Moderators: open
About 2 months ago I launched 20 sites with each having about 100 pages of unique content. Each target very niche city related terms and are only major content sites for the cities they target. They have dozens of links from directories and related sites, in the lastest pr update many become pr4-4 (meaning a pr4 on the index page and pr4 on the one level deep pages). Before Florida I would have expected the richness of content, high pr, cleaness of code, and lack of competition to guarantee that these sites would be top ranked for their "city widget" terms. However, every term is filtered. The sites are no where to be found in top1000 serps. Even for the sites unique name, which happens to have a keyword, they are no where to be found.
Because Google is filtering based on city related terms anything that people would use to find these does not work. Google was once my top referrer is now provide only a tickle of traffic and usually for non-related searches.
Even more annoying is that recently I have found Google is slow to index sites even when they follow proper design to allow for easy indexing. I have new sites (with no pr yet but many links) that were launched a month ago with only 1 page out of several hundred indexed. When Google first started the continuous update I was very pleased. Results were fresh and changes were quickly indexed. Now, it like Google has become lethargic and only slowly indexes even high pr sites.
So have others noticed these types of problems? How have people managed to deal with the new Google? How does one target city related terms without compromising the integrity of their sites? How can we survive this new age of Google?
I have similar problems with sites launched approx 2 months ago. Most of these sites have now got page rank - including a PR4 like you - however the results are nowhere to be seen in the SERPS.....
Googlebot has not indexed as many pages or gone as deep as I would have expected if these sites were launched September, October time.
Perhaps age of a site or a site needs to become established is becoming more important to Google.
We will see - perhaps (hopefully) the Serps are due a shake up soon.
My other recent site had an important page optimised for a certain phrase and also began with #1 or #2 serps then in mid-March vanished for that phrase. Only when I throw in terms from the body text into the key phrase does it show in those positions.
Without boring you with the details, and having experimented at length, I've concluded that there's a possibility that when a page is optimised for a phrase (eg that phrase is the page title, is used in headings, and is used in anchor text, as well as repeated a few times in the body text), Google picks that phrase out and somehow downgrades the page for it unless the search phrase is varied in some way - when the page is searched for - like throwing in an extra word or spelling one of the words wrongly. Quite honestly, it appears to me like some sort of malfunction rather than a logical process at work.
Based purely on my own limited experience, if I was to offer an opinion as to what is wrong with Google, I would say I don't know but (i) I suspect that the day of the PR concept (where PR has been the measure of the "importance" of a page, as determined by backlinks - which is clearly rubbish these days) is over and they're experimenting with a view to replacing it with something meaningful, and (ii) their anti-spam filters are too "intelligent" (if you will).
Without boring you with the details, and having experimented at length, I've concluded that there's a possibility that when a page is optimised for a phrase (eg that phrase is the page title, is used in headings, and is used in anchor text, as well as repeated a few times in the body text), Google picks that phrase out and somehow downgrades the page for it unless the search phrase is varied in some way - when the page is searched for - like throwing in an extra word or spelling one of the words wrongly. Quite honestly, it appears to me like some sort of malfunction rather than a logical process at work.
Patrick I most certainly agree with you but this frightens me. If Google IS penalising sites for this then we are doomed.
Maybe I am missing something so let's forget about SEO for a moment. Let's say that I have a site about cars and a page within this site about car engine repairs. What would I call the page, let's see, how about carenginerepairs.htm? What would I use for the page heading? How about "Car Engine Repairs"? Nothing wrong so far is there?
A few of the other pages on my site are linked to this page. What would I call the links? I think "Car Engine Repairs" would be good link text. It says exactly what the page is about and makes things easier for my site visitors.
Now when I am writing the copy for the page it is reasonable to assume that I will mention this text again and probably more than once, after all that is what the page is all about. How can this possibly be seen to be wrong? If I had a filter I would conclude that it is overwhelming evidence that the subject of the page is car engine repairs and that it should NOT be filtered from any SERPS for this phrase.
Am I missing something?
A few of the other pages on my site are linked to this page. What would I call the links? I think "Car Engine Repairs" would be good link text.>>
I have been thinking about this and as a web designer I would say that's not the naming convention most designers would use, at least without any seo nous.
carenginerepairs.htm is just too long (Designers or at least me are lazy) repairs.htm or engine_repairs.htm if there were more pages about repairs (ie body_repairs) would suffice. At the end of the day file naming is essentially just for the benefit of whoever maintains & builds the site.
Ditto for links: As a designer your main concern is to use the least amount of words in order to fit the space allocated. Again Repairs is enough and if the site is about cars you really would have to be dim to expect the link to take you to say housing repairs.
I do think Google is taking these things into consideration and trying to create a more level playing field.
Cheap & discounted car engine repairs for all makes & models inc (model, another model & yet another model ) in this location [macalester.edu]
Would insult the intelligence of monkey, particularly from a user perspective.
Engine Repairs
Body Work Repairs
Windscreen Replacements
Vehicle Servicing
Auto Parts
Works. And if you look at the above semantically...
Sorry if this is off topic.
I believe the way for an average small-size webmastering operation to "survive the new Google" is not to design / optimize specifically for Google but to aim for an overall balance across Google, Yahoo, MSN, etc.
My point was that it is not beyond the realms of possibility that the page title, headers and anchor text could all, quite innocently, contain common content, particularly when it was the subject of the page.
I think that it would be wrong to apply a blanket filter on this.
carenginerepairs.html
Car Engine Repairs (title)
Car Engine Repairs (hx)
Car Engine Repairs (anchor)
It adds nothing from a user perspective and this fits very nicely with Google's policy of "build your sites for users"
Patricks case of matching anchor and title is interesting.
The shorter your title the less choice you have but to match, however the longer your title...
Anyway just my penny's worth :)
My site dropped with Austin and came back with Brandy after adding the exact keywords in two additional places. Once in the heading and another in anchor text. These words were already in the title, keyword and description meta tags.
The first 6 pages indexed after Brandy came back immediately and 36 others have made it to the top as they made it into the index. They are still there except for two that for unknown reasons are gone from the index.
I've made additional changes since then and pages have all been crawled many times, but the fresh pages have been slow to show up. I've experimented with adding more of these same keywords with two pages and they both moved to #1. In once case I added one keyword 12 more times and it didn't drop from #1.
This tells me that at least in the context that I used the keywords, there was no penalty.
"20 sites interlinked suddenly appearing in google all with backlinks already established"
I never said that the sites interlinked. Actually, they don't have links on any others in the group. I got links from directories and other sites with the same theme who value the content. I strictly avoid any kind of cross-linking because I wanted to make clean strong sites. The goal with these sites is to really test Google. Is spam the issue? Is over optimization? Or lack of links?
The 20+ sites do not spam and have strong links and content.
The one thing that could be a problem, is local rank. Most of the links coming in are from sites virtually hosted on one or two IPs. Possibly the sites lack local rank as the other links (due to Google's laziness about crawling) haven't been picked up.
I wonder how many links it would require (not to get a pr boost) but to build a local rank which would prevent Google filtering (or extremely poor ranking for those who do not believe in the filter). 20? 30? 50? How many and is pr of those links an issue with local rank? (I imagine yes, but who knows)
Does anyone have experience with a site that has strong local rank (meaning many links from many differnet IPs)? Have these sites been affected?
Shouldn't it be "MyAutomotiveSite.com: Car Engine Repair"?
From a user perspective, I really prefer branded titles, and sites with favicons, because it makes it a lot easier when using tabbed browsers.
And from an SE standpoint, you aren't as likely to have all your links and header tags matching your title exactly.
Sorry for making some wrong assumptions but ,whether they are interlinked or not 20 sites come to their notice because of links from sites "virtually hosted on one or two IPs".
Now if this isn't a flag for google to suspect some manipulation then they aught to offer me a job there to help them out with spam tactics.;)
How could google takes the votes(links)to these sites seriously.
If these 20 sites are also on the same server as each other and have the same registration date and contain similar whois info then you are only compounding the issue.
Actually, these sites are subdomains. Though I have other sites that are older with a similar problem. The IPs of the links to these sites are from completely different servers, though one of the hosts happens to host dozens of themed sites virtually one IP. I got links from of these sites though not all.
My thought is this does create a problem with local rank which would just ignore it altogether.
So have to get more links and watch what the IPs are.