Forum Moderators: open

Message Too Old, No Replies

All people link to me with a two-word phrase - and G penalized me

What to do?

         

Jessica

8:17 am on Nov 30, 2003 (gmt 0)

10+ Year Member



Hi have a site all about "blue widgets". Its also the name of the site.
All people link to me with "blue widgets".
And it appears that after Florida update my site is being penalized for this keyphrase :(

I can't just tell all the people to stop linking me with the same keyphrase.

What to do?

steveb

8:27 am on Dec 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"If the "most important" sites linking to you are off-topic, what does that say about you?"

That you have a great site?

Oh please Wall Street Journal, please don't link to me.

C'mon, high PR off-topic links are obviously great, and a sign of quality. Reams of low PR off-topic links, that's another story.

nippi

8:45 am on Dec 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



finish

can not control what others do, so google will not penalise for this.

Powdork

8:59 am on Dec 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



steveb is correct,
However, and I'm not sure if this is important, there is a different level in different industries as to what is 'important'. In my area, a link from Modern Bride Magazine's home page would be huge, but it's only pR six. I just got a free pr 6 link from JoeAnt (not quite free, took a lot of work, but 0$). The result, site gone. And they linked with my hyphen-free url. High PR off topic links are a sign of quality. Google just doesn't recognize quality anymore.

caustic

9:11 am on Dec 1, 2003 (gmt 0)

10+ Year Member



Hrm, is the general concensus here that there is a decided difference between "off-topic" and"on-topic" links? I.e. the LocalScore Patent?

Powdork

9:28 am on Dec 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I should mention that I know my site hasn't gone awol because of the JoeAnt link.

To answer your question, caustic. I don't think there is a consensus. Off topic links still matter it would seem at least as much as before. Maybe even more. GoogleGuy even hinted at this back in Domerelda. It is possible the sites that are missing now may come back in as the calculators of local rank, since they seem to match all prior relevance criteria. This may be the update, or perhaps Galen, that gives us the answer to that. It seems more likely, however, that this will be the update that leaves us with more questions, than answers.

Miop

10:13 am on Dec 1, 2003 (gmt 0)

10+ Year Member



I hope it isn't backlinks that is the problem - of the thousand or so links to my site, I only built around 300 of them. The rest just linked to me of their own volition.
I would have to spend a *long* time emailing them al and checking that they had removed my link...

Nicola

10:16 am on Dec 1, 2003 (gmt 0)

10+ Year Member



I only built around 300 of them

300? Doorway pages?

super_seo

11:20 am on Dec 1, 2003 (gmt 0)

10+ Year Member



There is no mystery here people, is all too obvious what is going on. Google is targeting commercial sites optimized for specific search phrases. If you add a -sadfasdf to your search the results are as they were pre-Galen. I have 100s of sites all targeting specific commercial products that all rank in the top 2 in normal search results. Great sites, independent, fantastic original content, excellent service and satisfaction. All gone gone gone. add -asffefaea to the search query and wala! Google is targeting only main index pages as this filter is simple and effective. Good luck with the IPO Google, I'm hoping i can recuperate my losses with some quickly turned over shares. I for one have contributed to your revenue from a budget of 1200$ per month to 38,000$ per month joy joy happy happy!

Nicola

11:32 am on Dec 1, 2003 (gmt 0)

10+ Year Member



Google is targeting commercial sites optimized for specific search phrases.
Maybe, but there's more to this than meets the eye. Maybe they've identified common affiliate codes and are down ranking anyone who has them in their source code.

super_seo

11:36 am on Dec 1, 2003 (gmt 0)

10+ Year Member



no affiliate codes nothing. The really weird thing is on one of our sites (out of over 200) actually one of our older sites(2.5 years), is still holding its position, I can't understand it can't be random. All sites have aquired links from similar sites and most are around the same amount of links. And all are dominantly aquiring a single 2 phrase link. So why was I left with this little tidbit? Does Google actually have some sort of concience?

Sunset_Jim

8:54 pm on Dec 1, 2003 (gmt 0)

10+ Year Member



I have a destination travel guide site for which I have optimized the index page: title, description, alt, header, body text and incoming anchor text for 6 keywords: a 3-keyword phrase which is the name of my city, e.g., "My Great City" followed by 3 lodging type keywords, e.g., hotels, motels and condos. This is my exact title.

When I include all 6 keywords in allinanchor: i.e., my great city hotels motels condos, my index page comes up number 1.

If I include the city name followed by 5 keywords, e.g., my great city hotels motels, my index page still comes up number 1.

If, however, if I only include the 3-keyword city name followed by 1 lodging keyword, my index page is not to be found.

On the contrary, if I conduct a search using all 6 keywords, my index page comes up number 1, however if I use less that the total 6 words, my index page is not found. Before the Florida update my index page would appear in the top 5 for a search on the city name followed by any one of the lodging keywords.

This all suggests to me that Google is requiring an exact match to my 6 keyword phrase and is penalizing my site for other than an exact search match to my 6 keywords. I have observed that my two main competitor sites which are optimized as I have done retain their number 1 and 2 spots. The only thing I can see different in their listing is that it shows a Directory Listing whereas mine does not although I am listed in the DMOZ directory the same as they are.

My back-links have anchor text with the same 6 keywords as in my title. Since FLorida my PageRank has increased from 4 to 6 whereas, but my home page is lost except for an exact match to my 6 keywords.

Any suggestions as to where I should begin to reoptimize my site?

kaled

9:15 pm on Dec 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If, however, if I only include the 3-keyword city name followed by 1 lodging keyword, my index page is not to be found.

To test a theory, what happens when you add a fifth word
1) common words such as a, an, and, the
2) words that don't appear, e.g. nosegay
3) words that appear but are not optimised e.g. price, car
4) -asdf

Kaled.

PS
When you say "is not to be found" try to locate using [google.com...] and set results per page to 100.

Sunset_Jim

9:41 pm on Dec 1, 2003 (gmt 0)

10+ Year Member



kaled

>>To test a theory, what happens when you add a fifth word
1) common words such as a, an, and, the
Site not found.

2) words that don't appear, e.g. nosegay
Site not found.

3) words that appear but are not optimised e.g. price, car
Site found (#1) if two adjacent non optimizes words found on page added to first 4 keywords, i.e., my great city hotels complete guide. One word, i.e., complete won't do it.

4) -asdf

single -asdf site not found in top 100
If -asdf -jgtu added to first 4 keywords results in site position #2

kaled

10:01 pm on Dec 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sunset_Jim,

Though not conclusive, this does tend to support the bayesian-spam theory that was discussed in another thread. If correct, then Google have lost the plot. I posted a thread explaining why in the simplest and most obvious way but WW declined it. This is not a theory that I could ever have come up with because it would never have crossed my mind that a corporation the size of Google could be so stupid. Well, we live and learn!

If this theory is correct, eventually Google will drop it or the public will drop Google.

Kaled.

GregR

11:49 pm on Dec 1, 2003 (gmt 0)

10+ Year Member



"Hey people, I seriously doubt that Google is penalizing anyone based on words used to link to them. Just search for slash dot. The #1 listing is of course slashdot.org. Look at the cached page: "These terms only appear in links pointing to this page: slash dot""

"slash dot" is not a money keyword phrase. Inbound link text with money keywords killed my sites for those keywords.

mquarles

11:54 pm on Dec 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



And the same method can be used on your competition.

super_seo

6:26 am on Dec 2, 2003 (gmt 0)

10+ Year Member



I believe this is filter simply targeting a set of commercial terms, cars, insurance, hotels, flights …. It is then taking a bell curve approach to filter your site from the results. The line of the graph being the inbound link anchor text and the peak of the bell curve being the most popular link anchor text phrase that you are using. If the curve has too much of a gradient incline those words at the top of the peak are then filtered out. I’m thinking 60% 30% 30% 10% variance on 2 keyword phrases and 70% 10% 10% 10% on 3 word phrases can keep you out of the chopper.

mquarles

2:20 pm on Dec 2, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



super_seo,

Thoughts about internal v. external links? I seem to have some conflicting evidence there.

MQ

finer9

3:58 pm on Dec 2, 2003 (gmt 0)

10+ Year Member



I have a distinct feeling that the filter has to do with keywords being in the domain name (perhaps hyphen separated in the case of multi-word terms) and also in the link to that domain. Also, it may have to do with the same term being in the actual name of the HTML pages.

In fact, I really think it is a 'limit' like Spam Assassin...if you have 2/5 factors, you may be fine, but if you have 3/5 your site is dropped....

These are just my own personal theories from all the talk, observation, and two of my own sites that were affected.(the rest were not)

Josh

super_seo

6:30 pm on Dec 2, 2003 (gmt 0)

10+ Year Member



mquarles

i believe there is little or no difference from internal to external, however IMHO internal links should be more trustworthy therefore should pull even more weight.

finer9

I have 10s of domains both dashed keyword and non keyword laden I have recorded no data that would lead to this being penalized. I'm very confident its incoming and internal link anchor text only. Should be simple to over come the filter and come back but what a slow pain in the butt with so many domains and links. I think Google is just keeping us busy while the real algo comes.

jim_w

7:41 pm on Dec 2, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



James_Dale
>>H1, H2, H3 - when was the last time you saw a modern, professional, worthy site using such old-fashioned tags?<<

I use them because they reformat to the proper size, in theory, with ALL browsers.

super_seo
>>Google is targeting commercial sites optimized for specific search phrases<<

Your right. Optimized for 1 or 2 keywords. Regardless of how much relevant content.

Nicola
>>Maybe they've identified common affiliate codes<<

Yea, and I think it is adsense. Maybe if 1 or 2 high volume keywords that advertisers pay higher dollars for are being clicked too often on the sites that are gone, then G is making sure that you cannot make as much money as they do. We need some way to know what percentage of the sites gone for 1 or 2 KW’s had adsense running. Still my best guess, because we didn’t do anything. And this indeed seems to be the one constant. People do not know why their sites had disappeared for 1 or 2 KW’s.

kaled
>>When you say "is not to be found"<<

I mean not found in the first 1000 places using a tool somewhere on the net that tells you your position in G up to 1000. So I’m NOT in the 1st 1000, but only for the 1 or 2 KW’s.

>>I posted a thread explaining why in the simplest and most obvious way but WW declined it.<<

They have been doing that a lot lately. More now than during other G updates, this in of itself says something about this update and perhaps the sponsors of the forum. But it seems to me that every time something negative about G get said, there are a least a couple of positive things said about G right afterwards. I’ll be surprised if this doesn’t get killed, so I have someone sitting beside me here to see me post it.

super_seo
>>It is then taking a bell curve approach to filter your site from the results. The line of the graph being the inbound link anchor text and the peak of the bell curve being the most popular link anchor text phrase that you are using.<<

I have a lot more relative links that no less then 900 sites in front of mine. I have inbounds from 3 major universities, NASA, and 2 trade publications.

Theory about older site. Our site is going on 4 years old and it is gone from 1 or 2 KW’s. If it is the adsense thing, then G needs to be reported to the FTC., (Federal Trade Commission). Should be done anyway because if G hasn’t done anything wrong, this will squelch any wrongdoing theories, and if they have, the need to be penalized and that needs to be made public before the IPO so investors and Wall Street will know for sure.

[edited by: jim_w at 7:51 pm (utc) on Dec. 2, 2003]

Chndru

7:44 pm on Dec 2, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> so I have someone sitting beside me here to see me post it.

not meaning to dilute your post, but i couldn't help laughing, when i read the above line

jim_w

7:47 pm on Dec 2, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Well as President Regan once said, "Trust, but verify."

jim_w

8:19 pm on Dec 2, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Chndru

BTW, I didn't call someone to watch, they just happen to be here and I was showing them messages about the update. But see what happens when you grew up in the 70's? ;-)

jim_w

9:33 pm on Dec 2, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I did just notice this, if the user searches in the images of G, I still come up in the old position, actually better. So lots-o-graphics, relevant of course, may be the answer until we figure out what we did.

helenp

9:46 pm on Dec 2, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I just don´t think google is doing anything, think it´s a bug or they are doing updates, don´t understand to much about it.
But really in my case the only page dropped is one in a language not used often, and checking posible comercial words I use on that page, in **** in that language, really no site has been "penalized" or pushed away from top 100........... Actually if not remember bad, that page was indexed several months later than the rest of languages.
I do recomend you reading this [google-watch.org...]
I do think everything is going to be as it was............. I hope so.

Chndru

10:06 pm on Dec 2, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



helenp, that article was written on June 9, 2003. Those days, there were heavy rumur mills running about their database pointers and what-not.

helenp

10:17 pm on Dec 2, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I just don´t know, but there are no logic in anything.........
and as understand my english, the changes will take a year,,,,,,,,,,,,, it thats is the correct:
12:04 pm on June 7, 2003
Google has reached its data indexing capacity of 4,294,967,296 (2^32) URLs. Now non-image URLs have an ID stored in 4 bytes, so Google is now running out of IDs for stored pages. When there will be no URLs returned "not found" and deleted from the index, total number of non-image files indexed will soon reach 4,294,967,296 including 3,083,324,652 html pages. After that Google will stop adding new URLs from indexed pages as well as new URLs added for indexing.

They are now considering reconstruction of the data tables which involves expanding ID fields to 5 bytes. This will result in additional 2 bytes per every word indexed throwing the total index size to be multiplied by 1.17. This procedure will require 1000 new page index servers and additional storage for temporary tables. They are hoping to make this change gradually server by server. The completion of the process will take up to one year after that the main URL index will be switched to use 5 bytes I

t2dman

11:08 pm on Dec 2, 2003 (gmt 0)

10+ Year Member



There are a number of posts talking about index pages being penalised. It is not index pages, but ANY page that is optimised for one of the dictionary phrases. I have had a number of pages within my site dropped including "Search Engine Optimisation City" and "City Hotels". "City Hotels" mainly had internal links, the SEO City had both internal/external links. I even had two "Name Restaurant" internal pages killed. Other websites pages were killed for the same terms and they had only internal links, and virtually no traditional seo.

espeed

12:05 am on Dec 3, 2003 (gmt 0)

10+ Year Member



Didn't GoogleGuy say that Google does not penalize sites based on external forces (i.e. incoming links)? Otherwise, your competitors could sabotage you.
This 71 message thread spans 3 pages: 71