Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
In FI, CW and DC, my site is in the index. Most search terms performing the same as usual, EXCEPT the top keywords.
Top keywords are keywords used a lot in anchor text and contained in the title of my page.
The reason I am posting this is not to blame google or what. I need to at least find our the reason why this is happening.
Anyone here having the same experience? Maybe we can discussed here and find out the reason. Maybe there is a new filter working againt us.
Part of what has me concerned is that there are sites in the top ten now (for my term) that are garbage like a one-page server notice: "Your Apache Site is Installed". This shouldn't even be in the database and yet it's ranked in the top ten.
This really scares me!
There is no semi-penalty. It's called wacky Dominic. You changed the H1 tag and wacky Dominic didn't something at the same time, and that's why you can back, not because you changed an H1 tag. If
i mean, ww is great and all, and this is a great group of people, but i dare say that we make up such an incredibly small portion of the web public. i think that one of the things that search technology has not been able to do is to deliver not only the most relevant results, but the most desireable.
how do you do this without paid inclussion?
i have forever complained, to myself, about all of the crappy looking sites that litter the top SERPs. there are a lot of great web designers doing awesome work for very cool companies that can't even get in the top 10 for their own brand, because they don't use web publishing 101 to design their sites; they push boundaries.
i think that if the web is ever going to live up to it's own hype, then major changes in the way people search the web are in store for us. this could very well be the beginning of a whole new era in search technology.
then again, it could just be a silly dominic wobble, or whatever the latest popular speculation is.
i have news though, if you don't already know it; diversify you traffic generation, or you will probably end up incredibly bummed in the long run.
google is at war (in my opinion only) with SEO's, because they would like to bring in the most desired results, but SEO's have found ways to cheat the system (backlinks is a great example).
this is america. google is an american company. desire over relevancy.
the cure: build a strong brand, and make it sexy.
i might have gotten a little off topic, but i wanted to share these thoughts.
while everyone is theorizing over a semi-penatly for h1 and excessive anchor text, you HAVE to imagine that anything that gets abused will eventually be done away with.
Hagstrom and all,
Actually my site has reappeared in the TOP 10 after I saw Hagstrom post about his site reappearing in 8/9 datacenters yesterday. :) Why, because both of us had do something that we believe is true. I didn't post immediately yesterday because I would like to know if it stays until today. Yes, my site, which appear no way in the index after the update is now back on TOP 10 in 6/9 datacenters for my most important keywords
What I did?
1. I removed the H1 which I only added before the last update. (conincidence Hagstrom?)
2. I change half of my title.
3. I removed lots of SEO in my site - reduce the word occurance in bold, in links ...
For those who are questioning about the the semi-penalty theory, please give a new theory that can explain instead of attacking this theory. The theory might be incorrect as you wish, but it can be correct too, since you can't prove both either! :)
needinfo and many that have been stading in the middle,
You site might reapppear again few days later, next update, or next few updates ... depend on what you want to do. The reason google has this *silly* filter is not silly at all. They are trying to catch those who try to manipulate ranking in google. Those who disagree with this theort are those who usually actively manipulating their ranking with google. So they could never accept this theory because the theory is agaist them.
For me, I think it is wiser to always adapt to the new google. That's why I always stick with the forum to know "what's new" and "what's happening". Play according to the rules of the game. Who set the rules? Google, because it's their game. :) my 1 + 1 cents.
[edited by: AthlonInside at 8:12 am (utc) on May 26, 2003]
Sure. The phrase is like "Big Red Slidgets". The index page and whole domain is about Slidgets. I couldn't care less about ranking well for "big" and in the top ten for "red". However, what I've read here about seo penalties just isn't possible, as the sites above me all key the same word in essentially the same way. And those few sites that have more anchor text than me weren't hurt. It's not possible that the idea is simply to penalize fifth ranked sites while not penalize 1-4, and rewarding sites that become 6-15 for doing approximately the same things.
A three word penalty would make obvious sense in my situation, but if people are seeing something opposite for themselves, then there is nothing to this idea either way.
I'm just stunned I do so well for Word2. I can't believe my little site is ranked so well for such a term. It makes me think Webmasterworld is really cool......... until I look at my Word3 results and want to jump out a window because trite, zero content spam is able to beat me out.
"And the entry at #250 is some oddball page in the middle of my site. My home page is no where to be found!"
I have this for two sections of my site. Two deep pages are the "main" listing, down dozens of spots from where the real "main" page has been listed. The GOOD news I hope... is that yesterday for a little while -in showed much more sane results, with my true "main" pages listed as #4 (used to be #2, dropped to 37) and #5 (used to be #6 dropped to 60). While -in no longer shows this, hopefully there is another index running around in there somewhere with less ridiculous results.
there are always survivals. There are always guestbook spammers who survive. There are always sites with microtext or hidden links which survive! Furthermore the filter is new and so the strict level shouldn't be that high to avoid catching the innocent sites. There are X billion of web pages in google index, you expect the filter to catch exactly 56,565 of hidden text spams, 895,156 of guestbook spams and 15,564 of the semi-penalty spams?
Big welcome to Webmasterworld, very interesting thoughts and I had never thought about that before. The key question is does the average surfer want quick information with fast load time (generally seo sites with no flash) or a rich visual surfing experience? At the moment people search for information and will not wait more than a few seconds for a site to load. Google generally lists these types of sites, but if connection speeds improve and more innovative sites load quicker then I think you are right and google will have to find a way of indexing these more advanced sites because people will want them.
Going back to H1 changes that people have made and consequent better rankings. I think it is impossible to say if these changes have had an effect. Back links have increased on many listings and other filters are still being applied so older positions are being restored anyway. This has happenned to a few of my sites.... except one where I have used a lot of seo tactics. On another site I did reduce the H1 so that it did not match the numerous anchor text of links in.... it has risen significantly with the fresh bot. Coincidence? Maybe, but I still think too much seo, especially with H1 matching loads of anchor text could be a factor.
I keep reading people saying something like ' Links in for your main key phrase is hurting you? What rubbish....' This is not what is being suggested, links in with relevant anchor text is good, even if it is all the same anchor text. The issue is if that phrase is then over optimised on your pages, the filter kicks in. There must be occurances where someone has made a page in all innocents using h1, title, italics, etc. etc. using that phrase, but I bet this is rare. The majority of cases where all these old tricks are being used, is when the page has been built by someone who knows about optimisation, and these are the pages that google has got to be suspicious of and thus is lowering them in the rankings.
All conjecture... time will tell. I still think it is way too early to make any changes based on theories that are very half baked. But its all fascinating stuff!
No idea. But welcome back :)
BTW - back in post #184 I told that my index page was doing great for "A-city", "widgets A-city" and "wodgets A-city" but had totally disappeared for "widgets wodgets".
Now it dawns upon me that A-city has a German vowel. That means I use "Ü" on the index page, "ü" in all the links and that surfers search for "%C3%BC". Maybe that's why my index page escaped the semi-penalty for "A-city"?
I agree with your idea that there is an important new filter at work - possibly the "holy grail" filter that search engine develepors have been looking for since they began. I would call that filter the "natural language" filter. The idea of this would go way beyond simple stuff like repetition of words, how many words, frequency etc.
A natural language filter has to determine the main theme or themes of any given page. Once determined, the words used on that page have to be examined to determine whether they are being used naturally - hard, especially when the context of the theme has to also be determined. What I mean by that is once the spsecific theme of the page has been determined to be about widgets (hard enough in itself) the context could be scientific, commercial, user information etc. All these contexts will lead to different forms of natural language. Then throw in geograpy (different countries/languags use language differently) and you need an algorithm that is actually very sophisticatd AI.
Get the algorithm right and you catch those who are trying to fool the algorithm itself. Get it wrong and you end up with lots of false hits (penalties).
Perhaps Google have finally cracked this nut. The effect woud almost appear as the best optimized pages are those that don't seem to be optimized!
Would this mean the end of SEO? I don't think so, just means a different approach. If you have real products and services to offer users then it should be easy to talk about them "naturally". The group I suspect will suffer are those sites that are whose content is made up entirely of affiliate programs - currently these are the sites most likely to be using artificial language to boost rankings.
I don't think Google has anything against SEO but I do think it dislikes sites which are merely affiliate programs dressed up as real content.
I suspect such an algoritm would give great weight to sites that pass this filter, considering them to be authorative. This means that on the page ranking could be done by a Freshbot style update which would explain why you (and others) have regained places after removing stuff which looked artificial to the new algorithm.
Speculation is one thing, but there is no evidence that Google Guy has ever misled people in this forum. He stated clearly that some mild filters would be applied *after* the database migrated.
What this then means is that no filters were put in place for what came to be known as -sj. Various things happening to various people offer information, but some sort of handwaving filter is directly counter to what we were told. Any sort of filters only became a possibility couple days ago, after the basic -sj index was on all datacenters, not before.
What should be clear is this index is extremely unstable. The datacenters vary a lot, with pages moving in and out and up and down all the time. But pages using heavy seo are doing far better now than two months ago.
If only they would. Sites ranking well now are all keywords on the page, in some cases pure gibberish. I see a site that has rocketed up the rankings that is essentially 100 copies of itself, all interlinking, with only minimal changes in the text on each page, and no effort at all to make the text coherent. This site has (accidentally I suppose) cracked the temporary secret of -sj -- total seo, doorway pages, anchortext, keyword page gibberish, near-mirror domains all interlinking, and keyword in URL.
Google's deepcrawl failure has forced them to regress to old data, and it should be no surprise that they have also regressed, temporarily, to a much simpler, and easier to seo-exploit algo.
The next six weeks are no content, SEO heaven.
Hmmm. Perhaps the links in have yet to be factored in? It was only yesterday I had a site kick back in when the back links were found. Are the back links from various 'on theme' pages on different servers and different sites? Have the sites linking to you lost pr?
You could be right, the theory is unproven, but there must be lots of other new algo's which are being introduced that may have affected you. Probably the biggest factor is that old data is still being used by google, so we still have a way to go yet. Hence we should all wait and do nothing yet.
The filter work perfectly on your case (if I am the one who design the filter).
90 links are not a lot but all 90 links for the same exact anchor text is a lot. It has enough factors to accuse you of manipulating google site ranking for that specific keyword in the anchor text. The filter work great. But remember they just penalize that some specific keywords only as it is a semi penalty. You will still do well with other keywords. Everything make sense to me.
Just removed H1 as you suggested, AthlonInside, and will see what will happen...
Since H1 doesn't help much nowadays after there is such a thing call stylesheets which can make H1 no larger than ordinary text
You drop your headers because of some halfbaked incomplete temporary outdated update results?
Bang on heini.
AthlonInside has come up with a good theory to explain what is possibly happening even if I personally do not agree with it. Something to consider BEFORE changing your site :
In this thread the majority of people who will post are people whos sites fall into the "semi penalty" theory in my opinion, so just consider how many people whos sites do not fit into it who simply do not post. Then consider how small a percentage of the whole web is represented here in WebmasterWorld. Don't change your site because you've read 50, 60, 70 posts from people who back up this theory because for those 50, 60 , 70 sites that do fall into the "semi penalty" theory I bet there are 5000, 6000, 7000 that don't.
all 90 links for the same exact anchor text is a lot.
search=Coke (I'm making this up)
result#2=Do the Dew
result#3=Be a Pepper
result#5=Why Coca Cola is bad
Ok, maybe its not such a good idea
Certainly would improve the whole relevancy issue. "What? 300 people linking to this page seem to think the page is about green widgets. The header tag on the page screams that the page is about green widgets. Obviously this green widget page was designed to be deceptive. We need to give this page a semi-penalty. That will teach those people to craft a page that so accurately reflects its content that 300 people linking to the page had absolutely no trouble determining what the page was about..."
I don't buy into that for a second, if it did happen to prove out then those Ph.Ds at Google have managed to prove that Ph.D. is an acronym for piled high and deep.