Forum Moderators: open
I work for a company as a web designer/it support /online marketing person. Since our start, (6 mths ago) we have launched around 15 sites. Each site deals with a specific product in its sector of business (same sector for all sites). Since our launch, we have had no problems with google or any other search engine. I have read a ton of tutorials and gone through a ton of stuff on this site researching proper marketing tactics in order for us to get good rankings. Needless to say, my effort has paid off quite well and I have managed to list almost all sites on page 1 for the targeted keywords. I have also had a suprising number of position 1 rankings, which was very good indeed and i was quite chuffed with myself.
Anyway, having abandoned the periodical reviews of our logfiles and online marketing for a months or so, i have come back to a total apocalipse.
Almost 75% of these targeted keywords that ranked well (with their pages) are nowhere to be seen now. THey havent moved down a few places/pages, they have totally dissapeared! If i include my company's name next to a keyword, obviously the page comes up, so it is in the index somewhere.
To further clarify things i will try and recollect what i have done with the sites since this change happened.
Site structure is simple, its basically a small webring, having one main corporate site which then linked to all the other sites.
This has changed since, as i have interlinked each site with each other, so now each site has a small menu with links to all the other company's sites. (im thinking that this might be the main culprit to google dropping pages, as i cannot find any other changes i made to sites except this - but how? isnt good ranking achieved by linking relevant sites together? i thought this would boost my PR!) For the last month or so the only thing i did was to modify metatags here and there, and try to keep things tidy, added a few H1 tags to headings, etc. As i am saying the only other major change was adding those links which link up all the sites together. COuld this be the culprit? If it is not this i am hopelessly lost as i haven't a clue as to how to find what the problem is.
All of these sites have an ending of our company name + a keyword specific to the product being sold for that particular site, so considering that our company is called
[example] our sitenames are as following: www.example-product1.com
www.example-product2.com
www.example-product3.com etc
Why is google penalising me all of a sudden?
Im willing to post more info on here if it helps, or if something is unclear.
Help!
[edited by: pageoneresults at 6:24 pm (utc) on Feb. 9, 2004]
[edit reason] Examplified References [/edit]
I thought optimizing a site meant reducing stylesheets and needless html/javascript and stuffing every nook and cranny with keywords. There were so few words on this site that I'm surprised that it ranked on the first page of the serps for anything.
Acually, most of the time when bandied around these forums, "optimizing" usually refers to "SEO" a/k/a search engine optimizing.
Gbot ignores menu java, cumbersome or no, and isn't good at following javascript links so it's pretty moch SOP to hardcode good 'ol html to be sure it crawls the path(s) you hope it will. It also ignores (pretty much) stylesheets, and it's necessary (if you care) to use two different stylesheets to support older/thinner browsers - but there is certainly plenty of eveidence Gbot's become smart enough to discern CSS tricks that disguise <H1>, etc. tags down to normal fonts for viewing surfers and other such spammy tricks.
My (now "de-ranked") sites depended upon the many, many pages covering products all named in variations/model numbers of the 3-word keyphrases (2 of each), heavy alt tag optimization and the use of title tags.
I'm reading a lot, especially in recent "searchenginewatch" articles that alt tagging, at least the over-use thereof may hurt, and also a lot of buzz that unless you've declared the site to be Section 508 and W3C's WCAG compliant, title tags are also a no-no, but I haven't heard anything solid from the few people at Google that speak publicly and/or GoogleGuy to confirm or deny this.
Actually, I couldn't agree more w/ marmalade..
..its the blind leading the blind
As to the great hue-&-cry it's all a clever Adword ploy, a prelude to the (now defunct) IPO et al.. to wit
Now I have heard all the theories of OOP, conspiracies, and the like
Just my 3 cents, and a HO.
:)
Please no more, everyone knows that its only a problem if its used like this on a black background <font face="comic sans ms" SIZE="6" color="#00ff00"><b>Widget</b></font> apparently it trips some sort of hideous filter.
Nuff said, I'm off to find my white stick.
I have to hang that and frame it. Chelsea in my view is 100 percent correct with it.
We didn't even optimize our site, and would have no idea how to even if we wanted. It just got blown away, except of course for terms which had nothing to do with the content.
We have banned googlebot to stop that nonesense and won't allow it back until we see evidence that their problem is solved.
>>> Did you use the "comic sans ms" font on your site?
Please no more, everyone knows that its only a problem if its used like this on a black background <font face="comic sans ms" SIZE="6" color="#00ff00"><b>Widget</b></font> apparently it trips some sort of hideous filter.
My site that has been #1 for my keyword for years on almost all the engines (not a common keyword) got bumped to #25. For my #2 (popular)keyword where I had been #5 I'm no longer found. Most of my site is done with comic sans! Didn't know till now that's a sign of a bad designer. Guess it's back to verdana.
What I've read so far would also suggest that maybe adding header1 tags didn't help either.
My only other unscientific, and non-SEO opinion of what Google is doing, would be that they are simply messing things up enough to keep everyone off balance, and "blind" to their real algorithms.
I think "the mad" Google still handles the links in reverse - when you link from page a to page b, now page b gets the credit. So I guess that's the main reason selflinking works now.
Combuine this reverse link understanding with the power of example.com/keyword1/keyword2 URLs and you get the idea as of why so many dmoz clonings are all over the SERP's. There are still queries ("keywqrd1 keyword2") that are returning 80 DMOZ variants in the first 100 results.
So long for the "duplicating content filter" and G's patent in that area...
Keep in mind that the problem with repeating URL's in the SERPs still exists and I guess this days we will see some mayor changes in the SERP's.
P.S. There is strong evidence thast Gb reads the URLs in a Java Script code.
On the other hand there is no strong evidence for hidden text penalties.
These search terms have been highlighted: keyword1 keywird2
These terms only appear in links pointing to this page: keyword3
Often those pages are in top positions of the SERP's - above pages that contain all of the keywords and have biger PR value.
More confusing - those backawd links (containg the keyword3) are not placed on pages with PR's greater than 4 as they do not show up on link: search.
I think we are currently definitely in a state of "blind leading the blind" as some people have "some" idea of whats going on but it seems to me it still all is one big guessing game. It is rather unfair of google to drop sites out of the blue like it did with our company's group of sites. I do not think we could have been done for "over optimising" as i simply didnt have enough time to even do normal optimisation on daily/weekly/monthly basis. Very unfair i say, considering that a big part of our business depends on the number of visitors coming to our sites.
It also makes me very worried considering that the only safe alternative to draw visitors to the site is to use overture or adwords and that COSTS MONEY. the sum we are currently spending per month is equal to twice my annual wage.
I guess the fat kats are just getting fatter (overture & the like) while we normal people are kept blind. Maybe google's dance and "random" dropping of sites is intentional? Maybe they just want to make sure we keep adding money and bidding on our adwords and overture terms, call me paranoid but the thought has crossed my mind.
All that i can do right now is to take off the links which are interlinking our corporate sites together and hope that improves our rankings.
Did someone mention that adding a css style to a H1 tag is also considered spamming? (i've done this too)
What the hell? This whole thing is starting to p**s me off...
The two top sites in the two keyword search where my sites vanished uses both. One has only one instance of each keyword in separate option lists called by javascript and the other contains the two keywords only in a table hidden with CSS.
At least in this case, Gbot sees them. I know there are probably other cases where it appears these are being ignored. There are probably other unknown factors coming into play here. That's the problem.
Maybe you just have to compare your site with those above it and make adjustments.
The latest update has my site back up in the 1st page for my big two-word phrases. Where I have dropped is in my three-word phrases. If I had to drop, I am glad it is here, but when I say dropped, I really mean dropped off the face of the earth. You would think that if the site ranks well for KW1 KW2 it should rank well (or at least the correct subpage)for KW1 KW3 KW2.
The two top sites in the two keyword search where my sites vanished uses both. One has only one instance of each keyword in separate option lists called by javascript and the other contains the two keywords only in a table hidden with CSS.
NOW I'm curious...
You're saying that the only keyword incidence is w/in a JS link, and within a table *completely* hidden by CSS..?
I know from experience Gbot "ignores" menu java, and it's
peripherals, and can follow links it finds if they're presented in html form. The below blurb is _still up in the Info for Webmaster's section..
Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as Javascript, cookies, session ID's, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.
but as to Gbot parsing (understanding it arrives as a browser, not a crawler) a table hidden from a user-agent id'd as a browser..?
=+ :( =+
I said the top two spots, actually it was #2 and #3. Site in #2 spot had each keyword only once (visable page and code) and that was in two different anchor text spots within options lists with no html around them. They were being called by java.
Site in #3 spot had both keywords in a table that was not visable on the page, nowhere else. Not being familar with CSS, I did searches on the code for the table and came up with a page instructing adult webmasters how to use this trick to fool SE.
That site has since removed or hidden the table even better. Maybe he decided it wasn't worth the risk. I'm waiting to see how he rates after his new page is indexed.
I remembered that "googleguy" replied one time in this thread, why his reply is deleted? Last night I was too tired then I decided to read this morning. But when I get up this morning and I cannot find the googleguy replies in this thread.
Could somebody repeat what the googleguy replied in this thred for the possible reasons of the site dropped problem?