jj, GoogleGuy wrote:
|And then you have 'h1 align="center"' followed by over 60 keywords of hidden text with things like brand names of treadmills and elliptical trainers. |
Maybe I'm naive, but it seems to me that stuffing 60+ keywords into an h1 heading would raise eyebrows even if the text weren't hidden.
These are what make it fishy.
1. It is the first part of code after the body tag. That is to make sure the spiders find it first.
2. It uses the h1 tag which suggests importance, although it is obvious the h1 tag is made to look like regular text.
3. The negative div is used to hide it.
It certainly looks as though it is done to fool se's. For me, in IE and Firefox, it is visble but placed directly on top of your footer so neither can be read.
I also boggle at anyone having an <h1> at the bottom of a page. The main heading goes at the top of the page.
Run your pages through [validator.w3.org...] and make sure that you tick the box for "Show Outline". On the results page, scroll down to the bullet point list headed "Outline".
Inspect the result -- if it does not look like a summary of your document then you are abuisng the heading tags.
good point europeforvisitors
it was in sentence form :)
I have changed it for the client
Powdork the -100 in the tag was not meant to hide it but have it show at the bottom of the page
this was the only way to dynamically get it to show at the bottom of the page from what the programmers are telling me.
in my browser is is on white background and very easy to see so I never saw a problem
either way we are removing it
Wow that's the first time I've seen a number of people band together and give quality advice in multiple forms on a single subject to solve a problem for a webmaster in need...
Most of the time we just argue about why we dropped 5 spots on our big keywords, and our definition of spam...
Makes me wanna shed a tear ;)
You do have to give Google some credit for actually reaching out to individuals, automated or not, on the potential status of their place in the index. What they are doing is not spamming, what they are doing is WARNING us of an impending drop in traffic. If that's not caring about your users then I don't know what is. To all those site owners who say they don't need Google or care about Google, I say poppycock!
I mean, sure, Google Support is slower than death crawling backwards, and I'm pretty positive there's something shady with AdWords, but it's not like they HAVE to try and help us. GG will probably come back in here and get riddled with questions, but he doesn't have time for that. We're lucky enough that he took the time to explain one case. And the qualified rest of us finished up (I'm talking about you guys I just barely qualify). That's what we should be doing.
Man, that was sappy. I just don't see teamwork too often here...
question...if resizing h1's with css is a bozo no-no, can we just block G from crawling our css via robots? Or would they feel you have something to hide? Let's face it, H1's are nasty to look at. Any idea's
I think the issue was a combination of:
- using 60 words within a <h1> (see msg #174)
- using inline CSS to both reduce the font, AND position the H1 tag off page (top:-100px)
I don't think that any search engine has any issue with using semantic layout correctly; or using external style sheets for presentation.
If you saw the site using IE - then turned off the stylesheets (see msg #163) - you'd see what the issue was......
I used the validator on Google.com home page. 40 errors!
No hidden text though, so I guess we'll let them stay in the index.
My point was just to show why it would look fishy to Google. Also keep in mind they can use some funky browsers (probably even something in-house) to look at things. Like I said, for me, on IE and FF it is visible, but it makes your site look broken. Besides, they are all there in your brand links which should be much more effective seo wise than your h1 tags anyway.
|Powdork the -100 in the tag was not meant to hide it but have it show at the bottom of the page |
jjdesigns4u, glad to hear it. I'll submit a reinclusion request for your site tomorrow AM. Just guesstimating, I'd say the site should be back in Google's index by Thursday afternoon or so Pacific time.
I'll also put in a request that when we send emails about hidden text in the future that we indicate some of the text we're talking about.
Subtle warnings about layout have been around for a while, both from G itself [google.com ] and Brett's 26 step guide [searchengineworld.com ] .
As an experiment, take one of your pages and strip out ALL the html code and styling then see if your page makes sense. Then add your styling back in. Remember that any styling which pulls text OFF the page is going to look suspicious.
One thing that I haven't seen mentioned yet but may well cause a G trigger to go off is unusally small text sizing being applied to any element, especially hx ones. It would be very easy to check for.
>>I'll also put in a request that when we send emails about hidden text in the future that we indicate some of the text we're talking about.<<
Thanks. That will save both time and much guessing from the site owners/ webmasters sides.
i'm curious as to whether any css styling that uses negative positioning will be flagged as potentially abusive.
particularly as there are various css-driven approaches for generating rollover graphics/navigation, currently promoted by large webdeveloper sites (eg: the "slidingdoors" method)
|question...if resizing h1's with css is a bozo no-no, can we just block G from crawling our css via robots? Or would they feel you have something to hide? |
Robots might just ignore such an instruction. However, I think in this case, it was inline CSS that caused the problem but I didn't bother to search out the offending page.
I imagine people will start cloaking their CSS files in the not too distant future. If I was running a major search engine, I'd send half my robots out from anonymous IP addresses using fake User Agents - of course, they's have to be well-behaved (like users) and follow robots.txt exclusions too. I think it's a safe bet that if Google isn't doing this yet, they soon will be.
So why is my site being penalized? No one,
including Google seems to know.
We have removed everything we spoke about.
I really appreciate everyones help on this.
Powdork thanks for the help
SO - Google told you what the problem was, told you how to change it and hell, even put in the reinclusion request for you!
Who do you have to sleep with to get that kind of service?
theBear thanks for all your help also
Thank you for taking a look at the site and helping take care of what needed to be done
SO - Google told you what the problem was, told you how to change it and hell, even put in the reinclusion request for you! <<
Not the exact history of this case ;-)
- jjdesigns4u post his case on the most powerful forum on this planet; Forum 30 WebmasterWorld
- kind fellow members (including GoogleGuy) discuss the matter, share, follow up and help.
And we have a happy ending.
|I'll also put in a request that when we send emails about hidden text in the future that we indicate some of the text we're talking about. |
This would be a fantastic service. There are many webmasters that have unintentional problems like hidden text. I think these emails are a great service to webmasters. Thanks Google.
If there was a paid service where google could tell a webmaster any factors that could be hurting their rankings (penalties they may have tripped, eg. duplicate content) then I would have used it a few times already.
could not have said it better myself!
The comment was meant as a joke, I assumed (never assume!) people would take it as that
Well done jjdesigns4u, hope all goes well.
If it takes this much effort for what I guess is a major site then what about the rest of us?
Still its a start Google.
Speaking of hidden text, some VERY popular blog software inserts RSS related code inside comment tags for each entry. Basically, it's an excerpt (about 30 words) from the first lines of each post/entry.
The specific RSS related tag is dc:description
I had always avoided putting keywords in comment tags, as this was a spammy trick that worked for awhile years ago, but now it's being done automagically - would this be flagged as a no-no?
Wouldn't one expect Google to just ignore text between comment tags?
|google wants the web to adapt to it, when IT should adapt to the web. |
Absolutely. Google became THE search engine by returning the sites that best served the interests of those using the search engine. Now they've become about excluding sites which don't live up to Google's standards of form. Ridiculous. Could it possibly be that what I'm looking for is on one of the worst written, most w3c standards violating, most keyword stuffed webpages on the Internet? Yes, it very well could be. But I'll never find it if Google refuses to list it.
One thing that is annoying the heck out of me regarding my own current troubles (my site has completely disappeared from Google) is that I can no longer find things on my own site! I'll remember a specific phrase that I used on my site 3 years ago, and I'll want to find the whole post that I wrote, and now I can't because Google has dropped my site. What's more, I use this search technique with A LOT of websites. How many of my searches are now failing because Google has refused to continue indexing the sites?
They're on their way out. Are you using Google now just out of habit? I've noticed, lately, that MSN and Yahoo are giving every bit as good of results for the things that I search for as does Google. So why do I use Google? Answer: habit. That's all. And it's the toolbar that I've placed on my browser. I expect to be removing it in the next few days, depending upon how happy I am with the resolution of my site's current penalty (I guess that's what's going on) in Google.
I agree here, my site is constantly ripped off by copyright infringers. And they now rank better for my content, even when it uses my company name on their site. This is just not right. If I type in some text from my site in quotes; PR 0 sites and supplemental pages including 404s rank better for my text that is copyrighted. These are blatent rip-offs.
A little help please G...
Reinclusion requests, htaccess rewites, framebreaking code,
wt3 validation html cleanup, posts to both Googleguy and Matt Cutts, making sure we are not in any link farms, adding quality content, making sure we have no hidden text
or blackhat techniques being used, and on and on and on....
yet we get the ZERO feedback from Google on why our site
is penalized, it's like it's just lost in the system and Google doesn't really have an answer for the big WHY.
How many other of you feel that's the case with your site?
That it's just lost in the system and Google has no answer as to why that has occurred, so they just don't answer
nor do anything about it. Do you think the same treatment would apply if your site was a MAJOR corporation?
| This 215 message thread spans 8 pages: < < 215 ( 1 2 3 4 5 6  8 ) > > |