Welcome to WebmasterWorld Guest from 18.104.22.168
does anyone know how to find the issue they are talking about.
we have not done any hidden text on our site so we are at a loss. Our site is a huge dynamic ecommerce site so I dont even know where to begin.
They mentioned they detected "Hidden text/links on <domain name>.com" and also said that some pages are being temporarily removed for 30 days
does this mean the whole site is being removed?
how can I find out where they found this?
obviously we are FREAKING OUT
[edited by: jatar_k at 4:07 pm (utc) on Oct. 11, 2005]
[edit reason] no email content [/edit]
Most of the unsolicited mail I get is a copy of the exactly the same message that the spammer sent to millions of people. The message that Google is sending out is sent to that site for a particular reason, with a specific purpose: it is opening a line of communcation, not force feeding an unwanted marketing message.
The google emails only become spam if you have thousands of websites causing a penalty which causes googlebot to email you 1000 times.
Even then google is only trying to help you keep your 1000 websites listed so you can continue generating money from nothing.
It's simply not spam.
If it interests you, and you appreciate the spam from google then it's a different story.
Why spam? Because it's telling me what to do on my page so their product can make them more profit. If I don't do it then I defeat their algo(been doin it forever now).
They're not helping me make a profit, they're helping themselves by spamming me. Therefore making money from unsolicited email.
Money from unsolicited email = spam. What's there to argue?
I'm nr 1 for most my searches, big money terms....you can keep defending google or you can realise they're not your buddies. I for one prefer a fat bank and googleguy very far from me.
When you catch a big one, shut up and pull it into the boat. Don't make noise, or you'll scare all the other fish.
What google is telling you is that if you find a secret spot where the fish is you should tell them. You should not tell them anything. Get your fish, shut up and keep google at bay.
The sooner you learn, the sooner the profits grow for you.
I would like to have a warning before google yanks my fishing liscense.
If you make your living by fishing then it's a good thing to have a fishing liscense.
If google is going to revoke my liscense then a warning is very nice of them. If you get your liscense revoked (And I pray that you never do) you will be singing a different tune.
It's a lonely sad world when your liscense is revoked. Nobody will link to you because your site has no page rank. Most webmasters shun you and think you are a scraper or spammer.
You eventually lose all respect for yourself as an individual. It's just sad...
i have a different perspective though. google is taking my fish! i guess it all depends on where you're coming from.
i know at least 3 or 4 very very well known companies in my area which are PR zero on google due to the ammount of affiliate links to them(wild guess for PR zero). they've been like that for years....and they don't give a D#$%!
That's the attitude, in my opinion. Catch your fish, don't matter if pr zero or otherwise.
It don't get lonely. And normally the loneliest fisherman catches the most! Anyway good points, you're looking at it from a different perspective.
If so, I think I know your site. And if that is the case, please check your site again with CSS turned off. Specifically, the CSS part that looks like
position:absolute; left:163px; top:-100px; width:598px; height:46px; z-index:1
And then you have 'h1 align="center"' followed by over 60 keywords of hidden text with things like brand names of treadmills and elliptical trainers.
If I'm looking at the wrong site, I apologize. I'll ask about putting a tracking number or actually including the hidden text in future emails, so that people have a better idea about where to look for the hidden text on their pages.
I don't think I have any offending code. God, I hope not... over 2,000 pages and building. No H1 that I know of...No css, each page is copied from the last. I know I am ignorant and that css is preferable if a change is required but I can't handle css. I am one of the moms and pops, not a pro...
position:absolute; left:163px; top:-100px; width:598px; height:46px; z-index:1
I hope you are not implying that the "top:-100px" is necessarily a hidden text spam. I have used it in the past. It is a valid programming method of hiding things like a submit button when you have an onclick function call (js submit button)and you still want to be able to submit by pressing enter as well.
Basically I'm wondering if you could let us know how innocent CSS might cause a penalty because of the way it's read by the bot.
Hmm, what did I say about "creative" abuse / misuse of heading tags just a few posts back (#109)?You called mine abuse when it is misuse. and that is only according to you, the w3c, and I suppose the odp would probably be offended too. All those that crave structure in the order of the world. I am not one of those. The heading tag is an h and a number that when combined with css can alter the appearance of an element. You can say it is supposed to do this or that but ultimately it is up to me to use it in the fashion I like on my sites. And ultimately I must lie in the bed that I make. If it hurts me in Google that is my problem. I personally don't feel it would hurt anyone in Google that is not doing it to gain an advantage in Search Engines. It is obvious when that is the purpose. These are hand reviews we are talking about and I would be very comfortable with any Google employee looking over either of the sites. They've been there before (lord knows I pester them enough) and I am sure they'll be there again.
Very nice to see you posting again.
Any comments for my remarks posted previously:
GSQT = Google Search Quality Team
The only question to be solved in this connection, IMO, is to make it clear to the recipients that the alert email they receive is originated from GSQT, and there should be no doubt about that.
This might be done by adding a validation link to the alert email. For example: Click here to validate that this email is sent by GSQT, which lead you to a page on Google with text to that effect.
CSS isn't very very difficult. It is quite easy to pick up the basics. You can't go far wrong if you make your pages from headings, paragraphs, lists, tables, and forms (block elements). Delete all of the <font> tags from the document, and use a stylesheet instead. In there, you simply have one line of CSS to style each block (size, colour, typeface, etc), and tag any exceptions with a "class" (like <p class="footer"> for example).
Basic CSS is a replacement for the <font> tag, and can be mastered in a few days. Advanced CSS sees the stylesheet also position the content on the page. You can safely leave that until later.
Merely marking your content with semantic tags (defining the content as headings, paragraphs, lists, tables, and forms) and using an external stylesheet can cut the HTML bloat by 50% or more.
I recently converted an 81K HTML document that someone had produced in Microsoft Word, and reduced it to 14K of HTML code and Content, and just 8 lines of CSS styling. It took about 20 minutes to do it by hand.
Funny that you should bring up the ODP? No the ODP doesn't care about valid code, but they do care about people submitting according to their guidelines.
>> You can say it is supposed to do this or that but ultimately it is up to me to use it in the fashion I like on my sites. And ultimately I must lie in the bed that I make. If it hurts me in Google that is my problem. <<
Fair enough; and the same applies to the ODP too.
>> You called mine abuse when it is misuse. and that is only according to you, the w3c.. <<
If anyone can write HTML the way they want to, then I have absolutely no idea why the W3C would devote so much effort into producing documents like these: [w3.org...] - those are supposed to help unite the way that all browsers work, and how all authors construct their code. Aren't they? [w3.org...] Doesn't the term "best practice" mean anything?
In an earlier post, I made the point that many code examples on msdn are hidden until a link is clicked (presumably displaying a div but I've never looked at the mechanism).
I presume that you have no plans to remove these pages, certainly, to do so would be detrimental to the index. I presume, that other legitimate sites use similar techniques and removal would also be detrimental to the index.
Given these presumptions, and Google's stated preference for algos rather than human intervention, would it not be logical to simply ignore text that is identified as hidden. This would also be consistent with Google's policy of secrecy with respect to its technology and algos.
If pages are banned automatically, mistakes will be made and the index and users will suffer. Pointed ears are not required to see that this policy is both illogical and inconsistent with Google philosophy.
One more thing. I use the exact CSS method you highlighted to trim off the edges of a <select> box. But, hey, if you ban that page, I think it might even plug a PR leak - now there's irony.
I thought I was using all the right gear, line, casting technique, minnow bait and rowboat - I want my fishing license back, or at least some clue as to why it was yanked, for how long it will stay that way, and if there is anything at all I can or should do, what that might be... :(
It's a trial, so right now it is not perfect, I guess that is why you would call it a trial. In the jjdesigns case, it required a little heavy investigation it appears. Thanks to Googleguy for shedding some light because even if GG's example wasn't jjdesigns site, it at least might be of benefit to someone who is lurking in this forum trying to figure out what might be Google's problem with their site.
For two years, I banged my head trying to figure out what Google's problem with my site was. I found the solution only by chance, so I would have highly appreciated an e-mail that at least narrowed my search.
I wish Google all the best in trying to help out webmaster's who want to be good citizens but may have unintentionally caused themselves to get caught by Google's effort to thwart spam.
kaled, I absolutely see your point about ignoring hidden text in indexing. However, if a Google user gets a page and then discovers that it had hidden text, they typically are unhappy about it. An explanation like "we ignored the hidden text in indexing; that page would have been returned anyway" usually doesn't make the typical user much happier. We've certainly had this same discussion internally, but I think it's better for the original pages to remove the hidden text.
Funny that you should bring up the ODP? No the ODP doesn't care about valid code, but they do care about people submitting according to their guidelines.No, that was completely intentional. Your way of thinking is very much inline with theirs. You understand not everyoine thinks like that, right?
Fair enough; and the same applies to the ODP too.Yes and I have many sites listed in the odp. The only one that is not listed was removed when I called a meta an @$$#0!& (although I couldn't find anything to that effect in their GUIDELINES).
Doesn't the term "best practice" mean anything?Yes, it means guideline or suggestion, which is exactly how I take it.
However, if a Google user gets a page and then discovers that it had hidden text, they typically are unhappy about it.When does that ever happen if it's not a webmaster. Why would they be unhappy?
added- nevermind. It's not like I say I am a webmaster when I fill out the 'Dissatisfied with results' form
I don't run Windows or IE so I can't speak to that line of browsers.
The only one so far that has that heading as invisible text is Konqueror.
The versions of Moz, Netscape, Firwfox, Epiphany, and Opera that I have render that heading as visible and usually at the bottom of the page. Depends on the rendering engine (and these puppies have been known to have gottchas that if you page back and then forward will show some elements properly that weren't on the initial retrieval).
As for what Google is attempting I hope they keep it up, it is much better to know than to guess.
GoogleGuy about those sites hiding text under pictures, and your system getting totally confused when dealing with name based sites. Any time line for fixes for those as well?