Forum Moderators: open
Ive been asked by my peers to explain why you can't add too much javascript on a page in the search engines pov, and I cannot think of a explanation that wont confuse them totally If anyone could direct me to a post or to post here it would be of great help.
Thanks
Phil McRevis
[edited by: xcandyman at 1:47 pm (utc) on Aug. 23, 2002]
Although I'm not an expert on Google, I have found evidence that they only read the first 100k of any file. If you put lots of javascript in your page, then their robot may not reach the later text.
File truncation is quite normal amongst robots. Google's appears to do it, our's does also and in past experiments I've observed and confirmed this behaviour in others too.
There are probably other factors which affect Google and javascript such as their file parser however I'll let other people here share their knowledge on the subject.
As for file truncation, 100+ k files should be avoided anyway. Fine for people who like me are currently sitting on an N meg link to the internet. Possibly not so great for modem users. In reality, file truncation is not a major issue.
Best regards,
Derek J. Preston
Head of Technology
Mirago plc
You have helped me explain it better to the people I work for. I usually go kicking and screaming while the designers put HUGE amounts of scripting on the top of the content.
I had a good understanding with our last designer and I worked close with him at the design stage but there's no chance at this new place with over 10 designers I don't know who's doing what LOL. And if I do find who's doing the design they are so fixed on what they want I find it impossible to getting them think about what the search engines see when they look at the source of the page.
When will designers ever learn LOL
Phil McRevis
[edited by: startup at 2:48 pm (utc) on Aug. 23, 2002]
I do what I'm good at and thats SEO not trying to explain every last detail of SEO to a old fashioned Director. Hooo Hmmm
Phil McRevis