Forum Moderators: open
And, the big G has noticed. I've got fresh tags at least 5 times since the last update. For whatever reason Googlebot likes to take a nip at least at my index.htm page about every 2 days, and sometimes even more often. This site ain't nothing big on the scale of things (a less than whopping pr5 on the home page), and hardly seems worth the attention. Makes me feel the bot somehow doesn't trust me, and is keeping a close eye. To paraphrase George Orwell, "Big Googlebot is watching you." And my first name in Winston.
A day and a half ago I went really hog wild, and after adding another small section to the site, added a link to the home page. I'm wondering if alarm bells will go off at the Googleplex when the bot notices that, if it already hasn't. ;)
Is there any reason at all I should be worried, and make sure to wait a couple weeks after any changes before making another? Or is it just that Googlebot spots any minor little change, and just adds the fresh tag and adds the newest page to the index, and casually moves on to the next site? In theory this shouldn't mean anything. Ain't no law against tweaking the HTML on a page a lot. However, does Googlebot agree?
Although I do understand the big brother thing. I had a brand new site get deep crawled last week, and now Googlebot is hitting the home page daily. Only the home page has been added to the db.
The daily visits make you feel like Google hasn't decided if the deep crawled content is worthy, so they are keeping tabs on you while they make up their mind....
[edited by: WebGuerrilla at 4:54 pm (utc) on Sep. 9, 2002]
Only thing I can think of is if you do significant changes you might catch some sort of filter that means you are dropped in G results until they see the site more closely.
I'm just wondering that because of things like domains expiring, known url's changing to po*n sites in a day etc ;)
I have seen this with a couple of sites. Google considers it over optimisation.
What does Google consider acceptable optimization?
That would seem to make sites with dynamic content at risk, and I haven't seen any evidence of that. In fact, it seems to me that the busier a site is the more the bot comes around and having the bot onsite is a "good thing."
Lets face it, the proportion of people changing websites who know more than very basic optimization which comes naturally (use decriptive titles, structure your document), must be a thousands times those who optimize directly.
Likewise, fresh tags and indexing is primarily to index fresh content, not search for bad guys.
Me thinks there is mucho paranoia around here - or guilt complexes maybe? :)
But this ISN'T being done at all with SEO in mind. I'm not changing keyword density or such on a daily basis. All this is just tweaking for aesthetic considerations and the like.
Oh, great. I just checked the logs and Googlebot has actually found the new content. I'm doomed. ;) However, I've noticed that Googlebot no matter what seems nosy. Even when not changing things, it likes to take a nip at index.htm every 2 or 3 days. Worse yet, I just noticed I introduced a syntactical error on the home page for that link to the new content, and am tempted to correct it. Is adding a couple of parentheses enough to get one damned to Google hell? ;)
I do think there is a minimum change necessary to keep the fresh tag thing happening, changing a date or such doesn't cut it, nor would I tweak things just to attract attention. Many webloggers add a sentence or two per day, and it doesn't seem to annoy the bots.
once i wacked my site by just adding about 50 links to it.
i t dropped very much in rank, this was just last month.
After i removed the 50 extra links this month, the site got back up in the rankings.
Guess had something to do with dividing the pr among the extra pages(links).
tweaked does not affect google much, in fact you could do this:
<hello google, how are u this crawl>
i love you very much mr google
</hello google, how are u this crawl>
and google will ignore unknown tags, but will index the text between the tags and the rest of the html. of course.
None.
One of the bigger anti-seo FUDs [google.com] you will ever see from an se:
Be very careful about allowing an individual consultant or company to 'optimize' your web site. Chances are they will engage in some of our "Don'ts" and end up hurting your site.
I now, I saw that before. DEFINITELY an attempt at FUD. The weird part about that is SEO is so fundamental to web page design there are actually meta tags like description and keywords which exist *solely* for use by SEs. Damned if you do, and damned with lousy SE visibility of you don't.
SEO is nothing but good traditional document design plus a few 'tricks' that work with automated robots but not humans.
Unlike mant others here i dont think im a spammer just beacuse i use good titles descriptions and headings. Nor am i a spammer because im continually changing pages for spellos, better structure and better readability.
but i digress sorry,
No, Google will not dislike your site because you tweak it every hour even. I really think they have much more effective and efficient ways of finding spam than focusing on changes from day to day.
Now THAT is the sort of thing that can really get one paranoid. Basically, any change on the text of a page will almost always effect the keyword density. On those sites you saw get dropped to PR0 for changing keyword density, were they doing anything really outrageous? I'd have thought to getting a PR0 generally required something truly bad, like creating new pages on the site that obviously were targeting specific keywords, and basically were doorway pages.
Also, why would anyone change keyword density on a site to *daily* with SEO in mind? No search engine is updating daily. If optimizing for Google was the goal, then someone doing such would wait until after the dance was over, check the results, and then alter his keyword density shortly afterwards. And then wait for a month until the next update before doing it again.