Forum Moderators: Robert Charlton & goodroi
reseller, currently, a few data centers have some different data that should be everywhere in a few days. I'll keep people posted on the status of things, and collect feedback closer to the end after things are settling down more. I'd expect that things will be back to their normal level of everflux by New Orleans. But we do have incremental indexing after all, so it's normal to expect a certain amount of change to the index every day or so (aka everflux).In fact, everflux is a pretty good analogy. If you go back to summer 2003, update Fritz was the beginning of the transition from a monthly update to an incremental index. It caused a lot of comments, because plenty of people were happy with an index that only changed once a month. A lot of the thickness in my hide started with Fritz during summer two years ago. happy! Summer in the northern hemisphere is often a good time for a search engines to work on revamping different parts of our system and improving our quality; typically search engine traffic is lower in the summer due to seasonality. So the summer is a good time to think about things like bringing in new signals of quality and ways to rank pages, plus doing things like reorganizing our webmaster pages, etc. etc.
It's true that the summer (northern hemisphere) is when traffic is lower and sometimes it's easier to roll new things into crawling/indexing/scoring. I wouldn't be at all surprised if we worked on revamping our webmaster pages, for example. The SEO and quality guidelines pages have aged pretty well, but other parts of the webmaster section need to be reorganized; there's a few places where there's the same info (e.g. about robots.txt) repeated in several places or scattered over different pages. It's not trivial to reorganize that much info, esp. since it's translated into so many languages. But better to go ahead and start, and then if we want to tune those pages later, that's okay. I've been aching for a long time to mention somewhere official that sites shouldn't use "&id=" as a parameter if they want maximal Googlebot crawlage, for example. So many sites use "&id=" with session IDs that Googlebot usually avoids urls with that parameter, but we've only mentioned that here and on a few other places. Getting started on things like that will be nice. I appreciate the people who sat down and tried to tease out the info on our current webmaster pages and organize it more logically.
I have to admit that I'm so conditioned to our regular snippets that when we show one of the other types of descriptions, it's a little jarring for me. But I've been keeping an open mind and trying to figure out when I like it vs. when it just keeps me from getting the info I want faster. :)
Just as a guide for people who don't eat and breathe WebmasterWorld, we typically show new backlink sample sets every 3-5 weeks or so. We have a bank of machines that computes PageRank continuously (and continually, I suppose; I wasn't an English major), but we only export new visible PageRanks every 3-4 months or so.
I've never heard the suggestion that Google would penalize for iframes before reading it in the thread. Plenty of legit sites use iframes, so it wouldn't make sense to penalize for it. Powdork gave the right response in message 337 of the first Bourbon thread. Now I can easily believe that some search engine spiders would have trouble with iframes just like some spiders have trouble with frames. But I wouldn't expect iframes to cause any penalties.
So it's kind of like the whole urban myth that was going on for a while about "if you use javascript to change the text in the status bar, a search engine may penalize you." Sometimes these things just get started, I dunno where. Most of the time there's level heads around to say "that just doesn't sound right; it doesn't hold together with common sense." So I'll debunk the iframe myth now for good measure.
So the high-order bits to bear in mind are
- make it as easy as possible for search engines and spiders; save calculation by giving absolute instead of relative links.
- be consistent. Make a decision on www vs. non-www and follow the same convention consistently for all the links on your site. Use permanent redirects to keep spiders fetching the correct page.
Those rules of thumb will serve you well no matter what with every search engine, not just with Google. Of course, the vast majority of the time a search engine will handle a situation correctly, but anything that you can do to reduce the chance of a problem is a good idea. If you don't see any problems with your existing site, I wouldn't bother going back and changing or rewriting links. But it's something good to bear in mind when making new sites, for example.
Like I mentioned before, summer is a good time to work on pulling in new pieces of infrastructure and ways to rank/score pages. I'd look for us to keep finding new ways to get info out to webmasters (having my own roomy thread to post in helps! :), and hopefully provide more ways for webmasters to give info back to us as well. :)