Welcome to WebmasterWorld Guest from 184.108.40.206
It seems to me that the SERPs have been almost uncommonly quiet for the past few days. I guess with the US holiday and so on, that's to be expected. I'm not sure I could handle any heavy drama until after the New Year - which is when I think we'll see more signifiant churn again.
The lack or SERP activity (especially after such a hectic period)- do you think G has stopped twisting knobs, or webmasters have stopped tweaking sites? Or both?
3 days not in top 1000 for main site subject keywords.
Then 12 hours top 10 for main site subject keywords.
Followed by 12 hours not in top 1000 for main site subject keywords.
Then 12 hours top 10 for main site subject keywords.
Followed by 3 1/2 days not in the top 1000 for main site subject keywords.
14 sites though? Hmmm. Are they on similar topics? Share a backlink profile overlap? Same Whois? Interlinked?
12 hour period (in the sense of repetitive time block)? Are they business focused, home focused or otherwise specialised so time-of-day is important?
Secondly, I would seek strong independant backlinks for the sites so they stand on their own.
Then I would stop fretting for 3 months and see where I am after that time.
I'm targeting the Canadian market. The site is hosted in Canada but I noticed it uses U.S. ip addresses. Should I consider moving the site or wait for the December serp changes to finish.
Should I consider moving the site or wait for the December serp changes to finish.
I'm targeting the Canadian market. The site is hosted in Canada but I noticed it uses U.S. ip addresses. Should I consider moving the site or wait for the December serp changes to finish
I've never done that, but what I've read suggests the longer the site is geotargetted to the wrong region, the harder it is to move (or get G to realise its wrong). Theres no such thing as SERP changes finishing, and your site is young, and only just breaking into the top of SERPs.
In summary, I would move now.
looks like they're playing with the strength of the bond between index and inner pages again to me.
last night i was getting 2 different serps again for one query and one was the usual 3 sites all with indented listings #1 & 2, #3 & 4 #5 & 6, and the other SERP had the same sites but with "broken" connections, so #1 was still at 1, but it's second page was at #7
so we had #1 & 7, #2 & 6, #3 & 5, with #4 stationary on both serps , (actually the 4th site, but in 7th position on first SERP because of the dropouts)
does this make sense? anyone else see it yet?
[edited by: tedster at 7:40 pm (utc) on Dec. 3, 2008]
disconnect between index page and inner pages
I'd love to hear more about what kind of disconnects you see. One thing I've seen in the past days was not so much about ranking changes, but the second results from the same domain were not always indented. Now today, they seem to be indented twice as far! But those are just display changes, and I don't watch them very closely.
On the SERPs that I'm watching, I still see pretty strong connections between the Home Page and indented results showing up.
I've been pinned to #1 for my niche for a long time. Recently I yo-yo'ed but then settled down to #2 on my main term but was still number 1 on the secondary stuff. Now I'm seeing my slipping to #2 on secondary terms as well.
I really don't test things, I'm more of a 'throw enough backlinks at it to fix the problem' type but I'm not so sure that this is even my problem...since I'm seeing sites outrank me that I'd like to moan about as having crappy backlinks :). I can't imagine I've tripped a filter since I don't tend to do much of anything specifically, and I think a penalty wouldn't leave me at #2 :). So I'm a bit inclined to think that there's some factor in my backlinks that has been devalued substantially in the last month or so.
(the only other alternative, one of my larger backlinks nofollowed my link recently. I called and they fixed it, but I'm wondering if effectively removing a link for a month or two then putting it back could hurt me that much).
The site meets sitelinks criteria =>
# Show the strongest-linked pages
- Retrieve the best candidates for sitelinks (most linked to with repetitive anchor-text, and ignoring certain files (like legalese))
- Remove certain (frequently-occurring) patterns from the best link text for this URL (brand or repetitive keywords)
- Display sitelinks
IMO, sitelinks are useful information, because they reflect qualified data from Google. But, I believe sitelinks are generated from ranking-data, and so are a symptom rather than a cause.
One of our sites today totally dissapeared from G for all keywords, still found if I search of domain.com, it has pr and has pages indexed.
Should I panick or is this part of the update thats going on? We have been going for years.
did you check to see if you are in top 1000? you could have been hit by a 950
...our search algorithm saw a large area on the blog that was due to an IFRAME included from another site and that looked spammy to our automatic classifier. I believe that this bug has been fixed now. We also added additional safety checks to the relevant system that would escalate to an engineer if this site had the same issue in the future.
Google Groups discussion [groups.google.com]
Thanks to SE Roundtable [seroundtable.com] for spotting this little peak behind the curtain.
I thought I'd highlight some parts of the Matt Cutts comment that jumped out at me:
our search algorithm saw a large area
looked spammy to our automatic classifier
escalate to an engineer if this site had the same issue in the future
I've had top spot for a specific vehicle model since I wrote an in depth review on it over two years ago. The article is worthy of top spot on its own content wise but in general the official site will rank 1st for (one of) it's own products.
I knew one of the engineers working on the model long before it was oficially released and (with permission) I covered it eagerly. The article is a full 90 days older than ANY other mention of this model on the internet.
Does it pay to be first (with dictionary like attention to detail of course) ?
edit: Tedster - they've been using an automatic trigger for human inspection on a PER SITE basis for a very long time. They rely on the filters to tell them where to look and they then use that data to improve the filters. Chicken and egg stuff.
[edited by: JS_Harris at 2:55 pm (utc) on Dec. 8, 2008]