| 9:49 am on Jun 1, 2010 (gmt 0)|
Hissingsid, interesting observation as my next move is to place links to pages which have suffered onto the home page (footer of every page).
Luckily the site I have a problem with is quite small, so this should not present an issue.
Will present a real problem for larger sites though!
| 9:56 am on Jun 1, 2010 (gmt 0)|
Not sure if footer is best place to put them.
| 10:13 am on Jun 1, 2010 (gmt 0)|
I always place links below changing content, especially if they are to be repeated across multiple pages.
| 10:28 am on Jun 1, 2010 (gmt 0)|
Yes but Google may filter out what it thinks are "part of the template". The footer's a bit obvious as "Part of the template".
| 10:31 am on Jun 1, 2010 (gmt 0)|
MayDay Update, well, the only thing I am pleased about is, I inspired the naming of the update to MayDay, with credits to pontifex suggestion after my post and Tedster agreeing, see [webmasterworld.com ]. Secondly, it also shows WebmasterWorld's lead as the definite forum when it comes to tracking SEs activity.
From all observations I made since I last posted, ecommerce sites, sites with close to an even ratio (affiliate links/ unique content) and sites aggregating a fair bit of their content are hit harder. In short, sites having a considerable content originated first somewhere else are hit harder regardless of how much unique content is supporting the site.
Caffeine is not yet live, Gbot is still re-indexing larger sites from scratch as I noted weeks ago, HENCE many sites are still showing under-ranking due the latter fact, not enough indexed pages and pages linked to them are also being re-indexed = lower rank factor until all is re-indexed. When I pointed out what I called the Total-Recall, is basically a recount, re-index, re-rank...which shows the filters are only turned on again once all is done. So, there are a lot of changes yet to happen!
This algo update in the middle of moving to the new infrastructure did not help in anyway, we speculated and speculated and finally were sure it was an algo update, no thanks to anyone from G*, but thanks to all regulars here, we spotted it weeks before MC or VF confirmed that. If we can predict accurately as we did, then we can definitely also predict how to adapt to the new algo (once all is served from the caffeine IS and all sites are re-indexed and moved properly)!
| 11:08 am on Jun 1, 2010 (gmt 0)|
Hissingsid you are probably right, however I do not know of a way round this if the links not being present on every page is the issue in the first place :-)
Unless I move them randomly, which I could do but won't bother trying.
What I am noticing is there are a number of sites I have created which I've done NO SEO work on what so ever outranking my main websites.
Which is really strange and brings me to the conclusion that there's not really very much I can do about this anyway.
I'm a little bit more than worried right now if I am honest!
| 11:34 am on Jun 1, 2010 (gmt 0)|
I'm not saying randomise just that the footer is a particularly easy target to filter out.
| 12:19 pm on Jun 1, 2010 (gmt 0)|
Let's summarize everything that occured in the last few months at google.
- New google interface with google jazz
- Long term search traffic algorithm change confirmed by MC
and something that I agree with that Dusky just said and MC also "indirectly "confirmed in his video saying that caffeine is "procedding the pace... "
"Caffeine is not yet live, Gbot is still re-indexing larger sites from scratch as I noted weeks ago, HENCE many sites are still showing under-ranking due the latter fact, not enough indexed pages and pages linked to them are also being re-indexed = lower rank factor until all is re-indexed. When I pointed out what I called the Total-Recall, is basically a recount, re-index, re-rank...which shows the filters are only turned on again once all is done. So, there are a lot of changes yet to happen! "
The question is when is the reindexing going to be over and when is caffeine going be live ? are we years away, month away, weeks away or days away from it ...
| 1:03 pm on Jun 1, 2010 (gmt 0)|
|Caffeine is not yet live, Gbot is still re-indexing larger sites from scratch |
If the reindexing is for caffeine and it is not yet live. Then it those pages should still be in the index. No?
Lots of pages dropped on 17 May - so this could be reindexing. But why would you drop the page to reindex it? Why not keep it until it is reindexed?
| 1:18 pm on Jun 1, 2010 (gmt 0)|
|my next move is to place links to pages which have suffered onto the home page (footer of every page). |
There seems to be some evidence that placing links within naturally written sentences is more effective for passing on juice (as opposed to footers or sidebars), so if you can do that in a way that works for your visitors, you may see better results.
| 1:22 pm on Jun 1, 2010 (gmt 0)|
|The question is when is the reindexing going to be over and when is caffeine going be live ? are we years away, month away, weeks away or days away from it ... |
I feel we may be days away or weeks away at the most. When the caffeine buzz started, engineers at G* probably faced a different challenge which they had to work on updating their Algo first, then roll out caffeine slowly in chunks. What's happening now is data collection in chunks, hence the Gbot mega re-spidering activity many sites were and are seeing right now.
IMO, the algo update is not the only reason why many large sites are sent to oblivion, rather, it is the re-indexing process which needs to be done to move to the new infrastructure. One clue, the majority of sites which are hit are large sites, smaller sites were also hit, but their re-indexing does not take long and their backlink profile is usually thinner, hence their re-ranking happens quicker and are updated quicker...
As I pointed out also recently, two pager sites outranking 20th century old large authority sites in many niches is not G* messing up or getting it wrong, it's because the filters had to be undone until a total-recall has been made (total re-index).
[edited by: tedster at 1:24 pm (utc) on Jun 1, 2010]
| 1:54 pm on Jun 1, 2010 (gmt 0)|
|There seems to be some evidence that placing links within naturally written sentences is more effective for passing on juice (as opposed to footers or sidebars), so if you can do that in a way that works for your visitors, you may see better results. |
This is how the links to these pages currently work.
However the links come from the next level of links rather than directly from the home page.
The results seem random at present so I think I'll hold fire for a bit longer before I do anything else.
| 2:11 pm on Jun 1, 2010 (gmt 0)|
|However the links come from the next level of links rather than directly from the home page. |
What I'm suggesting is rather than adding a series of links at the footer, if you can add a paragraph or 2 of natural language text at the bottom (keep in mind I have no idea what your site looks like), that would get them on the home page in a format that G seems to like.
But as you said, it's also not the worst idea to hold off on any changes right now, until we see how this all shakes out, and if Dusky's right, that may be sooner rather than later. If you do add the links and it seems to help, please report back, as that would confirm yet one more variable...
| 2:42 pm on Jun 1, 2010 (gmt 0)|
Dusky thank for the reply but I still have a question because there is one thing I don't understand :
You say below that filters had to be undone which is maybe why one day I say my website rank first for one specific keyword in front of my competitors ( it lasted a few hours and I happened to check at that time )
My worry is was I first because the filter was undone or is it the potential ranking I am suppose to get once the reindexing is done ?
"recently, two pager sites outranking 20th century old large authority sites in many niches is not G* messing up or getting it wrong, it's because the filters had to be undone until a total-recall has been made (total re-index). "
| 2:53 pm on Jun 1, 2010 (gmt 0)|
|But as you said, it's also not the worst idea to hold off on any changes right now, until we see how this all shakes out, |
Someone earlier said they were worried. That made me think just wait a while there'll soon be something new to worry about. For webmasters Google is a bit like what they say about the weather in Scotland. If you don't like the weather just wait a while there'll be something different along shortly.
Having said that there are things that can be done that should not harm even if things do change. Personally I'm definitely doing an audit of my internal linking structure and I'm seriously considering some form of additional navigation links on very page to focus additional anchor text onto key pages. I'm certain that this is the right move in my niche and I can't see how it will do harm if there is a change.
| 3:30 pm on Jun 1, 2010 (gmt 0)|
|Someone earlier said they were worried. That made me think just wait a while there'll soon be something new to worry about. For webmasters Google is a bit like what they say about the weather in Scotland. |
I think that was me... but what's even stranger is I grew up in Scotland.
If and when I get round to changing links I will post my findings. I've noticed a few more 'improvements' I can make to content which is currently taking priority.
| 4:12 pm on Jun 1, 2010 (gmt 0)|
member22, make sure you are not logged into your G* account when searching, also have the personalization turned off and search history and even cookies deleted (deleting cookies will log you out off other sites you auto logged in such as here, do this with care), when logged in of course you'll see your site first...
However, if you were logged out and still seen a glimpse of your site ranking well, then not so well minutes later, that can be due to the search results were served from either a fresher DC or an older DC, depends.
What we have into play here are three things:
- The G* dance as we used to call it, SERPS rapidly changing and served from different DCs every few minutes
- New Algo change presenting searchers with different than normal SERPs
- Old and New DCs (New infrastructure DCs) both serving results with the new DCs being updated concurrently
Re-ranking of sites as we go, some are going further down due to:
a) - Large sites with 0000s of pages being re-assessed after the re-harvest, hence probably only 10-50%+ of pages processed along with their backlink profile which means 10-50%+ of their overall power is missing and not yet available to add to the overall trustrank and authority
b)- The new algo, sites are scoring differently, assessed and measured differently, hence some of the power which was helping large sites was from backlinks, some of those backlinks are now slightly devalued or discounted, but some may be elevated but need time for the total-recall to gain back better (or worse) rank
Some are going higher up due to:
a)- Other sites have to take the place of sites being re-indexed and re-ranked, in most cases smaller and faster re-indexable sites while the infrastructure update is gathering pace
b)- New Algo update seem to prefer original onsite unique content even more than ever, large ecommerce sites for example tend to have product titles similar to other ecommerce sites, selling best selling books for example, only the author him/herself are the originals. Other 000s of ecommerce sites and their stores are unique sites, but the book they are selling is available on other 000s of sites, hence it rests on added original content to support the site. Now you might say, I sell widgets, I provide a store and a shopping cart with good design and good customer service and that's all what's needed. The new algo seems to me is saying to us, differentiation, differentiation, differentiation with plenty of innovative ways of keeping users onsite, keeping them coming back, and most importantly, keep them bokmarking us and recommending us, which ties in nicely with what we thought about this new Algo update bing powered either partly or wholly by the use of AI (Artificial Intelligence), titles and description are now only a small reliable pointer, G* decides which should come at the top mostly from users experience and this is pure AI.
Long tail SERPs +- fluctuation, site: command numbers shrinking, re-spidering, shrinking number of SERPs for well known sites, WebmasterWorld for example, funny SERPs when searching for exact match even for very long phrases (even with quotes) you get a different result at the top even though that exact phrase only exists on your site...all this is temporary IMO.
Although the new algo should be a better search experience and results for users in the long run, not much is left to be desired when half of the index is discarded at the moment and this has hurt of lot of bread winners as well as large corporations.
| 5:04 pm on Jun 1, 2010 (gmt 0)|
I posted my CTR data at the end of April from Wbmstr Tools, and still think the introduction/weighting of CTR as a criteria has A LOT to do with Mayday, but wanted to share a couple recent observations:
1) Last week I purchased banner space on a high traffic forum in a complimentary industry looking for some referral traffic. The banner ad's alt text was set to reflect the image ad 'blue widget quotes', and is also a competitive keyword phrase. (The image actually reads 'new blue widget quotes', the alt text is just 'blue widget quotes'). The site was ranking #15 for the keyword phrase (#5-#20 consistently for years). Today, the site is not found for that term (cached today).
Coincidence, or filter/penalty? Could this be a 'penalty' for advertising on that site? A penalty for using slightly different alt text? Are images being 'read' and compared to alt tag values as a means to police the use of alt text and over optimization? Thoughts?
2) Site hit hard by Mayday had a lot of external back links from sites owned by the same company (same registry info, same host/nameservers) but unique IP's. We experienced a severe drop to the kw phrase that was used as the anchor text across many of these sites. Could be that link value/authority evaluation has changed (whether do to an incomplete re-indexing of back link profile, or stricter/smarter analysis of external linking sites).
Did anyone else that got hit by Mayday build external links from sites they owned or that were in the same registry/hosting account (despite unique ips)?
3) Site hit hard by Mayday has a large % of external back links to the homepage. Not much distribution to other internal pages. Is this a commonality in the back link profiles of other Mayday victims?
Just wanted to share, please let me know your thoughts or any similar observations.
| 6:11 pm on Jun 1, 2010 (gmt 0)|
|Did anyone else that got hit by Mayday build external links from sites they owned or that were in the same registry/hosting account (despite unique ips)? |
I have lots of sites that do this and some were hit, some weren't so I had discounted this as a factor for myself.
| 6:26 pm on Jun 1, 2010 (gmt 0)|
I think that the value of on topic external links is massively down-weighted vs within site links. So sites with good in site and on page optimisation rose in SERPS and those with lesser in site and on page optimisation regardless of backlinks fell back.
If what others have said regarding a complete re spidering and rebuilding of the index is correct then it may be that backlink value will return as the web map is rebuilt.
| 6:33 pm on Jun 1, 2010 (gmt 0)|
The upside of all these changes will "hopefully" be better quality results. The downside is the financial pothole we all have to ride through. Sales are bad and not even Adwords seem to be helping. Thinking of disabling some ads to see how much organic is left.
Anyone notice any odd traffic patterns? I am seeing an occasional window between 5am and noon where traffic and sales suddenly pop back to "normal". Not very typical. Also getting a higher than normal number of sales from foreign locations. Seems like US traffic is cut off.
| 7:22 pm on Jun 1, 2010 (gmt 0)|
backdraft7 - I'm here in the US, and I'm seeing quite a bit of odd traffic from foreign locations in the afternoons as well. Although we do ship internationally, we've had a higher than normal rate of sales that are outside of the country.
| 8:18 pm on Jun 1, 2010 (gmt 0)|
a telling pattern perhaps? Is seems data centers are churning - I'm afraid to see what gets spit out!
| 8:39 pm on Jun 1, 2010 (gmt 0)|
watching another massive increase in WMT for number of sitemap pages listed, going up around 20k per hour for the past 4 hours. Still no change in the actual results though.
| 8:55 pm on Jun 1, 2010 (gmt 0)|
The easiest way to add links to important places on blogs or sites using a database/cms is to add them via sql. Pick a keyword, search for a single instance of it in a particular table and replace it with a link for each different post/article.
If Google has snipped the pagerank passing value of any given link you'll know it soon enough.
I got snipped once for the sites primary keyword, many eons ago, and it's what the site was about so I chalked it up to Google thinking the site wasn't worthy of ranking well for that term. I discovered it when the site also got sitelink status in the serps and I noticed that that one word was removed from the extra links and the title. I ask for re-inclusion, the word was restored (the site immediately ranked well again) but the sitelink was removed.
The site's been fine ever since.
| 9:02 pm on Jun 1, 2010 (gmt 0)|
Matt Cutts mentioned that this was going to happen in Las Vegas after the search engine smackdown, he said it was going to deal with websites with "spammy footers"
I assume their will be a threshold of links in the footer that are allowable to google.
| 9:19 pm on Jun 1, 2010 (gmt 0)|
Interesting supafresh, the planned timing being after the recent changes I mean.
Acceptable (expected?) links in the footer would be:
~ terms of service
~ site ownership link...
and that last one might be a biggie - if a site is owned by an established reputable company that last link might boost a pages authority factor. Of course if that's true it could be abused, people could start adding "owned by Google" with a link to Google as the site's owner. (example)
Many sites just got owned by google so it wouldn't be entirely untrue, hehe.
| 9:40 pm on Jun 1, 2010 (gmt 0)|
Very interesting about the footer thing, however any links in our footer at all. What we do have is a div that shows the left side menu, but code wise that div code is after the main body of the page. I wonder if they would be mis interpreting that?
| 9:52 pm on Jun 1, 2010 (gmt 0)|
Nice post Dusky at #4145002
Some of theories are solid and make sense. I am not certain that links, or the re-evaluation of links, are necessarily part of the equation here.
The theory you present which refers to a rebuilding of the index, where larger websites with many deep pages is sound, and is what I believe is happening from the data and facts that I see for the websites I watched. This accompanied a marked reduction in the reportable pages in the index for many large websites - number I think that will turn around for some websites.
Supporting links from those 10-15% of currently 'lost' pages could very well cause the type of ranking reduction some large websites have seen, in addition to the new way Google is scoring long tail terms.
| This 329 message thread spans 11 pages: 329 (  2 3 4 5 6 7 8 9 ... 11 ) > > |