Welcome to WebmasterWorld Guest from 184.108.40.206
The update now appears to be over. So let's discuss the algo.
What I think google changed in the algo this update...
- Backlinks count for less. I see pages with a handful of less than outstanding quality doing well.
- Keyword density counts for more. I see poorly linked pages stuffed with a set of keywords doing well.
- Semantics / LSI - no evidence. If anything I notice a narrowed focus works better. 'clothes for women'!= 'clothes for her'!= 'clothes for woman' in Google right now.
- Hilltop = no evidence. Seeing pages from small, obscure, never seen before sites ranking well.
- On page factors count for more. It's the only way I can explain some pages rankings.
- No evidence of penalising affiliate sites.
There's a start... my tuppence worth.
I look forward to reading your analysis / thoughts.
ps: mods, please help keep this thread clean by deleting rants / moving to other Allegra thread.
posted on another thread too, but didn't know this was opened
Inbound links still very important while internal not so much (or even penalized with keyword repetition).
Subdomains/internal pages not featuring as they were pre update.
SERPS have been settled for a day for my keyphrase.
This time it is a bit more balanced, but still ugly. The SERPs show about a 50% decline in relevance
This was on a very focused, specific query. A site I am associated with ranks #9, and is one of only 4 good results in the top 20. I have another site at #6 on this term that really ought to be outranked by at least a few more relevant ones. (But who's to complain:))
lsi has a long ways to go IMHO.
Sounds like wishful thinking, but lets hope so....When I use the &filter=0 command I'm back at #1 for my main key words.
Allinurl:sitename.com shows sites redirecting to me, my url no longer shows up. I'll try and flush these out then see if that makes an improvement...Could be weeks before I know for sure.
1. Both have roughly the same keyword density for "Acme" that my site does--roughly 4% to 6%.
2. Both have the same PR as my site, which is PR 3.
3. Neither has as many incoming links as my site. Mine has double the number that one of the sites does, and five times the number that the site I created for the other company does. (Makes me want to go, "hmmm...."
4. My page title is "Acme Hammers. Acme Mallets. Acme Blue Mallets. Find Acme Mallets at MySite.com" My competitor's site's title is "Competitor.com--Acme Mallets including 2002 Special Edition Mallets." The site I created for another company has the page titled "Acme Hammer and Mallets." (Another difference that piques my interest).
6. The competitor's page uses much, much more table tags than does mine, while the page I created for the other company uses none.
7. The competitor's page doesn't use any H tags anywhere, the page I created for other company uses <h1>, and my page uses <h1> and <h2>.
8. The competitor's site, the site I created for the other company, and all of the other sites on the first page have been around for over two years. My site has only been around since last June.
Given that my site has now disappeared with the new algo, I hope the above is food for thought.
I just checked my other site, non-commercial: it is back to pre-302 redirect redirect bug. Close to 10,000 visitors today. My main commercial one has tanked.
"i just made 2 checks at mcdar within 2 hours (last was right know)the data moves from one to the other so i don't think its over yet. "
[edited by: walkman at 12:37 am (utc) on Feb. 10, 2005]
From what I can see is that those sites that have dissapeared (MIA'd whatever...) in Allegra are not even showing for their unique domain names. Only sites that link to them with their domain name as anchor text show as their domain name, generally in PR order.
Errr well... Let's all try and find significant new websites that have disapeared (I've seen a massive pseudo-gov.uk one here in the UK) and write their domain names in plain text on a PR 7 page and show up for their domain name! Simple idea. Easy way to steal traffic and show up Google's latest update mayhem...
I don't see that it would be that bad IMO. They could even be tracking the SERPS to see which sites on which DC are being clicked on more often etc.
There are two sets of DC's and they have been stable like this for the past 3 days now, why haven't they changed? Maybe Google doesn't want them to change?
Anyway, my take on the update is: they have made duplicate content a major issue to stop the spammers from scraping your site and taking a snippet but in the process haven't really pulled it off properly.
Also - they haven't implemented the 301 correctly either. Eg: If I do a 301 to your website, Google will replace the Cached version of my site to be your website and well, give YOU the duplicate penalty instead of removing my 301 website.
Day before, there were three sets, the old set, the changed set, and a mised set. I can see only the mixed and new set now. Interesting thing is, in the mixed set, real new and fresh content is no.1 in a category i keep checking.
My site is pure content - and as someone said here before, seems to be doing much better in the new SERPs (which many are complaining about) and a wee little bit worse in the mixed, fresher SERPs.
So is it going to settle down, or are we going to have 2 separate SERPs? Its already made life difficult when explaining some stuff to clients.
SERPS report both
Other evidence suggests a stronger weighting to age of website than before.
Has anyone else noticed increased prevalence of subpages?
I have suggy, in a number of sites I monitor.
I have also noticed that a few sites appear to now be out of the sandbox with their deep linked pages while others are now in it.
The serps for my main site have returned to pre-florida, wonder if it has to do with how long the site has been up. It's like, Google checks to make sure your site is valid over a period of time.
I also think there are serious flaws with allegra, sites are being dumped in a random nature, I am seeing sites just standard html, no link programs, no ads etc vanilla squeaky clean just disappearing but others much the same remain unchanged.
We need to wait and see!
gives me the impression this is the update, done. I don't see why he would be asking for feedback in the middle of the process. Wouldn't he just say "whoa there nelly, chill, we ain't finished yet?" if it were still ongoing?
Don't hold your breathe!
i added a few dozen pages in January, designed as a sort of "dictionary" on widgets. Title and page name comprise a combination of widget plus one or two other words, connected by '-' plus '.php' the body contains a random selection of ten pictures=shop-deeplinks on widgets, plus a few outbound links (!) to similar pages, even competitors, together with a few lines of relevant (!) text from those pages.
If searching for one of the exact phrases these pages were designed for, my results are always among the top ten. To me this supports
1) the relevance of title (and url) plus keyword in body-text
2) the relevance of outbound links
though both as a neither neccesary nor sufficient condition, just as a support.
But amazingly my whole site has also benefited on its broader embracing terms. The main page went from #23 to 2 or 3 (still jumping). I assume it is now accepted as an authority because of its relevant outbound links.
Let me also add two more observations:
1) Google has considarably improved its search on pictures (note the new link on the start-page), and I'm sure we might gain significant insights in the new algos from that.
2) Take a look at maps.google.com. There google offers searches like 'townA to townB', 'pizza stores near abbey road' and the like. Note the use of those particles of localization which are obviously added to the logical operators of ordinary site-search. to me this indicates a considerable step forward on linguisitc analysis of websites, not only by LSI but other yet undiscovered features. I guess a thorrough analysis of this recent update will take a long long time.
Also: the no-rel-tag issue, but this doesn't affect me.
We definitely need means to automate our analysis of new algos. A database to register the ups and downs of pages before and after such updates.