Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: open
An excellent post, that my own research backs up.
I don't want this however to be just a "me too" post and would like to make perhaps some practical suggestions, or at least things to try out, for those affected by Florida/Austin.
Disclaimer: These suggestions are essentially for trial purposes and I do not claim to hold the definitive answer.
Becoming an authority
Localrank is essentially all about the inter connectivity of pre-florida SERPs for a given keyword. It is believed to be based upon the top 1000 pages of the old serps, although I suspect it is likely to be less. What this means is that the results of the top pre-Florida SERPS (old algorithm) get put through a new filter which then forms the basis for the current SERPs. The localrank is calculated, in layman’s terms, by the amount of incoming links a website has from websites that were previously ranked well in the old SERPS. In other words, are websites on your theme/keyword that were previously well ranked, linking to you?
There is then a calculation based upon the old algorithm SERPs (OldScore) + Localrank (LRscore) to determine the Florida/Austin SERPs (not clear whether they are multiplied or added).
Evidence that theming/localrank/Hilltop is factored in to the current algorithm is pretty strong imo. There are the odd exceptions but they may well apply to ‘weak’ themes (weak meaning not commonly searched for terms).
Portals, large websites, sub domains of authority sites, High pr sites (7+), resource/info sites and professional spam sites are rising to the top.
On ‘weak’ themes there is a correlation between the number of internal pages indexed and the fact that the internal pages that are ranking well, have the exact phrase as the file name.
As mentioned, I’m seeing a great deal of websites at the top of the current serps that have lots of incoming links from not only similarly related sites but even their competitors. For the most part the top sites are indeed authorities on their subject. For those that aren’t authorities on their theme, yet still rank well, the website owners seem to have a large marketing budget or carry out:
1. Purchase of text links on similarly related sites.
2. Blog spam
3. Have multiple domains (on varying c-block ip numbers.)
Of the varying theories out there what I personally don’t buy is:
a. There is an over optimization filter. Many sites at the top of florida/austin keywords have high kw densities, h1 tags etc.
b. The Google Adwords trigger theory. That Google have specifically targeted keywords based on Adword price for example. I believe Google has enough search data to know what is a "strong" theme (many searches) and what is a "weak" theme (terms with few searches).
So how do you get to be one of the authority sites?
Well of course, the usual way is to create great content that webmasters will feel is worthy of linking to. Not any old link will suffice, but links from sites that are themselves regarded as authority sites. For "weak" themes I still see good results from high link popularity that are not necessarily thematic. Hence for weak themes you will likely see link farm spam rising to the top.
Here some suggestion for becoming an authority site:
1. Get a link from DMOZ, this supports the theming theory believed to be in place as the category page you have your link, is highly likely to have lots more occurrences of your keyword /theme and be very thematic (unless it’s a regional category). I am seeing a lot of google directory category links visible on the very top of Austin results.
2. There may well now be a stronger case for forking out the $299 for a Yahoo link. Although I haven’t tested this out yet so the Yahoo link may not be weighted.
3. More effort should be taken in finding thematic portals and quality directories where site submissions are accepted for review (in google try ‘keyword directory’ / ‘keyword +submit site’ etc.)
3. As links from a website of the same theme and links from authority sites appear to be important, you might consider trading links with a competitor. Here you might think Webby’s lost his marbles with that suggestion :-) but trading a link with a competitor that is ranking well in the current SERPs may bring about a large boost for your own site. Not easy to pursuade, but with a well formulated email feasible.
4. Large sites have by default lots of content. More content is likely to mean more chance of you getting one or more of your sub-pages linked to. It is also logical that large sites are seen by google as more effort being taken by a webmaster and the site deemed more of an authority than a smaller site. This suggests increasing the size of your website to compare with the number of indexed pages of the top sites in the current serps ('site:www.domain.com' in google). Think glossary/lexicon, faq, forum, detailed product description…)as potential areas for more pages.
5. Create something unique such as an on/offline tool or perhaps a research article. Basically something of worth that none of your competitors have on their sites.
6. If applicable to your theme, create a forum. This has two advantages, if it is an se friendly forum you will get many more pages indexed (I have thirty odd thousand pages indexed, mostly coming from my German language SEM forum). The other advantage is that posts in a forum regularly get linked to from other forums, or if it's a particularly good post, perhaps directly from an authority site.
7. If you are knowledgeable in your field, write articles and place them on your website (don't syndicate immediately). If the article is of worth, authority sites may well link to it. A little public relations (the other PR) can help here to get an article published on a major authority site, be sure to get that backward link though.
8. Google have a new patent for distinguishing duplicate pages and near duplicate pages. The important bit is 'near' duplicate pages. If you have several domains and the content is not completely duplicated, but very similar, consider a new layout and re-wording paragraphs to make the sites more different.
9. Google can identify crosslinking on multiple domains with the same ip c-block. A lot of the crosslinking merchants have lost their ranking because they hosted their multiple domains on the same c-block and none of the domains were themselves authorities. Another reason why professional spammers are still in the serps is that their multiple domains are on completely different ip blocks and they have a budget to buy their links (no names but you know I'm sure yourselves of examples of this).
Having multiple domains in itself is not however spam. Don't kick the proverbial out of cross linking and review your hosting arrangements. Are they on the same ip c-block?
For mom & pop small online businesses, getting a high ranking just got much harder. Most cannot afford to purchase text links from similarly related websites, they cannot compete with the professional spammers who have dozens of separate domains with separate ip blocks as the hosting of such is unaffordable. They can only rank well for minor (weak) terms as some posts here have already highlighted.
On-page optimization is still a factor but much less than it used to be imo. What imo smaller websites that don't have link purchase kind of budgets can do, is to increase the size if their website as mentioned in point 4 above. Perhaps also split 'scrolly' pages in to 2-3 smaller pages.
On a side note, I've researched the No. of backward links and No. of indexed pages for many florida top 10 serps. There seems to be some correlation between the top results and:
a. Many backward links + many indexed pages
b. fewer backward links but many indexed pages.
c. Many backward links but few indexed pages.
I believe it is the combination of these two factors which determines an authority page, especially lonks from sites of the same theme. It is like there is some kind of a threshold where sites are filtered out and in. I don’t know what the threshold is but I imagine it is based on the strength of the theme and competition.
So imo, new sites with low budgets, or those hit by Austin/Florida need to increase the the number of pages in their sites, get something unique on their site which makes the site of value in order to get linked to. Really what Google and I believe Brett have been saying all along.
However, more and more 3-4 word search terms are no longer showing up highly relevant 'smaller' websites due to the new algo placing far too much emphasis on authority.
The results often show authority sites coming tops with 3-4 keyword phrases with the keyword phrase, or even just half the phrase, occuring just once or twice. this hardly makes the page relevant. This is not good. It means that if you ranked well before and have now lost your ranking, It doesn’t matter how relevant your pages are or how good your on-page optimization is, if you aren’t an authority or at least becoming one, you have no chance of getting found unless it really is a very niche keyword phrase. You might want to do some synonym/thesaurus checks to find relevant terms that do not include a 'strong' keyword. That way you probably have more chance of getting found.
I can't tell you for sure if localrank or hilltop is in use. It is most likely a combination of the two plus some more filtering we don’t know about which might explain some anomalies. There again, i might also be completely wrong.
Anyway, I just thought that I’d at least provide perhaps some practical ideas to try out. For those devastated by Austin/Florida, some of my suggestions might be worth trying. It certainly cant hurt your site.
I completely agree with what you are theorising and in all honesty, is this not the way it should/has always been since Google first launched?
Search Engine Optimisation by its nature is designed to manipulate the search engine to return a higher placement for the optimised site. As Google develops this is basically an on going battle as Google refines its algorithms and the results of which mean (in some cases) SEO'd sites are given a lower place in the SERPS because (in Google's mind) they don't deserve to be there.
It is a great theory and well done for wading through the hundreds of Google Update posts to find the ones which are actually worth reading and allow others to go on and do something about it.
See message 179 of this thread: [webmasterworld.com...]
As beeing an authority is mentioned often in WW lately it might be worth to mention that there are also hubs.
If Google sees you as hub (directory...) it might be a better strategy to add additional links to other (authority) sites.
If you want to be seen as authority you better add new pages with more content about your subject.
IMHO as hub you get away with fewer incoming links while as a autority you have to have as many links (from internal and external pages) form theme related pages as possible.
Sites with loads of PR, all bought, 1000's of pages thrown up, all pulling higher positions than the standard "authorities" (loads of links in and out etc about widgets) in an field.
not in one area, but loads of areas. each one is a directory effectively.
Build one now, or be left behind :).
Your professional spammer gets himself an affiliate partnership with say espotting, creates a 10,000 page 'directory' all with affiliate links and then buys him/herself a world of authority site links and bingo, comes up top for a whole bunch of terms as Google due to the sites link pop sees it as an authority :-/
There are several such portal sites in Germany with bought link pop from authority sites that are dominating the serps for literally thousands of keyword combinations. This is more of an encouragement to link spamming than anything else. I certainly see it working. Not only link spamming but thousands of doorway pages as well. I understand what Google are trying to achieve, yet I feel they underestimate the manipulation that can and is seen to be possible.
I await the next phase, as so far they seem to have made it easier to spam, and so in modern search the algo is weaker (imho).
I cannot believe they are stopping here.
BTW, I realised why you spelt Pempernal wrong :).
Funny I read through your post, and the rest seemed so well put together...
Tickles me the way the detector works. Tried to post about Scnuthorpe once, (a small place on the east cost of Britain )
I come to much the same conclusions as you do, though less through research, and more through trying to figure out what the end might be that would justify these means.
Also concluded that while the object of both Hilltop and The Patent are the same, they differ enough in "ways and means" to work best either in tandem or amalgamated.
IMO we're in the middle of a "Great Leap Forward" in search relevance, right now it's the chaos and confusion stage, and it may last for some time.
Answer, It can not.
I see sites that are listed in the serps, because I have exchanged links with them, and that is the only reason they show up in the serps for that query. These sites only have the keywords in one place, in the text link back to my site. (I cover many keywords, not all have been hit, so I am better off than a lot of sites)
If I have the content, I must be the authority, not the site I link to, that is totally unrelated to the queries.
G is broken. If G does not recognize this, then they will soon be just another SE, and they will not be the dominate one.
Get Back to work, write pages of good content, get links, same basic rules apply, when Joe Public cannot find what they are looking for(unless they are looking for directories), they will search elsewhere, always have, always will.
Just my 2 cents worth,
I posted a very abrievated version of the same theory after Florida, so I agree mostly with what you say. The only thing I don't agree is that the aglo is applied across the entire spectum of terms. If this were the case many non-commercial site owner would join the ranks of those complaining about Google's new algo. The fact is the filter (or theming algo) is only applied on specific cases like "real estate" and "hotels". Even in the these industries it was only applied when the area was big enough to target. For example San Francisco real estate was filtered where as St. John's real estate not. (St. John's is a city in Newfoundland, Canada). To believe there wasn't some correlational between when the filter was applied and the value of the term is naive. The fact is Google is selectively filtering and their Adword revenue seems to be the only one benefiting.
NOTE: St. John's and secondary real estate terms were added to the list of filtered terms after Austin.
That directories are appearing all over shows to me this is a work in progress, as neither HT/LR envisage their "expert" pages would be the ones ultimately returned for the search query.
On a practical note it might be a good idea to check out your google directory / Yahoo category and make some friends ;-)
Consider similarly related categories as well.
This week I've had already 2 requests from what could be classed as competitors for a link exchange so it seems others are seeing this localrank /local pagerank in the algo as well.
[edited by: webby2001 at 11:04 am (utc) on Feb. 3, 2004]
(this is a serious question :)
This says that google have not completed their calculations yet, and that over time the results will improve. I agree, and I really do hope that this is the case. However, if this is so, how can we possibly predict where they are going? How many more steps?
Also, this has been going on since mid November now, 2 1/2 months. Seems a dangerous game they are playing, rolling out the results in this way.
I may sound here anti-Google, but actually I support what they are trying to do and still regard them as being the best search engine out there (personal opinion). Sure authority sites should be top, as long as they are TRULY an authority and it isn't solely based on bought links, networks and thousands of fluff / doorway pages. I really want Google to crack this as it is for myself still the best search engine out there. I fear if they dont get the balance right, and soon, MSN and Co. are going to be yapping at their heels sooner than many think.
[edited by: webby2001 at 11:12 am (utc) on Feb. 3, 2004]
IMO there are three steps, re-configuring the general index into loose topical areas using CIRCA broadmatching, the identification through particular parameters of "expert" pages, and mapping the inter-connectivity between the links returned by those "experts".
Where we are now, I have no idea.....;-)
<added wild guess> This mapping may use Benoit Mandelbrot's work with fractals, see todays logo</>
This mapping may use Benoit Mandelbrot's work with fractals, see todays logo
Interesting point. They're subtle people. A bit of complex algebra at the top of the logo too. Lots of PHDs at GooglePlex apparently, and the percentage of employees with beards is way above the national average ;) The potential problem is that with IQs in the top 1% they might not be basing their judgement of SERP quality on typical searches (has anyone noticed that sites of a scientific / mathematical nature have barely moved since Florida?)
BTW: the logo is actually a reference to the mathematician Gaston Julia - not Mandelbrot
[edited by: Chelsea at 11:25 am (utc) on Feb. 3, 2004]
I may sound here anti-Google, but actually I support what they are trying to do...
Assuming their motives are good then so do I. I've been a Google fan for years and my recent posts are definitely "more in sorrow than in anger".
If this is just a "breaking in" of a new algo, if it all turns out right in the end, great. However I still think they have made a serious mistake by live-testing an algo that is producing rubbish for weeks.
If live testing is inevitable then they should have told people. A simple statement of the expected disruption and the long-term benefits would have done so much to placate people not only here but in the real world.
Gogle are normally so good at PR, they've really slipped up this time.
We may mixing up Authority and Expert, to my reading "authority" sites are those returned after the inter-connectivity of the "expert" sites has been calculated, and should be highly relevant.
BTW, I see a small but significant difference between the way HT and LR pick their experts, HT through the number of on-topic links, LR through on-page factors.
The returned set of "experts" would be quite different, and amalgamated, the results would have far better balance.
Time will tell however I have some more tests going on.
You may well be right, but IMO LS leaves the relevant on page factors deliberately vague, and these may well be extended to include Title etc.
Detailed description - (Fig 1)
"For example, documents may have their rank value based on the proximity of the search terms in the document (documents with the search terms close together are given higher rank values) or on the number of occurrences of the search term (e.g., a document that repeatedly uses a search term is given a higher rank value)."
In a couple of threads I've used "authority", when I meant "expert", HT uses the term "target" instead of LS's "authority", which is definitely clearer ;-)
Very interesting. I've noticed for a while the algo seemed to be less sophisticated in that number of keyword occurences seemed to be more important than density. My ABAKUS Topword tool measures just that (No. of occurences for single , two word and three word phrases on a page) and I think I'll start testing some more with it.
Depends if we're talking Hilltop or LocalScore ;-)
Expert pages are the initial returns for a search query, HT would return pages with on-topic links, LS pages with on-page factors.
The pages these link to would be those seen as "authority" (LS) or "target" (HT)
Bear in mind I'm arguing this completely hypothetically, with no attempt at using present serps to support/disprove my case, it's much easier that way ;-)
What matters is what to do with the new algo.
1. Get more links.
2. Write more content.
3. Get links from authority sites.
4. Write articles for your site.
5. Link to a relevant authority website.
6. Get more links from sites in the old top 1000. Try Google.es to find these.