Welcome to WebmasterWorld Guest from 184.108.40.206
The backlink has indeed been devalued, however selectively.
.gov backlinks are still strong as is yahoo directory and other large authority type sites.
backlinks that are showing up all seem to have one thing in common - they also have the url in text. This fact would suggest that the text url is now very strong as its outweighing the negative effect of the actual backlink on the same page.
Its no secret that google has been going after paid linking and I think that this is part of their strategy to tackle it, where it will lead is anybody's guess.
I think you are correct about the search volume effected industries.
Taking it further, although text url is the obvious exploit of the moment, its also possible the algo is doing a correlation on the numbers between what IT considers to be a legitamte link from its list of sources and other sites which at a certain point is assuming paid linking or heavy seo and imposing backlink penalty.
I don't think its about "juice" to good links. I think its about penalty to bad and the ratio between good to bad links.
[edited by: tedster at 7:38 pm (utc) on Oct. 1, 2007]
It would just make sense that if a link leading to a page disapears, and there are no other links crawled (yet) leading to it, that the page not return in the serps. This would affect blogs most since they link from their front page articles a lot and have the "similar article" links that are always changing.
[edited by: kamikaze_Optimizer at 8:00 am (utc) on Oct. 6, 2007]
It is a blog with comments function (allowing anyone to comment). There is a chance for updates by the public.
I like the update box, maybe I'll inform there about similar writings and on-topic news.
That'll do just fine without having to touch the copy and change character of the literature presentation.
There are websites now ranking for mid-competitive keywords based on factors definetely originating from a very dark place.
In notable industry I closely watch for a long time, backlinks from pharmacy type of sites (homepage links!) rank you amongs authorities.
This is really looking for trouble with spammers and text link buyers all over the place.
The plural results in our industry are the worst ever, making sense one time out of 5 or 10. Except for so called authorities eventually.
The geolocation amazes me. Regular national/internaltional US sites replaced by foreign sites on Google.com but still ranking on some other Google tld's....
I don't understand their geo location procedure.
A website that says "whatever country" widget with domain registration in "whatever country" is logically less likely to to interest US users.
At the same time, textlink buyers live in paradise, their rank exactly where they have chosen to. I really feel like I've wasted my time reported all type of scams and textlinks-only rankings.
Blogs are still king of the hill. I don't get this one either. From where I stand 99% of blogs are as good as geocities personal pages used to be. Blogrolls and any other link schemes such as creating multiple domains or simply having a blog on a separate domain bump all sites in the top 20/30.
Any cr.p looks like good Google food as long as it is fresh, whatever the relevancy, whatever even the Language...
Half of the web seems to be gone supplemental. No improvements whatsoever in textlink hunting and certainly not irrelevant backlink schemes.
Yet not all searches are affected like for the past 3-4 weeks. Quality has downgraded only on not so competitive searches.
To tell the exact reality: I find Google SERP's to look like Live.com. Not sure if Google knows what it means but to me Google has complicated their algorithm so much that it filters all but cheap promotion on usually unoticeable websites.
Simultaneously other pages (C), are spamed by the .cn to link to other people's pages (B), in which the link has been introduced with the same key. With this, they obtain similar content and (B) will obtain more authority. They camouflage the links with .edu and sites with authority.
B link to A=(our web)
C --> A,B
D --> A,B,C
E --> A,B,C,D
F --> A,B,C,D,E
[edited by: Errioxa at 3:01 pm (utc) on Oct. 6, 2007]
I am seeing dynamic pages with parameters in the URL get beaten by "static" pages.
For example, static pages like:
are beating out dynamic pages like:
Personally, my own dynamic pages have almost disappeared for keyword CountryX Widgets. The google toolbar shows gray PR. Only my index page which is static and links to all my dynamic pages appear in the SERPS in any meaningful way for keyword CountryX Widgets and rank very poorly.
Almost all of the top 30 CountryX Widgets SERPS are static pages.
The SERPS seem to be messed up only in the searches where google suggests search refinements.
Anyone else seeing this? Are these old phenomena?
I don't know if they've messed up their geo location on purpose or not but there are rather ridiculous search results.
Example of site starting to rank on Google.com, according to Yahoo site explorer:
>site 1: 10 incoming links, including 5 from poland, 2 from india, 3 from totally unrelated sites in quite suspicious industries (casino and the like...)
>site 2: 15 incoming links, including 4 scrappers from previous MSN rankings and 11 links from lowest end general directories with plenty of adsense and textlink networks.
Noted a total of 7 sites similarly "relevant" for some searches. I am talking here about competitive (as far as Paid clicks go, yet low # of searches) 2-5 word queries.
I've seen that before, but only on MSN.
So if my understanding is correct my choices are now: click on the authoriy site or Google partner (ebay, about.com, ...), take a chance to by from the other side of the Galaxy or click on paid links?
I am very surprised that Google ends up with such "thing".
It's doubtful that they did not notice. This has been going on for what, 3-4 weeks and started much before at a lower degree.
I am also disapointed at webmasters here and elsewhere that probably see their own interest in the short term and do not raise any alarms to stop this debacle. Look a little bit around you people, search for medium traffic volume keywords and see what you are given to along with your usual authority sites.
Then compare to MSN for the fun of it.
WE do also have sites that benefit from whatever Google is doing right now, never thought they would rank on anything competitive since never promoted yet they do. But I think that nobody will benefit from more foreign sites, cheaply SEO'd results along with 1-voice-only authorities (or so-called). What is the point really? Does Google have technical issues, I mean real huge ones? Or is this all a giant test in which we are all guinea pigs for a month? Seriously!
Saw a nice news report on BBC today explaining how Goog (and the others) had better get their "universal search"/web 2.0 results together and more accurately display what searchers are really looking for.
And guess what, that's what they're trying to do.
Unfortunately, Goog has a nasty habit of being in perpetual "beta" and rolling out their testing to everyone.
Long story short - Goog knows Web2.0 will eventually take Goog market share so it's trying it's darnedest to implement Universal Search ASAP to keep up. As always, those changes have interesting side effects.
joined:Oct 27, 2001
Long story short - Goog knows Web2.0 will eventually take Goog market share so it's trying it's darnedest to implement Universal Search ASAP to keep up.
Universal Search has nothing to do with "Web 2.0" and everything to do with user education and common sense. If you were expanding your site from, say, widgets to widgets plus whatsits and thingamabobs, wouldn't you want your widget users to know that your site now covers whatsits and thingamabobs as well? And wouldn't you make it as easy as possible for those existing widget users to find your content on whatsits and thingamabobs?
Universal Search has nothing to do with "Web 2.0" and everything to do with user education and common sense
To paraphrase the report ... Google is failing to understand what searchers are looking for.
(see 1-word search threads)
Web 2.0 properties which are user-generated more intuitively target what a searcher is looking for because.. ummm... err... uhh.. oh yeah!, because it's generated by people -
not a stale outdated education-based algo from 10 years ago.
Last time I checked, universal search incorporated local search, user-generated videos, user-picked news topics and gee, Goog sure seems to like user-generated Wiki for their results.
The searchers are educating Goog, not the other-way around.
[edited by: tedster at 5:50 pm (utc) on Oct. 8, 2007]
joined:Oct 27, 2001
Vertical search is important because it's one of the two major things I've long talked about as being how search will advance. First generation search analyzed words on a page to rank content. Second generation search tapped into link analysis. Third generation search to me is looking at both user input (what we visit; what we click on; personalized results) and making search go more vertical.
More educational reading for those who like to be informed.
I'm seeing some movement in sites that I'm very familiar with that I can only attribute to more weight being assigned to onpage factors. It may be temporary... I just noticed it this morning. Anyone else seeing this?
I have seen the same - More weight to on-page factors.
It seems that up to a few weeks ago, on-page factors was roughly....
Least Density for a KW with Highest Prominence
Now it seems that it has changed to:
Highest Density with Highest Prominence
(this is pretty hard target to hit if your site is not spammy)
Naturally there are other factors (G claims that there are about 200) that get factored into the algo but for the KW I check.... this is what the highest ranking sites ALL have in common except for Wiki and Answers which seem to have received a hand job by G.
[edited by: Arctrust at 8:35 pm (utc) on Oct. 8, 2007]
If this approach is what is kicking in - and it's certainly possible, given that the patent is associated with localization - then Google would be folding "on-site" factors into their algo - so in making sense of current shifts, we might have on-page, on-site and off-site factors to consider. Bill considers how this same technogy could extend far beyond "location signals".
In fact, this could account for something I'm seeing. Some urls seem to be getting a rankings boost when the entire site is predominantly about the search term, compared to other urls where the entire site is not so precisely focused.
joined:Feb 12, 2005
Rather, much of the changes in the serps would suggest that Google is weighting more heavily SERP CTR rates and bounce-back rates from those clicks, similarly to what they've been doing more and more with Adwords in determining a Quality Score.
Maybe their new garbage identifying mechanism weights more heavily this "Quality Score" of your SERP listing?
In that post, I did bring up the earlier patent application, but it was because a newer one does expand the process involved to areas outside localization:
Here's the abstract:
Web pages of a Website may be processed to improve search results. For example, information likely to pertain to more than just the Web page it is directly associated with may be identified. One or more other, related, Web pages that such information is likely to pertain to is also identified. The identified information is associated with the identified other Web page(s) and this association is saved in a way to affect a search result score of the Web page(s).
The two share an author, Daniel Egnor, who was the technical lead for Google Local Search when the first patent application was written, and moved over to the algorithmic search team. The documents also share some ideas on how a page could possibly be seen as an authority page. The earlier one focused upon which page might be the best one (an authoritative page) for a business determined to be at a specific location.
The second one discusses localization, but it also explores determining whether a specific page might be an authoritative one for a specific query term, or for a category, as well as a business at a specified location.
The kind of boosting that you describe is detailed in the newer patent filing - folding "on-site" factors into their algorithms.
I have one client who keeps getting Google traffic to odd pages that are one click away from an important page. Usually these are long tail searches, with the "authority" phrase as only part of the complete query. Hard to nail this one down definitely as the result of this patent technology, but it is suggestive. Even landing on a "wrong" page, the traffic is doing OK because they do use the site navigation as a rule to check out the "main" page.
If this is that patent in action, I guess the fine tuning and "machine learning" will take a while to kick in effectively.
I do think that this kind of authority might show more clearly in results where there maybe should be an "authority" type site, like in their example of "Ramada Cincinnati."
It's probably a good result that the Cincinnati specific page on the Ramada site shows up ahead of the trip advisor result, or the Yahoo Travel, or other travel related sites. Is that the doing of this patent application in action? I don't know for certain.
Adam said: "Link love to a page may or may not always be appropriate to spread to a whole domain."
...Google traffic to odd pages that are one click away from an important page...
...I remember also an earlier discussion with Marcia, where she'd mentioned observing jumps in ranking for a page which she attributed to inbound links to a page one click away. I thought I had observed the same thing. We didn't know what might explain it, though.
Regarding the more weight for onpage factors that I'd reported in my post above, I'm seeing it on several other sites across a fairly wide range of terms. These happen all to be index pages, so I don't think, within these sites, at any rate, that the cause is the effect of the authority boost the patent covers.
[edited by: Robert_Charlton at 7:58 pm (utc) on Oct. 9, 2007]
I thought this might clear up overnight but it's still the same today.
Can anyone explain this?
What is the meaning of "depending on which Class C block I look at"? Does it mean if you see from different PC at different class C? Like you use proxy?
This is how it has always worked from day 1. Basically any page "one click away" from an important page is linked from an important page, so of course benefits from a link from the important page.
"Get links from important pages" is hardly news.
(If this "wrong page" thing is now more prevalent than previously, okay, that's something quite notable.)