| 7:14 am on Sep 27, 2010 (gmt 0)|
|I would get rid of visible pagerank. |
I agree but then wouldn't someone come along and make a fortune with a PR add-in tool? :)
| 7:32 am on Sep 27, 2010 (gmt 0)|
Grandfather legitimate backlinks from sites that no longer exist.
I finally looked at all the sites that linked to mine and was pleasantly surprised to see that numerous independent, theme-related sites linked to mine. Hooray....
But those sites all died because of the recession, or they let their domain registration lapse and their domain was taken over by spammers.
so I have probably lost about 30 to 40 percent of my naturally obtained backlinks to my site just due to economic attrition.
| 7:44 am on Sep 27, 2010 (gmt 0)|
|wouldn't someone come along and make a fortune with a PR add-in tool? |
Using what for data?
| 9:35 am on Sep 27, 2010 (gmt 0)|
If I knew I would build it. ;)
| 2:41 pm on Sep 27, 2010 (gmt 0)|
Punish copy pasters. G even allows them to have networks of MFAs.
Mmmm perhaps is not always about the "algo" but the G attitude and interests.
| 6:04 pm on Sep 27, 2010 (gmt 0)|
|Mmmm perhaps is not always about the "algo" but the G attitude and interests. |
I know that lately, I have been feeling that way too...
However, if you search the words "google slam" you will find that a LOT of people who do MLM /affiliate programs have had their adwords listings booted out by google quite simply because they were not relevant enough or too blatantly spammy.
so i can't just point a finger at g and say that they are just trying to grind out as much money as possible.
| 7:29 pm on Sep 27, 2010 (gmt 0)|
|so I have probably lost about 30 to 40 percent of my naturally obtained backlinks to my site just due to economic attrition. |
But this applies to everyone else as well, so it evens out. Grandfathering links is a bad idea, because the linked-to site could have changed to spammy junk as well. Old links aren't getting monitored at all.
What I would like to see is dead link checking ramped up. If any webmaster can check for 404s automatically then so can a search engine. This can be used as a quality signal to score sites for freshness and reliability, and there's anecdotal evidence that this goes on to an extent. I'd like to see the dial turned up on that, just to see what would happen.
| 8:02 pm on Sep 27, 2010 (gmt 0)|
Don't count blog comment links. I've heard Google might already be doing this but that's not what I'm seeing.
| 9:58 pm on Sep 27, 2010 (gmt 0)|
? Backlink count and quality, just like GG...
| 11:10 pm on Sep 27, 2010 (gmt 0)|
You may have underestimated that job (spidering the entire web ain't easy) - and also how PR is calculated now, including different link values for different positions on the page. There's a reason why MozRank and AC Rank are a bit, well, lame.
| 1:58 am on Sep 28, 2010 (gmt 0)|
I know there is a such a thing that is a paid link to "game" Google, but how do you get around determining the difference between legitimate paid advertising which is all over the net?
Also, a penalty should not be a permanent exclusion. People can make an honest mistake trying to shortcut the shortcut. I believe in perhaps a 3 strikes your out rule, but not straight to the penalty box for a first offence. That's a little too drastic.
| 2:09 am on Sep 28, 2010 (gmt 0)|
A paid link that passes PageRank is what Google says "No" to. If you stop the PR from passing (use a scripted redirect with robots.txt Disallow, or add a nofollow attribute to the link), then you're fine.
Many penalties - MOST penalties in fact - are currently not permanent. Most sites that get permanently banned do know what they did.
Google's algo "could" just ignore all such links if they wanted to. But then there would not be any social effect among websmasters. Fear, uncertainty and doubt are great social control factors.
If I were in charge of the Google algorithm, I'm beginning to think I might communicate LESS about it, not more.
| 5:50 am on Sep 28, 2010 (gmt 0)|
I am with Tedster on this one.
Instead of handinng out penalties, they should simply make the links not working.
For me, the reason for penalties is the fact that they are unable to detect paid links properly.
So I'd simply fix that first ;-)
| 7:44 am on Oct 3, 2010 (gmt 0)|
1) more rigorous exclusion of duplicate content
2) better detection of networks of sites linking to each other
3) better ranking of new sites that quickly gain some good links, to make it easier for them to move up in the rankings against established sites.
| 3:29 pm on Oct 3, 2010 (gmt 0)|
Hack into Bing's algo
CTRL + A
CTRL + C
Open Google algo
CTRL + V
A bit simplistic, but the jist of it is there ;-)
| 9:39 pm on Oct 4, 2010 (gmt 0)|
I see so many talking about links and their value who don't seem to realize that last year, with the removal of the PageRank tool from their webmaster tools, Google declared linking a dead metric.
Google has been fighting a losing battle with links since the original algorithm.
First it was reciprocal linking without relevance.
Then link farms.
When Google introduced PageRank it was intended as an honest rating based on votes from pages linking to the target page.
It was hoped that this would proceed in an organic, (unpaid, non-commercial), mode.
This didn't happen. Everybody and their brother invented a plan to influence the Search Engine Results Positions, (SERPs) using linking schemes. Even today some of the most respected SEO firms offer linking plans which they say will influence the Search engine rankings.
There have been almost constant changes to the PageRank system since it went public.
From inception to June 2002, small tweaks were continually made to the ranking process with a minimum of 19 days and a maximum of 54 days between updates.
In June, July, Aug, Sept, Oct 2002 major changes were made to the PR tool bar and Google directory, and another 12 updates were done which culminated in the Nov 16, 2003 "Florida" update.
Along with LocalRank*, Florida also saw the SEO filter (search engine optimization filter) to weed out over optimized sites.
LocalRank was reported by the web workshop in Dec 2003.
"Another idea that has taken hold is that Google have implemented LocalRank. LocalRank is a method of modifying the rankings based on the inter-connectivity between the pages that have been selected to be ranked."
After Florida came Austin which targeted factors like free for all links.
Brandy came right on the heels of Austin and it changed the basic concept of page rank.
A report on Brandy said:
Links and Anchor Text.
Google shifted its focus to the nature, quantity and quality of inbound as well as outbound link anchor text of a website. It decreased the importance of PageRank which had been Google’s unique ranking system.
Right here is the first official notice of the devaluing of links. As of Feb 11, 2004 links count for less and, if you investigate the news about the updates, LSI was introduced. The combination of LSI and devaluing of links made on-page work more important.
Brandy also saw Goggle talk about Link Neighborhoods.
About a year later saw Bourbon arrive in May and updates to it continue for a few months.
Non-thematic Linking - Having links to pages which contain content irrelevant to the source page's subject matter,
Low Quality Reciprocal Links - Links from "bad neighborhoods".
Fraternal Linking - Creating a network of sites, which all link back to the same "master" site in an effort to boost the master site's rankings.
In Oct the same year Google released the first of 3 Jagger updates.
Jagger focused on the growing "Generic" SERP irrelevancy, Reciprocal Linking Abuse, and Ballooning BlogSpam
Their solutions were increased importance placed on IBL (Inbound Links) Relevancy.
With reciprocal linking abuse growing out of hand, even "organic" SERP were losing relevancy.
Increased importance placed on OBL (Outbound Links) Relevancy.
The "Bourbon" update delivered a marked hit on irrelevantly linked and broader based directories.
nofollow was brought in.
More weight thrown back to PR @ top domain.
Google departed from earlier values ascribed to PageRank from the beginning in quest of content, content freshness and other factors.
It has been suggested that more emphasis was being placed on the PR of the site, rather than the page.
After Jagger3 you could find PR0 pages highly placed in important Topic SERPs. This is another clear indicator that linking is becoming less important.
In regards to links, Jagger dealt with:
Value of incoming links
Value of anchor text in incoming links
Content on page of incoming links
Age of the incoming links
Nature of sites linking to you
Speed and volume of incoming links created
Value of reciprocal links
Impact of outbound links / links page on your website
From Dec '05 to March '06 "Big Daddy" rolled in.
A major infrastructure change, again partially to combat link spam. As the update spread across the data centers, people started to notice that many pages from their sites had disappeared from the regular index.
Matt Cutts, a senior software engineer at Google, put it down to "sites where our algorithms had very low trust in the inlinks or the outlinks of that site. Examples that might cause that include excessive reciprocal links, linking to spammy neighborhoods on the web, or link buying/selling."
The rest of 2006, all of 2007 and 2008 saw about 22 recorded changes to PR and back links, including some changes and reversals to changes.
At the beginning of '07 Google came out with an algo to try and defeat Google Bombing.
2009 was visibly about PR with 6 "Google dances" due to PR updates which culminated in the removal of the PR tool bar in Google's webmaster tools.
Linking was officially dead, PageRank was no longer to be trusted.
Developers were told "We’ve been telling people for a long time that they shouldn’t focus on PageRank so much; many site owners seem to think it's the most important metric for them to track, which is simply not true."
Although PR was declared a dead metric for SERPs, it is still employed, albeit in a considerably changed algo.
Largely unnoticed due to some misdirection on Google's part, the Mayday update rewrote the basis of PageRank from one based upon a mathematical process to one based on relevance between linking and linked pages
The telling thing that Google said was reported by Vanessa Fox in an interview about Google: "I asked Google for more specifics and they told me that it was a rankings change, not a crawling or indexing change."
Following Google's past terminology,
"Rankings" refers to PageRank ratings.
"Crawling" refers to how often your pages are visited.
"Indexing" is all about SERPs.
Since Google has declared PR a dead metric, this can only be an attempt to revive it.
As it stands now links affect PageRank which does not affect SERPs.
Hands on experience in developing my new site shows that my PR4 was achieved with only 115 incoming links. 113 on PR0 pages.
This alone negates the old method of calculations where the PR of the linking page was taken into consideration.
My older site has a PR3 and has over 500 links.
In the SERPs the site, with ONE inbound link, placed higher than an authority site with 277,000 IBLs for a phrase with 33,000,000 competing sites. (Links reported by Yahoo).
This means that the amount and quality of inbound links does not influence SERPs. You can see this in just about every SERPs pages.
Since Mayday Google has stated that PR is used to determine the frequency a site gets spidered.
Higher PR sites get visited more often and information is fresher.
That a new site can hold the top position in a competitive search without any "off page optimization" means that the controlling factors for positioning are all on-page.
SERPs and PageRank are now separate entities.
SERPs are decided by on page factors, relevance, presentation, silo and synonym densities.
Links build PageRank which is used to present authority without an influence on SERPs.
| 7:29 pm on Oct 5, 2010 (gmt 0)|
Links were NEVER a feasible way to bring web sites to the fore in SERPS.
Looking at it ethically:
B-to-B sites could get links from customers - lots of them in some cases - because their customers had web sites where links could be placed.
Small sites whose customers were mainly general public (millions of sites such as health, domestic shopping, travel...) could get almost no links because their customers did not have web sites.
Where is the fairness - or even feasibility - in that?
To make a further point: few of our web sites have more than a handful of links at most - some as few as two. We've never had problems getting good rankings SO why all this fuss about getting millions of (mostly fake) links to a site?
The real outcome of google's "get lots of links" policy has been to push up black hat link generation and promote the very scenario they claim is "google illegal".
| 9:14 pm on Oct 5, 2010 (gmt 0)|
@dstiles, at the beginning links counted highly towards positions.
That was the whole focus of Google core algos.
|PageRank reflects our view of the importance of web pages by considering more than 500 million variables and 2 billion terms. Pages that we believe are important pages receive a higher PageRank and are more likely to appear at the top of the search results. |
I have a small site that has gained over 5000 links from individuals using our graphic plugin filter software. Links could be built organically. It just took time.
I don't think it was Google's plan to have link building rise to the frenzy it has.
However, not speaking out is a great piece of misdirection.
Let the SEO "Experts" go off chasing link rainbows. It keeps them from actually performing any valid SEO.
| 3:53 am on Oct 6, 2010 (gmt 0)|
If you keep collecting and studying the data, you will see that backlinks ain't all that anymore. I am sure that Google's "200+ algo factors" include some we've barely dreamed of. If I could change anything about Google's algo I turn up the knob a bit for on-page content. Sometimes the search results today look like they've lost the plot.
The other choice would be to drop the name "search engine" completely and call it something new, like "information engine".
| 9:14 pm on Oct 7, 2010 (gmt 0)|
Our site have great content all around, the user came to the site and hanged for a long time. Done every white hat method since the start of the site, oldest domain I have and now out of top 5 in the current google dance. Google algorithm is simple; screw the webmaster, I will keep my efforts for Yahoo and Bing as they at least don't screw you over in 1 update. A 5 position drop I can take, but a kick to the bottom of the second page, thats just cruel.
| This 50 message thread spans 2 pages: < < 50 ( 1  ) |