homepage Welcome to WebmasterWorld Guest from 54.211.213.10
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 109 message thread spans 4 pages: 109 ( [1] 2 3 4 > >     
Speculation about August Google Changes
google update aug 2004
conradmiller123




msg:49014
 8:11 pm on Aug 10, 2004 (gmt 0)

From what I can tell G is trying to weed out paid links. Or off topic links from Site A to B.

Especially sites with many links from site A to B with the same Anchor text across the site.

As far as I can tell the links are not worthless they are just devalued.

Also I think the amount of PR passing has been devalued.

Any Ideas.

 

IanTurner




msg:49015
 8:46 pm on Aug 10, 2004 (gmt 0)

The real difficult I have with this theory is just how do you identify such links as being different from other links?

rudy0826




msg:49016
 9:00 pm on Aug 10, 2004 (gmt 0)

I had a strang thing happen I was part of that nigritude ultramarine contest and google associated me with those sites now Im unassociated with my original sites (for sites that had few or weak relationships) and lost all those backlinks in google....

robotsdobetter




msg:49017
 9:05 pm on Aug 10, 2004 (gmt 0)

The real difficult I have with this theory is just how do you identify such links as being different from other links?
I have to agree with that, how could they find these sites without humans being involved?
conradmiller123




msg:49018
 9:09 pm on Aug 10, 2004 (gmt 0)

Well if site A has a link to B on %50 or more of their pages then. I think it would be easy to spot.

as for off topic google sets proves that they can detect word relations. by checking freaquency of found together. As in if A is Found with B X % of the time. And A and B and C are found together X% of the time.

[labs.google.com...]

Take a look.

That is a obviously a Bayesian Network based system.

Further more they purchased that Symantec Discovery Company.

I not sure but this is the best I can come up with.

trimmer80




msg:49019
 9:09 pm on Aug 10, 2004 (gmt 0)

they could make good guesses, size / position of text, trigger words on the page ("advertise with us" etc.), off topic links and link text, non reciprical.

Out of interest, I have only every once been stung by a sandbox. This was for a paid link. I think it is in googles best interest to stop paid linking and thus i would not be surprised if they were sandboxing on these links and now devaluing.

lars




msg:49020
 9:33 pm on Aug 10, 2004 (gmt 0)

I have to agree with that, how could they find these sites without humans being involved?

Who's to say that Google can't use a human touch? Why could they not hire a project team to look at URLs and make determinations to check automated systems against human judgement? Google's business plan calls for the best results, but doesn't specify that it will ONLY use automated methods.

I proposed that this was a human affected update in the Traffic Dropped thread. I stand by that assertion. The results are too random (spotty, actually) to have come solely from an automated process.

trillianjedi




msg:49021
 9:40 pm on Aug 10, 2004 (gmt 0)

There are clearly humans involved in the process - sometimes a ban is obviously manual.

But 3+ billion pages?

My best guess is the best guess from the computers goes over to a manual list. That may then be further filtered by cross-referencing other data (spam reports?).

TJ

Ian_Cowley




msg:49022
 9:50 pm on Aug 10, 2004 (gmt 0)

You say they are weeding out paid links. But there's another word for a paid link - an advert. I can't belive that G is penalising sites that have adverts on other sites, it just wouldn't make sense. They may be discounting them as backlinks but where would they draw the line?

conradmiller123




msg:49023
 9:53 pm on Aug 10, 2004 (gmt 0)

I have a specfic example to that lets me know that this is what is going on because we spammed this 3 word phrase really hard the Anchor text had these words in there on a site that was totaly unrelated. The site that linked to us provided us with 1400 links. In less than 4 days we were number 1 on that phrase and we dont use that anchor text any where else.

And we still are #1 on Yahoo on that phrase.

.... Yahoo has not Caught on yet.

Plus if you look at the link buying community there is not to much bidding going on becouse I think every one that is buying links is taking a wait and see attitude.

If Im right and this is the case. The bottom is going to fall out on the link buying market. Who is going to pay 800 a month if the benifit is no more than a yahoo directory link you can get for 300 a year.

So links are going to have to be sold one at a time and far cheaper.

hugo_guzman




msg:49024
 10:07 pm on Aug 10, 2004 (gmt 0)

"I think it is in googles best interest to stop paid linking and thus i would not be surprised if they were sandboxing on these links and now devaluing."

Google (or any other search engine for that matter) will never ban or devalue text link advertising as long as the links are relevant and not overabundant.

The reason: backlinks are the backbone of the internet. Google cannot differentiate between a link that you put on your site because you really like it and think your visitors will find it useful, and a link you put on your site because someone is paying you.

That is a webmaster to webmaster transaction. Just like if I ask someone to link to me in exchange for design services, etc... You cannot prevent webmaster from bartering for links (whether the payment is in cash, services, or a reciprocating link).

In fact, I would reason that google benefits from link advertising because it increases the number of indexed domains, which by nature, would increase the scope of their search engine (more sites=more resources for google's visitors).

I have often mentioned (though not in this forum) that the visual "PR" (the little green bar) is only useful as a gauge for link exchanges and link purchases. On functional level, that is what most of us use it for.

Funny how Yahoo developed their own PR type tool in response to the popularity of Google's original "PR" bar in webmaster/seo circles. Are searching really looking to curtail link buy/selling/trading?

I don't think so. 50 years from now link buying/selling/exchanging will be coined "the world's oldest profession" in seo/webmaster community.

conradmiller123




msg:49025
 10:12 pm on Aug 10, 2004 (gmt 0)

"There are clearly humans involved in the process - sometimes a ban is obviously manual."

Yea maybe for banning. But the whole point of google is automation. you dont hire PHD's to manualy look at pages. they are just trying to find the common thread between sites. That flags their program to devalue a link. Also they can't compair any indivdual link to the whole index because. Functions that do this shoot off into exponents. and we know they have limited resources. So G has to make a judgement based on simplistic values. Up until Aug 6 you could buy your way to the top with a few simple high pr links. Now we just have to do it from multiple IP addresses. One link at a time.

trillianjedi




msg:49026
 10:20 pm on Aug 10, 2004 (gmt 0)

Up until Aug 6 you could buy your way to the top with a few simple high pr links.

You can still do it now.

In the markets I follow, there's a clear distinction being made between sites selling text adverts and sites selling PR, or selling a link for the advertised sole purpose of manipulating the SERPS.

I can't see how that can be achieved without manual review.

TJ

conradmiller123




msg:49027
 10:28 pm on Aug 10, 2004 (gmt 0)

Well if you want to see how its pretty easy just type in www.google.com.

beren




msg:49028
 10:41 pm on Aug 10, 2004 (gmt 0)

From what I can tell G is trying to weed out ... off topic links from Site A to B.

Huh. They SHOULD be trying to weed out off topic links, but I don't see any evidence that they are doing so. It appears to be getting worse from what I can see.

trillianjedi




msg:49029
 10:57 pm on Aug 10, 2004 (gmt 0)

Well if you want to see how its pretty easy just type in www.google.com.

OK, I did that, but it didn't show me how to use an automated process to detect whether a link is being purchased as a genuine advert, or being purchased to manipulate the serps.

TJ

trimmer80




msg:49030
 11:11 pm on Aug 10, 2004 (gmt 0)

I dont think the possiblity of devauling "off topic link advertising" should be ignored.

What is the difference between a natural link and a bought (or traded) link. In on the link text is created by the site owner the other the link text is usually chosen by the link recipient.

Thus google may a way to tell if it is bought or traded. Remember that for this type of change collateral damage to innocent parties may be perfectly acceptable.

Also google may have increased its algos desired keyword density for inbound anchor text. ie. maybe now you should not have more than 30% or your anchor text the same?

At this stage its all speculation.

willybfriendly




msg:49031
 11:13 pm on Aug 10, 2004 (gmt 0)

They SHOULD be trying to weed out off topic links

The problem is how do you define an off-topic link. There are far to many tangential relationships between things.

Aunt Tilly's Flowers gracing the table of Uncle Elmer's restaurant
Uncle Elmer's contributions to the neighborhood little league stadium
Grandpa Chester's windshield repair service fixing glass broken by baseballs at the little league stadium
Grandma Elsie's Car Lot, providing the vehicles used by Grandpa Chester's business, and also leasing Aunt Tilly delivery vans for her cut flowers.

Are these really "off-topic"? How can an algo tell?

WBF

trimmer80




msg:49032
 11:32 pm on Aug 10, 2004 (gmt 0)

The problem is how do you define an off-topic link

I think the world is flat mentality is alive and well.

Google has 3 billion pages of data to play with.

it is not impossible to perceive that if google can allocate a topic to each page, it can then map what topics are related to each other and from that make assumption on what is off / on topic... Do a search in these forums for Latent Semantic Indexing.

robotsdobetter




msg:49033
 11:46 pm on Aug 10, 2004 (gmt 0)

Who's to say that Google can't use a human touch? Why could they not hire a project team to look at URLs and make determinations to check automated systems against human judgement? Google's business plan calls for the best results, but doesn't specify that it will ONLY use automated methods.

I am not saying they couldn't, but they are going to need a lot of man power for that!

matt21811




msg:49034
 11:48 pm on Aug 10, 2004 (gmt 0)

The problem isnt "how do you define an offtopic link"

Then problem is "how do you define an on-topic link" and thats easy. Its Googles core business.

Then they just more heavily weight on-topic incoming links in the algo.

If Aunt Tilly's Flowers doesnt have on-topic inbound links then it will be displaced by a flower shop that does.

I dont see a problem for Google.

trimmer80




msg:49035
 11:56 pm on Aug 10, 2004 (gmt 0)

matt21811
welcome to WebmasterWorld

kaled




msg:49036
 12:01 am on Aug 11, 2004 (gmt 0)

Presumably Google have procedures that must be followed to verify that manual action is required (and select the appropriate action). However, as soon as you formalize those procedures, you may as well automate them.

So, the reality is that you need people to teach the system what actions to take, but you don't need people to take those actions (except in special cases such as a legal intervention).

The real issues of automated/manual intervention are :-
1) What triggers an initial inspection?
2) What degree of manual validation is carried out to ensure the correct actions are taken?

Kaled.

trimmer80




msg:49037
 12:26 am on Aug 11, 2004 (gmt 0)

The real issues of automated/manual intervention are :-
1) What triggers an initial inspection?
2) What degree of manual validation is carried out to ensure the correct actions are taken?

i agree.
I would say the decision to take any manual action is based on an acceptable level of collaterial damage.

i.e. if we see a 80% reduction in spam is it worthwhile to disadvantage 10% of webmasters that are innocent. I believe that Google will also choose the result that gives the customer the best SERPs.

diamondgrl




msg:49038
 1:53 am on Aug 11, 2004 (gmt 0)

as for the question of whether google can use human editors to do reviews of sites, i think 50 people using some powerful semi-automated tools for pre-screening sites could probably identify 10,000 entire sites a day that clearly are intended for spam (200 per person per day). if you figure each one of these major spam sites has 10,000 pages, that's 100 million pages of content potentially banned each day with this level of effort. then add some other algorithms to detect who these sites are then affiliated with, and you can ban numerous other ones as well.

so i don't think it's unrealistic in the least to use human power to get rid of some of the worst detritus on the web. that doesn't mean, of course, that small acts of spamming results won't continue or that large acts can't get away with it for a short time, but it does reduce the probability that it will pay.

and it also doesn't mean that google is doing any of this, just that it is possible.

walkman




msg:49039
 2:01 am on Aug 11, 2004 (gmt 0)

this off topic link thing is a joke. I'm not going to link to a competitor, and a competitor is not going to link to me. However, a person with a personal site will put link saying check out walkman's car site or whatever that maybe.

gomer




msg:49040
 2:06 am on Aug 11, 2004 (gmt 0)

could probably identify 10,000 entire sites a day that clearly are intended for spam

so i don't think it's unrealistic in the least to use human power to get rid of some of the worst detritus on the web

Why would they do something like that when they repeatedly ignore spam reports sent to them... over, and over again.

trimmer80




msg:49041
 3:09 am on Aug 11, 2004 (gmt 0)

so i don't think it's unrealistic in the least to use human power to get rid of some of the worst detritus on the web

I think from a business model this is risky. What if one of those employees starts a site at home, and then spents a couple of days penalising all known competitors.

matt21811




msg:49042
 3:14 am on Aug 11, 2004 (gmt 0)

Walkman,

I dont think Google de-valuing off topic links is such a big problem.

The only people it will affect badly are those whose position comes from buying or trading for large numbers of links on otherwise unrelated pages.

If Google has realy done this (and I think they have) then they have found a clever way to reduce sites that have used "atificial" methods to promote themselves in the SERPS. I am constantly amazed at how clever those guys are.

I'll admit to all that I'm trying to make money doing SEO with mild success. This offtopic link adjustment could have huge advantages for me. It clears away lots of competitors in front of me that have the time to do thousands of link trades or lots of cash to buy links. All I have to do new is get a few high PR on-topic links and I too can have success.

bears5122




msg:49043
 3:19 am on Aug 11, 2004 (gmt 0)

Google needs to find better things to do, such as improving their ever decreasing SERP quality.

Links are advertisements. Right now, text links are hot, just like banner ads were 5 years ago. Does Google not realize how people make money on the Internet? How they pay for hosting? How they make a living? ADVERTISING is one of the key ways. Yahoo isn't building their network for the fun of it.

Google devalueing paid links is like TV Guide not listing stations that run commercials on them. If this is true, I can't wait till Google runs themselves into the ground.

This 109 message thread spans 4 pages: 109 ( [1] 2 3 4 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved