homepage Welcome to WebmasterWorld Guest from 50.17.7.84
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 180 message thread spans 6 pages: < < 180 ( 1 [2] 3 4 5 6 > >     
-nonsense stopped working now
zacsg




msg:148854
 5:31 am on Dec 11, 2003 (gmt 0)

"kw -fwefer [webmasterworld.com]" now returns the same result as "kw". at least from here in Singapore. anyone else see it?

 

espeed




msg:148884
 6:07 am on Dec 12, 2003 (gmt 0)

Any algorithm/filter can increase or decrease a page's score based an an infinite number of factors, or change how much weight they give to certain aspects. Any such change could decrease Google's opinion of how important your page is so that your page's score is relatively low compared to the 1000 pages that appear in the SERPs.

I_am_back




msg:148885
 7:55 am on Dec 12, 2003 (gmt 0)

Any algorithm/filter can increase or decrease a page's score based an an infinite number of factors, or change how much weight they give to certain aspects.

Can we change that to read

Any filter can indirectly increase the ranking of page that does not meet the filter criteria.

Just like when you search, e.g

"Widgets -blue" you are telling Google you do not want the word blue, not that you want it ranked lower. Without the negative keyword you may have 10,000 pages. With the negative keyword you will get 5,000 pages thus meaning some pages just increased in ranking as you have *filtered* out 5,000 pages.

Powdork




msg:148886
 8:21 am on Dec 12, 2003 (gmt 0)

Sorry Dave,
You can't change what someone has already written.

filter-2: alters the frequency spectrum of signals passing through it

This definition (provided from Worldnet via Dictionary.com via Google) says nothing of removing anything. So it is possible for a filter to change rather than just eradicate. Additionally, its possible for a filter to remove parts of your pages indexed data. e.g. If you trip the filter for a certain kw then the filter removes the instances of that kw from the page's indexed data.

I am not suggesting any of the above is repsonsible for Google's relevance issues, which you continue to deny exist.

ticketleap




msg:148887
 8:42 am on Dec 12, 2003 (gmt 0)

This definition (provided from Worldnet via Dictionary.com via Google) says nothing of removing anything.

A filter is a .....hey, wait a minute...Doesn't that depend upon what the meaning of the word is is?

Seriously...you can break the words apart forever...it doesn't really change anything. Its like debating what the difference is between a sound and a noise.

bye

I_am_back




msg:148888
 8:53 am on Dec 12, 2003 (gmt 0)

Sorry Dave

Who :o)

Google's relevance issues, which you continue to deny exist.

No, all I ever keep saying (sorry I mean what Dave keeps saying) is Google SERP's are still better than any other SE. They may, or my not be *less* relevent than some other point in time, but they (IMO) are still the best.

Have you ever used Auto Filters in Excel, or filtered down a database? It "Removes/hides* data not matching the criteria. It does not include it, but push it down the list or table.

This *is* from Google

Google's SafeSearch screens for sites that contain pornography and explicit sexual content and eliminates them from search results. While no filter is 100% accurate, Google's filter uses advanced proprietary technology that checks keywords and phrases, URLs and Open Directory categories.

By default, moderate filtering is set to exclude most explicit images from Google Image Search results

[edited by: I_am_back at 8:59 am (utc) on Dec. 12, 2003]

allanp73




msg:148889
 8:56 am on Dec 12, 2003 (gmt 0)

I think I'm getting tried of complaining about Google's serps. We all realize something is happening which removes many sites for particalur keywords. Now, we have to figure out how to get around this problem.
I started out believing that Google was blocking sites for over-optimization. I checked sites in the serps some had high keyword phrase density and tested by modifying my own sites. So far all my tests have come back negative.

Then I looked at links. Marissa Mayer, the Director of Consumer Web Products at Google "If you dropped in rankings, go back and look at who you linked to and who’s linking to you. If any of these people are using spam techniques, they're the reason your site no longer appears on Google."
This makes me wonder if Google could penalize for sites that link to your site. I can't not believe this to be true because it would allow people to destroy their competition by linking to them. Instead I looked at what links are leaving my sites. I ran the following three tests:
1) Site A links from its index page to 8 sites on same IP
2) Site B links from its index page to 2 sites on same IP and 4 from different ip
3) Site C links from its index page to 6 sites from 6 different ips and are not affiliated with site c
All sites have excellent content, pr 4 or 5's, and all the sites. All are not over optimized. The results so far is none have returned to the serps.

So now what? I looked at on page factors and off page factors and nothing. I really don't know where the go from this point. Any ideas?

I_am_back




msg:148890
 9:02 am on Dec 12, 2003 (gmt 0)

This makes me wonder if Google could penalize for sites that link to your site

I would imagine that Google does not allow reverse engineering. If it did, it would be all TOO easy to eliminate the competion. You should not worry about who links to you, but rather who you link to.

jaffstar




msg:148891
 9:10 am on Dec 12, 2003 (gmt 0)

I have closely been monitoring Google's filter and have started to see them relaxing the filter. Here is some proof

I just feel that I need to explain what I meant by the above.

I discovered Scroogle a few days ago. This site shows how many sites have been excluded from the serps for certain kw searches (with and without the "filter"). Yesterday when I ran searches through Scroogle , the results showed far less sites being excluded. Apparently the double minus search does not show filtered results anymore, thanks Google :)

Also, by saying "filter", yes it’s only a word, but it’s definitely either an OOP or a tweak in their Algo which excludes certain sites on certain KW searches.

All my sites have been unaffected by the new algo/tweak/filter, my main competitors are still in their respective places. Authority sites still seem to rule.

Marissa Mayer, the Director of Consumer Web Products at Google "If you dropped in rankings, go back and look at who you linked to and who’s linking to you. If any of these people are using spam techniques, they're the reason your site no longer appears on Google."

The way I understand the above is as follows:

If Site A uses spammy techniques and Links to B, then Site A will in theory get a grayed out bar (pr), and will be unable to cast a vote to site B, therefore its a micro penalty. We are always warned about bad neighborhoods, but what if I set up a site with spammy techniques and link to all my competitors, could I take them out as well? Sometimes we cannot control who links to us.

Edited :added more to same post.

Hissingsid




msg:148892
 9:24 am on Dec 12, 2003 (gmt 0)

Whether you call it a filter or an algo change the fact is that Google or for that matter any other search engine can only work with the data (pages, links etc) that it is presented with.

We control or influence that data directly to varying degrees (sometimes not in the case of backlinks). If we can figure out what Google needs to decide that your page should be #1 then we can feed Google what it wants. That has been the basis of everything that we do with regard to Google Optimisation since Google and SEO met.

The problem with the current algo/filter or whatever you want to call it is that in some, possibly many cases it does not seem to be opperating logically. There has been a step change and it is very difficult to figure out what data Google is looking for. Previously I used to analyse what the top two or three pages had in terms of on page factors, backlinks etc and try to go one better. I didn't want to use everything possible in case someone else was doing this kind of incremental SEO development and it worked plus I could always get back if someone got one over on me. Personally I don't mind if Google keeps its new algo/new filter or an combination of these as long as it maintains this in a rational way.

In the long term sites/pages that were #1 previously, and which did not get there by accident, will get back to #1 whatever algo Google setles on, as long as we all find out what the right data is and feed that to the bots.

I've found a number of things that through lazyness and "why change them because they worked", I had spun into my web pages on one site in particular. I can now readily see why Google might interpret these as duplicates and I've been through and removed them.

My advice to folks here is to go back to basics. That's a piece of advice that really bugged me when Brett Tabke gave it early on after Florida, but I think now that he was probably right.

What does that mean though. Well I would start by comparing two or more of your pages in terms of duplication/strong similarity in the text used in <title>, <meta description>, <h> and <anchors> tags.

No matter how you cut it I'm pretty sure that there is something different happening with how Google is dealing with duplicate issues and if you erradicate duplication between pages in those important tags I can only see how that will do you good and not harm. The abverse of this is that some repetition within a page in each of those tags is a good thing.

I had a secondary information site that I built in a hurry using an application that allowed a master page to be used to provide "consitency" . When I examined this I found that through lazyness I had allowed certain key areas of the pages to be duplicated. I've gone back and eradicated these and reduced the connection with my main site by reducing the number of links to my main site and adding a few to other sites. This whole site got dropped from SERPs. I will report back if it gets back in however I don't think it will prove anything because no doubt the algo/filter will change anyway.

Best wishes

Sid

Hissingsid




msg:148893
 9:29 am on Dec 12, 2003 (gmt 0)

Marissa Mayer, the Director of Consumer Web Products at Google "If you dropped in rankings, go back and look at who you linked to and who’s linking to you. If any of these people are using spam techniques, they're the reason your site no longer appears on Google."

Could you point me to where that came from please, sorry I missed it.

Thanks

Sid

PS How am I supposed to control who links to me?

allanp73




msg:148894
 9:38 am on Dec 12, 2003 (gmt 0)

I got the quote from webpronews. They interviewed several people from Google about the current dance.

steveb




msg:148895
 10:12 am on Dec 12, 2003 (gmt 0)

"An algorithm is a filter, and a filter is an algorithm"

No, not at all. There is no filter and thinking that leads to all sorts of poppycock thinking. A filter requires a "something" to be whole before passing through it. In this case, people think the old anchor text algo is completed, and then being passed through some "filter" (which they all describe in completely contradictory ways). Thinking such a way is backwards and fully counterproductive.

The evidence is concrete. If a filter was removing sites, then the sites left would rank in the same order, that is 17,23,27,45 would become 4,5,6,7, not 6,7,4,5. The word filter is simply wrong. The old anchor text algo is NOT the base pool of data of the new algo.

The new algo is something unto itself where sites are being ranked independently of the old algo.

The sooner people get that into their heads and give up the fantasies, the sooner they can discern the various positive and negative factor weightings of the new algo being applied within their niche.

I_am_back




msg:148896
 10:32 am on Dec 12, 2003 (gmt 0)

SteveB, finally someone with common sense!

I wish the words "filter", "money words" "Florida" were being filtered out of this forum!

and...

My advice to folks here is to go back to basics

That's it! Most hit by Florida though will not accept this. For this reason they will continue to be their own worst enemy.

jaffstar




msg:148897
 10:40 am on Dec 12, 2003 (gmt 0)

My advice to folks here is to go back to basic

What are the basics?

Develop a site for a end user and not for a search engine?

Content is king?

Ignore everything you know?

If we go by the above, we have more chance of winning the lottery!

In the past we pretty much knew what we had to achive in order to get to the top, i.e My competitor has a kw denisity of X, he has a pagerank of Y, and he has Z amount of links.

Basics might be to design a good looking site for end users and to pray?

kaled




msg:148898
 11:32 am on Dec 12, 2003 (gmt 0)

filter-2: alters the frequency spectrum of signals passing through it

The generic definition of a filter above is correct. However, if we're talking analogue waveforms then a passive filter can only reduce parts of the frequency spectrum whilst an active filter can also amplify parts of the spectrum.

Within the context of Florida penalties, it is reasonable to compare with passive analogue filters. However, the whole search engine algo (PR and all) could be compared with an active filter.

I hope that closes this argument.

Marissa Mayer, the Director of Consumer Web Products at Google "If you dropped in rankings, go back and look at who you linked to and who’s linking to you. If any of these people are using spam techniques, they're the reason your site no longer appears on Google."

Ignore this. She probably knows diddly-squat about the inner workings of the beast and just made something up. It is far too vague to be meaningful and clearly suggests that it is possible to damage a site by linking to it. If this is true, war will break out as spammers try to kill each other on the web.

Kaled.

Hissingsid




msg:148899
 11:32 am on Dec 12, 2003 (gmt 0)

What are the basics?

FWIW this is my opinion.

1. Go to the FAQs of this forum and read Brett Tabke's excellent recipe for success on Google. Sorry I don't have the direct link.

2. Don't put too much salt in the recipe or you will spoil it. What I mean by this is be very carefull with duplication between pages as I explained earlier with particular emphasis on these tags from a duplication between pages point of view <title>, <meta description>, <h> tags. Have duplication of keywords in these areas on the page but not between pages.

3. Watch who you link to and who links to you. If you have more than one site on the same topic make sure they are not in the same IP range.

All thatb stuff about good content etc is contained in Brett's instructions but what I have said in #2 above is something that you can do quickly now.

Best wishes

Sid

steveb




msg:148900
 11:45 am on Dec 12, 2003 (gmt 0)

"It is far too vague to be meaningful and clearly suggests that it is possible to damage a site by linking to it."

It does not suggest that at all!

Why do people only read half a sentence?

The statement is clearly talking about sites YOU link to. It is basic "bad neighborhood" stuff.

Don't reciprocate links to sites that are spammy.

===

Her actual statement is much more intriguing. It implies that webmasters should use much better judgment in terms of quality before linking... which would imply a resulting Internet where "votes" were more accurate and genuine.

Hissingsid




msg:148901
 12:05 pm on Dec 12, 2003 (gmt 0)

Ignore this. She probably knows diddly-squat about the inner workings of the beast and just made something up. It is far too vague to be meaningful and clearly suggests that it is possible to damage a site by linking to it. If this is true, war will break out as spammers try to kill each other on the web.

I have a hunch that a site linking to you can't do you any harm but if it was contributing to your PageRank previously and is now not being counted then it will affect you. If it is not being counted now this may be if it is deemed to be from a spam page/site/IP number. However if you link to a site/page that is tagged as spam and it links to you this may harm you more.

I wonder if Google has redefined what it sees as spam, increasing the weight of duplication of content between pages within the same IP range and closely linked sites in its estimation. Also I wonder if what it is checking in terms of duplication is text within the main page tags but not within the body tags. The reason that I say this is that syndicated content is not being adversly affected. In fact for many of the sites that have risen close to the top in my own niche the most relevant stuff on the page is my Espotting and/or Overture ads. The top pages as an example are 2 from the same site which have no commonality between text contained in <title> <description> and <h> tags. My ad and other widget ads are on both pages with 3 of the ads including ours actually appearing twice on the second page.

If duplication is a major part of the issue they have found a way to allow syndicated content when assessing it.

Best wishes

Sid

espeed




msg:148902
 12:27 pm on Dec 12, 2003 (gmt 0)

"An algorithm is a filter, and a filter is an algorithm"
No, not at all. There is no filter and thinking that leads to all sorts of poppycock thinking. A filter requires a "something" to be whole before passing through it.

steveb - Take a step back -- the "something whole" is all of the pages on the Internet. If Google didn't filter it, then any query would return every page in no particular order. Google's algorithm/filter determines the SERPs (the 1000 ordered results Google gives you for any given query are the pages that passed through its filter for that query). It's a matter of perspective.

The evidence is concrete. If a filter was removing sites, then the sites left would rank in the same order, that is 17,23,27,45 would become 4,5,6,7, not 6,7,4,5. The word filter is simply wrong. The old anchor text algo is NOT the base pool of data of the new algo.

Even if you believe that algos and filters are different, your reasoning is flawed. Who's to say that the change in results after the florida update isn't a combination of both additional filter(s) and algorithm modifications? Furthermore, any number of factors could account for the reordering (4,5,6,7 ==> 6,7,4,5) including an increase in backlinks or page modifications. So if the only thing Google did was add a filter, the reordering could have occurred due to changes external to Google's new filter.

Regardless, an algo and a filter are the same.

ciml




msg:148903
 12:39 pm on Dec 12, 2003 (gmt 0)

Steve, the way this one has come in (gradually, since long before the Florida update), it has behaved much more like a late-stage filter than a separate Google. Not a 'remove from results' filter, just a 'reduce the score' type filter.

jaffstar:
> relaxing the filter

Still seems as strong to me. I think I'll have to eat my words about Google relaxing the filter after a month like they normally do. This looks like it's going to stay with us.

kaled




msg:148904
 12:39 pm on Dec 12, 2003 (gmt 0)

SteveB said,
"It is far too vague to be meaningful and clearly suggests that it is possible to damage a site by linking to it."

It does not suggest that at all!

That's exactly what this statement SUGGESTS. I did not attribute anything more than this to the quote.

Read the quote again. This time pay particular attention to and who’s linking to you.

Kaled.

PS I read the whole interview last night, not just snippet this morning.

espeed




msg:148905
 12:45 pm on Dec 12, 2003 (gmt 0)

Relax, GoogleGuy reaffirmed that your competitors can't sabotage you:
[webmasterworld.com...]

GoogleGuy: Nah. I'll reaffirm the statement that webmasters can't really sabotage other people's sites--that wouldn't be fair. People who believe that are barking up the wrong tree.

[edited by: espeed at 12:47 pm (utc) on Dec. 12, 2003]

steveb




msg:148906
 12:46 pm on Dec 12, 2003 (gmt 0)

"Regardless, an algo and a filter are the same."

That's ridiculous.

An algorithm is a computation.

A filter is a separation.

The concepts aren't even related.

Google has always filtered things. Google has also always calculated things. A filter in effect *ignores* prior calculation/measurement.

====

One example. Google appears to be aggressively filtering duplicate content. That filtering and removing of sites is not algorithmic. It would be fully irrelevant how those duplicate sites scored in the algorithm, they would be removed regardless.

You can wave your arms and say "it all comes out in the wash" at the end, but if you are trying to figure what to do as a webmaster, the first thing you need to understand is there is a new algorithm in town. Your site may be doing something to get filtered, but that is separate from how your site is being rated algorithmically.

In other words, there are two separate things you need to worry about.

[edited by: steveb at 1:06 pm (utc) on Dec. 12, 2003]

steveb




msg:148907
 12:47 pm on Dec 12, 2003 (gmt 0)

kaled, read the sentence. Don't pay particular attention to one phrase!

The statement clearly is talking about reciprocal linking with spammy sites.

steveb




msg:148908
 12:54 pm on Dec 12, 2003 (gmt 0)

"just a 'reduce the score' type"

ciml, I can understand the other guys, but you are just missing the boat here. There is no "reducing the score" here. There is giving less weight to some things. Also, you need to consider what rises when something drops, not just look at the dropping. The rising of the dmoz/yahoo/google pages for instance.

More to the point though, I hope at least you agree that the old anchor text algo is not being filtered. Insist if you want that a new data batch is being filtered, but the idea that the old algo is being filtered is simply completely impossible.

espeed




msg:148909
 1:08 pm on Dec 12, 2003 (gmt 0)

"Regardless, an algo and a filter are the same."
That's ridiculous. An algorithm is a computation. A filter is a separation. The concepts aren't even related.

So how would Google determine what gets filtered? It would compute something (maybe score the page to determine its level of SEO). An algorithm is a set of rules [cogsci.princeton.edu]. In this case, the set of rules determine what pages to return in the SERPs.

If one of the rules is "reduce the score of pages that have an optimization score greater than 90", then in an highly competitive area, the pages could have a significant drop in the results and appear to be completely filtered out. Google could adjust the variables for these rules after the update has occurred (e.g. use a score of 95 instead of 90), but no matter what you call it, it's the same thing.

steveb




msg:148910
 1:17 pm on Dec 12, 2003 (gmt 0)

"appear to be completely filtered out."

Um, if you think "appear to be completely fitered out" is exactly the same as "filtered out" I don't know what to say.

Appearances can be deceiving.

More to the point, I could drive from LA to New York via Seattle, and you could take a train from LA to New York via Dallas. We both get to New York, but it would be pretty foolish to think we did "the same thing". And that's where people are going wrong here. It matters how you got to where you are.

shaadi




msg:148911
 1:20 pm on Dec 12, 2003 (gmt 0)

Google finally decides to relax it's filter?

My point is: says who?

Google is still full with irrelavant results...

espeed




msg:148912
 1:22 pm on Dec 12, 2003 (gmt 0)

Um, if you think "appear to be completely fitered out" is exactly the same as "filtered out" I don't know what to say.

Didn't say they were.

kaled




msg:148913
 2:21 pm on Dec 12, 2003 (gmt 0)

SteveB said
"Regardless, an algo and a filter are the same."
That's ridiculous.

An algorithm is a computation.

A filter is a separation.

The concepts aren't even related

How much program code have you written? A filter in this context is just a piece of program code and, yes, behind every piece of program code is an algorithm. To state that the two concepts are not even related is just twaddle.

Kaled.

Kackle




msg:148914
 2:43 pm on Dec 12, 2003 (gmt 0)

ciml says:
Still seems as strong to me. I think I'll have to eat my words about Google relaxing the filter after a month like they normally do. This looks like it's going to stay with us.

I agree. Moreover, if it looks like a filter, walks like a filter, and quacks like a filter, then the word "filter" is entirely appropriate as a description of what many small niche businesses and other have seen, as their Google rankings went down the toilet. I have yet to come across any of them who are worried that some of us use the word "filter" instead of the word "algorithm" to describe what they've experienced. They have bigger problems -- like how to feed the kids.

The exclusion-word trick suggested that the new filter/algorithm was operating post-facto on the results. This trick provided a stark demonstration and has to be considered a glitch. Anyone who is devious enough to drop thousands of innocent sites is not going to wrap such deviousness in a convenient demonstration of just how evil they are. It's bad form.

What I've seen in the last 36 hours is that the exclusion word trick is showing lower numbers in almost all areas, especially real estate and travel. However, I have yet to see any evidence that the so-called filter is being relaxed. It's the technique used to test the filter that is getting fixed.

Of the sites I've tracked that used to be near the top pre-Florida, about 90 percent of them are buried just as deep as they ever were post-Florida. This is true across all data centers, and I'm talking about small business sites. The only thing that's changed is that Scroogle's little screen-scraping toy isn't working so well and will probably get taken down soon, before Google figures out the new IP address (not the one in your address bar) and blocks it for the second time. Fortunately, a record was kept of the results beginning December 6, before numbers started dipping, and these can be searched by substring.

The argument over semantics is silly. It's a bit like arguing whether the Emperor's tie matches his coat, as he's walking down the street naked, while occasionally picking out innocent people at random, and punching them in the face.

This 180 message thread spans 6 pages: < < 180 ( 1 [2] 3 4 5 6 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved