Welcome to WebmasterWorld Guest from 18.104.22.168
Would someone be so kind as to define a "clean" backlink?
Now here is one for GoogleGuy to ponder upon.
There are a number of highly respected publications on the net that don't allow the bots in and that have links to other sites.
I would think that those links would be exactly what one would classify as "clean" backlinks (whatever they are, and as if you can prevent linking to a site since it is totally legal).
Then there is the so called SERP placement swing cycle, sometimes known as the scrapper effect, this may be embodied in the following excerpt from one of Google's search engine 125 patents:
"25. The method of claim 22, wherein the determining behavior of links associated with the document includes monitoring at least one of time-varying behavior of links associated with the document, how many links associated with the document appear or disappear during a time period, and whether there is a trend toward appearance of new links associated with the document versus disappearance of existing links associated with the document."
Couple that with modified automated placement checkers with the output based on SERPs from several different S/Es over varying periods of time from sites with differing PRs and thus Google spider and index rates.
Now to add even more to the list of things to consider we have the relative respider and indexing rate of the pages of your own site and those other "normal" sites that link to you.
Then we have the ever expanding number of urls for the bot army to retrieve and the indexers to index.
Just some food for thought, enjoy or tear apart.
Any spam techniques should be ignored by the spider and the ranking based on legitimate reasons. Therefore if a site is number 1 it will be there because of factors such as links in that are of sufficient quality. Suppose a company has employed a dim website design firm and they have put in hidden text. This is not the fault of the company and if their content is fabulous then users should and will want to find it. Therefore the spider will be ignoring all the rubbish seo and ranking on legitimate reasons of quality. That is best for the user and fair to the company who will have no idea about what might be hidden. Thus good sites with spam should still rank well.
I suspect too many people just look at sites above them and jump on the band wagon of calling them scum if they have hidden text. The reality is that they are still usually better sites than everyone else's because the factors that are really important have managed to put that site number 1 and the percieved spam has been ignored anyway.
This makes this whole witch hunt a total farce. There are very few sites ranking high up because of spam techniques, if any. There maybe a lot of spam in the code, but this does not mean the site is not worthy of a top position. But......
Ranking the most relevant site is another matter and sending Google endless spam reports does not help them address this problem. This spam reporting nonsense is a total red herring and Google should not be even thinking about it. They should be concentrating on positive elements of a site that cannot be spammed and ignoring negative ones. Surely they are just looking at visible text, links and titles plus h1 etc. and bold. Everything else is dumped. Then they analyse that data, using hilltop, semantics and keyword density/emphasis to establish a ranking order for a search term . So what if there is invisible text and keyword stuffing.... it may not mean the site content is rubblish or not relevant.... all it means is a misguided idiot did the code.
The trouble with inviting webmasters, especially from an seo forum (for heaven sake!) to judge their competitors is that there are too many hidden agendas and far too much hypocrisy.
Will there be any spam report not done by a hypocrite? Everyone of us is here to gain rankings by gaining an understanding of the algo's and acting accordingly. So sure, report the opposition, thats part of the game, but lets not pretend we are all not as bad as each other. As for Google wanting our help.... give me a break!
Iíve got a really bad feeling about this Iím I going to be penalized?
I have 20 sites all different areas different content, all are gone from the search results, but not banned. They all had reciprocal links on them from 50 to 500 links exchanged.
Will they come back, or should i make 20 new sites with new domains? since those will rank higher in the results. What is the pupose of having a site not show up for any search. Even the ones with names on it do not show up.
If they where just devalued the sites should still come up when you do a page specific search correct.
Its does not make any sense?
If reciprocal links are now considered spam then 1 way should be also spam since rankings are being manipulated with 1 way links.
Well said. I see a couple of sites above mine that are keyword stuffing to the max. Hundreds of the keyword all the way down the page.
I suspect it is helping their ranking but I cannot know for sure.
Either way, I will not be reporting them. I would feel silly reporting a site because the reality is, I would be reporting them because they are AHEAD of my site in the results.
Are people reporting SPAM on page 10 of search results?
I mean a backlink that is there because the site author thought it beneficial to the reader. As opposed to a backlink that is there for PR reasons.
In the end, only time will tell what direction Google takes. IMHO that direction is greater human influence on serps.
This makes me think that SEO as we've known it could be a dying art.
And that will be a good thing.
Just my opinion.
Squeaky clean? When you write a page do you consider word density, titles and contrive the odd sentance to get a keyword in? Your just fooling yourself if you think you are innocent. There is no reason to have a squeaky clean site before you report someone and wake up.... your site is seo'd as well.
Think about it and follow the idea through.
You're only going to be worried about spam reports if you are guilty.
And, as it happens, I did used to wear dark coloured hats. However, after Bourbon I was penalised so realised early what direction this was all going in.
So now my sites are squeaky clean. They are seo'd only in that each page has correct titles, description. etc and the sites have good overall structure.
What I don't have is keyword stuffing, hidden text, massive anchor link backlink spam, network interlinking...
This makes me think that SEO as we've known it could be a dying art.
I don't think it's dying, but I do think change is coming. Traditionally most ďSEOĒ webmasters spend more time concerned about linking than the true user experience, and with good reason. Itís the most effective way to get your site near the top.
My hope is Google is moving on from ranking sites predominantly based on linking. If that was de-emphasized and other aspects, like the amount of time a visitor actually spends clicking through the site, were given greater weight in the algo, things might improve.
Just think how good your sites could be if you didnít have to waste so much time on linking, proper anchor text and what kind of neighbors you have. Link popularity algorithms were born to be manipulated.
If the system were a person would that solve it?
And that person could very well be a competitor.
The important thing that has happened with Jagger2 is that Google has started listening to Webmasters.
That must be a good thing.
It certainly is for me.
joined:July 8, 2002
Suppose a company has employed a dim website design firm and they have put in hidden text. This is not the fault of the company...
Yes, it is the fault of the company doing the hiring, legally, morally and from a common sense point of view. Hire idiots to do work in your name--your fault.
However, I do agree that Google seems to be doing this brain surgery with a hammer and chisel...
It looks cleaner so far.
Reporting spam is a good thing, just don't let yourself go and report all of your competitors because it is stupid, eats resources and keeps REAL cheaters on top for longer time.
Hidden text, cloaking pages, scrappers, and other text link buyers (when this is the only reason why they got ranked) should be reported, leave the rest alone and accept being defeated.
Algo change, fashion change, stock market is king, just don't call your competition spam when their site doesn't look like yours or displays a different opinion than yours.
I just wanted to say that because if everybody would report [only] spammers and cheaters, a few months from now the SERP would dramatically change.
Don't forget to send feeback about spammers to Y! and MSN too! Maybe one day all search engines will start listening to what WM have to say :) (just kidding...money rules them all)
I know of several to watch.
The main one I would hope gets acted up hasn't been yet.
A (particular) person would not always be a good judge of a particular link. If the person doing the judging was not familar with the topic they may ( and very likely will) make an invalid judgement.
I have seen that happen repeatedly on various forums for just one instance.
Oh well, thank goodness no one is actually bothering to look and see if the site is actually quite good... we don't want to confuse the witch hunt by adding 'quality or relevance'.
So if you are able to judge these things, why employ an seo? Employers of seo firms do not know anything about seo.... thats why they need help.
Is it not equally immoral to ban an innocent business because of an innocent mistake? Far better that Google gets its act together and ignores spam rather than penalise it.
joined:Oct 11, 2005
Are you really sure there is nothing blackish hat SEO used on your site a competitor can't find? Maybe the end result will be that only big compagnies with strong brands, who don't care about position in Google reach the top of the SERPS totally (as they possibly should be).
[edited by: Tinus at 7:00 pm (utc) on Oct. 28, 2005]
...and other text link buyers (when this is the only reason why they got ranked) should be reported, leave the rest alone and accept being defeated.
Please let us know when google starts penalizing sites for the links pointing to them, and I'll go by a few dozen site-wide links to my competitors right away.
Google CANNOT (or definitely should not) penalize a site on the basis of inbound links. If that is ever proven to occur, rest assured there will be "hired guns" out there willing to setup damaging link campaigns to your competitors for profit.
What about sites that get unsolicicted inbound links from (what can be considered) off-topic sites? We justr had one of our articles wuoted and linked to from motley fool, despite the fact that our site has absolutely nothing to do with business or personal finance. Should our site be penalized because of it?
The only solution to that would be to simply ignore any benefit the inbound links would bring.
I think if I were looking to build an innovative new search engine, I would seriously consider your approach.
Why is it that we can't come together and through "party politics" and give G, Y, & M the impression of what WE the webmasters and content providers want to happen... It's a twisted ideology and it's politics at its best, but the truth of the matter is that without our content, systems, and structures, G or any SE would cease to exist. Are we that uncontrollable to the point G sees us as a bunch of uncouth animals? Something just isn't right... I think it's time for a unified webmaster front across the entire Internet.
Ancient Ethiopian Proverb: As the lordth passeth, the wise peasant gracefully bows and silently farts.
[edited by: Yippee at 7:13 pm (utc) on Oct. 28, 2005]
Google CANNOT (or definitely should not) penalize a site on the basis of inbound links
Many think they do, creating ranking problems for sites that are scraped extensively. I think Jagger is part of their solution to this.
The witch hunt theme is interesting. Basically Google is now using the community to police itself/report violators/competitors.
I see good and bad in this, but most importantly I see Google acknowledging that human editors are needed.
I guess Google can't afford to hire editors since the company is having so much trouble making money these days.
For a site using www.domain.com for the content, it is normal for the single domain.com entry to appear as a URL-only listing, simply because it exists. I wouldn't worry about it - unless the entry also has a title and description too. If there is a title and description then you really do have a problem.