How much usage tracking do they use to deteremine SERPs? Do they use Google toolbar, click-through tracking, etc. and to what degree? Will that a growing trend going forward?
Why has his site disappered from the serps on a www.domain.com or domain.com check, and when will the issue that caused this removal and the virtual removal of other sites throughout the year be addressed?
Is googlebot having a crawling problem?
Googlebot's visits on my sites have been divided by ten in the last 5 days?
Are you GG?
Why it is so easy to get to the top of nearly any niche simply by submitting hundreds of bogus articles to dozens of "articles" site and whether or not google sees this as a problem to be dealth (a typo, but I'll stick with it: death + dealt=dealth) with at some point, as they seem to already be hammering in on the "directories backlinks acquisition" issue.
[edited by: ownerrim at 4:27 pm (utc) on Nov. 10, 2005]
Bret, please ask him if Google still takes in consideration, SERP wise, the pages already "removed" via Google's remove function.
Example: Suppose I have a page or directory full of pages that caused a penalty. If I remove them via Google's site, will the penalty be lifted at the same timeframe as if I had changed the pages and Google recrawled the fixed pages?
How long does a 404 page remain indexed?
In my experience it can be months, others have said years. I would have thought an absolute maximum should be about one month.
These "issues" are compromising Google SERPS more than the spammers.
Difficult questions, rather you than me Brett.
My two questions were in Post #25 and #37 of Part 3 of the Jagger update thread (and clarified in Post #400).
Basically, when is the supplemental index going to be cleaned up of duplicate, ancient, out of date, data (such that a page can rank for current content, as well as for words that are in a title and snippet that hasn't existed for two years)?
When are the SERPs going to be cleaned of pages that have redirected elsewhere for a year or more, espcially for sites that have all the www pages indexed, but have many many non-www pages still randomly showing too?
Related question: why after taking a page, folder or site down, or even letting the domain expire, and then using Google Remove to remove the pages from the SERPs (they aren't removed, merely hidden from view) do they then all come back 90 or 180 days later even though they still still no longer exist?
How many out there were betting that I would have a different question?
How can we flush the Supplemental results away?
Was part of the Jagger update to address link spam? Have you (Google) put more importance on IBL & OBL relevancy with this update?
What advice can he give those who operate multiple sites about the best way to let our users navigate from one site to another, without risking a penalty?
Embed the links inside Flash?
Apply the No Follow tag to each inter-site link?
Send all inter-site links through a hub page that is protected by a Robots.txt file?
Everything that g1smd said!
I've got a number of pages that are gone and have been 410'd, yet they still show up as supplementals when doing a site: command. It's been this way for months. Will they ever go away? And could they be affecting my sites ranking still?
What's the best course of action to take if your Google referrals suddenly disappear and you have no idea why?
Can you define more clearly what is a penalty and what is a filter? And when a reinclusion report is and isn't appropriate and helpful? In your September 18 post 'Filing a reinclusion request' it seemed like you were saying that if you site's rankings fall, it could be because of a spam penalty and that you can do a reinclusion request. At other times you have that if the drop is because of something algorithmic that a reinclusion won't work. How can we tell when it is or isn't worth doing a reinclusion request?
May we PLEEEEEAAAAAASSSSEEE have our site back in the Google serps? LOL.
Trisha and of course g1smd pretty much asked the question I would like answered.
Oh and bad neighborhoods. Are we penalized for incoming and outgoing links to bad neighborhoods or are such links just discounted? Since we do not have control of what links come into our site...is this fair to penalize? If we do not know the google algo how do we know for certain what google is considering a bad neighborhood...we cannot go by pagerank because it is virtually invisible and delayed in such a way...is this fair? Since we have no control over another sites we link to, if they make changes that would put them into a "bad neighborhood"...are we given time to discover this or are we left to race google? Again is this fair?
Does "Do no evil" still apply? Since "evil" is subjective and intents can always be considered "positive", what line would Google have to cross to be considered "doing evil"?
Two more: What is your yearly income? Care to share?
I wish to hear about WebSpam Team; members, nationalities, average age, functions, offices and more ;-)
of course he is. If you read the latest post about the j3 datacentre its a giveaway. He forgot talk in the 3rd person as usual and switched to first person and mirrored a GG post.
Back ontopic. Given your time over, would you go straight to a portal with bells and whistles or keep to your core product of the search as you did for the initial years until branching out with your many current products.
"Are we penalized for incoming and outgoing links to bad neighborhoods or are such links just discounted?"
A big concern. You can link to a white hat site and have no clue if it'll be under the same ownership a year later and whether it'll be white hat or black. And it would be ridiculous to penalize anyone for incoming links since most of us probably get tons of unsolicited links from dmoz knockoffs and and adsense spam cookie-cutter sites. Who knows if they're good or bad neighborhoods and, anyway, the average webmaster may not even be aware of who is linking to his sites.
Why are there scraper sites using our content on top spot - 1st page for a search on our copyright statement and our site is found 5 pages deep!
Everything that landmark said. ;-)
>> "Are we penalized for incoming and outgoing links to bad neighborhoods or are such links just discounted?"
totally agree, it is a huge concern. I made the point on another thread, that link buying would be hurt just as much if Google didn't count what it sees as "bought" or fishy links. Penalizing is NOT fair, as some are done in good faith, and some friends will link to "do you a favor." By the time you find out, you're on page 20.
In addition, you may have a popular but not successful site, or just an informational one, and link to the other ROS. Now we know better, but many mom & pop sites do not and do that. Is it fair to them or us?
Discount, don't penalize, please.
"A big concern. You can link to a white hat site and have no clue if it'll be under the same ownership a year later and whether it'll be white hat or black. And it would be ridiculous to penalize anyone for incoming links since most of us probably get tons of unsolicited links from dmoz knockoffs and and adsense spam cookie-cutter sites. Who knows if they're good or bad neighborhoods and, anyway, the average webmaster may not even be aware of who is linking to his sites."
The problem is that incoming links from other sites may constitute in a Google "linking scheme" penalty as the guidelins SPECIFICALLY state. Also it states that outgoing links to spammers and "bad neighborhoods" should be avoided because rankings may be affected. So I would like to know how webmasters are supposed to know what GOOGLE considers a "bad neighborhood" since we don't have any specifics nor do we know their algo. And since these sites we link to and have links from are out of our control...just as content thieves displaying adsense ads on stolen material Google can claim innocence through ignorance of not knowing the content was stolen...why can we not claim the same under this principle?
"In addition, you may have a popular but not successful site, or just an informational one, and link to the other ROS. Now we know better, but many mom & pop sites do not and do that. Is it fair to them or us?"
It is NATURAL that you would seek out other traffic sources online as well as offline for businesses. The only way to get traffic online from another site is through some sort of a way to get to your site or to communicate. It could be advertising or trading links or providing content with a link to your site. We need to PROMOTE ourself and our brand. Most sites/companies do not build on WOM it must start somewhere and that is PR. Getting your brand out there in the first place. You must tell someone about it first whether it be online or offline. In a directory or SE. A newletter or a news article.
What is Google's say in this?
Everything that Caveman said.
Did you know caveman has a butler! ;)
"The problem is that incoming links from other sites may constitute in a Google "linking scheme" penalty as the guidelins SPECIFICALLY state."
I read something Matt Cutts posted: that buying links is frowned upon. But, what the heck, yahoo sells a directory admission, so does nearly every other directory. You advertise on a site with a text link. You're advertising but you bought a link.
The whole idea of penalizing for getting links is insane. The emphasis should be on determining which links are more important. As for getting too many links too fast, you could do a press release or get a mention on a high traffic site and pick up quite a few links pretty fast. Should you be penalized for that? Insane.
Basically, what walkman said:
"link buying would be hurt just as much if Google didn't count what it sees as "bought" or fishy links. Penalizing is NOT fair"
Have sites been penalized because of canonical issues (www vs non-www)? For example, can this cause duplicate content penalties?
Will these penalites be lifted when Google implements a canonical fix to this or will the fix just help out future canonical problems from occurring?
Everything that g1smd, landmark, caveman, and tantalus
Plus, when are you guys going to get faster hampsters?
The current bunch appear tried and slow, surely that new Serpulator must be ready to leave the lab ;-).
The most pressing questions, and they have my vote.
I suspect Brett'll be told that these are not problems. "They may have been at one point but they are all resolved now."
Is keyword stuffing penalized at all? -LH
| This 87 message thread spans 3 pages: 87 (  2 3 ) > > |