Forum Moderators: Robert Charlton & goodroi
I think the point that was being made is that URL's specifying examples aren't allowed in this forum.[see the rules]The examples you gave were excellent, but don't take it personally....its just the rules of WebmasterWorld.
Regarding input, I guess the reality of it is that many comments have been of value, over the past weeks, and some maybe not so useful. However, it is good for the guys to express opinions and state their point. It also helps them to know they are not alone, whether they are doing well or badly.
To reiterrate, this post is a personal opinion, which I am glad to be living in a free society, to express.
Yes, you can learn more from a specific example than you can from reading 1000 posts. But the problem is that WebmasterWorld is simply too popular, and gets too much forum link spam, to allow any specific links.
However, it's much harder to think of a reason, especially in the context of an update thread, to not allow specific searches, especially when they show instantly what the problem is, for example the 7000 backlinks I found on both searches. That's highly relevant for a thread like this.
Usually the reason WebmasterWorld doesn't allow any specifics like this is because they want the pages to remain a resource after the specific occurance is gone. But let's get real, nobody except the emotionally deranged or hopelessly insane will ever read any of these update threads after the updates cease to be relevant. I guess a tiny handful of seos might, but that's about it.
However, these are Brett's forums, so he can make whatever rules he wants, that's his right.
Why doesn't Google come up with a way for the us to BLOCK these links from being associated with our sites
I don't think the links from scrapers are counted against us. Scrappers link to everyone. I had a lot of links from scrapers when one of my sites was penalized during Bourbon. Now my sites are doing fine and there are still a lot of scrapers linked to my sites. I do think a scraper site caused a 302 hijack that caused the bourbon problem. I don't think it's the scrapers linking to us that hurts us but it may be something some scrapers do in the process of linking.
There is something causing these sudden drops of legit websites but it's not the links to us. Even though I'm doing OK in this update I know that either site could suddenly disappear any time. So this update is still far from reassuring for me.
Even though I'm doing OK in this update I know that either site could suddenly disappear any time. So this update is still far from reassuring for me.
Thanks AnneJ, I feel the same :)
Ohh, see I was thinking that the scrapper sites linking to us were devaluing our site.
As an example: Lets say there are 1,000 sites linking in, but 600 of them were scrapper/unwanted directory sites...I thought they average those into the equation some how to calculate PR?
Anyways, right now a huge update in our sector for Jagger followers.
What happened?.. Everything was fine and rosy on Tuesday going into Wednesday, then everything faded and nobody knows anything about anything.
Is Jagger over? Why doesn't GG comment about anything.. even just to drop in and say all is well. Not a very good feeling at all!
If that's all there is ...
As an example: Lets say there are 1,000 sites linking in, but 600 of them were scrapper/unwanted directory sites...I thought they average those into the equation some how to calculate PR?
Several factors are involved in how much weight an inbound link is given to your site or actually each page. No one is sure what they are but here is my thinking from following this forum.
The higher the PR of the page that has linked to you the more positive weight it will have. Remember it's the page that counts, not the whole site. Most links type pages don't have nearly the PR that the homepage of a site has.
Also the fewer links on the page that has linked to you the better. For example a link to you from a page that is a long list of links will have less weight than an article page that has two or three links to related material. I think links from related pages give more weight. In fact I wonder if links from unrelated pages help at all.
I don't believe that links from low ranking sites/pages hurt at all. In fact I think they add up and can help especially if they are related to your topic. I don't even believe that links from link farm types hurt you but if you link back it could.
Then there is the whole trustrank thing that may or may not exist. But if it does links from pages deemed to have higher trust rank may have more weight. It's also been theorized that EDU links have more weight.
Nothing is certain but this gives you an idea.
>> The New York Times does not own the news it reports on, but it is nevertheless able to report on that news. <<
Yes, but the NYT is libel for how it reports that news, whereas Google (search) doesn't report and is not libel for anything. It just shows existing data, whereas media companies in general interpret data, and are therefore bound by rules governing that interpretation. I believe search engines belong in their own category, which is yet unregulated and cannot therefore be compared with the other media types you mentioned.
>> That's why all these posts about Google failing are so pointless, how many times can we be wrong before we finally realize we are wrong? <<
When ever did my comment about Google's responsibility during an update become a statement of their failure? By all means, air your views, but please don't misquote what I said -- this was not just another a lame post predicting Google's demise, as you imply.
>> Having gotten to that point, maybe we can start working on doing the analysis that webfusion mentioned earlier. <<
By all means, but please understand that others are entitled to an opinion, even if that opinion does conflict with yours.
>> I understand your frustration. Thats human. <<
Thanks. I'm glad we sorted that one out.
>> So when Jagger update started, I had really not much to lose but everything to win. Therefore :-) <<
Thanks for explaining. Seems you're naturally Jaggerized and don't need a happy pill after all ;)
>> Nescafe Instant Coffee Gold Blend, usually followed by a cup of a danish brand Cappuccino :-) <<
Here's toasting your recovery with a cup of Italian cappuccino -- via an extremely noisy, neighbor-unfriendly coffee machine ;)
During this Jagger change pages returned have fit no pattern. Today we continue to see the same thing. Addresses of pages deleted ages ago are up. The regular pattern of pages searched is non existent. Things look almost random.
It appears that the flux continues. And as such it would appear that we have no idea as to where we will ultimately end up. So wait we must.
Flux, of course, is good term to use especially in the sense that it relates to an uncontrolled emptying of the bowels.
If you haven't noticed, all update threads are filled with dire warnings of google failing, and of course google has not failed. That's what I'm talking about.
That's not an opinion, it's an empirical observation, I read the posts, and I look at google market share. Try to read what I said and not assume it's all talking about you. Or don't, makes no difference to me.
You can think what you want, as you noted, anyone is entitled to an opinion. But don't mistake your opinion about how you would like things to be with how things are.
I'm not going to get into details about how far you are wrong about New York Times being liable for misreporting of facts, it would violate TOS. Only in very limited cases is this true, lets leave it at that, where they misreport in a malicious way about an individual or corporate entity who has access to a good attorney. Any larger misreporting is handled by at best an apology for getting the story wrong, and at worst... well, again, I can't go there due to TOS.
If I remember right, Google has also been forced by legal action to remove certain things that fall into a similar area.
And besides, you really seem to be confused about this: why is any site deserving of any position in an absolute sense? Where do you get this idea from? What algorithm did you use to determine that a certain site had a right to a certain serp position? Can you show it to me? Is it published anywhere? Why does noone have access to this superior method? Would you be kind enough to publish it so we can all see it?
How can moving site a from position x to position y be in any way shape or form be considered in the same league as libel? It is not a misreporting of fact, it's just deciding that site b is a better fit, or that site a is no longer a good fit, for whatever reason. And it's not even that specific, it's just site a satisfying some algo component, or not, and site b satisfying it.
If you don't like this situation, as I noted, feel free to actually work to change it by lobbying for a public search engine that would replace these private for profit ones. That would be a much better use of your time if you want actual change in this arena.
Anyway, back to work, can't argue about stuff that has no affect on anything in the real world.
[edited by: 2by4 at 6:03 am (utc) on Nov. 11, 2005]
The higher the PR of the page that has linked to you the more positive weight it will have. Remember it's the page that counts, not the whole site. Most links type pages don't have nearly the PR that the homepage of a site has.
Not entirely true. The higher intermal Google Pagerank value associated with the weight of the page to you is what yields results. Looking at the green bar to try and weigh what positive effect a given page has linking to your site is worthless as:
PR is passed on arbitrarily, and is no longer valid in coordiantion with Googles internal Pagerank algo. This means any site can pass pr to almost any other site, regardless of how relative the two are.
Reseller does this mean the flux is over now?
Does anyone see flux is still going on? I was still in a hope that the spammy sites will get removed and my sites will come back :(. My traffic is Zero Now.
One of my sites Cache is updated to 9th Nov, but rest of them still have cached date Oct 15th.
Kindly suggest.
>>Reseller does this mean the flux is over now?<<
Well.. I can see GG is here.
Good morning to you GG :-)
If Jagger3 follow the same pattern as Bourbon, there will still be some flux. GG prefer to call it "everflux" though ;-)
[edit] can see GG already answered the question. Thanks GG[/edit]
it will remain at *9* for a few more days at least while people here make sure that the datacenter is solid. But I believe it will spread starting next week
That will make Jagger3 on *9* as the longest stable results through this update. Will there be any tweaking when *9* spreads over to other DCs? I mean, are there any issues that ensuing flux (if there is any) will address, before Jagger3 is finally done for good?
[edited by: McMohan at 7:02 am (utc) on Nov. 11, 2005]