| This 237 message thread spans 8 pages: < < 237 ( 1  3 4 5 6 7 8 ) > > || |
|Well there ever be another monthly update?|
Is it going to be days, weeks, or months, before the next update? At this point, I'm completely exasperated. I've had to explain the current situation ten different times, to my clients, and I don't even have enough information to explain the situation with a modicum of confidence.
Regardless of what everyone else says, I know that this is affecting the quality of the results which Google is providing to it's users - how could it not, the last deepcrawl results are from months ago. And what percentage of the results are from the deep crawler? 80%? 90%?
Try searching for Today is April 6th 2003 [google.com], this stuff hasn't been updated in months. How could this not affect the quality of user results?
All the work that I've done in the past two months is worthless, right now, and it hurts. I'm just asking for information, so I don't continue to look like a fool.
|I webmaster several sites - 2 of them experienced better rankings, one dipped slightly and one stayed exactly the same. The one that stayed exactly the same has stayed exactly the same for the past 18 months. The two that got better are 6 months old. The one that dipped is about 3 years old. Go figure... |
Strangely enough my PR didn't change (at least not a full number), only my hits. I'm glad to see that at least some of your sites are doing better as a result of this update though.
My PR didn't change either, still a 7. But, the problem is, so many incoming links were not considered. My incoming links fell from 600+ to only 82. That means lots and lots of important anchor text were not considered. These are important factors in the algorithm. Many others here are saying the same thing. The index is incomplete and the algorithm seems to have run using incomplete data. By most scientific standards, this would be considered "bad science". They should not have rushed out an incomplete index. Period.
I agree crobb305. The incomplete index has given really hodge podge results in some cases, and not in others - the whole thing is very unpredictable and inconsistent - which was the point of my comment regarding 4 of my sites:)
ALbino, suffering depends on which boat you are in.
My older sites are up as well. My newer sites (2 months old) are dead in the water.
For an SEO/SEM I agree with Seattle_SEM that this situation is a problem. The client asks "when?" and we can't give an answer. We used to be able to say that after 3 months results would start to flow, now it is anyone's guess. I've sent several people to WW in hope it calms them down and helps them understand that this situation is a Google change issue and out of our control.
In very competitive markets backlinks are very important. Freshbot may pick up the newer sites, but they can't compete without PR and backlink anchor text being added into the equation.
I appreciate Google doesn't care whether it damages the SEO industry, after all we taking money away from it.
I suggested to three clients this week that if they were desperate for quick results they should use Adwords/Overture. All three said they were not prepared to pay $3.50/$2.50 per click, their margins would be destroyed and possibly turn negative at those prices.
So I guess SEOs are temporarily going to have a difficult time with new clients. My advice is concentrate on getting additional business from the older clients, do more work on conversion rate factors and less on SE positions and traffic. Making site mods to improve conversion rates can be measured the next day....much more enjoyable than all this wait and see lark! In addition conversion rate factors can't be influenced by SE's.
Maybe it's time to get the press involved - the only reason that Google is getting away with this is because the end users don't know that they're not actually searching the entire, current, Internet.
|the only reason that Google is getting away with this is because the end users don't know that they're not actually searching the entire, current, Internet. |
They are starting to realize, slowly. I have heard some coworkers making comments (out of frustration) that their searches are coming up wierd on Google. As was said in another post, word-of-mouth is very powerful. The internet has been around for a while, and I doubt the general public is as search-ignorant as it used to be, or as Google seems to think (or would like to believe).
[edited by: crobb305 at 11:43 pm (utc) on June 6, 2003]
GoogleGuy stated that we should expect at least one more traditional update.
A traditional update includes a deepcrawl. After the deepcrawl is completed it is typically about 2 weeks for the update. Since there has been no sign of the deepcrawl yet, mid June seems out of the question (that's a week away) and the end of June seems highly unlikely.
Couple this witht he huge problems GG seems to be having in this transition period and we may be looking at mid July at best, maybe August.
Also, please stop saying that the index is OK and not out of date because of Freshbot. Fresh only guesses and is usually wrong. There is no way a new sites (without backlinks or PR) are being ranked accurately and everyone here knows that.
I'm not so sure they have to crawl again to get more recent data. Nothing ever happened to the last deep crawl data. We could still see this, no?
Not to throw a wrench into people's hopes of a speedy recovery, BUT......
BRETT'S RESPONSE to Google settling any time soon:
There is no garantee of that. In fact, history says it will only get worse....
...Geo location, language specifics, browser type, and possible stealth personalization features mean that Google could appear to be the most random of them all in the near future.
Don't bet the bank on anything. Get your clients to realize single keyword watching is history. Watch for total referrals and build content content content until you drop.
WebmasterWorld - Professional Webmaster Business Issues:
It's getting old and repetitious already and I'm tired of hearing about it, but I'll say it just this once myself, just for the record.
>>I've got this one bookmarked for the nest time someone says the senior members are not complaining.;)
Count me in on that one.
I'm starting to understand how Financial Advisors have felt the last three years.
2. Knowing things would get better, eventually.
3. Still confident that the market would rebound.
4. Looking for other options.
5. Knowing things would get better.
6. Clients expecting us to know why the market has tanked.
7. Confident the market would still be the best place to be, eventually.
Financial Advisors will tell you, the problem is - even though the market has been better the last couple of months, people already expect to be even. And we know - after this is over, we will have some work to do....just wait until we have to explain that (especially to new clients).
Anyone planning to sell goods or services during the Xmas holidays should be all done with their websites by 15th of this month and submit them to Google and DMOZ and others. No changes after the 15th will be recorded by the crawlers to be in time.
I am referring to 2004 Xmas holidays, of course.
Seattle_SEM mentioned getting the press involved. In other words, "let's put the heat on Google". This might be a good idea.
Who here has access to the press and can take steps to publicize this issue? Hmmm
Google owes webmasters and its customers an explanation.
GoogleGuy's babblings are insufficient to cool this heated issue. Hell, the guy doesn't even give his real name.
However, my problem isn't with GoogleGuy, he's alright. My prob is with Google corp.
How tough would it be to have a webmaster news area of Google where they post information about the index? I visualize a day when Google posts it's latest complete update date on every SERP. What date would be displayed currently?
>Who here has access to the press and can take steps to publicize this issue? Hmmm
What's the point? Google News will suppress it. ;)
"Who here has access to the press and can take steps to publicize this issue? Hmmm"
I do, but what do I say? "look at this search for viagra, look at that a full page of ...err viagra...sites"
etc etc, I doubt the popular press would be the slightest bit interested.
May get an article in the SEO's gazzette, but only 16 people read it -;
|GoogleGuy's babblings are insufficient to cool this heated issue. Hell, the guy doesn't even give his real name. |
Since when has Webmaster World encouraged members to use their real names?
I'm not so sure they have to crawl again to get more recent data. Nothing ever happened to the last deep crawl data. We could still see this, no?
Actually Googleguy said he did not hink they would need April data, implying that therre would be another deepcrawl instead.
Actually, you're still wrong, and what I said was very different than what you are implying, thus not confirming your position.
GoogleGuy said less than months, meaning less than plural.
This means it has to be under 2 months, 2 being the point in which it becomes 'months'. (thus whatever is under 2 months... which is why I suggested 1 month, 29 days, which falls just short of the plural "months".)
Your confusion lies in that you assume "3 months" is less than "10 months" according to the language. This is equivelant to saying "less than infinity" which is just ludicrous (1000 months is less than 20000 months, and this is a possibility in your evaluation).
Obviously, there would be no point in making the statement if there was no timeline defined.
You said: "And by definition, "months" is anything GREATER than one month."
Actually, to have "months", you'd need 2 months. While "1 month and 1 day" is greater than a month, is certainly is not "months", at least not in this language. This is why you can have something take over a month, but not months. I hope this clears this up for you.
I tally this up most likely to the little problems we run into sometimes on webmaster world and other sites with an international community; most of the community is quite intelligent, but struggle because they are working outside their native language, might interpret something said incorrectly. Trust me, he didn't mean 'infinity'.
[edited by: deft_spyder at 12:46 am (utc) on June 7, 2003]
"That means lots and lots of important anchor text were not considered."
I'm not assuming this, if only because I'm not assuming anything logical at work these days.
One site I've previously ranted about is #1 for highly valuable, one million result two word keyword. Their ranking 99.999999% likely is because of their signing guestbooks with that exact two word anchor text. Google only shows 130 or so backlinks for this site, of which maybe forty are non-guestbooks (mostly doorways and internal pages).
However, All the Web show *2517* incoming links to this site. Hundreds and hundreds of guestbook and doorway pages with that two word keyword as anchor text.
Other sites have far more anchor text for those two keywords, but rank behind this site for searches and for allinanchor for those keywords. Google must be giving this site benefit for all that anchor text that is not showing as backlinks (mostly on pages with very low PR).
GG is alright. Whoever he is, he's cool, but this Google mess is getting very depressing.
Google was such an excellent SE and now... ancient data and none of my new pages in since the Mar crawl, (the main page is PR6, I don't understand but freshie isn't that interested any more and when it finds the new pages, it doesn't add them). My preservation page, that had only been in the index for 2 months, disappeared with Dominic. That page was very important to preserve what I love. It's #2 in Ink, #2 in ATW, and is the only page up against commercial sites exploiting it; Google had it and lost it.
I've started using ATW and Ink more for my own searches and changed the home page on IE because I know Google is totally out of date. I miss the old Google.
(And I'm not whinging because I was hammered... my site is doing well for most kw's in Google and it's still bringing in people asking to give me money).
|Actually, to have "months", you'd need 2 months. While "1 month and 1 day" is greater than a month, is certainly is not "months", at least not in this language. This is why you can have something take over a month, but not months. I hope this clears this up for you. |
If you consider a 45 day period, that is one and half months (1.5 months). Notice the 's' implying plural. Therefore, anything more than one month is "months". Obviously you and I have different interpretations, and no one fully knows what Googleguy was saying because it is ambiguous, which is all I was pointing out in my post. We are beating a dead horse...so next:
|A traditional update includes a deepcrawl. After the deepcrawl is completed it is typically about 2 weeks for the update. Since there has been no sign of the deepcrawl yet, mid June seems out of the question (that's a week away) and the end of June seems highly unlikely |
I agree...getting a bit worried here that no update will occur in June. Maybe even the first half of July for that matter, since we haven't seen deep crawler for quiet sometime.
One of my sites has just been deeply crawled during the last week, and another has just had the surface hits ( robots.txt & index.html ).
Also I have noticed that instead of getting a deep crawl over a period of three or four days, I get semi-deep crawls twice a month or so, lasting about a week each.
So now I can't "get ready for the googlebot", it seems to be unpredictable now.
I initially thought that Google just bought Applied Semantics because it knew that it would annoy the daylights out of All-Over-Fast - Applied Semantics being a top 10 Overture partner and all - but now we know the real truth...
Google bought Applied Semantics soley to help GoogleGuy better word his posts here at Webmaster World!
semantics - noun : the study of language
Welcome to WebmasterWorld, BigJay.
It sounds like freshbot hits, for what that's worth. Freshie was digging deeper than usual into some sites the last week or two.
The good old deepbot was different; it would find pretty much everything on the site and add it a few weeks later. Those were the days...
A little blurb on the Google search page would be good:
Insert corporate jibberish about wanting to make search results better...then add:
"However, we would like to apologize to anyone in the world who created or updated sites in February.
We decided to perform a back date before an update.
We will not inform you when your new sites and pages will be ranked properly because we have no idea at all.
Really...truth be told...for some reason all employees of Google have fallen into a coma and GooglePlex Personnel are under a strict quarantine.
I am an outside support consultant and the only person unaffected by this illness thus far.
But don't worry, I'll fix Google up. I'm an MCSE and run RedHat Linux at home!
Where's that Sybex Google Study Guide when you need it?
Lemme see...what happens if I click thi..."
[redirect to 'britney spears' SERPS after 60 seconds of inactivity]
Google lives on...sans employees.
"THE GOOGLE COMA"
|Web Guerilla is right, but it is even more than that Chris_R. Many new pages are indexed... so what? That is both not the point, and not a GOOD point. The point is the new pages are not ranked *properly*. Fresh pages are poorly ranked by definition. It's almost like "guessed" pagerank. Fresh pages are ranked by guessing. |
So what we have are old pages being ranked by very old data, and new pages being ranked by guessing.
Fresh pages have always been ranked by "guessing". This is nothing new.
While YOUR point may be the results are crappy - the person who posted this thread was complaining about old results - and used the date as an example. I simply showed how a much newer date gave more results which was EXACTLY THE POINT of the thread.
Every single complaint is from webmasters who haven't got stuff listed. I am not saying this isn't true in some cases - or even most of the cases with specific webmasters complaining [all my pages are in]. I am just saying that it is no big deal from the users perspective.
The claims that the SERPs are so different and trashy is just hogwash. They aren't. I have some printouts for the end of 2002. The first 9 results for one of the words I track but don't compete with are the same - in the same order. Searching for a different two word phrase gives very similar results from 2002 (8 out of 10 the same - order slightly different).
Every search I have done on google as a user seems normal. I haven't seen anything to suggest they are "broken" as some insist from a users perspective. Could they be better? I am sure they could be, but better doesn't mean "includes my sites". User don't care about backlinks - users don't usually search for this. Half the complaints seem related to backlinks not being counted and stuff like that.
Thing is - the search still works. No matter how many ticked off webmasters whine - it still works for users. The finding nemo example was the best I could think of on short notice - and was far from perfect, but guess what - when you search for it - you find good pages. Same for every other search I do.
Just because some webmasters page on widgets doesn't get included - doesn't mean there aren't thousands of other just as good pages. Freshbot is out and hasn't had problems indexing my pages.
Google doesn't have to dig so called SEOs out of the graves they dug themselves by promising clients stuff they never should have in the first place.
THE QUALITY OF AN INDEX IS NOT MEASURED BY IF A SPECIFIC WEBMASTER GETS ALL HIS OR HER PAGES RANKED.
I don't think any more info is going to come from google spelling out wha everyone wants to know. For some reason - they don't like to say when the updates are going to occur. I don't agree with that, but I don't run things either. Google is more open about their engine than any other search engine in history.
<<Chris_D wrote: Google bought Applied Semantics soley to help GoogleGuy better word his posts here at Webmaster World!>>
I consider GoogleGuy to be our own personal Alan Greenspan (for those who don't know, he is the Chairman of the U.S. Federal Reserve). One famous Alan Greenspan quote is: ""If I seem unduly clear to you, you must have misunderstood what I said." Mr. Greenspan's cryptic remarks are widely anticipated, dissected ad nauseum and greatly influence world financial markets. Could we not say the same for GoogleGuy's influence on the Webmaster World? You just can't help but like the guy! (well, I can't anyway) How someome can say so much while divulging so little is beyond me. I think GoogleGuy and Mr. Greenspan need to do lunch.
All the web has more than twice as many pages, but google blows away the others - it isn't like they don't have the info.
Well in smaller niches and topics of long term lament you can bet that google does NOT have the "info". Ya, the newcommers are getting a bad shake, no two ways about it.
I would have to strongly disagree with your post. There is a huge amount of spam in the results. Just look at all the sites that are listed highly that Google itself has in the past classified as spam. And didn't GoogleGuy even say that spam filters would be turned on gradually. In other words they aren't on!
I also have old domains that have been 301 permanent re-directed for two months still showing up.
What is my biggest complaint? All of the work that I have done for the last two months is nowhere to be found. This work is entirely based on Google's guidelines. Kinda sucks!
|semantics - noun : the study of language |
Chris_D, maybe they should add it to WebmasterWorld's Glossary!
And while they're at it, they could add "HUBRIS" too - the overbearing pride that comes before a fall. It applies to us, for thinking that we had the Google thing sorted, and that we knew how to do #1 sites, and to Google, for thinking that they'd get away with their present inadequate results...
The point of the thread is a guy complaining that his work of the past two months is worthless. Some people seem to confuse "listed" with "ranked well". I get all my pages in, that's no problem. I've always known how to get freshbot in, get pages in the index in a couple days.
But now we have Google serving up extraordinarily much worse search results (and please, I can't see a person seriously questioning these pitiful serps), because Google is combining old, outdated material with new, unproven material.
It's the worst of both worlds, and the serps plainly show it. Yes many sites still rank well for relevant content, but it is simply lusicrous to assert that the serps are not terrible when a particular, long established page might show at #6 one day and number #104 the next -- while large chunks of the results stay exactly the same.
It doesn't do anybody any good to just complain, and it sure doesn't do any good to bury your head in the sand. Old, poorly rated stuff mixed with new guessed stuff *does* trivialize the value of making valuable content.
Some folks here try to talk about broader implications of Google's data failure, but too often the discussions get sidetracked by people who insist it is all about their site. I happen to be having my best days ever... while Google is inexplicably ranking my own subpages below my much superior (by any measure) index pages, but at the same time it is ranking "fresh" pages I've made at #1. These are flat out terrible search-engining on the part of Google.
Those whose own domains haven't been affected need to get their heads out of their own sites and look around... expired domains, guestbooks, spam linking, "fresh" drivel has risen to the top where it mixes still with 60% adequate results.
Hopefully Google will fix this horrible mess soon, because their results become more irrelevant each day -- as the rated results get older, and more "fresh" guesses get dropped in (and while the new sites flop in and out completely). But we still have precious little evidence of a fix. Instead of continuing to waste resources on Freshbot, why isn't the deepcrawler doing several weeks of a deepcrawl?
Hey Googleplex, a problem is you can't rank anything accurately now, not finding more junk to guess at!
Fix the deepcrawl and get back to work judging and ranking the web on its merits, not its date.
| This 237 message thread spans 8 pages: < < 237 ( 1  3 4 5 6 7 8 ) > > |