homepage Welcome to WebmasterWorld Guest from 54.205.189.156
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 327 message thread spans 11 pages: < < 327 ( 1 2 3 4 5 6 [7] 8 9 10 11 > >     
Completely bummed out since Bourbon update
Traffic down 60-80%, AS revenue down by 75%+
fearlessrick




msg:751403
 11:32 pm on Jul 13, 2005 (gmt 0)

I have submitted a sitemap, made some changes to my site, sent an email to the address GG gave, nothing changed. Basically, my faith in Google AdSense has pretty much been rewarded by ruining my site and my plans.

Sure, this sounds like sour grapes, but since May 21, traffic and revenue dropped off a cliff. It's no longer worthwhile to update my site. Just as an explanation, I took what was once a subscription-based service in password protected pages and converted to publicly available pages with adsense. It was good, increasing income every month until Bourbon devastated it.

I've tried what I reasonably could but am at a point now that I am seriously considering chucking the entire site as the revenue isn't even worth thining about at this point. Anybody who has any reasonable ideas, please post or sticky me.

Bummed out and depressed...

 

Atticus




msg:751583
 7:48 pm on Jul 24, 2005 (gmt 0)

EFV,

Regarding your unwillingness "to accept the proposition that Google's search engineers and programmers are willingly accepting direction from beancounters," please consider that the sole purpose of a corporation is to provide profit to the shareholders and that an eye towards short-term gains almost always outweighs the concerns about the quality of the product which the coprporation produces.

See:

[en.wikipedia.org...]

"In general, management of publicly traded corporations are thought to have a fiduciary duty to always increase the amount of profit made."

To say that engineers would rebel if beancounters took over quality management of a search engine is to completely ignore our old dead friend, AltaVista. And to say that no corporation would ever risk long term stabilty for short term gains completely ignores the history of companies such as Enron, WorldCom, Arthur Anderson, etc.

europeforvisitors




msg:751584
 7:53 pm on Jul 24, 2005 (gmt 0)

This belief can only be held if you do not begin from the facts, rather begin from a flawed premise, with exactly zero empirical foundation, in this case the belief that Google isn't trying to maximize its income, a statement that hovers in the realms of religious faith.

I certainly don't share the belief that Google is foolish enough to sacrifice its core product for short-term gains.

Google.com isn't disposable domain, and Google does have competitors.

2by4




msg:751585
 7:59 pm on Jul 24, 2005 (gmt 0)

Who said anything about risking long term success for short term profits anyway? It's a balancing act, all major corporations play this game, some better than others. For example, by boosting pre ipo income, they probably significantly added to their ipo profits, which in turn have given them a very strong war chest to operate with, which is critical if you're going against the likes of microsoft, who will not hestitate to use theirs to wipe out any company they can't buy.

Like I said, I have far too much respect for the intelligence of the googler's, including their bean counters, to believe that they aren't capable of playing this type of balancing act quite successfully long term. But that means they MUST make money, now, and in the future. And that means maximizing their current income streams.

To me, SEOs of all people should understand the concepts of risk management, making decisions that involve risk that is in order to boost income, but also working trying to ensure that failure isn't the outcome.

Swebbie




msg:751586
 8:17 pm on Jul 24, 2005 (gmt 0)

Why in the world does G drop a site from page one into oblivion or visa-versa?

To keep webmasters guessing and to ensure a steady flow of AdWords money flowing in. If the SERPs were mostly static, those at the top would be happy with their organic traffic and not pay to advertise. This way, everyone feels the need because the next algo change may devastate your business if you rely on organic traffic. Makes good business sense, even if it screws up a lot of us.

mahoogle




msg:751587
 8:39 pm on Jul 24, 2005 (gmt 0)

I have a simple methodology of testing and this is limited to one area. There is more to it than I'll write here, but basically I measure the top five positions and weight them by the % that have adsense, the position of the ads, and a cpm based on my channels, and come up with revenue per top 5 resutls...My system certainly has holes, but it does give a direction of how adsense sites are doing in my categories.

Google is a heavily metrics driven company. With make more money and make users happy as major goals and of course a magical combination. For every search product I've ever worked on, considerable amount of study went into monetization and the effects of the changes on the revenue. I'd be very surprised if Goog made changes to their core product without having a precise expectation of the effect on revs.

The implications of this update says to me that they reverted back or rolled out a similar pre bourban update, saw that they lost too much rev and are reverting or appling a knob that helps adsense sites...we'll see....it's still early. My advice to people that lost traffic with this update is to be patient, my bet is you'll come back in the next two weeks...

steveb




msg:751588
 8:41 pm on Jul 24, 2005 (gmt 0)

"but this contradicts a few ideas"

Actually it doesn't, but if you insist in thinking of everything as black and white then you'll never understand anything.

By the way, does time move differently for some people? After you fix canonical issues it might take four or five months to see the benefits. If it is a 302 issue it could be much longer.

Humpty Dumpty can be put back together again, but it may take months and months.

stu2




msg:751589
 8:55 pm on Jul 24, 2005 (gmt 0)

I'd be very surprised if Goog made changes to their core product without having a precise expectation of the effect on revs.

By the frequency they seem to be updating and rolling back I'd infer that their expectection is rather imprecise and they are experimenting on live data.

europeforvisitors




msg:751590
 8:57 pm on Jul 24, 2005 (gmt 0)

To keep webmasters guessing and to ensure a steady flow of AdWords money flowing in.

Why is it so difficult for people to understand that delivering the "most relevant" search results out of thousands, hundreds of thousands, or even millions of results for a given keyphrase is very, very hard? And that even a slight change to an algorithm can have a major effect on where a page shows up in the SERPs?

I just searched on a certain type of cheesecake recipe, and I got 239,999 results. With the four-word phrase in quotation marks, the number of results was still 2,970. That's a lot of recipes, and who's to say which one deserves to be #1, #10, #100, or #1,000?

Now let's say that, for example, the search engine or its "black box" software adjusts the algorithm to give slightly more emphasis to title and less emphasis to anchor text because "anchor-text abuse" is skewing the search results. All of a sudden, Sarah's [city name] cheesecake recipe drops from #1 to #15, and Julio's [city name] cheesecake recipe jumps into first place. Why would anyone assume that's intended to help the search engine's bottom line? Or that, if the search engine is Google, Sarah is more likely to buy AdWords than Julio is?

Switch to a more commercial phrase (say, "Elbonia hotels"), and you've got another complicating factor: a constant flood of heavily SEOed template-based pages such as "user review" and scraper sites. Just the other day, a Webmaster World member spoke of publishing a 2,000,000-page site that would consist of user reviews but apparently was ready to launch without any "real content" (to use the member's phrase). This kind of influx obviously has an influence on search results and the methods used to deliver the SERPs. Is it any wonder that, in such an environment, Joe-Bobs-Widgets.com or Irma-Sues-Great-Content.com can't count on being #1 or even #10 for a given phrase all the time?

JuniorOptimizer




msg:751591
 9:59 pm on Jul 24, 2005 (gmt 0)

The recipe that should be number one is the one I get paid on. Did I get it right?

berto




msg:751592
 10:10 pm on Jul 24, 2005 (gmt 0)

Most troubling to me are the discontinuities--entire sites suddenly jumping from here to oblivion, then just as suddenly back again. In the Google universe, space and time warps are all too common. Too often, SERP movements defy common sense.

reseller




msg:751593
 10:34 pm on Jul 24, 2005 (gmt 0)

europeforvisitors

>Why is it so difficult for people to understand that delivering the "most relevant" search results out of thousands, hundreds of thousands, or even millions of results for a given keyphrase is very, very hard? And that even a slight change to an algorithm can have a major effect on where a page shows up in the SERPs?<

Honestly it isn´t that difficult to understand your argument.

But what most publishers find it very hard to understand is the necessity for all those continuous "updating" "shuffling" "tweaking" "everflux" and you name it which don´t follow any logic.

What is the logic in writing algos that bring a site today at the top of the serps and to send the same site
to oblivion next week, then bringing it back to the top again and then send it to h*ll later etc..etc..?

This has been going on for 6 months!

Do you have any explanation for that?

SEOPTI




msg:751594
 10:36 pm on Jul 24, 2005 (gmt 0)

reseller, it's just stupid, I'm happy there is Yahoo and MSN.

Swebbie




msg:751595
 10:40 pm on Jul 24, 2005 (gmt 0)

Why is it so difficult for people to understand that delivering the "most relevant" search results out of thousands, hundreds of thousands, or even millions of results for a given keyphrase is very, very hard?

Non sequitur, anyone? You have a real penchant for puffery, but I've noticed that it tends to be built on straw men, such as the tidbit above. No one has even remotely suggested that it's easy to get the most relevant results out of so many indexed pages.

We're all *SPECULATING* on motives. Yours is clearly that Google has an altruistic motive - namely, to give searchers the most relevant results. Some of us happen to think that's naive, especially since they went public and have folks who are not Google employees to answer to. Those same people have a lot of their retirement hopes resting on Google's stock performance. And stock performance has a lot to do with profits.

Denying that constant shake-ups in the SERPs are related to that major priority is a pie-in-the-sky perspective. But hey, to each his own EFV. I enjoy you most when you get up on your pedestal and preach to all of us inferiors. <snicker>

steveb




msg:751596
 10:48 pm on Jul 24, 2005 (gmt 0)

"Do you have any explanation for that?"

It's not deliberate. Why is it so hard for some folks to grasp the obvious, and instead grasp at all these wild ideas?

This notion that Google is the only infallible entity in the world is just creepy.

OptiRex




msg:751597
 10:53 pm on Jul 24, 2005 (gmt 0)

This has been going on for 6 months!

Don't think so...this has been going on since Google first ever appeared on the scene. I remember in Google's early days analysing the effects of a new update over a weekend and then implementing immediate changes!

The beauty of Google then was that I had pretty immediate results since they would re-spider virtually on submission and, of course, so did lots of other SEOs who were doing and learning the same things and comparing it to the results achieved with AltaVista:-)

It was those early days with G which taught us many of its preferences. Obviously it's not the same animal now, more of a hybrid! A very tweaky hybrid...

jd01




msg:751598
 11:09 pm on Jul 24, 2005 (gmt 0)

15. The method of claim 1, wherein the one or more types of history data includes information relating to how often the document is selected when the document is included in a set of search results; and wherein the generating a score includes: determining an extent to which the document is selected over time when the document is included in a set of search results, and scoring the document based, at least in part, on the extent to which the document is selected over time when the document is included in the set of search results.

Has anybody thought the above would take a period of time to gather information for?

Wouldn't the page(s) have to be in the results and compared to other results on the same page to make a determination?

Wouldn't the results have to change for a number of documents to be compared against each other to find meaningful information about how often they are selected compared to other results from the same niche?

Wouldn't there *have* to be fluctuation and change on an almost continual basis to implement this type of system and log information for comparrison?

After an initial implementation wouldn't there have to be adjustments made, and other data samples taken?

Is this likely? Your guess is as good as mine...

My main point is when we only look at information with blinders and are not open to where G is trying to go a year from now, we may be missing some important information along the way.

'How relevant and worthy the results are today?' should only be as much of the question as, 'what are they setting up now to release tomorrow?' is.

Justin

theBear




msg:751599
 11:23 pm on Jul 24, 2005 (gmt 0)

Justin,

Interesting set of words from the patent.

Only one _minor_ _MAJOR_ problem with that.

See the human factors study done on serp usage patterns.

Then we have the possible influence of automated button pushing.

From time to time I do some really far out searches on chemical names. I did one after the booze binge had "ended" there were some 38 plus or minus results but only 6 or so shown.

Every result was marked supplemental.

That right there told me that something got turned too high.

jd01




msg:751600
 11:56 pm on Jul 24, 2005 (gmt 0)

theBear,

I understand...

My point is that whether the button is turned too high, or they put $s=$Y+$z/$q*.0398751%$WA($XY-.4$i) instead of $s=$Y+$z/$q*.0398761%$WA($XY-.4$i)

We jump on a single simplistic explanation and run with it concerning a 5,000,000+ variable algo, with new conditions and adjustments being implemented almost daily.

Then we complain when our sites for a 12,000,000 result search go from result 1 to result 250... IOW we whine when our site drops in relevance by 0.0000002083% and claim G is broken...

I think we would be better off looking at what they may be *trying* to implement that we have been given some clues about, rather than exclaiming that the engine is broken when something that we don't understand happens.

IOW the question I ask is not, 'How could they do this to their results?', but I assume they are smarter than me and do actually have a plan so I ask, 'Why would they be doing this to their results?'

Added: In looking at the patent and knowing much of that may not be implemented, there are a number of things that require fluctuating SERPs to make usable.

Added: For instance, the seasonal implementation, that can relate to documents being present in the SERPs based on selection data from a month of the year, a week of the month, a day of the week, or a portion of the day.

There are not too many answers to the first, but when you start asking the second, there becomes a large number of answers, and some of them even make sense - beyond they are a company trying to make money.

My guess (Definitely not directed at anyone directly) is for most people it is either too much thought or not as much drama and fun as the 'it's broken' cop-out for not ranking is.

Justin

europeforvisitors




msg:751601
 12:07 am on Jul 25, 2005 (gmt 0)

For what it's worth, I'm doing well in Google these days (after being whacked for about two months), but about half a dozen of my "money" affiliate-link pages for one city are gone from the SERPs. Comparable pages for other cities haven't been touched, and indeed one was mentioned in a leaked Google quality-evaluation document as an example of a non-spammy affiliate page. So what's going on? Do the powers-that-be at Google think they can pressure me into buying AdWords for page set A but not for page sets B, C, and D? Or is it possible--just possible--that those half-dozen pages went missing because of a (gasp) Google glitch?

To expand on what Steveb said, where do people get the idea that Google is infallible, or that search algorithms can't evolve along with the Web?

jd01




msg:751602
 12:13 am on Jul 25, 2005 (gmt 0)

Appologies for the double post...

If I could sum up my thoghts, this is what I would say:

The future of SEO is not dead, the future of SEO will need to continue to incorporate new ideas, which, looking at the information available, will include controling user behavior:

How do I make users click on my site?

How do I make users stay on my site?

How do I make users view more pages of my site?

How do I make users behave in a way that dictates to the SE's my site is the site they were looking for?

As user behavior becomes more trackable it will become more important compared to on page factors... why store and analyze multiple on page factors, when you can find the subject of a page and then let the user tell you if it what they are looking for through their behavior?

Justin

theBear




msg:751603
 12:21 am on Jul 25, 2005 (gmt 0)

Justin,

On this I agree "(some of us) jump on a single simplistic explanation and run with it concerning a 5,000,000+ variable algo, with new conditions and adjustments being implemented almost daily."

However there are simple explainations for a lot of what is going on. If a site stands at the intersection of a number of these then all he¦¦ will decend upon that site.

Feedback loops are prone to being unstable.

The primary problem as I see it is:

1. It takes to long to roll out an update, full or a correction.

You can rest assured that in a previous line of work if it took this long, the corporation would have been belly up by the 4th day.

jd01




msg:751604
 12:41 am on Jul 25, 2005 (gmt 0)

Completely Agree:

1. It takes to long to roll out an update, full or a correction.

Also, agree there are some simple answers, I just hate the blinders that only looking at what seems to be simple from one perspective, sometimes puts on =)

Justin

walkman




msg:751605
 1:15 am on Jul 25, 2005 (gmt 0)

I still am doing OK in this update but wish me good luck: I used the rel=nofollow on a lot of (not all) external links, especially from my front page.

According to what I have been reading from GoogleGuy it shouldn't make a difference, but I'm still somewhat nervous :).

Atticus




msg:751606
 1:41 am on Jul 25, 2005 (gmt 0)

I don't care whether Google's schizophrenic SERPs are by accident or design.

As a searcher, I can not have faith in a search engine that offeres a page as the most relevant result to a query one day, drops it from the index entirely the next day and brings it back to page one a few months later.

Such dramatic turnarounds undercut any pretense at relevancy. It's better defined as 'hit or miss.'

pleh




msg:751607
 3:07 am on Jul 25, 2005 (gmt 0)

I cannot understand some of the wide flucuations. I went from a PR2 to a PR4, but my serps went from page 1-3 of Google, to non-existent.

2by4




msg:751608
 3:57 am on Jul 25, 2005 (gmt 0)

"I can not have faith in a search engine that offeres a page as the most relevant result to a query one day"

That would be a component of the risk I mentioned, and it's a real risk. I remember a few years ago I didn't use bookmarks at all since I could always find what I had looked for again. Now of course my bookmarks are nested folder in folder in folder, since I have to bookmark anything I find if I want to find it again. They do seem to have tightened their web/programming/computer related stuff a bit since last year though, it was really bad then.

But it's easy to forget that power searcher needs are not the same as average searcher needs, I think google can get away with this to some degree, but it's a definite downside to their activities.

On the other hand, having faith in any commercial search engine probably isn't a very good idea now that I think of it. Unfortunately there aren't any practical alternatives at the moment.

chopin2256




msg:751609
 5:42 am on Jul 25, 2005 (gmt 0)

"but this contradicts a few ideas"

Actually it doesn't, but if you insist in thinking of everything as black and white then you'll never understand anything.

I explained on my part why I feel it contradicts some theories. It's not like I just said, this theory sucks because I am not benefiting. In fact, I said the opposite. The site I bought IS doing well, regardless of the problems it had, and it beat Bourbon. To me, this contradicts a few ideas. I also mentioned that the site is 9 years old, and pagerank 7. This could play a role, but I remember some people saying, their old sites were affected as well. I am pretty open minded, but I am baffled why the site that has content is doing worse over a site that was neglected for years.


By the way, does time move differently for some people? After you fix canonical issues it might take four or five months to see the benefits. If it is a 302 issue it could be much longer.

Fair enough, but again..I hate repeating myself, but what about the site I bought that is 9 years old? It never addressed the canonical issues, nor the absolute linking problems. It was also stale, and even has a hijacked page. It's not affected by Bourbon. Why not? Again, I am not complaining, but I just fail to see canonical issues playing a big role in this. I am a pretty openminded guy, and willing to hear certain theories, etc...but I fail to see this canonical stuff playing a major role, if a role at all...maybe in some cases.

reseller




msg:751610
 6:04 am on Jul 25, 2005 (gmt 0)

chopin2256
>I also mentioned that the site is 9 years old, and pagerank 7. This could play a role, but I remember some people saying, their old sites were affected as well.<

You are right. My site is from 1997 (i.e around 8 years old) and was hit first by Allegra (3rd Feb 2005) where it lost 75% of its Google´s referrals, then got a hit on 22nd July 2005 ending up now with around 10% Google´s referrals. Page rank of homepage PR5 and most other pages PR4.

So much for old sites and page rank ;-)

ramachandra




msg:751611
 6:42 am on Jul 25, 2005 (gmt 0)

Hello,

The same case for my site too. My site is also 7 years old and was hit by Allegra and thereafter 0 page rank for all the pages. Thought recent Bourbon update will do well but no luck. I think again my site is penalised and nowhere in SERPs.

Presently I am finding out to overcome.

steveb




msg:751612
 8:11 am on Jul 25, 2005 (gmt 0)

"but what about the site I bought that is 9 years old?"

What about it? Why is it so hard to grasp that nothing is 100% consistent always and everywhere? Google is not infallible. Everything is not black and white. Pointing out one site that does or doesn't do anything really means nothing at all, and in fact is just missing the whole point.

reseller




msg:751613
 8:16 am on Jul 25, 2005 (gmt 0)

It looks like Google engineers are running some experimental algos/filters in real-time. I guess that this is going to continue as such the rest of the month of July.

The DCs are still moving.

And the top sites on the serps for my testing keywords/keyphrases are changing accordingly.

This 327 message thread spans 11 pages: < < 327 ( 1 2 3 4 5 6 [7] 8 9 10 11 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved