Forum Moderators: Robert Charlton & goodroi
but, then again ive never worked harder to achieve them.
does this mean when the dust settles and every pi**ed off person here gets there results back, mine go for a burton?
by the way, Walkman, im actually listening to a band right this minute called the Walkman! Freaky
Similar
- ASP based
Not similar
- Uses ID= tag in ASP string
No, we hand edit the titles & page names, some match the keyphrase, most are slightly different, all pages are affected.
- Has some redirects from old .htm pages to new .asp pags
Not similar.
- Cascading javascript menus
Single layer menu, pages outside the menu structure also affected.
- mini-site map for navigation in the footer
Some pages with, some pages heirarchical, some flat, all equally affected.
- changed rightnav.asp (global change for the site) just when Allegra started updating
We changed only a link and background to match season.
- Used to get about 2,000 Google visitors per day, now get about 25 a day.
- I found a duplicate site at www.excite.co.jp
Nothing special.
- I have a Commission Junction affiliate ad in the left nav on every page.
None
- Adsense on every page (Adsense $ dropped considerably post Allegra)
Some pages
this [216.239.37.99...] is showing some terrible stuff ( for me )
The interesting thing ( to me ) is that both have updated the pages with my recent deep crawl on the 6th and 7th and I am seeing my new pages on both.
Are there any conclusions to be drawn from that?
Is it safe to say that the update is an algo change only not a change in the data being used?
i just had a call from my son ,he told me that in his college they have change the default search engine from google to MSN.co.uk.does that tell you something?
Maybe the college's IT manager is moonlighting with a string of scraper sites that got nuked by Google? :-)
What? You mean you don't send out any of those drug related solicitations I get in my email every time I get my new messages.
I tell you I'm shocked, totally shocked ;).
I see things are still churning at the data centers, time to work on more html file size reductions and other stuff.
Analysis of index update, filter tweaks, and algo changes if any will prove interesting. I tried to track a couple of pages yesterday but I couldn't get a solid picture.
============
"This data center has what I recently saw and what I expect [216.239.39.104...]
this [216.239.37.99...] is showing some terrible stuff ( for me )
The interesting thing ( to me ) is that both have updated the pages with my recent deep crawl on the 6th and 7th and I am seeing my new pages on both.
Are there any conclusions to be drawn from that?
Is it safe to say that the update is an algo change only not a change in the data being used? "
<notSEdependant> Maybe we’ll call it “SED” (Search Engine Diversification) or “SOS” (Search Operability Strategy) someone can start SearchEngineDiversificationWorld dotcom (Brett?) </notSEdependant>
Ultimately, I’d like to see more competition in the search game; after all, the economics of search (millions of websites/businesses distributed to the world through a few major players) has spawned enormous advertising dollars driving some webmasters to corner traffic for advertising and not for end users. At the end of the day it’s all about traffic = money!
An interesting model would be “PPB” (Pay Per Bot) you get a one-year express inclusion in our robots.txt to spider all our pages for only $299(recurring annually in subsequent years to update your SERP’s.) to hard for webmasters to manage? No problem, hosts and ISP's can be the intermediaries and do a revenue share. :-)
-------------------
I hope you've got this right Walkman. If they do merge the 2 indices, Google will actually have things bang on track.
My site has been in the top of SERPs for most of my major terms for years, and has ALWAYS been #1 for mycompanyname.
Perhaps the answer is lost in the 24 pages of content - but does anyone have ideas what happened with the update to kill searches for mycompanyname? I see a lot of people mentioning this problem
by Walkman...First they let the 302 redirect issue unsolved for months
Walkman, please explain what you mean by that.
I have several high traffic sites where G refuses to count the traffic because it comes from redirets using htaccess. i.e. 1700/vists/day but less than 100 ad impressions are counted, according to my url channels.
An explanation would be appreciated regarding redirects, I am not even sure what a 302 redirect is, does that mean traffic going to a non-existant webpage, which is my main issue?
P.S. G does not explain why my impressions are not counted other than canned mechanical replies which are non-relevant or not applicable.
[edited by: trader at 6:22 pm (utc) on Feb. 8, 2005]
- The title in DMOZ is important
- The links in Dmoz and other BIG directories are important
- The age of the links is important
- The natural increase of link is important
Several sites increased the visitors despite the fact that they had a lower rankings in the targeted keywords.
I have never known where they are from, because I have never done anything to redirect a page - it's either there or deleted.
BUT, I saw a comment in an earlier post that may just have shed some light on it and I want to know if anyone can answer my question... I am using a custom 404 page so that if anyone comes to my site from an outdated search engine link or a direct bookmark to a deleted page they are still held within my site and not given some generic error message. Is THAT where the redirects are coming from? If that is the case, could I consider that value to be the number of times someone is trying to access pages that no longer exist?
BTW, trying to stay on topic, I have my google filter set to show me English pages only, so I know what I get is not my true position in the serps, but I went from averaging about #38, then down in the 80's for several weeks, and then last night I was up to #24. However, I have not yet seen any increase in traffic from Google - it is still a trickle.
And if they have problems with there (sic) capacity servers (sic) how come they're devoting processing power to updating the directory?
I hope it helps to improve the SERPs a little bit over this running update as there are some little signs of this, but...
They should simply spent more of their money to the SE as core business instead as open a new shop along the way every week.
They are looking short in crawling for over a half year now at least for me. MSN is leading before them with Yahoo. Also Ask seems to have not to low capacity.
It looks as Big G is devoting to much of their computing power to Adwords/sense last time (This 'SERPs' were described here as better some times).
I'd drop Adsense for my site to give G some resources for the SE back. :-) Believe it or not this month they crawling my site earlier and more...
In terms of computing G has to do some real engagement (invest money!) into breaking the 32Bit barrier. This could not be such an issue for a technology company.
the problem I was reffering to is detailed here: [webmasterworld.com...] (close to 400 messages :))
- The age of the links is important
I think Google has been factoring in the age of links for nearly a year now. It is why, IMO, new sites have been sandboxed since early last year - their links aren't "counted" until they are aged a certain amount of time. It would explain why sandboxed sites are indexed and can rank well for obsure terms (the links are recognized and followed, sites are indexed yet don't need the power of links to rank for the obsure terms) but why they don't rank for competitive terms (the power of links isn't applied because they are too "new").
The occasional exception would probably be explained by links from white listed sites, in which case those links would be applied immediately.
G did this to counter buying links (no bang for your buck) and it negates the short-term, large-scale link exchange campaigns meant to influence SERPs.