| 7:20 am on Jun 2, 2006 (gmt 0)|
on the grounds that I think G has major problems with BD and I really think the various spin doctors they have posting in the blogs are only giving us half truths I'm just getting on and adding content and working on new sites with the same structure as I've always had and hope the one day soon whether its in 1 month or 1 year G sorts itself out
| 8:47 am on Jun 2, 2006 (gmt 0)|
Yes, agree with tigger. You just have to keep on creating and adding.
If you (or your clients) have a site that has bombed in the G serps, keep working on it but scale down activity, imo, till it comes right (though it may never happen). Spend the rest of your time building new sites. If, like me, you also have one or more older sites that may have been unaffected by BD, time spent on improving/adding content will certainly pay off.
| 8:52 am on Jun 2, 2006 (gmt 0)|
thats what I'm doing DM building more sites and hoping old pages come back in
| 11:49 am on Jun 2, 2006 (gmt 0)|
"the dropped page issue"
To give a good sign about dropped pages.About 2 weeks ago I realized that 50% of my pages where dropped (6 years old site original content) yesterday all pages are back:)
Creating new sites and the myth of Sandbox,
I have created just 2 months ago about 4 sites for my clients with only 2-3 links on pages very related and on topic (PR3-4) ,within the first 2 weeks after publishing those sites rank between 1-9 top serps for the requested targeted 2 keywords ,no bad SEO tricks nothing just original content and a couple of links to be spidered from Google.All of those sites are maximum 24 pages ,does that tell you something ,in my oponion that is what Google wants small sites on topic and not heavy SEO.
| 12:01 pm on Jun 2, 2006 (gmt 0)|
what I see after/still all this mess, non www. issue is solved, but the former hijacked or 302googlebug hurt sites are still without any ranking, so do have a PR on the toolbar, but with no effect.
I think this is so unpro. for a company like google, first they add all those fake sites/files to there search just to get over 8bill "sites" in the index, then the troubles started for real. You know, I dont even want to continue this message, its just so unpro.
| 3:56 pm on Jun 2, 2006 (gmt 0)|
The situation with Google not adding pages or dropping previously indexed pages to their index is complex and unlikely that one cure-all will resolve everyone’s problems. I have a relative new site (started in Feb. of this year) that experienced the typical problems of having +10,000 pages indexed and dropped to about 100. This is really just a hobby, genealogy site, but for the site to have any value to others the names and birth date and location must be searchable by others.
After many weeks of experiencing continued loss of pages regardless of the clean-up activities taken that are recommended so frequently on this board another approach was taken. The site’s PR is low (in the 2-3 range on the PR tool) but there are about 150 inbound links and the site is crawled continually by Google. The primary pages that contain the ancestor information were 3 clicks away (level 4) from the home page. Observation indicated that almost all of the upper three levels were indexed in Google but my most important 4th level contained over 10,500 pages, almost none of which remained indexed.
The navigation structure of the site was changed to move the 4th level pages to the 2nd level. This seriously exceeded the 100 links per page that is recommended in the Google guidelines but quite frankly I had little to loose. Within a week my site moved from having about 150 pages indexed to having 10,600 real (searchable, not supplemental) pages indexed.
I know this approach flies in the face of some of the hard-liners’ recommendations and is not easy or possible for others to follow. However, it worked wonders for my site and turned it from a worthless, unfindable site, to one that ranks high for all the ancestor names on the site for Google and increased traffic from Yahoo and MSN by double in one week. I'm sure mileage will vary.
| 4:07 pm on Jun 2, 2006 (gmt 0)|
I'm seeing more and more sites with 100's of links on the index pages trying get better crawling of 3/4th level pages and although does look ugly it does seem to work - its a route I may well consider soon
| 4:13 pm on Jun 2, 2006 (gmt 0)|
|I'm seeing more and more sites with 100's of links on the index pages trying get better crawling of 3/4th level pages and although does look ugly it does seem to work. |
I would be careful in making any determinations based on that alone. Google's index is in major flux right now and I'd be willing to bet that those sites that have added more links to deeper content would have been fine even if they didn't add them. If the content was indexed before, it's there in the database. Google is doing all sorts of shuffling right now and it really pains me to see many chasing that damn algo on a daily basis.
Get away now! Do something more productive. You have absolutely no control over the outcome unless you've really messed something up technically and/or you've been involved in something that presented a level of risk to begin with.
| 4:22 pm on Jun 2, 2006 (gmt 0)|
I'm not making any knee jerk reactions right now and working on new sites till G looks at least settled - if ever!
| 4:50 pm on Jun 2, 2006 (gmt 0)|
|I'm not making any knee jerk reactions right now... |
I cannot count the number of times I have had to explain to clients that making constant changes to the SEO is not going to get them better results.
As soon as there is a minor drop in the rankings, the phone rings and I hear, "What changes can we do to counter that?"
Ahhh...it feels nice to vent. ;)
But seriously, the knee jerk reaction is what makes people crazy with figuring out how to 'fix' something that is not necessarily broken.
| 5:07 pm on Jun 2, 2006 (gmt 0)|
Google is always in flux, so you can't wait until Google "settles down" to do anything otherwise you'd never do anything.
However, overreacting to recent movements would be a mistake. Best to be patient for a while before doing anything drastic.
| 5:11 pm on Jun 2, 2006 (gmt 0)|
I know G is always in flux, what I referring to was the dropping of pages that a lot of us are suffering from - once that has stopped maybe we can see a way forward as this can't go on
| 5:13 pm on Jun 2, 2006 (gmt 0)|
What's so absolutely frustrating about this is the evasiveness. For example:
Webmasters: Wow, I lost lots of pages from my site I wonder what's going on?
Google: We've got a completly new crawling method ...
Webmasters: That's really not answering my question, is your new crawling method fully operational or are you still working on it? Can I reasonably expect the pages that were dropped to come back into the index soon? Have you changed the "minimum quality" requirements and my site falls below the threshold?
Google: Our new crawling method is much more efficient ...
Webmasters: I don't care if it's more efficient or not I just want to make sure you're crawling my pages and they are getting into the index ...
Google: ... we've changed a significant amount of code ...
Webmasters: ... sigh
I'm personally of the opinion that we're starting to see the 'sandbox of crawling'
| 5:19 pm on Jun 2, 2006 (gmt 0)|
Matt says crawl depth etc is dependent on PR etc.
I can live with that and give you all new pages 2nd or 3rd level.
What needs clearing up is:
When, or indeed if at all, are they going to do deep crawls again.
I have tons of good quality pages missing, many of which can take a day to research and produce.
Others which have grown over years!
| 5:21 pm on Jun 2, 2006 (gmt 0)|
Back to basics. It will all get sorted out.
| 5:21 pm on Jun 2, 2006 (gmt 0)|
|Google is always in flux, so you can't wait until Google "settles down" to do anything otherwise you'd never do anything. |
Once I build a page, it's done. It may go through a few tweaks every now and then but for the most part, it is finished. Then I move on to the next page, and the next, and the next...
I equate the chasing of the algo to those who are typically operating with a level of risk and like to be on the edge.
Years ago I learned that chasing algos was way too stressful when targeting long term marketing campaigns. That is why you build a solid foundation first and then look for ways to market the legs of that foundation. Free search engine traffic is one of them. And, it's not what it used to be. You're competing against millions of other pages today (particularly with Google). If you land a top ten spot, be thankful and move on.
The only real thing you can do in instances such as this is sit back and wait it out. Continue to build and improve upon what you have. That is the Golden Rule. Without the content, you don't have legs to stand on.
Forget about what you see competitors doing that you know comes with a level of risk. Unless of course that is your business model, then it doesn't matter. If you're a local business trying to establish an online presence, stay away from the latest fads. 9 out of 10 of them will always come back to bite you in your arse. ;)
Look at expanding your presence on Yahoo!, MSN, Ask. Get out there and pound the pavement if you have to. If you're a local business, there are a plethora of opportunities that await you that will be far more fruitful than watching a freakin' datacenter change by the minute.
If you're a small company trying to compete in a big company space, then you better have an appropriate budget prepared because you will need it.
Are your tenacles spread out way too far? Start bringing them back in and focus on what's around you and not outside your normal realm of operation. Take your budget and spend it wisely. Go after the long tail of search while you're waiting for your site to age.
| 5:56 pm on Jun 2, 2006 (gmt 0)|
I only look at page visits, and referrals, so as not to waste time trying to figure out google search seizures.
Since we've lost at least half our traffic across 8 sites (and in one case about 90% of traffic), and since I have no idea what google is doing, or how to address it, we're simply moving off of the Internet business until such time as things get stable, and predictable.
For us, that means no more large scale web content development. Basic maintenance, yes, but that's it. I'm lucky because I can pursue revenue outside of the net in ways that aren't dependent on what is ultimately a company that seems to be rather broken and unpredictable, and doing completely uncompresensible things in search, adwords and adsense.
| 5:57 pm on Jun 2, 2006 (gmt 0)|
|All of those sites are maximum 24 pages ,does that tell you something ,in my oponion that is what Google wants small sites on topic and not heavy SEO. |
Can anyone confirm or deny this? Is it better to have smaller, on-topic mini sites, instead of broad and big sites?
| 6:19 pm on Jun 2, 2006 (gmt 0)|
I agree that chasing algos is a waste of time, but in this case, people would give their lefty just to get indexed, or get their lost pages back in.
I've written dozens of pages since Google dried up, and they still haven't gotten in. I've slowed down lately because I am starting to think - what's the point? I write and create and wait, and wait... and wait some more, write some more, wait some more...
..and hope they eventually get indexed by Yahoo which is, as we all know, notoriously slow in indexing new pages. With MSN, they are in like Flynn, fast.
I think Google just goes from one disaster to the next. So when they fix this "non-snafu" - they will create another one.
These days I just focus on adding new content to the website I think has the best chance of getting it indexed. The site has good Trust Rank thanks to all the one-way inbounds from other trust rank websites. - But even there, it's taking awhile to get indexed.
It's very frustrating.
| 6:23 pm on Jun 2, 2006 (gmt 0)|
I haven't seen any evidence for or against that theory, and I have both types of sites.
| 6:26 pm on Jun 2, 2006 (gmt 0)|
I'm not going to change my content or structure. It's technically sound and the visitors can get around. To try to eliviate the no pages beyong level one issue, I am in the process of seeking a few GOOD deep links to my Regions. Then hopefully, the sub pages will get picked up.
| 6:33 pm on Jun 2, 2006 (gmt 0)|
looks to me like BD just raised the bar of what gets crawled: high page rank, backlinks, and most original content
and the niche sites teory is very compatible with the above. existing ones certainly. not sure sue about newly built ...
| 6:37 pm on Jun 2, 2006 (gmt 0)|
|The navigation structure of the site was changed to move the 4th level pages to the 2nd level. |
I am investigating this too. Unfortunately, it doesn't always make the most sense (maybe some) to do this. I don't mean for G, I mean for my visitors. I could put an alphabetical list of locations on my home page (all 57 varieties), but I don't think that would be as easy for the users to find what they want. Some of the locatiosn wouldn't make any sense without the context of the parent region.
Some people don't know exactly what region they want. The website is about discovery - "Gee, maybe I'd like Asia, India sounds nice, wouldn't have thought of that one..." This possible visitor may not have been served by sticking India in a big list as a sterile menu item.
Just an example...
| 6:52 pm on Jun 2, 2006 (gmt 0)|
Its scary just how much people and portions of the web rely on Google.
| 7:00 pm on Jun 2, 2006 (gmt 0)|
|Since we've lost at least half our traffic across 8 sites (and in one case about 90% of traffic), and since I have no idea what google is doing, or how to address it, we're simply moving off of the Internet business until such time as things get stable, and predictable. |
All this based on 30-60 days of unpredictable Google behavior? That might be a bit of a knee-jerk reaction to something that is probably going to correct itself in time, hopefully. I know, it's that "hopefully" part that most are concerned about. But, usually 8 out of 10 times things clear themselves up "naturally".
| 7:08 pm on Jun 2, 2006 (gmt 0)|
Consider using div to place a link to a sitemap with those important pages listed first on the sitemap. By using div, you can position your sitemap link on the bottmo but have it near the top in code.
Food for thought...
| 7:08 pm on Jun 2, 2006 (gmt 0)|
frederico> looks to me like BD just raised the bar of what gets crawled: high page rank, backlinks, and most original content
Original content doesn't feature in my experience. Hand written HTML pages aren't getting spidered. I've written a few after the start of bigdaddy, and they are simple - original content and the common headers and footers are tiny in the code.
Not the PR either (maybe) these new pages are sitting on the same level as established pages with PR3. Surely that's enough for at least a quick peep?
For backlinks - it's true my new pages don't have external backlinks, but that's arguably because nobody knows the pages are there (not in google etc).
Someone posted in some other thread they thought this was to do with homepage PR (or equivalent) and the depth GG were willing to crawl as a result. That argument makes a lot of sense to me.
It seems there are "factors" that are currently stopping Google from crawling new pages that are more than one or two links from the index page for me.
I'm sure that someone, somewhere on the web will have written something half useful that deep on their site over the last couple of months. I'm sure Google have realised this information could be useful to their visitors and might get around to spidering it eventually.
I'm not going to suggest Google is broken, but I suspect things aren't going quite the way they want and that they might hopefully be doing some adjustments soon.
| 7:30 pm on Jun 2, 2006 (gmt 0)|
I am still building pages and adding products. Google is in a lot of turmoil right now and I am more than sure this will soon effect the number of searchers.
I am building my sites to rank well in msn and yahoo and I am giving up on google.
If google wants to have sites evaluated on links and page rank instead of content and site themes ,then they are going to be the ones to lose in the long run.
It would help if google updates links and page rank more often if big daddy depends on it. Too many pages being dropped that the other search engines have indexed and deemed worthy.
| 7:54 pm on Jun 2, 2006 (gmt 0)|
2 more cents.
From 65K pre daddy to 1 page. But still maintain 50K supplemental pages that are all bottom level pages. Mid level pages (2 - 4) are just missing.
Many sites, same style, same result.
Also, useful for checking supplemental:
Set preferences to English only.
Supplementals will disappear.
(G must code sups as a separate language.)
| This 90 message thread spans 3 pages: 90 (  2 3 ) > > |