Forum Moderators: mack
I can see a site at position 2 it has 23,000 cutter pages that just can't be detected because it's all on page. And there is no change that Google or any other search engine will be able to find a reason why to boot it!
If you're one of those people saying "no such thing as a sandbox" then in my opinion you're lucky. I've always said that the prevalence of being a sandbox domain was checked by the prevalence of adwords and the dependent strength of the keywords. Or put another way "does the keyword sounds like it will get you lots of positions?" Unlikely for results. If domain = new forget it without a link from a PR7.
The funny thing is that when Google decided to dampen its results giving less predictability to SEO masters who traditionally were getting good results for Google for on-page algorithmic SEO what they did was they basically said "you will not get positions for strong keywords". To which the reply (if you check the volumes of pages in the indexes was) "fine, we'll then just create 100,000 pages on all the secondary keywords and have Googlebot crawl those instead". Which it does!
I still use Google every day but my partner tried to look for a hotel the other day in Italy. "All I found was popups and afffiliates". That's lucky, most Internet Users will just book through an affiliate/white label site not knowing that the best deal can usually be found on the suppliers site directly. But if you go looking for them in the search engines you won't find them because they start coming up on page 5.
Round and round we go...
I have launched many sites over the past couple of years and seen almost all of them doing very well in search results (google) and all getting PR as expected. But one site I launched almost 5 months ago still has PR0 and only showing for very irrelavant keywords many pages down. There is certainly some kind of filter applied here as the site has many unique pages, optimised as per all my other sites, and has over 5K backlinks from PR5-Pr8 pages (and has done for over 4 months). Now I know google has indexed about 10K pages from this sites, BUT because it has PR0 the results are very bad. I have to admit the site does have popular keywords in the domain (and a .org), and this has got to be the reason for the problems. But I still have faith it will do well sooner or later.... sooner is best for me. BTW.... it's number 1 in MSN beta for the most competitive keywords!
To me this is proof of some filters being applied, but is it the sandbox everyone is talking about? who knows.
If a site was worth being up, normally it was put up over a year ago
Where do you get this? Look in the phone book for any category and see how many businesses have web sites. I think we have a long way to go with new site building.
If a site was worth being up, normally it was put up over a year agoWhere do you get this? Look in the phone book for any category and see how many businesses have web sites. I think we have a long way to go with new site building.
When searching for "<cityname> plumber" does it matter if you get 20 or 200 good options? From a user perspective, I think not.
CF
Of course it matters, I might not be as picky about a plumber as I am a remodeler but the reason I am using the web instead of a phone book is because I want more info on who I am hiring than just a lousy phone number.
I think the best thing that MSN beta has going for it is the ability to answer the question, "Tell me something I don't already know." With Google, if you do a search in a specific field, ie. movies, it's always the same long-established sites that have been around forever that show up front and center in the results. As people continue to search the web for info on a specific film, and continue to get results from the same sites, there's always the risk that these search results won't be what people are looking for because they already know of these old sites, and are looking for a new voice in the field, and Google just can't do that because of it's sandbox.
As the internet grows older, and people become more and more savvy at searching it, they will want to know of alternate places to get their information on a given subject, and since MSN's beta is able to do it, they can and will attract searchers.
I really get the impression that a portion of the pro-Google crowd don't like the MSN beta results simply because their site doesn't place well in the Microsoft search, yet they place well in Google. The thing is, why would MSN want a search engine with results exactly the same as it's competitors? What MSN is doing with their new search is giving a very viable alternative to what Google has to offer, and in order to be an alternative it has to be different. Thankfully, it is also managing to be relevant thus far as well. There are obviously people that are trying to work the system, but from the searches that I've done so far, it doesn't seem nearly as bad as some of the people here crying bloody murder are claiming.
I didn't rank until I signed up for Google's adsense program. Now, instead of not showing up, I'm on page 50 for my product's key words. First page of every other search engine, but not google's. :-/
Several of my sites rank highly on Y and G and also on beta.search.msn.se but since they are in English almost nobody search for the keywords.
These otherwise high-ranking pages just cannot be found using beta.search.msn.co.uk or beta.search.msn.com
I dearly hope that MSN beta serach will tune down their weird concept of geographical targeting. I do believe in the future success of msn search, but it will mean that I migh have to get UK and US based mirrors.
I dearly hope that MSN beta serach will tune down their weird concept of geographical targeting
I don't think that will be likely - quite the opposite infact. But they are going much further than that in The US and eventually UK side too I would guess, gradually bringing the "near me" option more into play.
So - just getting a US and a UK server in itself won't be enough. You'll potentially be wanting mailing addresses in every city for some types of business! OK - probably not that necessary - but I am quite keen on the idea of a user in Birmingham seeing different results to a user in London - not necessarily from an SEO point of view (although I can make money out of that) but from the user's perspective. I like the idea that bricks and mortar businesses have their "place" on the web and the "place" relates to where they are geographically. With a bit of luck, I get to work for a major bank, optimising every branch web presence as well as the main brand. (Or Pizza company or Shoe shop chain - I am not fussed).
Anyroad - for me, the Geographical "mood" seems a logical step. The Internet has become too big to be without some form of catagorization and since the death of directories as a business model, geographically dependent results seems a positive step forward.
Really they are just small paid for directories and the vast majority of sites using ad words offer a good service, so they are very useful for finding companies that sell what you are looking for.
I think that while search engines spending a lot of effort filtering spamy sites out, if there were a search for online shops and catalogue type websites and a different search for just information, this could perhaps improve the SERP’s.
I either search for information or products.