Forum Moderators: open
But in my SERPs there are a few new spammy sites and pages really creeping towards top places, and they are from huge/massive affiliate sites.
Going back to the question of "The End of Large Sites?"
I am not seeing changes in my areas like this, the Large site still rules for most of the Keywords, even if they are spammy, PR generating, directory Product sites.
For example do a search for information and all too ofen you get baby gibberish that is literally someone's first site which they've stated proudly on their homepage.
As you wade through the dross looking for information or suppliers you end up with little real choice but to click the adverts.
My concern is the sheer number of window-shoppers who wouldn't buy from ANY site on their first pass.
Before, speaking simply as a surfer, you had an interesting mix and could pick the one that looked relevant. Until recently when I got into this stuff, I was window shopping via the adverts. I don't now as I feel guilty costing someone money simply for me to browse.
Google seems to be the big lovable friend of the surfer but I'm starting to seriously dislike them now, both as an affiliate trying to earn an honest bob and as a surfer.
They're fast but seem to bring up a lot of drivel and out of date cached stuff.
To me they're becoming a threat, even if I use adsense. They already have a specialized shopping section, complete with price comparison, what next, expert content on the main search queries? The big industry will happily pay for ads on it but if you don't make it yourself, could you afford to play at that level?
Yes, it's the natural progression of any market but frankly I preferred the internet as it was. Part of the fun was not knowing what you would find, soon we'll know, Walmart, Vodaphone, British Airways...
If Google starts doing content we may as well go home. They state on their site that they do searching and that's what they do.. yeah, and browser bars and email and shopping comparison..
One word, starts with M, ends in Y, is this healthy for us?
Pibs
Yesterday, no significant changes. Last night a site of mine went from #3 to #2.
Today it is at #4.
That all seems pretty normal. It is the other sites showing in the top 10-20 that reveal the more significant changes. There I see sites that haven't ranked since Florida.
2 out of the top 6 on one key word do not contain the word on page.
#12 is a mirror of #2 (which came out of oblivian for this term). Closer examination reveals that it is a development page that was never taken off the server. Doubt that G sees it as duplicate content, because there is literally NO content. It is a image based page with less than 25 words total of text.
A .pdf from an .edu site at #7
A brochure site with a bazillion purchased links at #3.
Many of these new arrivals were the very same sites/pages that ranked so well after the Florida debacle.
Looks to me like the knobs dealing with back links and with semantics (i.e. synonyms and stemming) got twisted pretty good in the past few days.
WBF
It's hard to imagine the philosophical mindset involved in deciding to try to benefit below average quality sites.
I don't believe this could be possible, but what would appear to be a perfectly logical explanation for this lightweight stuff is: Google is believing its own nonsense backlink info. High quality, authoritative linking doesn't help but get a link from your cousin's blog and woohoo you move up five slots.
====
At this point I'm guessing the keyword in URL and (the always useless) words on page knobs have been turned way up, particularly the latter. The latter generally is good at picking up relevance but is utterly worthless in recognizing quality.
At this point I'm guessing the keyword in URL and (the always useless) words on page knobs have been turned way up, particularly the latter. The latter generally is good at picking up relevance but is utterly worthless in recognizing quality.
I agree. The words in URL and Page are holding way too much value. I am seeing top ten results just full of mod rewrite keyword generated spam.
Yes, I think I understand what you mean, but I don't understand why do you have to go through such a hassle (complicated procedure)?
Suddenly in August a bunch of new sites appeared from nowhere and the remaining sites from the original dozen were radically reshuffled.
It seems to me that the new sites are very variable in quality and one has only a single page on the keyword (and the content is, speaking as an expert on this subject, written by a crank and is highly misleading). Another has a spammy title with the keyword repeated five times. Definitely a fall in overall quality.
Sites with 50 to 150 pages mentioning the subject (even if the mention is just a link to a sngle page) are doing better, while sites with a comprehensive guide to the keyword have moved down (including mine, with over 400 pages) have moved down, although only a few places. So I'd suggest that in this niche at least the trend it towards medium-size sites and away from larger, more comprehensive sites.
Google traffic has more than halved from July to August, although my overall traffic has only dropped by around 5%.
I think something's going on. It may not have affected everyone, but in my field it's a real shakeup, and not one that I favor -- not just for personal resons but thinking about ordinary people looking for quality information on my topic.
I think there is a misconception about the 'theme' aspect in relation to bigger sites. Not every page on a huge site is based on search engine rankings, most are for users of that site with a much smaller percentage aimed at ranking highly on search engines.
The pages that I have aimed at search engines do very well, the pages for users of my site don't rank so well but I'm not expecting them to. So I'm sure that smaller sites DO rank much better but that is because they will all be on one topic and all optimized. The question of this thread is only relevant to the aim of the webmaster. My site is user orientated first and search engine optimized second.
Point of order, Amazon seem to be doing just fine ....
After August Shake-Up: Considering Smaller Sites?
I think you're looking in the wrong place. I don't think there's a question of scale.
A sites internal anchor text will always have an effect, and I'm sure that google discount links from one page to another page on the same domain.
But further than that and something in the algo which has a method of calculating the root homepage per domain, I can see no evidence that google even considers "sites".
It's always been about pages in google, and I can't see that's changed.
More likely an answer, imo, is there's been a tweak relating link structure of pages on the same domain. That could quite easily be tripped by an anchor-text filter for example, in a way that google haven't yet got a work-around for.
TJ
I have 20 little focused websites, 15 to 50 pages each one. Domain Names are the names of products in English for some of those websites. Very focused on a product or type of product. Super Specialized in products that I sold very well in the last 3 years. The Domain Names can hardly be better for those products. Hard to think of better domain names.
4 of my erased websites are non-commercial and have no products, ads or offerings. They are Original Content Websites with no repetitions, copies or spam whatsoever. Hard Work and Research interviewing Local People. Google erased them!
No spam although I could have little repetitions.
Google erased all my work of year 2004 : 12 websites leaving only the Home Pages. In 3 cases the Home Pages also disappeared.
Google increases what doesn't exist
I have sites that were totally erased several months ago, because I changed the Domain Names six months ago to better ones. ( More valuable and shorter )
Well ..... Google is referencing those pages ( 404s, pages that don't exist ) and believe it or not, Google is increasing the number of pages in the last days. As if Google is taking recourse of very 'Old Hard Disks' with cobwebs found in Google's Attic.
Can someone here give me an indication of what to do or what to expect from this Super Whimsical Google?
In need of Friends :
I am searching now for friends anywhere. Perhaps if I befriend Super Intelligent Programmers then I can finally solve the riddle. I have the most wonderful domain names in English and Spanish, hard work, talent for writing and for funny inventions.
Friends can learn a lot with me and will become startled with some commercial or intellectual ideas for the future. We can share many things!
My email :
vicenteduque@hotmail.com
Vicente Duque
I am searching now for friends anywhere. Perhaps if I befriend Super Intelligent Programmers then I can finally solve the riddle. I have the most wonderful domain names in English and Spanish, hard work, talent for writing and for funny inventions.Friends can learn a lot with me and will become startled with some commercial or intellectual ideas for the future. We can share many things!
................... I really can't think what to say ................. anyone?
But further than that and something in the algo which has a method of calculating the root homepage per domain, I can see no evidence that google even considers "sites".It's always been about pages in google, and I can't see that's changed.
This is what I've always thought. Google doesn't rate sites it rates pages.
There is no such thing as an informative, user-friendly, easily navigate-able, well built site that's got 100k pages. Just doesn’t happen, apart from the occasional forum and I mean *occasional* - even then most forums are full of dribble as well.
I’m sick to the back teeth of G indexing those sites, Y manages not to so why can’t G get it right. I have nothing against big sites, e.g. WW, the difference being that you can actually learn something from every one of the WW pages indexed in G.
I think that *if* G are favouring small to medium sized themed sites it's a good thing.