Forum Moderators: open
What you wrote was probably true before this last update. I have built many real estate sites which target not just city real estate terms but a broad range of terms. The sites attempt to be the most informative resources for everything real estate and everything about a particular market. These sites were filtered by Google's Florida algo. Content doesn't matter anymore. The sites which dominate lack content. I do not believe DMOZ is a content site. It is no better than an over glorified link page. The serps are now dominated by these types of sites with no real content just links to sites which do have content. We have to face the facts Google is deliberately filtering based on money terms. Google determine what is a directory and what is not. The algo works simply by displaying directory listings for commercial terms. The user will not notice the difference for the nost part they will just either click on the Adword links or make an extra click from the directory site's page. So Google keeps it sense of relevance and increases its Adword revenue. It's a win-win situation for Google. Of course those of us who strived to build excellent resources get penalized for our hard work.
In order for us to beat Google at its own game we have to learn how it determines what is a directory and become the best directory for a search term.
So what do we know about directories:
- Directories have many links out.
- Directories are categorized (there is an observable structure to these sites)
- Directories like news agencies, education, and government sites have content on a broad range of topics and usually a small section on the relevant topic.
- These types of directories target the city as their theme and real estate as topic on this theme.
- Directories have strong pr with anchor text based on the city term instead of based on real estate or commercial terms.
- The sites linked out of a directory are very relevant to a search term. Sites like DMOZ have a purpose because they provide good collections of links. Usually they link to good sites with 99% relevance to a categorized topic. They are human edited to maintain this high level of relevance.
- Directories have links from other authority sites.
2) city-hotels-travel-x-y.com/city-hotels.htm kinda Sites which literally do not have any content on page except few outbound Links to same webmaster's sites.
3) Page ... yeah I mean PAGE which has few lines about the City and that is crap too! and this Page is hosted on free hostings like lyc*s, trav*l.to/ , etc.
only 2-3 sites out of 1st 20 deserve to be there looking from a normal searcher's point of view.
Google SERP reminds me of my friend who when asked in his exam "write a sentence using word - satiate"
He wrote the sentence as "satiate is the only word in world which I dont like"
LOL...and he actually used the word satiate to make sentence!
IMO if you apply what Claus is saying to your post, then the points he made are very valid. My webmaster believes that theming directory structure is more important than ever. If a web site is constructed like a directory then Google should thrive on its relevance as it will make it easier for the bot to categorise and asociate content.
I hope!
I was suggesting the same that theming a directory is very important. But the majority of sites are not directories. One should not get a directory confused with content. Directories for the most part just link to sites that have content. The content sites generally have few links out and provide for the user a complete resource on a theme. These sites are killed by Google though once were the dominate sites in the serps.
What I was suggesting is that content sites need to order the file structure to aid Google with its relevancy calculation and that the directory structure seems to work well. One can do this by making sure that sub pages in a topic are related in some way to the main topic and are named correctly to achieve the desired association. I'm no wizz with file structure so I hope you can see what I am getting at!
Useful content sites,imo, should also allow users to visit other sites that contain useful information on related topics.
Topic a - subtopic a.1 - content blurb a.1.1
topic a - subtopic a.1 - content blurb a.1.2
etc.
Generally you don't cross-link from one topic or subtopic to another though perhaps at the content level you might when you need to make a cross-reference to another article.
This could be part of the solution to ranking well in Google for some, but my selling site cannot have this structure except perhaps if I make a section on a related topic. The rest of my site is just about one type of product so I lack the product diversity to give googlebot much deep content to look at.
Personally, I think this is a red herring and the real answer lies within Google Sets.
Agreed. one of my real estate sites fell from top 10 to mid 100's. I did nothing but remove some extensive crosslinking around November 18th. Its now back top 3 for my two main kw phrases. The sites I crosslinked were all top 5 pre-florida real estate sites and havent moved at all. (one item of note is that -gv now reflects the reduction in those crosslinked backlinks, but PR and SERPs haven't changed.
My small (5k) main page is mostly text with 3 outgoing links. One of the 'city2 homes' serps has me back top 3. What's interesting here is that the term "city2 real estate' ONLY appears in the title of the frameset and the nav bar. No stemming here!
Also, while this site is specific to a west coast area, I am also #9 for an east coast city real estate company because of one outgoing link.
Claus, excellent post. I will spend the day digesting it, but at the moment I still tend to agree with Allen that this is more of a skewed filter.
So regarding the thread topic, I have no clue how I got back, but I am not complaing as the rest of the serps around me are exactly what powdork listed, so being one of the two 'city real estate' specific listings on page one has been fabulous. Busiest December ever! But if this is Google's version of stemming, I hope they dont bet the house on it. I wont be.
Here's some info for you guys to digest, make what you will of it:
In an earlier post I mentioned I would eliminate the tiny image links whose only function was to tell Google that the pages existed. Obviously those links could have been interpreted as spam, although they were all internal and pointed to valid web pages which were linked to thru javascript.
The 30 images or so had file names like "mycity-1.jpg", "mycity-2.jpg" etc.
A pre-florida search for "kw1 kw2 mycity kw3" resulted at number 1.
A post florida search for "kw1 kw2 mycity kw3" resulted at number 190 in www2 or www-in but not www.
A search now for "kw1 kw2 mycity kw3" after removing all the "mycity-30.jpg" images results at number 96 in www.
The amazing thing is I'm almost happy to be back in the top 100 and ready to "forgive" Google for 6 weeks of stress and lost business.
The operative word is "almost" however.
I'm not counting Google anymore, I'm counting on us sharing knowledge here and rising again. The filter has fixed parameters that can and will be diagnosed, it's a matter of time.
I think there is a real difference between "real" rankings and the normal "freshbot" rankings. I can only assume that Google's new algo is very processor-intensive. I think they add new pages all the time according to their standard algorithm... then, during the big update, everything is sliced, diced, filtered, penalized, and spit back out...
Anyway, my new pages are all ranked high using the exact same formula as pre-florida. I never bother to change old pages... it is a waste of time. I make new sites/pages. The new sites and pages are doing fine. They are all built according to the standard... title tag, h1, keyword backlink, etc. There is really no magic at all to it. But again, I do not know how they will fair when google does a real update. It may be that I have to create new sites/pages as fast as Google takes them down.
For one of my keywords there is a site ranked 10th. They have 8 words of text and around 20 banners on there home page. They have 11 pages in the url and 17 backlinks with a PR of 5 How is this possible - and ranked 10th?
COME ON GOOGLE WAKE UP!
No content no content!
Content is King?
It seems like everytime I put more content on my site it goes backwards. I put another 150 words on my site and went from 20th to 65th yesterday.
I dont get it.
Brew
COME ON GOOGLE WAKE UP!No content no content!
Content is King?
Hi,
I agree and I tried to start a new thread on this but it seems to have failed so here goes.
It looks very much, for the dominant two keyword term used in my niche, that content is no longer King. Relevance is not a Prince, Specialism is dead and the small niche opperator is gone forever.
My two word term top 10 on Google is dominated like this. The PageRanks and backlinks relate to the index page of the site. The backlinks count is to the domain root searched on Alltheweb.
#1 PR7 9,000 backlinks General big player site not a specialist VERY limited but relevant content (about 20 words) on one page linked to from the page listed in SERPs, only relevant text on page listed is the anchor text to this page. Not in top 10 on any other SE.
#2 PR7 680,000 backlinks Directory, content is my Espotting Ad this page is #3 on Inktomi
#4 PR6 125,000 backlinks Directory, content is an affiliates Ad and what looks like a link farm. Not in top 10 on any other SE.
#6 PR7 70,200 backlinks Large motoring organisation site, relevant(ish) but very limited content on this topic. Not in top 10 on any other SE.
#8	 PR5 890 backlinks Members Association, product only offered to members, relevant content but this is only a brief summary of a service offered by a small specialist (like ourselves) this small specialists site has been completely dropped from SERPs. Not in top 10 on any other SE.
#10 PR6 13,360 backlinks Large Australian company offering service only in Australia. Brief on topic, own content.
It looks like if I work really hard and get thousands of backlinks and my PR up to around 7 with some vague and limited content provided by an affiliate ad I might just have a chance of getting back in the top 10.
Surely this is not what the “commercial” Web is all about and surely Google didn’t intend to create this result in many niche markets. I still keep coming back to the fact that for all of the associated three word searches very relevant content rich sites are still returned in SERPs, its just for the main two word phrase that the results are heavily scewed to big sites with many backlinks. It looks like anchortext is switched off for this search and SiteRank is used.
Happy New Year everyone!
Best wishes
Sid
...its just for the main two word phrase that the results are heavily scewed to big sites with many backlinks...
I couldn't agree with you more. My 2 word phrase has been blasted out of the water and all SERPS show is a bunch of garbage that is kind of related. Most of the results are sites that may or may not have the 2 word phrase present and a lot of those are just directories of other related sites.
I hope this gets corrected soon.
The Google algo is a computer program, it will only get corrected if the mighty G thinks it needs correcting.
Meanwhile look at why all these "hardly relevant" sites are actually ranking high for some search phrases right now.
This is a unique opportunity, Google may step back in a month or two, but in six or nine months they may try it again.
You need every base covered to be successful in all markets and algos :)
My wife has an outfit for every occassion, rain or shine she insists on having something suitable to drag out of the closet.....A set of sites suitable for all Google storms would be a useful thing for us webmasters/SEO's to keep in our closets. :)
Will someone please start a thread discussing specifics on what worked for them in bringing their site back to
pre Florida status.
Time. Both Esmerelda and Florida knocked me completely out of the SERPs. After Es I did nothing and after awhile reappeared with even better rankings.
After Florida I rewrote a few pages to reduce the keyword density. Nothing major, but more user oriented copy. I added one outbound link to my homepage.
After awhile I reappeared, again with better rankings than pre-Flor.
I refused to make major changes that would negatively effect my rankings in ther SE's
This time factor is not something I have seen discussed.n The SERPs take 2-4 weeks to settle down. Old sites fair better than newer ones during this period, i.e. they are more stable. But, a quality, content oriented site does seem to return after awhile.
My $.02
WBF
If you search for your two word phrase with quotes around it, are the results more relevant? They are for the searches I'm concerned with. This is the problem with Google's new algo. Word proximity has very little value now, and is leading to a huge amount of non-relevant traffic to my sites. Wake up Google, smell the coffee.
Land? Rentals? Single family homes? Condos? Residential? Commercial? The amount of space on your screen?
Be more descriptive. Don't make assumptions that the end-user knows what the term "Real Estate" is. DESCRIBE IT.
Forget what you know about SEO and go back to plain old fashioned information. Provide more descriptive information about your services, do some old fashioned PR(public relations) and use words that describe your keywords.
If you are talking about "Real Estate" in the sense of home sales, then say "home sales" or "home for sale", etc.
Think of the most basic questions that some n00b off the street would ask you about your business and answer it on your site.
Watch your dupicate content from other sites you manage.
This has worked for us, and hopefully will for some others. I am not a professional SEO, just a professional in my industry (which is real estate related, but not what most would expect fromt he term Real Estate).
Hope this helps. Happy New Year!
If you think you do this and it hasn't helped, then ask somebody here to take a look for obvious problems. I am sure somebody would be happy to assist. As a wise man once said, it is easier for somebody out-side-the-box to point out possible problems.
The Google algo is a computer program, it will only get corrected if the mighty G thinks it needs correcting.Meanwhile look at why all these "hardly relevant" sites are actually ranking high for some search phrases right now.
This is a unique opportunity, Google may step back in a month or two, but in six or nine months they may try it again.
Hi,
If I was to pick one thing that I think is co-incidental which may be the cause of the change in rank, it is position in the DMOZ/ODP taxonomy. The sites that are doing well seem to be categorised in a bigger branch of the taxonomy for the keyword that appears to be being judged as more important ie nearer the root. The sites that have been dropped are in small regional, subject specific branches which are more closely associated with the second word in the search term.
I am totally convinced that the words in certain search terms are being interpreted and dealt with in some way separately. Previously both words had equal weight, proximity and order was important. Now it seems to take each word separately and find sites that strongly fit into the category of one of the words with some relationship to the second word, and then it looks the other way for sites which strongly fit the category of the second word with an area on the first word. So if your site is about the combined topic of the two words lets say outstanding widgets you have a problem because it probably does not fit strongly into the outstanding category and it probably doesn't fit strongly into the widgets category but it would be perfect for the outstanding widgets category but this new categoriser doesn't have a category for the specialist combined topic.
We have literally "fallen between two stools". Where the system is failing is that it doesn't understand that sometimes two words when used together are worth far more than the sum of the two parts.
Please look at your own affected SERPs and see if this picture fits what has happened to you and lets all let Google know specifically, where the new system is making mistakes. I wish they would confirm which technology they are using. If it is, as I now beleive, some form of categoriser, then there should be no harm in telling us this and providing some method of providing feedback.
Happy New Year
Sid
If you search for your two word phrase with quotes around it, are the results more relevant? They are for the searches I'm concerned with. This is the problem with Google's new algo. Word proximity has very little value now, and is leading to a huge amount of non-relevant traffic to my sites. Wake up Google, smell the coffee.
Exactly the same results which is why I think that the pages have been pre categorised for certain search terms. If I add +www of another word to make a secondary target term my pages come in the top 10 mostly #1.
How can this level of crudity be acceptable to Google?
I wrote to help at Google and the reply I got was a standard boilerplate topped with a personal note. But the person who wrote it didn't seem to have any grasp of the issues whatsoever. If this was the usual ups and downs of updates I wouldn't have bothered to write, but theres obviously something not right now that needs fixing. I don't blame the people at Google who have to reply to our emails, I'm sure that they are kept in the dark.
Best wishes
Sid