Forum Moderators: Robert Charlton & goodroi
My Question if you get to it is about how to determine bad neighborhoods when a link exchange is sought. FFA link sites and link farms are a no brainer and there are mentioned on Googles site. Many people here and on other boards have also tried to say that sites with porn or adult content, gambling or drugs, regardless of quality are also considered bad neighborhoods just like the farms and FFA sites.
What should we be considering beyond whether the site is an FFA or Farm when evaluating whether to exchange links with another site?
To GoogleGuy
Thanks for answering some questions here. I have a question regarding "Trustrank". Is it true that the weight on links from trusted sources will increase or is already high?
Regards
itLoc
My question is when I'm looking at the SERPs on various Dcs I see different amount of results returned on some of them. I've often heard people comment on whether this is due to the fact that certain Dcs are simply working with less data or that various filters maybe turned up too high or even too low. Any comments on this.
My site is going through a nasty phase with this update as we had good rankings (before update) to now very low but I am seeing it again on the "allinanchor" and "allintitle" as where it was before the update started. Could this give me some hope that we will survive this update?
Thanks for any input.
Ellie
Could the fact that an established website (penalized for a year -- ranking >100 for it's unique name) suddenly moves to the number one spot for it's unique name, begins to rank highly on obscure phrases (e.g. rank #1 or #2 out of about 40,000 results), and ranks #1 for snippets of content from the page, be an indication that penalties have been lifted and that it will eventually rank for less obscure phrases or is all this coincident with canonical url improvements?
All of this began shortly after the reinclusion request that I submitted in early April (when you suggested that I send one in another thread). Unfortunately, the response I got from you guys was simply that the url was already indexed. So that's why I wonder if all of these happenings are merely because of canonicalization improvements.
[edited by: crobb305 at 3:53 pm (utc) on June 2, 2005]
The symptoms are as follows:
- Sites no longer rank for company names.
- Sites enjoy great ranking in Yahoo, MSN, and ASK. But nowhere to be found in Google.
- Pages that used to rank in top ten or twenty are now pushed down to page 70 or 80.
Can you comment on what may have happened to these sites? Is there anything a site owner can do? (Beyond following the 26-step method)
Thanks.
Since Bourbon, supplemental results from our own obsolete canonical urls with old cache dates seem to be getting served up for searches and even ranking higher than the identical main existing domain pages and sometimes our main domain pages are not ranking at all.
ex: http://www.example.us/old_page.htm which is obsolete and returns a 301 to http://www.example.com/new_page.htm, ranks better or replaces the latter!
Our other canonical domain page urls either perform a 301 redirect to our main domain urls or return 404's (In either case the old cache dates seem to signify that the googlebot understands these as obsolete).
A reinclusion/canonicalpage subject message to Google support got a reply that there was no site penalty for the main domain. However is there still a chance that the main domain pages are being penalized as duplicates to these old canonical urls and if so do we need to anything further than returning 301s or 404s for these obsolete urls.
Interested to know why some sites may rank higher than others in the directory but may if fact show a lower page rank on their site. IE a Page Rank 5 site may sit with the PR 6 sites in the directory or a PR6 may fall with the PR4s in the directory.
Also, does the algorithm take into account position in the directory?, I find it amazing that a site that ranks high in the directory under “Blue Widget” website may not rank in the SERPs for a search on “Blue Widget” especially when that site would have been human edited by DMOZ in the first place so its obviously a site relevent to that search term.
Many posters have discussed the challenge of scrapper grabbing "snippets, titles, keyword phrases and company names" directly from sites.Being one of those heavily "linked to by scrapper" sites,
Q:Were we demoted in ranking because of this factor and will Google reinstate the original content generating sites?
I second Wiseapple's questions, with an important addendum;
- You continue to mention the phrase "natural growth"; the collorary would suggest that too many incoming links in a short period of time equals a penalty.
If that's case, for all of us with legit content sites who continue to be severely dinged in this update, it seems possible that a rapid series of incoming links from "scraper sites" has caused us to drop. Some of us have reported finding more than 3,000 links in by various scraper sites, most which take the top ten results from various engines and spit them out on page after page.
My question would then be - is this a reasonable hypothesis, and if so, does that mean in a few months we might work our way back into the top, as the time interval on the rapid incoming link "penalty" grows older?
Thanks,
Hunter
Is there a reason why an established page would suddenly display a link rather then a title and description? Users here have explained that it is because they are part of the supplimental results. I am wondering this because I have pages that used to disply normally, but now are being displayed as a url with no title or description. Some of these pages are close to 2 years old and have had no changes done on them since they were put up. I would really like to fix this because the urls mean nothing.
In the past when I searched G for a term such as "red fuzzy widgets" (without quotes), most results that came up prominently mentioned red fuzzy widgets in the title, multiple times on the page and even in the url. While such methods may be used by spammers, they are also the core elements of basic white hat SEO for folks who have content related to red fuzzy widgets. Heck, even most no hat, never heard of SEO publishers tend to mention red fuzzy widgets several times on a page pertaining to red fuzzy widgets.
Increasingly over the past year or so, when I search for red fuzzy widgets, the results are much more likely to consist of pages about smooth blue widgets with a unrelated mention of "red" and a seperate mention of "fuzzy" somewhere on the page.
It seems that G thinks that a page mentioning red fuzzy widgets is, for some reason, not a good match for a search on the term red fuzzy widgets.
As a searcher, this has left me looking harder and deeper for things I used to find quite easily on G. I often have to use Yahoo nowadays to find what I want.
As a publisher, I just can't figure out how to create a page about red fuzzy widgets, without ever actually mentioning the phrase.
I'm at loose ends -- any suggestions you can make regarding finding what I want as a searcher and being found as a publisher of red fuzzy widget pages would be much appreciated.
Why are you shutting out the reputable and honest mom-and-pop businesses that are trying to figure out if web marketing is worthwhile for them?
The Google Guy quote below my question should be EVERYONE'S answer to the thread about spam reporting and how Google feels about it. Now, onto my question.
If someone (theoretically of course) was trying to get back in google should they be afraid the following issue might be holding them out...
Let's say a site is providing content for many other sites in the way of forum software using one database. This means duplicate content on the sites that use this same forum DB because a post on one site shows up on all of them. It is certainly not for the purpose of spamming the search engines with content, rather providing the other sites with content so they gain value for their visitors, while they display the original site's ads in return for said content. I guess it would be sort of an interactive RSS if you will...
Should the site in question do a noindex/nofollow for those other sites (taking away one benefit from them participating - search engine traffic) or is it okay to do this?
to report spam. If you're a whitehat and don't like to see spam at all, it's good to report it. If you're a blackhat and want to report spam so that your site can do better, it's good to report it. If you're of the species of blackhat that will do absolutely anything to try to rank, from blogspam to referrer spam, yet you shun the idea of reporting spam because of your belief system, power to you then too--don't use the form. happy!
So here is the question,
What advice would you give to a website that has suddenly dissapeared off the index after the Bourbon update only to notice that the site now has thousands ( in some cases over 100,000 ) scrapper sites linking to it?
It seems clear to me that Google is penalising innocent websites in the process and with Yahoo providing RSS feeds of its search results this will be a recurring problem for any site listed high in Yahoo - since the scrapper sites seem to be using the feeds to get the links.
A website can't do anything about scrapper sites doing this, especially in the number seen for sites that have dropped off the index. Hopefully these issues will resolve as the update is not finished yet, but if they don't... what advice do you give to the websites that have been caught up in the cross fire through no fault of their own?
We have a 7 year old, high quality content site which ranked on the first page of SERPs for a few hundred keywords. Did a 301 in Feb. 2004 and the site was dropped from SERPs within 2 days (some pages appeared as 500th result). Site structure and pages were unchanged, only the domain name was changed via 301 to better reflect the things our site covers. Really surprised then that the 301 nuked us in Google (no problems with other serach engines).
If a large corporate site wants to change URL's for a legitimate reason (the URL they always wanted that reflects their company name becomes available), can you recommend a method that will allow the corporation to move to the new URL without loosing Google rankings for a year?
Thanks for your time! Much appreciated.
Cathy
I have asked this question to Google support
on several occasions, yet, fail to get an adequate
response. I am certain many others, using this forum,
are perplexed by this as well. Why is a company name,
when searched for, being suppressed in the Serps, even
though that company name is unique, appears in the title,
page itself, and as compnay name.com in the address?
According to engineers that have written us back, no
penaltys are being applied that would suppress such
results. Yet over 20 sites appear above us in the Serps
when company name is searched for. We have so hoped
that Bourbon would have cleared whatever was occurring yet
nothing of the sort has happened. Many users of this
forum are not expert SEO's nor understand every detail
of what is going on with these algorithm changes,
we just want something put in simple terms that can be used to make sense of why we are being suppressed and how we can go about fixing whatever needs to be fixed on our end.
302s, 301s, scraper sites...
Mom and Pop sites should not have to worry about this stuff just to sell their products online.