Welcome to WebmasterWorld Guest from 188.8.131.52
I've been reading through the June 27th/August 17th threads, and I was wondering if somebody could make it more clear as to what's actually going on?
Like many of you on the board, my site got trashed in the SERPs on June 27th only to recover a month later. At the time, I thought I had incurred a penalty and went through painstaking detail to remove even the most minute possible violations. I thought that correcting those problems was the reason that I recovered.
So needless to say, I was pretty upset when I got trashed again around the 17th when I knew my site was in total compliance with Google's guidelines. After visiting this forum, I now see that I was not the only one who has been experiencing this type of problem.
Here are my questions. If any of you can shed some light on these, I would really appreciate it.
1. Why is this happening? It seems like some kind of update, but why are certain sites getting trashed when others are standing firm?
2. Can I expect a recovery similar to the one I had in July?
3. Is there anything I can do to fix this, or am I completely at the mercy of Google on this one?
Thanks for you time!
[edited by: tedster at 6:25 am (utc) on Aug. 22, 2006]
Not quite the same thing, but related to "link strength", a friend has a 30 page site, with some pages having only minimal text content. I warned for a year that having the same meta tag on multiple pages was going to cause trouble.
The pages had been listed normally for several years, but then just a few months ago I did a site:domain.com search one day and found several changes had recently occured.
Only two pages showed up before the "repeat the search with the omitted results included" message appeared.
On clicking that link, many of the pages now appeared but all apart from two had turned supplemental.
A few days later, just a few pages were listed after the message was clicked. Only two were shown as normal results, and there were a few more pages listed as supplemental. The rest had disappeared from the listings.
The two normally listed pages were the only two pages of the site that had any external incoming links: the homepage and one internal page.
Upon fixing the meta descriptions on all of the pages, it was only a matter of weeks before all the pages from the site were listed fully and normally again in a site: search.
[edited by: g1smd at 10:50 pm (utc) on Aug. 23, 2006]
It's onlyn a factor in that most higher PR sites have a higher trust rank due to age and incoming links. You can get away with just about anything with an older site that has high TR. Try the same tactic with a lower TR site and you'll get banned to the google dungeon forever.
Page rank is not a factor, because we have pages ranking top 10 in their perspective keywords that have a pr of 0,1,2
ITt always helps knowing your neighbors on a shared server. We always use website server providers that strictly prohibit any type of spamming. Does your webhost do the same?
An omitted meta description causes roughly the same problems as an identical meta description.
In the omitted case, Google uses the first words on the page: often from the nav bar, and therefore often identical too.
wordpress by default omits meta keyword and meta description, isn't it? Matt Cutts blog also doesn't have meta kwd and description
This has nothing to do with it. I literally have thousands, some PR 4 and 5 as well.
That I would love to see. I presume you are writing unique content.
Please sticky me?!
To be or not to be supplemental.. is a question that hinges on more than one factor alone. I believe PageRank is a factor, but a PR of 8 may not save you if the page is an identical copy of a page with a PR of 10.
Google's algo is not a single IF/ELSE statement.
Exactly - it's at least 3 (lol).
I'd say anyone complaining about being supplemental listings as a result of copying content from another site shouldn't be moaning about supplemental results in the first place.
However this is unfortunate for companies writing quality, unique content and having little PR to aid retention of this content in the main index.
Surely if you take a close look at supplemental pages they must be 90% a result of low PR / poor linking structure (which in turn leads to a low of PR).
Wrong. We do not know what the cause of the "penalties" are, otherwise we'd fix them. My meta, title and html is 100% fine. Also, let's not assume it's a valid penalty we are getting hit with, many of us have 100% legit sites that are caught up in a google mess of some sorts.
I've got PR 5 & 6 sites that have been hit... lost traffic on June 27th, got it back July 27th, lost it again on August 17th.
These are both original content sites, they have unique Meta keywords & descriptions on all pages, they validate, and they have plenty of orgainc links – ie, they are exactly what Google tells us to build: quality sites.
This is certainly not about PR, and while the Meta issue may be effecting some, it certainly isn't a factor for me.
Personally, I'm lost.
Until two weeks or so ago when doing site:domain.com, the home page was listed either second or third. Now it is listed on top but the rankings still suck. When I rank #1 in Google for "domain.com" I do extremely well, so I assume an automatic penalty has ben assigned to my site. Anyone else having this problem?
When ranking all sites rank on the first page, and for the main site #2, for the relevant keywords.
My main site has had the experience of a lot of people: lost traffic (about 95%) on June 27th, got it back July 27th (about 150% of previous), lost it again on August 17th (about 80% from baseline).
Four other sites, each with less traffic and different topics, experienced the exact opposite--so in a sense it balances out. All sites have AdSense, Google Analytics and Google Sitemaps.
Yes, I just noticed this today. I had selected the www.domain.com format just to see that the function is no longer working.
What I (we) would give to get a glimpse inside google logic/theory/etc/etc.. it's scary to think how bad things may be just seeing all the chaos from outside!
[edited by: AustrianOak at 3:34 am (utc) on Aug. 24, 2006]
How many of you flipped on the preffered domain on site maps just to have it not working? Google could be shuffling due to this.
I turned on the perffered domain about 4 days ago. Nothing seemed different in the total non-supplemental pages. Until yesterday... I about fell out of my chair.
Then 90% of what was in their system went supplimental. I'm not saying it's a totally bad thing since the cache dates were so old .. well over a year old on those...
Google has been spidering our pages alot since I turned on the perferred domain, but our site is so big it will probably take them a year at this rate to get most of them back in.
Strange thing is ... the ones that are supplimental all had unique content, unique titles and for the most part unique descriptions.
Some pages are climbing out of the supplimental status today.
I will say this... since things have been slow I've taken advantage of the extra time to do some code cleanup, optimization and made more use of style sheets to get the page weight down.
[edited by: Bewenched at 3:58 am (utc) on Aug. 24, 2006]
What is interesting about this update is that it has impacted sites that had zero content changes over the past several months. This leads me to believe that whatever change is happening isn't related to on-page factors.
not neccessarily - if they change their "gusto" this could hit pages that haven't changed in years.
the webmaster backed off on those keyword links and saw upward movement within a few days. Not back to the first page, but a solid jump.
something I wanted to know for long time - but didn't dare asking ;-)
We know penalties. Your out and if you try you might be allowed in later.
And we know filters. You run into one - change something (like you mentioned) and - back you are.
Is there something in between - like they catch you, add a "4 weeks probation tag" and then let you back in - just to avoid too much tinkering?
can you clarify..when you speak of identical metas...do you mean any one of the meta could match another and cause a problem?..i.e two pages might have the same keywords but other meta differ..?
is their a % threshold at which they appear different...?
is no meta better than identical metas..?