| 8:32 pm on Sep 20, 2006 (gmt 0)|
I am with the people suggesting giving up the constant checking for Google rankings...too stressful! During the last update my site which over 2 years worked up to #11 for its main keyword dissappeared completely! Pagerank (5) stayed the same. Today it's back at #11 and I am currently reading a pagerank of 0. #*$!?
| 9:41 pm on Sep 20, 2006 (gmt 0)|
I'm praying that this is some sort of small step to a bigger picture. Lots of junk lately. I was rolling along pretty good, then got dropped like a rock.
| 10:11 pm on Sep 20, 2006 (gmt 0)|
webdude - I'd still be suspicious not all of your duplicate content issues have been resolved. Have a look at this thread: [webmasterworld.com...]
There's something going on in your internal linking and where you are linking to i suspect - just a hunch.
Recovery from this could take several months.
| 10:22 pm on Sep 20, 2006 (gmt 0)|
as of 9/16 our site doesn't even get a hit with a google site search site:example.com yet the google webmaster tools say we're indexed
we even run google blog ads and the site has been indexed by google for the 18 months before 9/16/06 - it's a perfectly legit wordpress blog, very active, with millions of hits and tens of thousands of unique visitors every month
for whoever was asking i ran the google search "www.mydomain.com */" and got 30,000 hits (none to our site) from what looked like mostly links from blogs discussing our content
[edited by: tedster at 4:04 pm (utc) on Sep. 21, 2006]
[edit reason] use example.com [/edit]
| 12:36 pm on Sep 21, 2006 (gmt 0)|
I found something that I think caused a problem... my bad :-/ -- On most of the pages, I forgot to close the H1 headers. All the pages that were disappearing had this problem. That and the fact that all the H1 tags were exactly the same made me suspect something was wrong. (good old W3C validtion tool). So I fixed that and changed all the H1s to reflect unique on each page. Company name at the top is now just bold and big. This is the same type of stuff I've been doing on my other sites and they have always ranked relatively well. We'll see what happens.
| 1:11 pm on Sep 21, 2006 (gmt 0)|
I wonder the same thing. I have ONE page that I KNOW should have come up better in Google, and it comes up only on 3rd page of SERPS, but should be on first... re-checked code and found that I had 3 h2 tags in page by mistake. I fixed it yesterday, (how I screwed it up like that, I have no clue)so we'll see...
| 1:29 pm on Sep 21, 2006 (gmt 0)|
I've used multiple h2 tags before. I thought that this wasn't a problem.
Example: My site sells two entirely different products, but are directly related to the same industry. So my H1 tag has both keywords, and then an h2 tag for one item (with much text after that), and also an h2 tag for the other item (with a bunch of content after that).
| 2:00 pm on Sep 21, 2006 (gmt 0)|
I'd be surprised if the lack of closing tag affected your site that drastically. Google is used to dealing with missing tags all over the place.
| 2:09 pm on Sep 21, 2006 (gmt 0)|
I was just reading about hn tags yesterday, and it's my understanding that using multiple h2 tags is fine, so long as you're using them logically. I do plan to make more use of h3 tags as well.
I also did some reading about ALT text, and am now pretty sure I've been guilty of ALT attribute spamming. In my Amazon store, I include larger images of many of the products, and didn't really worry about adding ALT text to these until a few short months ago. When I did finally add ALT attributes, it was just so easy to insert the keyword-rich titles of products, as provided on Amazon's pages.
I'm now deleting all my ALT text for store products to see what happens.
Also, many of my Amazon products that are similar and from the same seller also have repetitive descriptions, and I plan to do something about that as well. I have been using the descriptions as provided on Amazon too, and up till now that really seemed to be working for me.
| 3:03 pm on Sep 21, 2006 (gmt 0)|
My site was hit badly in August, with only my index page listed where previously there had been 150 pages. I redesigned the site during the lull and early in September Google started listing about 50 pages on my new site. 16th/17th September saw that drop to 17 and all the pages are old - content is from posts I made in early August.
I too am going to stop even thinking about this. Google is tying itself in knots and I don't think the first thing a web designer thinks about should be how the Almighty Google is going to view your site.
Seriously, these guys are like some sort of wrathful God. Should we start sacrificing chickens to Google?
| 3:11 pm on Sep 21, 2006 (gmt 0)|
> a web designer thinks about should be how the Almighty Google is going to view your site.
Think it this way... If you want to participate in 'their' index, get some of 'their' traffic and benefit from 'their' referrals, it might not be a bad idea to familiarize yourself with their preferences.
| 3:18 pm on Sep 21, 2006 (gmt 0)|
Here's what Google should do. Google should start selling (?) Google templates and/or use some kind of CMS and all we have to do is to fill in our text or other content we would like to have. Then we'd be following Googles rules to the point where they control everything we do on the Internet.
| 3:53 pm on Sep 21, 2006 (gmt 0)|
Obono Perhaps you can enlighten all the people here as to what prefernces you think are the some that should be reviewed. I submit that most of the people here have reviewed them ad nauseum and they are still stabbing in the dark.
Your kind help would be most appreciated.
| 5:14 pm on Sep 21, 2006 (gmt 0)|
>Think it this way... If you want to participate in 'their' index, get some of 'their' traffic and benefit from 'their' referrals, it might not be a bad idea to familiarize yourself with their preferences.
Perhaps if they wouldn't keep changing the way they build 'their' index that directs 'their' traffic and lets me benefit from 'their' referrals I would have some sympathy with this view.
As it is, my relationship with Google is constantly being changed on a unilateral basis with no apparent logic behind it.
| 5:38 pm on Sep 21, 2006 (gmt 0)|
Gimp, I am afraid I can't help as I am trying to find this myself. Like other people on this thread I too lost all traffic. I just meant that you can only control what is within your power, your site. You can't control what others do... well someone might, not me.
| 5:46 pm on Sep 21, 2006 (gmt 0)|
|As it is, my relationship with Google is constantly being changed on a unilateral basis with no apparent logic behind it. |
What relationship? Google just indexes what's on the Web and delivers search results to users.
| 6:17 pm on Sep 21, 2006 (gmt 0)|
But that's the problem. They have 2 indexes. And you have to be so very good to stay out of the bad one.
| 7:13 pm on Sep 21, 2006 (gmt 0)|
>> ... found that I had 3 H2 tags in page by mistake <<
That is NOT a mistake.
Your main heading should be a H1. In General there will be one per page.
All of the sub-headings should be H2. The number is the importance level of that heading.
If you have sub-sub-headings then all of those will be H3.
Run the page through [validator.w3.org...] and tick the box for "Show Outline". On the results page scroll down to the section marked "Outline" and take a look at the bullet-point list generated from the text of just the heading tags. If that list does not look like a Summary of your document, then you are abusing the heading tags.
| 7:16 pm on Sep 21, 2006 (gmt 0)|
I hope this is some kind of a weird glitch.
I have two different connections here, and usually I get different DCs on each one.
Right now on 220.127.116.11, I've tried ten different sites, and they all go supplemental on the FOURTH result. Everything. The only variance is the website for a major 24 hour news network, that manages to get to the FIFTH result before going supplemental.
One site has around 70 pages; a site: command claims 6 results out of around 71 pages - but there's no filter link at the bottom to see the rest of them. It's just six pages.
On my cable connection, I'm seeing normal results on 18.104.22.168.
Hmm, that would be an interesting strategy, wouldn't it? Everyone gets THREE PAGES in the directory, and no more, no matter how big the site.
| 7:23 pm on Sep 21, 2006 (gmt 0)|
You are correct sir. I have been basing my assumptions on using the site: command. It appears that my pages are there, just not showing up. Reference below...
Kind of funny how when things go awry, people panic. I was kind of grasping at straws and not seeing the trees because of the forest.
All the serps are normal. all the pages are there. They are just not displaying when I use site:
As an aside...
Each page has 1 H1
I have multiple H2, H3 and H4 tags. I have never had a problem with this...
| 7:35 pm on Sep 21, 2006 (gmt 0)|
+++What relationship? Google just indexes what's on the Web and delivers search results to users. +++
The only thing that seems obvious is a huge decline in stability and quality. We are offering what a good search engine should give to users. High quality content with huge amounts of videos, photos and info since years.
Until the 2005 Bourbon update we had solid Google traffic for a long long period with various page 1 spots. All stopped in June, came back in August and now gone since September.
We did zero modifications to the site except udating with new content. Code, navigation, tags, metas etc. had never ever been changed.
My conclusion is:
If you are a solid publisher aiming for a good publication ... forget about Google traffic! You may get it if lucky, or not because these engeneers are obviously in for whatever.
So just stop your Internet plans unless you do not see it as a game playing against the current controler of Internet traffic.
If you want to make money ... go blackhat, deliver irrelevant content are just copy the sites that occupy all those many No.1 pages that Google engeneering seems to love.
Easy to imagine they get more clicks on adsense whenever the page 1 results site offer spam, scrapers, empty frames or outdated stuff.
If it would be allowed by Tedster, I could aprovide many links and search results with examples that had also been forwarded to adsense and search support from where only a bunch of meaningless standard phrases had been a sad response.
Until there will be no striving for the best content by competitive engines it will be risky for any publisher to calculate traffic unless not depending on the G monopoly.
Google loves a few, but certainly not all.
| 9:03 pm on Sep 21, 2006 (gmt 0)|
For those disappearing this time around...if you do a search for your site in a non-conventional fashion, such as
"domain unique phrase"
where domain is without the tld and the unique phrase is essentially that, unique, are you seeing a lot of numeric subdomain .info results?
I'm now seeing some new sites as of this morning lose all their rankings on the DCs that list the .info domains for those searches. Many of the .infos I just tried to get rid of had cache dates of 9/19, so I think as more get their caches updated, we'll be seeing more of this.
A numeric subdomain on .info should be just about the easiest spam signal I can think of; seriously Google...
[edited by: JoeSinkwitz at 9:03 pm (utc) on Sep. 21, 2006]
| 10:24 pm on Sep 21, 2006 (gmt 0)|
"I'm now deleting all my ALT text"
Just wanted to chip in on the point about ALT Text, you cant get your page W3 validated unless you label your images with the ALT Tag.
If anything you would think google would give extra weight to validated pages but due to so many pages on the net that dont conform obviously not - i wouldnt delete ALT text on any of the images used.
On the spam issue, sure if you loads of images on a page and they all say "Keyword" your asking for trouble but if its one or two pics all labeled correctly i dont see any issue
| 10:39 pm on Sep 21, 2006 (gmt 0)|
I have many more than one or two pics on many of my store pages, and yes, they all had "keyword" in their alt attributes. I'm not sure what to put in an ALT for an image of a black widget, other than "black widget." When you have a whole page of black widgets, you see the problem.
But I've been informed now that if each image already has a text caption or description with it, you definitely don't need to repeat that in an ALT tag. Yes, you do need alt attributes in image tags for W3 validation, but I was informed that you can simply use an empty ALT attribute in each image: ""
| 10:40 pm on Sep 21, 2006 (gmt 0)|
A few words of text in the alt attribute is a GoodThing.
Don't delete it all. That looks bad.
[edited by: g1smd at 10:42 pm (utc) on Sep. 21, 2006]
| 10:42 pm on Sep 21, 2006 (gmt 0)|
I'm sure you knew what I meant, but I probably should have typed: alt=""
| 10:44 pm on Sep 21, 2006 (gmt 0)|
Well, I'll let you know what happens, if anything. I can always add ALT text BACK, but right now I don't have much to lose. :-)
| 10:45 pm on Sep 21, 2006 (gmt 0)|
Add a few words in there. Don't overdo it.
Empty attributes are not a help to your SEO efforts.
| 10:46 pm on Sep 21, 2006 (gmt 0)|
"I'm not sure what to put in an ALT for an image of a black widget, other than 'black widget.' When you have a whole page of black widgets, you see the problem."
No, I don't. What is the problem? A black widget picture should have the alt text of black widget.
"But I've been informed now that if each image already has a text caption or description with it, you definitely don't need to repeat that in an ALT tag."
Maybe you shouldn't listen to people with no idea of what they are talking about.
| This 82 message thread spans 3 pages: 82 (  2 3 ) > > |