Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
Of course I do check my backlinks to my home page, it is impossible to keep track of backlinks to my internal pages, which are most of my incoming links. The link count is up about 120%, but who knows what criteria they are using to come up with that number. The numbers jumped around before all the changes, why should they settle down now.
As for PR, I am more interested in the PR of the sites that I would like to have link to me, than the PR of my own site. I'm at PR6, and will probably make it to PR7 someday, but PR6 to PR7 is a long haul, and I doubt that I will get there till next year, at the earliest. I'll actually start up IE next week and go looking around a little bit to see how the PR is looking. It would be nice if a couple of second level files are up to PR6.
What I am far more concerned with is the number of internal pages that are in the index, and getting the deep links.
More pages = more content = more traffic.
Deep links can easily boost a content page from PR2 to PR4. And a PR4 content page can bring in a lot of traffic on lots of keyword combinations.
As there are only 20 more of my pages in Fritz than there are pre-Fritz, I don't expect to see any real change in traffic.
Freshdeepbot is making the updates a non-issue for me. And judging from how slow this thread is growing, I would guess it is the same for most other people.
Freshdeepbot is making the updates a non-issue for me
There was a time when I used to stay up late to see if there was an update. I don't anymore. I just roll with the changes and adapt.
In fact, I'm more concerned about my conversion rate than I am about any update.
More content, more pages, BETTER pages and better conversions is where it's at now for me.
At the same time, I can look at other sites, for example the one at #1 above me for one of the terms, and see the backlinks there. They show 21 where I'm only showing the 12. All the web shows them with 45 backlinks, and my page with about 200.
So, there is no work for me to do here regarding this search term, except maybe adding a couple links and deleting a couple just to change it around some. Next time hopefully google will not miss this page and I'll move up to #1 for that term.
(edit... backlinks are anchor text, and anchor text is king)
"SERPs look pretty much unchanged to me.." I've been watching this off an on all day long and they're all over the place. Kind of exciting and disappointing when you see you're at the top of the heap and one hour later fall back to #30 or so. First time I actually paid much attention to it during the creation of a new character---Fritz. Who names these things anyway--or did I miss something?
Google is now showing changed backlinks on some datacenters.
Easy to understand is those pages showing increased backlinks.
What I'm not sure how to interpret is those pages showing reduced backlinks.
Can't that only happen if:
a. Google hasn't found the link.
b. Google has found it but isn't showing it.
For say, PR5 links that showed last week, but don't show this week, Google has clearly found them in the past.
So if you can still see the links, and Google isn't showing them, and they are > PR4, does that mean Google is working from an older dataset?
My head hurts ...
joined:Feb 26, 2003
What kind of farm do you have? I might be interested in betting. ;)
After dropping to almost no backlinks for my home page after Esmeralda, I'm seeing a return of many of those backlinks when I use "link:www.mysite.com" (on -cw, haven't checked others). Yipee!
I'm also noticing that some of the pages that dropped way down in the serps after Dominic and Esmeralda are rising up again. Maybe coincidence?
I did read googleguy's post about the serps looking pretty much unchanged. Must be that he's looking at different serps than me. :) Or maybe I haven't been following the serps as much as others have and this is just the "normal" fluxing.
I agree it's a problem with the display of backlinks, but the actual searches are taking into account the actual backlink count.
Also for the record, the site I mention had over 100 backlinks, and has been declining steadily all this month. 120, 100, 80, 60, 20, and now 6. :) Before long I'll have a negative number and rank #1 in everything (Hey, I can hope)
However, when searching I find the Google category and description displayed for my site. What does it mean? Which PR is used to show the results - one considering DMOZ and Google entries or one without?
joined:Oct 23, 2002
Now all we need is Spell Check For Webmaster World!
In fact I'm not that worried about reported page rank either these days - just get the right mix of anchor text from 'decent' sites, and let Google worry about the page rank.
Couldn't agree more. But the update is significant for New sites so that they at least get some visible PR and others start linking to them. ( It clears them of suspicious PR 0 penalties ;) ) As for well established sites this Links and PR update should not matter that much IMHO.
Actually, it's a worm farm bether2, but they work 24/7 for table scraps.
<big chuckle> Guess the bet is off then.
If we're in a continuous update, then you continuously improve if you're doing the right thing--sounds like you are.
After taking big hit from Dominic and Esmeralda on both my educational pages and product pages, that's an encouraging note. Especially since googleguy seems to concur. Back to working on content. Fortunately my site is on a topic that I truly love, so working on the content is very satisfying.
For a fairly competitive search term, the number one spot goes to a site that is spamming big time with repeated lines of keywords.
The number two spot ends up on a company TOS page, which has no meaning or relevance to the search term.
The number ten prize goes to a simple "Lost Password" page, which, again, has no meaning for people looking for this search term.
These sorts of results are enough to throw doubt on Google's efficacy as the top search engine on the net. This is just too bad, because Google is a great company, enormously responsive to end users and publishers alike.
What I don't understand is how poor results like these can slip through unheeded...any ideas?
Because it is all automated. There is no human interference in the SERPs, apart from certain sites being banned, apparently. Google uses a particular algorithm that is tweeked for each update, and that ranks the pages. If the SERPs aren't good, then the algorithm is at fault; there is no "heeding" involved ;-)
joined:Feb 26, 2003
It's funny. Everyone draws a line in the sand, says I am this side of it and the other side is Spam. But no-one knows for sure where Google's line in the sand is anymore ;-). Also, the closer anyone is to #1, it is amazing how much better they feel Google's search results are!
Google's SERPs seem to pretty good at this time (not only could I find my sites early on, but also other stuff I was looking for), but does hidden text, log Spamming, cloaking, gb spamming, etc. still work? Sure it does if you know what you are doing! So I wonder what this says of people that feel there is no spam :-)
A happy (Sun)day to you all from a sunny beach in Phuket, Thailand :-).
BTW If you want to rank really highly in Google, add this new meta tag <META NAME="Googlerank" CONTENT="Hormel"> especially if you are one of my competitors ;-)
I was basing spam's definition as "unrelevent" content in the index for search terms.
..... which the webmaster of that site has got there through means which are against googles TOS.
If there are irrelevant results in the SERPS which are sites built in accordance with googles TOS, then it's just google not doing its job properly, not spam.