Forum Moderators: Robert Charlton & goodroi
I setup a website about a year ago and linked from another site to it - the new site got a PR of 3/4.
The links from the source page have since been removed, with only the image alt text remaining on a different image (a slight booboo).
There is 1 inbound link from a "linkedin" page (with PR of 0).
However following the recent PR update, the site increased from a PR of 4 to a PR of 5! However the source site no longer links to it. The linkedin page has no PR and there are no other links apart from internal links contained in Google or Yahoo.
So can a site stand alone without inbound links if its already got an established PR and/or why did it recently increase?
Many thanks
A
My assumption is that this unusual PR boost is one of the ways that Google helps "mom and pop" sites compete - something that Matt Cutts made a side comment about on his blog a few years ago. He never said WHAT Google does specifically, only that they do a few things.
Now if we only knew how they identify a "mom and pop", eh?
About four years ago I created a site for a local sports club. It has about a hundred pages, two listings in the ODP and a dozen or so naturally acquired (but low-PR) backlinks.
About six years ago I started an informational site that has, over the years, become somewhat of an authority in its niche, and has since then acquired hundreds of backlinks, many from good, related pages.
Now, here's the kicker: after the recent PR update, the local sports club homepage now has a PR5(!), whereas the homepage of the older, larger, more popular informational site went from 5 to 4.
What's going on here? How does a simple local site go from a PR3 to a PR5 without gaining any new backlinks? It seems to have been given some type of boost, but how, and why, I have no clue. Could this be indicative of changes in how small sites are ranked, too, or is this simply a toolbar PR thing?
It's also possible toolbar PR is just wrong due to hash collision or what not.
I have a big site. 4 years ago, PR6. 3 years ago, PR5. 2 years ago, PR4. Now PR3.
I have more backlinks than before (grateful users, natural linking), havent engaged into any linking scheme.... it just... happens...
I setup a page just to test a theory of mine about Google indexing. It had a meta norobots but was not robots.txt blocked. It wasn't linked to from anywhere and only Google's toolbar knew the page existed. My stats are password protected so the URL couldn't have been found there. I setup an alert for links to that page. Contrary to some experiences here, the page didn't make it into SERPs but, when I removed the norobots meta - bingo! the page got PR in the update a few days later. Still no IBLs listed in any of the SEs or WMT.
Low value pages with even a few links from (even lower value) pages on social bookmarking and networking pages (and nofollow blog comments!) seem to be gaining PR over similar pages with a few new, solid PR4-6 links and a DMOZ entry. Again, contrary to how toolbar PR has behaved in the early years.
What about from email services?
Do you generate RSS feeds?
What about services that autogenerate rss feeds?
Directory links that may be marked as NOFOLLOW?
I am pretty much convinced that links in Gmail pass some juice.
Ok, this deserves its own new thread on stealth links [webmasterworld.com]!
What about from email services?j
this discussion touches on how google may be crawling gmail
[webmasterworld.com...]
the argument is that they have to spider the content in the messages to serve adsense; so maybe they are also crawling links and passing equity.
the next big change that Google might be testing, i.e to have higher PR sites within Geo-specific niches
Doesn't seem to be so.
I second karkadan's experience. We have a big, country-specific site, top-3 in the niche. Exact same PR drop year after year in last 3-4 years, starting with PR7 down to PR3 now as we increase number of pages, improve usability, get a lot more visitors, etc. We don't do link schemes either (in fact we do 0 linking), just normal stuff, like launching new sections on subdomains.
We take a drop in PR as Google's way of saying, "no we don't like you being a large site that doesn't fall into our pattern of large sites".
G$$gle giveth and G$$gle taketh away.
The site 3 years old, used to be an old spam cloaking script in the gambling industry (read, obvious and blatant spam), but was never really used. For the past two years it has had zero content, the index page is an image and the word 'text'. No sub-pages.
It has 2 inbound spammy links that are also 3 years old.
It just went from a TBPR2 to a TBPR3. This is the last domain in the world that deserves to be a PR3, and I don't believe for a second that it truly is.
There is a TBPR phantom in our midst. Guess some spammers will be earning bank from link-sales this PR cycle.
I've actually stopped watching it...too much of a brain drain. First time I've thought of it in at least a year and checked our PR...lower than before with no change in search rankings. You're right...it just doesn't make sense.
We know that Google is a registrar and can pass trust through the registered site owner information. If the other site has high trust rank and/or authority, then I can see why PR may flow through. However, I think the most likely reason for this would be that the site with no backlinks probably shares the same IP as the other site.
PR bar means ABSOLUTELY NOTHING.
Absolutely nothing is too far IMO. It's data from Google and so can be interesting, given the right evaluation.
But the toolbar doesn't seem to be any indication of whether or not a URL has external links or references of value. I don't think it's even based on the same calculation as PageRank, and certainly undergoes lots of changes even if it does.
To me, PageRank (toolbar or otherwise) only matters inasmuch as it relates to whether a URL has received value and the URL's ability to pass on value.
It's possible to get a high PR without the need for a page to either receive a great deal of value, or be able to pass on any value. Whether you care likely depends on what your results are ;)
The "200" factors and constantly changing rules of the Tool Bar Page Rank system has made it laughable. A site should not have success in the SERP listings and only be a Page Rank "0" or "1" or "2" or whatever. A sites pages should not be grayed out because the page name isint a popular keyword. The pages of sites should be given rank based on their level. (Home Page 5, Product pages, 4, Marketing pages and About/Contact pages 3, etc.). The system made sense till about 2007. Now the system is a mess and doesnt make any sense.
Google got carried away with spam fighting and the Page Rank system became a broken, doesnt work, inconsistant and illogical tool.
SIMPLIFY, SIMPLIFY, SIMPLIFY the scoring of that tool and stop trying to play god with it.
Google got carried away with spam fighting and the Page Rank system became a broken, doesnt work, inconsistant and illogical tool.
There is no alternative to using web graph metrics (PageRank or not) to problem of ranking content on the web for competitive categories: it's the same thing as voting in democratic elections, this is not perfect and does not produce best results, however there is no viable alternative to it without running into dictatorships - in this case it would mean that only trusted manually reviewed sites would rank, would you like that to happen?
No, because manual updating is no match for G creppy crawlers. I agree, we dont want G-MOZ.
I think that some type of reset on the system would be nice to where sites are "one" with themselves and no partial gray outs. Each page should have a logical weight in rank and work down evenly. And a site should get at least a 3 or 4 if its a good site with a real business or purpose. The system nows seems like swiss cheese. To many loops, holes, and factors. Now, a decent site can have a rank of Gray or "0" or "1" and most pages grayed out.
Thats not cool.
Come on G, thats no way to run the show. If a site has supplimental pages you dont feel are important, give those pages a "1"...not a gray bar or 0.
Gray bars should be for pages that have lots of broken graphics, pages that code break and dont work, or some obvious SPAM.