| 5:13 am on Feb 7, 2014 (gmt 0)|
|I thought a 404 presents a "page" but it's not really a page on your site - just an error page |
This is one area where you can and should view humans and robots as entirely different animals.*
A robot receives a 404 response. It may or may not look at the content that accompanies the 404. (Your server always sends something. What a robot chooses to do with it is another matter, just like choosing whether to follow a redirect vs. pack up and go home.) Your job is to send the right numerical response.
A human sees the page that is triggered by the 404. This might be anything from a one-size-fits-all error document --"missing.html" or similar-- to a custom-built page designed to address any expectations raised by the request. Your job is to show the most useful information.
* Yes, I really do have a page whose query string says "animal=robot".
| 8:29 am on Feb 7, 2014 (gmt 0)|
Welcome to WebmasterWorld!
1) 'link:example.com' now shows a deliberately randomised selection of pages, and has done for a long time.
2) surveys of noteworthy/celebrity SEOs show that the majority still think that the 'title' tag is among the most important on-page ranking factors.
Perhaps the confusion was caused by the fact that Google does rewrite them in the SERPs.
Note: it's really not something to play around with though. In 2012 people began posting that tweaking even slightly too much will mess up your rankings in the short term, and Google does have a patent that detects tweaks and randomises rankings - to confuse those observing what changes to the title do, and make 'tweak and observe' SEO a thing of the past. That also says to me that they do still rely on the title tag at least to some extent. My own feeling is that it's definitely not weighted as much as it used to be.
| 8:40 am on Feb 7, 2014 (gmt 0)|
|'link:example.com' now shows a deliberately randomised selection of pages, and has done for a long time. |
Partially true. The part I disagree with is "randomised". It if were random, then I would get different results each time I looked.
| 9:04 am on Feb 7, 2014 (gmt 0)|
Sometimes SEO gets in the way of common sense: Title shows up on the browser title bar, which is useful to the user with many tabs open, etc. And is good HTML practice. Use if for THAT purpose.
Who wants broken links no matter how juicy? Fix them.
Any entity that claims they know what Google picks up is just plain.. insert your own choice of disbelief statements... I like "pulling your leg"...
| 9:58 am on Feb 7, 2014 (gmt 0)|
|It if were random, then I would get different results each time I looked |
... and that would be more useful, because if you ran the query enough times via proxies and scraped the results you'd still have an effective link diagnosis tool via Google.
I should have been clearer; what I meant is that there's no rhyme or reason to what they show, not that it changes every time.
| 10:13 am on Feb 7, 2014 (gmt 0)|
Even a link to a broken page is passing juice, making broken link building more of a user experience issue than a linkbuilding tactic
That statement confuses me too. If it did pass "juice" that would be to the custom 404 page not to a useful page (see posts above about bots and 404s). My understanding is that "link juice" flows to the page linked not to the site as a whole.
Maybe there is some confusion there with outbound links. Again my understanding is that every outbound link on a page absorbs a portion of the "link juice" that that page has to pass on so broken links and nofollow links are just pouring some of the available "link juice" down the drain.
Of course G change the rules on a regular basis as people try to game them so any advice on SEO written a couple of years ago may be out of date.
| 12:27 pm on Feb 7, 2014 (gmt 0)|
Welcome to WebmasterWorld, RankNFile and thank you for starting this interesting thread!
|2. Title tag is useless/doesn't influence rank (I thought this was indeed useful if for no reason other than it's a clear aid in classifying a page and is the first thing you see in SERPs) |
This probably refers to title attribute and not the title element (i.e. not for the page title). And it would be correct for title attribute (not really advantageous). This example shows how important is to use the correct terminology. Here is the difference:
- title element: <title>My page title that appears in SERPs</title>
- title attribute: <a href="/best-widgets" title="See the best widgets">My best widgets</a>
| 1:19 pm on Feb 7, 2014 (gmt 0)|
Thanks everybody for replying - I figured these would be such ridiculous statements that nobody would reply lol
So what I'm gathering is that Title tags ARE indeed not the holy grail, but also are not "useless" - which is what I had concluded.
That the link operator is limited in its utility
And that 404s should just be fixed as a rule anyway - but that the notion of links to broken pages "counting" is specious at best. That about right?
| 1:27 pm on Feb 7, 2014 (gmt 0)|
|So what I'm gathering is that Title tags ARE indeed not the holy grail, but also are not "useless" |
It depends what do you mean when you say "title tag":
This: <title> or this title="link title" ?
They are not the same and they have a very different role and a very different influence on ranking.
| 2:35 pm on Feb 7, 2014 (gmt 0)|
|And that 404s should just be fixed as a rule anyway |
Depends on what you're serving up. People have a tendency to freak out a little over 404s; I'm pretty sure the search engines *expect* 404s in some situations, like expired events, or products that are no longer available. 404s because something is broken, absolutely should be fixed. 404s because the page just naturally is of no further use, then those are supposed to 404.
But the answer to all these things is, use your common sense.
| 3:32 pm on Feb 7, 2014 (gmt 0)|
Even if the title tag didn't influence rankings at all, it's still the first thing someone sees when viewing results, and so it's still something important to consider.
True, Google may re-write it, but a lot of times they don't, so it still works for that purpose.
| 7:54 pm on Feb 7, 2014 (gmt 0)|
|I'm pretty sure the search engines *expect* 404s in some situations |
In some situations they even require 404s. Dump a lot of redirects onto a site within a short time, and you'll see the googlebot requesting a garbage URL to make sure you haven't started issuing "soft 404s". (This is originally google's term, isn't it? Humans use it because there's no real synonym.)
A few days ago (really) in a different thread I investigated different browsers' handling of the <title> element when you've got vast numbers of tabs open. It was educational.
| 10:10 pm on Feb 7, 2014 (gmt 0)|
According to what I've read in the past, the link: operator will show the number of "quality backlinks" the page has, but not the actual "quality backlinks" themselves. In other words, if the link: operator shows 12 backlinks, regardless of where they come from, then Google's algorithm credits the page with 12 quality backlinks. So what's important is the NUMBER of backlinks shown.
| 10:38 pm on Feb 7, 2014 (gmt 0)|
So what I'm gathering is that Title tags ARE indeed not the holy grail, but also are not "useless" - which is what I had concluded.
There is no "Holy Grail" when it comes to SEO just a whole collection of factors whose relative importance changes from time to time as G tweag the algo.
| 2:38 pm on Feb 8, 2014 (gmt 0)|
Broken link building
Your confusion re broken link building is well founded because you are misunderstanding it. BLB is the practice of identifying sites that have a broken link then approaching the webmaster to let them know about it and asking them to replace the non-existent URL with your URL.
|The part I disagree with is "randomised". It if were random, then I would get different results each time I looked. |
I agree. Randomised is not an accurate description. The backlink search has always shown a sampling. Perhaps "sampling" is a more accurate word than randomised.
Here is the history: Google used to show just links from pages PR 4 and up. But that led to the belief there was something wrong with pages less than PR 4, causing people to not link to them. This mistaken assumption continues, although most SEOs don't know why PR 4 is the threshold. Legendary Internet marketer, Dave Naylor, (http://www.davidnaylor.co.uk/) approached Matt Cutts during PubCon London (2005) with the problem and suggested a more useful "sampling" of backlinks that included sites from a broader range of PageRank levels. Matt thought it was a good idea and brought it back to the engineers at Google.
As far as Title Tags playing a role, Google has consistently encouraged using title tags that accurately described the topic of the page. I know from experience that an accurate title tag will help rank a page better than a title tag that is optimized for a high traffic keyword phrase. There's a difference. A high traffic keyword phrase is not always what the page is about.
For example, I was on a site review panel at PubCon and we were reviewing a site about example advice. The site owner's complaint was that he couldn't rank for higher traffic phrase. So I asked him, is your site about higher traffic phrase or is it about example advice. His answer was that it was about example advice. Having spidered his entire site and reviewed his title tags and the top level content of his site, I agreed and pointed out that that is what he should be trying to rank for, not the phrase with more keyword query traffic. His site was not about higher traffic phrase. His site was suffering from poor title tag choices and well, he knew enough SEO to shoot himself in the foot.
That was a classic case of SEO as a way to trick the search engines. SEO is about optimizing what you have, making your site topic crystal clear by making the right choices, such as page speed, good topic organization and site architecture. Trick SEO is about trying to convince the search engines that the topic is about something that it really isn't about. Trick SEO is a symptom of a failure to understand what optimizing is really about.
| 6:48 pm on Feb 8, 2014 (gmt 0)|
1) Link Operator is not very useful for practical purposes, cause is "randomized" aka Google does not want to help people watching freely competitors backlinks. So it actually shows a small subset of incoming links, changing it in some non-linear / not predictable mode.
2) Title tag is very important for giving meaning and importance to a webpage, try to change it and in some cases your ranking should be better (under certain condition, of course)
3) Can't really understand this point, in some way there is a well-known link building strategy about finding 404 pages (i.e. exploiting specifical footprints) and requesting a link as your good and free resource on your website. But I don't think is your focus now :)
| 1:52 pm on Feb 11, 2014 (gmt 0)|
So I'm still not sold on link operator.
But at least now I know there remains some real discussion about it. I thought this issue was 100% settled lmao
| 3:54 pm on Feb 11, 2014 (gmt 0)|
If you're concerned about what links Google is seeing, why not just download the list from Webmaster Tools? It isn't perfect, either, but at least you won't be relying on the link: operator.
Or, if you want valuable links from another site, use something like Open Site Explorer or Majestic.
| 4:26 pm on Feb 11, 2014 (gmt 0)|
1. Backlinks displayed in Google's WMT aren't necessarily what is helping the site rank. They could be passing zero or partial PageRank. This is why it's called a sampling or random sampling. The data is not meant to be useful for desconstructing how sites are ranked.
2. Just because a link is abasent in Google's WMT do not assume that the link is not important to your ranking. The links shown are a sample and do not represent what is helping or not helping your site to rank.
3. It has been reported [webmasterworld.com] that third party backlink tools are reportedly showing only about 30-50% of total backlinks.
4. Third party tools can never tell you the value of individual links. They can guess but those are just guesses.
5. Scores assigned by third party organizations should be viewed with skepticism because they do not have the ability to calculate what caused a site to rank. Scores reflect characteristics of sites that typically rank. Those characteristics are the qualities held in common with other sites that rank. But those qualities do not represent what caused a site to rank.
| 12:13 am on Feb 12, 2014 (gmt 0)|
1, 2, 3. Fair enough. OSE definitely doesn't show all the links, just the highest-ranking ones by its measurements. Moz has been clear about that.
4, 5. True, it's just a guess, but isn't toolbar PageRank itself just a simplified version of the real algorithm? Yet people in the community seem to continue to hold it with high esteem, even though it's only updated every six months at best. At least some of the others are updated regularly, even if they're just guesses.
In any case, SEOs can and should use other processes like visual checking of link profiles and the websites themselves, to spot bad practices, because even Google misses some obvious stuff on sites it otherwise rewards with high PR. So nothing is truly fool-proof, even if it comes from Google.