| 4:48 pm on Dec 17, 2013 (gmt 0)|
A few months ago, I began using schema.org article markup and link rel="prev"/link rel="next" pagination for multi-page articles. I've been working my way through our "legacy content," adding the markup and pagination where it's most useful (e.g., on in-depth articles that can be many pages long).
Google seems to like the changes, and that makes sense: Thanks to pagination, Google can understand that (to use a fictitious example) pages about blackberries, raspberries, lingonberries, and blueberries are part of a single, in-depth article on the topic of "berries" in addition to being about those berry-related subtopics.
|Some of you are probably thinking that using schema or HTML 5 simply makes it easier for the all-evil Google to steal your content. |
What Google "steals" is public-domain information that isn't protected by copyright. Complaining that Google "stole" Newark, Delaware from your American-state-capitals-dot-com site for the Google Knowledge Graph box is a waste of breath, because you don't own that information.
Also, there are times when a site owner will want Google and other search engines to "steal their content"--for example, when a business wants its address and hours to show up in Google's local listings, or when the organizers of an event (say, the Widgetberg Liverwurst Festival) want searchers see the event's time and place on Google's SERPs. When you use schema business or event markup, there's an implicit assumption that you're offering your information to anyone who wants to use it, in the same way that you'd be launching that information into the public domain if you disseminated it in a press release.
| 5:32 am on Dec 18, 2013 (gmt 0)|
I think that using HTML5 provides a good way for webmasters to segment a web page, tags such as: <article>, <nav>, <header>, <footer>, <section>, <hroup>,<aside>,<video>, <audio>,<rel=”prev”>, <rel=”next” or <rel=”search”>.
Search engines will find it easy to interpret what is given in a web page, which may (or may not) result in some benefits to those who adopt these tags earlier. I don’t know. I am going to add these tags thinking that if it won’t help it won’t hurt.
I would add to the mix OG tags as well.
| 6:08 am on Dec 18, 2013 (gmt 0)|
|Search engines will find it easy to interpret what is given in a web page... |
Good Point! -- When used correctly, a SE will be able to interpret a page accurately, when before they would have to guess at some things.
There's a recent thread here: [webmasterworld.com...] where the use of <section> is discussed -- Near the end of the thread, there are some examples of how differently interpretation of "what goes where" can be based on the proper use [or non-use] of sections.
I know there's quite a bit of "angst" against the "big sites Google loves", but one thing those sites do very well is markup their pages correctly for search engine interpretation of what's being presented.
Some of the markup now available that wasn't in the past can really impact the way a page is "understood" by search engines, and the "big boys" communicate very well with bots, which I think is something many sites are really dropping the ball on.
BTW: If people are concerned about Google "stealing" their information, they're probably in the wrong forum, because this one is for Google SEO [AKA Ranking in Google] and for a site to rank Google's bot needs to access the information presented and the algo needs to be able to interpret it correctly.
Not presenting "as easy to interpret as possible" information to bots and SEO are becoming more and more mutually exclusive.
If someone really doesn't want to take a chance on Google "stealing" anything from them and they refuse to use the tools we have to make sites easier to understand algorithmically, then this is the wrong forum and Google SEO is really not for them, because to rank well [what SEO is about] people have to put the information out there for Google to access in a machine-understandable way.
[edited by: phranque at 5:09 pm (utc) on Dec 18, 2013]
[edit reason] requested by JD_Toims [/edit]
| 6:35 am on Dec 18, 2013 (gmt 0)|
The only issue I see right now is cross browser support, especially older versions of IE which literally hold back the web.
| 7:28 am on Dec 18, 2013 (gmt 0)|
I wouldn't worry about it -- Older browsers ignore tags they don't understand and some major sites, eg Google [and I think Bing], have already dropped their support for them, so those who can't let go of their old browser aren't going to be able to surf the web for very much longer.
Personally, I've definitely quit worrying about the relatively small number of people who haven't updated their browser in the last, what, 7 years? IE7 was released in 2006, it's 2014 in less than 3 weeks.
IMO it's time for people to quit "not doing things" in an effort to cater to those who refuse to install a browser that's less than 5 years old.
Added: The biggest issues with older browsers [most of the time] is the styling/style sheet, not the actual HTML -- In the case where a video is played via HTML5 there might need to be an alternative for some legacy browsers, but for most sites the actual HTML really isn't an issue.
| 12:07 pm on Dec 18, 2013 (gmt 0)|
We did a rebuild on a not too old but big site recently. Moved it all to html5. Old browsers be damned!
Google traffic is up but that could be because we played with the on page SEO while we were at it.
The thing that I got out of it is how much I love html5. Even if Google didn't like it I think I am still going to make an effort to move my personal sites over.
It is just so easy and neat! I got a thing for tidiness!
| 3:02 pm on Dec 18, 2013 (gmt 0)|
What we're talking about here--making it easier for search engines to understand what's on our pages--isn't all that different from what we were told to do back in the heyday of meta tags (sic) for keywords and descriptions. Even before Google came along, the standard advice was to provide easily-digestible and recognizable "spider food" for search crawlers.
Thinking of digestion leads to food for thought:
Will schema markup, HTML 5 markup, etc. be abused in the same way that keywords statements and alt text were back in the day?
Already, Google Authorship markup is being misused--sometimes by big companies that know better or should know better. (A few months ago, one of the big hotel sites was slapping authorship markup on its boilerplate booking pages until Google wised up and removed the author byline and photos from its SERPs.)
Example: What's to keep a thin affiliate from using Schema.org "article" markup on product pages if it thinks (rightly or wrongly) Google will look upon those pages more favorably? And even if such tricks aren't successful, mightn't they reduce the value of the markup for Google and other search engines?
| 3:21 pm on Dec 18, 2013 (gmt 0)|
|Example: What's to keep a thin affiliate from using Schema.org "article" markup on product pages if it thinks (rightly or wrongly) Google will look upon those pages more favorably? And even if such tricks aren't successful, mightn't they reduce the value of the markup for Google and other search engines? |
IMO the use of structured data improves "understandability", but it's not the "be all, end all" of rankings, meaning they're not going to "throw all other factors out", because they understand a page, so thin affiliates will still be thin based on content, possibly even more easily identifiable as such when there's no "guessing" about what's what on a page and how pages are related to one another necessary.
IOW: Calling something an "article" won't suddenly change the content from "thin" to "quality and informative" by any means, and could really back-fire by making thin content more easily identifiable as thin.
I think one of the biggest advantages to structured data is *accuracy* wrt what something should rank for, what page of the site should rank for a phrase, what the main topic of a site is, what subtopics are included, what a page actually says about a topic, and things along those lines, which may not [it may, but it may not] directly impact ranking position, but it could easily impact the phrase(s) ranked for, which would impact conversion rate, visitor satisfaction, etc.
So, imo, the idea of "I'll outrank someone with my thin content if I mark it up as an article." is likely false.
But, the idea of "Search engines will be able to more accurately rank my pages for visitor's queries; which will increase conversion rates and visitor satisfaction; which will likely increase sharing of my site by word-of-mouth, social media, natural topical links, etc; which will increase 'positive signals' for search engines; which will have a positive impact on ranking position." is much more likely to be true.
Basically, for structured data to work in someone's favor, they have to "have something" to start with.
| 5:30 pm on Dec 18, 2013 (gmt 0)|
|I wouldn't worry about it -- Older browsers ignore tags they don't understand and some major sites .. |
Hey, I am talking about IE9 (15% of global users) and IE8 (8% of global users).
Many HTML5 (and CSS3) properties aren't supported by these browsers unless using some JS workaround.
| 8:13 pm on Dec 18, 2013 (gmt 0)|
|IMO it's time for people to quit "not doing things" in an effort to cater to those who refuse to install a browser that's less than 5 years old. |
The users don't exist for our benefit. We exist for theirs.
| 11:07 pm on Dec 18, 2013 (gmt 0)|
Which ones exactly that would impact the "average website" aren't supported and how many of those effect anything besides *style* -- I've only been using HTML5 since before it was the recommendation and with the exception of some "advanced" elements [not used on the average site] the only issue I've run into is style, which isn't that difficult to overcome.
|The users don't exist for our benefit. We exist for theirs. |
Well, imo, it would be to their benefit to update their browser so the web works the way it should and not everyone seems to think they're important enough to keep having 2 [or more] sets of code for. See Below.
For those not wanting to break anything in older browsers, here's a short list of sites/services not quite as concerned about older browsers:
|jQuery 2.0 Drops Support for IE6, 7 and 8 |
|Google will drop support for Microsoft's Internet Explorer 7 (IE7) and Mozilla's Firefox 3.5 browsers for its online apps, including Gmail and Docs. |
"Beginning August 1, we'll support the current and prior major release of Chrome, Firefox, Internet Explorer and Safari on a rolling basis,"
Google will drop support for Microsoft's Internet Explorer 8 (IE8) for its online apps and services in mid-November, effectively ending support for many users of Windows XP.
|[Dec. 2011] |
Facebook is starting to phase out support for Microsoft Internet Explorer 7 (IE7). First to go is the service's new Timeline profile; when IE7 users visit Facebook profile pages, they don't see the Timeline version, and are instead presented with the old profile design.
|Internet Explorer 6, 7 and 8 no longer fully supported |
|When the Tweet button was introduced in 2010, Microsoft’s Internet Explorer 6 was our baseline. Now, most users have upgraded and we see IE6 usage dwindled. On May 13th 2013, we’re going to prune our support for that browser and its successor, IE7. |
[edited by: JD_Toims at 11:16 pm (utc) on Dec 18, 2013]
| 11:15 pm on Dec 18, 2013 (gmt 0)|
According to this, Google does not recognize the article schema markup. https://support.google.com/webmasters/answer/99170?hl=en&topic=1088472&ctx=topic
Am I wrong?
| 11:20 pm on Dec 18, 2013 (gmt 0)|
Schema.org is a collaborative effort between Google, Microsoft and Yahoo! The other alternatives are available and documented in the help section, but they're no longer the "recommendation" and who knows if/when they'll drop support for those?
|Historically, we’ve supported three different standards for structured data markup: microdata, microformats, and RDFa. Instead of having webmasters decide between competing formats, we’ve decided to focus on just one format for schema.org. In addition, a single format will improve consistency across search engines relying on the data. There are arguments to be made for preferring any of the existing standards, but we’ve found that microdata strikes a balance between the extensibility of RDFa and the simplicity of microformats, so this is the format that we’ve gone with. |
| 11:27 pm on Dec 18, 2013 (gmt 0)|
|According to this, Google does not recognize the article schema markup. |
I didn't see a date on the page that you cited (and I didn't watch the video), but this page suggests otherwise:
For what it's worth, I've run hundreds of pages with schema.org "Article" markup through Google's Structured Data Testing Tool, and the tool hasn't had any trouble finding and displaying the marked-up elements.
| 11:33 pm on Dec 18, 2013 (gmt 0)|
The list of what they recognize is listed below the video.
|Google supports rich snippets for these content types: |
Businesses and organizations
| 11:38 pm on Dec 18, 2013 (gmt 0)|
Schema.org markup === microdata (recommended)
Edited: Just reread the page for the 4th time -- Above the list you're referring to it says:
|Google supports rich snippets for these content types: |
Not that they're the only ones recognized.
| 11:57 pm on Dec 18, 2013 (gmt 0)|
Is there much of a difference between recognize and support? The thread is about what helps rankings.
| 12:01 am on Dec 19, 2013 (gmt 0)|
There's a big difference between "supports rich snippets for" and "only recognizes markup for", imo.
[edited by: JD_Toims at 12:05 am (utc) on Dec 19, 2013]
| 12:04 am on Dec 19, 2013 (gmt 0)|
Well to me, support would indicate that these are the ones that they like vs. recognize would be the ones that they can see.
| 12:05 am on Dec 19, 2013 (gmt 0)|
The ones they like For Rich Snippets.
Added: From the same source I cited above.
|Having a single vocabulary and markup syntax that is supported by the major search engines means that webmasters don’t have to make tradeoffs based on which markup type is supported by which search engine. schema.org supports a wide collection of item types, although not all of these are yet used to create rich snippets. |
| 12:33 am on Dec 19, 2013 (gmt 0)|
How has the rich captions worked on Bing and Yahoo? I tried some products and did the feed etc but haven't seen any results. Have any of you seen positive results there.
| 1:23 am on Dec 19, 2013 (gmt 0)|
I haven't noticed any rich snippets on Bing or Yahoo! from using HTML5 or microdata, but I don't remember ever seeing many rich snippets on Bing for searches and Bing is my default SE for personal searches -- As far as Yahoo! goes, what is Yahoo! besides an e-mail service again? lol
To me though, using correct semantics is about way more than "do I or don't I get rich snippets" -- They can be a nice "bonus", but they're not the main reason I use the structure and markup I do.
| 1:35 am on Dec 19, 2013 (gmt 0)|
Regarding article markup:
When Google announced "In-depth articles" results last summer, one of its recommendations (for publishers who wanted their articles listed) was to use Schema.org article markup.
| 2:51 am on Dec 19, 2013 (gmt 0)|
Here's some material that I've posted elsewhere, which I hope will clarify some basics and lead to helpful resources about schema.org markup (vs other formats) moving forward....
|Ways of marking up a page for a Product |
|Going forward, schema.org is the microdata format that will be supported by the big three search engines, but you can keep what you've already done in the other formats mentioned.... |
The thread briefly compares microdata and schema.org with RDFa and microformats, and differentiates these from the Facebook Open Graph Protocol.
| 3:54 am on Dec 19, 2013 (gmt 0)|
PS: I should add to the above that I believe that Google is now using schema markup as a guide only, and is looking for confirmation on multiple other levels. Here's a thread where I relate the loss of rich snippets to other signals....
Disappearance of rich snippets
|Over the years, I've described Google's index as a multi-dimensional model of user behavior and the web. I think that multiple indications of trust, authority, and user popularity are increasingly being considered to confirm all markup, all links, all content, etc. As Google gains confidence in its ability to make accurate discriminations, it will do so... and low quality sites which don't merit rich snippets are not going to get or retain them. |
At this point, I believe that schema markup will make it easier for Google to understand your site, though I doubt that it will fool Google into thinking the site is "better" than it is. While the above discusses rich snippets, the principle applies to all kinds of uses that semantic markup might be used for. Chances are that some sites that are right on the fuzzy line may get a temporary boost in a particular area from schema markup, and they might then drop down.
Certain kinds of signals are likely going to be very hard for Google to parse without the markup. Ranking is probably the area where Google is most particular about requiring multiple signals. That said, I think that schema can be extremely helpful in ranking some types of content if the other signals are in place.
| 1:36 pm on Dec 19, 2013 (gmt 0)|
I am not sure how they are using it. I have about 10K different products. After implementing it I have seen some strange patterns. It is hard to describe, some items sell like normal and then some do unusual things. One item never sold but now I sell a few a day which is strange. Others go in streaks where it is a mad rush with people buying 25 per order (which is strange). We have noticed that at times it can wipe out your inventory of select items. It is hard to say if it is the structured data or Google's machine learning or that with a combination of Adwords. I can't say I saw any ranking difference.
| 2:38 pm on Dec 19, 2013 (gmt 0)|
|Some of you are probably thinking that using schema or HTML 5 simply makes it easier for the all-evil Google to steal your content. |
WARNING: I am seeing new Google SERPS layouts that contain NO traditional results at all, they're pushing the envelope even further and now our content IS their results page. The entire page(below the ads) is just content from various sites with a footer credit link containing only your homepage url as anchor text(no link to the page the content was taken from, no title, no words in anchor text).
Don't be in such a hurry to give Google your content, you'll know why when you see one of these new results pages where the "google knowledge" entirely replaces the search results.
| 3:13 pm on Dec 19, 2013 (gmt 0)|
If Google can display a sentence of your content or even a paragraph and it fully satisfies the consumer causing you to go broke, then your business strategy needs rethinking.
|You need to figure out a new strategy for your site to deal with the current situation so you can incorporate these best practices and still profit. |
I wish Google showed only links and provided zero text so we would get even more traffic but that is not going to happen. Getting frustrated will not help my business survive. Getting creative and dealing with the evolving serps will help my business survive.
I agree that people should not rush into semantic searching. They should realize there are pros & cons that need to be addressed in a well thought out strategy.
This new type of serp that Sgt. Kickaxe ran across seems like it might be another test. It does link to the page that Google gathered the data. The link is displayed after the description instead of before, it is grey instead of blue and has the text "read full answer at ..." which feels like Google is vouching for the quality of information at that site. It is different but not necessarily worse. It seems like these websites got pushed to the top of the serps because of their semantic strategy. You can ignore semantic searching but your competition probably won't. I suggest we all figure out how to live with these new and constantly evolving serps.
| 3:57 pm on Dec 19, 2013 (gmt 0)|
|If Google can display a sentence of your content or even a paragraph and it fully satisfies the consumer causing you to go broke, then your business strategy needs rethinking. |
Ditto for sites that just serve up facts that are in the public domain. If you can look up the world's largest cities, the height of the Empire State Building, the presidents of the U.S., the current time and temperature in Hoboken, or the new VW Jetta's specifications and display that information on your pages, so can Google or Bing.
| This 59 message thread spans 2 pages: 59 (  2 ) > > |