homepage Welcome to WebmasterWorld Guest from 54.166.173.147
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / WebmasterWorld / Content, Writing and Copyright
Forum Library, Charter, Moderators: not2easy

Content, Writing and Copyright Forum

    
have content!! maximising PR by syndication
how to gain PR by syndicating my content
topr8




msg:930308
 10:14 pm on Jun 19, 2002 (gmt 0)

we have a content rich site with new articles/news added continually, information is industry specific.

i'm thinking of offering content to other sites, and hopefully gaining some traffic but mostly a boost to PR from the links back to the site.

this is virgin territory for me, any ideas of what i should be looking into?? or how to go about it.

 

WebGuerrilla




msg:930309
 10:25 pm on Jun 19, 2002 (gmt 0)

The main thing you need to be concerned with is a duplicate content penalty. I know of a couple sites that use syndication as their primary marketing tool, and they ended up with a PR0 a few months back.

I also talked to a Google rep awhile back who said that in the future, when finding dupe content on two domains, the one with the lowest PR would get penalized.

That could lead to a situation where having your article republished on a site with a higher PR could cause your site to get penalized.

Grumpus




msg:930310
 12:16 pm on Jun 20, 2002 (gmt 0)

Penalizing sites for news feeds? What a sham.

Here's how to nip it in the bud...

Provide the content but have them display it in an "IFRAME". Therefore, their page will be linking to you, but the content itself is sill only at one single URL.

Another solution is to XML it and display the content via Javascript with a required hard coded link to your site. The bots won't index the content, but you'll still get link credit.

Good Luck!

G.

paynt




msg:930311
 12:17 pm on Jun 20, 2002 (gmt 0)

Hi topr8,

The ideal situation for this when I can get it is to have the site only list a portion of the article with a keyword rich text link back to my site where the reader can find the rest of the article. I use a different variation of a bio each time with another link back to my main page. This benefits both sites and the visitor plus gives me the link and the authority.

PR is a side benefit from this type of self-promotion. I believe the main benefit is drawing in traffic from the content and creating authority for your site.

[edited by: paynt at 12:20 pm (utc) on June 20, 2002]

topr8




msg:930312
 12:17 pm on Jun 20, 2002 (gmt 0)

ok thanks for the tips, its a new area for me.

ukgimp




msg:930313
 1:08 pm on Jun 20, 2002 (gmt 0)

Grumpus

I am not sure WG was refering to RSS news feeds, more likely the practice of allowing articles to be recreated on other sites providing there is a link back. I have seen this on numerous occasions. I may be wrong, just the way I read it.

This must be potentially dangerous??

Richard

Jill




msg:930314
 1:35 pm on Jun 20, 2002 (gmt 0)

It's a shame that we are penalized for the sharing of information and links to the sites that offer it. One of our sites is informational only. We do not sell anything but strive to give information to our readers. We have not yet been penalized, nor have any of our contributors at this point but I'd imagine it may happen in the future.

We do provide links back to sites for the purpose of informing people of where they can find further information. I understand Google's need to control linking for the sole purpose of raising PR, but to penalize those of us who are only providing information in one of the best and simplest ways we can is not the answer.

WebGuerrilla




msg:930315
 6:37 pm on Jun 20, 2002 (gmt 0)

I am not sure WG was refering to RSS news feeds, more likely the practice of allowing articles to be recreated on other sites providing there is a link back

You are correct. :) If you are allowing people to download and publish articles that are on your site, you are creating a situation where the original article on your site (and possibly your entire site) could end up with no PR.

Now that doesn't mean you shouldn't do it. As Paynt pointed out, and potential PR boost should be viewed as just some extra icing on the cake. The quality of traffic that comes from people clicking on your bio link can be even better than a new visitor from a search engine.

Grumpus




msg:930316
 7:55 pm on Jun 20, 2002 (gmt 0)

Was on my way out the door this morning when I posted, so I tried to answer the question and put my rant off until later. It's later now...

I have several professional movie critics who e-mail me their reviews. I post them on my site, with their permission, blessing, and encouragement. (It gives them some traffic for their archives, and gives me a foundation of current reviews for my visitors to check out and hopefully encourage them to post their own). These reviewers also submit their reviews to other movie sites, multiple newspapers (which in turn, print them on their sites), and newsgroups.

Now, as far as I know, duplicate content is just that - duplicate content. I believe that their looking for a VERY HIGH percentage of identical text between the BODY tags. Therefore, under normal circumstances, your navigation controls, site header, etc. SHOULD be enough to prevent you from having what anyone would consider "duplicate content".

You can't tell me that CNN and Fox News are going to get penalized for posting the same Bin Laden article that just came off the AP Wire, are you? You can't tell me that all the newspapers that post Roger Ebert's reviews are going to get penalized, are you?

Come on, that's like saying that people in Boston are not going to be able to watch NBC News tonight because it originates in New York and has identical content. Bah.

IF google is actually punishing syndicators and/or the people who receive the syndicated content, then EVERY news site, every portal site, every web directory, and virtually every other site with information (by nature, information HAS to come from somewhere, right?) would have a PR0. In fact, even Google would have a PR0 because EVERY bit of data on the site is a duplication of content on the web (well, except for that snazzy logo).

There's something wrong up near the beginning of this thread, I think. I find it hard to believe that you'd get PR0'd for providing information. Ugh.

Okay, rant is over. <breathing>

G.

matthias




msg:930317
 1:25 am on Jun 21, 2002 (gmt 0)

Right Grumpus

What would be considered double content? If I provide a xml file with news and anyone can parse this file and integrate the content in their site (I plan to do this), with their design, will google realize that it is duplicate content?

vitaplease




msg:930318
 5:23 am on Jun 21, 2002 (gmt 0)

WebGuerrilla
when finding dupe content on two domains, the one with the lowest PR would get penalized.

Interesting, if this penalty would happen, that would mean that most news agencies picking up an intersting story from an average site would PR0 the originator-page of that story, because most news-sites and their relating pages have higher PR's.

Grumpus

Now, as far as I know, duplicate content is just that - duplicate content. I believe that their looking for a VERY HIGH percentage of identical text between the BODY tags.

I would agree with you Grumpus. I think it takes more real explicit copy-catting to be zero'd-out by Google, possibly over several pages of a site, but more importantly, it would take predominant interlinking between these "identical content" sites/pages without individually earned incoming external links, for the lower PR page to be dumped into PR0-ness by Google.

If you look at the winners of the recent Google programming contest:

[google.com...]

also partially discussed in this thread:

[webmasterworld.com...]

you could see possibilities of Google taking into account the age of links (Laird Breyer) and Unique identifiers in content (Thomas Phelps and Robert Wilensky) to establish "who was first". Problem here is that some (copy-cat)news sites are spidered every 15 minutes by Google, whereas the "originator" could be only spidered (if at all) every month - so who's link/content is older?

nell




msg:930319
 9:21 am on Jun 21, 2002 (gmt 0)

>Provide the content but have them display it in an "IFRAME<

Not Netscape friendly.

chiyo




msg:930320
 9:48 am on Jun 21, 2002 (gmt 0)

Just to correct a mis-understanding about RSS. RSS does not "re-create" content or whole articles on other sites.

RSS provides a way for webmasters to provide a short headline and summary of the latest news items or other info on their site. Each item links to a separate page which holds the full article for that headline and summary,

The RSS file which resides on the host's computer and is available to RSS readers and software so other sites can use it (usually amalgamated with other RSS feeds). There seem to be almost 10,000 known publically available RSS feeds in many subjects - mainly news.

The RSS parsing software or simple perl/php scripts can indeed include the headline and small summary on other sites by accessing the RSS file and interpreting it for instant display on a second site. The idea is that the recipient sites gets good relevant content which changes quickly. When somebody on that site finds the headline interesting they still have to click on it to go to the approriate page on the original site. The only content in common is the headline and the small abstract (usually 0 to 10 words).

Most external RSS feeds are typically delivered by js, meaning there is no PR advantage for the source site. Some perl and php solutions means the info IS hard coded, so may help with PR, but given the nature of RSS feeds (news items changing very fast) it is not in any way a new effective technique for PR, though it does increase hits in the normal way and awareness of your site if people find the summary useful.

Brett, how effective are you finding the WMW RSS feed? Any comments?

Brett_Tabke




msg:930321
 3:52 pm on Jun 21, 2002 (gmt 0)

It's fairly significant overhead actually. There's alot of processing involved. If it weren't automated, I wouldn't be doing it.

> effective are you finding

It's hard to determine. I've not setup any followup tracking tools other than log watching. I'll keep doing it because it's all setup. I don't know if I would do it again though.

Learning Curve




msg:930322
 6:00 pm on Jun 21, 2002 (gmt 0)

What about this?

I buy a weekly syndicated "tips" column for my e-mail newsletters. The column also appears in several large newspapers and perhaps their websites. Recently, I put the columns up on my website trying to boost my content for Google and to help sell potential subscribers on they will receive.

Each column (total of about 40) on my website has my logo, etc. on the top and bottom of each page. The purchased content, of course, is in the middle. Page size is about 6KB.

Do you think this may cause a penalty from Google.

paynt




msg:930323
 12:15 pm on Jul 1, 2002 (gmt 0)

Hey Learning Curve,

I like your question. Instead of using a canned piece of content that dozens of other sites have access to, I would look into establishing a relationship with a copywriter. Think of the content I could then produce compared with canned content that's lost its impact through the simple process of dilution.

If I were to guess what Googlebot wants, in an odd twist to how we've analyzed the engines traditionally, I'd suggest a focus on making our sites richer, more exciting, and therefore I believe more successful. Google likes fresh content; I believe we all accept that. And of course Google loves a site that is well linked. My theories are that Googlebot just eats it up when those links weave through links that feed it fresh content.

Returning visitors appreciate fresh content, which is where we're trying to convert so it makes sense to appeal to them. This is where I push myself to think and link outside the box [webmasterworld.com]. Instead of being dependent on a source of content for my site that others are also utilizing and offers nothing new, by connecting with a copywriter I am able to become a producer of content. Imagine the possibilities.

So, your question was¦

Do you think this may cause a penalty from Google.

Although this is your question you asked I thought it deserved a broader response. I seriously doubt google will filter out tip lines and news feeds.

What could be really cool is to connect with another site that has the authority you are offering through the tip line you're presently using and make a connection with them. Have an independent site provide you with a tip column in exchange for the link and a bio. You both win, it doesn't cost anything and you're providing fresh content to feed both the visitor and the Googlebot. That's another example of linking outside the box.

Once we start looking at the production of content and consider the relationship we can make with it to linking opportunities, stepping out and embracing these tasks as challenges the better sites we could produce because the motivation that is sparking us has changed.

A bit of philosophy so early here for me on Monday morning. I sometimes go on and on and on in my head so pardon if it just has to blow out sometimes.

Peace to you all!

[edited by: paynt at 11:54 am (utc) on July 16, 2002]

Marcia




msg:930324
 3:52 am on Jul 14, 2002 (gmt 0)

There are people who develop information pages for their own sites and then they're used, fully or in part, as articles published on other sites and/or in newsletters that are then published to web sites in archives.

So the material does end up being duplicate content, but from the perspective of the original author rather than someone using others' content, are they then jeopardizing their own site by making it available to others?

I know someone who wrote quality articles for his site, and not only were the individual pages linked to considerably, but they were re-printed (with permission) by some high profile sites with links back to his homepage - not the articles themselves. His site was PR6 and the others were PR7 or PR8 (maybe higher). He's now PR5 with the interior pages having PR0.

I don't know if that's the sole reason for the penalty, but I wondered and worried when I first saw the articles reprinted. It gives some food for thought for those who make their own material available.

How much risk is there, and how much would need to be modified to play it safe?

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / WebmasterWorld / Content, Writing and Copyright
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved