Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Is it time to overwrite content until it sticks?

         

JS_Harris

8:30 pm on Aug 20, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



This has probably been discussed before but given today's SEO environment where the content on one page can affect the rankings of another do you think it might be a good time to take a different approach to content creation? My thought is that perhaps a domain with an index page and 4 content URLs(articles) could be the basis of a start of a site and each time a new article is posted it replaces the previous content on an existing URL. If you post once a week then any given article would exist on the site for a month, maximum. You could adjust how many URLs are on the site based on how often you update but the goal would be to have content only exist for a month before being rewritten.

Why? Because a month is long enough to know if a page is generating any interest(from search or otherwise). If it's not generating 500 views from search daily, for example, it would be rewritten with a new title and content when its turn comes. If, however, it does meet a minimum for traffic it would be archived and a new URL created for the rotation.

Pros: Any backlinks to a page would remain even if the page changed over time. Any non-performers would be purged efficiently. The site would only grow if content was worthy thus maintaining a high average value.

Cons: The occasional link or bookmark would no longer point to the same content for which it received that backlink or bookmark.

What other issues or benefits do you think such a "perform of get rewritten" minimal URL approach would have?

bwnbwn

12:14 am on Aug 21, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



1st I would never rewrite a page you never know when the topic might become hot or vice versa.. I would write another article on the same subject and link from the 1st to the 2nd as a more info read. Better to have more content on a subject than subject yourself to rewriting them Just My opinion.

JS_Harris

5:14 am on Aug 21, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Oh I understand that train of thought well, I and most every webmaster has followed it forever, but how many non-performing pages do you have? How many articles drive the vast majority of your traffic? I bet your top 5% drives over half your traffic(general statement, not *at* you).

Now if you could turn every one of the non-performers into $5 by selling it, or toss them onto another domain and sell that, you would be left with a top notch site and some pocket change. Is "more" really better with how search engines consider page A in the rankings of page B? It's much easier to create an efficient internal link structure to fewer pages as well, thus keeping their internal value(s) high.

If an article gets posted in the forest and Google ignores it, does anyone ever read it? Perhaps the content doesn't have to vanish for good, except from that site. You can always "bring it back" if it does well elsewhere if you simply choose to store it on another domain.

These are just ideas... because not every article can be a homerun and it's easy for a winner to get lost in the crowd. An alternative to completely removing the content from the site and re-using it's URL could be no add noindex tags to it. My general question is about recycling URLs to keep their numbers down, the exact method you would do that isn't as important.

keyplyr

9:30 am on Aug 21, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I don't have my pages compete against each other. Each page is specific content with overall theme of site. Each page targets specific search terms.

Years ago I had a few pages similar and they eventually were dropped from Google's SERP with only the page with the most juice remaining... I learned.

As far as a couple pages lowering the overall ranking of the entire site, while that may or may not be true, I haven't seen it myself. I have several pages with only a few sentences of content and they are not adversely affecting the other 200 pages AFAIK.

bwnbwn

4:24 pm on Aug 21, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I have never had an issue with writing several pages for a specific topic that can easily support targeting different keywords. Been doing it for as long as anybody here. If it is a very specific nitch yes but I rarely see a nitch so specific it can take several articles covering a topic. It keeps the read to a level the user doesn't just do the scan thing and IMO provides more useful content on the subject.

iamlost

8:06 pm on Aug 21, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I believe that overwriting content on specific URLs is counterproductive, i.e. potential invalidation of backlinks rationale, bookmarks purpose, etc. and runs counter to the underlaying architecture of the web.

However, I have been experimenting (variously, for 3-years) with changing how a page's content is delivered depending on context. For the past decade and more my pages have loosely followed the news style inverted pyramid where the most important details come first with the supporting information following - designed to satisfy both skimmers and readers.
Note: almost all my content pages are long (1000 word) to very long form.

As web publishers we've already generally separated structure (HTML), presentation (CSS), and behaviour (javascript); now I'm working at how best to separate the content from the semantics. This is an immensely difficult problem because context is not static, rather a dynamic process with a history. In fact much of what is commonly termed 'context' is in reality a snapshot of a specific moment in context or context state; in reality context is a process, a flow, a series of context states over time. And that history, the previous context states influence future ones.

Instead of thinking: structure/semantics -> content -> presentation/behaviour for each target also remove structure/semantics from the 'page' such that a given URL is totally amorphous. Think instead: context -> content -> structure/semantics -> presentation/behaviour.

One needs to collect specified user context information or metadata, within a framework build a (semi-)unique awareness of user context, match against user request, i.e. URL, considering referer(s), excluding extraneous, ranking remaining, and outputting a customised contextual result, i.e. page.

It is what I know (or think I know :)) about the visitor aka context that determines which information aka content in what forms(s) in which order, the required structure/semantics, and the associated behaviours/presentation... on the fly semi-custom segment targeted or individually personalised pages made totally from scratch.

Rather than overwriting the page as per JS_Harris's suggestion I am reordering page content and/or rewriting (i.e. to age/education level) to serve the page's content in the most contextually relevant aka personalised way possible.

It is extremely difficult to get right. :)
And the SEs are rather, ummm... ambivalent; they prefer a static URL/page. As with a blind person they have problems with frequent furniture rearranging. The room and the things in the room are simply assumed to have fixed positions...

Well, they will have to adjust (as we have to their personalisation efforts) and accept the 'default' knowing it's not unique (so far they have) or become increasingly irrelevant to my sites and business. They are no longer the only or even the main supplier...

bwnbwn

10:07 pm on Aug 21, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



#*$! Iamlost good post

blend27

12:19 am on Aug 22, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The room and the things in the room are simply assumed to have fixed positions...

Nicely said, nicely done!

Nutterum

7:54 am on Aug 22, 2016 (gmt 0)

10+ Year Member Top Contributors Of The Month



Great post Iamlost. I have seen an enterprise business I was consulting last year do just that. They personalized all the heavy hitting landing/product pages using services like DemandBase, optimizely and Doubleclick to present content and "personal ads" to more than 30 different audiences. They were very very granular and it worked quite well for them, with the leading audience groups reaching more than 15% conversion rate (which for their business is A LOT) without loosing on other potential clients. Now granted that involves quite a bit of additional development work, but for me it is the marketing game of the future.( read in 2-3 years everyone will be doing it)

aristotle

5:03 pm on Aug 22, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If you're talking about an informational site, to be successful in this day and age, you need to have super high-quality articles filled with lots of unique and useful information. You also need to build the site around one central subject or theme, with a organization of the articles that covers the subject in a logical way. You also should plan everything before you start researching and writing. You can't just throw something together and then try to keep revising it.

JS_Harris

12:41 am on Aug 23, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Great post iamlost.

In this day and age Google is still using human raters to do "jobs" which seems to be to look at new articles that Google feels may be worthy of a top 3-5 ranking. That being the case who knows how many of your pages can't rank top 3 because they didn't receive a high enough rating from the human evaluator. By re-writing content, and presumably having it re-evaluated, every article has the potential of being a winner and, overall, that's good for everyone.

No need for guesswork, if Google won't send traffic to a page after a month(or more on competitive subjects) that's a sign the page missed it's mark. We've tried everything else to make weak pages rank, including deleting pages, my question is more about recycling URLs in a "try again" fashion. You could cover the same topic again and again until you get a hit with Google so the URLs wouldn't necessarily change in topic, just content.

The new Google rater handbook shows that the E.A.T. factor has been strengthened on a per page basis, which is highly subjective. We have no way of knowing if a human rating is holding an article back or not, except if it's a good rating and the page is receiving traffic. Instead of churning out new URLs we could churn out new articles and not save any that don't stick. 20 pages with a medium rating are likely holding back 1-2 pages with a good rating under Google's current algo.