Welcome to WebmasterWorld Guest from 3.227.233.6

Forum Moderators: Robert Charlton & goodroi

Can constantly changing page negatively effect rankings?

     
1:53 am on May 17, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2662
votes: 793


On my site I have a few summary pages which display summary information of the aggregated data from all the other pages on the site. These pages have thus far, been the most popular in the eye of Google but this hasn't always been the case.

The issue is that the data on the pages is dynamic, in that each time I update one of the specific pages with new data it updates the aggregate page. The changes are marginal, in fact as I add data the marginal impact decreases, as an example a displayed time may change from 30:08 to 30:07. But as small as the change maybe it remains a change. Data is updated on a ongoing basis, as I have time I update and add to the data, some times a few times a day, sometimes every few days.

The nature of the page is such that this data is expected to be more ever-green than real-time, but given that changes are marginal it has no impact on the typical user, and there is a note in the write-up that explains that the data is subject to change.

My concern is really from Google's perspective. When I first launched the site I had no traction with these pages, or any pages for that matter. But I continued to feed the system, the site was new and it needed data. But after a few month I couldn't move the needle and other opportunities arose, (the kind that pay the bills), so I stopped updating the site. Then last October, I started to get traffic to these summary pages. So I put some effort into cleaning up the pages, and traffic continued to grow. So over the past few month I started updating the data again. Now I have noticed a drop in traffic.

Could my updates be causing this drop? Should I decouple the page from the live data? I would prefer not to as it add a layer of maintenance, of which there is already too much with constant need to update data.
11:19 am on May 17, 2019 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Feb 5, 2004
posts: 593
votes: 79


Since they are marginal I doubt the changes are causing the drop. More likely one of Google's many algo tweaks has caused it.

In my niche a pile of my competitors every month change just the title of their articles (they add a variation of "Updated May 2019" to the end). If Google cannot catch this grey hat SEO technique I am sure your legitimate changes will be fine (I realize the frequency of the change is not as high as yours).

I have a few pages that get updating on a weekly or more basis. I add/change a sentence or so based on new information I see and have never had a problem.
5:53 pm on May 17, 2019 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:10110
votes: 1002


I'd also keep in mind that g is not WATCHING the changes ... it comes back to see what HAS changed. What is incremental to you might be something different to the search engines. They don't live on our sites 24/7 ... at least I hope not!
7:40 pm on May 17, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2662
votes: 793


I'd also keep in mind that g is not WATCHING the changes .

Google may well be watching, A site that provides data that should legitimately be changing in real-time is likely being "watched" by Google due to their own need to remain current. On the other hand there are sites that display data where change does not occur frequently and so Google is likely not "watching". Google has said that they crawl pages that change frequently more often than those that don't.

My concern is that my site is changing when it is expected not to be changing. Each time Googlebot goes to the page it sees a change, but given the nature of the content it expects not to see a change.

A good analogy is stock price vs quarterly earnings. The during trading the stock price changes continuously, thus Google is following it continuously, but quarterly earnings are typically published every 3 months, except for an occasional addendum or correction. Now if Google begins to see that the quarterly earnings are constantly changing like a stock quote would that not be an issue?
8:44 pm on May 17, 2019 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:10110
votes: 1002


No. All that means is the page data changed. Overthinking this.

Keep pages current/updated and don't worry about g.
9:16 pm on May 17, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2662
votes: 793


Your probably right.
9:40 pm on May 18, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3630
votes: 365


Google has said that they crawl pages that change frequently more often than those that don't.

That's true. But it isn't the only factor. For example, pages that get a lot of traffic from google search will be crawled more frequently than pages that don't get much traffic. Also, on a very large site, the individual pages are crawled less frequently, other factors being equal.

You can check your server logs to see when googlebot crawled those pages. Also, Search Console shows the last crawl date, except that sometimes it's slow to update the information, even appearing to lag several weeks behind. Another way is to check google's caches of those pages, although I'm not sure how reliable that is either. The best informaion is in your server logs.
12:57 am on May 21, 2019 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member redbar is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Oct 14, 2013
posts:3229
votes: 496


Keep pages current/updated and don't worry about g.


Yep, basically most of my sites have evergreen information therefore are not changed/updated very often, maybe once a month maximum and G understands that, then again I have a hotel/pub site which is updated on an almost daily basis and G visits seemingly daily and keeps stuff relative for search enquiries.

How does this affect rankings?

Certainly this site ranks first page all the time for typical local/regional hotel search queries just like its competitors do who rarely update their sites however this site also gets a lot of long-tail queries and other local/regional searches which was how I designed it.

Therefore, IME, updating/changing pages are not negative so long as they are relative to the business and this applies to all search engines.
1:22 am on May 21, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2662
votes: 793


updating/changing pages are not negative so long as they are relative to the business

Agreed but the key to this is:
so long as they are relative to the business

The worry in my particular case is that it be perceived as being not relative to the business.
8:19 am on May 21, 2019 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:12338
votes: 400


Another way is to check google's caches of those pages, although I'm not sure how reliable that is either
aristotle.... Google has been emphasizing for quite some time, now, that cache data does not necessary reflect most recent page data. Even the cache date is not an accurate indicator of when the page was last crawled, or of anything to be depended upon.

-----

Nick... regarding constant changes on some pages... you clearly have to pay some attention to what elements of the pages are changing. Eg, you wouldn't want these changes to be affecting your page titles or main headings/ headlines, or prominent content at the top of your page, if Google isn't used to such changes.

I've had several clients over the years who, in essentially static sites, intermittently used their home pages for announcements, news of changes of hours, new products, etc. It's often been a major struggle to get them to drop the practice... and the question I've asked myself is how do engines decide what to respond to and what not.

I believe that Google does adjust itself to certain patterns of changes on a site, but not to others. It has, eg, "learned" how to look at blogs like WordPress, or news sites, or at forums, like WebmasterWorld, so it's not thrown for a major loop when front page content changes. Permalinks then ultimately link to full articles on the topic.

But on sites with traditionally static pages, putting a large vacation notice on a previously optimized front page, eg, can destroy rankings.

I usually try to get such intermittent news put in the top right sidebar, with the eye drawn to excerpts by a contrasting colors... and to link to more detail on internal pages.

It's also possible to iframe the data, though that's not an approach that Google likes. if there's a lot of it that's changing all the time, I suggest in some cases adding some static content up at the top that serves as a summary to, say, the numbers that are changing. If hotels... with multiple hotels listed on a summary page, it's helpful to have what might be perceived as "core content" to remain constant and establish a location, and then have price fluctuations shown more as attributes, not as core info. Again, link to detailed pages from your summary pages.

Nick, I still haven't gotten my head around what kind of data you're showing. Weather or financial data, eg, might be more volatile... so I feel there needs to be a context that serves to frame and describe what it is that's changing (if that makes sense).

I think the most important thing is that you keep enough context sufficiently constant so the particulars don't matter that much. Fivethirtyeight is a major news analysis site with constant changing survey, political, and sports data, and might be a helpful example.

The pages don't rank for generic [XYZ playoff projections 2019] type searches, but if you add a word that refers to a mathematical model the use, like "CARMELO", then they're up at the top. I don't know how this might apply to what you're offering. Fivethirtyeight is worth studying for lots of reasons. I take the liberty of mentioning it because it's as well known in its field as Amazon is in shopping. Lots to be learned from studying either.

.
2:00 pm on May 21, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2662
votes: 793


you clearly have to pay some attention to what elements of the pages are changing. Eg, you wouldn't want these changes to be affecting .... prominent content at the top of your page


Unfortunately this exactly what is being impacted. The top of my pages lists 8 bullet points that are appearing as rich results in search.

Nick, I still haven't gotten my head around what kind of data you're showing

So to better explain, the website takes data from events and displays statistics about the event. It also aggregates the data to create a baseline of like events and then stats for each specific event is contrasted against the baseline. I then have a page, where I describe the aggregate, a summary page. Basically I display something to the effect
- type X events occur every 36 hours and 47 minutes.
- on average type x events last 93 minutes.

Now these events are taking place on a continuing basis, each time a new event occurs and the data is updated. That updated then automatically changes the aggregate and the 36 hours and 47 minutes becomes 36 hours and 46 minutes. The change in marginal and it is not relevant, whether it is 47, 46 or 48 really makes no difference in the real world.

As I mentioned already, as I add events to the aggregate the wight of each event is reduced so the impact of the change is also reduced. So going forward this should smooth itself out. But in the meantime I need to do as Aristotle suggested and check my logs for the crawl frequency and check what the actual variation are that Googlebot and others are seeing.
2:29 am on May 22, 2019 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:12338
votes: 400


Nick, How often do your table displays change, and what's the point of showing a live progression that self-updates and is Google readable?

Why not, say (and this is pulling hats out of the air just to make up an example), show a video of a sample group of changes, and then have, say, an Ajax display show a current table... like, say, a stock market chart, where users could choose what to see?