| This 201 message thread spans 7 pages: < < 201 ( 1 2 3  5 6 7 ) > > || |
|Google's 950 Penalty|
What do we know about it, and how do we get out of it?
| 2:05 pm on Jan 11, 2007 (gmt 0)|
I've read a lot about Google's -30 Penalty, where pages on a site drop 30 positions, but most of the comments about the 950 Penalty, where pages on a site drop to the very bottom or last page of the search results, have been comments made in other topic threads.
What do we really know about this penalty, what causes it, and most important of all, how do we fix our sites to restore normal rankings?
| 10:24 pm on Jan 17, 2007 (gmt 0)|
Pages without the remotest seo are hit just like seo'ed pages so that idea goes nowhere.
Google has long had, well, call it "this domain can't rank well for this search term" for some stemmed word combinations. That could be the pre-history of the phenomenon, but it is defintely a different thing.
Now, since this has gone on so long, it's possible to have targeted a term with content on a topic that has been placed in unique, rewritten form on a series of URLs, all of which rank great immediately, and tank at the next data refresh.
Stemming, posion pill keywords especially in URL, density of poison words (a completely different thing than standard keyword density)... I don't know what is being inappropriately recognized, but I do know that all four pages on a topic I've put on a particular domain over the past many months have been penalized, while only 1% of the other pages on this domain are commonly penalized (1000+ are not).
I think there are many somewhat similar (but also extremely different) phenomenon occuring, so it is big mistake to try and categorize them the same. The bottom line is in some cases Google interprets its data badly. You can switch the data around, but this is primarily Google doing a bad job in this area.
| 11:31 pm on Jan 17, 2007 (gmt 0)|
|SEOJoe, when did you get penalty, and how long did it take to recover? My penalty seems to have kicked in starting Sunday the 15th. |
I first noticed it on Jan 10th but it depends on the datacenter you are hitting. It came back across datacenters yesterday, Jan 16th. So, it wasn't really a penalty but some sort of testing on those datacenters as I suspected since the url in the SERPs didn't make any sense.
| 7:45 am on Jan 18, 2007 (gmt 0)|
|Pages without the remotest seo are hit just like seo'ed pages so that idea goes nowhere. |
Bingo. Part of what happens is webmasters tend to look at the last known issue or task they performed on the website and assume that was what made the change, even thought 99% of the others made the same changes and didn't tank.
| 8:08 am on Jan 18, 2007 (gmt 0)|
I will say though that the phenomenon seems far more common among niche authority sites than anything else, and of course these normally have decent site construction which means some level of seo.
| 12:50 pm on Jan 18, 2007 (gmt 0)|
That would certainly fit my site, although as I mentioned the SEO on it is hit and miss. It doesn't have CSS, so there are no H1, H2 tags for the most part. The page titles are all unique, and descriptive of the page, and most of the meta descriptions are now unique as well, although they were not up until the last 90 days or so. I still have about 250 pages that need unique meta descriptions, and most of them are in one directory.
One has to wonder why Google would want authority sites listed at the very bottom of the search results. It would seem those are the exact sites with unique content that Google would want at the top of the SERPs.
This site still has #1 search results, and the pages at #1 are very similar in overall construction as the ones at the bottom. Different search terms, yes, but overall not different enough that there is a logical reason for the disparity.
Meanwhile, on another mailing list I subscribe to, someone commented that if you find a good site searching for it on Google, make sure you save it before you leave, because it may never appear on Google again! They referred to Google as the "one shot search engine." Now that's a nice attitude for people to have about a search engine, isn't it? - it works once, but not twice!
| 2:10 pm on Jan 18, 2007 (gmt 0)|
AndyA and SteveB - I would also describe my site as a niche authority site. In fact it sounds a bit like your site AndyA. One possible problem is that on many pages my meta titles are the same as my meta descriptions. I have changed this in the last week, but I am not hopeful this with fix the problem.
Like you I have posted very little to the site in the last month. The main reason for this is this whole thing dominates my thoughts. I am either searching the web or analysing my site looking for answers. It is very difficult to keep the site going as well.
My site has been online since 1999. This is the first major problem, but it has been going on for five months now. I must also admit that for the first time in 8 years my motivation levels are very low. There seems little point in adding new content if it is so hard for people to find.
| 2:42 pm on Jan 18, 2007 (gmt 0)|
I've been trying to figure this one out too. I once asked Matt about this. For example, I have an article that defines the search term, provides a great deal of information about the history of the term, how you'd apply this to real world data and even several examples.
It's a well thought out article, just what a searcher would be looking for - the ideal answer to their question. Modest SEO - the same SEO we'd apply to any page. But in the SERPS this page turns up around 500 - with a lot of krap ahead of it. I'm not asking for #1 - but 500!? No way it deserves to rank that poorly. His response to me was to send in a reinclusion request - which I did. Still, that page is in no man's land. I don't get it - makes no sense.
| 3:59 pm on Jan 18, 2007 (gmt 0)|
I've often wondered how Google returns search results. When someone searches for "widget" it seems you get a lot of "widget for sale" results. What about other topics? Widget construction, widget colors, widget models, who makes them, the changes to Widgets over the years, etc. It would seem to me someone wanting to BUY a Widget would enter "buy widget," "widget for sale," "widget deals," "widget coupons," or something of that nature.
When I search for a topic on my site, I find totally unrelated things. For instance, if I want to know about "Acme Widget Model 27," I don't want results showing Standard Widget Model 27, because it's a different company that made it and it has nothing in common with the Acme Widget.
To clarify, if you were searching for 1973 Rolls-Royce information, chances are a Volkswagen of the same year is not going to be relevant to your search. Those aren't my search terms, but that's what I'm getting anyway.
I keep looking at my site, I've checked it with Xenu, no broken links, I've got the www to non-www 301 in place, I've disallowed my forum due to duplicate URL issues, and nothing seems to help at all. I'm really running out of ideas, other than to update all the pages to CSS. I've checked my main pages at W3C, and they do validate, so I can't imagine there's an error so bad that little Googlebot is getting stuck.
I really don't know what else to do. I look at the sites that are ranked 600 positions higher than mine, and I ask myself, "How is this site more relevant than my site for this query?" And the answer is: it's not. The PR is lower, incoming links to those pages are non-existent, and it's a 1973 Volkswagen page, when I want 1973 Rolls-Royce pages! They are not comparable, they are not the same thing, yet that's what I get.
Edit to add: I do see lots of pages that have, for instance, "Widget" and "Acme" on them, but the search term "Acme Widget Model 27" doesn't appear on the page anywhere. There may be an Acme Whoosit Model 119, or a Custom Widget Model 27, but again, in the world of Widgets, there is no comparison between these things, they are completely different, as different as a 1973 Rolls-Royce is to a 1973 Volkswagen. Yes, they are both cars. Yes, they were both made in 1973, have engines, tires, etc., but that's about it. Most Rolls-Royce afficianados aren't likely interested in VWs.
[edited by: AndyA at 4:04 pm (utc) on Jan. 18, 2007]
| 5:03 pm on Jan 18, 2007 (gmt 0)|
Hi Guys I have been reading your posts regularly since the 20th Dec. And I am sorry to say I am in exactly the same boat. I have even had to let two of my staff go today because quite frankly I can't afford to keep paying them due to a huge reduction in traffic from G.
In Response to AndyA :- I have noticed exactly problem as you for many 3 word search terms that we were once top ten for. Now have irrelevant information for the query I type in. For example if I type "Blue Widget Reviews" in I get lots of Green Widget Reviews before I see any relevant results to the actual search I made.
The results on Y are perfect you get what you asked for.
I think G has gone well over board with the whole if you have affiliate links on your site it must be a spam site. Come on G some of the best websites out there have affiliate links this is how they manage to survive and go on creating great content after all we are not all backed by billionaire venture capitalist companies.
| 5:17 pm on Jan 18, 2007 (gmt 0)|
Add content, that always does the trick for us. If the content is stale, update it. Google loves fresh content from sites that are very crawlable!
You might want to also check your keywords on google trends to see what people are looking for. We have made that a practice now to see if we are using the right keywords on a page to target the right audience. We have found some huge differences in search for variations of keywords such as:
The one thing we all tend to forget is that google builds its search engine around the popular terms that are searched on it. So if widget is searched 99% of the time and the rest of the terms are only 1% then you would be shooting yourself in the foot for targeting widgeting....
| 5:58 pm on Jan 18, 2007 (gmt 0)|
I think everyone has to remember that there can be more than one item that is kicking off this penalty. To pinpoint it to one particular cause or even a set of different causes will take some time, which obviously a lot of us do not have.
You also have to realize there was a PR update going on when a lot of people lost listings this past two weeks. This update could have caused a temporary drop. Then all of the sudden you see your pages right back up at the top.
| 6:07 pm on Jan 18, 2007 (gmt 0)|
I think that the "PR Update" is a red herring here - although the PR shifts did add to people's upset. Historically, a PR update is just a data export, and it's not involved in the ranking calculations. The real PR that Google does use for ranking is updated continually and has been for years.
Matt Cutts blogged about this very point last October during the previous toolbar PR export.
| 6:31 pm on Jan 18, 2007 (gmt 0)|
I agree tedster. I have seen sites that have been banned by google gain PR.
Obsessing over that little green bar to much might get your site into penalty.
| 6:32 pm on Jan 18, 2007 (gmt 0)|
trinorthlighting is right in both respects. Although in particular, fresh content will always be king especially if it's unique and relevant content that is high value to the search terms to you target.
Keep in mind, top performing blogs also perform well on SEs. And there are a bunch of reports about user-generated content being both favorable to SE's because it's fresh, unique, and follows a pattern of steady growth (not a massive amount of new content at once)
| 6:50 pm on Jan 18, 2007 (gmt 0)|
Unique is a big key and I have a good example.
We have one ecommerce site that sell products from two different manufacturers that are similar. Neither of the manufacturers sell direct on their site, but both have individual descriptions of the products. One manufacturer demands we use the description that is on their website due to legal concerns. The other manufacturer does not care if we rewrite the content and we do that.
Now, comparing the serps for the two. The one where we have to copy word for word is listed on the third page of the results. The one where we rewrite and make unique is in the top 10 and in some cases we outrank the manufacturer because we add relevant information to it that a user might need to know.
| 7:15 pm on Jan 18, 2007 (gmt 0)|
|I think that the "PR Update" is a red herring here - although the PR shifts did add to people's upset. Historically, a PR update is just a data export, and it's not involved in the ranking calculations. The real PR that Google does use for ranking is updated continually and has been for years. |
Matt Cutts blogged about this very point last October during the previous toolbar PR export.
You are right Tedster, that part slipped my mind. But I still believe there is more than one thing that can cause you to get slapped.
| 7:51 pm on Jan 18, 2007 (gmt 0)|
|Add content, that always does the trick for us. If the content is stale, update it. Google loves fresh content from sites that are very crawlable! |
Yes, that is an excellent idea. It shouldn't be too hard to add something new, or expand on current content as I update the HTML code on the pages of my site. And I'll make sure I'm using the most relevant keywords for my site, although since it's such a niche site, I'm not sure I can change much about the keywords, as they are already pretty targeted. To my visitors, my page titles are very relevant and indicate exactly what the content is, and those tend to blend well with the keywords as well. To an outsider, they might not understand, but that's only because they aren't familiar with the niche.
I do believe this penalty or filter is the result of multiple issues, perhaps on their own not a big thing at all, but put together it has major impact. And fixing just one of them at this point might not be enough to set the site on the path to recovery.
I did notice during the recent PR update that my home page went from PR5 to PR3, but only stayed there for a day or so before returning to PR5. I have no PR3 pages on my site at all now, except for one page, which according to Google Webmaster Tools is my highest PR page. With the linking heirarchy on my site, I should have a lot of PR3 pages. My PR 5 pages link to other main pages, which are all PR5 or PR4, then PR3 is skipped completely except for that one strange page, and a bunch of pages are PR2. The home page on my site has been PR5 and PR6 for years now, so having that one page show a PR3 for months, and also having it show as my highest ranking page in Webmaster Tools for the last 120+ days is confusing. It doesn't make sense.
Things at Google just don't seem right.
[edited by: AndyA at 7:54 pm (utc) on Jan. 18, 2007]
| 8:11 pm on Jan 18, 2007 (gmt 0)|
|Google loves fresh content |
thats not what im seeing.
| 8:48 pm on Jan 18, 2007 (gmt 0)|
As this penalty has been applied to every page I have added since August fresh content is not the answer for me. I'm not saying it won't work for everyone, but it has not worked for me. At one stage in this I was adding ten pages per day. My site has thousands of pages so ten was not over the top.
| 10:15 pm on Jan 18, 2007 (gmt 0)|
This takes the cake: i am doing searches to get an idea of the types of sites affected by the last page penalty. I look for "servicio de traduccion" (which means "translation service" in Spanish). On the last page, what do I find among the other downgraded market leaders? A Google directory listing. I mean, #*$!? What is with the upside down search results? It almost is as if first is last, I cant tell you how many times I have found a page of mine at the bottom of the 10th page, when it was at the top of page 1. G, come on, just press the "revert" button and go back to the drawing board with this index.
| 11:46 pm on Jan 18, 2007 (gmt 0)|
One thing it seems a lot of people need to do is forget their freaking sites. Look at what is down there at 950. One thing is plain:
These are not pages that scored in at 950. There are some niche authority sites, and then there is a pile of rotting garbage that, well, that scores in the top ten on Yahoo or MSN. Google places a lot of stuff down there by design that desrves to be penalized. The problem is the OBVIOUS mistakes. While some people complaining here have spammy, low content sites, many have genuine niche authority sites. More importantly, when I see a page of mine down there, I also see some of my top competitors who I also know are not spammers (to any large degree anyway for sure). What my competitors pages all have though is: they would score highly for SOME search term(s). In other words.... I don't see any pages placed down at 950 that normally would rank say #72.
| 12:10 am on Jan 19, 2007 (gmt 0)|
PhattusCattus. I Agree.
This is the classic case of Google cutting off its nose to spite its face. Now I'm clearly seeing barely in the ballpark results in the SERPs at the top in some areas, with more relevant stuff at least 5-10 pages down.
Check this out: today I wanted to buy an electronics item as a birthday present for our my live-in nanny from a company I did business with previously and had a top notch shopping experince with. So I type in companyurl manufacturer, and I see only shopping comparison engine pages at the top where the google blurb indicates company used to advertise its products, but when I click, I realize those listings expired. I also see one more-than-year-old news story mentioning the company url and the manufacturer.
I then go to Yahoo and type the same thing in, and guess what, the main company url is the 1st result, and a product sold by that company from the manufacturer I seek is result #2. Clicked on the 2nd result, called them to make sure they had the product I want in stock and bought it online.
Now if wanted to comparison shop I would go to comparison shopping sites (as spammy as they are these days), and if I wanted news stories I would go to the news search. But I think my query in no uncertain terms indicated what I was looking for, and not only was it not at the top in the SERP(like it should have been in this case), but it was nowhere to be found.
All this over-intellectualizing and statisitical modeling I read in that PDF paper reminds me of Myron Scholes and LTCM ([en.wikipedia.org ]])
Now overall google is still probably the best search, but its really beginning to suck in some areas. It's like they think they know what is best for the user.
| 1:47 pm on Jan 19, 2007 (gmt 0)|
I spent some more time looking at the URLs adjoining my Google listings. I'm not at all happy about the bad neighborhood Google has placed me in. Right below one of my pages is one with the title "women (performing a normal body waste function)"!
Nice. Very nice. I have no idea how that site could possibly rank for the same search term as mine, nor do I have any idea how my page which has nothing to do with women, normal bodily functions, or anything even remotely of that nature, could be just slightly more relevant to the search term than this other page.
I clicked on the link to see what was on the page that could place it in the same neighborhood as mine, and I got a warning from Google that visiting that page could harm my computer!
GREAT! THANKS GOOGLE! You've ranked my original, unique content site right above a disgusting site that downloads spyware, and has who knows what kind of disgusting content on it!
This is downright insulting. Google should be ashamed of themselves for returning such unrelated sites for a search term. What does this say about my site to people who see it listed next to such garbage?
My site is appropriate for children, there is nothing even remotely similar between my site and a site such as the one I mentioned above, yet there they are right next to each other. I didn't put myself in a bad neighborhood, GOOGLE did.
| 4:36 pm on Jan 19, 2007 (gmt 0)|
I dont think google can be taking into account title tags in some cases, in fact it prefers not to see the keywords in the title on a lot of search requests. So in the example you gave we are seeing more and more of this issue in the serps because the filters are taking out the quality and leaving junk in.
I think Steve B is bang on the money here ref stemming. In the majority of cases i think googles infastructure is poor and it relates far too many words under its stemming control and then hits sites that are deep in the subject matter due to duplicate content or perhaps to much mention of the keyword?.
The chickens will come home in the end because a lot of those sites kicked out that are buying adwords will shortly either go out of business or just wont be able to subsidize the adwords cost as they no longer get free traffic so it will be interesting to see the longer term effects of this - also the serps are less relevent so again users may start looking away from google which was once the authority on search.
To prove the point ive now seen loads of cases where the stemming has caused issues. For example:- You have a widget section with detailed informaion off say 12 pages and you carry an index on each page to the content so users can move to the next page or skip a page or so :-
1. Widgets Item Page
2. How i found a blue Widgetier item
3. The widget item that came by
4. The night of six widgetors items
5. Johns experience with widget items
6. Uk research into widget items online
etc etc you get the drift
In the above example google reads every page, assumes they are all about "widget items" and treats the lot as duplicate content yet the text and content is different on each page. Previously i always thought google would deliver in a search string the page it thought the most relevent to the string - now it doesnt imo - it delivers the weakest page of the site - often you are not now landing on the specific page you want in a lot of cases
In the above example the surfer wants detailed information on "Widget items" yet the serps deliver loads of single page junk, ebay pages, amazon pages about buying widgets and the site thats an authority with a detailed section dedicated to the subject matter is no longer showing.
Thats my take on it - this problem does relate to the rolling out of stemming in some shape or form and it doesnt work, you get some results miles off target
| 5:03 pm on Jan 19, 2007 (gmt 0)|
Just wanted to mention that we are probably just starting to experience a similar penalty which might scale up to be the end-of-results penalty. Since a few days ago a term that used to rank in the top 10 now ranks at 90+ and most of our other keywords that used to rank 1st for about a year fell to the end of page 1 and then to page 2 and they will probably fall all the way down to 90+ pretty soon. It seems that the effect of this penalty is gradual in our case.
We do have affiliate links since we are an online travel agency that displays hotel offerings that are handled directly by us but also compliment our offering with hotels from affiliates. In addition to that most of our pages being penalized have duplicate content with only a few keywords changing for each page.
This is a textbook case of how to get youself the penalty I suppose so I will post here what happens with our attempt to fix these problems and how soon one can get out of the penalty.
| 7:09 pm on Jan 19, 2007 (gmt 0)|
I think we need to wait for Matt as he is department head for WEBSPAM and he will be able to throw some light on it, like why we have been kicked out from there and how we can come out from there.
| 4:13 am on Jan 20, 2007 (gmt 0)|
RichTC, I think you are getting pretty warm. I should know upon the next cache or two. I'm in classic 950 mode, having fallen from a long-standing top 20 (sometimes top 10, others 20, depending on which way the google wind is blowing). I was the one who earlier was spending a lot of time studying my old neighbors who joined me in my new neighborhood in the 900 block of G-street..
In virtually every case, I can see that google is having a hard time determining which page on the affected sites should be ranking high, and throws the historic good one to the bottom. Often another page from the same site starts ranking, and sometimes nearly replacing the old page.
I think my problem is a result of thinking about my visitors first instead of google first (shame on me.) Only one content page on my site is about widgets. I have always had a sitewide naviagation text link to the widgets page. Google could easily determine which of my pages should rank well for widgets b/c it is the only one with widget backlinks with widget anchortext, the only one with widget in the title, and the only one with widget anchor text for internal links sitewide.
About 1 cache prior to moving to the 900 block, without thinking, I added a second sitewide link in the navigation with "widget" anchor text, pointing to a new page. This page was not meant to be a real content page, but was under the Navigation heading of something similar to "News". But I think I confused google by essentially voting on every page on the site for two separate pages about widgets... although only 1 has any outside backlinks. It simply made more sense to create a heading for "News" and make the anchor text "widget", rather than listing "widget news", "next product news" etc.
After studying the other sites with the same problem, I'm getting pretty confident that something along these lines is the culprit. We should see in the next new cache or two.
| 2:15 pm on Jan 20, 2007 (gmt 0)|
|In virtually every case, I can see that google is having a hard time determining which page on the affected sites should be ranking high, and throws the historic good one to the bottom. |
I'm just wondering if everyone with the 950 Penalty is showing the correct page with highest PR in their Webmaster Tools account? My WT account shows a page that's 3 clicks away from the home page as my page with highest PR. It only has internal links from other pages in my site, and one other incoming link from another site. This page has been listed as my top PR page for 4 months, replacing my home page which was previously listed as the top page.
I've checked, and even with the most recent PR update, this page is showing a PR3, while my home page is PR5. So it's not the high PR page we once thought it was.
At any rate, I think there is a problem with Webmaster Tools reporting this page as my highest PR page, as I just don't see how that could possibly be. If anyone else is experiencing this, along with the 950 Penalty, there could be a connection.
| 2:26 pm on Jan 20, 2007 (gmt 0)|
I just checked another search term I monitor, and I've noticed another site that is normally in the top 20 is now down at the bottom of the results with mine. It also has two pages listed for the search term.
Is there a pattern to this? Are sites hit with the 950 Penalty ones that would normally return two results listed for the same search term? Can anyone verify if they have only one page affected for a particular term?
| 3:21 pm on Jan 20, 2007 (gmt 0)|
I don't think that every site having 2 pages rank for the same search term are affected this way, but I do think that something along these lines is causing the issue.
In my case, the person with the historic position 1 still holds the number 1 result, but with a different page than historically been there. The old one is down at 950.
I don't think this is across the board.... I just think it has something to do with google having a hard time IN CERTAIN INSTANCES determining which to choose, and for some reason tosses the old one down to the 900's.
I think the question we should be asking is simply, "Is there anything about my site that would make it appear that another page deserves to rank for the same query". I'm not sure that this has to be something we recently did to our sites, rather than simply a problem of recent vintage in google's ability to determine the proper page to rank.
In my case, I think google's logic in ranking my pages would sound like this: "For this query, the site has two pages returned in the result set (1,000 results) to be ranked. Historically, page A should be ranked number 5, but it looks like the site is saying that page B is more relevant to the query, thus we will push page A down to the bottom, and let page B assume it's normal algo calculated rank (result 100)."
Could be way off base on this whole thing, but that's the best I can come up with at the moment.
| 8:18 pm on Jan 20, 2007 (gmt 0)|
Last night after trying to find why some of my pages are gone I suddenly realized it was probably related to those 950 penalty messages on WW so here I go. I have just a few pages missing so far but the fear that more will go motivates me to find the answer.
An individual page all but disappears for the most usual search term like 'ancient widgets' but if I pick something else from the page like 'brass widgets' even though brass widgets are a minor mention on the page it will rank in the first 2 or 3 pages. So it is as if the word 'ancient' is blocked but from that page only. In fact if you search 'ancient widgets' another page from my site will come up even though ancient widgets are only mentioned on that page. Also pages from other sites that have linked to my ancient widget page will come up in the first few pages of the serps.
So I think it might be a one word penalty on that page only. Since this wasn't happening earlier it could be an algo change. I'm wondering if it might be word density as in my topic it is often necessary to repeat the word frequently when writing an article. I have tried decreasing the word density on one page which made for some awkward writing. It is too early to see if it will solve the problem.
I don't think affiliate links are the answer as I have exactly the same affiliates on the pages that haven't plunged in the serps. Also I don't think it has anything to do with high-value key words as the words that are hurting my pages would only be of interest in my niche topic. My sites are not retail either. I have links from gov. and edu. sites as well so that is no protection. Also this penalty seems to have nothing to do with PR. The pages still have the same PR3 or 4 as they had before.
|far more common among niche authority sites |
If not more common on this kind of site the penalty is certainly hitting them.
|seems to be repetition of keywords on page title and in headlines on page. |
Hmm, there may be something to this. I added headings throughout my articles a while back so it would be easier for people who skim to see what the page is about or to find what especially interests them. The missing page I worked on a couple of days ago had the key word in every one of these. I took it out of all but one so we will see if it helped.
I need to go back and see how often I use the two words together like 'ancient widgets'. maybe it is the words in combination.
I agree with Andy. It's insane that people with informational sites that have been around for years now have to pick through individual pages to try to change them so they won't have this penalty. I've spent 3 days on this instead of writing an article that I've spend weeks researching. Does Google want unique and informative content added to the web or not? I'm just spouting off here. I think we are getting caught in an algo meant to catch scrapers but it's catching a lot more pages than that.
| This 201 message thread spans 7 pages: < < 201 ( 1 2 3  5 6 7 ) > > |