Welcome to WebmasterWorld Guest from 188.8.131.52
All 3 pages had received approximately 50 FB likes, 1 tweet from an account that tweets on this subject, and 1 G+ from an account that shares on this subject.
I then posted 7 blog comments on articles that were on the exact same subject with the same keywords in the title or had close synonyms, with a link pointing to keyword1 and keyword2. I left keyword3 alone. All the links were not keyword laced with 6 being links under my name and 1 a bare url. All of them were "no follow."
Keyword1 dropped from the serps the next day. It reappeared the next day at the top of page 2. It bounced on page 2 for a week then settled at position 13
Keyword2 remained unchanged for a week, bounced to down 1 position for a day then moved up 2 positions before settling back to it's original position at the moment.
Keyword3 remained unchanged during this whole process.
I've sat and watched bouncing right after changes that stabilize over time.
Me, too. I mentioned that observation too, right after the patent hit the light of day. I'm thinking it may actually be applied, but only in very specific cases. My guesswork, not Google's official statement at all.
[After redirecting an EMD] We've not experienced a rise quite so meteoric, but are definitely bobbing back to the surface. 40s last week, 36 over the weekend and 20 this morning.
Fell to mid 40s over the weekend. Decided not to touch a thing and hold fast just in case this was some sort of google a/b test. Then we popped up at no.9 this morning.
[Less than 1 hour later] Errrr... no.5 now
I have gone from page 1 for years and years to nowhere to be found, to page 15, to page 2, page 7, and now whatever page 250'ish is on.
We're hovering around positions 5 - 8 at the moment. We used to be #1 but still many pages to get picked up and redirected.
[edited by: TheOptimizationIdiot at 10:00 pm (utc) on May 4, 2013]
Cutts does not flat out say, "We're not using that patent." I think his point is more that we should NOT read a patent and decide, "Well, I give up." And fair enough.
I still don't think this is the type of signal that a good algorithm should be using. Some novice could be playing around with different titles as new ideas occur to him. What Google should be focusing on is creating an algorithm that is inherently immune to SEO practices and attempts at manipulation. That would be a much better approach.
In fact it is theoretically possible to create an algorithm that is immune to SEO practices and attempts at manipulation. Absolute 100% immunity might not be attainable in the real world, but that doesn't mean Google shouldn't try.
...so I made a few changes, and it went back to #1...
I immediately made changes (before I even knew about Penguin, actually - I hadn't been to WebmasterWorld in some time), and my rankings just got worse.
What did you change then that did not cause it to bounce?
Did you change the same things here as before or something different?
Why would "doing what everyone says" seem to have no effect?
What would be very difficult to reverse engineer?
Why would more "uniqueness" not have a positive impact?
How did diluting your internal PR distribution by adding outbound links help initially?
Why did increasing your internal PR distribution by removing outbound links not help or have any noticeable impact?
Why would Penguin not need to be run very often?
BTW: The best I can guess right now that makes the any sense to me is it was Professor Cutts in the Kitchen, but to confuse everyone, rather than the knife, he used the candlestick.
How many positions are the bounces?
I've seen from "small" (1 to 5 places) to "all over the place" (gone from the SERPs, then back and forth between pages, then different positions on the same page).
Is the amount of movement related to the amount of over optimization on the page and/or site?