Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Business Strategy In Light of Panda - NOT catering to Google SERPs

         

coachm

8:19 pm on Jun 5, 2011 (gmt 0)

10+ Year Member Top Contributors Of The Month



Been pondering this for a while, and since one hat I wear is as a business strategist, thought I would share this. What I suggest business owners who use the Internet as a customer channel, should do and not do in light of Panda.

This does NOT apply to professional webmasters, per se, but to businesses who use the Internet as a tool, but not as THE major part of their business models.

DO strive to create best content you can FOR VISITORS. Ignore the SERPS. Please your customers, and while it makes sense to do basic SERP stuff, don't focus on it. Why? By focusing on how to "fix" sites that got penalized by Panda, you WILL, for the most part, lose.

It's clear that google is going to be constantly fixing things, because, to be blunt, their business model is being threatened, and right now their results are not very good.

If you own a business, and you are not a professional webmaster you can't keep up. If you are someone who spends 8 hrs a day building sites, different deal. For those of us who aren't web specialists, playing a game that requires constant updating isn't going to work.

DO remind yourself of the focus of your business. Did you start a business to spend all your time trying to learn how google is doing things? Or to do something else that includes search engines as a tool to get customers? It's different.

DO explore alternative ways to find customers that are not Google dependent. As an example, in our redesigns we're rss'ing our pages so we can auto-generate feeds from our static pages, auto update them, and then leverage them into the social networking world to Twitter, Facebook, etc.

DO NOT follow the "thin" Google advice. Coutts et all have mentioned some suggestions like spin off parts of your site, etc. Not doing it. Because we don't have enough information. Can subdomains work? Should spun off site domains have a different owner of record? Who knows.

I'm doing the opposite. Closing down sites or bringing them back into the umbrella of my main site. Why? Because spun-off sites increase maintenance costs and complexity, at least for us. I can't handle our existing sites (about 10-15) as it is. I need to be able to modify things fast, and I can't do that across a network of sites.

Besides, having various inter-linked domains is problematic for visitors anyway. I don't want my visitors to be going back and forth from one site to another, all with different looks.

DO integrate with other marketing methods, so that you push people to your website through promotions, social media stuff, etc. Reduce dependence on Google.

DO have faith in "content is king". Right now it feels like it isn't, but it's in Google's interest to reflect that. It's the only reason google exists, to help people find good stuff, so even if they aren't doing it right now, they will have to. Or become bit players, in a world of social media.

DO create content of varying types and lengths. This is smart because it meets the needs of various types of human visitors -- some looking for longer stuff, some for much less. We run articles from 400 words up to more than 2000 words, always have. This also prevents having to figure out Google and whether "size" matters.

DON'T use ezine type directories. Bring your content home, so you don't compete with yourselves, and recognize that a lot of these "directories" have been hit by Google Panda, anyway. We have about 20 of our articles on one such directory. Removing them. Besides, I find the majority of people reprinting from these sites are cheating and violating TOS.

DO repackage and reuse your content. We're loosening our reprint licence terms to allow people to legally share material from our sites, free, on anything other than public Internet sites. Letting them share with colleagues, employees, and on Intranets, and in printed formats. With proper attribution, of course.

Also consider the repackaging of best material to produce free or paid e-books, and even print books. We're looking at this for later in the year and already doing some of it. Leverage Kindle formats. Get in the Apple shops. In the amazon shops.

That just a little bit of the stuff we're looking at, as a business that exists on the Internet only to meet the needs of our customers in non-Internet ways.

Be interested in comments, or what others are doing. I imagine, from what I've read that there will be some disagreement.

And, if my strategies don't work, I'll at least be able to wake up in the morning knowing I'm doing it my way and comfortable I'm not wasting my time trying to out-guess an algo I suspect nobody, including google, understands!

I was going to publish this on my own site, after polishing it up, but figure this is a better place.

tedster

3:03 am on Jun 6, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Wow, that's an intensive piece of work. Well done.

It reminds me of an earlier thread, Depending on Google has hurt businesses since at least 2002 [webmasterworld.com] So many web businesses started up over the past ten years with a model that assumed they could leverage Google rankings indefinitely. Some of them even filed for patents on their methods for ranking at Google!

I do have one nit to pick, and it's with the phrase: "DO NOT follow the 'thin' Google advice." Google only ever talked about "shallow" content. Webmasters ran wild with the idea that this meant "thin", but it's not what their engineers ever said.

danimalSK

9:09 am on Jun 6, 2011 (gmt 0)

10+ Year Member




Awesome post dude.

I've been meaning to write something similar for a while, although you are clearly less lazy than I am.

I think "SEO" focused websites are going the way of Blockbuster; i.e. they will soon be dead as a dodo. Focusing on Google because it worked for you in 2005 is like focusing on DVD rentals because in 2002 it was an awesome high margin business.

If Google doesn't need you, and for 99% of the people on this forum it doesn't, then your toast. Google used to need small websites and startups because there wasn't anyone else doing a good job on the web. Now not only does Google have all the data it needs to compete itself (Places, Travel, Boutiques.com etc) but it also has a load of web savvy billion $ companies stuffing its pockets with ad dollars.

Most of the people on this forum need to start innovating, and FAST. Unless you drop Google and start working on the other distribution mechanisms / business models (apps, iPhone, Android, Facebook, Twitter) you've got a pretty limited future. A top 25 free app in the app store gets 50,000 + downloads a day! I'd wager thats more traffic than most of the people here get in a week.

Shaddows

9:23 am on Jun 6, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@Tedster
I do have one nit to pick, and it's with the phrase: "DO NOT follow the 'thin' Google advice." Google only ever talked about "shallow" content. Webmasters ran wild with the idea that this meant "thin", but it's not what their engineers ever said.

I think the OP was talking about Google advice in general. It's 'thin' (or indeed, shallow) in the sense it doesn't MEAN anything. It's noises intended to fob off the SEO community; it has no depth, no substance, no specificity.

While the recent thin/shallow comment belongs in the set of "shallow comments made by Googlers" it was not specifically referenced by the OP.

HuskyPup

10:37 am on Jun 6, 2011 (gmt 0)



Excellent coachm, a lot of business could do with circulating this around their offices/cubicles/sales force/shop floor/everyone.

I imagine, from what I've read that there will be some disagreement.


Not much, it's what a lot of people/companies should have kept going even when everyone was saying the High Street is dead, long live The Net, in a way all this technology-driven stuff has been a giant kiddie's game and now the reality is seriously settling in...some people are actually going to have to get real jobs and work instead of outsourcing to the Far East and merely being consumers...but that's for another thread:-)

I have to agree with the thin/shallow part, I have seen no evidence to prove this is a major issue, in fact I've seen the opposite since Panda 2.1 and let's face it, for some widgets only partial information is available, I know since some of my pages are like this and there's absolutely nothing I can do about it.

walkman

4:00 pm on Jun 6, 2011 (gmt 0)



+1 for danimalSK.

Google is now killing even the chrome url bar I heard, they don't like when you go directly to site, they want to read your mind first and $end you there.

Unfortunately I need them for a year, then it will be the icing on the cake. Google does what Google wants. They rule search and can change rules whenever they want to whatever they want. A year from now they might decide not to index at all certain sites. Watcha gonna do?

Bottom line the search team might use an algo but the direction is set by humans, up the Google food chain. Their search pages are already full of ads: all kinds of ads from products to those with 1800 number, checkout buttons, reviews, address, prices, images, whistles and bells to make you click. How many people see the ads as #1, #2 and #3 results? They have also maxed out the earnings by tweaking different shades of blue but it isn't enough to justify their stock price. So they have to try different things

MrFewkes

6:22 pm on Jun 6, 2011 (gmt 0)



Tedster - can you clarify what the difference in real terms is between thin and shallow content?

Staffa

6:39 pm on Jun 6, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I heard, they don't like when you go directly to site, they want to read your mind first and $end you there.

They do indeed. Lately, I noticed on one of my sites (with a particular setup) that when a user right clicks on a link in the SERPs they land on a different page than when the same user clicks on the link directly, though the referrer string in each case shows the same in the log file. I tried it myself and that's indeed how it works. So between the right click and landing the user is redirected "somewhere" first and the referrer string is slapped on after landing.
.

Shaddows

7:20 pm on Jun 6, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



'Thin' is generally light on text- no body. It's lazy in the classic sense.

'Shallow' can be full of words, but of little substance. eHow et al epitomise it. It's how you would describe a Content Farm. Which is where this all started. It's lazy in the automated sense.

wheel

7:30 pm on Jun 6, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm a webmaster/SEO - that is my job. But I look like an independent practitioner in a niche industry because that's the industry I'm in and I don't venture out of it. If my industry went away, I'd still be online, just doing some other niche. Nevertheless, my income derives from internet generated business in one niche.

I'm still a solid believer in backlinks. I see little real evidence in my serps of other factors causing an increase in rankings. I think that may change a bit in the future, but I'm OK beating on the drums of link building.

Only problem, link building methods that are decent, are just about non-existent these days. Most of the traditional stuff either doesn't work, or I've done it. So the last couple of years I've had to come up with ever more complex and intricate marketing methods (perhaps the word I'm looking for is devious) to get backlinks. Methods that don't look like I'm asking for a backlink. Methods that I no longer post about :).

And the funny thing is, some of those link building methods are now providing me non-Google income that competes with my base business model. I'm working on another one right now that may provide some income, but I don't care if it does or not - I'm simply very hopeful I'm going to get links from authorities I've never tapped before (and my competitors will never be able to tap). I've got another project that brings in decent revenue - I could reasonably dump my entire base business tomorrow and concentrate on that sideline and have as good a chance at success and income (and probably less work). I won't because it's not where I'm an expert, but it's sitting there if Google ever stops luvin' me.

So I'm getting Google proof. Not by design, more by accident.

walkman

7:43 pm on Jun 6, 2011 (gmt 0)



thin and shallow content?


Thin content for a "How old is Al Pacino"

Al Pacino is XX years old. He was born in Gtgfg in year 19xx, to a family of [] heritage.

Gives you the exact answer you were looking for but maybe it's too straight to the point.


Shallow:

Al Pacino is an American actor. He was born in a hospital. He started acting because he loved movies....He has done many movies in his lifetime...click on Google ad that answers your question ;)...He also did a movie with Robert De Niro...


The problem is Google has no way to determine that directly, not in grand scale and among all niches, anyway.

Sgt_Kickaxe

7:48 pm on Jun 6, 2011 (gmt 0)



In my corner of the net I'm seeing 2-4 results about the keyword being searched for and 6-8 about related keywords making up the top 10. Before Panda I saw only sites about the keyword, not banks trying to offer insurance for it or auction sites trying to sell it.

More top 10 space was handed to mega sites with Panda, deserving or not.

suggy

8:55 pm on Jun 6, 2011 (gmt 0)

10+ Year Member



Walkman - It's subverting the topic for the thread, but I think Google's can tell shallow content using word n-grams.

Basically, compare n-grams of a shallow page to more worthwhile peers and it will be found lacking in key (within population) terms (it will be a bad match); it will have more in common with n-gram sets for auto-generated spam or the population as a whole. In other words, it isn't specific!

I believe Google's done some research into n-grams (Seobythesea wrote it up)....

[edited by: suggy at 8:56 pm (utc) on Jun 6, 2011]

tedster

8:56 pm on Jun 6, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Let's re-visit Google's original description of the challenge they are trying to meet. A lot of time that can be more valuable (like a patent) than trying to reverse-engineer this particular moment.

Singhal: So we did Caffeine [a major update that improved Google's indexing process] in late 2009. Our index grew so quickly, and we were just crawling at a much faster speed. When that happened, we basically got a lot of good fresh content, and some not so good. The problem had shifted from random gibberish, which the spam team had nicely taken care of, into somewhat more like written prose. But the content was shallow.

Matt Cutts: It was like, "What's the bare minimum that I can do that's not spam?" It sort of fell between our respective groups. And then we decided, okay, we've got to come together and figure out how to address this.

[wired.com...]
A lot of "bare minimum" sites really did get nailed, too.

suggy

8:57 pm on Jun 6, 2011 (gmt 0)

10+ Year Member



Tedster -- see my n-grams post above. I think it's the answer you seek!

walkman

9:06 pm on Jun 6, 2011 (gmt 0)



suggy,
if the web was articles only, then maybe. But, case in point, eHow escaped Panda. Google had to use data from user's blocked sites to find a reason to penalize them. Other content farms were hit mostly because of ads, duplicate content and manual penalty (Mahalo.)

What works in a lab in one thing, cannot always be applied across all fields. Some have 2 sentence descriptions for shoes, others have 2200 word articles for antimatter, others write 200 word crap about poker or acai berry. Eric Schmidt himself admitted Google can't tell quality or intent

suggy

9:27 pm on Jun 6, 2011 (gmt 0)

10+ Year Member



Walkman

Where there's content there's n-grams -- whether in titles, meta descriptions, alt text, title attribute, links/ link anchors, image file names, folder names, etc.

I think an n-gram of a two line product description would show it to be exactly what it is: not very useful!

If I am searching a specific pair shoes, why wouldn't I want the page with a better n-gram profile, all other ranking things being equal (eg Trustrank, authority, etc)?

On a side note, I've noticed a lot of folks here at WWW these days like their "facts". I suspect I am a little contrarian, but I am not convinced that quite a lot of the things I am told are 'fact' or are presented as unquestionnable truths, actually are... just subjecture, built upon subjecture, mostly.

Anyway, we are digressing from the purpose of the thread. My two penneth would be to build a sticky/ repeat site, so you're only really relying on google for new introductions, not your daily bread!

tedster

9:53 pm on Jun 6, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google's patents for "phrase-based indexing" lean heavily on n-grams. They've prior art here going back at least to 2005 or 2006, and they even released a huge corpus of their early results to the public.

The thing is, companies generating shallow content also built applications to use those n-grams along with term co-occurrence to spin their page titles, headlines and copy. That still didn't mean that the content had any real depth.

walkman

10:46 pm on Jun 6, 2011 (gmt 0)



coach,
my only advice would to still leave a few sites separate, just out there for Google. You never know they might rank. I'm not suggesting spam, interlinking or anything, just not-cared-for-daily type of sites.