Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Josh Bachynski's Commentary/Rant about Google

         

goodoldweb

10:42 am on Jun 25, 2014 (gmt 0)

10+ Year Member Top Contributors Of The Month



[themoralconcept.net...]

When a webmaster argues to Google that he/she has their own limitations, that they simply cannot afford the $5k per month ridiculous marketing budget to buy advertising and magazine level content for their plumbing business, and instead had to go the $500 per month buying links route to compete in their chosen niche because that’s the money they had, and that’s what everyone else had done, these limitations do not morally indemnify the “black hatter” in Google’s eyes...

goodroi

8:51 pm on Jun 27, 2014 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



How about a user submitted search engine? ... the users through their upvotes or downvotes


That was tried back in the 90's and we had a great time abusing those serps. I have a huge smile on my face remembering how easy it was to abuse all of those old engines.

technically speaking I don't think they used downvotes in the 90's but that would have invented negative SEO that much sooner

nomis5

9:28 pm on Jun 27, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Sorry to contradict but, no, they don't. The shareholders are pretty well disenfranchised


Why do you say that? Maybe my mistake, but I thought G was owned by the shareholders. Or does G have the majority interest in their own shares - honest question?

jmccormac

9:48 pm on Jun 27, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



How about a user submitted search engine? Based on an imgur type script whereby it is the users through their upvotes or downvotes that decide which sites make it into the directory/search.
Most recently tried by Jimbo Wales and his Searchwikia project. Basically the people running it were good on the mechanics of search (it was built on Nutch) but the project hadn't a clue about Search quality and the importance of keeping the index clean. (I was on that open mailing list and it appeared that the plan was to "Open Source" the expertise to make the project's backers rich. The problem for the project is that real search engine expertise (not the Wikipedia scraper/blind crawling/infinite monkeys kind) is quite rare.) It actually worked to some extent but the index quality was complete rubbish. Google even tried to implement SERPs voting after seeing the project in action but quietly dropped it.

User submitted engines don't work because they never get to the critial mass required to become a player. People quickly tire of submitting sites and meatbots and spammers will hammer such search engines the same way that they hammered web directories.

Regards...jmcc

webcentric

9:49 pm on Jun 27, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Problem with any kind of engine is how to assess quality without opening the floodgates to abuse and criticism. Let's see. What quality indicator can we use to power a truly unbiased, authoritative search engine?

Keywords will get you keyword spam.
Counting inbound links will get you link spam.
A voting system gets you voter fraud and ballot stuffing.
Personal analysis always seems to lead to some bias, favoritism, etc. (see DMOZ).
Paid systems favor deep pockets over quality but can foster a system where the more you pay the better your ranking will be. As long as you let people know rankings are based on ability to pay, at least you're being honest with your users.

Take one topic, any topic, and ask yourself how you would build a set of search results for that topic. We'll presume that you are the foremost expert in that subject so are qualified to make such assessments. Break on that point...segue to...

Can you build a set of links that pretty much anyone would recognize as unbiased and anyone could could recognize that the most pertinent information was at the top of the list? Never mind completeness, If something comes along it can always be added...back to the...question...

What criteria determines what listing goes in the number 8 slot v.s. the number 9 slot? And when you've applied that criteria, whatever it is, will your audience appreciate your assessment and agree.

It really comes down to how much your users credit your authority as a search results provider. Most people see Google Search as

1. Comprehensive (representing all the content on the Internet in some fashion)
2. Suitable for finding what they need at a moment's notice.

Most don't consider algorithmic biases when they get the information they're looking for.

It's a good question...how do you rank your results? It's like trying to rank everything in the Library of Congress. If I type in satire, am I going to get Mark Twain first or Oscar Wilde?

MrSavage

3:11 am on Jun 28, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I know this is more of a "solution based" environment. Here's my go at it. Enjoy.

My proposal is to essentially submit my content to Google and the knowledge graph. If it gets published, I get a kick back. The issue of course is deciding what my fair share is. So essentially I become a writer for Google. If my websites get in the way of "convenience", then let me suggest away! If it's good enough for them to use and post the entire content on their site, then I'm given stats. I guess this issue is what it's really worth. It's like a radio station playing songs. How do they come up with the value for each play?

The way I see it right now, it's a free lunch. Wikipedia provides the info largly. Whelp, I'm here and I'm for hire. Pay me! If people search "how to ride a unicycle", then choose my article! Post it, but pay me for it.

That's my futuristic solution. Slightly "out of this world", but heck, paying for content? Isn't that what we all have to do? Our time, whatever. I think we're almost there frankly where links to our site simply get in the way. It's a very 90's way of thinking to suggest people actually use a search engine in 2014 to find sites. As webmasters and creators of content, perhaps we need to strike a deal. Let Google and Bing compete for our content and not our links. Pay us for it of course. Government departments that provide weather need to get onboard with this too. Somewhere along the line that weather forecast was created by somebody who has training and gets paid. Like I said, no free lunch. I think Cutts was essentially agreeing that "free lunch" is not exactly courteous.

micklearn

5:30 am on Jun 28, 2014 (gmt 0)

10+ Year Member



I think Cutts was essentially agreeing that "free lunch" is not exactly courteous.


I really hope that is what he believes in and resigns some day based on own his beliefs. He seems like a good guy.

MrSavage

6:25 am on Jun 28, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Well, since I referred to it, this is the part of the article that I found to be the most profound:

On point 2 he personally agreed that their taking content for the knowledge panel might seem unfair and they had already internally debated ways to ex post facto compensate site owners, possibly even monetarily.


Interpretation to me says no more internet, just submit your "stuff" and get it published on Google and Bing. Competition says it's only your content that's required, not that website and hosting nonsense. I interpret this as a rapidly shrinking peep hole in which webmasters exist. Instant answers is leading to a hostile takeover isn't it? Too gradual to having any meaning for some people of course.

So let's talk solutions. It all starts by asking or creating a per word published royalty. Need stats of course, like views, so that monetary solutions can be had. Who wants to create that first proposal?

Martin Ice Web

7:09 am on Jun 28, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month




On point 2 he personally agreed that their taking content for the knowledge panel might seem unfair and they had already internally debated ways to ex post facto compensate site owners, possibly even monetarily.


Big laugh on this one. In germany this week there ws statet a new antitrust against google. Newspaper say they want money from google while google uses their content in the serps. goohgle says, if we have to pay for it, we will not show it in serps anymore!
And now they are think of rewarding maybe millions of webmaster to compensate webmaster for stealing their content?

superclown2

10:20 am on Jun 28, 2014 (gmt 0)



I thought G was owned by the shareholders. Or does G have the majority interest in their own shares - honest question?


The way Google is constituted means that shareholders (a) have no say in the way the company is run, and (b) receive no dividends. This means the only way they can make money is if Google's share price constantly rises. This also means that the founders can do pretty much whatever they want with the company's earnings.

It will be very interesting to see what happens when that share price eventually falls.

nomis5

9:55 pm on Jun 28, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The way Google is constituted means that shareholders (a) have no say in the way the company is run, and (b) receive no dividends


The majority of companies don't issue dividends so that's the norm.

But how is G constituted so that the shareholders have no say in the way the company is run? Just asking in case my company takes off, I would rather like to mimic such a constitution. Wouldn't want to end up like Mr Selfridge did many years ago!

EditorialGuy

2:13 am on Jun 29, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



And now they are think of rewarding maybe millions of webmaster to compensate webmaster for stealing their content?


Or maybe they could bill millions of Webmasters for all the referrals they've delivered over the years. That would make as much sense as paying publishers for the privilege of sending them free traffic.

webcentric

3:28 am on Jun 29, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Or maybe they could bill millions of Webmasters for all the referrals they've delivered over the years.


There are no contracts in place to support such a financial claim. What was given was given without any request, demand or reasonable agreement for compensation. On the other hand, there are laws regarding copyright infringement and intellectual property protection. IMHO, Google is tap dancing over the line between fair use and syndicating others content without permission or license to do so. Cutt's statement about compensation is hinting at the licensing of content. Call it reimbursement if you want to. Google wants a license to publish your content (after it's taken fair use as far as it can).

A license can be defined as permission to do something that is otherwise illegal or unlawful. Requesting to be included in an index isn't the same as requesting that your content be syndicated commercially. An index by definition is a reference to other material, not the material itself. Google is bending the definition of itself from index to knowledge base and while I have no issue with anyone compiling facts, compiling and displaying original content from copyrighted sources without explicit permission seems over the line and it's going to get challenged again, and again and again. Hence the concept of just getting (and paying for if necessary), clear permission to step over the line.

iammeiamfree

2:19 pm on Jun 29, 2014 (gmt 0)

10+ Year Member



There was a ted talk with someone from google about the surprising history of copyright, publishers and so on. Although there was a disclaimer at the end saying how these views were not necessarily the views of google the talk was pretty much about how copyright originated with censorship legislation in the UK after the invention of the printing press and was in the interests of publishers and government rather than artists.

Personaly I am not really that much in favour of copyright on the basis that copying is not theft. As a publisher however I suppose it wouldn't do me much good if my site was scraped but this has basically already happened. Other sites have copyed my ideas and I do the same in response. Take a peek look what innovations are worth using. A lot of the sites in my niche are like that where it is obvious we are all copying each other. Not much that can be said really against it. All my work is basically dependent on other peoples work. To say it is mine is a big stretch when I didn't invent practically anything and millions of people before me played their part.

[edited by: iammeiamfree at 2:59 pm (utc) on Jun 29, 2014]

EditorialGuy

2:20 pm on Jun 29, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



There are no contracts in place to support such a financial claim.


Yep, and there are no contracts in place to support a demand for payment by Webmasters, either.

Now, can we get back to the real world?

iammeiamfree

2:30 pm on Jun 29, 2014 (gmt 0)

10+ Year Member



Yep, and there are no contracts in place to support a demand for payment by Webmasters, either.


Maybe not but I suppose the obvious contract to amend would be the adsense one.

webcentric

2:34 pm on Jun 29, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Now, can we get back to the real world?


Well, Matt is the one that introduced the idea. I never mentioned a demand by webmasters. Sounds more to me like preemptive thinking on G's behalf. Like a kid with her hand in the cookie jar, making up a plausible story before getting caught red-handed.

EditorialGuy

3:01 pm on Jun 29, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



OK, if you want to talk about "what if" scenarios, here's one to ponder:

Google could require "opt in" (via an agreement) for sites that wanted to be listed in results for commercial/transactional queries. Participating sites would agree to pay a small flat fee per organic click.

This would improve the quality of commercial results by discouraging low-quality e-commerce and affiliate sites from participating (sites with low conversion rates would find participation too expensive).

Site owners who didn't want to pay the fees would have an obvious alternative: Create pages with content of intrinsic value that would show up in informational results. In other words, the guy selling water skis who didn't want to pay a nickel or a dime per organic click could publish a guide to water skiing, advice on buying water skis, etc. instead of opting in to the commercial/transactional agreement.

What, that doesn't sound appealing? Well, we could always ask Google to stick with the status quo: Free traffic in return for crawler access. That's how the Web has worked since the years before Google existed, and site owners who don't want to be part of the Web can always use "noindex" or robots.txt to make their sites harder to find.

EditorialGuy

4:02 pm on Jun 29, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Here's another idea (thanks to Martin Ice Web for suggesting that fees should be part of the relationship between Webmasters and Google):

Google could charge fees for reconsideration requests or for re-inclusion in the search results after a manual penalty. The fee could be based on the number of pages indexed, the amount of traffic the site was receiving from Google before the penalty, or a combination of both.

This policy would help to discourage unnatural link-building and other questionable SEO tactics (even among megasites, since they'd be paying huge fees for reconsideration or re-inclusion), and it should please Webmaster World members who think searchers ought to use Bing, since it would let them publicize Bing as the search engine that doesn't charge fees.

supercyberbob

4:10 pm on Jun 29, 2014 (gmt 0)

10+ Year Member



Here's another idea.

How bout I link spam your site into the toilet, and you pay Google for the re-inclusion.

And while you're at it, pay me for my super SEO services as well.

webcentric

4:43 pm on Jun 29, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@EditorialGuy --

1. I'm not sure why you are eager to defend monopolistic practices. I could play devil's advocate (and do on many occassions) and cite various achievements by various moguls that did benefit society as a whole but I'd also have to mention the people and companies that were crushed in the process.

2. You raise a pertinent distinction when you mention commerce vs information. Back in the day, webmasters published information and then discovered that the information platform could be monetized. Now days, webmasters are building monetization platforms and using information to draw people to them. It's crazy to think that the two can ever be separated but in some strange way, that distinction is at the heart of the problem here.

One webmaster complains about the results because their widget (catalog id xyz123) isn't ranking as well as some other site's page with the same widget on it (catalog id abc 666) and claims the quality of their site is better so why the disparity? "My page is better than theirs. Why doesn't my widget page get the best position?" That's the commercial side of this.

And as you pointed out, then there's the information side. The side where creative writing is involved? Original photography, artwork, graphics, etc. Ideas are one thing and I firmly believe there are very few if no original ideas floating around today that didn't percolate up from the collective consciousness. How we express those ideas though can be and often is "original."

My gripe here is when something I created is put to commercial purpose by someone else without my permission (and I'm mean explicit permission--not implied permission based on the fact that I didn't block the content from every crawler on planet earth). Crawling a website in order to understand its content in the process of creating an index of web-based content is far different in my mind than crawling the same website, extracting its content and then republishing it in your own commercial environment. Pretty much anyone here would take issue if I created a bot, crawled your site and republished your information on my site (next to my ads and affiliate programs) so why are the monopolistic entities given a pass for doing the same thing. You can't walk into a store and take someone's apples without paying for them and then walk across the street and sell them at your own stand. Why is this any different?

IMHO, an information "index" and an advertising platform are two different things but Google is trying to make them somehow live together and that is the source of the problem in a nutshell. Product information cluttering up the information landscape and/or information competing with everyone who's trying to market something.

Like I said, I think separating the two is an impossibility at this stage but what you suggest would be a lot more tolerable and at least put the marketing critiques in one thread and the information critiques into another where we could all keep them straight.

EditorialGuy

5:02 pm on Jun 29, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



One webmaster complains about the results because their widget (catalog id xyz123) isn't ranking as well as some other site's page with the same widget on it (catalog id abc 666) and claims the quality of their site is better so why the disparity? "My page is better than theirs. Why doesn't my widget page get the best position?" That's the commercial side of this.


Yes, and part of the problem is that so many pages are essentially the same: whether they're commercial pages with boilerplate descriptions, prices, etc. or ten different information pages that give the current time and temperature, flight status, or whatever. In some cases there is no "best" or "most relevant," so the rankings are going to seem arbitrary even if Google is looking at things like user metrics behind the scenes.

My gripe here is when something I created is put to commercial purpose by someone else without my permission (and I'm mean explicit permission--not implied permission based on the fact that I didn't block the content from every crawler on planet earth).


That train left the station in the 1990s. Besides, search engines need to cover their huge costs and make a profit, too. They get to put ads on their search results, and site owners get free traffic from those search results. The fact that just about everyone here wants more traffic from Google suggests that the quid pro quo is acceptable to most site owners.

iammeiamfree

5:11 pm on Jun 29, 2014 (gmt 0)

10+ Year Member



Pretty much anyone here would take issue if I created a bot, crawled your site and republished your information on my site (next to my ads and affiliate programs) so why are the monopolistic entities given a pass for doing the same thing. You can't walk into a store and take someone's apples without paying for them and then walk across the street and sell them at your own stand. Why is this any different?


It is different because in the case of copying some web content the original content remains on the source site whereas if you steal an apple the apple is no longer in the store. It would be more akin to taking an apple seed from an apple you had got hold of and growing an apple tree and then getting free apples. This is why there are efforts to genetically modify apples so that they can be patented and then people prohibited (or prevented) from growing apples even tho apples have existed as long as people or longer perhaps.

Your content getting copyed could be a loss but it could also be a gain. It depends how it happens. If google copies your meta tag that is good because then you get a visitor following the link. If they copy more content and try to reduce the number of people leaving the serps then that might not be so good. In principle tho copying is good for a webmaster if you are able to get content out there and visitors coming to your site as a result. A simple thing like a url link is a copy of your url afterall and where would be if nobody copyed those? I mean atleast google is copying urls but what would it be like if we all copyed more urls? Too many scrooge webmasters means generous google gets all the traffic.

[edited by: iammeiamfree at 5:36 pm (utc) on Jun 29, 2014]

webcentric

5:19 pm on Jun 29, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Again, there's a difference (to me) in marketing a product or service and publishing information (even if the lines between them are completely non-existent in this day and age).

Want advertising? I say pay for it! When did Google ever announce that it is a free advertising service? Complaining about the quality of free advertising is a bit like looking a gift horse in the mouth.

Mixing advertising (and by that I mean someone's unpaid widget listings) in with links to information resources is the source of this debacle in many ways. I never expected that G would not some day monetize its search pages (e.g. with its own ad services). What's appalling is the fact that it gave away free advertising to those who would go on to abuse it and allowed purely commercial concerns to ruin a perfectly functional information index.

Brett_Tabke

5:58 pm on Jun 29, 2014 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



> Anybody else remember what it was like doing SEO
> BEFORE google became the prominent search engine?

Yes, it was totally freakin awesome.

The top listing for "mp3" on altavista in 1999 produced 7 million referrals in 4 days (and generated $39k in ad income, and $12k in affiliate income for me).

The top listing on Yahoo/Inktomi in 2002 generated 131,000 page views in 2 days.

The top listing on Google for "mp3" in 2001 for 12 days, generated 21,000 referrals.

Samizdata

6:19 pm on Jun 29, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The top listing for "mp3" on altavista in 1999 produced 7 million referrals in 4 days

And then everyone started to use Napster instead.

...

Brett_Tabke

9:19 pm on Jun 29, 2014 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



that was in the hey day of napster and why 'mp3' was such a fire hot keyword.

That's just one example of hundreds.

FranticFish

10:22 am on Jul 2, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Anybody else remember what it was like doing SEO BEFORE google became the prominent search engine?

Yes, it was great.

Search referrals came from a wide range of different engines and if you didn't rank well in one then you could still get decent traffic from the others.

It was also far, far easier to rank :)

webcentric

3:38 pm on Jul 2, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



However the internet is not just a stream of endless adverts, but if G wants to turn its self into the shopping channel then great.


Again, blurring the line between advertising and information (as Google and those who use Google as an advertising platform have done) is at the root of much of the discussions in this forum. There really should be two forums on the topic of Google SEO on this site. One for people who have information websites and one for people who are marketing their wares. Better yet, two versions of Google. Of course nothing is so black an white as that (well, except black and white) so implementation such an idea would be a joke. But, just because something can't be fixed doesn't mean it isn't still the source of the problem.

EditorialGuy

5:26 pm on Jul 2, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



There really should be two forums on the topic of Google SEO on this site. One for people who have information websites and one for people who are marketing their wares. Better yet, two versions of Google.


Yahoo used to have a "slider" control that let searchers shift the emphasis of its search results from commercial to informational or vice versa. I'd love to see Google offer something similar--or maybe a simple "I'm shopping for:" or "I want to know about:" toggle.

As it is, searchers can choose from "news," "images," "videos," "maps," and "web," so why not break "web" into separate results for informational and commercial searches?

Of course, spammers and content marketers would try to cram the "I want to know about:" results with commercial pages disguised as information pages, but aggressive filtering could address that problem.

webcentric

5:37 pm on Jul 2, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Of course, spammers and content marketers would try to cram the "I want to know about:" results with commercial pages disguised as information pages


Hence my previous comment about things not being so black and white. Looks good on the drawing board though. ;)
This 65 message thread spans 3 pages: 65