homepage Welcome to WebmasterWorld Guest from 23.21.9.44
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
Forum Library, Charter, Moderators: mademetop

General Search Engine Marketing Issues Forum

This 33 message thread spans 2 pages: 33 ( [1] 2 > >     
Is this the right direction?
Technique vs. results
yoda0482




msg:243555
 3:47 pm on Jan 12, 2003 (gmt 0)

Having designed over 100 sites since 1995, I have been involved with SEO from the get go. I read all there is and try to keep up to date.

Yet, it appears to me that we are all going in the wrong direction. Search Engines seem to be focusing on technique, instead of concerning themselves with results. We, as developers, spend a great deal of time maximizing techniques and putting down that technique and what is the new technique, etc. This comes at a cost to our customers.

Should not the focus be, is the result deceptive or true?

I am having difficulty with the concept that META tags, which are hidden to the viewer, are OK, yet, a hidden div after the body tag is not. They are both invisible.

Isn't any form of modification, to increase one's ranking, a form of Spamming?

Is it wrong, to work a page or site to improve a search engines ability to rank that page properly? No.

Are we not helping search engines return better results? I think so.

An example, the search on Google for <keyword> yields the first non-paid return is an ISP, the third is a Newspaper. Nothing to do with <keyword> at all! In fact 5 of the top 20 were not <keyword> firms at all. Others were not from <location>, yet I admit they may do business in <location>. Since <keyword> are aware of optimization, this result is better than many returns.

Should a domain be banned by a SE, because they optimized their site to return a better ranking for keyphrases that actually pertain to their business and the query? Does technique matter or does the result matter?

If I am a <snip> and do a lot of business in <snip>, is it wrong to optimize my site so that people looking for that see my site in the query results? Or is it better that they see a newspaper?

Well, that is the case. Yes, I used a hidden div, with only pertinent keywords. And, only a few. So, now, my company, who provides an exceptional value for businesses in <snip>, is off Google. How does that help the businesses in <snip>?

Interesting to note is the fact that we were reported as a direct attack on a company I design for. My company and the company that owns the studio, who is not involved in site design at all and did not appear in searches where we were found, were collateral damage. Others who are?aggressively? optimizing their sites are still on the query return.

I propose that Google, and others, should not be concerned about what technique was used, but, is the result deceitful or misleading? That we should not go down the road that this method of invisible text (META TAGS for instance) is OK and hidden divs are not. We can always find a way around these things and as history suggests, new loopholes are born every time you try to plug one up. Look at the Tax laws. This is unfair to businesses that have to bear the costs of optimization, the possible risk of being banned and the loss of revenue. All so that their product or service can be found properly on a search engine and they rank better than non-relevant results due to Search engine deficiencies. That in fact, our techniques can be used to improve a search engine?s results.

Abuse reporting should be aimed at those that are deceitful in that they are using techniques, to mislead and drive traffic to their site, which has no bearing on the query.

If we want Search engines to work better, I feel, it is time for new thinking.

[edited by: NFFC at 11:51 pm (utc) on Jan. 12, 2003]
[edit reason] Please, no specifics [/edit]

 

europeforvisitors




msg:243556
 5:03 pm on Jan 12, 2003 (gmt 0)

I am having difficulty with the concept that META tags, which are hidden to the viewer, are OK, yet, a hidden div after the body tag is not.

Google ignores meta tags because they're hidden from the viewer and are so easily manipulated.

Isn't any form of modification, to increase one's ranking, a form of Spamming?

Spam is in the eye of the beholder. Any of us is welcome to define spam in whatever way we want. But Google is also welcome to define spam in whatever way it wants. And I suspect that convincing Google to approve of "spider food" with artificial ingredients will be about as easy as convincing an organic-food enthusiast to eat a genetically modified fruit-with-pesticide cocktail. :-)

john316




msg:243557
 5:27 pm on Jan 12, 2003 (gmt 0)

Until SE's apply some "real world" sensibilities to their algos, it's what you see is what you get.

There is no logical reason to build a web site no matter how "content rich" and well designed only to land on the 4th page of serps, in fact doing so on behalf of a client might qualify you as incompetent, and at the very least ineffective.

Unless you are including:

User-agent: *
Disallow: /

in your robots.txt, you are at some level "spamming".

Actually the definition of "spam" is continually changing, at some point in the future, it may mean "your site competes with our advertisers".

MHes




msg:243558
 5:54 pm on Jan 12, 2003 (gmt 0)

"Isn't any form of modification, to increase one's ranking, a form of Spamming?"

No, if you have a good content site and modify it so the spider can see all the pages that's not spamming.... it's good business.

BigDave




msg:243559
 6:14 pm on Jan 12, 2003 (gmt 0)

Well, that is the case. Yes, I used a hidden div, with only pertinent keywords. And, only a few. So, now, my company, who provides an exceptional value for businesses in NH and MA, is off Google. How does that help the businesses in MA and NH?

Yes. It keeps them from hiring a web designer that will use tricks that will get them kicked off google. If they are using google for their searches, I bet they will consider it important that they show up in the results.

Instead of using hidden text, which has been in google's Don'ts list for quite a while, why not produce visible text which would meet your goal, and also let your user know what you are trying to tell the search engine.

If you are a web designer and SEO, You should know how to do this. You should also keep up with all the changes in google's policies. It is one of the things that your clients are paying you for.

martinibuster




msg:243560
 6:35 pm on Jan 12, 2003 (gmt 0)

why not produce visible text which would meet your goal,

I think we all know the answer to that one: It will spoil the look.

Sometimes that is the client's fault. Sometimes it is the designer's fault.

In case 1, it is our job to educate the client. In case 2, it is our job to work within the constraints put upon us by SE considerations and produce something that works for spiders and humans.

There's an article in one of the recent Communication Arts magazines in which the writer discusses two types of creatives. Those that work without restraints and create brilliant work that doesn't see the light of day. And those who work within their constraints and still manage to produce great work that is seen by many.

Just found it online. The article is called "In space, no one can hear you being creative." Found it here [commarts.com].

There are many many instances in different media, classical and jazz come to mind, where a particular form of experession was thought to be played out, until along came some old timer who knocks the socks off those who claimed the life had gone out of the music. It wasn't the mode of expression that was at fault, the fault lay with those who couldn't cope with creating for it.

The desired results dictates the technique. There is no versus.

[edited by: martinibuster at 7:00 pm (utc) on Jan. 12, 2003]

Unversed




msg:243561
 6:50 pm on Jan 12, 2003 (gmt 0)

What is so hard about putting a visible sentence in the page text that says "We are based in NH and have a lot of business with customers in MA", or whatever?
Surely that helps human visitors and robots?

I cannot see the merit for anybody in using hidden text, meta or otherwise, to spam. If the user arrives at your page and it doesn't have what they want they'll just click the back button and you'll have achieved nothing except wasting their time and your bandwidth.

Lots0




msg:243562
 7:11 pm on Jan 12, 2003 (gmt 0)

Yoda0482,
you make some VERY good points.

Yet, it appears to me that we are all going in the wrong direction. Search Engines seem to be focusing on technique, instead of concerning themselves with results.

This is absolutely true, IMO. The SE’s, in this case Google, have led the charge against technique and have lost site of the real goal, providing relevant search results.

I am having difficulty with the concept that META tags, which are hidden to the viewer, are OK, yet, a hidden div after the body tag is not. They are both invisible.

I have never thought of this in quite that way - But I believe you are correct as far as meta description tags go. As long as the description tags were used "correctly" and did not have misleading words in them they were OK. Say your site is about widgets and you put the word gidgets in the description tag then you were spamming, because you were trying to deceive the user about what was on your site.

Shouldn’t that same logic be used for div tags - I think so, it makes sense.

Isn't any form of modification, to increase one's ranking, a form of Spamming?

absolutely positively correct.

When you start writing copy and you are trying to figure out how to put that keyword in one more place, and have the copy still make sense - guess what you are spaming - when you are trying to get that anchor link with just the right keywords - guess what you are you are spaming. And the list goes on and on.

Google has made this issue of “Spam” very arbitrary - there is good spam (What Google likes) and then there is Bad “Spam” (What Google does not like) And some of what Google says it does not like and will ban a site for Google uses themselves - Like IP detection and redirection (Cloaking).

Should a domain be banned by a SE, because they optimized their site to return a better ranking for keyphrases that actually pertain to their business and the query? Does technique matter or does the result matter?

You would think so wouldn’t you. But Google is more determined to control what Webmasters do than they are determined to provide relevant search results.

If you don’t believe me just check out GoogleGuys posts in this forum for the last several months - very scary - all about control and showing what happens if you make Google angry. Not hardly a word about anything of substance.

Fortunately, a lot of Webmasters and SEO/SEM’s have started seeing and focusing on this arbitrary behavior that Google displays about spamming. And a lot of us are getting very tired of it.

yoda0482




msg:243563
 7:30 pm on Jan 12, 2003 (gmt 0)

I am loving these replys, they are helpful.

Remember, this entire site, forum and ones like it are about spamming.

As GoogleGod says on their site;
"Would I do this if search engines do not exist"

And remember, writing for people and writing for Se are two different things.

Top design teams are not hired because they have the first line of text read, "We are web site desgners for New Hampshire and Massachusetts companies seeking..."

Really...

Unversed




msg:243564
 7:40 pm on Jan 12, 2003 (gmt 0)

Top design teams are not hired because they have the first line of text read, "We are web site desgners for New Hampshire and Massachusetts companies seeking..."

I'd think not - that's terrible copy. Try "Successful companies in New Hampshire and Massachusetts come to Widgetarama for their web design because..."

heini




msg:243565
 7:44 pm on Jan 12, 2003 (gmt 0)

yoda0482 - welcome to WebmasterWorld. I think you are absolutely right. Usefulness of results to Joe User should be the only factor to determine the quality of results. And in a very abstract way that's what's happening.

All engines claim to have a ranking methodology which ensures exactly this. Their algos are set up to measure usefulness of results for the users.

However, the pressure is immense - it's a multi billion dollar game. All players, engines, advertizers, publishers, siteowners want as large a piece of the pie as possible.
What happens is that this tool to measure usefullness gets fetishized and protected like a sacred ritual device, while Joe all of a sudden is nothing but an operational quantity in this game.
As has been stated again and again: with billions of pages out there it doesn't make any difference for Joe if a couple of thousand sites are indexed in an engine or not. Neither is it for Joe, that a page with hidden text migh get deindexed.

It's a nothing but a big game. If you got burned, shake it off, start anew. No sense in appealing to justice or mercy. And not much sense either in repeated claims of ethics etc on either site. Ethics usually follow interests. Rarely this truth has become more obvious as in the discussions on spam.

Lots0




msg:243566
 7:51 pm on Jan 12, 2003 (gmt 0)

It made me LOL, and then made me think a bit, but heini you hit the nail on the head.
Ethics usually follow interests. Rarely this truth has become more obvious as in the discussions on spam.

yoda0482




msg:243567
 7:53 pm on Jan 12, 2003 (gmt 0)

True, but it has better relavency, but that is spamming.

Seriousely, go through Comm Arts archive and look at the companies who designed those sites. Pretty tough to include content rich stuff, especially on the home page. And then look at fortune 500 companies, they don't want a home page or pages that are like that either. How many times have you been told, "I want my site on coolhomepages.com" Don't blame them. Yet, there is nothing that can be done for those pages without spamming or redesigning for a more text based approach, which, as we all know, most people do not want to read.

True, copy on inner pages may carry more weight and do better. Yet, it should be written for the viewer and not the SEs, so rank will suffer.

So, my new SEO is basically no SEO at all.

Let my title (Comapny Name and Page Name), and whatever text on my inner pages (which is usually to be kept short and sweet with no SEO in mind) and of course what ever links from outside there may be, be it.

Because anything else is spam.

martinibuster




msg:243568
 8:22 pm on Jan 12, 2003 (gmt 0)

Communication Arts is the poster child for promotion of bad web design. They don't understand the concept of web design being different from print design. Their archive is composed overwhelmingly of flash sites.

Flash site? That's not web design. That's print and film collateral repurposed for the web through the medium of flash. Please don't call it web design. It's not.

They say things like, "breaking the rules" or "pushing the web to it's limits" but that's pretentious nonesense. You have to know the rules before you break them. You have to understand the limitations before you can push them. Something they don't because their backgrounds are Print.

Graphic designer designed web sites very often end up being round pegs squeezing into square holes. And Communication Arts is the biggest cheerleader of that kind of junk. That is the problem.

The answer, of course, is to seek a different route by trying to understand the limits and rules of Design for the Web, then working within those constraints to come up with the innovative solution. It can be done. It's done every day.

yoda0482




msg:243569
 8:35 pm on Jan 12, 2003 (gmt 0)

martinibuster...

True, to some degree. Yet, the consumer is used to TV and Print. Companies are interested in the look and feel. They want their site to be like a saleperson. One who is dressed well. Reading on the net is tough and most people don't.

If you go to a site that is plain and boring, most people will not have the confidence to be business with that company.

The web should not be limited to "text" only. But should include visually rich and interactive displays. Yes, even entertaining.

That's is why SEO is tough (while being ethical)

(BTW love the handle)

HEINI

Not trying to vent anger. A couple of martinis helped that. But, I am trying to decide what to do for the several agencies I design for as well as the 16 jobs in-house, the past jobs, on-going jobs and future jobs. I have spent 4 long days trying to come up with something that is best. Not easy as we all know. Companies hire us because of our look and feel. They want to do well in SEs. You are right on with what you say, ethics usually follow interests. Yet Joe User does suffer, whether they realise it or not.

Thanks.

[edited by: NFFC at 12:20 am (utc) on Jan. 13, 2003]

europeforvisitors




msg:243570
 9:01 pm on Jan 12, 2003 (gmt 0)

Ethics usually follow interests.

Not every corporation is an Enron, and not every Webmaster or SEO believes that "if you can think you can get away with it and profit from it, do it." But that's really a topic for another thread, IMHO. I don't propose to get bogged down in a debate about ethics vs. "situation ethics" in a thread about Google.

Yoda0482's original question was:

"Should not the focus be, is the result deceptive or true?"

I'd argue that:

1) Google is focusing on "the result" (i.e., what's on the page that the reader sees); and...

2) The process of identifying spam techniques (as defined by Google, not by us) is a legitimate tool for (a) determining whether content is likely to be relevant, and (b) for encouraging Webmasters and SEOs to abide by Google's rules if they want to be indexed by Google.

But in the final analysis, Google gets to decide how to rank pages and, where necessary, penalize the pages or domains in its index. And ultimately, it's search users who'll decide whether Google has chosen the right approach.

Lots0




msg:243571
 9:33 pm on Jan 12, 2003 (gmt 0)

Not every corporation is an Enron

No, but quite a few are. Or else all those CEO’s wouldn’t be going to jail.

Some people on this board like to argue that Google can do and has never done no wrong or mistake of any kind. - Regardless of logical arguments and proof to the contrary. Makes me wonder what their real motives are.

BigDave




msg:243572
 9:54 pm on Jan 12, 2003 (gmt 0)

Yet, there is nothing that can be done for those pages without spamming or redesigning for a more text based approach, which, as we all know, most people do not want to read.

and

Reading on the net is tough and most people don't.

I think you are mistaken, especially on the second quote.

Most people now spend a good portion of their day reading online. That is exactly what e-mail is.

Even the people that tend to scan things are likely to put more trust in a site where they know that they can get additinal content when they need it.

When people go shopping for cars, they go to the flashy, content poor websites of the manufacturers to see what options are available. And they get to play with the "build your own car" gimmicks. But when they are shopping for cars online, they go to edmunds, autoweek, car&driver and consumer reports to READ the articles about something that they ar about to spend $30k on.

A good website will provide both summary and in depth information. Learn to connect the two pieces.

jk3210




msg:243573
 9:58 pm on Jan 12, 2003 (gmt 0)

>>Makes me wonder what their real motives are<<

What do you feel their real motives are?

Lots0




msg:243574
 10:59 pm on Jan 12, 2003 (gmt 0)

Is not good SEO just trying to make your pages more relevant for the search terms on your page?

I think the days when a site that sells widgets uses porn keywords to get more traffic are over. This is deceptive, (and I guess some would call it spam) but more to the point it does not work if you are trying to make sales - It just makes searchers angry and not willing to buy anything.. They know they have been deceived.

Making your site more relevant to the search engines for the terms you are targeting is what it is all about. If your selling widgets, you want your site to be the MOST relevant to the SE’s for the term widgets.

And if your site truly does sell widgets why should the SE’s care how you go about making your site more relevant for the term widgets?

<added> Sorry jk3210 but if I answer your question this thread will get closed by one of the mods</added>

percentages




msg:243575
 2:10 am on Jan 13, 2003 (gmt 0)

yoda0482,

I agree with you. If the site is relevant to the search, then it does seem to be hurting Google's own search quality goal to penalize it for using techniques it considers spam and/or against their guidelines. They are deliberately reducing the relevance of their own results by removing such sites.

I don't see where Google gains by doing this to relevant sites, other than hoping they will switch to Adwords afterwards. The use of "off page" considerations in their algo has a similar effect as the "on page / hidden text" techniques you used.

Some webmasters simply say if I can't use a hidden layer, I'll use other web sites I control/influence to artificially boost backlink popularity and anchor link text. Google will never be able to stop that happening, so maybe it should rewrite the rules on what it considers to be "fair play"?

I've heard the argument that says they could research the domain owners...think it is difficult to make these all different? Often, they are already. No real way to track who owns what on the Internet without a bunch of court orders. How about, throw out all sites that link to someone who links to a possible spammer? That has some serious consequences also.

Currently I believe Google is encouraging webmasters to set up multiple sites and establish link partners to artificially boost PR and anchor text advantages by these types of actions. That is partially what leads to half the sites in a cat being owned by the same root source.

People who intend to win are not going to give up easily, they will always find ways around these problems and it is a game of chess Google can not ultimately win IMHO. Google is just encouraging an "arms war" against webmasters. It would surely be better to allow webmasters to fight that war between themselves?

For a search engine to ultimately combat all "unfair" tactics it is going to have to hand review every site before it is included in the index, and then review it again at regular periods. Totally impractical! Or it could base its search results on a human edited description...that didn't work for Yahoo!, think anyone wants to go there again? The concept of developing an algo spotting "unfair" tactics is a faulty one, those implementing the "unfair" tactics will keep developing new ones, I see it as a never ending tail chasing exercise for SE's.

I tried yoda0482's suggested search and the results aren't that great. It is a bit better if you reverse the search and put the region in front of the industry. Results are good if you put quotes around the term, maybe Google expects users to be more intelligent? I would have thought a simple algo fix would combat that issue, as we all know quoted searches are only a small percentage. More algo weight on the exact phase typed, followed by additional weight on the collection of terms entered being close together. I have noticed over the last couple of months Google has moved a tad in this direction, maybe ramping that up would help the algo produce much better results and in turn reduce the need for webmasters to try dubious tactics.

In my experience more people use the entire state name than the abbreviation, so I tried that with yoda0482's search also. 4 of the first 5 sites were totally irrelevant, and take a look at the source code for the other one! I guess that site will also be reported and banned soon (even though it is relevant), leaving no sites at the top relevant at all...Is banning these sites helping anyone?, maybe only the guys who get paid for selling sponsorship and Adwords.

People will end up simply having to do more searches or use another engine to find relevant sites. The former means more Google bandwidth, the latter less Google users. Some may say Google isn't perfect, but is better than the rest, try yoda0482's suggested search at MSN or even better the Pure Ink search and compare the results for relevancy!

Until those at the Googleplex take a new perspective on this subject I guess the best thing to do is create "ugly" pages with all the relevant keywords you want written into "ugly" content at the bottom of the page. Google doesn't ban for "ugly" visible content (at least as far as I know), and most people won't bother reading it anyway as long as it is separated, but still visible from the obvious quality stuff. Users who scroll down may think you are crazy, but at least you won't be poor:)

It simply isn't possible to get all relevant words into user friendly readable sentences for many sites. There are probably at least 2,000 relevant keywords making up phrases suitable for yoda0482's site and I'm sure they would like to be found for them all. I deal with many regional sites, you have to get the state, county, cities, towns, and all of the industry specific words in there to maximize traffic, very tricky to do in readable sentences. Split it up into separate pages and you run the risk of getting banned for creating doorways!

I have a couple of competitors that state on their sites that the following text is only relevant to Search Engines, then write keyword stuffed sentences to improve rankings. The sentences are repetitive and almost gibberish, but they don't seem to get penalized by the folks at the Plex, so I guess this is allowed? They have several thousand sites designed like this, the Google spam department is going to be very busy if they consider this against their guidelines, which in theory they should.

GoogleGuy, feel free confirm this tactic isn't allowed and I'll happily have several thousand spam reports sent to the Plex ;)

Personally I would like to see Google change their guidelines to say that if you attempt to manipulate search results to deliberately produce non relevant results in an attempt to purely increase traffic, then you will be penalized. I have no time for these types of site either, but remaining competitive based upon having to fill out spam reports is not the way to go IMHO.

It would be nice if we could all get on with producing more quality sites, that appear closer to the top of the relevant SERP's and spend a lot less time on worrying about SE spam. ;)

Ultimately making a site the most relevant for a relevant search term should be left to SEO's to battle out between themselves. I don't want to see a massive or inadequate Google Spam Squad policing web sites for possible infringements of the Google Bible. Especially when the Bible isn't adequately published.

If you search for "widgets" and get porn, then a small number of manual editors/analysts should be able to handle it. But if you get a widget site with "widget" text hidden so that it looks more presentable to the public I personally don't see the problem. The searcher is happy, the site owner is happy, what more does Google want?

Yes, all site owners will need an SES/SEO if they want to rank high, but they also need a good web designer, a good host, a good copy writer......SEO is just another advertising cost of doing business on the web. If they don't want to pay for an SEO they can always buy Adwords/Sponsorship and Google wins again :)

martinibuster




msg:243576
 3:16 am on Jan 13, 2003 (gmt 0)

It simply isn't possible to get all relevant words into user friendly readable sentences for many sites.

It is possible. If you have keywords, you crank out some more pages. It doesn't take an Einstein to figure out the value of a FAQ page. Or the value of a product description page.

There is no need to fit hidden divs into a web page, unless your web page is a repurposed magazine ad. And that's the equivalent of a Flintsones car where you stick your feet through the floor and push.

How is anybody going to find you if all you have on a page is pictures and a witty tag line?

The web is about information. People go online to find things out. If all you have on your slow loading web site is pictures of vacuous skinny people with latte moustaches, guess what I'm doing?

You have to sit down and work within your constraints, and if you put your noodle to work you will come up with brilliant ideas for a spiffy web site and be visible on the web.

How far you get depends on the noodle-power.

:) Y

percentages




msg:243577
 8:47 am on Jan 13, 2003 (gmt 0)

10 hours and 22 posts later this thread gets moved to the "end of the universe".....I already suffer from severe paranoia.....please don't encourage it ;)

Penalties imposed by Google seem to be relevant to Google news to me?

MartiniBuster, if you think you can cram 2,000 keywords into any one page or numerous pages without being accused of creating doorways please sticky me an example.

We are not talking about graphic rich content, that item was never introduced and is irrelevant to this thread as no one here seems to be implementing it.

We are talking about penalizing sites for being relevant but breaking the rules laid out in the secretive Google Bible.

Lots0




msg:243578
 5:44 pm on Jan 13, 2003 (gmt 0)

10 hours and 22 posts later this thread gets moved to the "end of the universe".....I already suffer from severe paranoia.....please don't encourage it.

I don’t think your paranoid - it happens to me all the time here.

Back to the subject,

I have yet to see a good argument against the following;
We are talking about penalizing sites for being relevant but breaking the rules laid out in the
secretive Google Bible.

I would like to hear the Google supporters and/or employees comment on this.
Why does Google care about how you try to make your site more relevant - as long as what you do is not trying to deceive the users about what is on your site?

noodlehead




msg:243579
 8:16 pm on Jan 13, 2003 (gmt 0)

Martinibuster said: "The web is about information. People go online to find things out."

I kind of disagree. Though you're correct, the internet is alot of different things to alot of people. To some, sure it is a giant library at their fingertips. To others it's convenient shopping, to some it's entertainment, to me, it's a job.....so on and so forth.

Now to my point about the overall thread...

Obviously Google does has a CONTROL issue. If they did not, it wouldn't matter what technique you use to optimize with. Google obvious wants to sell more Adwords. Just another engine down the old Yahoo greed path. I think whatever the algorithm is, it equals $$$.

Adding new content, even though it's visible, specifically to improve search engine results, is spam. And it's aestheticly (sp?) displeasing.

By the way martinibuster, In my opinion: "A picture is worth a thousand words"

Marcia




msg:243580
 8:27 pm on Jan 13, 2003 (gmt 0)

>>"A picture is worth a thousand words"

That's most certainly true in some cases, as martinibuster and all of us are aware. Great idea, maybe there's a market for a graphically intense directory as an option for those who want pictorial representations. As always, market demand would be the deciding factor in its success.

But in spite of the internet being many things to many people, how many have knowledge of how crawlers work? There has to be a compatibility between the medium of expression and the technology used to present it. Crawlers don't do graphics.

Using a parallel, people love to follow links. Can magazines and other print media make use of hyperlinks?

noodlehead




msg:243581
 8:45 pm on Jan 13, 2003 (gmt 0)

My comments about what the net is to different people are completely independent of search engines and how they work. Pretend as-if search engines don't even exist. Forget crawling....that's not relevant to what the net is. Now re-read.

As far as technology being capable of presenting our expressions, browsers do a fairly good job of allowing most things. And paying visitors use browsers. Usually newer versions of browsers that see images and other objects just fine.

Yes, magazines offer links all the time to URLs? Oh, you mean for navigating pages..... Based on that logic, I'll ask you this: Should we not use the full potential of the internet, because you can't browse the web with a magazine?

homegirl




msg:243582
 8:57 pm on Jan 13, 2003 (gmt 0)

I stand by Martinibuster. The web is all about information. Information isn't always equivalent to categorization as a library. Pricing is info. News is info. Pictures or graphics can convey info. BTW, this idea that the web's all about info isn't new; it's central to usability theory (of which Jakob Nielsen is Guru).

Accusations of "control" seem pretty pointless when all I can see Google doing- is attempting to prevent their livelihood from being manipulated. What I'm not seeing is specifics on how Google penalizes sites for relevancy. Make sure it's relevancy and not ranking that people are critiquing. (There is a difference.) Example: SiteX only has links from sites that themselves turn out to be link farms (lots of links, poorly organized, very little info or data other than a list of links). If SiteX gets penalized in its rankings for its keywords, is Google to be blamed for SiteX's poor rankings for its keywords? (Even if SiteX is very relevant for its keywords?) I don't think so. However, if SiteX has links from many sites which discuss the content of SiteX or why SiteX's keywords are relevant for SiteX- and Google fails to accord SiteX a higher ranking, then I can understand how Google has a problem.

Truthfully, I don't think the issue is that Google cares about how you try to make your site more relevant for your keywords; I think Google cares that you not attempt to manipulate its algorithm (which of course has its shortcomings) to increase your ranking. Making your site more relevant for its keywords involves content (yes, there are shortcoming with regard to visual data; but much of this can also be expressed through written data), involves a community or the recognition of others that your content is useful, valuable, valid, etc., and some basic organization from both a spider's perspective and your user's (e.g. it's not just spiders that can't read flash... a lot of users still can't unless your audience is very specific). The lines do become fuzzier for those keywords that are highly competitive- but so does creating an algorithm that understands what would be relevant for a particular user when they input a specific query.

The basic questions tend to remain the same among many SEOs- and it's not "Why is Google screwing me?". It's more "how do I build for the market? the users?" first and the rest follows. Focus isn't Google. Google is just one means; albeit a pretty important one. And SEO isn't the entirety of any marketing strategy (or it shouldn't be).

bird




msg:243583
 9:10 pm on Jan 13, 2003 (gmt 0)

if you think you can cram 2,000 keywords into any one page or numerous pages without being accused of creating doorways please sticky me an example.

That's the most trivial thing to do, if you have content about those keywords. And if you don't have content about those words, then your site is not relevant for them.

Remember that Google ranks web pages, they don't rank businesses. Your business may be relevant for a certain keyword. But if your site doesn't provide information about it, there's no reason for Google to rank it high on that search.

Lots0




msg:243584
 10:44 pm on Jan 13, 2003 (gmt 0)

...I don't think the issue is that Google cares about how you try to make your site more relevant for your keywords; I think Google cares that you not attempt to manipulate its algorithm
This is the crux of the problem as I see it... How can you make your site more relevant without manipulating the algo?

Let me put it another way - Why should Google (or any SE) care if a page uses hidden text - as long as the hidden text is relevant to the visible text, graphics and functions on the page?
This is a question that has been asked before - but never answered...

This 33 message thread spans 2 pages: 33 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved