Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
joined:Sept 1, 2000
I know, we’ve all said it dozens of times and there’s nothing new here but maybe we should take another listen because seriously I don’t think most of our esteemed membership get this. You all want to make it some big deal, a conspiracy theory or a huge change in the algorithms or penalties, that’s a popular theory, to get around the idea of what Google wants. What feeds Google? Isn’t that what we all want to know? Isn’t that the key to our success these days?
My advice, you want my advice on Google? Take a great big step back and look at the big picture.
worry less about PR on the toolbar and more about rankings. – GoogleGuy
And less about rankings for high-profile phrases and more about overall rankings. – GoogleGuy
And less about rankings and more about traffic. – GoogleGuy
And less about traffic and more about conversions – GoogleGuy
Can I use that GoogleGuy, on my website?
Did everyone note this comment?
Sorry to once again draw attention to your comments GoogleGuy, I really try to cut you slack but when something as profound as this comes through I can only hope that by drawing a bit of attention to it will in turn help us all.
Two clients asked me, knowing in advance that I hadn’t been following the update, what I felt about this update, if I’d peeked in and followed what was going on. For me it’s not the updates that get my fire burning but I certainly understand why it burns in the core of many of our members. I think the core of what I found to work with Google is today as successful as it was when I first discovered it 3 ½ years ago. The problem is that what I discovered isn’t very exciting.
I find success with Google when I diversify.
I noticed that bit too, and have noticed comments by many, including myself, that suggest that there was a lot of information there.
I think there is now the possibility to over-optimize your site for a specific keyword or phrase.
Those of us that are not very dependant on a *main keyword* seem to be doing at least as well as normal if we are in this index, though we may not be doing as well on our previous top performers.
There are several "advances" that could describe what I am seeing, but there just isn't enough data there to support anything yet. I think some of what is going on should (but probably won't) make your forum a lot more popular.
No matter, everyone should keep on with developing good content. In the long run it is the best way to CYA when big changes come along.
>Those of us that are not very dependant on a *main keyword* seem to be doing at least as well as normal if we are in this index, though we may not be doing as well on our previous top performers.
Problem is for those of us running narrow amateur sites this is often the case. Particularly when there aren't very many "competing" sites. The reasons are #1) there aren't that many searchers a day to begin with, and #2) because of the lack of competition people find simple search phrases work well enough. Only when simple searches fail ("widgets") do people tend to start refining to get what they need ("blue widgets Tokeka Kansas"). For competetive commercial searches this sort of keyword phrase targeting is the key. I do very well, even in the new index, on more complex search terms. However, my logs show most aren't using those.
joined:Sept 1, 2000
To distribute (investments) among different companies or securities in order to limit losses in the event of a fall in a particular market or industry. – dictionary.com [dictionary.reference.com]
I think that means spread things around. So we all know Google likes links in and links out and Google does not appear to benefit sites that provide duplicated content. (dull, overused, boring, mirrored – ok I am showing my bias) Google does appear to like (what does that mean like? Ugh) content that not only stands out but that other sites link to because they too think the information is interesting, unique, shall we say original? <aside>I don’t really think Google has actually figured this out. I think we’ve just figured out ways to work with what Google’s algo seems to do.
And could it be that we just don’t want to put all our eggs in one basket, and that includes the Google basket. I actually think that Google likes sites that show up in multiple peripheral markets and as ‘resources’ for multiple lines of promotion of information. What might that mean? Well, maybe showing up in the ‘news’ about the product or service. Or for resources or images, what ever it is that is being promoted, I see Google seeking sites to place near the top that offer more than one channel of interest. There are certain sites that come up consistently no matter what topic I’m searching for. I know my searches might be more refined or directed than others but still, I consistently see the same sites or ‘type’ of site coming up in searches no matter what.
What do these sites have to offer? Diversity. They have news and information, fresh and relevant content and are referenced from a variety of peripheral markets.
I think that many of us are having problems because we don’t actually understand the true value or meaning of themes. We’re boxing ourselves in. I see themes as a core base to build from but then grow from there. We need to stretch ourselves more. These update threads are boring because they are all caught up in the details of the moment. How are you all applying the details of what you see?
Content and quantity is king.
joined:Sept 1, 2000
So, let’s take it apart… Or better yet, let’s go to the core of that statement…
GoogleGuy refers to overall rankings, which I see as equal to more than a phrase or two. Why not think about what the totality of the site is about? Why limit it to a few keywords that a tracker tells us is what people are searching for?
How are you all applying the details of what you see?
I've always kept outbound links on any single page to less than 15. Why? Well, mainly due to the way I design my resources pages. If I had 39 outbound links, then I would have 39 titles, 39 descriptions and 39 logos on that one page. My goal is to keep page load time to a minimum so I break my resources pages into categories. -pageoneresults
joined:Sept 1, 2000
I do think about these things and I don’t mean to be cryptic but darn it gets dull otherwise ;)
You’re right about the traffic though vdlddd8379 and that’s what brought me back from my almost sleep (actually turned off the computer but had to get up and come back just had to post my ’thought’.)
I think would I rather have 4000 visitors a day with 4 that convert or 4 visitors a day with 4 that convert? Where are my priorities here? I admire GoogleGuy for bringing this up and it shows the integrity of their enterprise. You’ve got to admire that.
I must admit that even 6 years ago it seemed empty, my beloved spamming days, where content failed to rule only rank and who cared if it converted? We didn’t get paid to convert nor do I now, but man do I love to hear from clients that are turning those clicks into sales.
Major one word keywords exist. They represenet HUGE amounts of traffic and SOMEBODY ranks well for them. Now that those of us who haven't been asleep for the past several years dominate diverse search phrases, the main thing to focus on are the big enchiladas. It is patently absurd to suggest that those should just be left to guestbook-signing, linksmanager-joining, keyword-in-domain-naming, hidden-text-writing spammers.
Diversify is webmastering 101. Some folks are a wee bit past that and are concerned when hugely competive terms/areas are given to weak/bad content domains and those with good sites and clean webmastering are deposited in the rubbish bin.
1. Google is doing some substantial changes to the way they rank, index, and classify results/keywords.
2. This will take some time to do. What we are seeing is a significant change in some significant areas, bit its only the start. Its obvious that a lot of factors have to be factores into the prsent "sj" results, regardless of when it goes "live".
3. 90% of the most vocal on the upate threads are, I think, people with new sites or who have done a lot of "new" work, optimization, or reorganization of content in the last month or two, probably following "advice" on WebmasterWorld or using "models" and "systems". It seems obvious that newer sites or changed sites are not getting the fast results they expected based on experiences from previous updates, and the process of WebmasterWorld as a discussion board DOES seem to raise expectations unrelaistically. And how many times do we see posts bascially saying "ive followed brett's classic guide so why am i not there!"? Basically brett's stuff and WebmasterWorld is the best on the web re google, but there's an upper limit to how much you CAN influence google, while many people dont seem to realise this and want definitive statements of what works and what does not. They are in cloud cookoo land. Google does change fast. No "model", except for the most general and smart like Bretts, will survive mroe than a few months.
4. I feel this major shift change at google is only at the first stages of implemention (hence the lack of much practical ddcussion on the update threads) is aimed at making the google serps more 1)topical 2)targeted and 3)informative. As part of this some of the optimization shortcuts that are threatening this relevance such as a)reciprocal links b)hypertext anchors and d)keyword priorities in terms of titles, H1's etc; are being "revisited" by google. I expect a lot of the latter holy grails of optimization are being focused on by google.
5. That is why older sites seem less effected than newer sites, where newer optimizers are going overboard with recips, H1's, anchor text etc.
6. In the end these changes will be GOOD for us in terms of delivering more highly targeted traffic
7. This is hard to understand for (without being patronizing) newer webmasters who tend to look, like i did at the start, at only raw referrals, rankings on keywords, and "backward links". In the past few years i spend less and less time on these (in fact i dont think ive looked at backlinks for over a year) but more at how referrals are converting - things like newsletter signups, enquiries, orders, etc.
There is much intevening factors between both these "input" indicators and "outcome" indicators and even SEO's when they start out encourage, and their clients assess them, only on the inputs (raw referrals, rankings, etc etc). They do this because it is easy and easy to understand, but it is a bad oversimplification of the value of their work and how it must be measured with any degree of reliability or validity.
The fact is that there is a big difference between attracting anyone, someone who has a motivation to look/buy, and someone who actually converts.
Somehow i think Google in the long run will help us - at least those who go beyond looking at input outcomes and look at the REAL bottom lines of conversion and ROI.
8. That also means intelligent diversification as you say. To me personally this is operationalised by realising that some pages and sites return best ROI in PPC like OV and Adwords; others in Google proper. Knowing the difference is important. Assuming that PPC is only a "fallback" for not being "listed for free" goes aginst what i have learned - and that is for our commercial sites, adwords returns a MUCH better ROI than "optimizing" as adowrds clickers, in our experience, are much more likely, ready, and motivated to buy.
9. Most importantly like you and many others, in the meantime I understand and sympathise that it is a bumpy road for new players. (But sometimes, just quietly, i wish some would do more listening than talking ;))
[edited by: chiyo at 8:10 am (utc) on May 20, 2003]
I don't think anyone missed that point. The question I have is how much have you concentrated on optimizing just for that major keyword? Is your content real content on the subject or heavily modified for SEO reasons content?
Let's assume that google will turn on the spam filters and all those big time spammers above you disappear. How is your site going to be sitting then?
I'm working with a very small sample of sites that I have checked, so it would be interesting to hear the details from someone that lost out on one of those big keywords.
It hurts, alot
Google is more relevant
We get more targeted visits
Everyone is better off
I don't think the factors you mentioned hurt the relevancy that much yet in my categories, which admittedly aren't that competitive. But they were definitely on the horizon and had to be dealt with.
So what we are hearing now is a lot of loud screaming (myself included) rather than months of dull moaning later on.
I can't imagine how you come to that conclusion as nothing in these changes support such an assertion.
In fact the exact opposite is true. Google is moving to "fresher". There is simply no way at all to judge a fresh site's informative content or the extent of its targetedness. The same goes for topicality, except that these "fresh" results will be more keyword-y. But keyword-y isn;t topical. Keywords are just gibberish. Keyword fixated is spam heaven, non-quality heaven, non-informative heaven.
If Google was moving to be more topical/targeted/informative it would be *emphasizing* the deepcrawl instead of abandoning or ignoring it as it is now.
Topical is a matter of context. Links, link text.
Targeted is a matter of themes. Authority sites, relevant pagerank.
Informative is a matter of quality, which is discerned via authority linking, on page words, and general popularity (overall link voting).
"Fresh" does not just not come into play for these, it is deliberately anti- most of these.
If Google wanted to move more in these three directions (they were the best before), they would emphasize the deepcrawl even more. They would emphasize the making of thoughtful, valued, deliberate decisions.
Freshbot was fun for awhile, but it is now the bane of *quality* content.
Sorry BigDave, those aren't the questions. That's again just trite Webmastering 101. Sure, some people are there and don't understand that many drops of water make a raging river, but that is very basic and a not that hard a part of webmastering. Much more difficult is to take the best (or tied for the best) content available and go toe to toe with other good content *and* with spammers for the key keyword in each interest area.
Saying work on your site, get quality content, go for multiple keyords, is like saying "breathe". That is the easy stuff. It's ongoing work and valuable, but once you get past these basics the work gets much harder, and the payoffs much greater.
You mention all the linking and off page methods that seem to be the sorts of things that everyone talks about when it comes to themeing. So it seems natural to me for Google to concentrate on smoe areas that SEOs have not been planning ahead on.
This sort of thing could have a major impact on your ranking for single word searches.
For example, let's take the word "mountain". Google could associate a lot of words with mountain by looking at how common they are on other pages with the word mountain on them. Ski, altitude, pass, timberline, snow, trail, resort, dew, biking, hiking, climbing. Many of these will make of some of those secondary searches.
So you have a page that talks about the soda, Mountain Dew. It has the keyword mountain, and one of the words associated with mountain, but it is unlikely to have many of the other words associated with it.
Now you search on the word "mountain" and it will no longer necessarily find the site that just stuffs "mountain" in the title, H1, domain name, file name, and 28% of the words on the page. It now matters that there is a good selection of other mountain related words. The Mountain Dew page will have some advantage over the keyword stuffed page, but it is not likely to compete with a year round mountain resort town.
All the old stuff would still count. It would just add a new ability that has not been accounted for by the SEOed sites.
Again, I am not claiming that this IS being done, but I suggest that you try not to limit the possibilities, because I am sure that Google is not going to limit themselves to only the things that are easy for everyone else to imagine.
Sorry BigDave, those aren't the questions.
Not only are they questions, they are very valid questions. Some standard SEO recommendations are actually very restrictive in how well they apply to broad content. I would really like to know how much your site has been optimized for those big keywords. If you do the page per phrase system, it could easily backfire for you at some point.
If you would never do something like that, then that answer is also valuable. But a non-answer is useless.
that's an interesting thread.
When I read through it this morning I found some similarities with one of mine assumptions: themeing.
Many of GoogleGuy's posts over the last days point in that direction, especially when you look at his feedback to other posts. E.g.:
annej, I really wish every webmaster would do the log analysis that you just did. :)
Maybe Google after Dominic will focus less on main keywords / phrases, but more on semantic denotation. It can be done as BigDave pointed out in msg #19. Analyzing the semantic fields of pages (and sites) may bring up more relevant search results. - You might prefer to speak about topic fields instead of semantic fields (because english is not my first language I'm not sure if I used correct terms).
It's too early to prove a hypothesis like that, but there are some hints, anyway.
Imagine no more ;)
1. Google Guy has already stated that a lot of data and algos will not be factored in until the sj/fi-like indexes transfer across all data centres. So far they have not done so, only one or two to go. I'm betting that the present sj/fi indexes you see coming on all databases now will not be the same as what you see when this older data and new filters are factored in later. I really dont think they mean anything now of any practical use to us. They are parts of a jigsaw puzzle that dont make sense until they are put together with ones. People are analysing the pieces and of course coming to premature conclusions.
2. GG is on record as saying sj is meant to be a "topical" oriented index, hence my reference to topical.
3. Im looking at a bigger strategic picture looking at how google has developed in the past and how that trend may continue in the future. "big-picture" if you like.
4. Admittedly a small sample, but on our sites and competitive sites we track, they are coming up for a broader range of phrases than before. As i said i dont think the sj/fi indexes mean much by themseleves, but im just putting that in my "interesting" box and putting 2 and 2 together and getting 4 rounded to the nearest whole number!
5. Even you say the index looks "fresher". Certainly that would suggest it is more "topical" yes, gievn my understanding of the term that topical means it relates to issues of the present. (didnt have time to get out of my dictionary! - Im sure someone will if im wrong!)
I admit looks like plain bar bar baboor (craziness) to many here. But heh, there has to be some crazies in a place!
I was curious to learn about themes and vortals some time back, when I put the pieces together of time for google to calculate ,data to be processed and themes taken into account, then latent semantic indexing for me seemed to be the answer..
which is like when you are searching for moutain you will get sites on skiing etc...This happens purely on common sense level basis( probability basis)..."skiing" would be used with high frequency ,in the content along with "moutain", so if you pick out the highest repeating frequency words, there you go ,you have the best possible search results, quality wise.
I don't have a clue if google is implementing such an algo but if it does then I am very happy coz it will give quality results by which the web starts looking beautifull and nurshing place.
Perhaps then the focuss shifts to quality and good business practices.
As paynt said diversify.....there might lie an answer.
To my mind this is where this is going
Google is trying to READ the page for the content. But no, it is not only trying to read the page it is also trying to read the web site (to ensure that the page is not an anomaly but contextually fits in) . This means that henceforth, it will not depend on just the on- page elements to sort out the pages relevance to the user query & its backlinks. It will instead depend on the meaning of those pages.
Welcome to the Brave new world of semantic searching.
Two implications of this for SEO community would be :
Spamming would become extremely difficult for it will not work unless you spam the whole site. Which then is as good or bad as making an altogether different site.
Secondly theme optimizations will become more relevant. Stay on theme, use as broad a set of linguistic terms( key words) or images or audio/video to support that theme.
The web site will be relevant to the SE as well as the end user.
In this light GG comments on the end user conversion being the focal point of optimization, interprets rightly.
He is trying to tell us focus on the searcher. They with their semantic tools want to make his searching relevant. So if we focus on the searcher too our & googles objectives will be coinciding.
Hence this will be the best possible google optimization technique.
We & Google will end up on the same side.
Say you sell widgets. If your home page is nothing but lists then its sunk, but the underlying small pages will become more important because they will actually talk about the widget itself in a way that means something to someone.
Perhaps I should not be so mystified that my sub pages are doing a whole load better in this release than my main pages....
Why limit it to a few keywords that a tracker tells us is what people are searching for?
I've noticed this for some time. I thought it was my secret. The totality of the hits on product specific pages overwhelms the hits on index or category pages. Hits on "blue aluminum widgets" are the key and wordtracker and other tools don't show you this. What worries me is that suddenly all of these pages are PR0. Still getting hits though. I was one of the conspiracy theorists that someone mentioned, but now I just think Google is seriously broken and their PHDs can't seem to figure it out.