| This 49 message thread spans 2 pages: < < 49 ( 1  ) || |
|All I have learned from this Forum...|
my google knowledge
| 9:40 pm on Apr 15, 2003 (gmt 0)|
Thought I'd post a note about all I have learned since I joined this forum. Some points I may be unclear on and appreciate input from others.
I wanted to post this because of the impact of all the posts of read over the last few months has changed the way I look at websites.
Feel free to add/modify as you see things. :)
A solid SEO foundation for websites:
1) Plenty of good seperate pages of content.
2) a Site Map that links to all pages
3) A site map that is friendly to people as well as bots
4) Optimized Page Title + H1 text can go a long way in unoptimized SERP's. [my experience]
6) The anchor text in backlinks is important though often hard to control.
7) Having the Google Toolbar is a helpful tool.
8) Page rank is important, but like all other aspects there is no 'magic bullet' where one thing will rocket you to the top.
9) Page Rank should be a SECONDARY consideration when requesting links or allowing links. e.g. A client's #3 referral is a PR0.
10) Backlinks with a PR4 and above will show in Googles Baclinks on toolbar or 'link:www.domain.com' but links from lower PR sites still count.
11) Use All the Web dot come to check back links to see all links regardless of PR. [personal choice]
12) useful queries "link:www.domain.com". "allinurl:www.domain.com" "allinanchor:word or phrase"
13) A bigger site with more pages that has proper site map MAY benefit from the internal linking, hence the bigger is better theory. More unique pages, more possibilities to optimize for individual phrases. Each page should be looked at as an opportunity.
14) Buying a text link or any sort of link just for PR is bad and Google is wacking those domains with PR0's.
15) While Page Rank in the Google Toolbar shows as round numbers, Google itself calculates it to fractional numbers. Meaning, even if you show as a PR5 the websites above you who are a PR5 may actually be a PR 5.9
16) Each SERP seems to have it's own unique parameters as far as what is more of a weighing factor in optimization. Perhaps, not due to Google, but to how people design pages.
17) Google doesn't always 'take' the page you optimize for, but instead will utilize the index page because that's usually the page with the highest PR.
18) PR is not the end all be all of SERP's and a lower PR page can beat a higher PR page. I would guess this goes for less competitve/unoptimized SERP's, but have not followed threads on this closely.
19) Google has two types of robots out spidering. One is Freshbot and the other is Deepbot. Fresh- IP 64... Deep IP 216....
Though, people have noticed what we consider Freshbot acting more like deepbot lately. Freshbot can grab new pages and update your SERP's. Deepbot will usually hammer a site over a period of a few days then leave. It is the results of those crawls that ends up in the monthly Google update. Though, I am speculating Google may use Freshbot results in a more agressive form to feed SERP's.
20) Google updates their index on a loose average of once per month. GoogleGuy will specifically not push the button until he has seen enough people wimper/moan/worry/cry/beg/plead/ and generally have a nervous breakdown.
21) The progress of the update can be seen before the database goes live on www2 dot google dot com and www3 dot google dot com. There are numerous Google Dance tools on the web.
22) The Google Dance is when all the SERP's mix around, PR is handed out...upgraded, downgraded, penalties are seen, reinclusions requests are seen and basically your life is on the line. ;)
23) For a while it took a couple days for the index to filter though the different data centers (collection of google servers) before it hit the main index, but this past update zipped though to the live server(s) quickly.
24) I'm sure one day GoogleGuy will change the IP's of the servers and serve up a 3 month old index on www2 and www3 just to play with us. Woulda been a cool April Fools Day joke. :)
25) Still, it takes 2-3 days or thereabouts for PR and SERP's to settle.
26) You can see what your new PR is buy adding an entry in your hosts file to point to a specific datacenter, but that is too GEEK for me. It is what it is... :)
27) Yahoo's Google index use to lag a bit behind Google itself, but on the SERP's I watch Yahoo seemed to update it's Google index VERY quickly this time around. AOL's Google took a bit longer to catch up.
28) You need a _HUGE_ amount of patience before you or your clients see results from SEO work. While one index may help, we analyze, learn and change...
29) Google doesn't care if your website ranks highly regardless of whether you feel you have the best offerings for that search phrase. As long as Google serves relevant results normal people have no idea that they are 'missing you'.
30) For immediate visibility check out Overture/AdWords
31) Utilize ROBOTS.TXT files and META TAGS to save bandwidth on your image directory or to 'noindex' a page that announces a weekly promotion exlusively.
32) Stuffing all the keywords you can in the META TAG won't help you.
33) It can take a few indexes for Google to catch up with all your backlinks.
That's about it. Naturally, it's presumed the page you want to optimize IS on topic for the phrase you want to rank highly in.
In terms on mindset it's best to think in terms of 1/4's of the year instead of "How did we do this month?"
Please add or modify as you see fit. Brett, saw your pet peeve so...
Hope this Helps. :) :) :)
| 12:41 pm on Apr 16, 2003 (gmt 0)|
Thanks for the thread and the post which started it, Alphawolf. It's an excellent summary.
RE: Keyword density. The best description of optimum density I've heard is 'a bit more than your English teacher would have been happy with' So what I tend to do is write the content in the best English I can, then add a couple (depending on the length of the copy on the page) more instances of the keyword. That way it still reads well and keeps the punters happy (hopefully) but also flags the keyword up to the SEs. I'm certain that there is an optimum; too few and it won't register as a keyword, too many and it looks like spam and will be treated accordingly.
| 12:53 pm on Apr 16, 2003 (gmt 0)|
|Buying a text link or any sort of link just for PR is bad and Google is wacking those domains with PR0's. |
Yes, buying a text link for PR is bad
No, Google is not wacking those domains with PR0 - AS LONG AS IT IS A VISIBLE LINK.
| 1:09 pm on Apr 16, 2003 (gmt 0)|
|No, Google is not wacking those domains with PR0 - AS LONG AS IT IS A VISIBLE LINK. |
Good point to clarify. I was thinking of the search king owned site that sells PR.
It's funny to see a web site selling something to boost your PR and it's own site has a white bar. :)
| 1:23 pm on Apr 16, 2003 (gmt 0)|
Kennyh and all the others who thanked me for starting the thread:
|Thanks for the thread and the post which started it, Alphawolf. It's an excellent summary. |
I debated posting that because I had to read though bazillions of posts to get to this point...and I am still learning for sure!
I decided it was OK to share all the kernals I picked up along the way because:
Having the knowledge (and really- much is educated guess) is one thing. Implementation of that knowledge is quite another deal.
Bringing it all together (SEO/SEM/USABILITY/PROGRAMMING/CONTENT/LINKS) in good style to a fully functional website that actually serves its purpose is a _whole_ other deal entirely.
| 1:25 pm on Apr 16, 2003 (gmt 0)|
Nice thread, alphawolf and others.
I would add - somewhere at the beginning of the list:
Analyze your keyword(s) environment, including your competitor's sites.
Plan your site structure keyword-oriented (but don't violate usability). - Sometimes deep linking can help.
Keep analyzing the keyword environment and improve your site accordingly.
| 9:12 pm on Apr 16, 2003 (gmt 0)|
Good job AlphaWolf!
Just to add the two things I didn't know that have helped my sites the most:
37. Don't use "id=" as a URL query string variable name
38. Don't use more than 2 URL query string variables
| 9:39 pm on Apr 16, 2003 (gmt 0)|
I'd like to add my own contribution.
What I have learned over the last 3 years, and I think could make the internet better for many is regarding URLs.
Please read the article "Cool URIs don't change" of the W3C here: [w3.org...]
Please start to learn your tools. Learn how to use mod_rewrite. Learn how to use PATH_INFO. Learn what a QUERY_STRING REALLY is for (it's NOT for telling your script what page to fetch!)
Start using sensible URLs without hints of technologies or methods. Remove those extensions, those script names, those variables and values. Follow the theme pyramid in your URLs. Use your keywords. Make them easy to remember, write down and tell. Google isn't the only one telling others about your pages. Never udnerestimate word of mouth.
I hope one da we will all live in a world of short, concise, clean and meaningfull URLs that both our visitors and our favourite search engines will love.
If you want to see samples of the 10s of 1000s of pages my sites deliever through "pretty" URLs via a colourfull variety of scripting technologies, jstu sticky me.
| 9:53 am on Apr 17, 2003 (gmt 0)|
Alphawolf, that's a great summary of Google optimisation in 2003. I agree that it's best to think in terms of 1/4's of the year for link building, but once you have enough PageRank the Freshbot can give good results within a couple of days.
> 8) Page rank is important, but like all other aspects there is no 'magic bullet' where one thing will rocket you to the top.
I guess it's once you have that part worked out that everything else starts to make sense.
I think I agree with almost everything you wrote, with doc_z's addition (not too many links on a page) being a crucial addition. I'll add one more, Google gives some extra weight to link text from outside the domain.
| 11:58 am on Apr 17, 2003 (gmt 0)|
Regarding URL Redirects.
When I found WebmasterWorld, I had already created a site using Java with redirects for session tracking.
After playing around a bit, we seem to have found that it's the single quotes within the redirect statement that stymies the Googlebot. Double quotes within double quotes seems to work fine for Java and the pages get indexed on a single pass.
| 12:31 pm on Apr 17, 2003 (gmt 0)|
Google only indexes URLs with 2 or less variables.
This one has been killing me. Only half of my site is indexed. Its off to the design board this morning.
| 2:26 am on May 21, 2003 (gmt 0)|
38: Google can't read frames. Just the no frames, title, and meta tages parts.
39. Google loves text and hates complicated sites. Obey the KISS rule.
40. You never know what the next dance will be like, or how long it will last, 5 days or two weeks, er three weeks.
| 3:12 am on May 21, 2003 (gmt 0)|
|38: Google can't read frames. Just the no frames, title, and meta tages parts. |
Partially true, because google will actually read the content. They just won't serve up the frameset, but rather the pages referred to by the frameset. Not advisable either way :)
|Please start to learn your tools. Learn how to use mod_rewrite. Learn how to use PATH_INFO. Learn what a QUERY_STRING REALLY is for (it's NOT for telling your script what page to fetch!) |
Not really sure why to bother with mod_rewrite, considering google is picking up on the pages fine as is. And would you mind explaining what query string's ARE really for then?
|20) Google updates their index on a loose average of once per month. GoogleGuy will specifically not push the button until he has seen enough people wimper/moan/worry/cry/beg/plead/ and generally have a nervous breakdown. |
Hehe, sure does seem that way sometimes!
| 4:48 am on May 21, 2003 (gmt 0)|
|24) I'm sure one day GoogleGuy will change the IP's of the servers and serve up a 3 month old index on www2 and www3 just to play with us. Woulda been a cool April Fools Day joke. :) |
OMG AlphaWolf what a prediction .. :)
and what lottery number you think is gonna hit the jackpot tomorrow ;)
| 4:59 am on May 21, 2003 (gmt 0)|
Wow, what a great post, thanks for putting this together!
Don't show this to my clients though or I'll be out of a job!
| 5:26 am on May 21, 2003 (gmt 0)|
Wow. I was searched for that post of mine several times, but could never find it.
Now I bookmarked my own post...maybe read it again and think it over a bit.
| 8:08 am on May 21, 2003 (gmt 0)|
In your control panel you can look up your threads.
| 8:14 am on May 21, 2003 (gmt 0)|
"It will track up to the 25 most recent threads."
That was posted a month ago. Non issue now though. :)
| 9:59 am on May 21, 2003 (gmt 0)|
If you read the HTTP specs and hte W3C recommendations, and the original CGI standard specs, you will learn that QUERY_STRING strangly enough is for queries. and saying pages.cgi?load=page1 is only a query in the most semantically loose sense, not in an strict IT sense if you know waht I mean. search.cgi?q=some+key+word THAT is a query. A Query is a question to which oyu DON'T know the answer. Everything else is a request and should be donw in the standard URL section before the query, like: pages.cgi/page1
Now of course you can do whatever works, jsut like crackign nuts by drivign a truck over them, but I jsut wanted to point out that the original designers did some thinking, and what they came up with might be so for a reasopn.
Just my 2 cents.
| 10:44 am on May 21, 2003 (gmt 0)|
here is my 100th post :)
| This 49 message thread spans 2 pages: < < 49 ( 1  ) |