Welcome to WebmasterWorld Guest from 22.214.171.124
1- ask one question
2- be brief.
3- no commenting on other posts.
4- no specifics please
5- violators will go posting off ;-)
6- 1 q - 1 q only.
7- thread will be scrubbed of junk/offtopic/etc
After we get 10-20 q's we will submit them to the plex...
I think the book Google Hacks also has it toward the end of the book, too.
It's very deep stuff. If you just re-read those 26-steps once a month, you won't need 90% of SEO questions/threads. :)
I will add though I run 4 retail / etail sites and all get very good results from Google using standard White Hat Tactics, Dynamic Meta Tags and as the saying goes... content is king.. so Good Images and Good Descriptions...
..but always good to brush up and keep reading articles, threads and posts.
I was hoping you had a super-secret method of contacting elite members of a department who care for idiots like myself
The photo gallery and reviews sections are my most popular with visitors, but Google hates them.
It's a white-hat content site, and I'm thinking the best thing is to use robots.txt to block just Google from the forum, photo gallery and reviews sections and then use the URL removal tool.
I removed session ID's for guests on the forum, but the old pages are still in Google Supplemental. I can only write so many words to describe each image in the gallery to make the page unique before it will look silly to my visitors, my reviews section is slowly building content...but I rely on my visitors for that.
Is Google working on better methods of determining what is duplicate content and what isn't?
It's fine for someone to say "the user wanted to book a hotel room, and I let him do that." But if 50 other sites are trying to do the same thing, and those 50 sites end up crowding out good results like the hotel's real home page with an actual phone number, then those 50 affiliate sites are clearly hurting diversity, esp. if all the sites are cookie-cutter/templates or nothing but repackaged feeds.
So my answer would be to think hard about your value-add compared to other affiliate sites in whatever niche you're targeting.
I think that this is only one question :)
We got hit during allegra (75% of traffic down from Google), and with bourbon we lost remaining traffic from google.
we still get some visitors, google sends us about the same number of visitors like altavista. MSN and Yahoo are way above.
We really think that we are completely whitehat. And I think its impossible to lose 30.000 visitors from google without some kind of penalty.
From what I can tell, there ARE different considerations to make when trying to acheive ranking on my Korean sites in Korean portals/Goog Kor.
As you probably know, the whole market and how sites are presented in portals is completely different out here, and many of that work in my "western" sites just don't seem to "take" out here (as well).
note: I'm very whitehat about it all...but still...even the basic rules do seem to a bit different.....
[edited by: GrendelKhan_TSU at 10:28 am (utc) on June 2, 2005]
joined:Jan 23, 2005
[edited by: Brett_Tabke at 1:28 pm (utc) on June 2, 2005]
[edit reason] No specifics. Please reread the first msg of this thread [/edit]
A question from my 17 years old daughter.
She and her school class friends find it difficult to run effective searches on Google which bring relevant results without the need to run several searches because they don´t know the basics elements of how to search. So I showed her the pages bellow and she asked whether its possible to add a direct link to that page on Google search box in addition to the standard links : Images Groups News Froogle Local more »
I am seeing spammers getting hit a lot faster than in the past that would suggest this is the case - can you confirm this?
If you go back to say Feb 2004, I think spam had gotten to be an issue because we hadn't allocated enough resources to it. Google isn't a monolith; within the company there's always room for different opinions about what different priorities should be. But I'm happy that I think Google is paying more attention to this issue now. My opinion is that spam is getting hit faster, and that for the rest of this year, it will continue to get harder to spam.
It's funny because a while ago, people would post on WebmasterWorld and say "When is Google going to pay attention to spam?" Recently I saw a post where something said "Why is Google so harsh on spam lately--why don't they work on other things besides rooting out spam?" I guess you can't please all of the people all of the time. :)
Thanks for your time and effort.
Question about banned sites (we might just got one for the first time in ten years....). Both specific and general.
1. Assume the site command (site:example.com) returns NO pages.
2. When one types the URL (like: www.example.com) the standard is a return by Google of:
# If the URL is valid, try visiting that web page by clicking on the following link: www.example.com"
# Find web pages from the site www.example.com
# Find web pages that contain the term "www.example.com"
But our site returns ONLY the last two.
Checking many domains we can fish from the net – there seem to be some sites that return all three and some that return only the last two.
- Can you clarify how to detect if a site is banned or not?
- Can you clarify Google’s policy as to when NOT to return any URL when a URL is entered as search query?
- Are banned sites being crawled?
Thanks for doing this Brett--this was good. I don't know whether I'll circle back around to this thread--I'm planning on dropping any Bourbon-related index "weather reports" in the thread from the previous paragraph. I might put more effort into comments/advice in that other thread.
Apologies if I had typos or anything, I was just typing really fast. It's more fun if you don't proofread it before you submit it, anyway. Maybe I ought to get me one of these blog things; I hear they're really popular with the kids these days. :)
Talk to folks later.
In this update, I seem to have attracted a 'rank minus 70 spot penalty' even for pages that contain unique useful content. These pages simply do not rank for anything anymore, unless I unique-stringify the query. I still have the same number of pages indexed, get spidered frquently, have no supplementals in the Serps, and generally can find no obvious sign (ranking aside!) of any obvious penalty, so am naturally a little puzzled.
Would blocking gbot access to the affiliate content enable my otherwise healthy (unique) pages to resurface. Or would I need to email a human at the plex and request the 'offensive' filter be lifted?
[edited by: TravelMan at 10:43 am (utc) on June 2, 2005]
joined:Jan 23, 2005
Just kidding. It sounds like the problem was the search <snip> showed a different url a while ago. I tried it just now, and it looks like we have only pages from the Supplemental Index now. So we're not really crawling/indexing/serving your site up, except for the Supplemental Index. Since you recently were crawled and had good results, it could have been the actions of the previous owner. I would send in a reinclusion request (go to google.com/support and click until you can specify that as a subject line) and put a summary of the info into the report.
[edited by: lawman at 11:53 am (utc) on June 2, 2005]
joined:Jan 23, 2005
Is it okay to have a domain alias or more specifically point two domains (one eg domain1.com, and the other a domain2.ccTLD) to the same content or would this incur a duplicate content penalty? Would using 301 redirects be a much better method?
A few tests showed that we would recieve a penalty on both pages that only had $/£ difference which I understand but is there a work around?
I have many problems with US customers not knowing the difference between $/£
The customers see's £500 and phone and ask about a product and when the price is converted you often lose that customer..