Any thoughts on canonical url improvements.
And esp when sites go url only on the homepage (both non-www and www)
I know you have part covered this - but any fix for sites that have the problem, timescale after 301 introduced. I assume it triggers a dup content pen - same timescale - 180 days!?
Taken from update thread - if gg already covers this then no worries ;)
[edited by: Dayo_UK at 8:35 am (utc) on June 2, 2005]
What should I wear to the Webmaster Conference boxers or briefs?
Many people believe that Google uses everything to determine the ranking, for example, toolbar data, adsense data, domain record and a lot others. Can you confirm whether this is true or not?
As long as you wear something over top of them, I'd recommend either one. ;)
How can I (or we) help Google and ourself by reporting websites that uses tricks to rank high? e.g. purchace many links?
When will Google start using Trustrank and will it have any effect on the adsense earnings?
Will offering first-born children work as a bribe to guarantee an invite to the "Meet the Google Engineers" session at Pubcon?
Brett, are these questions for the "Meet the Google Engineers" session?
Edit: Directed 2nd question to Brett. (I know I broke the rules but it is worth clarifying. At least one person is confused, me.) :)
[edited by: whoisgregg at 8:43 am (utc) on June 2, 2005]
There are rumors that if a new site ( but not only new sites..) get too much links in a short time it could be penalized.
It is supposed that the site is doing a link campaign popularity, just to trap google...
Is it right? Should we avoid for example a Press release on Internet? Usually such Press relase brings several equals text links and descriptions.
thank you for clarifying that!
What was in the bourbon update that caused so many sites to disappear?
guoqi, my rule of thumb is not to confirm whether we use any given piece of data in ranking. It's really interesting to see all the possible ways that people have tried to take the public info about PageRank and exploit that info, for example. A whitehat SEO will tell you just to make a normal site and not worry about it. A blackhat SEO will tell you make a site and try to make it look normal. Either way, if you mimic the way that a real site is built and gains reputation, that's a better way to think of it than "Oh, I have to get a ring of friends and we all click the happy face on our toolbars 3 times a day, because GoogleGuy said that Google might use toolbar votes as a spam or quality signal."
How can we keep our original articles ranking higher than stolen copies of our content? (which are sometimes impossible to get taken down)
FillDeCube, definitely use [google.com...]
to report spam. If you're a whitehat and don't like to see spam at all, it's good to report it. If you're a blackhat and want to report spam so that your site can do better, it's good to report it. If you're of the species of blackhat that will do absolutely anything to try to rank, from blogspam to referrer spam, yet you shun the idea of reporting spam because of your belief system, power to you then too--don't use the form. :)
We've seen a lot more interest in using the data from [google.com...] as we have been taking spam more seriously for the last several months, so I'd give it a try.
Can anyone else at Google convince the Adsense department to tighten up their quality control standards on the type of websites they accept and/or keep in the program? Junk sites and scrapers are being massed produced just for adsense, alienating advertisers looking for conversions from quality sites and casting a bad image on real content publishers. Besides the url mess, I think it's the second biggest problem Google has.
Whatever the motivation of Supplemental listings were in the beginning, now they are often entirely worthless, false results. Supplementals for pages deleted off the Internet more than a year ago are common, even if high PR, active links are pointing at the now-404 locations.
Clearly there is some problem involved here, as there is no quality justification for listing these things, particularly when the URL is repeatedly seen as 404. Many supplementals disappeared with Bourbon, but many still exist. So...
can we expect a supplemental fix soon, meaning that 404 Supplementals will no longer be listed in the results?
Are there any plans to improve the quality of the content network for Adwords, specifically to apply the TOS to at least the worst of the scrapper sites and others that are obviously created just for Adsense and have nothing of any substance to offer?
roycerus, I never try to make predictions about the future or date when the future will arrive. I especially don't try to predict the future based on an ambiguous trademark registration, domain name, or some patent filing being disclosed. It makes me want to register gsecretlabs.com just to make people speculate when our secret lab will open, and whether the mad scientist gear will be electricity-based or chemical. :)
whoisgregg, if you read Danny's article, I believe the Google spokesperson said that you can register at [services.google.com...]
without any first-born at all. In fact, technically you might not even have to register for PubCon. I'm sure Brett would say that as long as you're in town anyway, why wouldn't you want to register? ;) Sometimes there's good schwag at various booths, for example, and the pub conversation is rumored to be interesting. Although ThomasB and another German SEO firm now have a lifetime moratorium against claiming AdWords coupons. ;)
(But if you really want to sacrifice the naming of a first-born, I suggest Orville or Hortense. If they can get past that obstacle, they'll grow up strong.)
(continung a theme from a parallel thread), why is the Google Directory the most useful tool for finding information on the web?
Does that make the Google SERPs only the 2nd most useful?
You have on many previous occasions suggested that those sites that are removed i.e. banned from the Google index, can make a re-inclusion request.
There have been many comments and posts from members here who have totally cleaned up and removed even the vaguest questionable techniques from their sites but have not been re-listed even after a lengthy period. There are precious few here that say they have been re-included. Does Google seriously consider all of these re-inclusion requests and if they do what is the average time scale from request to re-inclusion? Can you give any tips on how to make a succesful request and what are the considerations for re-inclusion?
Will Google ever consider manual site quality reviews? By this I don't mean a boost to rankings, I mean paying a recurring fee to have your site verified as genuine, this could then be used to negate any penalties that may happen due to Google trying to get rid of scraper sites etc.
P.S. I'd be happy to set up such a facility ;)
angiolo, organic growth is often best. That may include a public relations strategy, but many times I've seen posts like "My site is nearly three months old. I've got 5,000 backlinks but I'm not doing as well as I'd hoped. What am I doing wrong?"
I'd recommend people go back to Brett's 26-step plan when I hear questions like that.
GG - Having problems with Google News taking an age to respond to questions - still waiting over a month for a technical issue to be resolved. Is there anything else I can do for quicker assistance in that department?
There's been a lot of 302 redirect problems with Google, where the 302 redirect causes Google to replace the redirects destination URL with the redirect URL. Do you guys plan on fixing this or already doing something about it?
GoogleGuy, let me know when you register gsecretlabs.com, so I can learn SEO from Google first hand. :)
Not sure what you mean.
Hortense O Smith
xyzzyx, Bourbon isn't done brewing. It would be premature for folks to analyze right now. Some people are seeing different things depending on what data center they're hitting, for example.
I have a client who has a company listings site (contact details and information about companies in a certain sector).
My client started off in March with a generic list of companies (about 300 web pages in total, one company per web page). He used this as a demo to his potential paid listing clients.
Now he has a handful of paid clients, with full listings, he doesn't want the old free 'demo' listing on the site any more and he's asked me to delete them.
The site ranks really well for all his main keyphrases on both Yahoo and MSN, but as yet he's only listed for obscure phrases on Google. I'm putting this down to the "sandbox" and assume it will pick up by the end of the year.
Will removing these demo pages (a vast majority of the site) do any harm to the future Google rankings? I'm currently considering the following options:
1. Just delete them (the server will produce 404 errors)
2. Delete them and return a 410 error
3. 302 redirect visitors to the home page
4. 301 redirect them to the home page.
5. Meta Refresh (0 seconds) back to the home page.
6. Replace the text with a message saying "this listing is no longer available. Click hgere to return to the homepage"
Which would you recommend?
[edited by: mrMister at 9:03 am (utc) on June 2, 2005]
GG - Will getting listed in to many online directories to quickly increase the possibility of being penalised?
[edited by: Langers at 9:03 am (utc) on June 2, 2005]
buckworks, report 'em via the spam report form if the sites are doing wholesale duplication in a spammy way. If it's your content, a properly-formed DMCA complaint is the correct way to assert your copyright: [google.com...]
But whenever possible, I recommend going to the site that's copying your content first and trying to solve it at the source. Even if you file a DMCA complaint to Google, other search engines may find the copies as well.
Finally, if the site seems scuzzy/scraperly to you, use google.com/support/ to report it in addition to using the spam report form.
| This 201 message thread spans 7 pages: 201 (  2 3 4 5 6 7 ) > > |