Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
P.S. I won't be posting as often (gotta work, ya know :), but I will be checking this post and chiming in when there's something I can add.
While that is funny and all, it pretty clearly means a couple things: what we see on -fi now is poor, and objectively poor search engine-ing. But then if pagerank has not been included in the equation, that could explain a lot. Google is showing 251,000 backlinks on -fi. Altavista (for example) shows 21,600 backlinks. This a case where we are all able to know that Altavista's backlinks are not magically far superior to Google's.
These results need a major influx of 1) the display of new pagerank and 2) the addition of this new pagerank to the calculation of the ranking algorerhythm. Quite a lot of the obviously ludicrous errors in the rankings can be explained, at least partly, by lack of pagerank in the calculation.
So, I'm optimistic GoogleGuy, but... if Google doesn't sit in the top two for "search engine", no quotes, a week from today, that would be evidence that this straightjacket should be put on some of the engineers. (And no hand ranking to put Google first, that would be cheating.:)
Means nothing, GoogleGuy already said he wouldn't be surprised to see (translation "very likely will happen") that pagerank is applied a couple *days* after the index moves onto all the datacenters.
Maybe that won't happen, but it does make some sense in the context of what we are seeing now. Of course, if that doesn't happen, these are bad results. Google at #21 is absurd.
joined:Oct 27, 2001
The only thing that is a little wierd is still the cut in back links, but mostly from internal sites.
My reported backlinks have nearly doubled in Esmeralda, from 569 to 1040. Most of the new backlinks appear to be internal links, so I assume that one of two things happened: (1) The PR threshold for reporting backlinks went down, or (2) The PR of those "inside pages" on my site went up just enough for the pages to be counted.
(I mention this only to show that backlinks aren't being universally cut or underreported.)
Also, GG's willingness to comment on the new index rather than the usual "wait till the dust settles" mantra is perplexing.
Maybe 'his willingness' is just because what we've seen lately is causing such a big wave of excitement?
In other words:
Wait 'til the dust ...
[all:] SETTLES ;)
Well said SteveB.
It's one thing to put less emphasis on PR or anchor text, but I think this case would be a bit extreme!
[edited by: mfishy at 10:29 pm (utc) on June 17, 2003]
What are you guys doing? As far as my sites are concerned and as far using Google everyday for searching, it really isn't broke!
I seem to recall GG saying something about the end result of all this being something that made it all worthwhile. Anybody spotted anything majorly positive about the SERPS compared to pre-Dominic?
GG was quite open about the expired domain filter a few updates back, I wonder if he'd enlighten us about the benefits of this and the Dominic update (hint).
I would also be interested to know if the expired domain filter is know fully implemented. It was supposed to take a few updates, which should by my reckoning make it fully operational now.
There has been a few posts in here and another thread about it - [webmasterworld.com...]
At the last check -dc.google seemed to be plucking the correct www version for 1 of my sites but not the other data centres.
I think my best bet would be to just forget about it for a few days and wait for everything to pan out.........but its too hard!
Nothing to do with feeding kids or mortgage........but your site ranking well in the SERP's in Google is everything at the mo as they have 80% of the searches!
added <BTW Google is 5th for search engine in -dc!>
My one and only site is about the same, #1 in its kw's for a very obscure niche, except for having had a lot of new pages added, at #1. I had 98 pages in Dominic, 158 in Esmeralda. Backlinks are back. My main page might still be PR6, the toolbar shows that but who knows...
LOL, uh yeah, you might want to use one of the twenty sites listed above Google for a "search engine" search.
It's an old mantra by now, but you guys gotta get your heads out of your own sites. Google's own ranking of itself is ridiculous. Hopefully once the data is on all the datacenters and pagerank or whatever else is factored in, maybe then Google will be back on its feet. But as of now the prima facie evidence of it being broken is its own #21 ranking. (Or maybe they are just being honest....:)
My main site is sitting pretty and I see more backlinks, so no complaints. My question about this new index is with hidden links and spam filters.
One site appeared out of nowhere in Dominic with 12-15 hidden links. I thought it would drop in the SERPs with this index, but it is still around on page one. The site in the serps is [domain.com...] and the hidden links are to www.domain.com.
Are these type of links considered spam? If they could be considered spam, does this put one or both of the urls at risk for a penalty? Is there any evidence that the spam filters GG said would kick in have done so?
How does Google know (algorithmically (if that's a word)) that it is a search engine? You and I know it is, but nowhere on the page does it say "search engine." The word engine is not even on the page. Also, by now, Google is such a household name, people link to it with anchor text not of "search engine," but of "Google."
So if the words Search Engine are nowhere on the page, and people link to it as Google, why should it be in top spot for the search term Search Engine?
Search engines such as hotbot, dogpile etc., probably are linked to with anchor text such as "other search engine," or something of that nature...
Anyway, I just don't think it's indicative of a problem at Google for them to not show up #1.
Uhh...Cause they have the most anchor text links using search engine?
Man, people will say almost anything in defense of Google! SteveB's right, maybe it will work itself out before the update ends. But, if you don't think that GG should rank high on search engine with the most anchor text links/200,000+ overall links/PR11...geesh
[edited by: mfishy at 11:18 pm (utc) on June 17, 2003]
True, but I think people are just using that to gauge the changes that are happening.Google used to rank #1 for "search engine" for a long time.
#21 IS more relevant than @22 & #23 though..
But again, I have to say I don't think they're done. I think they're still playing. I think they're not factoring in the anchor properly and pagerank yet..
Well... I'm praying..
Cause they have the most anchor text links using search engine?-mfishy
I stand corrected. Had never used allinanchor before, and it shot a hole in my theory :)
Man, people will say almost anything in defense of Google!-mfishy
Had nothing to do with defending Google, just a simple opinion like everyone else, that happened to be wrong...geesh!
joined:Mar 2, 2003
It would probably wipe out SEO in about six months.
I don't think Google would do this though. They are too heavily invested in their own hype, which includes PageRank, the toolbar, etc. Too many "Google is broken" rumors would be flying. The mainstream pundits would stop opening their stupid columns on every conceivable topic with a lead sentence that reads, "A quick search on Google for blah, blah produces blah, blah hits..." (and more often than not, the number these pundits report are overblown because they should have put their phrase inside of quotation marks in the search box).
Google might wipe out SEO by introducing randomization, but they might go down themselves at the same time.
I'll have to wait for this update to settle down, but I believe that Google in fact did add in a random element previously from some evidence I was shown. However, it is far more subtle than completely randomizing the first 30 results.
I doubt it. Those who post on this forum have'nt been able to agree for the last 6 weeks, whether or not google is -broken-.
We can't see the code, but what do programmers think? This example of Google not being #1 for "search engine" is probably something that they could fix in about 2 seconds.
joined:Mar 2, 2003
However, it is far more subtle than completely randomizing the first 30 results.
Agreed, complete randomization would show their hand. How about a random algo that is 100 percent dependent on some hash of the top 30 hits? That way today's search comes up the same as yesterday's search. You couldn't have a situation where the top 30 move around drastically from one search to the next; that would turn Google into a parlor joke.
joined:Feb 26, 2003
I have seen a one page web site with no heading tags, no meta, no external backlinks whatsoever (and with no secondary pages, it has no internal back links), and having been untouched for over a year, ranking #1 for a tough search term. I have seen old domains coming back in for no apparent reason. And I have seen spanking new fresh sites jump in.
I also have a good, solid theory about link pages (also guest books, etc.) and Google's new algo, which I will not divulge here and now. Only because I understand last time I put a theory about this up, it was not taken up, apart from some people behind the scenes. So I got no benefit from that input, but others may. So, if anyone wants to sticky me with a starter on this subject....