Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: lawman
We had the opportunity to discuss individual site issues and observations. I was involved in the 302 dscussions and it appears that many webmasters/sites suffer the same/similar problems.
It's been a great opportunity for those attending the WebmasterWorld conference and I think it's proved that Google is listening.
hats off to Allegra for such incredible planning and preperation. Also to the team of engineers for such a great job - and also to their mentor ;-)
The food was pretty awesome, too.
my only comment was that it was generally to loud to listen to conversations. i'm sure googleplex could foot the bill for some smaller rooms.
did i mention the free food, booze and live jazz...
I also got the impression that guys from Google have built up a system that is so big that they really can not give you a clear answer on a question, but they do make a professional guess :-)
Still there is much value is such guesses.
And of cause the most important thing that they were listening and truly interested. Taking notes and digging for feedback, which is positive and we can hope that we will be heard.
joined:June 11, 2004
What a fun week, I'm already looking forward to the next one!
I was most impressed with the interest the engineers took in the questions. They could have gone out there, been stubborn and quick with people, but they weren't. The ones I heard were very articulate, interested, and generally excited to be talking nerd with some webmasters. Someone should have gotten some hurricanes in them though to give up some dirt, although I'm sure a few tried at the bar later on.
The food was good, and not cheap. And an open bar will win me over anytime. Good time, and a great event.
Yeah I would guess that the algo is so large, different people work on different parts of it and nobody knows the entire secret.
The days of easy money and disposable domains are near the end. I think in the future affiliates may be able to make more money selling advertising than other peoples products.
That was not only great information but what exquiste people! I had a great time with my new buddies... AND I remember their names even after the weird jello things... : )
Hi Kevin, Anna, Chad and Tony! House of Blues was GREAT.
I have some reading and learning to do but it will be fun. Hope the rest of you had a blast too!
Brett, you're a great host. Hope to see you all next time.
joined:Apr 30, 2009
I thought the session halls were great, but the hotel in general was horrible. Like the poster said above I as well will not hold that against WebmasterWorld. :)
Thanks everyone for such a great experience. See you next time, -g
In any event, I got some important questions answered, enjoyed the food (and drinks) and was extremely pleased with the forum.
Hope this becomes a regular thing for them :)
What?! Some small number of webfolks were taking advantage and trying to hog the best spots all for themselves?!?!?
Why, who ever heard of such a thang? :)
I gotta make one of these soon, even if I don't do this for a living, sounds like there definitely is the potential to do some living
Pubcon was awesome as well. I met just the right people. Just, WOW!
I have one word to describe the entire experience: overwhelming. My "to do" list will take me 6 months. Better hire some help soon.
It's been about 10 years since I was last in London, that would be a hoot, not sure what my spousal unit would think about hopping the pond to just sit in a pub as she's not fond of beer.
Got alot of good info on Google SiteMaps, Duplicate Content and SEO Tools. Here's the brief summary of what I got of the Wednesday session:
The Google people use the term “Sandbox” but it means specific kinds of behavior; they don’t subscribe to the idea that all new sites fall into a sandbox.
Google confirmed that they probably do use CTR as a ranking factor with qualifications – they have to determine when people are gaming the system and clicking on their own urls, etc.
The engineer I spoke with did not agree with my definition of Duplicate content on websites. I did not get the feeling though,they actually knew for sure how the duplicate content penalty was invoked. If content looked identical but it was clear the data was sourced from somewhere else (therefore, not a copy), the engineer said this should not be a penalty.
There was a lot of talk about LSI at the conference; however, I don’t know if any of the engineers actually addressed it in detail.
None of the Engineers at Google use the tools we do; they are more like statisticians and speak a different language (ie: Signal noise, etc). That’s the one thing I think we need to bridge. I suggested that Engineers at Google look at the SEO tools in use and express an opinion on the ones they think are useful (or not).
I also go some very direct info on Sitemaps for a large site of a client of mine with 11 million pages; the Google Engineer knew of my clients' sitemap! I was very impressed.
Great Show Brett!
I'm sure any further nuggets from Google would be welcomed too.
As for Signal/noise, that's probably in reference to the famous "signal to noise ratio"
used in all sorts of electronics, radio, communications and engineering.
Everyone wants to improve their S/N ratio, i.e. more signal and/or less noise.
The same formulas might apply to content over junk if you catch my drift;
but that's only speculation on my part. -Larry
... specific kinds of behavior ...
Hehe. No one who was paying close attention over the past year subscribed to the nonsense that all new sites get hit. But for the still-non-believers, hats off to MetricsGuru for being in the thick of things and accurately restating this critically important tidbit.
And some people wonder if it's worth it to attend the PubCon's. Go figure.