Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: lawman
We had the opportunity to discuss individual site issues and observations. I was involved in the 302 dscussions and it appears that many webmasters/sites suffer the same/similar problems.
It's been a great opportunity for those attending the WebmasterWorld conference and I think it's proved that Google is listening.
That was not only great information but what exquiste people! I had a great time with my new buddies... AND I remember their names even after the weird jello things... : )
Hi Kevin, Anna, Chad and Tony! House of Blues was GREAT.
I have some reading and learning to do but it will be fun. Hope the rest of you had a blast too!
Brett, you're a great host. Hope to see you all next time.
I thought the session halls were great, but the hotel in general was horrible. Like the poster said above I as well will not hold that against WebmasterWorld. :)
Thanks everyone for such a great experience. See you next time, -g
In any event, I got some important questions answered, enjoyed the food (and drinks) and was extremely pleased with the forum.
Hope this becomes a regular thing for them :)
What?! Some small number of webfolks were taking advantage and trying to hog the best spots all for themselves?!?!?
Why, who ever heard of such a thang? :)
I gotta make one of these soon, even if I don't do this for a living, sounds like there definitely is the potential to do some living
Pubcon was awesome as well. I met just the right people. Just, WOW!
I have one word to describe the entire experience: overwhelming. My "to do" list will take me 6 months. Better hire some help soon.
Got alot of good info on Google SiteMaps, Duplicate Content and SEO Tools. Here's the brief summary of what I got of the Wednesday session:
The Google people use the term “Sandbox” but it means specific kinds of behavior; they don’t subscribe to the idea that all new sites fall into a sandbox.
Google confirmed that they probably do use CTR as a ranking factor with qualifications – they have to determine when people are gaming the system and clicking on their own urls, etc.
The engineer I spoke with did not agree with my definition of Duplicate content on websites. I did not get the feeling though,they actually knew for sure how the duplicate content penalty was invoked. If content looked identical but it was clear the data was sourced from somewhere else (therefore, not a copy), the engineer said this should not be a penalty.
There was a lot of talk about LSI at the conference; however, I don’t know if any of the engineers actually addressed it in detail.
None of the Engineers at Google use the tools we do; they are more like statisticians and speak a different language (ie: Signal noise, etc). That’s the one thing I think we need to bridge. I suggested that Engineers at Google look at the SEO tools in use and express an opinion on the ones they think are useful (or not).
I also go some very direct info on Sitemaps for a large site of a client of mine with 11 million pages; the Google Engineer knew of my clients' sitemap! I was very impressed.
Great Show Brett!
I'm sure any further nuggets from Google would be welcomed too.
As for Signal/noise, that's probably in reference to the famous "signal to noise ratio"
used in all sorts of electronics, radio, communications and engineering.
Everyone wants to improve their S/N ratio, i.e. more signal and/or less noise.
The same formulas might apply to content over junk if you catch my drift;
but that's only speculation on my part. -Larry
... specific kinds of behavior ...
Hehe. No one who was paying close attention over the past year subscribed to the nonsense that all new sites get hit. But for the still-non-believers, hats off to MetricsGuru for being in the thick of things and accurately restating this critically important tidbit.
And some people wonder if it's worth it to attend the PubCon's. Go figure.