homepage Welcome to WebmasterWorld Guest from 174.129.103.100
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 100 message thread spans 4 pages: < < 100 ( 1 2 3 [4]     
Google Rewrites Quality Guidelines
netmeg




msg:4686383
 3:06 pm on Jul 9, 2014 (gmt 0)

I'm going to post this link here, because it's from a credible source (WebmasterWorld user jensense) and because it touches on some of the things we've been discussion lately - particularly the Knowledge Graph. Interesting, and worth a read.

[thesempost.com...]

Here's the analysis on "Supplementary Content"

[thesempost.com...]

[edited by: brotherhood_of_LAN at 2:35 pm (utc) on Jul 11, 2014]
[edit reason] Added extra link [/edit]

 

Planet13




msg:4687447
 2:21 am on Jul 14, 2014 (gmt 0)

"I guess when a website automatically get enough traffic, then the webmaster no longer has to worry about the details. "


I would say that it gets enough links and citations, and reviews, and traffic, and other social signals, etc., you probably don't have to worry about all the correct html tags.

superclown2




msg:4687484
 9:12 am on Jul 14, 2014 (gmt 0)

I would say that it gets enough links and citations, and reviews, and traffic, and other social signals, etc., you probably don't have to worry about all the correct html tags.


I was looking at some major key phrases recently. The level of key phrase stuffing on pages of a big brand site was eye watering, and obviously deliberate. The pages are firmly anchored at the top of the SERPs.

This underlines what many of us have known for years; yes, quality is important but reputation and relevance can easily trump it and in this case the rules can be stretched right up to the line. Deciding on where that line is, though, is where the real SEO skill comes in.

jmccormac




msg:4687499
 10:18 am on Jul 14, 2014 (gmt 0)

I would say that it gets enough links and citations, and reviews, and traffic, and other social signals, etc., you probably don't have to worry about all the correct html tags.
The problem is that Google sucks at Social Media. It is good at spidering the Dead Web (fire and forget websites and brochureware that are rarely updated) but the Live Web (continually changing and based on user driven content) seems to give it problems.

Regards...jmcc

netmeg




msg:4687526
 12:47 pm on Jul 14, 2014 (gmt 0)

What baffles me is, how many top websites have too many long titles and URLs, No descriptions, no H1 tags, and no ALT attributes yet they still rank high in the SERPs.


These things have minimal impact. Nowadays it's more about quality, uniqueness, authority, usefulness, relevance, and trust. Technical issues are important insofar as they let the search engines and your users determine those things.

Jenstar




msg:4687592
 3:21 pm on Jul 14, 2014 (gmt 0)

The problem is that Google sucks at Social Media.


There had been a lot of talk about Google possibly using Twitter for URL discovery, but Eric Enge just did a study which shows that you need to have more than 5M followers for Google to even make a dent in indexing what you tweet. Google used to be better at indexing Twitter when it had the firehose of data, but now Bing aims to be the indexer of all things Twitter.

What baffles me is, how many top websites have too many long titles and URLs, No descriptions, no H1 tags, and no ALT attributes yet they still rank high in the SERPs.


As soon as spammers start doing a technique, Google reduces the effectiveness to such an extent that NOT doing those things won't have much of a impact either.

And some sites have great content, and might be the authority in the field (especially for niche hobby sites) but have an atrocious website from an SEO perspective. Google doesn't necessarily want to discount those sites and not rank them well simply because they don't use ALT or don't know what an H1 tag is. Most of the times, it is only when they make a major misstep ("oh, no one told me selling links on my site was bad?") and investigate why traffic has gone down, that they look at SEO at all.

My look at reputation in the guidelines is now up... [thesempost.com...]

jmccormac




msg:4687681
 8:08 pm on Jul 14, 2014 (gmt 0)

There had been a lot of talk about Google possibly using Twitter for URL discovery, but Eric Enge just did a study which shows that you need to have more than 5M followers for Google to even make a dent in indexing what you tweet. Google used to be better at indexing Twitter when it had the firehose of data, but now Bing aims to be the indexer of all things Twitter.
Bing does seem to be very active in indexing Tweeted links. But Bing has become far more active on indexing of late. It still has some major problems to sort out when dealing with large websites and their sitemaps and these are problems that Google has already solved. The big problem for Google with SM sites is that they are developing into walled gardens where people exchange links and Google becomes unnecessary. Social Media is taking the original link authority model and imposing it on a human authority/trust model. Unlike the link model where the links are relatively long lived, the links in a Social Media model are often highly transitory and mentioned in a datastream which Google might not be able to readily spider. Using Twitter for URL discovery is not a good way of doing things and neither is the link crawling discovery model. One effect of Google's FUD about unnatural links has been a collapse in links to and from trustworthy sites and a massive increase in noise from out of context and low quality links. The aspect here is that Google's success has "convinced" a lot of webmasters and website owners that they don't need links to be found by search engines. In some respects, Google has become the ultimate victim of its own success and is heavily tied into the blind crawling model of site detection.

Regards...jmcc

Jenstar




msg:4687688
 8:15 pm on Jul 14, 2014 (gmt 0)

Bing does seem to be very active in indexing Tweeted links. But Bing has become far more active on indexing of late.


Bing has an exclusive agreement with Twitter for how they index their tweets, and Bing announced a few weeks ago new features in the Bing search engine to allow users to find tweets and Twitter handles while searching in Bing.

There is more on it here: [blogs.bing.com...]

jmccormac




msg:4687693
 8:48 pm on Jul 14, 2014 (gmt 0)

Smart move by Bing - bet on Social Media but let others develop the Social Media sites. Google, with its Orkut and Google Plus, tried to be both Social Media and the search engine. Orkut is gone and so is Vic Gundotra. Between Google and Bing, there is the makings of one good search engine.

Regards...jmcc

Martin Ice Web




msg:4687866
 11:33 am on Jul 15, 2014 (gmt 0)

I just recalled this sub content thing and the example with allrecipes. I thought of the domain we discussed lately, i think it was whatscookingamerica? That lost a major part of its traffic. I wonder if it had something to do with this sub content thing. As this site does not have any of those things mentioned on allrecipes. On the other hand allrecipes cooking nstructions are very plain and very click intensiv.
But if subcontent is now more important than true content then this miht be the reason.
I even wonder why some ecoms are at top positions while they have lots of similar items on the detail page. It might be seen as good sub content.
Also google kills landing pages and conent farms with this SC, cause in most cases there will be no SC.

MikeNoLastName




msg:4688631
 12:27 am on Jul 18, 2014 (gmt 0)

Maybe I'm misunderstanding it, but I found the "supplemental backlinks" portion sort of an about face of what MC said in response to a question just last year (search you tube for "How does Google consider site-wide backlinks?") that lots of backlinks within a site makes it appear very spammy...
So if every recipe page on the cooking site points to the same on-site 'metric to non-metric measurements converter', will that be good or bad? Does it make a difference if it is on another site? What if you have 3 cooking sites, each specializing in different types of cuisine with entirely different audiences, and want to use the same converter rather than re-invent the wheel? Do you copy it to each and look like you're scraping your own content or plagiarizing or backlink it from each?
I simply don't think most manual reviewers are smart enough or will take the time to analyze it properly. In fact based on some past adsense/adwords messages I've gotten, I KNOW they aren't. At least on the latter services you can request a re-review.

This 100 message thread spans 4 pages: < < 100 ( 1 2 3 [4]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved