| 11:13 am on Nov 22, 2013 (gmt 0)|
Thanks for spotting this, Whitey.
To get things started, Matt offered some sample suggestions from his "personal brainstorming". Here are several that jumped out at me...
|Make it easier/faster to claim authorship or do authorship markup. |
Checklists or help for new businesses that are just starting out.
Periodic reports with advice on improving areas like mobile or page speed.
Send Google "fat pings" of content before publishing it on the web, to make it easier for Google to tell where content appeared first on the web.
Better or faster bulk url removal (maybe pages that match a specific phrase?).
Note, btw, that authorship and original publication are not exactly the same thing... they work in tandem... and Matt apparently doesn't think that Google has given up on them. These aren't necessary areas will work on next year, but they're a good start to a list.
I smile, btw, at his bulk url removal suggestion... because he rightly assumes that search by keyword phrase might be the easiest way for webmasters to find artificial links that need removing.
| 11:37 am on Nov 22, 2013 (gmt 0)|
|Send Google "fat pings" of content before publishing it on the web, to make it easier for Google to tell where content appeared first on the web |
This would be a good one if Google then decides to attribute it to the correct website. I think it should work for the future content (not yet published). If they make it work retrospectively - could someone "steal" your content then by copying your old content and sending Google the "fat ping"?
| 11:49 am on Nov 22, 2013 (gmt 0)|
Create new meta tag "no-title-change" because some webmasters really do not like Google dynamically changing the title tags.
Give clearer clues about what is the problem. Content, Links, Tech Errors, Something Else. I understand Google can't expose their algo because people like me would abuse that information but clearer messages would help less educated webmasters.
Fix toolbar PageRank because many people do not want to link to a website with no pagerank even if it has no pagerank because it launched 8 months ago and toolbar pagerank broke 9 months ago. This makes things much harder on new sites trying to develop new partnerships and traffic sources.
| 9:48 pm on Nov 22, 2013 (gmt 0)|
I'd like to see some sort of report to help webmasters "bridge the gap" from old to new SEO, with a couple of site health reports showing up as quality indicators to assist folks out of the key Panda and Penguin penalties. Something similar to pagespeed insights [developers.google.com...] which hone's in on key elements for improvement.
Clearly there's a lot of good folks hurting real bad who deserve some relief and Google would appear to have the tools that tell them instantly whether a site in Pandalized or not. Bringing some of that data in to WMT would give back hope to respond positively to many.
| 10:21 pm on Nov 22, 2013 (gmt 0)|
Page rank should die imo. I've never not linked to a site based on page rank. The fact that page rank seems well out of kilter says it all for me. (Dodo)
| 12:01 am on Nov 23, 2013 (gmt 0)|
|Send Google "fat pings" of content before publishing it on the web, to make it easier for Google to tell where content appeared first on the web. |
Like it! Not quite as easy as it may sound to implement, but I'm sure they can and hope they do.
If they make it work retrospectively - could someone "steal" your content then by copying your old content and sending Google the "fat ping"?
They would actually have to build that type of a "filter" in to do it and get close to right, because if you pinged and I was constantly scraping you and the "pingbot" was later to your site than the "regularbot" was to mine I could steal your content even though you pinged, so attribution of originality in that type of system would have to really be based on "ping time v discovered time" where "ping time" would override "discovered time" when the "ping time" was earlier -- If someone's going to code that far to get "future publications" attributed correctly, then it's relatively easy to not attribute originality to a "pinged page" when the content has been in the system since before the "ping" was received.
|Create new meta tag "no-title-change" |
Like it! I think even cooler would be something like "TitleSuggest", so if the algo had a title that was determined should get more clicks, when someone used TitleSuggest, rather than the algo "auto-overriding" the title in the SERPs, there would be a notice with the algo determined suggestion(s) in WMT allowing the webmaster to decide which title(s) to test and use.
Added: This one could get super cool if they went "all the way" with it. For instance, if the algo came up with 5 different titles that were determined to work better for 5 different queries a page is returned for, then the "current page title" and "suggested titles" were displayed in WMT with "check boxes" for a webmaster to indicate use/don't-use the title of the page on the site wouldn't need to change, but things could really be "dialed in" by the webmaster, especially if "projected number of displays to clicks" v. "actual number of displays to clicks" were presented for the current title and each suggestion when used -- This would also eliminate situations like the recent thread here where a wrong city was present in a title algorithmically, because the webmaster could "turn that title off" and fix the issue -- That's a complicated system to code, but it is Google, so I'll just venture to guess if they want to they can write the code to make it happen ;)
|I'd like to see some sort of report to help webmasters "bridge the gap" from old to new SEO... |
I like this one too! I think even a "quality score" published in WMT would give people some help or at least an indication of "current standing" and, yes, it makes things "more reverse-engineerable", but if the score was really based on quality then who cares? because to "game the system" people would really have to create higher-quality sites, which really isn't gaming at all.
|Page rank should die imo. |
FYP: TBPR display should have died long, long, long ago -- lol
[edited by: JD_Toims at 12:38 am (utc) on Nov 23, 2013]
| 12:30 am on Nov 23, 2013 (gmt 0)|
1) More up to date data in reporting. eg.. stop showing 404's for pages that vanished years ago.
2) Introduce reporting that flags aspects of a site in need of improvement. I'm sure that can be done without giving away algo secrets. If the current algos are as good at trapping spam as Google say, then what harm in letting sites know where their weaknesses are? Bad spam remains bad spam and gets blocked, borderline spam gets a chance to reform and genuine sites are given a chance to improve... win-win-win.
[Added] Similar ideas expressed in recent posts above... we were obviously commenting at about the same time[/Added]
| 8:50 am on Nov 23, 2013 (gmt 0)|
|More up to date data in reporting. eg.. stop showing 404's for pages that vanished years ago. |
That's a BIG one IMO. Why do they do that? Do they NEVER forget? If a page gives a 404 & the webmaster confirms it as not a problem (gone) then drop it once & for all.
| 9:24 am on Nov 23, 2013 (gmt 0)|
I think one of the biggest, yet easiest, changes they could make is to be more understandable wrt how they "label things". They're technically accurate, but sometimes simplicity and understandability is much better.
A great example is "crawl errors" relating to 404s.
The way it's presented is usually interpreted as: "Houston, we have a problem..."
If they simply changed the wording/presentation to:
Crawl Notes >
- Did You Know >
-- There are 4000 URLs from this site in our system your server cannot find the info for.
-- Status Code Received: 404 Not Found
--- To review the URLs [link]Click Here[/link]
--- If the you know the URLs requested are invalid check this box [check box] then click [submit button] to clear this message.
| 9:34 am on Nov 23, 2013 (gmt 0)|
/\ There we go, simple!
| 1:47 pm on Nov 23, 2013 (gmt 0)|
Personally, I would like to see the same thing I have been suggesting to AdSense and AdWords lately - An "I meant to do that" button. Once I click on that, it means I'm aware of the issue and don't need to be reminded again. (With the option to review and reverse, of course, like you can do with removed URLs) And if I make a boo boo, then it's my tough luck. But I'm tired of getting endless notifications on stuff I meant to do.