homepage Welcome to WebmasterWorld Guest from 23.20.77.156
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Reconsideration Requests - are some responses automated?
Andylew




msg:4325427
 11:56 am on Jun 13, 2011 (gmt 0)

Just busy pulling some information together after reconsideration requests over a few years and several recently on the same site and Im convinced these are automated (I have posted this in another thread)

I belive penalties are imposed manually but lifting is automated, perhaps keyword matching in the request triggering a specific penalty related algo to check for resolution. Does anyone have an opinion on this?

Im specificly interested in anyone who has recently submitted a reconsideration request for a 'known' manual penalty - I appreciate this is not always easy to tell.

Can you set up email forwarding of WMT messages and log the time the reply comes in. I ask this as the past 3 (for the same site) I have put in have always been replied to on the 5th night after the request and the reply has always come in at an almost identical time (to the second!) 1:18gmt (This could obviously just be batch sending)

Post your reply times to reconsideration requests and any experience to indicate that this is not a manual process.

 

walkman




msg:4325536
 4:46 pm on Jun 13, 2011 (gmt 0)

I asked just in case for mine and it was automated as far as I can tell. For all it's worth, it was after CA's 5 pm and after about 5 days. (I didn't have a manual penalty)

indyank




msg:4325546
 4:55 pm on Jun 13, 2011 (gmt 0)

oh yes, I had the same experience recently. After 5 days and during the night in US. The subject line reads "No manual spam actions found" against the domain name.

Shatner




msg:4325615
 7:04 pm on Jun 13, 2011 (gmt 0)

They all are. Google isn't listening.

Robert Charlton




msg:4325636
 8:16 pm on Jun 13, 2011 (gmt 0)

This forum thread from 2009 discusses a video featuring two members of Google's search quality team....

Reconsideration Request Tips - from Google Search Quality Team
http://www.webmasterworld.com/google/3903282.htm [webmasterworld.com]

The video states that reconsideration requests are in fact read by members of the team. It emphasizes the need for the clearest possible statement of the situation in the request, and also the need to be concise and to the point. It also suggests that many requests received assume penalties when the problems are elsewhere.

Some months back, in the Supporters section here, we looked at what one member had done (quite a bit of work, in fact) to enhance the quality of content on his site. We critiqued his request to help him make it clearer and quite a bit shorter. For those who are Supporters here, I suggest that the thread is well worth reading...

My Google Reconsideration Request: need a sharp review
http://www.webmasterworld.com/supporters/4269251.htm [webmasterworld.com]

tedster




msg:4325651
 8:45 pm on Jun 13, 2011 (gmt 0)

They all are. Google isn't listening.

Just two weeks ago someone shared their reconsideration response with me - it was most definitely personal and not automated. It didn't even seem to be boilerplate to any significant degree.

I do think the RR team now has a number of quick reply buttons they can click for standard actions, and it's not impossible that the "no manual spam actions" look-up could be automated. But it is frustrating that algorithmic spam actions might still catch an occasional false positive, and now it feels like you can't get a human look to overrule that - to see if your site might be worth putting on the "exception list".

I think there's still a human look somewhere along the stream, and the key is that your request needs to immediately look like it's not "just another one of those."

Andylew




msg:4325664
 9:23 pm on Jun 13, 2011 (gmt 0)

Tedster, thats interesting clearly there is some human interaction somewhere....

The site im playing with (mainly for fun/personal curiosity) is just met with the usual 'your site violates googles quailty guidlines' I have started a new test today, dropping the page index rate down to a minimum and completely removing the site, all indexed pages are 404s and the index is a simple 'site closed'. A reconsideration request has gone in to this effect. My hypothesis is that in 5 days time I will get a reply saying the site still violates googles quailty guidlines. The reasoning behind this is that google will only recrawl a fraction of the inexed pages and find they are 404s. The automated system for reconsideration uses cached data from previous crawls and will not notice that the site has now gone. A human review would.

My reasoning that may help others, people who get penalised change something then submit a reconsideration, if google is using cached data or only the data collected from 5 days then large sites which havent had all their pages recrawled in that time will still be classed as in violation, not until every page has been crawled and checked then a reconsideration completed will the site then recover.... I will of course post the outcome of this.


Rob, Ive seen that video a few times and they clearly say that they read what is written, I guess the question could be is this pre reconsideration or post to help develop new algos?

walkman




msg:4325666
 9:24 pm on Jun 13, 2011 (gmt 0)

If you have no penalty it makes no sense to waste their staff's time, automate it, unless they want to peak just in case something is wrong. These past few months google has told people 'look at outgoing links' for example, or doorway pages. The generics I've seen on Google support forums are when there's no hope in the site, totally mfa, with no other purpose.

I think the non-manual penalty cases are automated but they wait a few days for some reason. Maybe not to encourage people from sending each month to see if Google caught them after their latest trick?

mhansen




msg:4325687
 10:12 pm on Jun 13, 2011 (gmt 0)

Worth noting... In 2008, I had what I felt was a unique problem, thought I was more deserving than others, and requested a manual review with special considerations. In addition to the online request, I spent a few hours Googling, and got both the work and personal email addresses of Matt Cutts, to send it to him also. (I know, probably not a great decision)

Not only was my request answered promptly, I also got a somewhat personal email from a gent named Mur V. a Product Manager that worked directly with Matt at the time. He mentioned enough personal info about both myself and the site in question that I knew it was not canned, but he also sent the same type of canned reply with links to Google guidelines, reconsideration request guide, etc.

The point... even when you get personal replies, the answers are the same and always under the guide of obfuscation. [en.wikipedia.org...] Incidentally, I never used the word much before reading Google guidelines and having to understand its meaning very clearly! :-)

olmcdonald




msg:4326699
 12:51 am on Jun 16, 2011 (gmt 0)

So we sent in a reconsideration request for the second time.

A few years back we were penalized 100% and we made some changes, filed a reconsideration request and about 2 months later saw an increase in traffic.

But...we never recieved an email back saying anything ie your request had been processed-just saw an increase in natural search traffic.

After losing 60 to 70 percent of our traffic in Panda we realized that maybe our penalty were not completely lifted after all and once again we did a reconsideration request. 10 days later we got the same form letter that the webspam team has nothing wrong with our site...basically this is all on us...not them.

My question is now, is there like a refresh button that maybe we were penalized and now that they did a look see and realize it is long gone, we gained our trust back?

Have any of you seen any changes since getting this form letter?

Andylew




msg:4331458
 9:19 am on Jun 27, 2011 (gmt 0)

A follow up:

The site was 404'd and a reconsideration request put in, the hypothesis was that after the usual 5 days google would reply with 'site still violates' if it uses pre-reconsideration request data however no message was received. After seven days and no message the site put live again. 5 days later the message came in, exatcly the same time of night and after the exact same period of time (with the site live).

Conclusions:
This would suggest that if a site is 404'd (or possibly fixed) either the timeout is longer than 5 days before google will consider it as clean and reply or if it is 404'd they assume foul play/website problems.

The best thing this indicates is that 'something' is happening when a reconsideration request is put in, this seems to point to google not using pre re-consideration request data as a factor.

On this reconsideration a simple 'please reconsider this site' was entered into the text box. On previous attempts text has varied from a single paragraph to full a4 pages. This seems to have no effect on time periods involved in reconsideration or the outcome. I can only draw the conclusion that this is read after the reconsideration process to help develop future algos. The length of text (or lack of) does not seem to influence the process.

All this is pointing more and more to the process being automated.

Whats next?
Because 'something' seems to be happening when a reconsideration request is put in then there will be a log of it in the apache logs. This site is large enough and the crawl rate at such that it should be possible to find a specific page which is crawled on a significant basis above others between the times when the reconsideration requests are put in. ********

Further to this because I belive this is automated, would google be daft enough to use the same ip?....

Further to this with the site 404'd would the 'normal' spiders have gone and left a reconsideration spider?...

I think I will now have enough data points after several reconsideration requests to acheive answers to the above.

To further develop the ideas around 404ing a site I have taken another site (part of the same group but a different country, received penalty on the same day) put in a reconsideration against a completely 404'd site. I will leave this site 404'd until I receive a reply.


********
To do this I will be racking up a new server, taking the apache logs and spliting each line by replacing different characters with pipes to split it into, date, ip, request.

Importing it into mysql.
Taking each line then comparing it to ever other line in the database and doing a count.

There are several hundred thousand lines to do this on so it is going to take several days to complete - anyone with any bright ideas on speeding up this process?
********

pierrefar




msg:4333295
 9:02 pm on Jun 30, 2011 (gmt 0)

Hello all

I thought I would clarify some of the questions above. Firstly, we do have a team that reads and responds to reconsideration requests, but in some cases we may not read all reconsideration requests; for example, if a webmaster is bombarding us with a new request every day, we may choose to read only the most recent submission.

It's also important to note a few things about reconsideration requests:

1. Before you submit the request make sure that you read our guidelines and that the site is in compliance to the best of your knowledge.

2. We've seen some webmasters misdiagnose the problem and think a manual spam action is affecting their site when reality it's something else, like a crawling issue. Be sure to look through Webmaster Tools for such problems, and ask for help from other webmasters if you're not sure.

3. A reconsideration request does not apply to malware. In those cases follow our clean up tips and submit a malware review through Webmaster Tools.

4. When you are writing the reconsideration request, be as detailed and as clear as possible - help us help you. Give a timeline as accurately as possible when you're describing actions or changes that may be relevant. If you did find and fix a violation, explain what you did. We do appreciate that some fixes simply cannot be completed 100% (e.g. cleaning up spammy back links), and in those cases explain the effort you took to fix the problem as much as possible.

5. If you have get new relevant information after you submit the request, it's OK to follow up.

Hope this helps,
Pierre

dazzlindonna




msg:4333317
 10:02 pm on Jun 30, 2011 (gmt 0)

We do appreciate that some fixes simply cannot be completed 100% (e.g. cleaning up spammy back links)


Mumble, grumble....it shouldn't be a responsibility of the webmaster to clean up spammy backlinks. Unless the webmaster has control over something, he shouldn't have to "fix" it. Just discount them, already! Mumble, grumble...

johnhh




msg:4333336
 11:07 pm on Jun 30, 2011 (gmt 0)

Firstly, we do have a team
Um is @pierrefar the new googleguy - if so, welcome.
tedster




msg:4333401
 2:29 am on Jul 1, 2011 (gmt 0)

Yes, I can confirm that Pierre Far is employed at Google where (according to his Twitter profile) he is a Webmaster Trends analyst.

dazzlindonna




msg:4333427
 4:43 am on Jul 1, 2011 (gmt 0)

Yep, Pierre definitely works for Google. John Mueller pulled him over to the dark side, that sneak! :) I'll always be friends with both of them, but I'll still yank their chains every now and then when they say something Googly that I don't like. I don't think they'd want it any other way. :)

Andylew




msg:4333508
 9:35 am on Jul 1, 2011 (gmt 0)

Thanks for the reply Pierre, as always with google there are a lot of words but very little content.

To answer my earlier post and follow up with new findings:

Once more my hypothesis is incorrect, although something is happening when a reconsideration request is put in it is not a result of new activity after the reconsideration. Raw logs have been downloaded, I said this was a few hundred thousand lines, it actually turned out to be over 50 million lines spanning 4 reconsideration requests and I can say categoricaly there is no significant change in crawling, no new ips, no page looked at more than normal during that time or any other factor that could suggest different activity or a specific area, date, ip request etc etc which is being looked at after a reconsideration request has been put in.

So what can be concluded from that? Because no new/specific data (collected after a reconsideration request) is being used to determine whether a site is now fit to have a penalty lifted and reconsideration requests are unlikely read before reconsideration takes place there must be a continuous monitoring of the site and probability score for resolution. Google has already confirmed penalties are time limited, I wonder whether, through what I now belive 100% in this situation is an automated system, standard googlebot data is used to determine a score, this score is then constantly updated. Once the score reaches a certain level, a penalty resolved level, the site is determined to be 'fixed' then it goes into the timeout phase, 30 days, 60 days whatever where if no further problems the penalty is removed (or it goes to a human evaluator for final check). Once it is in this timeout phase a reconsideration request would just speed up the return to standard listings (or human evaluation), putting it to the front of the queue.

The 404 test, the exception to the rule could be if googlebot detects a significant change in the site (like it disappears!), it would make sense that a complete re-evaluation of the site takes place before it is then re-scored. This could be a good indicator of new ownership etc. Tinkering around the edges like many do is unlikely to trigger this complete re-evaluation.

Whats next?

With no clues as to an area on the site to concentrate on, with no change in google activity after a reconsideration is entered. The only current area to focus on is this notion that a 'significant change' influnces the reconsideration request in some way.

With one site now 404'd and awaiting a response, im going to do a complete redesign of the other site. New graphics, css, layout etc etc. The content and url structure will stay the same. I wonder whether this will trigger the 'significant change' response characterised by an increased wait for a response to reconsideration.

*************
For all those coming across this thread in future and wanting to evaluate their raw logs the following provides a atarting point,

Download your raw logs then seperate out the different elements by replacing characters with pipes or similar then import into mysql.

create a new column, adder in my case with default value 1.

Analyse the data using queries similar to:

CREATE TABLE haystack_needle AS SELECT `ip`, `date`,`source`,`request` , SUM( `adder` )
FROM `haystack`
GROUP BY `request`
ORDER BY SUM( `adder` ) DESC

this will give you a new table with the adder column equal to the number of instances the group by element apears, in the above example 'request', this could be changed to 'ip' or something else. This can then be used to the drill down further.

**************

Andylew




msg:4336225
 8:24 am on Jul 7, 2011 (gmt 0)

Latest update,

The idea that a 'significant change' may prompt a different response to a reconsideration request has proven wrong. A site which has been completely removed and 404'd has still been met with 'site still violates quality guidelines' This adds further weight to the idea that reconsiderations are automated and rely on cached google data.

With a site 404'd googlebot disapears so pages are still in the index and cached and google doesnt realise they have gone.

So the problem is googlebot is needed to realise old violating pages have gone and 404ing a site looses the bot.

The next step is to completely change the url structure so there is still an active site, It would look like a new one with post violation content changes. This would tick the box of 'remove the pages which violate google quailty guidlines'

The idea in this case is that a new site would be recognised and pages would not have a black mark against them, black marked pages would be 404'd. The other conclusion may be that once a site has been penalised a site wide specfic penalty check algo is run on all new pages crawled. Time will tell.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved