homepage Welcome to WebmasterWorld Guest from 54.205.254.108
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 41 message thread spans 2 pages: < < 41 ( 1 [2]     
My site hit 27 November - Google Update?
Mentat




msg:4627130
 8:04 am on Dec 2, 2013 (gmt 0)

Here is the drop of one of my subdomain since 27 November.

[i.imgur.com...]

Site since 2002, no link building, but a lot of bots are scrapping my site :(
A lot of sites are using my RSS to create content.

Other subdomains got hit on 4th of September.
One is back, the only one left alone (no content cleaning, only links disavowed).

So the question still remains... what is this?
Panda, Penguin or a nasty bird?

 

itsjustme2




msg:4636478
 2:49 pm on Jan 10, 2014 (gmt 0)

I had a manual penalty on one product page.
I searched and searched and I found a lot of .ru domains linking.
The problem was that the site is huge and I have a tens of thousands pages! It's a needle in a haystack, so I've removed the page 410 gone error.

I give up! :@


btw, that to do with *.blogspot.com/ru/fr/br/etc domains?
The same blog has automaticly/infinite TLDs! :@


I don't think you can disallow for
domain:.ru

on the disallow text,

but you can filter for .ru on the spreadsheet
copy and paste the results to a separate spreadsheet to eyeball the results

I've been going through the same thing...

it's easier to combine the spreadsheets from WMT and other of the backlink checkers, then filter on the domains column to eliminate duplicates

i.e., using different backlink checkers will automatically produce some domain duplicates.

The problem is that checking backlinks and checking domains on a site with thousands or tens of thousands or hundreds of thousands of links is a very difficult technical task when you are sifting through both domains and urls.

here's how the Rap Genius recovery story was reported.

Following the detailed mea culpa the founders included a list of four steps the search engine required they take in order to receive forgiveness:

Download a list of links to your site from Webmaster Tools.
Check this list for any links that violate our guidelines on linking.
For any links that violate our guidelines, contact the webmaster of that site and ask that they either remove the links or prevent them from passing PageRank, such as by adding a rel="nofollow" attribute.
Use the Disavow links tool in Webmaster Tools to disavow any links you were unable to get removed.
It turns out there were 178,000 suspect URLs, but with 100 workers scraping URLs the task was completed in 15 minutes, according to the post.


If you have 100 technical employees you might be ok...otherwise, try to go as systematically as possible through all the links and domains to determine which ones or groups of links can be disallowed.

DansSitesRScrewed




msg:4636480
 2:59 pm on Jan 10, 2014 (gmt 0)

The tool on webmasterworld where you can upload your links and it separates and controls them, removed duplicates and makes it very easy to manually check the links and create a disavow file. Watch the videos how it works, it's really nice.

itsjustme2




msg:4636503
 4:35 pm on Jan 10, 2014 (gmt 0)

The tool on webmasterworld where you can upload your links and it separates and controls them, removed duplicates and makes it very easy to manually check the links and create a disavow file. Watch the videos how it works, it's really nice.


agree, Boykin does a good job with the tool, automatically filtering, whereas I manually filtered, copied, pasted and eyeballed different domains...

He also mentions that the sweat equity associated with the tool comes with the actual eyeballing and interpreting the "value" of the links, once they are sorted with the tool.

It literally takes hours.

Also note, he mentions the need to look for more backlinks other than the WMT backlins because WMT only provides a subset of total links.

dethfire




msg:4636508
 5:17 pm on Jan 10, 2014 (gmt 0)

Some of us don't have 100 works working 8 hours a day to review 178k links. I've been seriously hurt by spammers. I have disavowed about 500 domains, but I'm sure that isn't all.

itsjustme2




msg:4636539
 6:33 pm on Jan 10, 2014 (gmt 0)

Some of us don't have 100 works working 8 hours a day to review 178k links. I've been seriously hurt by spammers. I have disavowed about 500 domains, but I'm sure that isn't all.


it's called irony

if you buy links and get caught, it takes about ten days to recover

if you don't buy links and get caught up in the scrappers and negative seo, it takes about ten years to recover

I'm at disavowing 1500 domains of scrappers at present....if I knew what to do, I would have done it two years ago...

it's the antinomies of big data

Mentat




msg:4636555
 7:51 pm on Jan 10, 2014 (gmt 0)

I've just found a tons of subdomains on sourceforge.net with pr0n content and links to me :@

This is crazy!

Planet13




msg:4636559
 8:11 pm on Jan 10, 2014 (gmt 0)

@ Mentat

I am sorry if this is a stupid question, but i have to ask:


when you had that big drop and gain in traffic, did you confirm it by looking at your server logs as well?

Or did you get all your data solely fro google WMT / GA ?

I ask because i remember a period a few years back when google traffic dropped about 90% for around a month before reappearing, accordin to google analytics. This contradicted what my server logs said.

Anyway, I apologize if this seems like a stupid question.

Mentat




msg:4636599
 10:07 pm on Jan 10, 2014 (gmt 0)

I wish it was all a mistake, but Google Analytics and WMT are in sync + Adsense revenue...

There is no spoon!

Sand




msg:4637224
 4:15 am on Jan 14, 2014 (gmt 0)

I'm going to guess that it was Panda. I had a site that was hit by Panda and recovered. Now, with each Panda update, I get more traffic. I saw a boost starting right around this time, though the Thanksgiving holiday makes it hard for me to pinpoint the exact day.

DansSitesRScrewed




msg:4637225
 4:29 am on Jan 14, 2014 (gmt 0)

Sand, Did you do anything special to help your site recover?

Sand




msg:4637340
 1:22 pm on Jan 14, 2014 (gmt 0)

I really just took inventory of everything on the site and identified pages that could have been much better. Then, depending on their importance to the site, I either made them much stronger or deleted them altogether. That's it, though it took me 6+ months of dedicated work to actually do it.

This 41 message thread spans 2 pages: < < 41 ( 1 [2]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved