this is more of a preventive measure - to avoid future issues. i doubt it's going to fix any of your current issues.
i Receive my content from Unique article wizard, (not my content)
if it wasn't made clear to you, this is the biggest problem you have to solve. everything else is like rearranging deck chairs on the titanic.
i did what you said on one of my pages and i found 5 errors, 5 of the links from that page are blocked by robots.txt
yes i did "Fetch as Googlebot" like you said i checked 2 pages and i found the same 5 errors on it and the errors are "blocked by robot.txt"
as many times as you write this i can't understand what you are saying. how can you check one page and get 5 errors saying "blocked by robot.txt"?
can you find any urls on your site (i.e. your hostname, not external links from your content) which are being blocked by robots.txt? can you find any urls on your site which you expected to be indexed which aren't indexed?