I have problem with my website because Google is reporting duplicate content on it, even if those pages (static pages) don't appear on the server. For example if I have page at this location (this is static and not dynamic page)
I did checked .htaccess file and he is not causing this problem.
It may not be causing the problem, but it may be failing to prevent the problem.
In ordinary static html, directories have slashes at the end and files don't. If your site is made up of php files built on the fly, then you need to transfer the slash-checking job to the php itself. Whenever there is a request for anything that doesn't exist-- including pages with name ending in ".php/" --your php has to return a 404.
If your filenames end in php but are essentially static, you can handle it the same way you'd handle it with html. Does this type of URL http://www.example.com/folder/page.php/another-page-under-folder.php really occur? If not, make a single conditionless rule that says something like RewriteRule ^(([^./]+/)*[^./]+\.(html|php))/ http://www.example.com/$1 [R=301,L]
No closing anchor, so there might be extra stuff after the superflous slash. I have this rule on my test site, though so far I haven't needed to add it in real life.
I guess lucy24's way of doing it is more comprehensive, so stick with it. In Google WMT, just select all of the pages that appear, and mark them as fixed. Don't get surprised if many of new ones start appearing every day.