| 3:20 am on Jul 14, 2008 (gmt 0)|
Been using FileZilla for awhile now and haven't had any trouble with it. Also haven't used it to view over 2000 files within one directory, either, but I'm sure it woulnd't have a problem with it. Just try it out.
| 5:55 am on Jul 14, 2008 (gmt 0)|
Filezilla is the ftp client I was using when the server tech support person told me it was the problem.
So far every other one I have tried has the same problem. Some do not even get to the 2,000 files before they have a problem.
| 6:12 am on Jul 14, 2008 (gmt 0)|
2000+ in a directory does sound quite a few! In such a case I'd put all the files into a TAR archive, GZIP it, upload it as one file and ask the administrator to untar it for you. For him it's no more than typing:
tar -xzf filename.tar.gz
| 6:44 am on Jul 14, 2008 (gmt 0)|
Just tried Filezilla on directory with 3000+ files and seems to be Ok ?
| 7:31 am on Jul 14, 2008 (gmt 0)|
The problem is not with uploading files. It is viewing over 2,000 files with ftp client after they are uploaded. Like if you wanted to change a file name and the file isn't displayed.
You know it is there because the http directory list when viewed with a browser shows the file.
Yes I know it looks all right. But if you look close at all the file names you will see some missing. With me it was files in the IO and IR area through Qs of the displayed files.
It is kind of decieving because the first 2,000 show up ok, then the last part of the files show up in the x or y or z range and that makes it appear ok. but you are missing a bunch of files in the middle.
I did some reseach and found that a LIST -a command might help. I thought I would have to command line it but when I had filzilla open up the files directory it issued the LIST -a itself. But I did notice something else. Below the list command was "226 Output truncated to 2000 matches". Filezilla itself tells you you won't be able to list all files.
| 9:07 am on Jul 14, 2008 (gmt 0)|
jake60; try telnet directly from the command prompt - that should give you everything - and has the advantages that you don't have to list files to change them...
e.g. you can type rename, delete, put and get commands immediately after login without having to list files as FTP clients tend to insist upon doing.
| 1:21 pm on Jul 14, 2008 (gmt 0)|
The problem is solved, on my end at least! I have found out it is not a problem with the ftp software, but the server itself.
| 3:36 pm on Jul 14, 2008 (gmt 0)|
If you know, do you know what on the server was the issue?
| 3:43 pm on Jul 14, 2008 (gmt 0)|
|Below the list command was "226 Output truncated to 2000 matches". Filezilla itself tells you you won't be able to list all files. |
That's a server FTP warning - not FileZilla. Sounds like you came to that realization though.
| 3:51 pm on Jul 14, 2008 (gmt 0)|
I have been using ws-ftp pro by Ipswitch successfully for many years and never had limititation issues with the number of files. I also believe it is the most popular FTP program.
| 2:44 pm on Jul 16, 2008 (gmt 0)|
Depending on what OS (and filesystem) your server uses, you may experience a significant drop in performance once you start putting [tens of] thousands of files in a single directory.
| 6:42 pm on Jul 16, 2008 (gmt 0)|
My two cents here:
IMO, 2000 files in a directory is insane. As you've noticed, any time any device or program needs to read the entire directory it takes an unnecessary amount of time to do so.
Sure it can be done, and it works, and you may see no performance problems . . today. Over time, this may come back to present a real problem for you, just in locating files if not in performance.
In this case, it's taking even more because this program is reading that directory remotely.
One of the carts I built I dumped all the images willy-nilly into a common directory. Bad idea. I boosted the performance a hundredfold by scripting a method to organize them in directories, then added code to the script to the effect that if the target directory doesn't exist, create it.
If this is an issue of SEO where you need all the files in one directory, write an .htaccess rule so the URL's stay the same but refer to the actual directories.
|I also believe [WS_FTP] is the most popular FTP program. |
I love this program: small footprint, simple, no overfluffed point and drool GUI's that require more training to use than time spent using them. I simply hate programs like SmartFTP that make such a big flippin' deal over a simple task: move my files from HERe to THERE.
However, it's notable to mention, WS_FTP has gotten the smackdown here many times because regular FTP is not a secure method of transferring files. If you use it, change your password often, be on the lookout for trouble.
| 7:02 pm on Jul 16, 2008 (gmt 0)|
I was putting all my files in one directory so when people did the directory list they could do simple searches with their browser to choose files to download. With 4,000 files it only took 3 extra seconds to get the entire list generated.
If I have to break up my files in to smaller directories that means people have to do searchs in each directory for the files on a subject.
There is no way to order the files other than mp3, video,pdf, etc. So people will just have to search each directory for files on a subject no matter what ext. Subject may have files with more than one ext.
File names are not standardized or consistant enough to to order by name in different directories. If it were possible, the webpage direcory descriptions could get extremely wild. Subjects from A to K just doesn't quite cut it.
Additionally, the problem comes up there are 1999 pdf files so there will have to be 2 pdf file directory when more are added.
It is just easier all the way around to have all the files in one directory.
What will happen by listing files by extention is people will search only in their chosen media type. If they prefer videos they will probably miss important info in pdf files.