Windows directory performance question

JOhnH

Touchdown! Greaser!
Joined
May 20, 2009
Messages
14,210
Location
Florida
Display Name

Display name:
Right Seater
I use a commercial program for our veterinary hospital. Since we went to all Electronic Medical Records in 2007 we do not retain any paper. We scan everything in and attach it to a client/patient record.

I remember in my old tech support days (VAX/VMS) that we always had performance problems when we created a directory with too many files in it. So I arranged to store all of our attached files in a directory tree as such:

C:\top directory\scanned files\yyyy\mmm\long-descriptive-filename.
the long descriptive file name is formatted as:
\client# - patient name - descriptioin
(the patient name is the animal's name).

This way, the largest single directory I have has about 900 entries. It happens to be C:\top directory\scanned files\2013\may\

But my vendor insists that I convert to their scheme before I upgrade to the next version of their software. They want:

C:\application\client#\patient name\filename.
But this way, the subdirectory "client #" would have about 15,000 entries now and would continue to grow as I add clients.

They say my way is "too deep" (5 levels) vs (4 levels)
I say theirs is "too fat". 15,000 entries in a dir vs 1,000 entries.

The application runs on 32bit XP Pro workstations but the data is contained in "flat files" on a 64 bit server.

Does this make sense to anyone?
Would I gain or lose performance if I change?
They say I can choose either way but they don't support my way. And once you do anything that is "not supported", it aint fun to get past that with the script reading phone support to work on other problems.
 
While I can't speak to Windows performance with 1000 files in a dir vs. 15000, I would say that trying to predict Windows file system performance based on VAX/VMS file system performance is...well, kinda silly.

I can't even think of a good analogy to illustrate the silliness. Maybe, "I learned I had to blow on my Nintendo game cartridges to ensure a good connection, so I still do the same thing with my Sony Playstation 3 discs today."

VMS and the Windows file system both exist to provide an interface to data stored on mass media devices. But the technology used to accomplish the task is utterly different between the two, and using the performance of one to predict the performance of the other is...well, silly.

Are you the largest customer your vendor supports? If they've got a large installed base, and your usage is not way outside the norm for their customer base, and their other customers have no issues, then why are you worried?
 
Last edited:
I think the only way you could know for sure is to do some benchmarking and find out.
 
Depends a bit on how you access the files.

If you use Explorer to look, then a directory with 10,000 files will be a PITA to look through.

If the application handles the file management, and just needs to grab files that it knows the path/name of, then presuming you're using a decent file system (like NTFS), I don't think there would be much impact. The directory at an internal level is highly indexed, it's just the GUI of Explorer has to pull a lot of info in order to display the files (hence the hit there).
 
Back
Top