How to dsable file cache

Jul 11, 2008 at 5:41 AM
Hi,
every time i do an oparation woth Tourtoise and SVNBrigdge (V2)
in the E:\Program Files\SVNBridge\MetaDataCache\@hashed folder are create a lot of directoreis.
on some operations the number directory exceeds (>50k!!!!!) so deleting via explorer is not possible anymore.

I have read about ot disable the FileCache,
but did not found docuimentation about wher to set the CacheEnabled Paramater;
simply in thje config.txt file of the program directory?
or in the envirnment variables?

Karl
Jul 11, 2008 at 7:53 AM
Hi Karl,

open the file "SvnBridge.exe.config" and change to setting "cacheEnabled" to "false".

André
Jul 11, 2008 at 9:11 AM


MagicAndre1981 wrote:
Hi Karl,

open the file "SvnBridge.exe.config" and change to setting "cacheEnabled" to "false".

André



Thanks André for the Tip,
I downloaded the source
and compiled it and
i have access to the SvnBridge.exe.config file
Great!

Karl
Jul 31, 2008 at 1:59 PM
So I have to compile it myself to remove this cache?? Crap, I had almost 400 000 folders there and I thought my machine was infected with some stupid virus generating files and folders to prevent me for *ever* doing a complete virusscan. I had to boot from a Vista Rescue CD to clean up the mess because neither explorer, total commander or DOS-prompt let me delete the files and folders.... I even got a BSOD while trying....
I see two options:
1) download source and remove the annoyance myself
2) don't use svnbridge at all
Aug 4, 2008 at 6:21 PM

skaue,

I'm not familiar with SvnBridge but .exe.config files are checked by the .NET CLR at runtime so there is no need to recompile. Just copy app.config from the latest source code (such as here), rename it to SvnBridge.exe.config, and place it in the same directory as SvnBridge.exe.

Aug 5, 2008 at 7:49 AM
Thanks Tobiasly! That's great stuff... :-)
Aug 15, 2008 at 9:12 PM
I had over 550.000 files... took half an hour to delete them...
I guess the MetaDataCache is the main reason, why SvnBridge is SO slow...

@tobiasly: This doesn't work for me. After a "svn up" the folders are recreated.
Aug 17, 2008 at 12:31 PM
I have had the same problem with SvnBridge.  It created over 900,000 files on my machine when I checked out dasBlog from CodePlex.  I have a quad core, 64 bit machine with Raptor 10,000 RPM drives and I have been waiting the last 4 hours for these files to get deleted.  The Vista delete box says I still have another 7 hours to go! (sometimes it says days).

I would suggest that this file caching is off by default in the next release.  Or the caching is re-engineered.  The current behaviour means that doing anything with that folder is like launching a DOS on yourself.  It also makes virus scanning an issue.


Aug 18, 2008 at 9:24 AM
Edited Aug 18, 2008 at 9:27 AM
This caching wouldn't be an issue if it was implemented better. All versions of NT have problems when there are more than about 4000 entries in a directory.

I have noticed that the cache entries have 128bit hashes (32 hex chars), and these are broken into 12 digit top level directory, 12 digit second level directory, and 8 digit filename.

In my case there are about 400,000 top level directories, each containing one subdirectory, that contains a single file.

If instead the hashes were broken down something like 4:12:16, then the files would be distributed better between the different levels, and would perform significantly better. This would also mean that virus scans would be less of an issue, as most of the time spent performing the scans at the moment, is actually spent scanning the directory structure.

It might even be worthwhile adding an additional level to the hierarchy and making it something like 3:6:9:14.
Aug 19, 2008 at 9:41 AM
@dt309: I strongly disagree! This would still result in the same number of files, which I am not going to accept. Did you have a look at the content of those files? They mainly contain version information about the serialized classes, not actual data (although I'm happy that they at least didn't use an XML serializer). If the data in this cache is really needed, what I strongly doubt for most of it, then the file system should not be abused as a big hash table, but the hash table should be stored in one file! This would surely improve the performance of SvnBridge very much! The "cache" as it is now only makes things slower.
Aug 19, 2008 at 2:07 PM
@mkroll: While I agree that the number of files is a problem, it is not as big a problem than the sheer number of entries in the topmost directory of the cache. It is that which is most likely to cause the significant slowdown of your computer.

I have experienced issues in the past with large directories. A program that checks to see if a directory contains a certain file (or directory) before actually creating it will use hardly any CPU time when the directory is nearly empty. However, once the directory gets to about 4000 files (or subdirectories) the performance of the program (and the entire system) slows down tremendously. And that is exactly what is happening here.

If there were actually several files in each of the subdirectories, and several subdirectories in each of the toplevel directories, then the performance of the cache would be significantly improved, as there wouldn't be anywhere near as many directory entries to scan through at the toplevel.
Aug 19, 2008 at 4:36 PM
OK, I understand your point. As a quick and dirty fix this would probably be a big help for both using SvnBridge as well as deleting it ;)
But I still just don't want to allow any program to create such an enourmous number of files on my computer. It's also a question of hard disk space, as each file takes up at least 4 kB. Hmm... I have to correct myself. I didn't have over 550,000 files, but items, i.e. files and folders. Otherwise I would have had -1,200 MB left on my hard disk ;)
Aug 19, 2008 at 5:43 PM
I'm with mkroll on this one.  For me, creating that many items, no matter the structure, does not *feel* right.  And I think it would still cause my virus scanner perf issues.  They may be lessened, but it will still slow it considerably.  Anyway, this may all be moot, as I'm sure we'll get server-side SVN support soon enough.
Aug 26, 2008 at 1:56 PM
If you don't want all the cache files, then may I suggest that you use the latest development version. Revision 22341 builds without any problems for me.

If you want to delete all the files in the cache cd in the @hashed directory and then run: for /D %i in (*) do rmdir /s /q %i

Once you have cleaned up your SvnBridge cache, and downloaded the development version, you should not longer have any of these issues, and the cache is no longer used.
Aug 28, 2008 at 9:30 AM
@dt309
Thanks for the info about the cache change and the working revision.  I will give it a go.

I had already managed to find my way to rmdir, thankfully!
Sep 10, 2008 at 1:14 AM
I wanted to check out 1 file, svnbridge created over 170,000 directories before I killed it. I think this thing should be considered malware.
Sep 10, 2008 at 3:12 AM
@cartershanklin
 The source is freely available. Perhaps have a go at fixing the problem and submitting a patch?
Mar 5, 2009 at 10:40 AM
Edited Mar 5, 2009 at 10:54 AM
I stopped using SvnBridge about 6 months ago. But I used it for a few months.
I haven't been able to delete the infamous "@hashed" folder.

I ran del /s /f /q * for about 8 hours last night.
It finished, but there are still so many files in there, that it takes longer than 30 minutes, just to count them.
So I ran it again - a bit strange, but it ran for at least 1 hour and then I fell asleep.

After waking up, there are still so many files in there, that it takes longer than 30 minutes to count them (hence I'm not going to count them)

I've just run rmdir /s
It's still running, only 32 minutes so far...
But, I'm actually starting to wonder whether I'll ever be able to delete the @hashed directory.

Any more tips for those who've been "SvnBridged" could be useful.
Mar 5, 2009 at 12:12 PM
I got rid of them after using rmdir and let it run for a while. I don't remember now if it ran for several hours or days :p

Run it with /s and /q and let it work that disk... ;-)