I want to scan a set of files say 20,000 and store them in a SQLite database.
Actually doing this is straight forward enough but I am wondering what the most efficient way of doing this is.
I have a class to model the files which has about 20 attributes.
Having used ADO with VB a few years ago and reading up on ADO.NET it looks like paramaterized queries are still the way to go. The question though is whether I process the files individually or whether I store the files in some sort of collection and then process the whole lot in one go.
Just wondering what the hit on memory usage / speed would be storing this many objects in a collection.
Actually doing this is straight forward enough but I am wondering what the most efficient way of doing this is.
I have a class to model the files which has about 20 attributes.
Having used ADO with VB a few years ago and reading up on ADO.NET it looks like paramaterized queries are still the way to go. The question though is whether I process the files individually or whether I store the files in some sort of collection and then process the whole lot in one go.
Just wondering what the hit on memory usage / speed would be storing this many objects in a collection.