efficient database code (1 Viewer)

jameson_uk

Retired Team Member
  • Premium Supporter
  • January 27, 2005
    7,258
    2,528
    Birmingham
    Home Country
    United Kingdom United Kingdom
    I want to scan a set of files say 20,000 and store them in a SQLite database.
    Actually doing this is straight forward enough but I am wondering what the most efficient way of doing this is.
    I have a class to model the files which has about 20 attributes.
    Having used ADO with VB a few years ago and reading up on ADO.NET it looks like paramaterized queries are still the way to go. The question though is whether I process the files individually or whether I store the files in some sort of collection and then process the whole lot in one go.

    Just wondering what the hit on memory usage / speed would be storing this many objects in a collection.
     

    bpell

    New Member
    August 4, 2010
    3
    0
    Home Country
    United States of America United States of America
    This is a little late but maybe it will help with posterity. A lot of this depends on memory and machine speed. It's going to be more efficient to read the information in and insert it (or update) the database in that loop with the parameterized query and skip the collection of objects. If you fill a collection or generic list you'll have to iterate through the files, then iterate through that collection to update the DB (all the while, the entire collection is in memory). If you have plenty of memory and a fast machine, not an issue. I'm running MP on a 1.2Ghz with 512mb of RAM. :) It works well, but I prefer using the least footprint. I have a catalog of over 43,000 files and when I've stored the metadata in an Access database I've skipped the object collection and written the data straight to the table.
     

    Users who are viewing this thread

    Top Bottom