[wplug] Large Database

Drew from Zhrodague drew at zhrodague.net
Fri Mar 6 14:19:01 EST 2009


> Each record would include a datestamp field and at least one value
> field.  More than likely I would have 20 or more value fields, all
> with a common datestamp.

 	Use a binary format! Compress the records in chunks, maybe hourly. 
Make sure you don't drop them when writing to the next file.


> The system would sample data from process devices every 10ms, 100ms,
> 500ms, 1s or 5s, depending on the source of the data.  Sampling at
> these rates for a year or more yield millions of records.

 	Is BerkeleyDB fast enough for this? The stuff I was doing uses an 
old Lotus123 format, and it is pretty fast (shapefiles).

 	There should be a simple method of streaming these records to 
disk. Company I worked for did stuff like this, and it seemed to be best 
to use an efficient binary format. Compress the samples in chunks to save 
disk space.



-- 

Drew from Zhrodague		http://www.WiFiMaps.com
drew at zhrodague.net		http://www.pghwireless.net
 				http://dorkbot.org/dorkbotpgh



More information about the wplug mailing list