[wplug] Large Database

terry mcintyre terrymcintyre at yahoo.com
Fri Mar 6 14:22:04 EST 2009


> From: DK <wplug at curlynoodle.com>
> 
> Each record would include a datestamp field and at least one value
> field.  More than likely I would have 20 or more value fields, all
> with a common datestamp.
> 
> The system would sample data from process devices every 10ms, 100ms,
> 500ms, 1s or 5s, depending on the source of the data.  Sampling at
> these rates for a year or more yield millions of records.

10ms sample time => 100 per second. Multiply by 31 million seconds per year, and you have about 3.1 billion samples for one year, multiplied by the number of process devices. That's a lot of data, and a lot of iops.

Do you need _every_ data point, or would some form of data reduction be appropriate?


      


More information about the wplug mailing list