Is RealTime Database Possible??

G

Thread Starter

Girish Patel

Hi, There is one application for process automation. In this data is logged in the Relational Database SQL. The Data Logging frequency is high. Almost up to 1000 records/second. And DBMS server is not effective for fast data transfer. Have any one any idea regarding Bulk data transfer in easy way?? Thanks Girish
 
G

Greg & Roxanne

> There is one application for process automation. In this data is logged in > the Relational Database SQL. The Data Logging frequency is high. Almost up > to 1000 records/second. And DBMS server is not effective for fast data > transfer. Have any one any idea regarding Bulk data transfer in easy way?? From what you say, I believe that a relational database is not the right tool for this task. You need something faster, leaner, with less overhead. Perhaps a b-tree, or some other form of indexed flat file of records. However, you don't provide enough information to help suggest a solution. - what software you're using, or what platform it's running on - what sort of program is providing the data to the database - what sort of data is being stored, and what "relations" it implements that justify or require a relational database - whether you have the right or the the ability to modify or replace that program - what other programs need to access the data in the archive, and what constraints they place on the type of archive or the means to access it - why you're using an RDB or SQL in the first place, and whether you have the option to change it It is also not clear whether you can keep your existing system and solve your problem by reducing your data flow. Are your logged records subject to any sort of deadbanding or statistical compression? That is, must you save every record, or just values which changed from the last stored sample, or only enough values necessary to recreate the curve described by each discrete point? Regards, Greg Goodman
 
What is a relational database ? in clear words what does it do in reality ? 1000 records/s sounds not realist in Process Control. Unless you monitor only critical pressure points relevant to sudden and unexpected surges, related to safety. Or Hydro lines. More detail will help. Analog, discrete ? Good luck. Do not hesitate [email protected]
 
D
I am not exactly sure of your application, but logging process data with RDBMS is a common application. I suggest selection of intelligent snapshot periods and deadbanding (as Greg suggests). You will want to check out http://www.tpc.org/ for the performance of various combinations of database software and hardware. These systems are designed to handle a large amount of loading. If you want a lightweight, cheap (as in free) database, you might try MySQL. It is feature-limited, but very quick. The size of the database can get very large, so you will want a method of archiving off old data. We keep minute snapshots of one customer's data for process modeling. The database now contains over 4 years worth of data! Dustin Beebe, P.E. [email protected] ProSys, Inc. Process Systems Consultants http://www.prosysinc.com
 
D

Dittrich, Rodolfo

Hi, I have the opportunity to work with in-sql, the realtime database of wonderware (factory suite 2000), and it was really very fast to transfer data. Not so sure if arrive 1000records/seconds. But sure you can ask to some dealer. I had a very good experience with this. Regards
 
R

Ralph Mackiewicz

Your performance bottlenecks using RDBMS is because you are probably not using the right product. Sounds like you need a solution designed to handle large amounts of real-time data. Try the PI software product from OSI Software: http://www.osisoft.com Best Regards, Ralph Mackiewicz SISCO, Inc.
 
G

Girish Patel

Hello sir, Very much thanks for your reply regarding my topic. Actually I am developing program to log the data in to the database coming from the field. This program is reading the data from the shared memory and logging it in to the database continuously on every seconds. Reading speed is no problem reading is too fast. -I am using WindowsNT operating system and Microsoft Visual C++ 6.0 as developement Tool. -SQL server 7.0 as a database. The objective of my application is logging data at the rate of 1000 to 2000 records per seconds. -I am using ADO for database interface. -But I am adding every record manually by "AddNew method" to add record in the recordset. -currently 200/300 records are logged on every second. But not efficient to log more than that. -Record Stru: No more complex data for logging but just simple record of Channel No., Time Stanp and value is stored as record. - Actually very sorry for RDBMS but No relation between this table with any table. - Other Report generating program and offline trend uses this data. -Why SQL? Actual I want to transfer data for ODBC connectivity for other programs so using SQL. - Management is disagreed with my suggestion to reduce data flow. If possible pl. provide more guidance, Very much Thanks for response, Girish > > From what you say, I believe that a relational database is not the right tool for this task. You need something faster, leaner, with less > overhead. Perhaps a b-tree, or some other form of indexed flat file of records. > > However, you don't provide enough information to help suggest a solution. > > - what software you're using, or what platform it's running on > - what sort of program is providing the data to the database > - what sort of data is being stored, and what "relations" it implements that justify or require a relational database > - whether you have the right or the the ability to modify or replace that program > - what other programs need to access the data in the archive, and what constraints they place on the type of archive or the means to access it > - why you're using an RDB or SQL in the first place, and whether you have the option to change it > > It is also not clear whether you can keep your existing system and solve your problem by reducing your data flow. Are your logged records > subject to any sort of deadbanding or statistical compression? That is, must you save every record, or just values which changed from the last > stored sample, or only enough values necessary to recreate the curve described by each discrete point? > > Regards, > Greg Goodman
 
G

Girish Patel

Hello sir, Very much thanks for your reply ragarding my topic. Sir actually I am developing program to log the data in to the database coming from the field. This program is reading the data from the shared memory and logging it in to the database continuously on every seconds. Reading speed is no problem reading is too fast. -I am using WindowsNT operating system and Microsoft Visual C++ 6.0 as developement Tool. -SQL server 7.0 as a database. The objective of my application is logging data at the rate of 1000 to 2000 records per seconds. -I am using ADO for database interface. -But I am adding every record manually by "AddNew method" to add record in the recordset. -currently 200/300 records are logged on every second. But not efficient to log more than that. - No more complex data for logging but just simple record of Channel No., Time Stanp and value is stored as record. - Actually very sorry for RDBMS but No relation between this table with any table. - How can I speedup the data logging?? If possible pl. provide more guidance, Very much Thanks for response, Girish
 
G

Girish Patel

Hello sir, Very much thanks for your reply regarding my topic. Sir actually I am developing program to log the data in to the database coming from the field. This program is reading the data from the shared memory and logging it in to the database continuously on every seconds. Reading speed is no problem reading is too fast. -I am using WindowsNT operating system and Microsoft Visual C++ 6.0 as developement Tool. -SQL server 7.0 as a database. The objective of my application is logging data at the rate of 1000 to 2000 records per seconds. -I am using ADO for database interface. -But I am adding every record manually by "AddNew method" to add record in the recordset. -currently 200/300 records are logged on every second. But not efficient to log more than that. - What To Log? No more complex data for logging but just simple record of Channel No., Time Stamp and value is stored as record. - Actually very sorry for RDBMS but No relation between this table with any table. - I cannot reduce data flow for log. - So is any efficient way for logging these data?? If possible pl. provide more guidance, Thanks again for response, Girish
 
G
Hello sir, Very much thanks for your response. Actually I am developing program to log the data in to the database on every seconds coming from the shared memory. Reading data is too fast. -I am using WindowsNT operating system and Microsoft Visual C++ 6.0 as developement Tool. -SQL server 7.0 as a database. The objective of my application is logging data at the rate of 1000 to 2000 records per seconds. -I am using ADO for database interface. -But I am adding every record manually by "AddNew method" to add record in the recordset. -currently 200/300 records are logged on every second. But not efficient to log more than that. - No more complex data for logging but just simple record of Channel No., Time Stanp and value is stored as record. - No relation between this table with any table. Can I use any method to Log bulk of the data in ADO to boost up performance?? If possible pl. provide more guidance, Thanks again for response, Girish > I have the opportunity to work with in-sql, the realtime database of wonderware (factory suite 2000), and it was really very fast to transfer > data. Not so sure if arrive 1000records/seconds. But sure you can ask to some dealer. I had a very good experience with this. >
 
G
Hello sir, Very much thanks for response regarding my topic. Actually I am developing program to log the data in to the database coming from the field. This program is reading the data from the shared memory and logging it in to the database continuously on every seconds. Reading speed is no problem reading is too fast. -I am using WindowsNT operating system and Microsoft Visual C++ 6.0 as developement Tool. -SQL server 7.0 as a database. The objective of my application is logging data at the rate of 1000 to 2000 records per seconds. -I am using ADO for database interface. -But I am adding every record manually by "AddNew method" to add record in the recordset. -currently 200/300 records are logged on every second. But not efficient to log more than that. - No more complex data for logging but just simple record of Channel No., Time Stamp and value is stored as record. - No relation between this table with any table. **** Programming Related ******** Can I add bulk of data to the Recordset, with any method other than "AddNew"? If possible pl. provide more guidance, Very much Thanks for response, Girish
 
Hi Girish, You can try Industrial SQL Server from Wonderware. Regards C.Murali Manager(Technical)-Real Time Group Ramco Systems 3,Sardar Patel Road, Chennai-600 113,INDIA Phone 091-44-2354510 Extn:4219 Fax 091-44-2355405 Home:091-44-2363132 Web:www.ramco.com e-mail:[email protected]
 
S

Surendra Mehta

Few Suggestions: You might want to remove duplicate values and record only on change if that is within your control. Probably a more suitable product for such large data storage would be Uniformance from Honeywell Hispec Solutions (www.hispec.com). Its an excellent product with lots of features for such large data. It will also allow you to store data into rdbms, analyse, trend, and further opportunities are limitless. If you provide more info on your requirement I am sure the list/myself will give you more accurate info. Surendra Mehta [email protected]
 
Girish, If SQL Server isn't a requirement, I would use the Pervasive SQL 2000 Software Developer Kit and use the Pervasive database engine. Instead of populating the database using ODBC, you can populate it using a transactional method that is extremely fast. When you want to fetch the data, you have ODBC access to it. It is much cheaper than SQL Server, much more robust in my opinion, and is based on the "ancient" Btrieve database technology which has been used for years. Complete details are at http://www.pervasive.com. Sincerely, Mark Wells President Runfactory Systems Inc. http://www.runfactory.com 1235 Bay Street, Suite 400 Toronto, Ontario, Canada M5R 3K4 Cell Ph. 416-984-0774 Fax. 416-352-5206
 
R

Richard Dewees

Girish One way to improve your performance is instead of opening your ADO connection then opening a recordset and updating it with .addnew, create a stored procedure with an insert statement and call the stored procedure within your ADO connection passing your data as values for the stored procedure. Rick Dewees Computer Guy Ocean Kayak
 
D

Dharani Singh Raja

From the description of your application it looks like SQL server (for that matter any database) seems to be a overkill. If you're only interested in logging Channelno, Timestamp and Value why can't you store them in a simple text file? If you're pressed to store them into a database for future analysis using the features of SQL, I'm sure there are plenty of ways in every available database to load Textual information from text files. So, I would suggest you store the data into Text files when online (in real time) and in latter stage port the data to the database. Raja.D.Singh
 
R

Ralph Mackiewicz

I'm afraid I can't help you with the programming. Several other people much smarter than me had posted some suggestions for you. My point was that the fact that you are having a hard time writing this data is not surprising. RDBMS were not designed to log real-time data like this. And, even though you don't *currently* have a problem reading the records, wait until you've logged 4 weeks worth of data: you'll have 4,838,400,000 records (at 2000/second). When you try to find the data from 1100 through 1130 on the 15th day of this run your RDBMS will probably take an hour to return the query and won't be able to log any new data while it processes the query. Both problems (writing data and accessing data) are related to not using the right tool. Logging massive amounts of high-speed data requires a specialized solution, not a general purpose RDBMS. If you haven't done so you should at least check out http://www.osisoft.com before you spend the next months trying to make a RDBMS do something it was not designed to do. You may be able to acheive a better write rate than your current limit but you won't even approach what is possible with a specialized solution like PI from OSI Software. Regards, Ralph Mackiewicz SISCO, Inc.
 
> **** Programming Related ******** > Can I add bulk of data to the Recordset, with any method other than > "AddNew"? how about executing SQL query "INSERT INTO ..." using passthrough SQL execution. it sounds faster than running AddNew method. The fastest way that people suggest is executing server-side SQL-procedure. I dont know wheter it is possible to insert/add records using server-side SQL-procedure or not. regards, usd 2500 engineer
 
C

Crucius, Wesley

Also, When doing an INSERT there is a way to turn off the default behavior of returning a count of rows affected. Since this is probably of no value to you, turn it off and decrease the workload on the service and comms connection.
 
N

Nathan Sykes

We are using InSQL (from Wonderware). It is touted to run better than 1000 records/sec. It also does some data compression (acutally filtering) in that it can be set to record only when the data changes more than a given deadband. By default it records on a given time interval. The way it accomplishes this it runs a front in to pack multiple records together before sending it to MSSQL. It has different record structures for analog, discrete, and string data. This could be one solution for you. Pack a thousand records together and send it to the RDBMS only once a second. This is easy if all your data is collected on the same frequency. If you various time intervals, I'd look at InSQL at $5000-15000. I agree with one of the other comments, you're going to run out of space fast at that rate.
 
Top