Industrial process data validation and reconciliation, or more briefly, data validation and reconciliation (DVR), is a technology that uses process information and mathematical methods in order to automatically ensure data validation and reconciliation by correcting measurements in industrial processes. The use of DVR allows for extracting accurate and reliable information about the state of industry processes from raw measurement data and produces a single consistent set of data representing the most likely process operation.
Necessity of removing measurement errors
ISA-95 is the international standard for the integration of enterprise and control systems It asserts that:
Data reconciliation is a serious issue for enterprise-control integration. The data have to be valid to be useful for the enterprise system. The data must often be determined from physical measurements that have associated error factors. This must usually be converted into exact values for the enterprise system. This conversion may require manual, or intelligent reconciliation of the converted values […]. Systems must be set up to ensure that accurate data are sent to production and from production. Inadvertent operator or clerical errors may result in too much production, too little production, the wrong production, incorrect inventory, or missing inventory.
DVR has become more and more important due to industrial processes that are becoming more and more complex. DVR started in the early 1960s with applications aiming at closing material balances in production processes where raw measurements were available for all variables. At the same time the problem of gross error identification and elimination has been presented. In the late 1960s and 1970s unmeasured variables were taken into account in the data reconciliation process., DVR also became more mature by considering general nonlinear equation systems coming from thermodynamic models., ,  Quasi steady state dynamics for filtering and simultaneous parameter estimation over time were introduced in 1977 by Stanley and Mah. Dynamic DVR was formulated as a nonlinear optimization problem by Liebman et al. in 1992.
Data validation denotes all validation and verification actions before and after the reconciliation step.
Data filtering denotes the process of treating measured data such that the values become meaningful and lie within the range of expected values. Data filtering is necessary before the reconciliation process in order to increase robustness of the reconciliation step. There are several ways of data filtering, for example taking the average of several measured values over a well-defined time period.
Gross error remediation
Gross errors are measurement systematic errors that may bias the reconciliation results. Therefore, it is important to identify and eliminate these gross errors from the reconciliation process. After the reconciliation statistical tests can be applied that indicate whether or not a gross error does exist somewhere in the set of measurements. These techniques of gross error remediation are based on two concepts:
- gross error elimination
- gross error relaxation.
Gross error elimination determines one measurement that is biased by a systematic error and discards this measurement from the data set. The determination of the measurement to be discarded is based on different kinds of penalty terms that express how much the measured values deviate from the reconciled values. Once the gross errors are detected they are discarded from the measurements and the reconciliation can be done without these faulty measurements that spoil the reconciliation process. If needed, the elimination is repeated until no gross error exists in the set of measurements.
Gross error relaxation targets at relaxing the estimate for the uncertainty of suspicious measurements so that the reconciled value is in the 95% confidence interval. Relaxation typically finds application when it is not possible to determine which measurement around one unit is responsible for the gross error (equivalence of gross errors). Then measurement uncertainties of the measurements involved are increased.
It is important to note that the remediation of gross errors reduces the quality of the reconciliation, either the redundancy decreases (elimination) or the uncertainty of the measured data increases (relaxation). Therefore, it can only be applied when the initial level of redundancy is high enough to ensure that the data reconciliation can still be done (see Section 2,).
Advanced DVR solutions offer an integration of the techniques mentioned above:
- data acquisition from data historian, data base or manual inputs
- data validation and filtering of raw measurements
- data reconciliation of filtered measurements
- result verification
- range check
- gross error remediation (and go back to step 3)
- result storage (raw measurements together with reconciled values)
The result of an advanced DVR procedure is a coherent set of validated and reconciled process data.
DVR finds application mainly in industry sectors where either measurements are not accurate or even non-existing, like for example in the upstream sector where flow meters are difficult or expensive to position (see ); or where accurate data is of high importance, for example for security reasons in nuclear power plants (see ). Another field of application is performance and process monitoring (see ) in oil refining or in the chemical industry.
As DVR enables to calculate estimates even for unmeasured variables in a reliable way, the German Engineering Society (VDI Gesellschaft Energie und Umwelt) has accepted the technology of DVR as a means to replace expensive sensors in the nuclear power industry (see VDI norm 2048,).
- ^“ISA-95: the international standard for the integration of enterprise and control systems”. isa-95.com.
- ^R. Kuehn, H. Davidson, Computer Control II. Mathematics of Control, Chem. Eng. Process 57: 44–47, 1961.
- ^ Vaclavek, Studies on System Engineering I. On the Application of the Calculus of the Observations of Calculations of Chemical Engineering Balances, Coll. Czech Chem. Commun. 34: 3653, 1968.
- ^ Vaclavek, M. Loucka, Selection of Measurements Necessary to Achieve Multicomponent Mass Balances in Chemical Plant, Chem. Eng. Sci. 31: 1199–1205, 1976.
- ^S.H. Mah, G.M. Stanley, D.W. Downing, Reconciliation and Rectification of Process Flow and Inventory Data, Ind. & Eng. Chem. Proc. Des. Dev. 15: 175–183, 1976.
- ^C. Knepper, J.W. Gorman, Statistical Analysis of Constrained Data Sets, AiChE Journal 26: 260–164, 1961.
- ^ Jump up to:ab cM. Stanley and R.S.H. Mah, Estimation of Flows and Temperatures in Process Networks, AIChE Journal 23: 642–650, 1977.
- ^ Joris, B. Kalitventzeff, Process measurements analysis and validation, Proc. CEF’87: Use Comput. Chem. Eng., Italy, 41–46, 1987.
- ^J. Liebman, T.F. Edgar, L.S. Lasdon, Efficient Data Reconciliation and Estimation for Dynamic Processes Using Nonlinear Programming Techniques, Computers Chem. Eng. 16: 963–986, 1992.
- ^Stanley G.M. and Mah, R.S.H., “Observability and Redundancy in Process Data Estimation, Chem. Engng. Sci. 36, 259 (1981)
- ^ Jump up to:ab c VDI-Gesellschaft Energie und Umwelt, “Guidelines – VDI 2048 Blatt 1 – Uncertainties of measurements at acceptance tests for energy conversion and power plants – Fundamentals”, Association of German Engineers, 2000.
- ^Stanley G.M., and Mah R.S.H., “Observability and Redundancy Classification in Process Networks”, Chem. Engng. Sci. 36, 1941 (1981)
- ^ Delava, E. Maréchal, B. Vrielynck, B. Kalitventzeff (1999), Modelling of a Crude Oil Distillation Unit in Term of Data Reconciliation with ASTM or TBP Curves as Direct Input – Application : Crude Oil Preheating Train, Proceedings of ESCAPE-9 conference, Budapest, May 31-June 2, 1999, supplementary volume, p. 17-20.
- ^ Langenstein, J. Jansky, B. Laipple (2004), Finding Megawatts in nuclear power plants with process data validation, Proceedings of ICONE12, Arlington, USA, April 25–29, 2004.
- ^ Amand, G. Heyen, B. Kalitventzeff, Plant Monitoring and Fault Detection: Synergy between Data Reconciliation and Principal Component Analysis, Comp. and Chem, Eng. 25, p. 501-507, 2001.
Ofer Abarbanel is a 25 year securities lending broker and expert who has advised many Israeli regulators, among them the Israel Tax Authority, with respect to stock loans, repurchase agreements and credit derivatives. Founder of TBIL.co STATX Fund.