Data Acquisition (DAQ) and Control from Microstar Laboratories

Automatic Controls: White Paper

data acquisition filtering data acquisition software channel architecture enclosures services

Home | Download this White Paper as a PDF file | View as one page

<< Previous 1 2 3 4 Next >>

Implementing Automatic Controls Using Data Acquisition Processor Boards

Though Data Acquisition Processor (DAP) boards are primarily data acquisition devices, the onboard processing power that makes rapid service to data conversion and buffering devices possible can also be applied for rapid service to external control processes. There are certain limitations, but for systems that can stay within these limitations, Data Acquisition Processors provide a reliable platform for the implementation of control methods that would be very difficult or very expensive to deliver any other way. Such systems might implement advanced control methods, require special data analysis, or operate on a scale beyond the capabilities of ordinary control devices.

This white paper surveys the challenges and strategies for implementing automatic control systems on Data Acquisition Processors.


The Role of Architecture
The Role of Processing
Timing and Regularity
Time Delay Benchmarks
Multitasking Control
Representing State
Control Strategies
Tuning and Monitoring
Special Applications

When to use DAP technology
How hardware affects system performance
How software affects system performance
Assuring response in real-time
Measuring actual performance limits
Time-critical and preparatory processing
Managing complex event-driven systems
Relationship to common control strategies
Extensions to facilitate process control
Examples illustrating technical benefits


data acquisition board
A workhorse engine for high-precision controls is the DAP 5216a, featuring 16 bit conversions and a composite 500k sample-per-second processing rate.

Data Acquisition Processors provide device-level controls. They can operate in stand-alone workstation nodes, or as low-level remote processing nodes within a multi-level control hierarchy. The fact that they can capture a large number of measurements accurately means that they are useful as data server nodes in SCADA networks, with the host supporting the SCADA protocols and network connections. They can apply local process knowledge to implement a control policy downloaded from a higher level node, receive and respond to feedback, and intelligently analyze data so that higher level controls receive the best information.

Perhaps it is easier to say when Data Acquisition Processors are not appropriate. Applications with relatively low measurement rates, moderate accuracy requirements, one or very few measurement channels, and minimal processing needs will probably find alternative low-cost devices that serve the purpose well. Most feedback regulator controls have pre-packaged solutions available at a reasonable cost. Even though this means that the great majority of measurement and control applications do not need the high-end capabilities of Data Acquisition Processors, this still leaves many difficult measurement and control applications for which Data Acquisition Processors are well-suited but for which few alternatives are available.

In addition to basic requirements for signal capture, control applications must process those measurements, apply a control policy, and generate control responses, meeting real-time constraints while doing these things.

The simplicity of pre-packaged control solutions offers very little in the way of extensibility or flexibility. For those cases in which the most basic control solutions are not sufficient, the options are few.

  1. For optimal performance, but with the highest cost, greatest risk, and least flexibility, hardware-software solutions can be designed from the "ground up." The difficulties of this are great enough to make this option out of the question in most cases, but the benefits can be great if it is possible to apply the results widely and amortize the development costs.
  2. To accelerate the development cycle and reduce costs, it is often worth compromising slightly on performance and using modular hardware and software sub-systems rather than starting everything from scratch. Data acquisition components, interconnected systems, and processor modules are available. Finding components that can inter-operate, can meet performance specifications adequately, and are supported by sufficient and affordable development tools can remain a challenge. Long-term maintenance is a concern.
  3. Data Acquisition Processors are the next step in modularization. They provide data conversions, processing power, signal interconnects, and a straightforward development platform. The hardware is completely pre-configured, and so is the software necessary for device operation, but this doesn't leave you much hardware-level flexibility. If you can work within the architecture, it is relatively easy to program specialized control features using a high level programming language. Pre-existing software provides for routine configuration of timing rates, data channels, data routing, pre-processing, and so forth. When you have completed the software configuration and customized control software, the hardware is an off-the-shelf commercial component ready to install.
  4. Or in the end, you can stay with simple control devices that are available, cope with their limitations, and bear the economic costs of operation far below optimal capacity.

The Role of Architecture

Data Acquisition Processors are designed for measurement processing, where speed means lots of channels and bulk data movement. Devices that best facilitate bulk data movement are not the ones that best facilitate responses to individual items. Consequently, there are limitations on the speed and regularity of process timing to be considered.

FFT of the output of a CHEBYSHEV command superimposed on the FFT of the input, showing high frequency attenuation
For developing and exercising new methods, the Developer's Toolkit for DAPL is your entry to embedded development without the embedded costs.

When processing captured data streams at very high rates, a delay of a millisecond causes backlogs of thousands of samples in hardware devices. Both conversion inaccuracy and loss of data are unacceptable. This suggests the following criterion for evaluating real-time performance: response times must be in fractions of a millisecond. In contrast, workstation systems must operate on a human time scale, and delays much longer than 50 milliseconds make system response seem sluggish. This establishes a comparable criterion for its real-time performance.

Delays for processing and moving data make the problems of converter device operation more difficult. Processing is interleaved, with some processing time applied to device operation, some to data management, some to data communication, some to higher-level processing. It takes time for data to propagate through these stages. As a general rule of thumb, you cannot depend on a Data Acquisition Processor to deliver a completed result in response to an individual sample in less than 100 microseconds (and you must verify by testing).

Contrast this to the architecture of a typical workstation host. It is optimized for moving bulk data to support graphical data presentations and multimedia. After operations are set up, data movement is fast, but chip sets for memory management, bus control, and peripheral control require setup times in preparation for the data movement. There is even a setup time for an operation as simple as reading the current system time from an external timer chip. Workstation systems typically host an array of diverse devices, and these devices compete for service. There is no assurance about how much time the devices will take, or how much time device driver code will take to operate them. Add to these difficulties the heavy computational loading of a typical workstation operating system, and perhaps a GUI environment specialized for data presentation, and in the end you can't depend on delivery of any data in less than about 10 milliseconds... Or it might be 100 milliseconds or longer depending on a variety of uncontrolled conditions.

Even if you replace the operating system with a fine-tuned embedded microkernel system specialized for real-time systems, disable all unused devices, eliminate user interaction, and devise a way to load your control software as an embedded application, even in this highly customized PC environment, you will still have a very difficult time getting responses for the best case in less than a millisecond or two, and even then you can't trust it. It simply isn't designed for this purpose.

The Role of Processing

On a Data Acquisition Processor, the hardware processes for measurement and the low-level software processes to capture and move that data are pre-programmed. Control algorithms will reside in higher-level processing organized as independent processing tasks, which are scheduled for execution by a system task scheduler. There isn't much overhead with any of these operations, but there is some overhead with each, and the effects on delay are additive.

Data movement mechanisms called data pipes will transfer data between tasks or to external devices. The schedulability of the tasks depends on availability of data.

<< Previous 1 2 3 4 Next >> View as one page

MSXB 067