A device such as a digital oscilloscope uses a high speed a/d converter to acquire the desired signal for some very short duration of time at a very high sampling rate, this acquired data is then transferred through a signal processing engine that may perform some sort of analysis on the data at that point such as an RMS calculation, and then the data is transferred to the main display engine for more math (such as calculating a cursor value) and finally displayed on the screen. So the flow is simple, acquire the data, analyze it to make the measurement, and present the result to the user. For 20 years, NI has used the phrase "acquire, analyze, present" to describe virtual instrumentation. A virtual instrumentation system allowed you to create that oscilloscope yourself out of a PC, a digitizer, and LabVIEW and allowed you to write your own measurement.
However, the phrase "acquire, analyze, present" may misleadingly imply that the phases are discrete when in fact they can be continuous. Why might this be continuous? One example is the processing to perform a trigger. If a scope is going to trigger when the signal exceeds a voltage level, acquisition and analysis must run continuously to digitize the signal and perform some very simple math (comparison) to look for the threshold to be met.
As with all engineering problems, the devil is taking the simple concept and applying it to a need that our customers have. Creating that trigger is subject to two fundamental limitations: the speed at which the trigger criteria can be evaluated and the rate at which the data can be sent from the a/d unit to the trigger computation circuitry. Triggering is typically a hardware operation because it needs to run at a megasample or gigasample rate and it is performed "close" to the a/d converter to handle the data rate of hundreds of megabytes or multiple gigabytes per second being generated.
The march of technology provides some interesting opportunities on the horizon that have been alluded to in past posts. The architecture of a software defined radio (SDR) and an oscilloscope are not that different. The FPGA technology and wideband A/D technology driving SDR is beneficial in T&M applications. The biggest difference is the audience. Whereas the SDR design team consists of C and VHDL programmers working for 12 months on a highly specified design in a high process development environment, the T&M system design team is a few people who know LabVIEW trying to keep up with the changing specs being thrown over the wall and trying to avoid the end product being late even when the design team is late.
The next generation of test equipment will be leveraging the technological power available in the SDR architecture but the ease of use you expect from a T&M instrument to get those difficult measurements made. Imagine the possibilities:
- Write your own math computation for that trigger, compile it, and have it run at the full rate of the hardware.
- Perform calculations on the data at full speed. Need a hundred kilohertz RMS calculation to decode a LVDT? You've got it. Filter or demodulate a 20 MHz communication signal? No problem.
- Take that test that you wrote but have part of it run on an FPGA to speed it up. When that's not good enough, take the whole thing and move it down to the FPGA.
All of these tasks are possible today but some aren't as easy as they could be. The technology is there to accomplish the task, the trick is to expose that power to you. NIWeek is coming up in just over two weeks. I look forward to discussing the possibilities with you there.