The latest analysers make speedy measurements, occupy lesser space, have multiple measurement capabilities and software-defined functionalities powered by FPGAs. These modular instruments are meeting performance and other capabilities of bench-top instruments
Abhishek A. Mutha
Analysers now aim at meeting all the test needs of an engineer from design stage to testing stage. Available in low-cost to high-performance range, the analysers are used for a variety of applications, such as 5G, smart antennae (like multiple-input and multiple-output) and smartphones.
Traditional analysers are box type instruments with the capability of analysis in-built into them. With the newer modular PCI eXtensions for Instrumentation (PXI) based analysers, channels can be added as per requirement. For instance, by adding a card to an 8-channel logic analyser, it can have 16 channels, and by adding another card it can have 32 channels, and so on.
Functionality of traditional analysers is fixed, that is, logic analysers would do digital pattern analysis, while protocol analysers would do protocol-level analysis on specific, limited set of protocols. But the newer analysers are completely programmable in different environments, whether LabVIEW or any other text based programming language.
A programmable analyser can have ‘fixed’ personality wherever required. Whether it is radio frequency (RF), digital logic analysis or spectrum analysis, it is possible to use it as a traditional analyser, just like the box type instrument. If you are building a new protocol and want to do an analysis of that protocol, you have the flexibility to incorporate the capability into it. “Today, people talk about monitoring the spectrum instead of just performing a simple spectrum analysis, or probably even look at it as an intelligent demodulator. These are the software-defined personalities that you can bring to analysers,” says Satish Mohanram, technical marketing manager, National Instruments.
“Testing instruments are supporting high bit rate testing. This enables faster data rate development of future telecom technologies,” notes Madhukar Tripathi, regional manager, Anritsu India Pvt Ltd. Earlier, programming used to happen mostly on a computing platform, where you put the logic into a processor, try and implement it. But today, there are high-speed protocols, which run into gigabits and terabytes, where it is important to perform analysis at high speed to be able to cater to these high-speed digital buses and high-speed RF signal data. “Analysers come with a field-programmable gate array (FPGA) on board which can be programmed by the users. You can do real-time analysis, for immediate processing of incoming data, to exactly understand the kind of signal coming through, unlike in a traditional case where you just take the data and do off-line analysis of the protocol,” says Mohanram.
What’s shaping the analysers
One is FPGAs. High-speed analysers have in-built FPGAs. For instance, if need arises for a protocol analysis, the FPGA aboard takes the signal and processes it in-line. If there is communication that has to go back, it is sent and, at the same time, analysed information is available for the user to look at. “The FPGAs are also capable of breaking down the signal for analysis to learn the kind of communication happening, if at all this necessity arises,” informs Mohanram. He adds, “Hardware in-loop is a very common way of testing new-generation protocols because, if you have to test a protocol, you have to be able to emulate the receiver. It becomes easy with the FPGA based in-line processing capabilities and complete programmability that you get off these instruments.”
The FPGAs can be chosen based on signal being handled and the amount of in-line processing to be done. If the end user has a very high-end protocol that has to be implemented, and it needs a lot of logic and digital signal processing (DSP) blocks, a high-end FPGA could be used. Even the size of the FPGA is pretty much configurable based on the customer’s need.
Traditional analysers have been box type but now analysers are coming with just a single-slot card. Their size is shrinking tremendously, thanks to semiconductors and multi-core technologies. “FPGA and shrinking size are the key elements bringing down the cost and increasing the capabilities of these instruments,” believes Mohanram.
Besides, a single instrument now includes, for instance, spectrum analyser, logic analyser, oscilloscope, digital multimeter and multiple-slot PXI chassis capabilities, which reduces investment on instruments. It also minimises the effort on measurement, automation and learning how to use the instruments, besides occupying much lesser space in labs. Adesh Kumar Jain, applications engineer, Keysight Technologies India Pvt Ltd, says, “Software tools control different instruments from a single software on a single screen. Even mobile versions of software tools are launched by companies to control/monitor measurements from smartphones. These are supplemented with tools to make automation easier.”
Also, many power applications have fast-changing asynchronous current pulses which are not suited to fixed data length FFT analysis. Hence, gapped analysis causes events to be missed, resulting in inaccurate measurements. “The Newtons4th Ltd (N4L) power analysers use real-time discrete Fourier transform (DFT) technique with variable-window no-gap analysis to ensure the optimum speed and accuracy at all times,” says Motiwale.
Why FPGA based analysers
Apart from the flexibility, FPGA based analysers are better for hardware in-loop testing and test coverage in time. Today, systems are all interconnected. There is lot of security and logic built into every communication protocol to ensure that no signal or information is lost when it goes from one end to another.
Consider people who are building multiple-input and multiple-output (MIMO) systems to define the next generation of mobile communication called 5G. One of the biggest challenges that engineers face is having to define and implement the base station or customer premises equipment (CPE) even before the standard is defined. “In this case, you need logic analysers, spectrum or RF signal analysers, which can really behave as though it’s a physical phone, and at the same time analyse the complete protocol that goes between the base station and the CPE. So the protocol is being defined and at the same time you are trying to emulate a particular device. This is not possible with traditional spectrum, logic and RF analysers,” informs Mohanram. With FPGA technology, it becomes possible.
Testing is becoming a lot more thorough. According to Mohanram, “Earlier perhaps people used to test four different points, which is a small example, on a protocol to say it is fine. But today, instead of testing limited number of points, they could test pretty much all different points on the curve. So very thorough testing is possible because of FPGA.” FPGA is basically a concurrent logic; everything happens in parallel and, because of that, the set time also drastically comes down.
Selecting the right analyser
Flexibility is one factor that is often missed out while selecting analysers. Scientists and engineers are comfortable using traditional instruments and have stuck on to those. It is a mindset people have stuck on to, but now it is slowly changing. People realise, if they have the capability to define exactly what the analyser can do, they really are able to make optimal use and get the best throughput out of the system or the instrument they are building.
Mohanram says, “What we generally ask them is, ‘Tell us what you exactly want to do with an analyser?’ and we make them do it right in front of us by guiding them on how they could go about bending the instrument according to their requirements rather than adjusting with what the vendor typically provides,” He adds, “Because of the buzz surrounding defining functionality of these instruments, that mindset is slowly changing, and I think that’s one point I would like to emphasize which people should keep in mind while selecting these products.”
Another factor that people generally ask while selecting analysers is, “Can this analyser analyse this particular protocol?” If it is followed by a yes, they immediately go for it, informs Mohanram. But the real question should be, “Can it be done in real time?” This is not considered while picking analysers. Real-time analysis helps users to do a two-way communication analysis, be it logic or communication from a RF stand point. People forget to ask about real-time capabilities and get into trouble after getting the instrument.
Talking particularly about power analysers, Manisha Motiwale, technical manager, Scientific Mes-Tecknik Pvt Ltd, informs, “Manufacturers normally specify basic accuracy, but user should consider the accuracy specification for the complete range in which measurement is done. Also, one should check the standard and optional features, and accessories as per requirement.”
Why real-time processing
In most cases, what usually happens is that the test engineers are told, “You have to test this particular protocol coming from your system. Get an analyser for that,” informs Mohanram. He says, “That’s the mandate an engineer gets. Sure there are a lot of analysers which can analyse that particular protocol, but what they don’t share is the information on how they do that analysis.” These analysers capture all the information and a software on the instrument takes that data and processes it off-line to give the analysed information.
In the above case, if a particular protocol needs handshaking, it becomes completely impossible for people to use those off-line analysers. Handshaking means, based on an incoming signal, the analyser has to generate some other signal to establish communication and, once that happens, it starts pumping out data. In such cases, it is not feasible to use off-line analysers. To avoid this, engineers should look for online analysers.
Off-line analysis ideally has a transmitter and receiver, which can only split packets and see what is happening. Mohanram notes, “That is not a complete test that you are doing, because if you are trying to analyse a packet or trying to analyse communication protocol, you would want to do it as a different layer like physical or application layer, for instance. But if you just do splitting then you are not able to completely test the capabilities at every different layer. That drawback can be overcome by taking an analyser that can do in-line processing.”
For almost the same cost that people used to pay for box type analysers, the high-end FPGA-enabled analysers come in really handy. “I would say, at the same cost there is more and more functionality getting packed into these analysers and they are shrinking in size,” says Mohanram.
Using a PC to do the processing work for all manner of test and measurement equipment may not be feasible for the larger manufacturers, but it does serve the user very well. “PC-based processing for test equipment is here to stay,” says Bruce Devine, CEO, Test Equipment Plus, Inc. It puts pricing pressure on the traditional companies that build test equipment with the processor inside their equipment. He adds, “It is now difficult for many companies to justify the expense of buying anything but PC-based test equipment because of the cost savings, upgradability and extensibility of PC-based systems.”
Today’s analysers are a boon for engineers
Design engineers are supposed to spend time on designing their products rather than learning new instruments and spending time debugging instrument related issues. Today’s instruments cater to all the different issues generally faced by these engineers. They are reliable, can be easily automated and can be reused for different platforms. Various software tools support instruments covering different technologies. Thanks to these capabilities, they enable engineers to invest least possible time on measurement, be confident about the measurements and spend more time on designing efficient products.
The author is a senior correspondent at EFY