FPGAs have come a long way since their initial visibility in the field of test and measurement, but not every one we talked to was interested in going with them. Why? This article gives you an idea of whether your test equipment is better off without an FPGA in it
It has been quite some time since we first began seeing FPGAs in test instruments. They were brought in to target the areas where the quantity of equipment produced was far too low to benefit from the economies of scale. This allowed FPGAs to get a one-up over ASICs with shorter time to market and greater flexibility with product upgrades. But is that it?
As recommended by Navaneethan Sundaramoorthy, a technology innovation and education consultant based in Bengaluru, we will begin this story by differentiating two kinds of FPGA-based test equipment—home-grown FPGA-based test equipment and custom instrumentation designed from off-the-shelf FPGA modules.
Commercial off-the-shelf (COTS) FPGA modules allow an engineer to quickly devise solutions for instrumentation challenges without spending too much time on development. Apart from being able to harness the faster prototyping capabilities of an FPGA, it also makes economic sense in cases where the scale of deployment would make using professional test equipment unreasonable. On the other hand, in cases where precision and accuracy are important, traditional test equipment still hold steady.
In many parts of the telecom industry, time to market is critical. Product designers need test tools to be available at the same time as the product that they are developing, and that requires significant investment by test equipment manufacturers. Using FPGA-based equipment here allows the vendors to bring out their solutions quickly before silicon-based solutions are available.
The pros are aplenty
We may classify T&M equipment broadly into two categories, the first for general-purpose usage like educational/training institutes and the second for industrial purpose. “In both cases, the basic requirement like compact design, performance and cost matters a lot in all T&M equipment. User-defined flexible design and virtual instrumentation are where the functionality of an instrument being determined by programming/software can add an extra for both end users as well as for the designers,” shares Lavesh Kumath, assistant head R&D (VLSI), at Scientech Technologies Pvt Ltd.
Computation power. This is something that benefits both kinds of test equipment we mentioned earlier.
The true parallelism, a unique feature of an FPGA device, exceeds the computing power of digital signal processors (DSPs) by breaking the paradigm of sequential execution and allows multiple clock domains to be incorporated. “A single device is now much more powerful and capable of performing simultaneous operation of multiple instruments to directly benefit the user—enabling him to have flexible, compact, light-weight, precise and multifeatured instrument at overall low cost,” adds Lavesh Kumath.
This parallelism, Chinmay Misra, technical marketing engineer at National Instruments, explains they can perform complex mathematical calculations simultaneously without involving a host processor. This enables it to outperform standalone DSPs and even graphics processing units (GPUs).
Vinod Mathews, founder and CEO, Captronic Systems, says that while using FPGA-based equipment, the biggest benefit in speed is especially noticed when you need to do hardware-in-the-loop kind of tests.
Customisability. Initially, the FPGAs were very difficult to program due to lack of simplifying tools, and this affected engineers working with COTS FPGA boards a lot. The HDL programming required was very unintuitive and demanded expertise which most users did not have. Second, there was no hardware platform or architecture, no starting point that the users could leverage to quickly build their systems. The only option had been to start from scratch, from the FPGA chip and figure out the rest of the pieces like IO connectivity, power requirements, interaction with other ASICs and processors (if present), by yourselves. This required a lot of time, effort and money, and was not at all viable, especially with the technology advancing ever so rapidly, and changing the test and measurement requirements with it.
NI LabVIEW is one solution to this problem of customisability. “National Instruments, realising the benefits of a user-programmable FPGA, came up with the LabVIEW FPGA module, which abstracted the complexities of FPGA programming and enabled the user to use the same graphical LabVIEW environment to program the FPGAs. Simultaneously, NI also came up with a revolutionary RIO architecture that had an FPGA, a real-time processor and modular IO at its core. It abstracted all hardware design complexities from the user and could be used just as well for test and measurement instrumentation, as for embedded systems,” explains Chinmay.
Reduction in both test time and physical footprint. “By embedding FPGAs into T&M equipment, test routines can be run at a much faster rate, thereby reducing the testing costs. Since FPGAs are highly configurable, newer test cases and functionality can be embedded in a short period of time to allow the equipment manufacturers to reach market on a timely basis,” explains Nate Srinath, founder–director, Inxee Technologies.
NI’s Chinmay also shares an example of reduction in test time achieved by FPGA-based instruments, in which by synchronising timing of digital control with the on-board FPGA on the NI vector signal transceiver with the RF front end of the instrument, Qualcomm Atheros was able to reduce the test time by 20x over their previous PXi-based solution and reduced the physical footprint for their test system from several different boxes to a single 8-slot PXIe chassis with slots to spare.
Lower time-to-market for equipment vendors. In many parts of the wireless industry, time-to-market is critical. The designer of a real mobile has to worry about many factors such as achieving very low manufacturing costs, long battery life and small size. Cost is of course important for test equipment manufacturers but size and power consumption are less so, and the use of FPGAs allows the vendor to deliver functionality in line with when base station manufacturers need it—Aeroflex’s first LTE test mobile was delivered in November 2007, some four years before the first networks launched.
Stephen Hire, general manager, Aeroflex Systems Pvt Ltd, explains, “Product designers need test tools to be available at the same time as the products that they are developing, and that require significant investment by test equipment manufacturers. Taking our FPGA-based TM500 test mobile as an example, it is used by virtually every single base station manufacturer worldwide as a test peer because it was available long before commercial mobiles are.”
Nate Srinath adds, “Since FPGAs are highly configurable, newer test cases and functionality can be embedded in a short period of time to allow the equipment manufacturers to reach market on a timely basis.”
In-field firmware upgrades. Flexibility and protection of investment are the two big advantages of in-field firmware upgrades enabled by FPGA-based equipment. “Technology, especially in the wireless industry, is constantly evolving, and different markets often have slightly varying standards or requirements, for example different spectrum allocations. When designers and manufacturers invest in new test equipment, they need tools that can not only cater to the variations across markets but also have the ability to be upgraded via software as standards and technology evolve. FPGAs are a major help in achieving these goals,” shares Stephen Hire.
In-field firmware upgrades are a must for most modern-day embedded systems with communication capabilities. Such systems are usually deployed remotely (like a microwave repeater) and need firmware updates to be deployed over the air or the provided network.
“It is unacceptable to expect the end customer to bring back the remote system to the factory for firmware upgrades. Customers can face substantial losses when firmware upgrades are not performed remotely. The system has to be brought down and back to the factory. During this time, the system is not operational and the cost of removal, transport and reinstallation adds to the losses due to the system downtime. It is strongly recommended that all embedded systems (wired or wireless) with communication capabilities should have in-field upgrades. This would add tremendous flexibility to the end customer to not only save costs, but provide the ability to change the system configuration remotely,” adds Nate Srinath.
At the end of the day, is it worth it?
Whether to use FPGA in test depends on the feature and functionality requirement on the end users’ test and measurement equipment.
Lavesh explains, “While on one hand, FPGAs in test are good for a system with multichannel parallel functionality and narrow bandwidth. FPGA-based test can also keep up with future modifications that might be necessary. On the other hand, a system with just one or two channels and of higher bandwidth (GHz), the cost of FPGA and thus the cost of the system for the end user can quickly exceed and it may not be worthwhile as compared to ASIC.”
The fact that there exists a learning curve is in itself might also be a dampening factor for anyone looking to get into designing FPGA-based tools. However, Hanneke Krekels, test, measurement and emulation segment owner at Xilinx (an FPGA supplier for National Instruments), says, “Like with any new technology implementation, there can be a learning curve getting familiar with FPGA design flows. Xilinx has been actively working on the UltraFast methodology to shorten this potential initial learning curve by providing an implementation framework and best-in-class examples.”
“Moreover, the security (related to code modification, corruption, etc.) that is inherently supported by ASICs or MCUs is not easily achievable with FPGAs; thereby adding another question mark in their abundant use,” explains T. Anand, managing director, Knewron.
You also have to learn to make good use of the space available. “Hence the programming should be tight, and the designer should have knowledge of how the FPGA works and how to save space using techniques specific to FPGA. Secondly, certain standard given things which we take for granted on Windows, will not work directly on the FPGA like floating point or direct memory access transfer,” adds Vinod Mathews.
Sanchit Bhatia, application engineer, Agilent Technologies India Pvt Ltd says that most T&M equipment like oscilloscopes, logic analyzers, network analyzers need not move to FPGA-based systems as they need to be easy to use and don’t require real-time processing. Instruments like digitizers and modular instruments can go the FPGA way since real-time processing is required.
“FPGA-based T&M equipment are in general complex to use and require the users to learn special tools/toolkits to program their on-board FPGAs. Thus FPGA-based T&M equipment are mainly for advanced users who are already power users of T&M equipment. The learning curve associated with FPGA-based equipment are also higher,” he adds.
For test and measurement vendors, an emerging trend that Stephen says Aeroflex is a keen proponent of, is to build common test equipment platforms that can be used for multiple applications. “This not only saves R&D cost by avoiding duplication of design effort but more importantly shortens the design cycle—critical for staying abreast of market needs. For example, our Common Platform Architecture has enabled delivering multiple products to different customers addressing different test needs all based on the same core hardware and software. These have included LTE mobile phone testers, WLAN and mobile phone production test equipment, military radio testers and avionics test equipment.” Apart from giving customer access to test equipment early, another major customer advantage is the protection of the investment as technology matures. In the past, a new technology standard usually meant designers and manufacturers would need to buy new test equipment completely. Architectures such as the Common Platform using FPGAs and DSPs mean that certain standard updates can be delivered via software only and more significant changes are usually now just an upgrade rather than a complete new purchase.
On the other hand, user-programmable COTS FPGAs are also emerging as a likely platform for next-generation embedded test and measurement solutions, which can be employed during design, development, manufacturing and in the field following product launch.
Chinmay explains that as systems grow even more complex, tools enabling a platform-based approach for system design, and software-defined instrumentation are becoming more popular. This is fuelling the growth of an eco-system around the platform, where many domain experts are creating domain-specific IPs, and sharing it through channels like LabVIEW tools network. With FPGA personalities easily available because of the ecosystem, it is becoming possible for an average test engineer to make a generic test system to test multiple products with the same hardware setup. This is driving down the cost and increasing productivity.
The author is a senior correspondent at EFY Bengaluru