Test & Measurement: Fuelling Technological Advances

0
108

Imbibing progressive practices ranging from multicore (parallel) processors and software-defined instrumentation to wireless standards, the field of test and measurement continues to enable new technologies

JANANI GOPALAKRISHNAN VIKRAM


Software-defined instrumentation (Courtesy: National Instruments)
Software-defined instrumentation (Courtesy: National Instruments)

October 2011: The use and development of technology requires test and measurement instruments. From research and development of new technologies to ensuring quality in the manufacturing of various products, these instruments are essential. For example, when a space shuttle is launched, many different instruments are used to conduct thousands of electrical, temperature, chemical and other measurements to monitor the status of the craft and its crew. These instruments are essential to our high-tech economies,” says Dr Anita Agrawal, assistant professor in department of electrical and electronics engineering, BITS, Pilani-K.K. Birla Goa Campus, opening our discussion on developments and trends in the test and measurement (T&M) space.

The goal of T&M has often been misunderstood as mere fault-finding. Of course, the detection of faults during manufacture has been a key task handled by T&M instruments; and has been instrumental in achieving high quality-control levels in all technological arenas. However, T&M plays a larger role in monitoring and maintaining various critical technical scenarios, and in the research, validation and improvement of new technologies.

Further, the relationship between T&M and other technological areas has always been intertwined and symbiotic. While T&M helps develop new technologies, it also imbibes the best of technologies to advance its own capabilities further. Progress in networking and telecommunications, for example, has been possible only because of capable T&M equipment that can accurately measure and monitor various critical technical and quality-of-service (QoS) parameters on a real-time basis. At the same time, fast wireless and Ethernet technologies have helped make T&M instruments more communicative and capable of connecting to larger enterprise systems via fast networks. It is this symbiotic relationship that drives a lot of improvements and innovations in the field of test and measurement.

Explore Circuits and Projects Explore Videos and Tutorials

Inspiration to improve—from technology and technologists
The innovations in T&M cater to both the needs of new technologies as well as the demands of technologists and manufacturers.

At one time, strict quality regulations and the need to find even the smallest of faults were the key innovation drivers in the T&M industry. But now high standards have been set and achieved in this aspect, and so the focus is shifting to low cost, short time and ease of testing.

“The drive to reduce the cost of test is forcing changes in the way instruments communicate and the way they are programmed. As examples, look at the rise in the number of LAN eXtended for instrumentation (LXI) instruments (over 1500), graphical programming and the acceptance of modular tests such as PCI eXtensions for Instrumentation (PXI),” says Bob Stasonis, Pickering Interfaces, USA.

 

Software development coupled with improvements in areas like LED lighting allows vision inspection systems to improve test speeds and reduce marginal errors

A lot of effort is also focused on making the test engineer’s job of routing stimulus or measurement instruments to and from the unit-under-test easier. “To that end, we have focused on switching the density and bandwidth. As products become more complex, more inputs/outputs (I/Os) need to be managed. Also, between new high-speed serial buses and wireless products, the need for even higher number of switching points and getting a few more GHz where possible is the priority,” remarks Stasonis.

On the technology front, the growth of wireless technologies has been driving a lot of innovations in the T&M world. Higher signal bandwidths and increasingly complex modulation schemes, strict QoS requirements, need for real-time monitoring across networks, etc are driving the need for faster and more capable instruments.

Automatic test equipment for avionics (Courtesy: ST Aerospace)
Automatic test equipment for avionics (Courtesy: ST Aerospace)

The recent multicore mania, surge in digital broadcasting, more stringent quality norms across industries, convergence of design and development activities across the product lifecycle, need to shorten the time-to-market like never before, and a lot of other new technological and market trends are also triggering various developments in the T&M world. In the rest of this article, let us delve into some of these recent developments and trends in test and measurement.

Great improvements in vision test systems
When we think of test systems, machine vision is probably the first thing that comes to our mind. Machine vision involves the capture and analysis of images to monitor and control an activity. Generally, it is used for automatic inspection and process control in assembly lines, guidance of robots, etc. The light source used is a very important factor in vision test systems because it affects how accurately the image is captured—and hence how accurately defects can be spotted.

Of late, there have been significant improvements in machine vision systems with the use of light-emitting diodes (LEDs) for the light source. LEDs offer many benefits like high efficiencies, reproduction of varied colours and longer lifetimes. Moreover, the last few years have seen significant improvements in LED technology in terms of brightness, efficiency, thermal stability and optics—all of which are now being harnessed by machine vision companies to improve their test systems.

READ
WiMAX Testing, Step by Step

We also see test systems with increased strobe speeds emerging to meet new machine vision applications. Cameras capable of 1000 frames per second (fps) are becoming the standard. These take hundreds or thousands of inspections on a part looking for a pass condition, as against the earlier norm of one or a few images. Software is also becoming smart enough to process and analyse these images swiftly and accurately.

Stasonis confirms, “Improved lighting and, most importantly, software algorithms have improved vision test systems. Software development coupled with improvements in areas like LED lighting allows vision inspection systems to improve test speeds and reduce marginal errors.”

Emergence of multicore systems
Multicore processors are fast becoming popular as these help achieve performance gains without increased clock rates. Of course, the world of T&M has to keep pace with developments across the technological spectrum. Hence test engineers are now beginning to use multicore processors to develop automated test applications capable of achieving the highest-possible throughput through parallel processing.

 

Major applications and emerging trends

The major application areas for T&M equipment are:
1. Communications: Cellular tests, wireless and wire-line tests, and optical tests
2. Aerospace and defence: Electronic warfare, radar, satellite test and signal monitoring/intelligence
3. Research and development: Digital, radio frequency (RF), microwave, etc
4. Manufacturing: In-circuit tests, functional/automated manufacturing tests
5. Security: Surveillance, military communications
6. Core electronics: Data acquisition, control, automation, semiconductor/component test, nanotechnology, RF and microwave

One of the most challenging applications is in the strategic electronics sector comprising military and aerospace electronics. The reliability and repeatability needs of these markets are unmatched as the margin for errors is minimal.
Technological advancements that have a far-reaching impact on the way T&M is being utilised today include:

1. Faster and smaller semiconductor chips
2. Better and brighter displays
3. Convergence or consolidation

Not only have these advancements helped bring down the test-times and cost of T&M instruments—making them faster and affordable—these have also helped in making them easy to use and more efficient. These advancements have triggered T&M to drive the following key customer needs:

1. Take complexity out of test and save time to focus on where it matters the most
2. Provide reliability and repeatability—the key expectations from any test platform today
3. Offer new alternatives for testing, like instruments with multiple capabilities and automated test set-ups

—Arun Dogra, country manager & VP (sales, marketing & support), Agilent Technologies India

Demand for more flexible test systems
With recent improvements in embedded systems technology and software engineering, it is now possible to greatly influence a device’s functionality by just changing the software embedded in it. Software-defined functionality is generally more flexible and scalable, and makes it possible to change the characteristics of devices quite rapidly.

This puts test engineers in a fix because traditional test instruments cannot keep pace with the changes in the device-under-test (DUT) because of their fixed user interfaces and hard-coded functions, which are difficult to change. Hence test engineers are also turning to techniques like software-defined and field-programmable gate array (FPGA)-enabled instrumentation, so they can quickly customise their equipment to meet specific application needs and integrate testing directly into the design process.

Software-based instrumentation comes at two levels—virtual and synthetic. Virtual instrumentation is a combination of software and modular hardware instruments that can be customised by the users for their specific needs. It has been used for over two decades but became very popular in the last few years.

“Almost all of the testing industry has accepted the concept of virtual instrumentation, which equals or exceeds traditional instruments in terms of data rate, flexibility and scalability, with reduced system cost,” says Dr Agrawal.

In virtual instrumentation, the software is very powerful—to the extent that it can transform common hardware components into test instruments. For example, an analogue-to-digital converter can be made to function as a virtual oscilloscope. So a combination of common hardware and modular test instruments can be strung together with powerful software to make a test system with completely user-defined functionality. And this can be changed any number of times to cater to the changes in the DUT.

A synthetic instrument on the other hand is purely software-defined. It performs a specific synthesis, analysis, or measurement function on completely generic, measurement-agnostic hardware. So it is a system that can run completely on a PC, without even needing basic and modular T&M instruments. Another recent trend is the use of highly modular and reusable software code, which increases the flexibility of software-based instrumentation solutions even more.

The need for flexibility has also resulted in the greater use of FPGAs in test devices. There are lots of system-level tools for FPGAs these days, so modular instruments with FPGAs can easily be reprogrammed by engineers according to their needs. In fact, with modern software tools engineers can rapidly configure FPGAs without even writing low-level VHDL code!

READ
Handheld Digital Multimeters

Meeting the needs of the wireless world
The emergence and popularity of new, high-speed wireless technologies is greatly influencing changes in the test industry.

The high speed requirement of wireless devices and applications has resulted in wireless networks with more spectrum, greater data throughput, technologies such as orthogonal frequency-division multiple access, and advanced antennae with technologies like multiple-input multiple-output and beam-forming. Testing plays a huge role in making all these new technologies work together.

Today’s wireless test equipment have to cover higher bandwidths and many parts of the spectrum. The solutions need to look beyond the channel frequencies specified for the DUT to help detect and minimise RF interference. They also need to take into consideration the security requirements of wireless devices, not to forget their physical robustness and even weather-resistance.

What is more, since the high-speed wireless industry is rapidly growing with new devices and protocols emerging everyday, the test solutions need to be very flexible and customisable. Hence these solutions often comprise standards-based software that can be downloaded to hardware instrumentation and modular platforms—to quickly put together a test system that meets the specific requirements of the project. The solutions must also be simple and foolproof because these might be used by design and field repair engineers with relatively little experience. Plus, since many wireless technologies are meant for low-cost devices, the test solutions must enable manufacturers to verify performance without increasing end-system cost.

Reducing time and costs
“The primary focus of test instrument manufacturers is to accurately capture a signal or create a stimulus. However, demands for lower costs equally drive these specifications in order to shortened tests and hence lower its cost,” says Stasonis.

Indeed, in today’s dynamic marketplace there is a great pressure on manufacturers to introduce new models at short intervals. Consequently, there is a pressure on test engineers to come up with solutions that reduce the testing time and cost while improving the effectiveness of the testing process. This is often achieved by test systems that feature faster signal capture, faster data transfer to the PC for analysis, and more intelligent drivers to shorten development times and minimise test steps.

Cost is as important as speed, because the cost of the test process influences the cost of the device as well. The emergence of flexible test systems that use smart software, modular hardware, FPGAs, etc has had a significant impact on the cost of the test process, as the same solution can be used to test multiple devices.

Another technology that has gained popularity in this context is the built-in self-test (BIST), where a device or machinery is also empowered with the ability to test itself.

“Test departments are constantly under pressure to reduce costs, and with less and less test-point access, BIST can be critical for efficient test,” says Stasonis. Despite the reduced cost and time for testing, BIST is very reliable and is often used in critical spaces such as avionics and defence.

Sagar Patankar, senior program manager, KPIT Cummins, says that test accelerators used in software testing also help meet these time and cost requirements. “Test accelerators are built with three goals: high quality, reduce time-to-market and cost savings,” he says. Some of KPIT Cummins’ innovations in this space include test case optimisation techniques, testing process redesign for software testing and script-less automation testing.

5E7_Figure_4_-_Class_B_IEEE_1588-new

Internet enhances capabilities
Internet capabilities have enhanced instrumentation capabilities in several areas. “LXI devices make remote data acquisition applications relatively simple, synchronisation between tests can be accomplished over the Ethernet using features like IEEE 1588, and remote diagnosis of problems helps minimise downtime,” lists Stasonis.

Dr Agrawal explains further that LXI is a recent technological standard that allows the user to plug-and-play test devices from different manufacturers and route the data on a local-area Ethernet network without worrying about the individual system unit’s bus structure, operating system or firmware. LXI-compliant equipment also allows a user to place test instrumentation close to the unit under test, thereby saving time and expense by not needing to lay out long wires to connect distant measurement instruments.

In short, a networking technology like LXI makes it possible to logically string together physically separated T&M instruments for remote data acquisition, processing, etc. But for remote test equipment to work together, time synchronisation is very important. That’s because in critical spaces like T&M, even a microsecond’s lapse in recording or reacting to a measurement can prove to be too costly. This is where IEEE 1588’s Precision Time Protocol (PTP) comes in. The PTP makes it possible to synchronise distributed clocks with an accuracy of less than one microsecond via Ethernet networks for the very first time. This makes it possible to apply LXI even for critical automation and testing tasks.

READ
Testing Of Solar-Based Devices And Panels

 

Almost all of the testing industry has accepted the concept of virtual instrumentation, which equals or exceeds traditional instruments in terms of data rate, flexibility and scalability, with reduced system cost

Rise of heterogeneous computing in test
Typically, an automated test system uses many types of instruments for measuring different parameters, because every instrument has unique capabilities best suited for specific measurements. The complex computations involved in the T&M process also require various specialised systems. For example, applications like RF spectrum monitoring require inline, custom signal processing and analysis that are not possible using a general-purpose central processing unit. Similarly, there might be other processes that require specialised computing units like graphical processing systems, FPGAs or even cloud computing to handle the heavy computations. This has resulted in the use of heterogeneous computing or multiple computing architectures in a single test system.

This year’s Automated Test Outlook report by National Instruments identifies this as a key trend: “The advent of modular test standards such as PXImc and AXIe that support multi-processing and the increased use of FPGAs as processing elements in modular test demonstrate the extent to which heterogeneous computing has penetrated test-system design.”

The heterogeneous computing trend has injected more power as well as more complexity into test systems—because using varied computing units means varied paradigms, platforms, architectures, programming languages, etc to deal with. However, greater interoperability and standardisation, as well as abstraction and high-level programming options are making heterogeneous computing easier.

To an extent, the heterogeneous computing trend is being adopted on the software side too. Test engineers are beginning to mix and match modules from different software environments like LabView, Matlab and Visual Studio, to form tool-chains that meet their specific requirement.

Testing on the cloud
Another trend in some ways linked to heterogeneity is the rise of cloud-based testing. In cases where testing requires heavy computations, testing teams are beginning to rent and use processing power and tools on the ‘cloud.’ However, this option is available only for test processes that do not require a real-time response. That’s because the processes involved in reliably and securely sending data back and forth across a network instils a bit of latency.

Use of design IPs in both devices and testers
For quite some time now, intellectual property (IP) modules have been used in the design and development of devices and software in order to speed-up development and ensure reuse of component designs. Now there is a similar trend in test software and I/O development as well. In many cases, IP components used in the design of devices are beginning to be included in the corresponding test equipment also.

Consider, for example, a wireless communications device that uses various IP units for data encoding and decoding, signal modulation and demodulation, data encryption and decryption, etc. A tester for the device would also need the same functions in order to validate the device. Instead of reinventing the wheel, same-design IP components used by the device can be reused by the design team also to reduce the time and cost involved in stages like design verification and validation, production test, fault detection and so on. The sharing of IP components also facilitates concurrent testing during design, production, etc to further reduce time-to-market.

According to industry reports, the availability of FPGAs on test modules is also facilitating this trend. Modules such as the Geotest GX3500, for example, has an FPGA for implementing custom logic and can support a mezzanine card that holds a custom interface, providing the flexibility needed to implement the design IP. Such modules, along with the use of common high-level design software, greatly speed up the implementation of design IP in test.

This whole concept of using design IP in test has the ability to integrate testing throughout the lifecycle, right from design to production. However, experts feel that in areas like semiconductor fabrication that rely largely on contract manufacturing, many challenges remain before validation and production test can be merged, because a lot of manufacturers still consider testing as an ‘evil’ process and do not cooperate or invest resources in advanced techniques.

That said, it is obvious that this is a ‘happening’ time in the world of T&M. The explosion of wireless standards and the meteoric developments in other technological arenas is putting a lot of pressure on the T&M industry to deliver its best. There is a lot of lateral thinking, experimentation, research and development leading to interesting products and techniques.


The author is a technically-qualified freelance writer, editor and hands-on mom based in Chennai

LEAVE A REPLY