A programmable power supply is a remote-controlled-type power supply that has an analogue input or a digital interface. It is used in a wide variety of applications, including automated equipment testing, crystal growth monitoring, semiconductor fabrication and X-ray generators. It typically employs an integral microcomputer to control and monitor power supply operations. A power supply equipped with a computer interface may use proprietary communication protocols or standard protocols and device control languages such as Standard Commands for Programmable Instruments (SCPI).

Programmable DC power supplies
Power supply manufacturers have to get their devices certified by nationally-recognised primary standards laboratory such as National Institute of Standards and Technology, or simply NIST, or by an internationally-recognised agency such as CSA, UL or VDE.

There are two basic types of programmable DC power supplies that are commonly used: linear and switch-mode.

Linear power supplies operate by rectifying AC line power to create DC and then filtering and regulating it to produce user-selectable voltage or current levels. These are heavier because the 50Hz or 60Hz transformer and associated filters are physically larger.

Switch-mode power supplies start out the same way, rectifying and filtering AC line input voltage; however these chop the DC into AC. These are significantly smaller, lighter and more efficient than linear ones, and have replaced linear supplies in many applications.

Fig. 1: Simplified block diagram of a programmable linear DC power supply
Fig. 1: Simplified block diagram of a programmable linear DC power supply

Linear power supplies continue to be a popular choice for test equipment. These are generally durable, accurate and deliver power with little noise. Their simple, direct feedback mechanisms deliver excellent load regulation and overall stability. Fig. 1 shows a simplified block diagram of a programmable linear power supply.

Microprocessors receive input from the user interface or the remote interface. A digital-to-analogue converter (DAC) takes the digital setting and translates it into an analogue value, which is used as reference for the analogue regulator. Setting resolution and accuracy are determined by the quality of this conversion and regulation process. Some common parameters to select a programmable DC power supply are given below:

Resolution and accuracy. Programmable voltage and current settings have resolution and accuracy specifications associated with these. Resolution determines the minimum increment in which the output may be adjusted, and accuracy describes the extent to which the value of the output matches international standards. A DAC with more bits will have more resolution of its output and be able to deliver more distinct values for the control loop to use as reference. Accuracy in a power supply is largely due to error terms in the DAC, including quantisation error.

Read-back accuracy is sometimes called meter accuracy. It determines how close the internally-measured values are to the theoretical values of the output voltage (after setting accuracy is applied).

Read-back resolution is the smallest change in internally-measured output voltage or current that the instrument can discern. It is usually expressed as an absolute value, but may also be given as a percentage of full scale.

Load regulation. It is the measure of the ability of the output voltage or output current to remain constant during changes in the load.

Line regulation. It is a measure of the ability of the power supply to maintain its output voltage or output current while its AC line input voltage and frequency vary over the full allowable range.

Remote sensing. A programmable power supply is equipped with remote-sensing capability. Remote sensing is required in applications where load is located at some distance, typically more than 3m (10-feet), from the power supply output terminals. You can also use remote sensing if the measured voltage at the load input power terminals is significantly lower than the voltage measured at the power supply output terminals.

Difference in voltage is based on the amount of current and the load lead size and length. It uses a four-wire connection (Fig. 2) so that the voltage you set on the supply is the voltage you get at the device under test (DUT) in spite of voltage drops in cables that carry current between the power supply and DUT.

Analogue interface. DC programmable power supplies typically provide a standard and isolated analogue interface, through which a supply’s DC output voltage, current and over-voltage protection can be set. These values are controlled by supplying a voltage signal, a current signal or by connecting a resistor to the analogue input. For example, you can use the analogue output of a PLC to control the output voltage of a power supply.

Residual AC. Output of these DC power supplies is not perfect DC. Some AC is to be expected on the output. For some applications excessive AC on the output can produce unexpected circuit behavior, so it helps to know the amplitude of the residual AC.

LEAVE A REPLY

Please enter your comment!
Please enter your name here