I wonder who if anyone will bother to read this page all the way through. I'm sure a large majority will find it boring. Those one or two people who are turned on by metrology will read it through and then send me emails telling me everything I did wrong. So why am I posting it? Damned if I know. I guess it's because I did a lot of work to improve my 8 ohm dummy load and I wanted to show the world how much trouble I went to to get an extra 1.5% accuracy.

Verifying My Test Equipment

It seems I should give some verification of the test equipment I am using to make measurements presented in these pages. I don't have anything that could be called an electrical standards lab. The fancy term for that is a metrology laboratory. I don't have one of those either.

One very good instrument I have is a Hewlett Packard 4261A LCR Meter I bought through eBay. But it has been replaced by this little wonder from Japan. It gives one more digit than the 4261A and will perform a wider variety of tests. The 4261A was used in the measurements on the 8 ohm dummy load below. I'm not going to repeat the work using the new LCR meter.

My most precise instrument is this Mastech MS8050 DMM. It was an anniversary present from my wife.

It is more than a 4 and a half digit instrument. It will display 52000 counts.

Power measurements are based on a known resistance and a known voltage. Total harmonic distortion (THD) is measured by an HP 334A that is basically self calibrating. At this writing I am still wrestling with the calibration of an intermodulation distortion (IMD) analyzer. I have a home brew version that is breadboarded and a Heathkit IM 5248 I found on eBay. They disagree by almost a factor of 2.

HP 3570A Network Analyzer.

I also have an HP 3570A network analyzer. It has two inputs and reads only relative values between the two channels. There are 5 screwdriver adjustable pots on the front panel that allow calibration of the comparison of amplitude (dB) and phase angle (degrees) between channels A and B. If you want one of these they turn up on eBay occasionally, that's where I got mine but I had to buy two to get enough parts to make one working unit. Also the 3570A by itself is useless. It needs a 3320B frequency synthesizer to work. If you have deeper pockets a 3330B will give you more features such as frequency sweeps.

I want to use the 3570A to evaluate my 8 ohm dummy load across the audio band and a little beyond. But first I need to know how accurate the analyzer is. First I wondered how accurate the dB readouts are. All I have to evaluate this is an old HP 350D 600 ohm attenuator. It works all the way down to DC and as I watch the DC input resistance while switching the attenuator knobs, 10 db steps and 1 dB steps, the values vary from 593 to 605 ohms. Tests with an audio oscillator and the voltmeter function of the 334A indicate a tolerance of plus or minus 0.1 dB.

Testing the 3570A is a matter of connecting one channel output to the same channel input and connecting the attenuator between the other input and output. Initial level setting is determined by the output setting on the synthesizer. Lights on the front panel of the analyzer warn If the maximum input amplitude is exceeded. The attenuator is connected as shown in the figure below and the channel selector set to either A or B depending where the attenuator is connected. The analyzer instructions state that it is permissible for the output not to be terminated in 50 ohms as long as the frequency is not too high.

The amplitude readout selector can be set to A, B-A, or B. The measurements given in the chart below are single channel only not compared between the two channels. The frequency used was 100 Hz.

Table 1, Log Converter Tracking Test.

Setting (dB)
A Display
Reading (dB)
B Display
Reading (dB)
0 -0.01 0.02
1 -1.00 -0.98
2 -2.00 -1.98
3 -2.96 -2.95
4 -3.98 -3.98
5 -5.00 -5.00
6 -6.00 -6.00
7 -7.10 -7.08
8 -8.12 -8.11
9 -9.16 -9.13
10 -10.16 -10.12
0 -0.01 0.02
10 -10.26 -10.18
20 -20.29 -20.18
30 -30.48 -30.30
40 -40.68 -40.43
50 -51.07 -50.46
60 -60.58 -60.22
70 -70.04 -69.50
80 -78.40 -77.74
90 -84.2 -83.4
100 -87.4 -85.2
0 -0.02 -0.01

You are probably wondering why the zero point is different between A and B. That is because there are separate zero adjustments for channel A and B so I couldn't just leave it alone. Backlash in the zeroing pots makes exact zeroing difficult to impossible. I did the best I could.

I have started taking data using a small digital voice recorder so I don't have to take the time to write it down. There is a certain amount of zero drift so speed in taking data is important.

The instrument is specified to read down to -110 dB but I don't have the HP approved cables to connect the synthesizer to the analyzer. I also don't have the approved test leads. The leakage is evident when the attenuation is above 70 dB. This is apparently leakage from the cable carrying the signal from the synthesizer to the analyzer. This signal is at a level of +10 dBm. The signal then leaks into the test cable which has a level of less than -70 dBm.

Verifying the analyzer over its frequency range.

For this test each output was terminated with a 50 ohm coaxial pass through termination. Each output was connected to its input using equal length test leads, 4 BNC to alligator clips, clipped together.

Table 2, Testing Analyzer Over Frequency Range.

Frequency (Hz) B-A Amplitude (dB) B-A Phase (Degrees)
100 -0.03 +0.00
200 -0.02 -0.00
500 -0.00 +0.00
1.00 k -0.03 -0.01
2.00 k -0.01 -0.00
5.00 k -0.01 +0.00
10.0 k -0.03 +0.00
20.0 k -0.00 -0.00
50.0 k -0.01 -0.00
100 k -0.03 +0.00
200 k -0.00 -0.03
500 k -0.01 -0.08
1.00 M -0.03 -0.23
2.00 M -0.02 -0.58
5.00 M -0.03 -1.31
10.0 M -0.06 -2.36

Some may question my use of +0.00 and -0.00. This is how the digital readouts display the numbers. A reading of +0.00 means the value is positive but less than 0.005. Similarly a reading of -0.00 means a negative value greater than 0.005.

The largest error in dB of 0.03 translates into a voltage error of 0.345%. These random jumps seem to be caused by noise in the instrument. The display jumps a lot in the last digit and I try to record the value in which it spends the most time.

My Dual 8 Ohm Load For Testing Stereo Amplifiers.

Another vital link in the amplifier testing system is the dummy load.

Note: This is not meant to be a how to article. Mostly because those resistors aren't plated with gold, they are solid unobtainium. The section on making precision resistors by successive approximation may be useful so I will walk you through an example.

I made my load out of 0.909 ohm 1% 30 watt resistors, 9 in series for each resistor. That gives a calculated value of 8.181 ohms. That value is 2.2625% high. The calculated power is 270 watts. With them snuggled together like that I don't for a minute believe that I can expect them to dissipate that much power. These resistors are designed to be mounted on a chassis which provides a considerable heat sink. I estimate I can safely dissipate 50 watts on each side. I have dissipated 200 watts for short periods. Resistors of this type don't burn out instantly when overloaded. They get hotter and hotter until the high temperature finally destroys them.

I can measure the resistance using my HP LCR meter. One side comes out 8.17 ohms and the other 8.19 ohms. But the 4261A only measures at two frequencies, 1 kHz and 120 Hz. What I really want to know is how the impedance behaves over the entire audio band. After that I will look into adjusting the resistance to bring it to within 1% of 8 ohms.

In order to test the dummy load using the 3570A I need 3 precise resistors. An 8.00 ohm, and two 72.0 ohm resistors of low power. While I was at it I decided to make two 8.00 ohm resistors and it's a good thing I did because I found that I needed it.

Making an 8.00 ohm resistor.

The picture below shows the end result of the procedure described. If you look carefully at the photo of the dummy load above you will see that there are two such assemblies on each side of the load chassis. In the photo below the 72.0 ohm resistor is on the left out of the picture. The two resistors are permanently connected in series.

Starting with an 8.2 ohm, measures 8.25 ohms.

Calculated parallel resistor needed 264ohm.

Go one value above what it seems is needed.
Use 300 ohm, measures 298 ohms.

Measured parallel value 8.03 ohms.

Calculated value to parallel 2.141 k ohms.

Use 2.4 k ohms, measures 2.38 k ohms.

Measured parallel value 8.00 ohms.

The digital display occasionally read 8.01. This means that the value is somewhere between 8.00 and 8.01. It is possible to get a feel for where it is by how long it spends indicating 8.00 versus 8.01. I estimated a value of 8.0025 and calculated a required parallel value of 25.6 k ohms. I temporarily connected a 27 k ohm and the display settled down to a steady 8.00 ohms. I soldered the resistor to the wires. The photo above shows this combination of resistors.

Resistor values are effected by the heat of soldering but they return to their original room temperature values upon cooling.

Verifying the Resistor Networks.

The purpose of this test is to examine the behavior of the resistor network over the frequency range. Channel A of the analyzer has its output terminated by the 80 ohms of the full divider, (tap not used). Channel A input is connected directly to the A output. Channel B has the divider fed from the output while the input is taken from the tap. Equal lengths of cable were used on both channels. If the analyzer could be easily zero adjusted the network would read -20.00 dB and 0.00 degrees at 100 Hz.

Table 3, Testing Voltage Divider Over Frequency Range.

Frequency (Hz) B-A Amplitude (dB) B-A Phase (Degrees)
100 -20.12 +0.22
200 -20.13 +0.23
500 -20.13 +0.23
1.00 k -20.12 +0.22
2.00 k -20.13 +0.23
5.00 k -20.13 +0.23
10.0 k -20.13 +0.24
20.0 k -20.14 +0.31
50.0 k -20.14 +0.39
100 k -20.12 +0.59
200 k -20.13 +0.59
500 k -20.12 +1.96
1.00 M -20.09 +3.56
2.00 M -20.05 +6.74
5.00 M -19.76 +16.13
10.0 M -18.84 +31.37

Note: The 3570A is an analog instrument with digital readouts. The Log conversion for dB is done in the analog domain. It is surprising how accurate HP engineers were able to get the Log conversion. If you look back at Table 1 with the step attenuator at -20 dB the readings were, channel A = -20.29, and channel B = -20.18 dB. The readings at 0 attenuation were A = -0.01, and B = +0.02 dB. Subtracting the zero offsets from the readings gives A = -20.28, and B = -20.16. B - A offset is -0.12 dB. Subtracting this offset from the readings in Table 3 gives an error of 0.01 dB which is well within system error. The attenuation of the 72.0 ohm and 8.00 ohm voltage divider may be considered to be -20.00 dB.

Comparing the 8.00 ohm Resistor to the 8 ohm Dummy Load.

At last the snapping alligators have been dealt with and we are ready to start draining the swamp. This table has a different look because I have entered the data in a spreadsheet and converted it to HTML. This permits me to make repeated calculations the easy way.

Table 4, 8 ohm Dummy Compared to 8.00 ohm Resistor.

Frequency (Hz)

Amplitude (dB)

Phase (Degrees)

Impedance (Ohms)

R (ohms)

+Jx (ohms)

A-B Amplitude (dB) from Table 3






















1 k







2 k







5 k







10 k







20 k







50 k







100 k







200 k







500 k







1 M







2 M







5 M







10 M








The Frequency (Hz) column is the frequency which was set on the 3320B synthesizer. The Amplitude (dB) is the difference between the 8 ohm dummy and the 72.0 ohm divider and the 8.00 ohm and 72.0 ohm divider. The Phase (Degrees) is the phase difference between the two dividers. Both amplitude and phase were read directly from the network analyzer. Impedance (ohms) uses a formula to convert the dB comparison to the actual resistance of the 8 ohm dummy load. R (ohms) is obtained from R = Z cos(Phase). jx (ohms) = Z sin(Phase). Where Z is the impedance listed in column 4. The last column is necessary for removing the effect of the 8.00 ohm resistor and the small error in zero adjusting the analyzer. This column is used in the calculation of the Impedance (ohms) column. Adding this to the calculation brings it within 0.5% of the nominal value of 8.18 ohms. Note that at low frequencies there is no difference between the Impedance and Resistance.

So, what happened at 10 MHz? The largest value of amplitude for pure resistances is 20 dB. Because the reading is 21.19 dB a negative resistance is required to give this value. In actuality there is a resonance effect which is giving more output than input. The 72.0 ohm resistor is inductive as confirmed by the RX meter test below. Evidently the dummy load becomes capacitive somewhere between 5 and 10 MHz. Note that the phase angle changes sign between these two frequencies. One inductive resistor and one capacitive resistor will form a series resonant circuit. I can't test the dummy load on the RX meter because its resistance scale only goes down to 15 ohms.

Table 5, RX meter data for the 72.0 ohm resistor.

Frequency (MHz)

RX Meter Rp

RX Meter Cp (pf)

Lp (H)





Magnitude Z

Angle Theta




















































The 8.18 ohm resistance of the dummy load is only 2.25% high but I am going to correct it just because I can. But how to do it? Making it an 8.00 ohm resistance calls for a 363.55 ohm resistor to be connected in parallel. In case you are wondering how I arrive at these values here is how it's done.

1/Rp = 1/R1 + 1/R2.

Where Rp is the parallel combination of the two resistors, R1 is one of the resistors, and R2 is the other. If we know Rp and R1 and want to calculate R2 we write.

1/R2 = 1/Rp - 1/R1

This calculation is very easy on an HP calculator.

Suppose we connect a 360 ohm resistor in parallel with the bank of 0.909 ohm resistors. If we connect the load to a 50 watt amplifier the applied voltage is 20 volts. The bank of gold resistors has 20 volts across it as does the single 360 ohm resistor. Its power dissipation is 20^2/360 = 1.11 watts. If I had a 360 ohm 2 watt 5% resistor in my stock I would probably use it Although that would limit me to testing 50 watt amplifiers. I would like to have the capability to test 100 watt amplifiers and maybe the occasional 200 watt amp for short periods as mentioned above. 40 volts will give 200 watts in an 8 ohm load. The 360 ohm resistor would then be dissipating 4.44 watts. I would need a 360 ohm 5 watt resistor. Remember the gold resistors have to have power applied for a short time so the parallel resistor wouldn't be in danger. I did find a 500 ohm 10 watt resistor that measured 400 ohms but it is wire wound and I didn't want to use it. I decided on paralleling small resistors with individual resistors. The way the gold resistors are laid out, picture above, makes it easiest to connect 1 watt carbon film resistors across two of the gold resistors. I arrived at the schematic diagram shown below.

Modified Dummy Load.

Because I am lazy I did a spice simulation on this network to find the correct value of the small resistors shown as 150 ohms. I set it up with a 5 amp current source driving the circuit. I then adjusted the value of the vertical resistors until I got as close as possible to 8.00 ohms. While I stock every 5% value in 1/4 watt resistors, I only stock every 4th value in 1/2 and 1 watt. It turned out that 150 ohms gives a value of 8.0086 ohms which is an error of 0.1075%. That's luck but if my boss asks I will say it was good planning. OH I forgot, I'm retired.

Power Dissipation.

With the 5 amp current source driving the entire network the voltage across it was 40.043 volts. This gives a resistance of 8.0086 ohms. This is an error of 0.1075%. The voltage across each 0.909 ohm resistor is 4.491 volts for a dissipation of 22.2 watts. The voltage across each 150 ohm resistor is 8.928 volts for a dissipation of 0.531 watts.

Allow me to point out that 200 watts is the power of my silicon amplifier and I have used the load at that power for short periods of time with a fan blowing across it. The highest power I am ever likely to apply from a tube amplifier is 100 watts which should be safe for all resistors.

After installation of the 150 ohm resistors, resistor bank L read 8.02 ohms and R read 8.00 ohms. Connecting a 3.3 k ohm resistor in parallel with bank L brought it down to 8.00 ohms. The 3.3 k ohm 1 watt resistor will dissipate approximately 1/2 watt at an applied voltage of 40 volts.

Now lets see how the new load behaves at high frequencies. For all of its life the aluminum chassis and therefore the bodies of the 0.909 ohm resistors have not been connected to either side of the load. I will run a frequency test in this condition and then try connecting the chassis to the cold side of the load, ground of the analyzer.

Table 6, Testing Load Over Frequency Range.

Frequency (Hz) Chassis Not Grounded Chassis Grounded
B-A Amplitude (dB) B-A Phase (Degrees) B-A Amplitude (dB) B-A Phase (Degrees)
10.0 k 0.08 0.81 0.05 0.76
20.0 k 0.08 1.36 0.05 1.32
50.0 k 0.08 2.91 0.07 2.83
100 k 0.14 5.42 0.14 5.35
200 k 0.28 10.28 0.29 10.03
500 k 1.15 22.86 1.12 22.36
1.00 M 3.22 36.70 3.15 35.97
2.00 M 7.07 46.76 6.97 46.17
5.00 M 14.25 38.28 14.21 37.35
10.0 M 19.80 -11.41 19.36 -18.87

It looks like things are improved ever so slightly except at 10 MHz. I'd like to see a transformer coupled tube amplifier that has any output at that frequency. OTLs are too inefficient for my tastes. It seems pointless to take data above 1 MHz. So why did I do it? Just ignore it.

Table 7, Tuning Out Inductive Effect of Load.

Frequency (Hz) No Capacitor 0.0047 uf 0.01 uf 0.022 uf
B-A Amplitude (dB) B-A Phase (Degrees) B-A Amplitude (dB) B-A Phase (Degrees) B-A Amplitude (dB) B-A Phase (Degrees) B-A Amplitude (dB) B-A Phase (Degrees)
1.00 k 0.07 0.12 0.05 -0.00 0.08 0.05 0.07 -0.03
10.0 k 0.07 0.54 0.06 0.35 0.07 0.28 0.08 -0.08
100 k 0.15 4.96 0.14 3.79 0.11 2.47 0.06 -0.68
1.00 M 3.18 34.77 2.88 24.99 1.80 12.25 -3.46 -4.51

If I haven't already done so, and I don't think I have, I should state what amplitude (dB) and phase angle corresponds to a 1% error. A variation in amplitude of 1% is a dB value of 0.086 dB. That's getting close to the accuracy limit of the analyzer. Some may question my earlier results of sub 1% accuracy, I do. A 2% error is 0.17 dB, and 5% is 0.42 dB. A phase shift of 8.08 degrees will give an error of 1% in the impedance as compared to the R value in the R + jX representation.

I need to examine the range between 0.01 uf and 0.022 uf in more detail. So lets do it.

Table 8, Tuning Out Inductive Effect, Finer Resolution.

This data taken the next day. May not match with above.

Frequency (Hz) 0.010 uf 0.012 uf 0.015 uf 0.018 uf 0.022 uf
B-A Amplitude (dB) B-A Phase (Degrees) B-A Amplitude (dB) B-A Phase (Degrees) B-A Amplitude (dB) B-A Phase (Degrees) B-A Amplitude (dB) B-A Phase (Degrees) B-A Amplitude (dB) B-A Phase (Degrees)
10.0 k -0.20 0.61 -0.75 0.80 -0.84 0.77 -0.95 0.72 -1.16 0.73
20.0 k -1.11 1.32 -0.76 0.99 -0.83 0.84 -1.05 0.84 -1.17 0.71
50.0 k -0.58 1.77 -0.70 1.51 -0.83 1.24 -1.13 1.08 -1.18 0.51
100 k -0.18 2.79 -0.73 2.13 -0.83 1.80 -0.90 1.23 -1.18 0.20
200 k -0.87 5.33 -0.63 3.99 -0.75 2.84 -0.98 1.77 -1.24 -0.34
500 k 0.01 10.31 -0.23 7.51 -0.57 4.75 -1.05 2.11 -1.60 -2.85
1.00 M 1.35 11.85 0.17 7.25 -0.84 2.57 -1.91 -1.07 -4.30 -4.67

I saw no reason to take data above 1 MHz as mentioned in the paragraph just above Table 7. There are several anomalies in the data but the 0.015 uf and 0.018 uf sections seem to show consistent trends. The 0.015 uf seems to be the best. The amplitude changes represent an error of 3% while the phase error stays well within the 1% limit of 8.08 degrees. The change in sign of the derivative of the phase shift is probably due to a resonance somewhere in the system. The resistors have a bit of inductance, there is capacitance across each resistor and from each to ground. Not to mention the impedance of the zip cord that connects to the load resistors. I made the measurements at the end of the wires where the amplifier connects. At the risk of having beaten the load to death I have improved it. Here is what it looks like now.

Low Distortion Audio Signal Generator.

For many years this modified Heathkit IG-18 has been my low distortion standard.

The only way to know if a low distortion audio signal generator is really low distortion is to connect it directly to the input of a harmonic distortion analyzer.

Readings with Generator Connected Directly to THD Meter.

Frequency (Hz) THD (%)
20 0.054
30 0.04
40 0.036
50 0.034
100 0.028
1,000 0.03
1,000 * 0.026 *
2,000 0.026
5,000 0.026
7,500 0.052
10,000 0.056
15,000 0.058
20,000 0.058

* Generator on X100 range dial on 10, produces a quieter Meter on Analyzer.

This year's anniversary brought me a new DMM and a new function generator. I had not bought a DMM since the 1970s and my last function generator was a child of the 1980s.

After I learned how to work the thing one of the first things I did was to check the total harmonic distortion at 1 kHz. It was 0.032% as compared to 0.026% for the Heathkit. So let's ring it out to see how it performs at various output levels, load conditions, and frequencies.

Output levels.

All at 1 kc.

0.4 volts; 0.032%. Meter very steady.
1.0 volts; 0.038%.
3.0 volts; 0.033%.
6.0 volts; 0.047%. Maximum RMS output is 7.07 volts.

Optimum conditions for minimum distortion appears to be 0.70 volts output with a 50 ohm termination. Actual output voltage is 0.35 volts. Display still shows "Load: Hi-Z". As the output is turned up and down using the knob the click of a relay is heard periodically. The best distortion is the level just below a relay click.

Frequency (Hz) THD (%)
20 0.038
30 0.038
40 0.037
50 0.037
100 0.035
200 0.036
500 0.037
1,000 0.035
2,000 0.036
5,000 0.036
7,500 0.056*
10,000 0.060*
15,000 0.058*
20,000 0.060*

* Meter very jumpy.

What do I mean by a jumpy meter? The meter will settle down to a value which can be read and may stay there for as long as 2 or 3 seconds or as short as half a second. It will then swing up wildly, bounce up and down a few times, and settle back down to the reading. This is how I was able to get numbers out of distortion tests. The difference is that the Heathkit did this at all frequencies when distortion was being read on the 0.1% range. When the new function generator is used the meter is very quiet until 7500 Hz is reached. Then it begins jumping just as the Heathkit did. A range change on the analyzer is required between 5000 and 7500 Hz. This begins to point to a defect in the HP 334A. Note the very similar values at and above 7500 Hz. I'm going to start using the SIGLENT in THD measurements but I'm going to hang on to the IG-18 in case problems turn up.

Inter Modulation Distortion Analyzer.

I haven't yet solved the calibration problem. I bought another Heathkit IM 5248 IM analyzer on eBay. It came with a manual so I should be able to calibrate the two. A saying often quoted on the Time Nuts list is "A man with one clock knows what time it is. A man with two clocks has no idea." The newest analyzer reads about 0.03% when its output is connected directly to its input. The other one reads about 1%. But when I cross wire the signal source of one to the test input of the other the results are totally inconsistent. And when my breadboarded IM analyzer gets into the act things get even worse. I have rechecked how I calibrated my analyzer design and I still think I am right. I'll keep working on this and post the results if and when I figure it out. Stay tuned.


This page last updated June 23, 2014.