Initial Set-Up and Calibration
for the MSA
 
Updated Feb. 17 2014,  Software Version 117 Rev B. No hardware changes.
Updated Nov. 2 2013,  Software Version 117 Rev 0. No hardware changes.
Special Notice for XP users,  Possible XP Problems (at end of this page)

    This page will describe the Initial Set-Up for a new MSA and the procedures to Calibrate it.  The MSA Software can be downloaded from this page.  The procedures apply to both the Original MSA and the SLIM MSA.  Separate pages will describe the Operation of the MSA for its different FunctionsBefore you begin this Initial Set-Up and Calibration, I suggest you read and become familiar with  MSA Control and Operation.

I.  Initial Set-Up Procedure for the MSA
    Before the MSA is can become operational, these procedures must be followed:
    A.  MSA Software Download and Installation
           
Downloading MSA Software for use with Liberty Basic
           
Downloading MSA Software as an Executable
    B.  Hardware Configuration
            MSA Hardware Configuration
            MSA Hardware Adjustments
            Computer Configuration
    C.  Software Configuration using the Configuration Manager
            Files created by the Software
    D.  Coarse Calibration using the Calibration File Manager

II.  Calibration Procedures for the MSA
    For the MSA to be accurate, calibrations must be performed, in this order:
    A. 
Coaxial Cavity Filter, Tuning Procedure
            
Optional Test:  Coaxial Cavity Filter Sweep
    B.  Master Oscillator Calibration
    C.  Resolving Filters in the MSA
            
Resolving the Final Crystal Filters for each Path
            
Resolving the DDS Crystal Filters (in-work)
    D.  Phase Detector Module Calibration (VNA only)
    E.  Path Calibration for Magnitude (and Phase for VNA)
    F.  Frequency Calibration for Magnitude
        F1.  Manual Frequency Calibration for the Basic MSA, or
        F2.  Semi-Automatic Frequency Calibration for the MSA/TG


I.  Initial Set-Up for MSA
A. 
MSA Software Download and Installation,
    The MSA software is written in Liberty Basic and, as the name implies, is Basic code.  I won't get into the total capabilities of Liberty Basic, you can get their application software and information at their website.  Find it at http://www.libertybasic.com.  It is free, but will continually "Nag" you to buy it.
    Before MSA Software version 113, the MSA software was released as a ".bas" file to be opened, manipulated, and run, using the Liberty Basic Application Software.  Updates will continue to be released as spectrumanalyzer.bas and require that it be run using Liberty Basic, version 4.03 or newer.
    Beginning with the revision 113, the MSA software is also released as an Executable Program. That is, the MSA software can be run on any Microsoft Windows computer, without having to download any version of Liberty Basic. The MSA software is released as an ".exe" file
    For either version of MSA Software, read the instructions in the ReadMe117.txt file before downloading the software.  It explains the full software installation procedure.
Future Updates and Releases
    Once the Support Files are downloaded and installed, they will not need downloading again.  All future MSA software updates will be released as a revision to "spectrumanalyzer.bas" and "spectrumanalyzer.tkn".  Simply, download either and replace your previous version with the latest revision.  The name of any revision will always remain "spectrumanalyzer".  It is important that this name never be changed.  All software change releases will include a "ReadMe" text file with an appropriate revision number.
Reverting to Original Software
    It is possible, although unlikely, that you could manipulate the software configurations into a condition where your MSA does not operate. You can always revert to the original software, and start over.  Do this by finding and opening the "MSA_Software" folder.  Click and highlight the folder named, "MSA_Info".  Either, delete this folder and send it to the computer's "trash can" or, change its name to "errorMSA_Info".  Then, run spectrumanalyzer.exe (or spectrumanalyzer.bas for Liberty users). The folder named, "MSA_Info" will be re-created. It is as though you are starting with a fresh MSA with default topology.
Previous versions of MSA Software are available on the MSA Archives Page (in-work).

Special Note:  MSA Software may have some bugs. As they are found and fixed, the software will be re-released as a later version or with a Rev update. If you ever have a problem with your MSA software, return to this page and see if a more recent "Rev" might cure your problem.  If it doesn't, email me a "Bug Report" at:   wsprowls@yahoo.com.

MSA Software Downloads for Liberty Basic

    If you wish to use the Liberty Basic application for the MSA Software, the following files are required. To download any of the following files, "Right Click" your Mouse over each link.
spectrumanalyzer.bas   (Version 117 Rev B, 2-17-14)  This is the MSA source code, written in Liberty Basic, version 4.03.  It can be viewed with any text program on your computer, such as Word or Notepad.  It will "Run" with the Liberty Basic application, version 4.03, or higher.
Redist.zip     Folder with new Ntport.dll and Zntport.sys, required when using Parallel LPT to control the MSA. It is not needed if only the USB Converter is used to control the MSA. Also, if you will use the optional USB, click to go to the USB Converter page and download its software from there.
    If you are updating Version 116 to Version 117, you only need to download and replace "spectrumanalyzer.bas".

MSA Software Downloads as an Executable
    If you wish to use the MSA Software as an "Executable" (no Liberty Basic), these files are required.  They must be installed into a common folder, named "MSA_Software".  All the files within the MSA Software folder must remain together, with the exception of "Redist.zip".
spectrumanalyzer.tkn   (Version 117 Rev B, 2-17-14)  The MSA program code, "tokenized".
spectrumanalyzer.exe   The Liberty Executable, that "Runs" the spectrumanalyzer.tkn file
vbas31w.sll                  Support File required by Microsoft Windows to run Liberty
executables.
vgui31w.sll                   Support File required by Microsoft Windows to run Liberty executables.
voflr31w.sll                  Support File required by Microsoft Windows to run Liberty executables.
vthk31w.dll                  Support File required by Microsoft Windows to run Liberty executables.
vtk1631w.dll                Support File required by Microsoft Windows to run Liberty executables.
vtk3231w.dll                Support File required by Microsoft Windows to run Liberty executables.
vvm31w.dll                  Support File required by Microsoft Windows to run Liberty executables.
vvmt31w.dll                 Support File required by Microsoft Windows to run Liberty executables.
Redist.zip                     Folder with new Ntport.dll and Zntport.sys, required when using Parallel LPT to control the MSA. It is not needed if only the USB Converter is used to control the MSA. Also, if you will use the optional USB, click to go to the USB Converter page and download its software from there.
    If you are updating Version 116 to Version 117, you only need to download and replace "spectrumanalyzer.tkn".

Other "Executable" Software
    The file called "spectrumanalyzer.exe" is a Runtime Engine created by Liberty Basic.  When it is "Run" it looks for a file called, "spectrumanalyzer.tkn", and runs it.  This Runtime Engine can run any Liberty Basic tokenized file, but only if the tokenized file is "directed" to run with the "spectrumanalyzer.exe".  For example, if you try to open and run "George.tkn", Windoze will ask you what program you want to "open with".  Tell Windoze to use "spectrumanalyzer.exe" for all files that use the .tkn suffix.

B.  Hardware Configuration
MSA Hardware Configuration
The SLIM MSA is initially configured for 1G Band Operation.
  Verify that Mixer 1 output is connected to the Coaxial Cavity Filter.
  Verify that Mixer 2 output is connected to the I.F. Amplifier.
  If installed, verify that Mixer 3 output is the TG output (Tracking Generator Option)
MSA Hardware Adjustments
The SLIM MSA design has only one mechanical adjustment, tuning of the Cavity Filter.  The initial position of the tuning screws are not important.
SLIM Modules, used in the MSA, have no mechanical adjustments.
The Original MSA has modules which need preliminary adjustments before a first time run:
    For the original 8 Bit parallel
A/D, adjust both pots to the centers.
    For the original 12 Bit parallel A/D, adjust both pots to the centers.
    For the original 16 Bit serial A/D, adjust both pots to the centers (2.5 volts).
    For the original Master Oscillator, adjust pot for oscillator Vcc = 5.0 volts.
       I will update this section more extensively, for the Original MSA.
Computer Configuration
    The MSA can be controlled by either the Parallel LPT Port or by USB, using the Cypress FX2.  See the USB Conversion web page.  You may not have to make any configuration changes to your computer for proper operation of the MSA.  However, if your computer has an extremely high speed processor or its PCI bus speed is very fast, you may have to re-configure your LPT port.  This is how you do it:
    Enter your computer's BIOS.  This is normally done from a cold start.  I have to press the "DEL" key while the computer is booting.  Find your configuration for the LPT Port.  Depending on your computer, your choices will be Normal, SPP, ECP, EPP, Bi-Directional, EPP+ECP, etc.  For a moderate to slow computer, any of these modes should work for MSA.  For high speed computers, select EPP.  Save, and allow the computer to continue its boot.
    All Microsoft Windows Platforms support the MSA, but for Win XP (and newer) users, see the XP Problem at the end of this page.

C.  Software Configuration using the Configuration Manager
    Assure there is no input signal to the MSA.  Just leave the input unterminated.  Select the Magnitude Video switch to Wide (if manual switch is used).  Apply power to the MSA.  Open and Run spectrumanalyzer.exe.  (For Liberty users, spectrumanalyzer.bas).  When either is run for the first time, the Configuration Manager Window will open automatically.  The variables that are initially in place are defaults for a SLIM MSA with Tracking Generator and VNA.  Subsequently, it will be opened from the Graph Menu Item, "Setup".
    The MSA software is written using defaults for the MSA hardware as a SLIM MSA/TG/VNA.  The user is able to change the variable values in the software, to match the topology of the user's MSA.  These are always accessed by the Configuration Manager Window. They can be modified at any time.

Configuration Manager Window
msascreens/confgmgrdefault.gif

   
The variables shown in this Configuration Manager Window are for the Verification SLIM MSA with Tracking Generator and VNA.  You will now change the values in this Window to match the topology of your particular MSA.
    If you have followed a standard build for the SLIM MSA using SLIM modules, you need only to:
(1) click "Delete TG" if you did not install the tracking generator feature, or
(2) click "Delete VNA" if you installed the TG but did not install the Phase Detector Module
and,
(3) select your ADC if it is not the 16-bit module, and
(4) enter the information for nominal frequency and bandwidth of your final IF filter(s).
(5) click the "Save Configuration" button.
    In later calibration procedures you may determine more precise values for the items, and can change them by returning to the Configuration Manager.

The Buttons in the upper right quadrant of the Configuration Manager Window have these functions:
Set to SLIM Defaults - This will insert all SLIM defaults into the Configuration Manager Window.  Upon
Initial Set-Up, the default values are already inserted.  But, if the Configuration Manager Window is opened after the Initial Set-Up, it is a quick way to change all the values back to SLIM defaults.
Re-Load File - This will read the hidden
config.txt file and enter its values into the Configuration Manager Window.  Upon initial set-up, that file does not exist.  Therefore, this button is only used after the initial set-up and first run.
Delete TG - This will delete all options in the
Configuration Manager Window that pertain to a Tracking Generator.  Click this if you do not have the Tracking Generator installed.
Delete VNA-
This will delete all options in the Configuration Manager Window that pertain to the Vector Network Analyzer.  Click this if you do not have the VNA installed.
Help - This will open a window for more explanations. I will add more to this window.
Save Configuration - If this is the Initial Set-Up for the first time, this button will save the entries into the
config.txt file, the Configuration Manager Window will close, and return to the main program.  When the Configuration Manager Window is opened in subsequent runs, this button will save the entries into the config.txt file.  After giving Notice, it will then close the MSA session.  The session must be re-started by running "spectrumanalyzer.xxx"
Return to MSA Without Saving - This button does not appear on the Initial Set-Up.  Subsequently, this will just close the
Configuration Manager Window and return to the main program, without saving the entries.

    Each of the following variables has an explanation for its value.  SLIM defaults are underlined.

PLL 1 Type - Select for LMX2325, LMX 2326
, LMX 2350, LMX 2353, ADF 4112.  "0" is an option reserved for future use.  Do not select it.
PLL 1 Polarity -
For non-inverting loop filter, use (non-inv). For inverting op amp, use (invert)
PLL 1 Reference - The value will determine the app
roximate Reference phase detector frequency of PLL 1.  If the DDS 1 Center Freq is 10.7 and if DDS 1 Bandwidth is greater than .010 (MHz), then enter .974.  For other topologies, this number can get rather involved.  There are several factors that determine the value to be used.  It can be determined by using the following formula: PLL1 Reference = (VCO 1 minimum frequency) x (DDS 1 Bandwidth)/(DDS 1 Center Freq).  But, in cannot be greater than 1.02 (MHz).
PLL 1 Mode - (Integer) or (Fract)ional Mode.  Use Fractional Mode only when using Fractional N type PLL's. (LMX 2350, 2353).  Even if using a Fractional N PLL, I recommend using the Integer Mode for the MSA.  It is less noisy.
PLL 2 Type - Support for LMX2325, LMX 2326, LMX 2350, LMX 2353, ADF 4112.  The selection of "0" is reserved for topologies that use a frequency multiplier scheme to replace PLL 2.
PLL 2 Polarity -
For non-inverting loop filter, use (non-inv). For inverting op amp, use (invert)
PLL 2 Reference - The value will determine the app
roximate Reference phase detector frequency of PLL 1.  Use 4 for most MSA topologies.  If PLL 2 is used in the Original MSA with Original Tracking Generator (PLL3 is a fixed frequency), I suggest you contact me for more information.  This value can get extremely involved.
PLL 3 Type - Support for LMX2325, LMX 2326, LMX 2350, LMX 2353, ADF 4112. Select "0" (zero) for no Tracking Generator.  Better yet, for no Tracking Generator, click the "Delete TG" button.
PLL 3 Polarity -
For non-inverting loop filter, use (non-inv). For inverting op amp, use (invert)
PLL 3 Reference - The value will determine the app
roximate Reference phase detector frequency of PLL 3.  The SLIM MSA uses .974.  If PLL 3 is being steered by DDS 3 then use the same rules as for PLL 1 Reference.  If PLL 3 is a fixed frequency, as used in the Original MSA with Original Tracking Generator, I suggest you contact me for more information.  This value can get extremely involved.
PLL 3 Mode - (Integer) or (Fract)ional Mode.  Use Fractional Mode only when using Fractional N type PLL's. (LMX 2350, 2353).
DDS 1 Center Freq -
The value (in MHz) is the center frequency of the DDS 1 crystal filter. 10.7
DDS 1 Bandwidth
The value (in MHz) is the bandwidth of the DDS 1 crystal filter. .015
DDS 1 Parser - Select the command mode for DDS 1, (serial) or (parallel)
DDS 3 Center Freq - The value (in MHz) is the center frequency of the DDS 3 crystal filter. 10.7
DDS 3 Bandwidth
The value (in MHz) is the bandwidth of the DDS 3 crystal filter. .015
LO 2 (MHz) - This is the fixed frequency of Local Oscillator 2. Generally, it is 1024.
  If PLL 2 is used in the Original MSA with Original Tracking Generator (PLL3 is a fixed frequency), I suggest you contact me for more information.  This value can get extremely involved.  If PLL 2 is replaced with a multiplier scheme (PLL2 = 0), this value needs to be a whole number multiple of the Master Clock nominal value.
Mast Clock (MHz) -
enter the exact frequency of the Master Oscillator (in MHz).  If the Master Oscillator Module is adjustable, enter the oscillator's nominal value (64.0 in this case).   If it is not adjustable, enter the actual frequency that the clock is creating (in MHz).  If you are not sure, enter the nominal value of the Master Oscillator, and you can change this value during calibration.  My Mast Clock is = 63.9995093 (a .3 Hz resolution)
Max PDM out - For VNA extension only.  This is the Bit Count output of the Phase Analog to Digital Converter when the Phase Detector Module is reading 360 degrees.  For the SLIM Phase Detector Module and SLIM AtoD Module, this value is fixed (either 65535 for the 16 Bit or 4095 for the 12 Bit). For the Original MSA using the Original AtoD this value is adjustable, and is determined during calibration (use 65535, 4095, or 255 for the 8 Bit).  If the VNA is not installed, click the "Delete VNA" button.
Inv Deg - VNA only. Inversion in Degrees.  This is the actual amount of phase change when the Phase Detector Module has its state changed from Normal to Inverted.  A perfect PDM would have a 180 phase change.  The actual value is determined during the PDM calibration.
ADC type - Select 8(orig 8-bit), 12(ladder), 16(serial 16-bit), or 22(serial 12-bit).  It is assumed that the same type ADC is for both the Magnitude and Phase.
TG Topology - Select (orig)inal TG or (DDS3/PLL3).
Select "0" (zero) for no Tracking Generator.
Control Board - Select (Old) for Original Control Board, or (SLIM original) for the SLIM Control Board.  There is another option (Old, new harness).  This is a place holder for an in-work design of a wiring harness to convert the Original Control Board and original modules to take advantage of new software changes.  At this time, I see no advantage, so I may delete this option.
LPT Port Address - If your computer does not command the MSA, this may be the problem and needs to be changed. 
The standard home computer LPT address is Hex 378.  Plug-in parallel cards will likely be different.  If this value is changed, highlight only the octal value and modify it. 
List your final filters:  This is a table listing of Resolution filters that are installed in the MSA.  At this time a maximum of 4 filters (Paths) are allowed.  The default is a single filter (Path 1) with frequency 10.7  (MHz) and  bandwidth 15 (KHz).  Up to three more lines can be added, Paths 2 through 4.
  The top line of data is Path 1.
To change this default to match your final filter, highlight the data by clicking it with the Mouse.  Four Buttons will appear: AddPrior, AddAfter, Delete, and Replace
Enter the correct data for your Path 1 filter in the Freq(MHz) box and BW(KHz) box.  Then click the Replace button.  If you use more than 1 Resolution filter, then enter the correct data for your next filter in the Freq(MHz) box and BW(KHz) box.  Then click the AddAfter button.  This will become Path 2.
If you use a 3rd Resolution filter, then enter the correct data for the filter in the Freq(MHz) box and BW(KHz) box.  Then click the AddAfter button.  This will become Path 3.
If you use a 4th Resolution filter, then enter the correct data for the filter in the Freq(MHz) box and BW(KHz) box.  Then click the AddAfter button.  This will become Path 4.

Save the Software Configuration.
    After the configuration values are entered, click the Button called "Save Configuration".  The Configuration Manager Window will close, and the Working Window and Graph Window will open.  The MSA will begin sweeping in the Spectrum Analyzer Mode.
    The following graph of the Magnitude response is what you would get if your MSA was Calibrated.
msascreens/graphrun1.gif
The Magnitude response is the actual response of the Final Crystal Filter (Resolution Filter) centered at Zero MHz. The actual peak Magnitude will vary among different MSA's due to the variation of losses of mixers and filters. Expect this "Zero Response" Magnitude to be about -30 dBm, +/- 10 dBm.
    NOTE: If your graph initially shows a trace response and then "goes away", you probably have the notorious XP Problem. Prove this by Clicking "Halt" and then click "Restart". The trace response will return, and then "go away" again. Halt here and fix your computer by following the advice in the paragraph,
XP Problems (at end of this page).

Below, is a more realistic display of a first-time sweep.
msascreens/graphwinuncal.gif

    This graph of the Magnitude response is more likely what you will see during the Inital Set-Up.  The cavity filter is not tuned (low amplitude), the Master Oscillator is not calibrated, and the Final Crystal Filter (Resolution Filter) is not exactly as planned (center frequency of response is not at Zero MHz).  The power measurements are certain to be incorrect, since no magnitude calibrations have been performed.  We will perform those critical calibrations in the Calibration section.  After the next paragraph, you will perform a Coarse Calibration by installing values in the Calibration File Manager Window.  This will assure a Magnitude response for a "freshly built" MSA.
    
As I stated earlier, this page is written with the assumption that the MSA is in working condition. If your Graph response is "not even close" to either of these shown, you may have a problem. But, do not be dismayed until after the Coarse Calibration.

Files created by the Software
When "spectrumanalyzer.bas" (or .tkn) is run for the first time, several background operations will be performed by the software.  A new folder called "MSA_Info" will be automatically created and placed in the same folder that "spectrumanalyzer.xxx" is located.  This is usually the folder, "MSA_Software".  Within the "MSA_Info" folder, three folders and one text file will be automatically created:
1.  "MSA_Cal" folder.  Within this folder will be a minimum of two and a maximum of 5 text files:
    a. MSA_CalFreq.txt  This is the Magnitude vs. Frequency Calibration factors. Defaults will be 0.
    b. MSA_CalPath1.txt
  This is the Magnitude vs. AtoD Calibration factors. Defaults will be 0.
    c.
MSA_CalPath2.txt, if a Path 2 is specified in the Configuration Manager
    d. MSA_CalPath3.txt, if a Path 3 is specified in the Configuration Manager
    e. MSA_CalPath4.txt, if a Path 4 is specified in the Configuration Manager
2.  "OperatingCal" folder.  It will be empty until the first time a line calibration is performed.
3.  config.txt  This is the information about the MSA hardware that was collected in the Hardware Configuration Manager Window.

4.  "MSA_Prefs" folder. This contains files with information about sweep and appearance settings. After the initial software run, there will be a Prefs.txt file with default settings, but the user can replace that file by the File->Save Prefs menu item, or he can save the preferences under a new name so he can have multiple preference files in this folder. The Prefs.txt file is loaded on startup, and that file or any other preferences file can be loaded at any time with the File->Load Prefs menu item.
    During MSA operation, there may be more folders and files that will be created and installed into the "MSA_Software" folder.

D.  Coarse Calibration using the Calibration File Manager
    The Calibration File Manager creates and controls all of the Calibration Files that are used by the main MSA Program.  During the Initial Set-Up and Running of the MSA Program, Calibration Files are created and filled with SLIM MSA default values.  The user has the option to change any, or all of these values.  To access the Calibration File Manager Window, "Halt" the sweep, then select from the Graph Menu, Setup, Initial Cal Manager.

msascreens/calmgrdefault.gif

*The right hand "Available Files" menu will display all of the MSA's Calibration files.
*The Calibration File Manager will control these files:
        * The Frequency Calibration File.
        * The Path Calibration Files.  There may be up to 4 Path Calibration Files,
            depending upon the number of Paths, as entered during the Initial Configuration Managment
            Procedure.
*Within "Available Files" menu, the 0(Frequency) is highlighted.  This is an opening default.
    The left text box is named "Frequency Calibration Table", and will display the most recently saved Frequency Calibration File.  Initially, it displays the SLIM default values of 0 MHz and 1000 MHz.  All Magnitude measurements at frequencies between 0 and 1000 MHz will be given a calibration factor of 0.00 dB.

Buttons in the Calibration File Manager Window:
Clean Up - This will sort the displayed Table's Calibration values.
Display Defaults - This will change the
displayed Calibration values to the nominal SLIM defaults.
Re-Load File - This will load the last saved
MSA Calibration Table values into the displayed table.
Save File - This will replace the MSA Calibration Table with the displayed Table's
Calibration values.
Return to MSA - This will close the Calibration File Manager Window and give the option to Save.
Start Data Entry - This will allow the user to Calibrate the MSA, semi-automatically.  When clicked, more boxes and buttons will appear, depending on the type of calibration requested.  These will be described during the Calibration Procedures.

*Within "Available Files" menu, select and highlight 1(xxx yy).  The Calibration File Manager Window will change and display the Path Calibration Table for Path 1.
msascreens/calmgrpath1.gif
    The left text box is named "Path Calibration Table", and will display the most recently saved Path 1 Calibration File.  Initially there are only two lines of values.  These are for the minimum and maximum magnitude dynamic range points for the MSA.  The default values are coarse and the final values will be characterized during the Path Calibrations.
    If your Analog to Digital Converter is a 16 bit version, the ADC value at 0.000 dbm should be 32767.  If not, change it.
    If your Analog to Digital Converter is a 12 bit version, the ADC value at 0.000 dbm should be 4095.  If not, change it.
    If your Analog to Digital Converter is an 8 bit version, the ADC value at 0.000 dbm should be 255.  If not, change it.
    For any version, the ADC value at -120.000 dbm should be 0 (zero).
    If changes are made in either the Frequency Calibration Table or Path Calibration Table, click the "Save File" button.  Then click the "Return to MSA" button.
    "Restart" the sweep.  If the Magnitude response is traced on the bottom -100 dBm scale line, "Halt" the sweep.  Open the Axis Y2 Window and change the Bot Ref to -120.  Click "OK".  The Graph will change its Magnitude scale and the trace should be seen. If you still have absolutely, no Magnitude response, you may assume that your MSA has a "hard failure".
    If you have any value of Magnitude response, perform a coarse tuning of the Coaxial Cavity Filter. While sweeping, adjust each the four tuning screws on the Cavity Filter for maximum amplitude of the Magnitude response. Do not expect the response to change in frequency, just maximum amplitude. The resulting Magnitude magnitude should be at least -50 dBm. If it is lower, it is likely that there is high insertion loss in either the Coaxial Cavity Filter, or the Final Crystal Filter. I term this condition, "soft failure".
     Again, this page is written with the assumption that the MSA is in working condition. If you have no Graph response, you have what I term, "hard failure". Since your MSA is fully integrated, stop here and fix the problem. Go to the page, Testing the Integrated MSA, and follow the Troubleshooting Guide.

II.  Calibration Procedures for the MSA

    The MSA can be constructed with a variety of topologies.  There are no two MSA's that have identical characteristics.  The purpose of calibration is to measure, characterize, and quantify the effects of those characteristics.  Once calibrated, an MSA can perform with the accuracy of an expensive, commercial unit.  These are the main factors that affect MSA measurement accuracy:
*  The coaxial cavity filter affects the gain/loss characteristic of the MSA.  It is sensitive to its source and load impedance, and will need tuning, even if pre-tuned independently from the MSA.  This is accomplished in the
Coaxial Cavity Filter, Tuning Procedure.
*  The Master Oscillator determines the frequency accuracy of the MSA.  It is usually quite stable, once it reaches its operating temperature, but may not be absolutely accurate.  The software can be compensated for this inaccuracy.  This is determined in Master Oscillator Calibration.
*  The Resolution Bandpass Filter(s) may not be exactly at the expected center frequency.  However, it can be charcterized, and the software can be compensated.  This is accomplished in Resolution Bandpass Filter Calibration.
*  MSA Magnitude Measurement is, basically, a linear function of input power.  MSA linearity is quite good in most of its dynamic range, but deviates significantly when its input power is close to its upper and lower dynamic range limits.  The full range of magnitude response can be characterized and compensated by software.  Magnitude nonlinearity is characterized in Path Calibration for Magnitude and Phase.
*  MSA Magnitude Measurement is affected when using different Resolution Bandpass Filters.  This is due to different insertion losses and filter bandwidths.  Therefore, the MSA gain can be characterized for each Resolution Bandpass Filter that is used.  Gain is characterized in Path Calibration for Magnitude and Phase.
*  MSA Magnitude Measurements are affected by frequency changes within the MSA.  There are multiple components in the MSA whose gain/loss characteristics change when frequency changes.  MSA gain can change greater than 2 dB over the frequency range of 0 to 1000 MHz.  These gain vs. frequency changes can be characterized and compensated by software.  Magnitude Accuracy versus Frequency is characterized in the Frequency Calibration for Magnitude.
MSA/VNA Phase Measurement accuracy is affected by the power of the input signal and the frequency of operation.  The Phase Accuracy versus input signal power level is characterized in Path Calibration for Magnitude and Phase.
MSA/VNA Phase Measurement accuracy is affected by the frequency at which the MSA is operating at.  The Phase Accuracy versus Frequency is characterized during normal VNA operation each time a Line Reference Calibration is performed.  But, a one-time VNA Baseline Calibration is performed to provide a coarse calibration for "uncalibrated" VNA operation.
    The measurement accuracy of the MSA can be optimized by adding a permanent, 50 ohm attenuator on the input of the MSA.  A 3 dB to 10 dB attenuator is best.  If you plan to use a permanent attenuator, it must be attached during the Calibration procedures.  Padding the MSA does not change its dynamic range, but it does shift it in the positive direction.  Example: Range without: -20 dBm to -110 dBm, Range with: -10 dBm to -100 dBm.  The same consideration can be made for the output of the Tracking Generator.  Its output will decrease by the amount of padding placed on its output connector.  I have determined that 8 dB of padding on both the TG output and the MSA input is optimum.

A.  Coaxial Cavity Filter, Tuning Procedure:
    This is not really a "calibration".  It is a tuning procedure.  If the coaxial cavity filter has been pre-adjusted with the mechanical information given in the construction procedures, it will be fairly close.  For correct adjustment, perform the following steps.  No other test equipment is required.  If the MSA has only a single Final Filter (Path 1) or if all Paths use the same center frequency (or within .5 MHz) the following procedure will give good results.  If the Path center frequencies are more widely spaced, perform the procedure and then continue with the Optional Test: Coaxial Cavity Filter Sweep.
    * Open and Run the MSA Program (spectrumanalyzer.exe).
    * Halt the sweep
    * Open the Sweep Parameters Window
    * Select the Video Filter BW to Wide
    * For an MSA with a single Final Filter, select Path 1
    * For an MSA with multiple Path Filters, select the Path with a center frequency that is the average center of all paths.
    * Verify, "0" as the Center Frequency, "Cent" box.
    * In the Span Box, enter 10 times the bandwidth of the Final Crystal Filter (in MHz).
    * Click "OK", then "Restart".  The Graph should show a response curve, even if the cavity filter is
        badly mistuned.  It is also possible that the response is below the Bottom Reference Line.  If so,
        Halt sweep and change "Bot Ref" box to -120.  Click "Restart".

    * It is very likely that the center of the response curve will not be in the center of the Graph.  To
        center it, Halt the sweep.
  Place Mouse pointer over the center of the response curve and Left
        double click the mouse.  Click "Mark->Cent" button.
  Click "Restart".  The response will now
        be in the center of the Graph.

    * Adjust the tuning of the Cavity Filter for maximum amplitude response (maximum dBm) at the center frequency.  The response should look more like the Graph Window near the top of this page.
    *
For more critical tuning, perform the following extra steps:
        * Halt sweeping
        * Open the Sweep Parameters Window
        * Change Span to "0"
        * Enter 300 into "Wait" box
        *
Select the Video Bandwidth to Wide.
        * Click "OK", then "Restart".
        * Halt sweep.
        * Select from Menu, Options, "Show Variables". The "Variables" Window will open.
        * Click "Continue".  The sweep will be very slow and the variable, Magdata = xxxxx will update and display the Bit Count value of the Magnitude.  Tune the Cavity Filter for maximum Bit Count.
    * Tuning is complete.  Halt the sweep.  Close the Variables Window, if open.

B.  Master Oscillator Calibration:
    If your system is the Basic MSA, and has no Tracking Generator, use Method A.  If your MSA has the Tracking Generator addition, with or without VNA extension, you can use Method A or Method B.

Method A.  For the Basic MSA (no Tracking Generator). Beat Frequency Method.
    This method requires an external AM radio receiver (and appropriate antenna) that will recieve WWV at 2.5 MHz, 5 MHz, 10 MHz, or 20 MHz.  This is for North America.  For Europe or other countries, you can use a Frequency Standard radio station, operating below 32 MHz.
  The DDS 1 spare signal is used as a "beat" frequency oscillator.
    1. 
Tune the external receiver to WWV, 10 MHz.  Use an antenna, if necessary.  I will use 10 MHz during this procedure, but others may be used.
    2.  Open and Run the MSA Program (spectrumanalyzer.exe). Halt the sweep.
    3.
  Connect a length of hook-up wire to the DDS 1 spare output, and position the wire close to the radio reciever or antenna input.  If the MSA's DDS 1 spare output is brought out to the front panel, the center conductor of the hook-up wire should fit snuggly in the center pin of the connector.  If the DDS 1 spare output is not brought out, it is available on the bottom of the SLIM-DDS-107 and is J3.  Use a hook-up wire size so that its center conductor will fit snuggly in the pwb hole.
    4.  (allow the MSA to warm up for 30 minutes).  Select from Menu, Setup, "Special Tests".  In the Special Tests Window, enter 10 (MHz, the frequency of WWV) into "Command DDS 1" box .  The "with DDS Clock at" box will display the value of the default global variable, "masterclock" (64.xxxyyyz).  Click the "Command DDS 1" button.  DDS 1 will immediately command to approximately "10" MHz.  The program software used the value in the "with DDS Clock at" box as "masterclock" for its calculation.  Leave the Special Tests Window open.
     5.  Couple the DDS 1 spare output wire close to the receiver to obtain an audio beat signal.  If the DDS 1 and WWV frequencies are more than a few hundred Hz apart, this "beat" may sound like a tone.  For best results, the WWV input power to the radio reciever and the DDS 1 signal input power to the radio reciever should be equal.  Move the DDS 1 signal wire to a location near the radio to obtain best results.
    6.  To adjust the Master Oscillator for zero beat, use a. or b.
        a. 
If you have a mechanical adjustment for the master oscillator, the nominal Master Oscillator frequency value should be in the "with DDS Clock at" box.  If not, Halt the sweep and enter it, then click the "Command DDS 1" button.  Manually adjust the Master Oscillator for zero beat.  A final zero beat is less than 1 noticeable cycle per second. When found, you are finished. Skip b.
        b.  If you don't have a
mechanical adjustment for the master oscillator, zero beat is found by changing the value in the "with DDS Clock at" box and clicking the "Command DDS 1" buttonThe goal is to find the lowest frequency zero beat.  If the beat frequency increases when changing values, you are changing in the wrong direction.  When the final value in the "with DDS Clock at" box is determined, you are finished.
    7.  Exit the Special Tests Window.
    8.  From the Graph Menu, select Setup, Hardware Configuration Manager.
    9.  In the Configuration Manager Window, change the "
Mast Clock" to the final value that was last entered in the "with DDS Clock at" box.
    10. 
Click the "Save Configuration" Button.  The MSA program will close.
If a zero beat to within 1 cycle per second can be obtained, the Master Oscillator is calibrated to within 1 part in 10 million, (using WWV, 10 MHz).  If WWV, 5 MHz is used, the calibration is within 1 part in 5 million, etc.  This is a one-time calibration.

Method B.  For the MSA with Tracking Generator or VNA. This is a Beat Frequency Method, but no external receiver is required.  The MSA acts as a receiver.  This method requires that the cavity filter be adjusted first.  Otherwise, there may not be enough signal to perform the calibration.
    This method uses the MSA as a radio reciever for WWV at 2.5 MHz, 5 MHz, 10 MHz, or 20 MHz.  This is for North America.  For Europe or other countries, you can use a Frequency Standard radio station, operating below 32 MHz.
  DDS 3 is used as the "beat" frequency oscillator.
    1.  Connect an antenna or long wire into the input of the MSA.  This injects WWV into the MSA.
    2.  If the MSA program is not running, RUN the program.
    3.  Halt the sweep.
    4.  Open the Magnitude Axis Window
    5.  Enter "-20" into the "Top Ref" box.  Enter "-120" into the Bot Ref" box.
    6.  Click "OK", "Restart", then "Halt"
    7.  Open the Sweep Parameters Window
    8.  Command the MSA Center Frequency to WWV, 10 MHz., "Cent" box = 10.0
         I will use 10 MHz during this procedure, but other WWV's may be used.
    9.  Uncheck the "Refresh Screen Each Scan"
    10.
  Click "OK", then "Restart".
    11.  Verify the signal response is in the center of the Graph.  If not, Halt and center the signal.  Click Restart. 
Allow the MSA to warm up for at least 30 minutes.  The Master Oscillator should stabilize in this time period.
    12.  Verify the input signal level has at least 10 dB of signal to noise ratio.  Take note of this input power level, as WWV power. Example, -90 dBm.
    13.  Halt the sweep.
    14. 
Open the Sweep Parameters Window and enter "0" into the "Span" box.
    15.  Click "OK", "Restart", then "Halt"
    16.
   Open the Magnitude Axis Window and enter "-70" into the "Top Ref" box.  Enter "-110" into the "Bot Ref" box. (use +20 dB above and -20 dB  below the noted WWV power).
    17.
  Click "OK" then "Restart".
    18.  A uniform, horizontal, trace will be displayed, along with some noise ripple.  Some magnitude change will occur if the WWV signal is fading or modulating.
    19. 
Halt the sweep.
    20.  Select from Menu, Setup, "Special Tests".  In the Special Tests Window, enter 10 (MHz, the frequency of WWV) into "Command DDS 3" box .  The "with DDS Clock at" box will display the value of the default global variable, "masterclock" (64.xxxyyyz).  Click the "Command DDS 3" button.  DDS 3 will immediately command to approximately "10" MHz.  The program software used the value in the "with DDS Clock at" box as "masterclock" for its calculation.  Leave the Special Tests Window open.
    21.  Combine both the DDS 3 spare output signal, and the antenna input, using a "T" connection on the input to the MSA.  For best results, the WWV input power to the MSA and the DDS 3 signal input power to the MSA should be equal.  See a. and b. next.
        a.  If the MSA's DDS 3 spare output is brought out to a front panel coaxial connector, its power level is very high, about -8 dBm.  Add an appropriate attenuator so that the DDS 3 power into the MSA is approximately equal to the level of the WWV signal.
        b.  If the DDS 3 spare output is not connectorized, it is available on the bottom of the SLIM-DDS-107 and is J3.  Use a hook-up wire with a center conductor that will
fit snuggly in the pwb hole.  The end of the wire can be loosely coupled to the WWV antenna input to the MSA, so that its power level is somewhat equal to the WWV power level.
    22. 
Click "Continue".  The previous uniform magnitude trace will look like waves on water.  These waves are a result of the beat frequency between WWV and DDS 3.  There could be many "waves" per sweep, meaning the Master Oscillator is far off frequency.  You can "grab" and move the Special Tests Window out of the way to see the Graph display.
    23.  Adjust the Master Oscillator for zero beat, use a. or b.
        a.
  If you have a mechanical adjustment for the master oscillator, the nominal Master Oscillator frequency value should be in the "with DDS Clock at" box.  Example, "64.00".  If not, Halt.  Enter the correct Master Oscillator value, then click the "Command DDS 3" button, then click "Continue".  Manually adjust the Master Oscillator for zero beat.  Zero beat occurs when the "waves" occur very slowly (less than one per second).  The sweep can be slowed for better display of the very slow waves.   Halt the sweep, enter "20" into the "Wait" box, "Continue".   When this adjustment is found, you are finished. Halt the sweep and skip the next step b.  I have a mechanical adjustment in my Original MSA.  It is very easy to adjust to 1 wave (1 beat) every 5 seconds.
        b.  If you don't have a
mechanical adjustment for the master oscillator, zero beat is found by changing the value in the "with DDS Clock at" boxThe goal is to find the lowest frequency zero beat.  If the beat frequency increases when changing values, you are changing in the wrong direction.  The procedure is: Halt the sweep, change the value in the "with DDS Clock at" box, then click the "Command DDS 3" button, then click "Continue".  Repeat, until the value of "with DDS Clock at" box creates the slowest waves (less than one per second).  Halt the sweep.  You are finished.  In the SLIM MSA, I was able to command the value of the Master Oscillator until I got 1 wave (1 beat) every 3 seconds.
    24.  Exit the Special Tests Window.
    25.  From the Graph Menu, select Setup, then, Hardware Configuration Manager.
    26.  In the Configuration Manager Window, change the "
Mast Clock" value to the final value that was entered in the "with DDS Clock at" box.
    27. 
Click the "Save Configuration" Button.  The MSA program will close.
If a zero beat to within 1 cycle per second can be obtained, the Master Oscillator is calibrated to within 1 part in 10 million, (using WWV, 10 MHz).  If WWV, 5 MHz is used, the calibration is within 1 part in 5 million, etc.  This is a one-time calibration.

C.  Resolving Filters in the MSA:
Resolving the Final Crystal Filters for each Path
    The center frequency of a
Resolution Bandpass Filter (Final Crystal Filter) may not be exactly as the manufacturer states.  For wide-band filters of 20 KHz or greater, this is not much of a concern.  But for narrow filters, this error will be indicated when the swept response in not in the center of the graph, when it should be.  To determine the real center frequency of the Final Xtal Filter, follow these steps.
    1.  The Master Oscillator must have been calibrated.
    2. 
Run the MSA Program (spectrumanalyzer.exe).
    3.  Select Magnitude Video Bandwidth to Wide.
    4.  Halt sweep.
    5. 
Open the Sweep Parameters Window
    6. 
Verify or change the MSA Center Frequency to "0", and Filter Path at P1
    7.  In Span box, enter 5 times the bandwidth of the Resolution Filter in Path 1 (Final XtalFilter)
    8.  Enter "20" into the "Wait" box
    9.  Click "OK" then "Restart".
    10.  The trace on the Graph is the actual frequency response curve of the Resolution Filter.  A perfectly tuned Resolution Filter will have low ripple within the 3 dB bandwidth.
    11.
  Verify the response is centered. Centered, means that the 3 dB points are equally distanced from the center of the Graph, and the maximum power indication is in the center of the Graph.
    12.  If centered, verification is complete.
    13.  If the response is not centered, Halt the sweep. Position Mouse pointer over the center of the response curve.  Double Left click or single Right click the Mouse.  The "L" marker frequency in the Marker Box will indicate the MSA tuning frequency.  A negative value is valid.  We will call this value, "
L Mark Freq".
    14.  To determine the true center frequency of the Resolution Filter:
        a.  true center frequency = value in "Select Final Filter Path:" box -
"L Mark Freq"
        b. 
example, if the "L" marker is at 0.0011 then true center frequency = 10.7 - 0.0011 =  10.6989 (MHz)
        c.
  or, if the "L Mark Freq" was at -0.0015 then true center frequency = 10.7 - (-0.0015) =  10.7015
        d.  this "true center frequency" will be entered into the Configuration Manager Window for Path 1
        e.  or, for subsequent Paths, "finalfreq2", "finalfreq3", and "finalfreq4"
    15.  To determine the unknown Bandwidth of the Resolution Filter:
        a.  With the sweep Halted, Select Marker "L".  Position the Mouse cursor directly on the trace that indicates the lower -3dB point of the filter response curve.  Double Left click the Mouse.  The frequency will be displayed in the Marker Box as the "L" frequency.
        c.  Select Marker "R".  Position the Mouse cursor directly on the trace that indicates the upper -3dB point of the filter response curve.  Double Left click the Mouse.  The frequency will be displayed in the Marker Box as the "R" frequency.
        d.  The actual Bandwidth of the Resolution Filter is "R" frequency - "L" frequency.
        e.  This actual Bandwidth
must be entered into the Configuration Manager Window for Path 1
    16.  If you have more than one Resolution Path, repeat steps 4 through 15 for each Path (2, 3, 4).
    17.  After the center frequencies and bandwidths of the Resolution filters have been determined, change the Path Variables in the Configuration Manager Window:
       a.  If sweeping, Halt the sweep

       b. 
Select Menu item, "Setup", Configuration Manager.
       c.  In the Configuration Manager Window, change the appropriate Filter Paths for correct frequency and bandwidth.
       d.  Click the "Save Configuration" Button.  The MSA program will close.
       e.  You have completed the characterization of the Resolution Filter Paths

Resolving the DDS Crystal Filters (in-work)
    The frequency response of the Crystal Filter used in DDS 1 and DDS 3 may not be centered exactly at 10.7 MHz.  It is not necessary that it is.  The bandwidth of the Crystal Filter should be 15 KHz and the MSA only utilizes 10 KHz of this bandwidth.  So, there is plenty of "head-room".  This is an optional procedure to characterize this filter.  It is written so that no other special test equipment is required.  In this test we will find two equadistant frequency points on the crystal filter's response curve and calculate the true center frequency.  This will be used to update the Configuration Manager. (This is in-work, more to follow).

D.  Phase Detector Module Calibration (VNA only):
    The Phase Detector Module (PDM) is very accurate when the differential phase of its two input signals is between +72 degrees and +288 degrees.  If the differential phase is outside these boundries, the PDM will automatically be inverted, and a phase measurement is repeated.  The inversion of a "perfect" phase detector creates a 180 degree phase shift and that inversion could be compensated by factoring out 180 degrees.  But, the MSA's PDM is not "perfect".  We must calibrate the PDM to find out what the real phase shift is when the PDM is inverted.  This real phase shift is found by using the following procedure:
    Open and Run the MSA Program (spectrumanalyzer.exe).
    *  Halt sweep.
    *  Enter the VNA Transmission Mode
    *  Halt the sweep
    *  Allow at least a 30 minute warm-up to ensure valid measurements
    *  Select Menu, Operating Cal, Reference To, and check "No Reference", exit References window
    *  Connect Tracking Generator output to MSA input with 1-2 foot cable.
    *  Set the Phase Video Filter Switch to WIDE bandwidth.

    *  Open the Sweep Parameters Window
    *  Select Video Filter BW box to Wide
    *  Select Final Filter Path 1, if not already displayed.
    *  Enter 200 into "Cen" box, center frequency = 200 MHz
    *  Enter 200 into the "Span" box, sweep width will be 200 MHz
    *  Click "OK" then "Restart".
    *  Verify a sawtooth response.
    *  Halt the sweep. The Magnitude power level will display the power level of the Tracking Generator and is not important, unless it indicates an unusual power level.
    *  Select the "L" marker and position the Mouse cursor on the slope that is near +90 degrees (left phase scale).  Double Left click the Mouse.  The "L" marker phase is displayed in the Marker Box.  Reposition the "L" marker to obtain 90 degrees, if necessary.
    *  Click the"Mark->Cent" box.
    *  Click "Restart"
    *  The sawtooth will shift with the "L" marker in the center of the Graph
    *  Halt the sweep
    *  Set the Video Filter Switch to NARROW bandwidth.
    *  Open the Sweep Parameters Window
    *  Change the Span to 0 (MHz)
    *  Select Video Filter BW box to Narrow
    *  Click "OK", then "Restart"
    *  Both the Magnitude and Phase traces will be horizontal lines.
    *  Halt the sweep.  Phase at the "L" marker should be approximately 90 degress.  The Magnitude is not important.
    *  Select Menu, Setup, PDM Calibration.  The "PDM Calibration" window will open
       *  Click the "PDM Inversion Cal" button.  The following take place:

            *  the button will change to "Be Patient".  (This calibration takes about 10 seconds)
            *  the "Current Inversion =" will go blank. It was displaying the current PDM calibrated value.
            *  the computer will command the PDM to its uninverted state
            *  the computer will beep and take the first phase measurement.
            *  in about 5 seconds the computer will command the PDM to its inverted state
            *  the computer will take the second phase measurement.
            *  when the measurements are finished, the computer will beep again
            *  the button will revert back to "PDM Inversion Cal"
            *  the software will calculate the PDM Inversion Phase Shift, using the two measurements.
            *  the "Current Inversion =" will display the newly calculated PDM Inversion Phase Shift
                * you should expect a value of 180 degress, plus or minus 5 degrees.
            *  The Message area will display the two phase values taken during the measurement.
        *  You may repeat the measurements by clicking the "PDM Inversion Cal" button as many times
             as you wish.  I suggest 4 or 5 times to verify repeatability.

        *  You may choose the save this new value as the permanent PDM Calibration value, or
            click the "Cancel" button to exit without saving.

            *  For a permanent Calibration, click the "Save New Value and Quit" button.  This will install
             the value into the Configuration Manager file, automatically.  The new value will be valid for
             this and all future MSA sessions.  If you wish to change the value manually, open the
             Hardware Configuration Manager and change the value in the "Inv Deg" box.

    *  PDM Calibration is completeThis is a one-time calibration and should never have to be repeated,
         unless there is some future modification to the PDM.


E.  Path Calibration for Magnitude (and Phase for VNA):
    MSA Magnitude measurement is a correlation between MSA input power and a binary (digital) representation of the converted input signal.  Under ideal conditions, this correlation would be absolute, and dependent only on the total gain/loss of the MSA.  However, variation can be expected, due to these factors:
    *  Mixer 1 is non linear in its compression range, but we can use this range if we quantify it.
    *  The Log Detector, used to convert RF power to voltage, is non linear.  Close, but not perfect.
    *  The A to D Converter
, used to convert Log Detector voltage to binary bits, is not exact.
    *  MSA gain/loss is dependent on the characteristics of each Resolution Filter Path.  Path losses can
         deviate as much as 10 dB.
    Even with these contributing factors, "Correlated Input Power" can be characterized.  Path Calibration will measure the correlations at different power levels, and record them in a "Path Calibration File".  This will be done for each Resolution Filter Path.  There can be up to 4 Path Calibrations, each creating its own "Path Calibration File".
    MSA/VNA Phase measurement is the phase difference between the MSA's converted input signal and an internal reference signal.  In an ideal VNA, the power level of the input signal would not affect phase measurement accuracy.  However, the MSA is not ideal, and has two main components that change phase when power level changes.  They are, Mixer 1 and the Logrithmic Detector.  This "Phase versus Magnitude" change can be compensated if this change is known.  Path Calibration will measure this phase change versus power level change, and record it as the "Phase Error vs. Magnitude" Correction Factor.  When the VNA is measuring input signal Phase, it will access the Path Calibration File, take the Correction Factor and subtract it from the Phase Measurement.  The result is the True Phase of the input signal.
    During a Path Calibration, a signal with a known power level is injected into the MSA and the digitized Magnitude output is recorded. This input power level and digitized Magnitude output is installed in its own "Path Calibration Table".  If the MSA has the VNA capability, a digitized phase measurement is also installed in the same "Path Calibration Table".  The input power is then changed to another known power level (at the same frequency) and the digitized output is also installed in the "Path Calibration Table".  This process is repeated for multiple input power levels.  The final accuracy of the MSA depends on the accuracy of the known input signal level and the number of calibration points taken.  The more calibration points taken, the more accurate the MSA becomes.
    For an MSA
with a dynamic range of 100 dB, and to be accurate to within .1 dB, each Path Calibration would require 1000 different input power levels.  The MSA can have up to four Resolution Bandwidth Filter Paths.  This would not be practical, so each Path Calibration will calibrate about 30 input power levels, or less.  During normal MSA Magnitude measurement, the software will use these calibration points and use interpolation to find the closest digitized magnitude.

Step by Step Procedure for Path Calibration:
     Note:  These calibration procedures must have been performed before a Path Calibration.
        Tune coaxial cavity filter
        Master Oscillator Calibration
        Resolve the
Center Frequency of Path 1, and others, if installed.
        Phase Detector Module Calibration, if the MSA has the VNA installed.

    1.  Configure MSA for Path Calibration:
        a.  Open and Run the MSA Program (spectrumanalyzer.exe).
            *  Verify MSA is in Spectrum Analyzer Mode
            *  Halt the sweep.
        b.  Select Magnitude and Phase Video Bandwidth Switches to Narrow.
        c.  Open Sweep Parameters Window and configure:
            *  Verify the Select Final Filter Path box is Path 1
            *  Select "Narrow" in Video Filter BW pull-down box
            *  Enter into the "Cent" box, the correct frequency you are calibrating at. A Calibration
                Frequency of 1 MHz is preferred, but, somewhere between 1 MHz and 2 MHz.

            *  Enter 0 into "Span" box.
            *  If your MSA has the Tracking Generator Option, you may use it as the Calibration
                source (internal Signal Generator).
            *  If your MSA has the VNA Option, and
you want to calibrate phase error, you must use
                the internal Signal Generator.
                *  For internal Sig Gen use, open the Sweep Parameters Window and enter the Calibration
                     Frequency into the "Sig Gen Freq" Box.
            *  Click "OK", then "Restart"
            *  This will automatically command the TG (Signal Generator) to the Calibration Frequency

            *  Halt the Sweep
    2.  Configure the Calibrated Signal Source MSA
        a.  Configure Signal Source
            *  For a Basic MSA (no Tracking Generator), you must use an external CW Signal Source.
                * 
Adjust the external CW Signal Source to the Calibration Frequency.
                *  Adjust the power level of the external Source to approximately, -10 dBm.
                *  Most MSA's will be in saturation with an input of -10 dBm.  Therefore, higher
                    calibration levels are usually unnecessary.

            *  If you use the Tracking Generator (internal Signal Generator) as the Signal Source:
                * 
The output power level of the TG (Sig Gen) is approximately -10 dBm
                *  Most MSA's will be in saturation with an input of -10 dBm.  Therefore, higher
                    calibration levels are usually unnecessary.

                *  If you need an input power higher than -10 dBm, you will require an amplifier.
            *  For the very best Path Calibration results, connect the output of the Signal Source to a
                low pass or band pass filter that will pass only the fundamental calibration frequency.
                Harmonic effects are rather minor, but, a filter will help.  This is a user option.

        b.  Configure Step Attenuator
            *  Connect the Signal Source to the input of a precision selectable attenuator.  An ideal
                attenuator would have 120 dB of range, with 1 dB resolution, and an accuracy of .01 dB.
                It would also not deviate in Phase for any step change.  For a Basic MSA (no VNA),
                Phase change is not important.
            *  No matter what Signal Source you are using, the Calibration Power level injected on the
                input connector of the MSA must be known, as accurately as possible.  The final MSA
                operating accuracy depends on the accuracy of the Signal Source and attenuator.

            *  Calibrate the Step Attenuator's Output with a precision power measurement instrument.
                *  With the attenuator at 0 dB, measure the Signal Power and record it. __________dBm.
                *  This "Signal Power" value will be used for each Path Calibration. Example: -10.75 dBm.
                    *  Precision power meters are not available to most people. But, a fair substitute is an
                        Oscilloscope with at least 5 MHz bandwidth and a 50 ohm termination on its input.
                        For an o'scope reading, dBm = 20 x log(peak to peak volts / .6324555 volts)

            *  Connect the Step Attenuator's Output to the input of the MSA.
        c.  Adjust the step attenuator for an output level of approximately, -30 dBm, +/- 5 dBm.
            *  It does not have to be exactly -30 dBm, but it does have to be accurately known.
                *  Whatever the power level is, it must be known to within .1 dBm (.01 dBm is preferred).
                *  This power level will be referred to as "True Power Level", the level entering the MSA.
                *  True Power Level = Signal Power - Attenuator setting. Example: -10.75 - 20 = -30.75
            *  If the VNA is installed, it is important to use a power level of about -30 dBm for the first
                Calibration Point.  The first point is used as a Phase Reference for the subsequent calibration
                points.  A very high input power level may saturate the Log Detector.  A very low input
                power level will result in very noisy digital conversion.  For either extreme, the digitized
                phase would be very much in error, and, we would not want to use it as a reference for
                the subsequent calibration points.
            *  If the VNA is not installed, the power level of the first Data entry does not matter, even if
                the MSA is in saturation.

    3.  Configure the Calibration File Manager Window
         a.  In Graph Window, select Menu, Setup, Initial Cal Manager
         b.  In Calibration File Manager window, "Available Files" box, select and highlight Path 1
         c.  Click "Start Data Entry" box. 
msascreens/calmgrpath1cal.gif
         d.  The "Path Calibration Table" will display the latest Path 1 Calibration File.
              There are three columns of data with these headings,
                ADC  The bit value of the digitized Magnitude value of the Log Detector
                dbm  The actual MSA input power to generate the ADC value
                Phase  The "Phase Error vs Input Power" Correction Factor, in +/- db

         e.  The Calibration Boxes will be displayed and are available for data entry.
                Input (dBm) box - The True Power Level injected into the MSA, in dBm
                ADC value box - The ADC Bit value, correlated to the True Power Level
                Phase (degrees) box - The phase measurement, correlated to the True Power Level
                Ref Freq (MHz) box - The Frequency at which the calibration is performed.
         f.  Delete the data, leaving the Header information.
            *  The table that is displayed on initial set-up will have only two rows of data with ADC
                values of 0 and the maximum bit count for the AtoD Converter for your MSA.

            *  The data shown in the above Path Calibration Table are approximate values for a
               
SLIM MSA.
            *  These are old calibration numbers that we do not want in a new Path Calibration.
            *  Move the Mouse cursor into the displayed Path Calibration Table, Left Click and Highlight
               all the rows of data under the Header information.
            *  Delete the data, by pressing the "Delete" key on the keyboard
    4.  Calibrate this MSA Path, using multiple steps and True Input Power levels.
         a.  Enter into the "Input (dBm)" box, the True Power Level injected into the MSA.
            *  True Power Level = Signal Power - Attenuator setting
            *  Example, -10.75 - 20 = -30.75 (type the value, -30.75, without the suffix, dbm)
         b.  Click the "Measure" button.
         c.  Automatic measurement and data entry will occur:
            *  The software will test for an error condition for Center Frequency and Sweep Width.
            *  The software will read the Center Frequency and install it in the "Ref Freq (MHz)" box.
            *  The software will read the Magnitude Analog to Digital Converter 10 times, and average
                 the Bit values.  It will then enter the average Bit value in the "ADC value" box.
            *  If VNA is installed, the software will read the Phase Analog to Digital Converter 10 times,
                 convert Bits to Phase, and average the Phase values. 
It will then enter the Phase value
                 in the "Phase (degrees)" box, in degrees.
            *  The software will not enter any value into the "Input (dBm)" box. You must manually
                enter the
True Power Level injected into the MSA (in dBm).  You may do this before or
                after clicking the
"Measure" button, but certainly before clicking the "Enter" button.
            *  Note: You may make repeated measurements at a Calibration Point without clicking
                 the
"Enter" button.  Simply, re-click the "Measure" button.
         d.  Click the "Enter" button to transfer the box data.
            *  If this is the first Calibration Point in the Path Calibration (for this Path),
                *  A new box will be created, called Ref Phase (deg).  For VNA, this becomes
                     the Reference Phase for all subsequent Calibration Points.

                *  The Phase values are only used for VNA operation.  If the VNA is not installed,
                    the values will be meaningless.  Or, the
"Phase" boxes may not even be shown.
            *  For all Calibration Points,
                *  The Input Power and its correlated ADC bit value is entered into the
Path Calibration
                    Table under the headings "dbm" and "ADC".
                *  The Reference Phase is subtracted from the Measured Phase, and the result is entered
                    into the
Path Calibration Table under the heading "Phase".  This is the "Phase Error
                    vs Input Power Correction Factor.  It is used to compensate for the phase error created
                    for different input power levels to the VNA.  Used only for VNA.

                *  The boxes will clear and be ready for the next Calibration Point.
         e.  Change the attenuator setting for a new Input Power Level (for the next Calibration Point)
             *  You may use higher or lower power; it makes no difference.  But it is important
                 that no two Calibration Points have the same input power level.

         f.  Return to 4-a. and follow the steps for each Calibration Point. Basically the steps are:
             Apply
True Power Level, Enter Input Power, Measure button, Enter button. Repeat
             for all Calibration points.
             *  You would like to take as many Calibration Points as possible.  An MSA may have a
                 dynamic range of 100 dB +/- 20 dB, but saturation limits can be as high as 0 dBm and
                 noise floor inputs as low as as -130 dBm, depending on the MSA's topology.
             *  When calibrating points in the input level range of 0 dBm to -35 dBm, attenuation
                 steps of 2 dB or 3 dB increments
is advised.
             *  When calibrating points in the input level range of -35 dBm to -75 dBm, attenuation
                 steps of 5 dB or 10 dB increments is advised.
             *  When calibrating points in the input level range of -75 dBm to the noise floor
                 (approximately -110 dBm),
attenuation steps of 2 dB or 3 dB increments is advised.
                You will know you have reached the noise floor when changing the attenuator will not
                 change the average Bit count.
    5.  After the last Calibration Point is taken, you will manipulate the Path Calibration Table
         a.  Click the "Clean Up" button.  This will sort the data points.
         b.  The first row of data in the displayed Path Calibration Table, is the lowest input power
                 Point, taken during Path Calibration.  However, this may not be the ultimate noise floor
                 of the MSA for this Path.
             *  Remove the signal connection from the MSA input connector.  Install a 50 ohm load on
                 the
MSA input connector.
             *  Click the "Measure" button
             *  The "ADC value" box will display the bit value for the ultimate noise floor for this Path.
                 Take this value and subtract 1% .  Highlight the value in the
"ADC value" box and replace
                 it with the resulting value. Example: If the displayed value was 4900 (bits), 4900-49 = 4851
             *  Read the "dbm" column value of the first row, and subtract 10 (dB).  Type this value into
                 the "Input (dbm)" box.  However, if this value is greater than -120, use the value, -120.0
                 Examples:
If it was -125.33, use -125.33 : If it was -106.88, use -120.0
             *  Click the "Enter" button, to install this new data into the Path Calibration Table
             *  Click the "Clean Up" button.  This will sort the data points.
             *  Now, the first row will contain the "ADC" Bit value and "dbm" value of the noise floor.
         c.  Highlight the "Phase" value of this first row.
             *  Change this value to the same "Phase" value, as displayed in the second row. (The Phase                     values of first row and second row will be the same).  Highlight the value in the Table with
                the Mouse cursor and type in the new value.
         d.  Verify the data in the Path Calibration Table is acceptable, before Saving.
             *  Click the "Clean Up" button.  This will sort the data points.
             *  Make sure all data points are monotonic, that is, an increase in Bit count shows a resulting
                 increase of input power.
             *  Do not allow any two data points to have the same ADC bit value.  If this has occurred,
                 delete one of the rows.
             *  Do not allow any two data points to have the same "dbm" value.  If this has occurred,
                 delete one of the rows.

             *  Click the "Save File" button.  This will replace the MSA Path Calibration File with
                 the Table that is displayed.
    6.  The Path Calibration is complete.
         a.  Exit the Calibration Manager Window by clicking the "Return to MSA" button.
         bIf you wish to calibrate another Path,
             *  Open the Sweep Parameters window
             *  Change the Path number in the "Select Final Filter Path" box
             *  Click "OK", "Restart", then "Halt"
             *  Return to Step 2-c. and follow the procedure, replacing any reference to Path 1, to Path X.

F.  Frequency Calibration for Magnitude:
    The frequency, at which the MSA operates, will affect Magnitude measurements in both the Spectrum Analyzer Mode and VNA Mode.  This effect is called, "Magnitude Error vs. Frequency".  Frequency Calibration will characterize this effect at multiple frequencies, and will create a "Frequency Calibration File".  The main MSA software will use this file to compensate its Magnitude measurements.
    Basically, the Frequency Calibration is accomplished by injecting a signal of Known Power Level, and of known frequency into the MSA input connector.  The Magnitude Analog to Digital Converter is read and converted to "Measured Power" in dBm.  The "Measured Power" is subtracted from the Known Power Level and the difference is called the "Magnitude Error vs. Frequency" Correction Factor.  This Correction Factor will be used in the main MSA software when determining the true input power of the MSA.
    The value of Frequency, and the value of "Magnitude Error vs. Frequency" Correction Factor are both installed in a "Frequency Calibration Table".  The input signal is changed to another frequency, also at a Known Power Level, and the measurements are repeated.  They are installed in the same Frequency Calibration Table.  This process is repeated for multiple input signal frequencies.  The completed Frequency Calibration Table is then saved as the MSA Frequency Calibration File, and placed into the MSA Software Folder.
    The final accuracy of the MSA depends on the accuracy of the Known Power Level at each frequency, and the number of frequency calibration points taken.  The more points taken, the more accurate the MSA becomes.  Of course, we do not calibrate at "every" frequency.  This would require millions of calibration points.  Instead, we calibrate at several frequencies and allow the software to interpolate between these frequencies.  Frequency Calibration is performed in Path 1, only.
    Frequency Calibration can be either a manual procedure or a semi-automatic procedure.
       
* The Basic MSA uses the Manual Frequency Calibration for the Basic MSA.  The MSA will be manually commanded, and each Calibration Point will be manually entered.
        * The MSA/VNA uses the
Semi-Automatic Frequency Calibration for the MSA/VNA.  The MSA will be swept, and each Calibration Point will be manually entered.
    Note:  The following calibration procedures must be performed before a Frequency Calibration:
        Tune coaxial cavity filter
        Master Oscillator Calibration
        Resolve the
Center Frequency of Path 1.
        Phase Detector Module Calibration, if the MSA has the VNA installed.
        Path 1 Calibration.  Other Path Calibrations are not necessary.

F1.  Manual Frequency Calibration for the Basic MSA:
       The MSA will be manually commanded, and each Calibration Point will be manually entered.  An external, calibrated Signal Source is used.  This procedure is normally used for the Basic MSA since it has no Tracking Generator option.  But, it can be used for the full MSA/TG/VNA, if preferred.

Step by Step Procedure for Manual Frequency Calibration:
    1.  Start with a "Fresh" Frequency Calibration File.
         a.  Open and Run the MSA Program (spectrumanalyzer.exe).
         b.  Halt sweep.
         c.  In the Graph Window menu, Setup, select Initial Cal Manager
         d.  In Calibration File Manager Window's "Available Files" box, select and highlight (Frequency)
         e.  The "Frequency Calibration Table" will display the latest Frequency Calibration File.
              *  If this is the initial Frequency Calibration, the Table will display only two rows of entries,
                0.00 MHz and 1000.00 MHz, with a corresponding value of 0.00 under the "Error"
                column.  This is what we want.
              *  If a previous Frequency Calibration has been performed, multiple entries will be
                displayed. We want a "fresh table" for a Frequency Calibration.  Therefore, click the
                "Display Defaults" button.  The Frequency Calibration Table will be replaced with
                the SLIM default Table, showing only the two rows.  Click "Save File".

         f.  Click the "Return to MSA" button.
    2.  Configure the MSA to sweep a Calibration Point.
        a.  If sweeping, Halt the sweep.
        b.  Verify that the MSA is in the Spectrum Analyzer Mode
        c.  If not in SA Mode, halt the sweep and select it in Graph Window Mode menu.
        d.  Select the Magnitude Video Bandwidth Switch to Narrow.
        e.  Open Magnitude Axis Window
            *  Enter 0 into the "Top Ref" box and -100 into the "Bot Ref" box.
            *  Select Magnitude (dBm) in "Graph Data" pull-down box.
        f.  Open Sweep Parameters Window
            *  Verify the Select Final Filter Path box is Path 1.  If not, select it.
            *  In the "Span" box, enter 5 times the bandwidth of Path 1.
            *  Enter 100 into the "Steps/Sweep" box
            *  Enter 50 into the "Wait" box
            *  Change the "Cent" box to the frequency of your first Frequency Calibration Point.  This
                 should be the same frequency that was used for Path 1 Calibration.
  If you don't know
                 what it was, use the following procedure:

                *  Graph Window menu, Setup, select Initial Cal Manager
                *  In Calibration File Manager Window's "Available Files" box, select and highlight Path 1
                *  The Calibration Table for Path 1 will be displayed in the Path Calibration Table.
                *  The top of the Calibration Table will be, *Calibrated (date) at (xx.xx) MHz.
                *  The (xx.xx) value is the frequency, in MHz, used in Path 1 Calibration
                *  Click "Return to MSA" button
            *  Click "OK"
    3.  Configure the external, Calibrated Signal Source.  There are several methods of obtaining a
        calibrated Signal Source, but I will not explain them here.  I will only discuss the
        requirements of the calibration signal, which is injected into the input of the MSA.
        a.  The frequency should be adjustable from 100 KHz to 1000 MHz, or greater.  A narrower
             frequency range usable, but the final MSA will be "uncertain" at any frequency that is
             not within the calibration range.
        b.  The frequency must be stable to within 1 KHz.
        c.  The output power level must be between -20 dBm and -40 dBm.
        d.  Whatever the power level is, it must be known to within .1 dBm (.01 dBm is preferred).
            This power level will be referred to as "
True Power Level".  The Magnitude Measurement
             accuracy of the MSA is dependent on the accuracy of this
"True Power Level".
        e.  Connect the Calibrated Signal Source output to the Input of the MSA.  The coaxial
             interconnections should be low loss and as short as possible.

    4.  Create a Working Calibration Chart for the Frequencies you will use.
        a.  Create a Table with these Row and Column Headings:
                Frequency                   Measured Power Level         
True Power Level     Error (T-M)
Point 1
      ___2______MHz          ______________dBm          ___________dBm    _________dB
Point 2      __________MHz          ______________dBm          ___________dBm    _________dB
Point 3      __________MHz          ______________dBm          ___________dBm    _________dB
etc, to
Point X     __________MHz          ______________dBm          ___________dBm    _________dB
        b.  In the "Frequency" column, fill in the frequency values you plan to use for each Point.
            *  Point 1 should be the same frequency that was used for Path 1 Calibration.
            *  Points 2 through Point X can be at any frequency that is within the range of the MSA.
                Keep in mind that all SLIM MSA's will operate higher than 1000 MHz.  Use Calibration
                Points up to the frequency limit of your MSA.
            *  We would like to have as many Calibration Points as possible.  More points taken will
                result in better accuracy of the MSA, but more than 50 points is probably unnecessary.
                I used 20 points for Calibration and obtained good results for the MSA.

        c.  "Measured Power Level" will be the power of the input signal, as measured by the MSA.
        d.  "True Power Level" is the actual input power to the MSA, as provided by the Calibrated
               Signal Source.
            *  Fill this column with the True Power Level at each Frequency Point to be taken
        e.  "Error (T-M)"  will be the "Magnitude Error vs. Frequency" Correction Factor, found by
             subtacting the Measured Power Level from the
True Power Level. The result can be
             a positive or negative number. Take to two decimal places. i.e., -1.23 dB
    5.  Command the Calibrated Signal Source and the MSA for a Calibration Point Sweep
        a.  Open the Sweep Parameters Window and change the "Center Frequency" to the same
             frequency as the Calibration Point.  Click "OK".
        b.  Change the Calibrated Signal Source Frequency to the same frequency.
        c.  Click "Restart"
            *  Magnitude will be measured at each step in the sweep.
            *  However, we are only interested in the measurement at the center of the sweep.
            *  Verify the peak of the response curve is in somewhere in the sweep.  If it is not in the
                center of the sweep, the frequency of the source is not the same frequency that was
                entered as the Center Frequency for the MSA.  This is ok.
        d.  Allow at least one full sweep, then Halt.
        e.  Position the Mouse Cursor over the center of the response and double Left Click the Mouse.
        f.  The "L" marker will be displayed, with the measured Magnitude data in the Marker Box
        g.  Enter the "L" marker Magnitude measurement into your Working Calibration Chart, under
             the header "Measured Power Level", for this Frequency Point.

        h.  Return to Step 5 and repeat steps a. through g. until you have completed your Working
             Calibration Chart for all Frequency Calibration Points.
    6.  Open the Calibration Manager and access the Frequency Calibration Table
         a.  In Graph Window menu, Setup, select Initial Cal Manager
         b.  In Calibration File Manager Window's "Available Files" box, select and highlight (Frequency)

msascreens/calmgrdefault.gif

         c.  The "Frequency Calibration Table" will display the default Frequency Calibration File.
              The Table will display only two rows of entries, 0.00 MHz and 1000.00 MHz, with
              corresponding values of 0.00 under the "db" column.  This is the "Error" column.
         d.  Manually enter the new Calibration values from your Working Calibration Chart into the
              Frequency Calibration Table.  We will use the "text editor" process.
            *  Place the Mouse Cursor under the "1000.000" row entry, and left click to apply cursor.
            *  Enter the Frequency of Point 1 (in MHz)
            *  Press the space bar on your computer to move the cursor to the right
            *  Enter the Error for Point 1 (in dB), using correct sign
            *  Press the "Enter" key on your computer to move the cursor to a new row
            *  Enter the Frequency of Point 2
            *  Press the space bar on your computer to move the cursor to the right
            *  Enter the Error for Point 2
            *  Press the "Enter" key on your computer
            *  Repeat this process for all Calibaration Points, Point 3 through Point X
            *  They can be entered into the Frequency Calibration Table in any order
            *  When finished, click the "Clean Up" button. This will sort the rows by frequency
    7.  Save the Frequency Calibration Table
         a.  The row containing the default data at 1000.00 MHz must be changed, or deleted.
            *  If you have selected a Calibration point that is higher in frequency than 1000 MHz,
                delete the default 1000.00 row.  Highlight the "1000.00", and its column values,
                and delete them.
            *  If your highest frequency Calibration point is lower than 1000 MHz, change the
                error value in the 1000.00 row
.  Do this by highlighting the error value in the
                1000.00 row, and replace it with
the same value as your highest
                Frequency Calibration Point's eror value.
         b.  The row containing the default data at 0.00 MHz may be changed, but must not be deleted.
            *  It is advisable to make its data values the same as the values of the next closest
                Calibration Point.  H
ighlight its error value and replace them with the values of the
                next closest Calibration Point.

         c.  Click the "Clean Up" button.  This will sort the data points.
            *  Look through the Frequency Calibration Table and verify that no two rows contain
                the same frequency.
         d.  Click the "Save File" button.  This will replace the MSA Frequency Calibration File
                with the displayed Frequency Calibration Table.
         e.  The Frequency Calibration is complete.
         f.   Exit the Calibration Manager Window by clicking the "Return to MSA" button.

F2.  Semi-Automatic Frequency Calibration for the MSA/TG
    The MSA will be automatically swept, but each Calibration Point will be manually entered.

Step by Step Procedure for Semi-Automatic Frequency Calibration:
    1.  Start with a "Fresh" Frequency Calibration File.
         a.  Open and Run the MSA Program (spectrumanalyzer.exe).
         b.  Halt sweep.
         c.  In the Graph Window menu, Setup, select Initial Cal Manager
         d.  In Calibration File Manager Window's "Available Files" box, select and highlight (Frequency)
         e.  The "Frequency Calibration Table" will display the latest Frequency Calibration File.
              *  If this is the initial Frequency Calibration, the Table will display only two rows of entries,
                0.00 MHz and 1000.00 MHz, with a corresponding value of 0.00 under the "Error"
                column.  This is what we want.
              *  If a previous Frequency Calibration has been performed, multiple entries will be
                displayed. We want a "fresh table" for a Frequency Calibration.  Therefore, click the
                "Display Defaults" button.  The Frequency Calibration Table will be replaced with
                the SLIM default Table, showing only the two rows.  Click "Save File".

         f.  Click the "Return to MSA" button.
    2.  Configure the MSA and Tracking Generator
        a.  Open and Run the MSA Program (spectrumanalyzer.exe).
        b.  Verify that the MSA is in the Spectrum Analyzer Mode
              *  If not, halt the sweep and select it in Graph Window Mode menu.
        c.  Halt the sweep.
        d.  Select the Magnitude Video Bandwidth Switch to Narrow.
        e.  Open Magnitude Axis Window
            *  Enter 0 into the "Top Ref" box and -100 into the "Bot Ref" box.
            *  Select Magnitude (dBm) in "Graph Data" pull-down box.
        f.  Open Sweep Parameters Window
            *  Verify the Select Final Filter Path box is Path 1.  If not, select it.
            *  Enter 1000 into the "Span" box.
            *  Enter into the "Steps/Sweep" box, the number of Frequency Calibration Steps you wish
                to make.  The more Steps, the better the resolution of the final calibration.  I entered the
                value, 20.  This actually creates 21 Calibration Steps, since step number 0 is included.
            *  Enter 200 into the "Wait" box
            *  Change the "Cent" box to 500 plus the same frequency that was used for Path 1
                Calibration.
  Example: if the Path 1 Calibration Frequency was 2 MHz, then enter
                the value "502".  If you don't know what it was, use the following procedure:

                *  Graph Window menu, Setup, select Initial Cal Manager
                *  In Calibration File Manager Window's "Available Files" box, select and highlight Path 1
                *  The Calibration Table for Path 1 will be displayed in the Path Calibration Table.
                *  The top of the Calibration Table will be, *Calibrated (date) at (xx.xx) MHz.
                *  The (xx.xx) value is the frequency, in MHz, used in Path 1 Calibration
                *  Click "Return to MSA" button
            *  Click the "Signal Generator" button to change to "Tracking Generator.  This will configure
                the MSA to use the Tracking Generator output as the Calibration Source.
            *  Click "OK"
    3.  Configure the Tracking Generator as a Calibrated Signal Source.
        a.  Requirements of a Calibrated Signal Source, which is injected into the input of the MSA:
          * The frequency should be adjustable from 100 KHz to 1000 MHz, or greater.
          * The frequency must be stable to within 1 KHz.
          * The power level must be between -20 dBm and -40 dBm.
          * The power level must be characterized over a frequency range of .1 MHz to 1000 MHz
              * That is, whatever the power level is, it must be a known value to within .1 dBm
                 (.01 dBm is preferred).  This power level will be referred to as "True Power Level".
                 The Magnitude Measurement accuracy of the MSA is dependent on the accuracy of
                 this
"True Power Level".
        b.  The specifications of the Tracking Generator:
          * Frequency range is from 1 KHz to greater than 1050 MHz.
          * Frequency stability to within 3 Hz.
          * The RF output level is approximately -10 dBm, however,
          * Level is not uniform across its entire range of .1 MHz to 1000 MHz (and above),
          * Expected ripple is about 2 dB.
          * The output level of the Tracking Generator can be characterized (calibrated) over frequency.
            
There are several methods of characterizing the Tracking Generator, and I will not explain
             them here.  But, if the Tracking Generator is used directly, as the
Calibrated Signal Source,
             it must be
characterized.
        c.  If the Tracking Generator is used to drive a leveling circuit, such as a Limiter or Leveler, the
             Tracking Generator does not need to be characterized.  Only the leveling circuit
needs to be
             characterized.  It is considered the
Calibrated Signal Source.
        d.  For either method, connect the Calibrated Signal Source output to an attenuator.
          * The attenuator should attenuate the Signal Source output power to approximately, -30 dBm.
        e.  Connect the attenuator output to the Input of the MSA.
          * The coaxial interconnections should be low loss and as short as possible.
    4.  This step not used.
    5.  Critically Sweep, to read the Calibration Points.
        a.  Click "Restart".  Magnitude will be measured at each step in the sweep.
        b.  Verify that the graphed Magnitude response is a horizontal line, although there may
              be a large amount of ripple.  The measured levels should be close to the
True Power Level
              of the
Signal Source output, +/- 2 dB.
        c.  Halt after at least two full sweeps.  The MSA has now recorded Magnitude data for each
            Frequency Point step in the sweep.

    6.  Calibrate for each Frequency Point of the sweep.
         a.  In Graph Window menu, Setup, select Initial Cal Manager
         b.  In Calibration File Manager window, "Available Files" box, select and highlight (Frequency)
         c.  The "Frequency Calibration Table" will display the default Frequency Calibration File.
            *  The Table will display only two rows of entries, 0.00 MHz and 1000.00 MHz, with
                corresponding values of 0.00 under the "Error" column.
         d.  Click "Start Data Entry" box
msascreens/calmgrfreq1.gif
         e.  The following Buttons will be displayed
            *  "Next Point".  This will increment through each of the recorded Magnitude steps
            *  "Prev Point".  This will decrement through each of the recorded Magnitude steps
            *  "Enter".  This will calculate and enter this step's Calibration into the Calibration Table.
            *  "Enter All". This will calculate and enter all steps into the Calibration Table.
         f.  The following Boxes will be displayed with data already entered.
            *  "Point Number" box will display 0. This is the first data Point and will become the reference.
            *  "Freq (MHz)" box will display the Frequency of this Point
            *  "Measured Power (dB)" box will display the Magnitude reading of this Point in dBm.  It
                 is important that this measured value, in dBm, be the same as the
True Power Level of
                 the Calibrated Signal Source injected into the MSA, within 0.1 dB or better, if possible.
                 If it is not, the Path 1 Calibration may be in error.
         g.  Click the "Enter" button, this occurs:
            *  Software reads the "Freq (MHz)" box and install its value into the Frequency
                Calibration Table, under the column, "MHz"
            *  A new box appears with the label "True Power (dBm)".  It contains the same
                value as the
"Measured Power (dB)" box.  This becomes the Reference value for
                subsequent calibration points.

            *  Software reads the "Measured Power (dB)" box and subtracts its value from the value
                in the
"True Power (dBm)" box.  For the first Frequency Calibration Point, the result is
                0.00
.  It installs this result of 0.00 into the Frequency Calibration Table, under the
                column, "db", in the same row as the just entered, Frequency.  This is the "Magnitude
                Error vs. Frequency" Correction Factor.  It will be used in the main MSA software when
                determining the true input power to the MSA, during normal Magnitude Measurements.

    7.  Select the next Frequency Calibration Point.
         a.  In the Calibration File Manager Window, click the "Next Point" button.
            *  If this is not the correct Frequency Calibration Point, continue clicking the "Next Point"
                button until you reach the correct Point.

            *  The "Point Number" box will increment by 1.
            *  The "Freq (MHz)" box will display the frequency of the new Frequency Calibration Point
            *  The "Measured Power (dB)" box will display the Magnitude reading of the Point, in dBm.
                This value will probably not be the True Power Level on the input of the MSA.

            *  The "True Power (dBm)" box will display the True Power of the previous point.
         b.  Enter the Calibrated Signal Source's True Power Level at this Calibration Point into
              the
"True Power (dB)" box.
            *  Highlight the value in the "True Power (dBm)" box and replace it with the value of the
               
Calibrated Signal Source's True Power Level.
         c.  Click the "Enter" button
            *  Software reads the "Freq (MHz)" box and install its value into the Frequency
                Calibration Table, under the column, "MHz"

            *  Software reads the "Measured Power (dB)" box and subtracts its value from the value
                in the
"True Power (dBm)" box.  It installs the result into the Frequency Calibration
                Table, in the same row, under the column, "db".  It can be a positive or negative value.
                This is the "Magnitude Error vs. Frequency" Correction Factor for this step.

         d.  Return to 7. and repeat the step procedures for the next Frequency Calibration Point.
            *  Basically, the steps are:  Select each Frequency Point, insert True Power, "Enter" into
                Calibration Table.  Repeat for all Calibration points.
            *  We would like to have as many Calibration Points as possible.  More points taken will
                result in better accuracy for the MSA, but more than 50 points is probably unnecessary.
    8.  Save the Frequency Calibration Table
         a.  The row containing the default data at 1000.00 MHz must be changed, or deleted.
            *  If you have selected a Calibration point that is higher in frequency than 1000 MHz,
                delete the default 1000.00 row.  Highlight the "1000.00", and its column values,
                and delete them.
            *  If your highest frequency Calibration point is lower than 1000 MHz, change the
                error value in the 1000.00 row
.  Do this by highlighting the error value in the
                1000.00 row, and replace it with
the same value as your highest
                Frequency Calibration Point's eror value.
         b.  The row containing the default data at 0.00 MHz may be changed, but must not be deleted.
            *  It is advisable to make its data values the same as the values of the next closest
                Calibration Point.  H
ighlight its error value and replace them with the values of the
                next closest Calibration Point.

         c.  Click the "Clean Up" button.  This will sort the data points.
            *  Look through the Frequency Calibration Table and verify that no two rows contain
                the same frequency.
         d.  Click the "Save File" button.  This will replace the MSA Frequency Calibration File
                with the displayed Frequency Calibration Table.
         e.  The Frequency Calibration is complete.
         f.   Exit the Calibration Manager Window by clicking the "Return to MSA" button.

XP Problem
    Here is a strange problem that I had when I first started using my new WinXP Pro computer and the MSA.  After opening and running the MSA software, the display would graph the correct results for the Spectrum Analyzer for a few seconds.  Then the Magnitude trace would begin to disappear.  I could "Halt" the sweep and click "Restart".  The Graph would be normal for a few more seconds and go away again.  I would repeat this process for about a minute and the MSA would be normal for the rest of the time I had the MSA session open.  I found this Question and Answer on the internet:
    Question: If a logic 1 is written to the Control Port, bit 0, (Strobe), my PC clears all of the port bits once every five seconds for about a minute.

    Answer: Some versions of Windows XP look for devices by periodically writing to the port. A registry key can disable this behavior. 
You can make these changes in Windows' regedit utility.
    The following registry setting disables the port writes:
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Parport\Parameters]
"DisableWarmPoll"=dword:00000001
    The following registry setting enables the port writes:
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Parport\Parameters]
"DisableWarmPoll"=dword:00000000

    This must be for a version of XP which I don't have.  I could not find the "Parameters" file.  However, I did find this:
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Parport]
"Start" = 3
    I double clicked the "Start", and an Edit window opened to allowed me to change the value.  I changed the value from 3 to 4.  I saw this somewhere else on the internet.  This fixed my problem.
    As always, use caution when working with the registry, which contains critical values for configuring and running the PC.

    Windows XP has another feature that previous Windows versions do not have and should be modified to prevent "weird" displays when resizing the MSA Graph Window.  Perform the following procedure:
You need to disable this as follows (translated from German interface):
Start => Control Panel => Display and Design => Display => Appearance => Effects
=> Show Window Content while dragging.    Uncheck this box.


End of Page