Next: SYSTEM SET-UP FOR RUNNING HST_MASTER
Up: HST: HIGH SPEED TOMOGRAPHY REFERENCE
Previous: Contents
HST consists of two main programs: hst_master and hst_slave. As their names suggest the ``master'' program exists to command the ``slave'' program. This split of the task between two programs occurs as the reconstruction process is very intensive in terms of time and computer resources. Thus it is best run in ``batch'' mode. (Indeed, at present a computer system is deliberately set-up so that only one reconstruction process uses almost all relevant system resources at a time.)
hst_master is an interactive program which allows the user to select the files containing raw projection data images from a tomography experiment, and associated detector calibration data, and interactively choose the volume to reconstruct. This program outputs a parameter file containing the selected information.
hst_slave takes the parameter file as its command line argument, and then runs automatically, producing the reconstructed volume as previously defined. This takes about an hour for typical reconstructions to run on a fast high performance workstation (Compaq Alpha systems are presently used), and may take about 8 hours to run on a modern PC (provided it has a reasonably large amount of RAM installed).
The parameter file may be produced by other means, or may be edited from an existing file, provided much care is taken to produce sensible input.
The reconstruction is performed using a direct filtered back-projection algorithm which has been adapted to require a minimum of floating point operations whilst maintaining high data quality in the reconstruction. The reconstruction has been implemented for maximum speed on dedicated high performance workstations (tomorrow's slightly up-market PC's !). This is achieved by performing the reconstruction of a slice (or two slices) within the cache memory of the workstation. This allows the processor to work at maximum speed, whereas if a slice is significantly larger than the cache then the reconstruction proceeds much more slowly as the processor spends most of its time waiting for data to be brought to and from random access memory (RAM).
Whilst a relatively large RAM is recommended for the storage of the ``sinograms'' the performance of the reconstruction is not critically affected by this.
Options exist for normal pre-treatment of the data for detector distortions (dark current and non-uniformity of intensity response), and for output of re-ordered ``sinogram'' data, and reconstruction direct from ``sinograms''.
Andy Hammersley