Release Notes for STScI_Python 2.2
PyFITS
======
These are the changes that have been implemented in PyFITS since v0.9.6:
- PyFITS version updated to 1.0 (from 0.9.6 delivered with the patch release)
- PyFITS now uses the standard setup.py installation script
- Added interactive convenience functions that allow getting data
and headers from a FITS file in one step. This includes: getdata(),
getheader(), getval(), writeto(), append(), and update(). See User's
manual for more information
- Now uses the Python boolean values True/False. The older TRUE/FALSE values
continue to work.
- Added support for the HEIRARCH convention
- Improved formatting when printing cards or card lists.
- Support for iteration and slicing for HDU lists
- EXTNAMEs automatically converted to upper case
- PCOUNT and GCOUNT keywords removed from PrimaryHDUs
- Added optional keyword argument "clobber" to all writeto() functions and
methods
- Header argument is now optional in writeto(), update() and append()
functions. When not supplied, a minimal header is constructed that is
consistent with the data
- ImageHDUs, if supplied to writeto(), is converted to a PrimaryHDU.
- The EXTEND keyword is added to the Primary header if not present and
extensions are added.
- New keywords are appended after the last non-commentary card instead
the end of all keywords
- Improved error message when indexing HDU lists past the end.
- Fixed errors when creating empty headers using the Header() constructor
PyRAF
=====
These are the changes that have been implemented in PyRAF since v1.1.2:
- Version updated to 1.2.0 (from 1.1.2 delivered with the patch release)
- The addition of limited support for GOTO statements in CL scripts. GOTOs
that jump forward and do not jump to labels within code blocks are now
supported. (It should be possible to restructure CL code in almost all
cases to use only forward GOTOs).
- Multidimensional arrays now permitted as parameters and local variables
in CL scripts. Other array-related bugs fixed (e.g., they can be used in
epar now.)
- Python 2.4 introduced changes that caused problems with the use of INDEF
values in places where Python functions expected a float or int value.
This has been fixed so such uses can continue. All the built-in CL
functions have been modified to handle INDEF values correctly. Comparisons
of INDEF values now behave the same as IRAF CL comparisons of INDEFs.
- Fixed problem with certain tasks' terminal interactions that caused the
task to hang (e.g., with phot)
- Fixed bug that caused CL scripts with many consecutive hash characters
to hang the Python session.
- If Verbose is set, PyRAF now prints a traceback when syntax errors are
caught in parameter file parsing.
- int parameters now accept (and truncate) float values
- Improved error message for CL syntax errors.
- "set" has been added to the list of keywords that PyRAF treats differently
when used in a way that is different from how the new (as of Python 2.4)
Python "set" is used.
- Performance of epar has been improved by caching the Tk default root window.
This results in much faster startup times on platforms that have slow Tk
window creation times.
- Changes made to support the simultaneous use of multiple image displays
using named pipes (FIFOs). This now works identically to IRAF.
- Added changes to support installation on 64-bit Linux (for X11 linking)
- Improved handling of Python boolean values.
- help works on functions again (broken by Python's addition of function
attributes)
- Many of the above changes were made to support the Gemini data reduction
packages.
- Added spaces after the prompts in CL compatibility mode.
- the access function (which tests for file existance) now returns True
for the special file names "STDIN", "STDOUT", and "STDERR" matching
IRAF's behavior.
- Various improvements to the CL script cache handling, including the
compileall script (which now rename the old cache directory to clcache.old).
- Added some more useful scripts to tools:
- checkcompileall.py: Read output of compileallcl.py and pull out info on
just the scripts that had errors or warnings.
- cachesearch.py: Search the CL cache for entries with Python code that
matches a particular regular expression.
- cachecompare.py: Compare the contents of two CL caches, listing the
names of scripts where the Python code differs.
numarray
=========
These are the primary changes since the last patch release of STScI_Python:
- Version updated to 1.4.1
- Speed improvement for numarray operators. The Python level hook
mapping numarray operators onto universal functions has been moved down
to C. This makes operator notation (a+b) as fast as ufuncs (add(a,b))
for small arrays.
- Speed improvement for string-array comparisons, any(), all(). String
correlation is ~10x faster.
- Better operation with py2exe to help it automatically detect the core
numarray extensions to include in an installer.
- scipy newcore 'dtype' keyword added to many array creation functions.
- scipy newcore .dtypechar attribute attribute added to NumArray class.
- ~10 minor bugfixes
Full release notes are here:
http://sourceforge.net/project/shownotes.php?release_id=366528
iterfile
=========
A new IterFitsFile class for each input was added to this module. This
new class manages the file I/O internally for the input FITS image,
only opening/closing upon each request for data. This insures that only
1 file handle remains open at any one time regardless of the number of
objects in memory at any given time. This allows an arbitrarily large
number of FITS images to be processed at any one time, subject to memory
constraints, when used with the new FileIter iterator class.
In addition, the IterFitsFile object can be treated like a PyFITS object,
complete with all the same attributes and methods, but with the extra
overhead of file open/close for each call.
nimageiter
==========
A new iterator, FileIter, was added to iterate over IterFitsFile objects. It
manages a buffer which gets used to extract 'sections' from FITS objects
managed by the IterFitsFile class. The default buffer size used by MultiDrizzle
is 1Mb per input image.
Numcombine
===========
Support was added for computing a clipped minimum array. This is based
upon the minimum function in numarray.images.combine that was implemented
for numarray 1.3. In the case that numarray 1.3 is not installed, an
alternative implementation that does not support clipping will compute a
minimum array. Regardless of numarray version, any masks provided will
be used.
Imagestats
===========
Modified imagestats to throw an exception if the npix returned by the
computeMean function is ever less than or equal to zero.
===========================================
====
==== Applications
====
===========================================
SAACLEAN
========
The new task, 'saaclean' measures and removes an estimate of the SAA
persistence signal in a NICMOS image. It was developed as a stand-alone
Python task and an IRAF interface has been provided to allow it to be
run as an IRAF task under the HST_CALIB.NICMOS package in STSDAS.
The input image to the task must be at least partially calibrated,
including zero-read subtraction, dark subtraction, and linearization. The
image may also be flatfielded. The task works by determining an
appropriate scale factor by which to multiply the persistence image
before subtracting it from the science image. It does this iteratively,
measuring the noise in the image after each subtraction, then fits a
parabola to the results to determine the minimum. Before performing
any corrections, the task flags hot pixels, pixels at the edges, and
pixels in the central row and column, to exclude them from the estimate
and correction process. A summary of the results is written to standard
output, and is encoded in the header of the output file. A new section
of header keywords is inserted following the "POST-SAA DARK KEYWORDS"
section, labelled "SAA_CLEAN output keywords". They include:
SAAPERS SAA persistence image
SCNPSCL scale factor used to construct persistence img
SCNPMDN median used in flatfielding persistence image
SCNTHRSH Threshold dividing high & low signal domains
SCNHNPIX Number of pixels in high signal domain (HSD)
SCNLNPIX Number of pixels in low signal domain (LSD)
SCNGAIN gain used for effective noise calculations
SCNHSCL HSD scale factor for min noise
SCNHEFFN HSD effective noise at SCNGAIN
SCNHNRED HSD noise reduction (percent)
SCNLSCL LSD scale factor for min noise
SCNLEFFN LSD effective noise at SCNGAIN
SCNLNRED LSD noise reduction (percent)
SCNAPPLD to which domains was SAA cleaning applied
MultiDrizzle
============
MultiDrizzle has been updated to version 2.7.0. The primary changes include:
- MultiDrizzle now runs on 64-bit Linux using the version of 'f2c' included
in this release.
- Documentation derived from epydoc was updated, and put into a new
'multidrizzle_api' directory. This update also included new PDF and
PS versions.
- New parameters 'driz_cr_grow' and 'driz_cr_ctegrow' for flagging cosmic
ray CTE tails have been added to the interface. These new parameters
only control the regions used around already identified cosmic-rays
for more stringent tests to identify and mask CTE affected tails.
- Changes were made to 'buildmask' to always start with a copy
of the DQ array to support changes in numarray 1.2a and later.
- Changes were made to the IRAF parameter interface to allow for the use
of the new 'minimum' combination function in the CREATE MEDIAN step.
- A new IterFitsFile class to support working with arbitrary numbers of
input images at once while only have 1 file open at any given time. The
ImageManager class was then modified to use this for creating the
median image. The only limitation now remains the size of the buffer
used for the slice from each input image, as only the slice gets
processed in memory.
- Output products from MultiDrizzle now contain HISTORY keywords reporting
what versions of MultiDrizzle, PyDrizzle, PyFITS, and other pertinent
libraries were used to generate the product.
- Input images with EXPTIME of 0 (zero) seconds now get ignored. They
are not included when running MultiDrizzle, and a warning message
reports what files were ignored.
- The 'tophat' kernel now can be successfully selected for use.
- The creation of the median image will now more closesly replicate the
IRAF IMCOMBINE behavior of nkeep = 1 and nhigh = 1. This prevents
the over-rejection of pixels in non-overlapping science images.
- The minmed algorithm was modified such that the first median image
created would ignore the use of nhigh = 1 clipping if using it would
result in no image data for that pixel location.
- The 'wt_scl' values used for IVM and ERR weight type specifications
now get computed as the exposure time scaled by the pixel scale
rather than just the exptime.
- Modified mdrizpars _handleMdriztab method to support the new version of
PyFits (0.9.8.1). The new version of PYFITS returns FITS standard
types for columen definitions. This is in contrast with previous
versions that returned numarray types for columns.
- Changes were made to when the output frame gets defined so that MultiDrizzle
can be successfully restarted at the 'blot' step.
- A bug in the driz_cr step that created invalid simple FITS format
files for the *_cor.fits and *_crmask.fits products has been corrected.
- Turned off use of memory mapping.
PyDrizzle
=========
PyDrizzle has been updated to Version 5.6.0. This new version includes
the following set of changes/bug fixes/improvements.
- PyDrizzle now runs on 64-bit Linux using the version of 'f2c' included
in this release.
- This new version implements the proper conversion of the output product
to 'counts' based on 'units' parameter. This required adding the
exposure time as an attribute of the Exposure class in order to allow
for proper conversion of the product from 'counts' to 'cps' and back.
- Images without distortion coefficients had a 1 pixel shift in both
X and Y imposed on them by PyDrizzle. This correction now allows the
predicted output size to match the actual drizzle product resulting
in almost no dropped pixels for most cases, and for images with no
distortion being applied to come out nearly unchanged.
- Output of the translated coeffs file was moved to the Exposure class
method 'writeCoeffs'.
- The 'XYtoSky' function was moved from 'wcsutil' module to its own module,
so that 'wcsutil' can be imported separately from rest of PyDrizzle.
- The 'read_archive' method in 'wcsutil' needed to be modified to use
any existing prefix found in the header for archived keywords,
especially for computing the pscale.
- The output FITS products from PyDrizzle no longer have extension
keywords in the primary headers when 'build=no' to conform to the
FITS standard.
- Velocity aberration correction now gets turned off when no distortion
coeffs have been specified.
- A new method, 'runDriz', has been added to the Exposure class to allow
a single chip to be drizzled to an output image as if cut as a
section from the final product.
- The description of the 'bits' parameter in the PyDrizzle docstring was
updated to explain both 'bits_single' and 'bits_final'.
- The shifts applied to 'blot' are now properly scaled to the 'input'
frame to match the STSDAS interface to 'blot'. This corrects
the problem of the blotted images not matching the original input
chips when scale != 1, which was the case for all WFPC2 images.
This allows WFPC2 images to finally be masked properly by MultiDrizzle.
- Distortion correction images now have the correct subarray section applied
to subarray input images.
- Subarray images with the reference point specified outside the image
area have the WCS values properly interpreted for computing shifts
between images.
- Arbitrary numbers of input images can be processed now, due to changes
in file handling so that only 1 file is ever open at a time.
- Combining images of different input sizes now works as expected. Previously,
a memory buffer the size of the first image was used for all input images
when 'drizzling' the images. This was modified to create a new buffer for
each input and delete it when finished, so that only 1 buffer ever remained
in memory at the same time.
- Turning off usage of memory mapping by default.