The first thing anyone with WFPC2 data should do is make a quick check of the initial data quality to find out if there were any problems with the observations. A few "quick looks" can help you verify which calibration files were used, determine if they are the best ones available, and to see if there are any obvious problems inherent in the data (such as saturation).
This document is set up to take you through each of the following steps:
1.) Look at the image headers: a.) Look at the CALIBRATION REFERENCE FILES and SWITCHES b.) Look at the EXPTIME c.) Look at the RA_TARG and DEC_TARG 2.) Display the images 3.) Look at the .ocx, .pdq and .trl files 4.) Check the data quality files
Let's take an example. For EACH WFPC2 EXPOSURE there is a set of files similar to the following. The extension tells you what type of information is contained in each file:
APPROX. FILE SIZE EXPLANATION ---------------------------------------------------------------------------- u22u0101t.d0d 5M (Raw data - pixel file) u22u0101t.d0h 24K (Raw data - header file) u22u0101t.q0d 5M (Raw data quality - pixel file) u22u0101t.q0h 24K (Raw data quality - header file) u22u0101t.x0d 90K (Engineering and CCD overscan - pixel file) u22u0101t.x0h 5k (Engineering and CCD overscan - header file) u22u0101t.q1d 90k (Eng. and CCD overscan data quality - pixel) u22u0101t.q1h 5K (Eng. and CCD overscan data quality - header) u22u0101t.c0d 10M (Pipeline calibrated data - pixel file) u22u0101t.c0h 24K (Pipeline calibrated data - header file) u22u0101t.c1d 5M (Calibrated data quality - pixel file) u22u0101t.c1h 24K (Calibrated data quality - header file) u22u0101t.c3t 21K (photometry table made by pipeline - STSDAS table) u22u0101t.shd 2K (Standard header packet (telemetry) - pixel) u22u0101t.shh 25K (Standard header packet (telemetry) - header) u22u0101x.ocx 2K (OSS Quality report made when data recd. - text) u22u0101t.trl 14K (Trailer file produced by pipeline - text) u22u0101t.pdq 2K (PODPS calibration comments - text)
If there is more than one exposure in a given set then each will be numbered as 101, 102, 103, 104, etc., and if there is more than one set of exposures in a given program then each set will have a number as in 101, 201, 301, etc.
Start IRAF and load the STSDAS package.
First take a look at the headers of the images using the 'imheader' task:
st> imheader u22u0101t.c0h l+ | page
The "l+ | page" tells 'imheader' to print the full header and to scroll only one screen at a time, which makes it easier to read. Always refer to an STSDAS image by its header file (the one with an "h" at the end of the extension). The number in brackets specifies which group (i.e. chip) you are interested in. Some header keywords are chip-dependent.
A: - Take a look at the RSDP CONTROL KEYWORDS. As an example look at the DARKCORR keyword. A value of 'complete' means that the step was performed. If you look in the header of the raw data (.d0h), this keyword would be set to 'perform.' For exposures shorter than 10 seconds the DARKCORR will not be performed and a shutter shading correction will be performed.
Look at the CALIBRATION REFERENCE FILES. These are the names of the files used to calibrate the data. You can look in the directory instrument_news/wfpc2 at the anonymous ftp site stsci.edu (STEIS) to find the lastest list of WFPC2 reference files. Check the keyword values against this list. Pay particular attention to the FLATFILE and DARKFILE.
The most commonly used filter configurations will have appropriate flatfield reference files. The majority of these are thermal vac (made at JPL) flats which have been corrected for the OTA illumination pattern. On-orbit flats are being taken and will be put in the database as they are processed.
The hot pixels in WFPC2 images change with time, and so you need to use a dark frame that is closely matched to the time of the observation. Darks are now made on a bi-weekly basis, but there is a lag between delivery of the completed darks and when the data is calibrated in the pipeline. If hot pixels pose a threat to the scientific return of your data, then you should consider recalibrating with a better dark. The new dark reference files will show up in the list on STEIS with a "useafter" date. Use the earliest dark with a useafter date that is appropriate for the observation.
Usually the last file in the list for a given configuration is best, but look at the comments in the file to decide. If you think you want to recalibrate the data you can use the calibration tools provided with STSDAS like 'calwp2.' Calibration files can be obtained from the data archive using the (x)starview software. Instructions on how to use StarView and STEIS can be found on pages 455 and 479 of the HST Data Handbook. For more information on flats, darks and other calibration issues, see the Instrument Definition Team's (IDT) Calibration Status report.
B: - Check the exposure time (EXPTIME) to be sure it is what you expected. The EXPTIME keyword reflects the true exposure time and is calculated from engineering data, so if it is something other than what you expect then there may have been a problem. Also, if the EXPFLAG is something other than 'normal' then there may have been some problem during the exposure. Look in the .ocx file to see if there is some explanation from the OSS person on duty (if there was one) when the observation was taken.
C: - Look at the RA_TARG and DEC_TARG keywords. These are the coordinates that were specified in the proposal. If everything executed correctly, then these are the correct coordinates for the image, and the CD matrix in the header correctly describes the transformation between pixel and celestial coordinates. Use the 'xy2rd' task to get coordinates from the image. There is also a new version of the 'metric' task that will work on WFPC2 images, but it is only available on STScI systems at the present time. If you have reason to believe that something is wrong with the pointing (i.e. off by more than 1-2"), then it is possible to get the actual pointing from the FGS guide star data. This information is now being archived for all observations, but is not being sent out on GO data tapes yet.
You might find the 'keyselect' task useful for looking at header keywords. This task will produce an SDAS table with the keywords you select from each image in the set. For example:
tt> keyselect *.c0h keys.tab "rootname,time-obs,filtnam1,exptime" tt> tprint keys.tab
This will produce an STSDAS table called "keys.tab" with the values of the keywords ROOTNAME, TIME-OBS, FILTNAM1 and EXPTIME. 'Tprint' prints the table on the screen.
Secondly, display the image using the 'display' task. You will need to open an image display tool, such as (x)imtool or SAOimage, and make sure that the iraf variable stdimage is set to imt800 so that the entire image can be viewed:
st> set stdimage = imt800 st> display u22u0101t.c0h 1
The number in the ""'s specifies which group you want to look at; "1" is the PC, and "2-4" are the WF chips. Look at all four groups. Adjust the color stretch and contrast on the image display tool you are using. Look for anything obvious that might indicate an anomaly. Look for 'blooming' along the Y-axis which is caused by saturation of bright sources. Also look for any cosmic rays that might interfere with data reduction (there are a number of IRAF/STSDAS tasks available to remove cosmic rays, such as 'gcombine' in the stsdas.toolbox.imgtools package, 'crrej' in the stsdas.hst_calib.wfpc package, and 'cosmicrays' in the noao.imred.ccdred package). To be absolutely sure an area of the image is not saturated you should check the data quality files (see below).
If there are very faint sources in the data then you may have to set the min and max display ranges by hand in order to see the objects. ('display' sets the range automatically unless you tell it otherwise - cosmic rays can cause 'display' to set the max unreasonably high for images with low count rates) For example:
st> display u22u0101t.c0h 1 zr- zs- z1=0 z2=200.
z1 is the min and z2 the max. 'zr- zs-' turns off the autoscaling.
You might try using the 'imexamine' task to get some additional image quality information. This task can be used to plot stellar radial profiles, centroid on objects, and do some crude aperture photometry. If you have a set of images you could use the 'imcentroid' task to see if a given star's position has changed from frame to frame or to note any systematic drift in the telescope pointing.
Third, look at the .ocx, .pdq and .trl files. These are ordinary text files and can be viewed with any text editor.
When the data is transmitted down from the spacecraft, it is sometimes looked at by the on-duty Observation Support System (OSS) personnel who makes a quick check for anomalies and makes some estimate of the "quality" of the data. OSS personnel are not always on duty, so there may not be any comments. If there are any, the .ocx file will contain them and some statistics of the raw data frames. They will usually note any obvious saturations as well. The .trl file is produced by the Pipeline calibration and contains a log of the calibration steps. The .pdq file is produced during the calibration process and contains information about the actual and predicted observational parameters, and also any obvious features noted by the Post Observation Data Processing System (PODPS) operations astronomer.
Sometimes the OSS comments will say "blank image" even though there might really be faint objects there, so don't give up until you've actually looked at the data with an appropriate scaling to see the faint things. You know what to look for in your data.
Finally, you might want to check the data quality files to see if any of the pixels have been flagged for any reason.
The .c1h file contains an image of the same dimension as the actual data. Each pixel in this image has a value associated with it that tells you the quality of that pixel. A value of "0" is nominal. A value of "8" flags a saturated pixel (a raw data value >= 4095). A value of "2" indicates a reference file defect for that pixel. There are a number of other flags as well, and there can be numerical combinations of these which are unique, but this file is best used to look for saturation values. There are a number of STSDAS tasks to help you look at this file. You can make a mask of values equal to a given flag using 'imcalc,' and its sometimes useful to just display the .c1h file, setting the min to 0 and your max to 1. Then everything good will appear black and anything flagged as bad will appear white. Then use 'listpix' to list the pixels in a given area to pinpoint the pixels in question. You might also use the 'imexamine' task.
A listing of what each flag represents is available on page 87 of the HST Data Handbook. There is also an explanation given in the 'calwp2' help file (type "help calwp2").
A NOTE ABOUT H O P R 'S:Any OBVIOUS problems with your data are identified by STScI staff and the observations are automatically re-scheduled if the data are clearly compromised (e.g. failure to acquire guide stars). Less obvious problems (e.g. a shortened integration time) must be identified by the observer who then has the option of asking that the data be re-taken by filing a HOPR (HST Observation Problem Report). Generally if the HOPR is approved, the previous data are archived as non-proprietary and the observation re-scheduled. Contact email@example.com if you think you should file a HOPR, and we will help you research the problem. Note that HOPRs *must* be filed within 90 days of the receipt of your data.