Last updated on by Attila Kovács

LABOCA Photometry with CRUSH

Updated for 2.15 release

Table of Contents

   

Introduction

The woodoo observing mode

After some less successful earlier attempts, the photometry mode was finally commissioned for LABOCA (and SABOCA) during the March 2010 (03/11–03/15) technical run (under T-084.F-0001-2009). The observing mode is the basic chopped photometry, with a symmetric nod. I.e., the wobbler is used for the fast switching (i.e. chopping) between the source of interest and an off position on a selected bolometer. After spending some time with the source in the left beam, the telescope nods to the other side where it continues chopping with the source now in the right beam. The cycle is repeated until the desired depth is reached.

This symmetry is required because the left and right beams illuminate the telescope primary differently (due to their off-axis positions), resulting in a large and arbitrary power inbalance between them. By combining the symmetric nod phases together the source flux can be separeated from the inbalance, as long as the atmosphere is sufficiently stable such that systematics can be ignored.

Reducing photometry data with CRUSH

As of version 2.02, CRUSH provides the capability to reduce photometric data obtained in this symmetric chop mode (a.k.a. woodoo). Simply specify the scans on the command-line, just like you would do for mapping. CRUSH will automatically recognize that these are photometry scans and will reduce data accordingly. E.g.,

crush laboca [...] 9561-9566

Modifiers, such as faint and deep are encouraged when appropriate. Not using these on a weak source will result in sub-optimal discrimination of bad data, while using these options incorrectly on a bright source will result in over-flagging. The results from the commissioning run in March 2010 are summarized in Table 1 (tip: hover over the object names to see the scan numbers used). Fluxes are shown for default, faint and deep reductions, while fluxes obtained from imaging are shown in the last column for comparison (when available). The results were obtained using version 2.15-b2, and should be representative of the 2.15 (and later) releases.

Table 1. Photometry Results
Object Time (default) faint deep Imaging
Vesta 1.0 min 2.10 ± 0.044 Jy 2.10 ± 0.044 Jy 2.21 ± 0.046 Jy 2.20 ± 0.046 Jy
Metis 6.9 min 293 ± 18 mJy 294 ± 18 mJy 310 ± 19 mJy 314 ± 33 mJy
Angelina 42.3 min 22.4 ± 5.7 mJy 22.3 ± 5.6 mJy 23.5 ± 5.9 mJy
SMM14009 113.5 min 11.2 ± 2.9 mJy 11.4 ± 2.8 mJy 12.9 ± 3.0 mJy
BR1202 141.4 min 25.0 ± 2.2 mJy 23.9 ± 2.2 mJy 25.4 ± 2.3 mJy

Table 1. Summary of photometry from the 2010 commissioning.

As you can see, the photometry reductions recover the same flux as the imaging, within the indicated errors. There is good agreement between results obtained with the default pipeline and with the faint and deep options. This is good news, since latter the two options mainly differ in the way they treat large-scale (extended) emission, which is of no consequence for single-beam photometry. Below, you can find more details about the reproducibility and accuracy of the photometric reductions, if you are interested.

What's new?

The photometry reduction of CRUSH is improved from time to time, as the pipeline incorporates new ideas or fixes various bugs. Here is an overview of what has changed since the initial 2.02-1 release supporting photometric reductions for the first time.

Release 2.15-1

Photometry in the 2.15-1 release has much improved reliability and reproducibility. Ultimately, this means better sensitivity and results that you can trust. These improvements are the result of a lot of work under the hood:

The release also provides additional new outputs for photometry reductions:

Release 2.03-1

The 2.03-1 release (see the corresponding earlier version of this document) brings a number of improvements in the photometry reductions . The main changes are:

The net effect is fluxes that are about 10–15% higher than previously (i.e. these were about 10–15% too low before), and improved systematics mean that for >1Hz chops, the scatter of measurements is more in line with the estimated errors (around 8% excess vs. around 35% before). As such, the reported flux errors are confirmed to be reasonably precise.

Analysis

The close agreement between fluxes obtained in photometry and in imaging modes ought to convince you that the values are reliable. However, of equally great concern is whether the indicated uncertainties provide a fair estimate of the measurement errors. Ideally, repeated measurements of the same source should yield values that are scattered around the true flux of the source with a spread equal to the estimated uncertainty.

How errors are calculated

CRUSH estimates the error of the photometry directly from the data. The estimate has three components:

The first component provides a very accurate measure of the detector sensitivities, providing a lower limit to the measurement uncertainty. The presence of 1/f noise in the chopping phases means that the actual noise of the photometry is higher than the white-noise limit calculated from the high-frequency time-stream.

This is where the second method of estimation provides some relief. Given a 1Hz chop (typical for LABOCA) and a minute long nod-phase (also typical, although shorter would probably be better, see further below), there are around 60 repeated measurements of the relevant fluxes (power inbalance plus/minus source) in the subscan. The actual scatter of these can be used to estimate the systematic errors coming from the chopping motion itself. This correction for 1/f-type detector noise is performed for each pixels, and also globally for each nod-cycle.

Consistency and Reliability

There are also long-timescale systematic effects that degrade the photometric precisions. Chief among these are (a) pointing drifts, and (b) changes in focus quality, and (c) a transient presense of emission in the 'OFF' beam as the sky rotates relative to the chopping direction. In order to accurately capture the full point-source flux in the photometry, the focus (especially z-focus!) has to be maintained at optimum, and the telescope pointing controlled to a small fraction of the beam size, and the observer has to choose the wobbler throw carefully to avoid other sources of emmission in the 'OFF' beam. A pointing drift of just 1/5th of a beam (around 3.6" for LABOCA, and 1.5" for SABOCA) will degrade the detectable source flux by 10%, while a drift by 1/3rd of a beam (6" for LABOCA and, 2.5" for SABOCA) will result in 25% degradation. To catch such long-timescale systematic variation, CRUSH calculates a reduced-χ value from the scan-to-scan scatter, when multiple scans are reduced together. This reduced-χ value is also reported at the end of reduction. When the reduced-χ exceeds 1.0 one can multiply the reported uncertainty by it to arrive at a more robust estimate of the total systematic uncertainty of the measurement.

The plots below illustrate this by showing the scan-by-scan breakdown for the three sources (see above) observed with multiple scans during the March 2010 photometry commissioning. (Tip: click on the images to bring up a larger version of them.)

Angelina scan-by-scan SMM14009 scan-by-scan BR1202 scan-by-scan
 
Figure 1. Scan-by-scan photometric scatter

In general, the photometry results of the scans are in good agreement with one another. However, there is clearly the occasional outlier (but not beyond 3-σ from the mean!). It is important to remember that such discrepancies are bound to happen (because of pointing, focus, nearby sources, some transient pickup in the detectors, or other reasons). To mitigate their effect,

Provided that one follows these basic guidelines, the resulting photometry should be rock solid...

Sensitivity and Radiometric Performance

To test the radiometric performance of the data reduction, the same set of 2009 commissioning data is used as in the table above, but reduced with tau=0.0 option to remove the variations in atmospheric trnasmission, and yield unattenuated sensitivities. The data shows just about perfect radiometric behaviour, with the expected t-0.5 down-integration. The corresponding unattenuated photometric NEFD is about 150 mJy s0.5. Of course, with a line-of-site opacity τ, the effective NEFD will be exp(τ) higher.

Radiometric Performance
 
Figure 2. Radiometric performance.

It's worth mentioning that this NEFD is calculated per net integration time, of which only 50% is spent looking at the source. Thus, the photometric NEFD of LABOCA would correspond to a mapping NEFD of ~100 mJy s0.5, which is still noticeably above the nominal performance. The reason for the discrepancy is that for slow chops (~1 Hz) the photometry signal tends to be affected by both detector 1/f noise, and 1/f2 sky noise. Indeed, looking at the figure below, it is clear that the lowest sensitivity commissioning scans are the ones obtained at a 0.5 Hz chop. The dotted curve shows an assumed 1/ff2 behaviour fitted to the observations (power index 2 for sky-noise-dominated 1/f). Accordingly the photometric noise floor, achievable only at high-frequency chopping, is expected to be around 110—120 mJy s0.5 (equivalent mapping sensitivity around 80 mJy s0.5), consistent with expectations.

Observers should avoid slow chopping below 1 Hz, as they will results in a significant noise penalty. At 1.0—2.0 Hz, the effective photometric NEFD is expected around 120—130 mJy s0.5 (not including the wobbler efficiency overheads!).

Sensitivity vs. Chop Frequency plot
 
Figure 3.The importance of a fast chop.

(As a side note, the original SHARC 350µm camera on the CSO used a ~4.1 Hz chop with a 15 second nod — anything more prolonged than these showed a clear excess in the residual noise values. The 870µm sky at APEX may be somewhat less unstable, but it is strongly suggested that the 1/f-type photometric noise would go down with faster chopping and/or nodding.)

Conclusions

CRUSH provides a viable approach for reducing chopped (woodoo) photometry scans for LABOCA (and SABOCA), yielding reliable and reproducible results. The reported flux values includes a best estimate of the systematic errors, while long-timescale variations, spanning multiple scans, are reported as a reduced-χ value, which the user can use to further refine their total systematic error estimate. The excess noise in the photometry vs. mapping, is most likely due to the very slow nodding (1-minute cycles) of the telescope, as well as the relatively slow chop (~1 Hz) used during the observations used for commissioning. The systematic errors will likely decrease with a faster chop/nod. Observers are much encouraged to chop fast and nod more frequently (e.g. every 15 seconds) to obtain more precise photometry.

 
Copyright © 2013 Attila Kovács (attila[AT]submm.caltech.edu)