Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Solid angle Omega subtended in an angle A from celestial pole is Omega=2pi*(1-cos(A)). 

A (Deg)declinationOmega(sr)/2pisq degN fields
30-600.132681280
45-450.295981623
60-300.5103131074
70-200.66136071417
9001206262148
120+301.5309393222

So within 2 airmasses we can get to 3/4 of the entire sky. A six-band, annual 3pi survey would require 6*3222 = 19K visits. At 50 seconds per visit, this is essentially 16K minutes ~ 270 hours = 30 nights per year. But of course it's only the marginal investment that should count. 

...

Number of fields vs. declination

-90 0
-87 7
-84 13
-81 19
-78 25
-75 31
-72 36
-69 42
-66 48
-63 53
-60 59
-57 64
-54 69
-51 74
-48 78
-45 83
-42 87
-39 91
-36 94
-33 98
-30 101
-27 104
-24 107
-21 109
-18 111
-15 113
-12 114
-9 115
-6 116
-3 116
0 117
3 116
6 116
9 115
12 114
15 113
18 111
21 109
24 107
27 104
30 101

 


Cadence, coverage, and passband trades.

Only considering time between astronomical twilight, using skycalc:

monthdateduration
0.5Jan 116.9
1Jan 267.4
1.5Feb 097.8
2Feb 258.4
2.5Mar 118.9
3Mar 269.4
3.5Apr 099.8
4Apr 2510.2
5May 0910.5
5..5May 2410.8
6Jun 0710.9
6.5Jun 2210.9
7Jul 0710.9
7.5Jul 2210.7
8Aug 0610.4
8.5Aug 2010.1
9Sep 049.7
9.5Sep 189.3
10Oct 048.8
10.5Oct 188.3
11Nov 027.8
11.5Nov 177.3
12Dec 026.9
12.5Dec 166.7
13Dec 316.7

  average duration is 9 hours. Obstime=9.2+2*cos(2pi t/yr).  

 


...


Sky Brightness from LSST ETC

...

footprint is 26.82 pixels, Gaussian weighted. Units are electrons in 15 sec exposure. Moon is 90 deg from boresight

lunar phase\filter

u

g

r

i

z

y4

0

42

89

111

159

218

238

3

53

107

113

159

218

238

7

81

160

140

174

224

238

11

143

280

205

215

242

242

14

240

474

305

269

263

254

(bright/dark) ratio

5.7

5.3

2.7

1.7

1.2

1.07

SNR impact~sqrt(sky)

2.4

2.3

1.6

1.3

1.1

1.03

SDSS cumulative DR1 sky brightness distribution (no bright time imaging...)

...

1) additional optical attenuation due to increased atmospheric path length

2) degraded "seeing". 

band

central wavelength

extinction (mag per airmass)

seeing degradation compared to r band at zenith

u

350 nm

0.40*a

(1.1)*a0.6

g

450 nm

0.18*a

(1.07)*a0.6

r

650 nm

0.10*a

(1.0)*a0.6

i

750 nm

0.08*a

(0.97)*a0.6

z

850 nm

0.05*a

(0.94)*a0.6

y

1000 nm

0.04*a

(0.91)*a0.6

...

Plot of seeing degradation vs. airmass, and polynomial fit:

...

SNR scales as source flux in the numerator and (for unresolved objects) as seeing in the denominator. Flux at an airmass "a" is reduced by a factor f(a)=10^(x*(a-1)/2.5) where x is the extinction coefficient listed in the table above. So SNR vs. airmass at fixed exposure time for unresolved point sources scales as SNR(a)~10^(x*(a-1)/2.5)/(0.35+0.72a^2-0.07a).

airmass

seeing degradation

SNR_u

SNR_g

SNR_r

SNR_i

SNR_z

SNR_y

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.1

1.06

0.91

0.93

0.94

0.94

0.94

0.94

1.2

1.11

0.83

0.87

0.88

0.88

0.89

0.89

1.3

1.17

0.77

0.81

0.83

0.84

0.84

0.85

1.4

1.22

0.71

0.77

0.79

0.79

0.80

0.81

1.5

1.27

0.65

0.72

0.75

0.76

0.77

0.77

1.6

1.32

0.61

0.68

0.71

0.72

0.73

0.74

1.7

1.37

0.56

0.65

0.68

0.69

0.71

0.71

1.8

1.42

0.53

0.62

0.65

0.66

0.68

0.68

1.9

1.46

0.49

0.59

0.62

0.64

0.65

0.66

2.0

1.52

0.46

0.56

0.60

0.61

0.63

0.64

fits

Seeing=0.35+0.72a^2-0.07a

SNR_u=2.1-1.4*a+0.30*a^2

SNR_g=1.9-1.1*a+0.23*a^2

SNR_r=1.8-a+0.21*a^2

SNR_i=1.8-0.98*a+0.20*a^2

SNR_z=1.7-0.94*a+0.19*a^2

SNR_y=1.7-0.93*a+0.19*a^2

Slew times. 

Oct 19 2013, CWS. 

...

See http://www.gb.nrao.edu/~rcreager/GBTMetrology/140ft/l0058/gbtmemo52/memo52.html for az rates vs. zenith angle. 

 

 



Coverage Rate

At 35 seconds per visit and 9.6 square degrees per field, we cover the sky at a rate of 7900 square degrees in an 8 hour night. That means that on average we revisit interval (no weather) is 3 days for 18,000 square degrees.  

...

The position of objects on the sky changes in right ascension direction at an angular rate of 15 degrees per hour times cos(declination).  How long does it take the sky to rotate by one field width, as a function of declination? It takes 3.1/15 = 0.2 hours = 12 minutes on the equator, and so t(dec)=cos(dec)*12 minutes elsewhere. So if we were scanning along the meridian we would have return to a given declination at an interval of cos(dec)*12 minutes, to get full coverage at minimum airmass for each declination band. At 50 seconds per field (average) in 12 minutes we would cover 3.1*12*(60/50)=45 degrees of declination.    


One potential approach:

  1. determine the rank-ordered priority of all fields on the meridian, or at that night's minimum airmass if they don't transit, in each passband, for different potential values of seeing. 

  2. reject the fields that never appear in the top ~1000. These have such low priority we'd never get to them in a single night. 

  3. For each parametric value of seeing, compute the sequence of observations that maximizes the merit function, including the slew overhead contribution. 

...

Took Gautham's data set and did correction for airmass, make cumulative plot of delta zero point, sorted. 

percentilemagnitudes of extinction from clouds, after mean is subtracted
10-0.146
25-0.111
50-0.076
75-0.033
80-0.014
900.166
950.486
991.71
99.92.52
  


...

Stubbs Notes, May 25 2014. 

Jaimal and I have agreed that weighted sum and a product figure of merit amount to the same thing. So we'll stick with the weighted sum, and compute a Figure of Merit accordingly: 

Latex formatting
FOM=$\Sigma_{programs=1}^M \Sigma_{fields=1}^N w_{program}w_{field} M_{field}$,

where the weights w and merits M are drawn from multiple considerations. We'll tune values of M to range from 0 to 1, where they saturate. Some candidate elements for the merit by field:

 

 

For the atan() function, tau_1 determines the 50% point and tau_2 the slope of the merit function at that point. 

Atan merit function for tau1=45 and tau2 values of 1 (red), 5 (black) and 10 (blue) days. Same basic thing happens for FWHM and depth, where merit increases

Image Removed

And here is a plot of exponential weight evolution for taus of 5 (red), 10 (black) and 30 (blue) days. 

Image Removed

Here is an example of FWHM-based merit, driving a field higher if seeing is really excellent. This is for FWHM_1= 0.5 and FWHM_2=0.1. Depth uniformity would look the same as this. 

Image Removed

 

 

This FOM is computed per field, per passband, for each potential observation. We can also introduce a couple of penalties:

  • penalize an observation if a better opportunity will come up within the characteristic time tau
  • penalize observations as a function of hour angle, favoring observations towards the East, since we can follow transients for a longer time for those fields. 
  • We also need to compute a penalty that connects pairs of fields, namely the slew time between them.

We can adopt the 5 or 10 sigma point source limiting magnitude as a good indicator of quality of an observation.  

Apart from atmospheric variation in cloud transparency and seeing, the observing conditions as a function of time are deterministic. The zenith angle and sky brightness can be computed, and so the ten sigma point source magnitude depends upon

  1. sky brightness, which is a function of moon phase, distance from the moon, lunar elevation, solar cycle. 
  2. zenith angle, which affects both atmospheric attenuation and seeing degradation. 

We can compute all of this in advance, for each field. Jaimal found what seems to be a good Python package for this

http://pythonhosted.org/Astropysics/coremods/obstools.html#astropysics.obstools.Site.apparentCoordinates

So we want to compute, for each field and for each observing opportunity, a zenith-angle and sky-brightness adjusted 5 sigma point source magnitude, m5. The signal to noise ratio for a point source scales as 

for a fixed integration time and in the sky-dominated regime. Taking the log of both sides, and incorporating the zenith-dependence of FWHM and also passband-dependent extinction A(zenith), and incorporating the attenuation due to clouds AC (in magnitudes), we compute a change in m5 relative to observing at the zenith and under a sky background of m_o magnitudes per square arc sec, 

This includes the zenith-depedence of FWHM, which scales as airmass^0.6, and extinction in the various bands. The coefficient of the final term comes from the airmass dependence of seeing, a^0.6, and the 2.5 factor for magnitudes, so that 2.5*0.6=1.5. 

Have each science program fill out this table, for each field center. Constrain field weights so they sum to one, for each program. Examples from SN, weak lensing, and static sky are illustrated

programprogram weightfield IDfilterfield weighttau 1 (days)tau 2 (days)FWHM 1FWHM 2depth 1
WL0.3100repsilon3651000.50.127
  101repsilon3651000.50.127
SN0.2205gepsilon2520.10.125
static sky0.2205gepsilon33651000.80.227

 

A high value for tau1,2 de-emphasizes that aspect. A low value for FWHM1,2 de-emphasizes seeing. 

A prescription for a (single-band, for now) optimization strategy would be

...

Then, before the start of each night

...


Latex formatting
$M_{temporal,1}=(1-e^{(-t/\tau_f)})$, where $t$ is (partial-credit) time elapsed since last observation and $\tau_f$ is field-dependent max unobserved gap.
Latex formatting
$M_{temporal,2}=(1/2+(1/\pi)*atan((t-\tau_1)/\tau_2))$
Latex formatting
$M_{seeing,WL}=1/2-(1/\pi)*atan((FWHM-FWHM_1)/FWHM_2))$
Latex formatting
$M_{uniformity}=1/2-(1/\pi)*atan((depth-mean(depth))/d_2))$


For the atan() function, tau_1 determines the 50% point and tau_2 the slope of the merit function at that point. 

Atan merit function for tau1=45 and tau2 values of 1 (red), 5 (black) and 10 (blue) days. Same basic thing happens for FWHM and depth, where merit increases

Image Added

And here is a plot of exponential weight evolution for taus of 5 (red), 10 (black) and 30 (blue) days. 

Image Added

Here is an example of FWHM-based merit, driving a field higher if seeing is really excellent. This is for FWHM_1= 0.5 and FWHM_2=0.1. Depth uniformity would look the same as this. 

Image Added



This FOM is computed per field, per passband, for each potential observation. We can also introduce a couple of penalties:

  • penalize an observation if a better opportunity will come up within the characteristic time tau
  • penalize observations as a function of hour angle, favoring observations towards the East, since we can follow transients for a longer time for those fields. 
  • We also need to compute a penalty that connects pairs of fields, namely the slew time between them.

We can adopt the 5 or 10 sigma point source limiting magnitude as a good indicator of quality of an observation.  

Apart from atmospheric variation in cloud transparency and seeing, the observing conditions as a function of time are deterministic. The zenith angle and sky brightness can be computed, and so the ten sigma point source magnitude depends upon

  1. sky brightness, which is a function of moon phase, distance from the moon, lunar elevation, solar cycle. 
  2. zenith angle, which affects both atmospheric attenuation and seeing degradation. 

We can compute all of this in advance, for each field. Jaimal found what seems to be a good Python package for this

http://pythonhosted.org/Astropysics/coremods/obstools.html#astropysics.obstools.Site.apparentCoordinates

So we want to compute, for each field and for each observing opportunity, a zenith-angle and sky-brightness adjusted 5 sigma point source magnitude, m5. The signal to noise ratio for a point source scales as 

Latex formatting
$SNR=\frac{\Phi}{\sqrt{(FWHM^2*sky}}$

for a fixed integration time and in the sky-dominated regime. Taking the log of both sides, and incorporating the zenith-dependence of FWHM and also passband-dependent extinction A(zenith), and incorporating the attenuation due to clouds AC (in magnitudes), we compute a change in m5 relative to observing at the zenith and under a sky background of m_o magnitudes per square arc sec, 

Latex formatting
$dm5=AC+A~sec(z)+0.5~log(FWHM/0.7)+0.5~(m_o-m_{sky})+1.5~log(sec(z))$

This includes the zenith-depedence of FWHM, which scales as airmass^0.6, and extinction in the various bands. The coefficient of the final term comes from the airmass dependence of seeing, a^0.6, and the 2.5 factor for magnitudes, so that 2.5*0.6=1.5. 

Have each science program fill out this table, for each field center. Constrain field weights so they sum to one, for each program. Examples from SN, weak lensing, and static sky are illustrated

programprogram weightfield IDfilterfield weighttau 1 (days)tau 2 (days)FWHM 1FWHM 2depth 1
WL0.3100repsilon3651000.50.127


101repsilon3651000.50.127
SN0.2205gepsilon2520.10.125
static sky0.2205gepsilon33651000.80.227


A high value for tau1,2 de-emphasizes that aspect. A low value for FWHM1,2 de-emphasizes seeing. 

A prescription for a (single-band, for now) optimization strategy would be

  1. allocate weights to different science programs, based on fashion and merit
  2. have those science programs determine merit attributes for all fields
  3. pre-calculate zenith angle and sky background dependent m5 values for all fields, for all potential observations. 
  4. Construct a nominal m5 value for zero clouds and median FWHM, for all fields for all observation slots. 

Then, before the start of each night

  1. trim list of candidate fields to the ones that are above some cutoff airmass
  2. estimate the co-added depth for each one, compute their depth merit functions 
  3. determine the (partial-credit) time since last observed for each field in each band, and compute the temporal merit function for each field/passband combination
  4. compute merit function for each field and passband, and calculate nominal sky merit function vs. t by looking forward until temporal merit hits 0.9. Compute penalty for observing it now if better chance later, for each field and passband. 
  5. Pick the top 150 fields for each passband and add up merit function for each band. Select the highest total as the starting filter for the night. That establishes the first hour's worth of observations

Adopt a block-wise approach. While taking data during a block of 150 pointings (which will take about an hour), remove them from the list of candidate fields by setting weights to zero, or whatever. 

Decide whether we should change filters: Compute sum-of-merit for next-best 150 observations in current filter, as opposed to most important ~100 observations in other bands. This imposes the filter change penalty properly. 

Then decide on the best order for observing most important fields. This is now just a question of optimizing the sequence of 150 pointings. 

We also want to impose a look-ahead penalty, in case there is an upcoming opportunity (based on sky brightness or zenith angle) to do better on a given field. But we want to guard against having the scheduler drive to later observations when urgency is higher. So we should distinguish between observation merit and urgency factor, and paradoxically we should observe now if urgency factor is much much higher later on. So what seems to matter is the gradient in urgency traded against getting a better value of m5 later on. 

It would seem we need to compute a better-if-observed-later merit penalty that depends on the gradient of the urgency factor. So how about a penalty factor of the form

penalty=max over a period tau of (merit later/merit now)*(urgency now/urgency later).

This favors subsequent observations of higher merit, unless urgency is much high at the later time.

 




...

Some references

LSST science book

...