Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Version History

« Previous Version 14 Next »

For Kewei to do

  • Read the rest of the book! (done)
  • Get swipe card access to the lab (done)
  • Start to learn about scripting and shell scripts (bash) in Unix of Mac (done)
  • get up to Numpy and Scipy in Python
  • bug Chris about PanStarrs data file locations on Odyssey machine (done)
  • extract multicolor light curves for ~10,000 objects, in g,r,i,z,y.
  • make plots of colors (g-r) vs. (z-y), etc. and reject outliers from the stellar locus (see High et al SLR paper) 
  • Do this for different pairs of data, taken at different times.

 

For Chris to do:

  1. tell Peter to move the computer that's in the office (tomorrow, Thursday)
  2. approve RC account  (done!)

 

Li Kewei's Lab log for the week of Jun 10-16

  • Learnt about python and unix. I have set up a python development environment on my local machine, and learnt about the unix system for the computing server.
  • Wrote a crawler to find the relevant data that's needed to find a light curve from the data on the server. The star to be used is given as an input. The crawler finds all objects within 1 pixel or the sum of variances of the 2 PSFs involved and puts all the data into a file. I used the pickle module for saving the file, but that doesn't seem to preserve all aspects of the data. I'm finding another way to save the data right now. But besides that I think the code is working correctly. I am testing the code for data collected in MD04 in the month of May, 2013.

Monday, Jun 17

  • I can process data and produce plots now. Here are some light curves:


Thursday, 20 June

  • Finished reading "Learning Unix for OS X Mountain Lion"
  • Gautham informed me that there's a hard disk error on the server. As a result file operations (such as cp) are not completing. This is giving me considerable trouble trying to process the data. I'm waiting for the disk to get online again.
  • Meanwhile, I'm reading High's paper on Stellar Locus Regression and learning a bit more about numpy and scipy.

Saturday, 29th June

  • Plotted the color diagram for all the objects in stacked gpc1v3. The diagram gets too cluttered if I plot all the objects, so I limited it to points where the error bars are small (<0.002 mag).

Data access on Odyssey:

  1. Run JAuth.jar to get login key
  2. ssh -Y into to odyssey.fas.harvard.edu, or herophysics.fas.harvard.edu, using the electronic key. 
  3. run tcsh
  4. source .myrcstubbs
  5. data are at /n/panlfs/data/MIRROR/ps1-md/gpc1/
  6. nightly science uses individually warped images, nightly stacks run on stacked frames
  7. image types: wrp is warped. 
  8. see available modules with "module avail"
  9. load a module with "module load hpc/ds9-5.6"
  10. photometry is in .cmf files, as FITS tables. 
  11. in python: 
    1. import pyfits as p
    2. p.open('filename')
    3. print a[0].header
  12. or, imhead on command line
  13. a[1].data.AP_MAG for aperture magnitudes
  14. PSF_RA and PSF_DEC are in the skycell files. 
  15. make a scratch directory for data in /n/panlfs

Photpipe photometry as text files are at

run gpc1v3 to invoke scripts for photpipe

then files by ut, but aubsets a-j as 10 spatial subsets. 

for example

/n/panlfs/data/v10.0/GPC1v3/workspace/ut130525f/41

and use the dcmp files, four per stacked image. 

take catalog photometry entries and add ZPTMAG to all entries to get corrected photometry. 

photpipe are DoPhot with aperture correction. 

 

 

  • No labels