Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Summary:

HL is sending 25,000 items in three shipments to Google for scanning (Phase 1: two shipments in November and one in December). Iron Mountain is managing the pulling and shipping of materials from the Harvard Depository to Google Ann Arbor. At the time of HD pull, items are permanently withdrawn from HD.

...

  1. Items are pulled from HD by Iron Mountain and pw'ed.
  2. Pat puts these files into a directory "off root" on sherlock. All of us should have access by entering cd /google in the home directory. Pat says he has been sending files with barcode only, but the files he copied into /google all contain  bookcart number as well. I cat'd the files together and edited into barcode only. That file is: HD_bookcarts_combined_200_242_barcodeonly.txt and it contains 12,430 barcodes. I will use the file as input to an itemized set, and to an analysis and send files of metadata today (11/30).. 
  3. Iron Mountain batches items from weekly pulls into three shipments to Google (Ann Arbor) according to the schedule indicated.
  4. Item manifests (lists of barcodes) should be given to LTS in advance of shipments so that LTS can properly process items.
  5. LTS will process items as follows:
    1. Run a report to break shipments down by location so that they can be properly mapped to the correct ReCAP destination location
    2. Metadata extracted as per Google spec and sent to Google
      1. Copy list of barcodes and paste into the Barcode prompt for the report on the Alma Analytics menu called RES-Physical items details with core holdings, bib data - Flexible. Compare the number of items input to the number of rows in the report to see that they match. 
      2. Post the report to this page (See below)
      3. Create an Excel file for barcodes only. Use file to create an itemized set in Alma. See Google metadata - bookcarts 200-242 for an example.
      4. Open publishing profile called: Google@item level. Publishing profiles are under Resources menu in Alma.
      5. Select Edit on the profile. Update the parameter for Set name with the name of the itemized set. Update File name prefix with a name like: harvard_bookcart_200-242. Save profile.
      6. Select Run on the profile to run the export. Output files go kant /dropbox/alma/alma/google
      7. Copy the files to your local machine and then upload to shared Google Drive. Google automatically ingests files. No notification needed. Contact bbunnell@google.com for access (or Allison might be able to sent a sharing invitation).
      8. Google contact for metadata is  Kurt Groetsch kgroetsch@google.com
    3. Create itemized sets by location and run a process to update sets to permanent RD location
      1. Create separate itemized sets for each library/location (i.e, WIDHD, WIDHDJUD, etc.). 
        1. Sort spreadsheet list into separate barcode lists for each library/location and save associated barcodes into text files with the text barcode as the first line of the file. 
        2. From manage set choose add set and run an itemized set, uploading each list you created in step 1.
      2. Run a job > change physical items on the set
      3. Change type: Permanent; New library: same as current owning library (no library ownership change); New location: mapped to corresponding RD collection for new/updated holdings
    4. Place items into transit awaiting post-processing ReCAP accession 
      1. A revised EOD script will be used (see documentation on /wiki/spaces/LibraryTechServices/pages/59098446). Items will appear to have been put into transit from the owning library's main circ desk (i.e., WID_CIRC); the goal is simply to put the items in transit from a circulation desk which does not have a reshelving relationship with the items. 
    5. Regular End of day process/file from ReCAP returns items to "on-shelf" status (removing Transit status). Any waiting request would be sent to RD to pull in next transfer.

...