Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Summary:

...

Overview

Harvard Library is sending a total of 50,000 items to Google for scanning in FY2019, in two major phases of 25,000 items each.

Workflow Summary

  1. Items are pulled and permanently withdrawn from Harvard Depository (HD) and shipped to Google Ann Arbor for scanning. Iron Mountain is managing pulling and shipping of materials.
  2. LTS uploads metadata files to Google. Note: Google requires receipt of metadata prior to receiving physical materials for scanning.
  3. After scanning, items are shipped to and permanently reaccessioned at ReCAP. 

Google requires that item metadata be uploaded prior to receiving physical materials for scanning.

Project Timeline

Phase 1 (November-December 2018)

25,000 items over three shipments. Iron Mountain is managing the pulling and shipping of materials from the Harvard Depository to Google Ann Arbor. At the time of HD pull, items are permanently withdrawn from HD..

  • Delivery 1: November 12, 2018
  • Delivery 2: December 3, 2018
  • Delivery 3: December 21, 2018 (TBD)

Phase 2 (January-March 2019)

25,000 additional items. Shipment details TBD.

Detailed Workflow

Notes:

  • The deaccession of items from HD and the shipment to Google

...

  • are not

...

  • simultaneous. Items are deaccessioned weekly from HD by

...

  • Iron Mountain and then batched together into larger shipments. 
  • Google requires item metadata in advance of the items' arrival for scanning. 
  • Items will subsequently be shipped to ReCAP for accession and final storage following the Google processing.

In January 2019 (continuing through March 2019), an additional 25,000 items will be pulled and withdraw from HD for the project (Phase II).

Project Workflows:

Phase I: Target is to pull and transfer 25,000 items in three shipments to Google.

Delivery 1 - November 12
Delivery 2 - December 3
Delivery 3 - December 21 TBD

Phase II: Second pull and transfer of 25,000 items in three shipments to Google.

Dates TBD

Procedure:

...

Who is responsible? (color coded)

  • Harvard Depository staff
  • Iron Mountain
  • LTS staff
  • Google Ann Arbor staff
  • ReCAP staff

 

  1. Iron Mountain pulls items from HD weekly and permanently withdraws them from HD.
  2. HD (Pat O'Brien) places weekly item manifest files in /google directory on sherlock. Example: Example_weekly_HD_bookcarts210_214.txt
  3. Iron Mountain batches items from weekly HD pulls into three shipments to for Google (Ann Arbor) according to the schedule indicated.
  4. Item manifests (lists of barcodes) should be given to LTS in advance of shipments so that LTS can properly process items.
  5. LTS will process items LTS processes item manifest files as follows:
    1. Run a report to break shipments down by location so that they can be properly mapped to the correct ReCAP destination location
    2. Metadata extracted as per Google spec and sent to Google
    3. Copy list of barcodes and paste into the Barcode prompt for the report on the Alma Analytics menu called RES-Physical items details with core holdings, bib data - Flexible. Compare the number of items input to the number of rows in the report to see that they match. 
    4. Post the report to this page (See below)
    5. Create an Excel file for barcodes only. Use file to create an itemized set in Alma. See Google metadata - bookcarts 200-242 for an example.
    6. Open publishing profile called: Google@item level. Publishing profiles are under Resources menu in Alma.
    7. Select Edit on the profile. Update the parameter for Set name with the name of the itemized set. Update File name prefix with a name like: Combine weekly item manifest files into a single batch manifest and confirm/correct file to ensure that it contains only barcodes, one per line, no extra text or spaces. Example: Example_batch_HD_bookcarts_combined_200_242_barcodeonly.txt
    8. Extract metadata per Google spec and send to Google:
      1. Create a single-column Excel file: header row "barcode" followed by the list of barcodes from item manifest file. Example: Example_Excel_for_Alma_set_input.xlsx
      2. In Alma, create a public itemized set of Physical Items using the Excel list above: Alma > Admin > Manage Sets > Add Set+ > Itemized > From File. Example set in Alma: Google metadata - bookcarts 200-242
      3. Compare number of items in the Alma set with the number of barcode lines in input file to make sure they match.
      4. In Alma, open the publishing profile "Google@item level" for editing: Alma > Resources > Publishing > Publishing profiles, then select the ellipsis button next to "Google@item level" profile and select "Edit".
      5. Update the "Google@item level" profile parameters as follows:
        1. Set name: Select the itemized set you just created.
        2. File name prefix: Update with current batch information, following the same convention, e.g. "harvard_bookcart_200-242".
        3. Save the profile.
      6. Select Run on the profile to run the export. Output files go kant Run "Google@item level" profile by selecting "Run" from the profile's ellipsis button. Output files are deposited as *.tar.gz format on almadrop in /dropbox/alma/alma/googleCopy the files to your local machine and .
      7. SFTP to almadrop as the alma user to download the metadata output files to the local machine, then upload to the shared Google Drive. Google automatically ingests files. No , no notification needed. Contact 
        Work with Allison P. to contact bbunnell@google.com for access (or Allison might be able to sent a sharing invitation).to the Google Drive. Google contact for metadata is  Kurt Groetsch kgroetsch@googleis Kurt Groetsch (kgroetsch@google.com
      Create itemized sets by location and run a process to update sets to permanent RD location
      1. Create separate itemized sets for each library/location (i.e, WIDHD, WIDHDJUD, etc.). 
        1. Sort spreadsheet list into separate barcode lists for each library/location and save associated barcodes into text files with the text barcode as the first line of the file. 
        2. From manage set choose add set and run an itemized set, uploading each list you created in step 1.
      2. Run a job > change physical items on the set
      3. Change type: Permanent; New library: same as current owning library (no library ownership change); New location: mapped to corresponding ).
    9. Update Alma permanent physical location to RD and set item status to Transit to RD:
      1. Run Alma Analytics physical items report on batch barcodes from item manifest:
        1. Open Alma Analytics report "RES-Physical items details with core holdings, bib data - Flexible": Alma > Analytics > Reports > RES-Physical items... 
        2. Paste barcodes from item manifest file into "Barcodes" select window and Click OK to run report. 
        3. Compare the number of report rows to the number of barcodes in the manifest to confirm all items are represented.
        4. Export report as Excel, and post Excel report to this page (see below).
      2. Create itemized Alma sets for each physical location:
        1. In Excel, filter RES-Physical items report by physical location (column P) and copy barcodes from that location into a new single-column Excel file, with header row "barcode".
        2. In Alma, create a public itemized set of Physical Items using the Excel barcode file: Alma > Admin > Manage Sets > Add Set+ > Itemized > From File.
        3. Repeat for each physical location found in RES-Physical Items report.
      3. Run Alma job "Change Physical Items" on each location set: Alma > Admin > Run a Job > (Select set) > Parameters:
        1. Change type: Permanent
        2. New library: Same as current owning library (no change)
        3. New location: Corresponding RD collection for new/updated holdings
      4. Place items into transit awaiting post-processing ReCAP
      accession 
      1. A revised EOD script will be used (see documentation on accession:
        1. Log onto almadrop as ltsadmin user and run remotetransit.py script on barcode list. (See /wiki/spaces/LibraryTechServices/pages/59098446.)
        .
        1. Items will appear to have been put into transit from the owning library's main circ desk (
        i
        1. e.
        e
        1. g.
        ,
        1. WID_CIRC)
        ; the goal is simply to put the items in transit from a circulation desk which does not have a reshelving relationship with the items. 
    10. Regular End of day process/file from ReCAP returns items to "on-shelf" status (removing Transit status). Any waiting request would be sent to RD to pull in next transfer.

...

        1. . Any existing requests will be queued.
  1. Iron Mountain physically ships batched materials to Google.
  2. Google retrieves metadata from shared Google Drive and scans batched materials.
  3. Iron Mountain physically ships batched materials to ReCAP for permanent accession there.
  4. After batched materials are accessioned by ReCAP, the regularly scheduled End Of Day (EOD) process will return items to Item In Place status. Any waiting requests will be sent to RD for transfer.

Google Scanning Batch Reports 

HD_bookcarts200_242_combined_input to Physical items detailed report with core holdings, bib data - Flexible input.xlsx

...