Based on DCC SCARP [disclosure: I'm at the DCC] & CASPAR projects. Need to do preliminary analysis of data holdings, then do a stakeholder and archive analysis. Eg a project started in the 1920s, which started from radio, through radar, later ionosphere studies. Then define a preservation objective, which should be well-defined, actionable, measurable, realistic. Assess this against a particular designated community (DC).
From this design preservation information flows; there are always important elements beyond the actual data that are important, eg software, documentation, database technologies, etc. Then do a cost/benefit/risk analysis. Interesting issue about the nature of the relationship between archivist and the science community (producing and consuming).
They seem not to want to define objectives in science discovery terms (eg gravity wave research from wind profile data) but much more specifically in terms of 11 specific parameters. Describes a rather over-the-top AIP including FORTRAN manuals, to read NetCDF files (maybe I misunderstood this bit).
They then find that this homework makes it easier to interface with DRAMBORA & TRAC for audit & certification, and the PLATTER tool from PLANETS. Work may also help to build business cases for preservation of these data.
Question: How well does this archivist/community relationship scale? Does not require those relationship, but exploit it where it exists. Point is to use all the assets you have.
Question: Different types of infrastructure, eg computer centres; have any taken initiatives themselves? Mostly at present it’s a “found” situation rather than a designed one.
Comment: worth looking at the DRIVER project, with concept of enhanced publication, ie data plus supporting documentation.