Keeping Data Alive Supporting reuse & repurposing of 3D data in the humanities


KEEPING DATA ALIVE: Supporting Reuse and Repurposing of 3D Data in the Humanities

A NEH Preservation and Access Tier I Research & Development grant; Grant # PR-253389-17

Table of Contents

  1. Project Goals
  2. Project Activities and Accomplishments
  3. Evaluation: Research Outcomes for Data Publishing Workflows
  4. Evaluation: Research Outcomes for Infrastructure Design
  5. Collaborations with Libraries
  6. Audiences
  7. Long Term Impact & Future Decisions
  8. Award Products
  9. Citations

Project Goals

Digital data preservation and access is complex and multi-layered involving standards, best practices, open-source vs. proprietary software, migrations, versioning, and much more. 3D data adds complexity because of rapidly emerging technologies that coincide with new software and file formats. While many types of 3D models are being used in humanities scholarship, this Tier I project focuses on 3D models of ancient architecture generated using procedural modeling—rapid prototyping of 3D models from a set of rules in order to contribute to 3D data preservation efforts for “intrinsically-born” 3D data that stem from multiple data sources such as Geographic Information Systems (GIS), architectural drawings, excavation reports, photogrammetric, and airborne LiDAR.

Procedural modeling is ideally suited for the development of 3D modeling standards or best practices that promote data interoperability, dissemination, and reuse because they CAN bring with them the underlying metadata, paradata (information about modeling choices), and descriptive data (e.g., data sources, textures, building type).

The goal of this NEH-funded Tier I Research and Development (Division of Preservation and Access) project was to develop workflows to:

  1. generate, store, and make accessible 3D procedurally-generated models of architecture in an open source database that scholars can (re)use and repurpose to create their own multi-scalar reconstructions ranging from individual buildings to entire cityscapes, and
  2. host, deliver, and visualize 3D models, linked to metadata, paradata, and descriptive data, in an open source 3D visualization environment.

In Tier I, we had two objectives. Objective One was to develop and test workflows to export procedurally generated 3D models along with metadata and paradata (i.e. modeling choices) for use on multiple platforms. In the development of workflows, we researched standards compatible with WebGL, which is becoming the standard for online visualization because it renders 2D and 3D graphics online without the use of plug-ins and is based on OpenGL (the most widely-used open graphics standard).

Objective Two was to design a beta-infrastructure to store and make accessible 3D procedurally-generated models using an open source database framework that makes these models and their associated meta- and paradata readily accessible as well as citable. Cite-ability is critical to encouraging scholars to take full advantage of the richness of the 3D data medium as part of academic publications, and to encouraging data reuse.

These objectives are part of a long-term vision to contribute to innovative methods of materials analysis and new modes of discourse using interactive 3D web visualizations. To achieve this vision requires data compatibility and accessibility—scholars must be able to combine and recombine their data for reuse and repurposing in multiple ways. Crucially, the workflows that support making data compatible and accessible must be efficient and not require a significant investment on the part of the end user. Researchers, who are the primary community concerned with these analyses and modes of discourse, are often primarily focused on producing research. The added value of compatibility and accessibility, while increasingly recognized, cannot be realized if adoption rates remain low because of a perceived high level of work required. Therefore, in this Tier I project we specifically developed workflows: (1) to export 3D models using a Python script to a repository based on a PostgreSQL database that could assign a DOI and (2) then be imported into a 3D online viewer for reuse with real-time geometry, metadata, and paradata changes tracked and stored in the repository. In scripting the workflows, the aim was for a minimal number of required steps. Scripting the process also goes some way toward ensuring consistency in the outputs.

The project’s deliverables therefore contribute to data preservation and access in the humanities. Workflows, scripts for automating processes and generating procedural models are hosted both on a project website, which provides contextual information including a white paper, and supplementary resources, and on the Github keeping-data-alive repository, which facilitates uptake by the growing community of archaeologists and modelers engaged on this platform.