Remarks on the Digitization of Cultural Materials

Fig. 1: Moving down the assembly line, each station adds yet another misrepresentation or glitch

Fig. 1: Moving down the assembly line, each station adds yet another misrepresentation or glitch

The purpose of this short essay is to describe the advent of a strange property belonging to digital data collection, storage, and query— its sudden architectural scale.

Here and Now
An individual point of data differs from a point of information only in that a data point may not necessarily be informative to a given individual in a given context. Data, then, is an observation, while information is a useful observation.

The process of locating a point of information within a body of data can be defined as querying. Querying becomes more difficult as the disparity between large amounts of data and small amounts of information grows larger and larger. Different forms of automation can drastically increase querying capacity.

In order to compile, and therefore to query, a given datapool, data points must share a common characteristic that allows them to be arranged according to the structure of an organizational system. The book, for example, fits on a shelf and typically includes identifying information printed on the spine. Many forms of information structuring have skeuomorphically emulated the book in this respect; films, rendered VHS tapes, and eventually DVD discs, were and are manufactured likewise—to sit on a shelf with identifying information displayed across a thin section of real estate deliberately reminiscent of the book’s spine. How else can we explain the plastic wasted on each rectangular DVD box, which only needs to house a smaller circular disc? A similar relationship exists between the vinyl record and its square packaging—making a case for the ergonomic efficiency of manufacturing orthogonal objects, and the ease of storing them.

Modes of data formatting are changing and turning over at an increasing rate. In the span of this generation’s lifetime, information appeared on paper by necessity, was collected onto discs and recorded across magnetic strips, and eventually became codified into computer files—under dozens of ever-improving file extensions—to be stored first on floppy discs, then on disc-based hard drives and solid-state hard drives. Now information is organized by new, emergent file extensions—often the property of the corporate interests that developed them—across an ocean of privately-held cloud storage services.

While the internet has existed in a marketable form for more than two decades, the wide-scale use of online fora to conduct social interactions is relatively new as a matter of commonplace.

The rise of the internet in this capacity—not only as a social platform, but as the primary social platform—depends now upon cloud storage and automation as its essential infrastructure. The various technological devices that, in turn, allow us to access that online repository of social material inevitably become cultural fetishes themselves.

Due to the fetishization of technology, we now produce unparalleled amounts of data points to be stored irrespective of their value as information. Issues arise from the speed of iteration, with respect both to storage and querying strategies, as well as to navigating the staggeringly size of the most popular datapools currently in use.

Editors are anachronistic, but that’s unimportant.

It can be difficult to organize information that exists in non-compatible formats both digitally and physically. Limitations on backward-compatibility, even restricted to digital information alone, are exacerbated by the fact that any business which allows so-called “outdated” products to be feasible interfaces with the digital ecosystem beyond the launch of a newer product is losing revenue—revenue which would otherwise be gained through selling newer and newer devices—hence oddments like planned obsolescence.

As merely a system of storage, the library is worthless at all times—save for the moment when one finds the book one is looking for. Any storage system is only as good as it is able to produce a point of information on demand.

Mindful of these considerations, we understand several limiting factors that are relevant to explicating the swelling scale of big data infrastructure. Housing the hardware for cloud storage systems is now a matter of architecture.

In 2014, the existence of an NSA program called MonsterMind was revealed through extensive examination of a cache of documents leaked to the public animus by Edward Snowden.i In a subsequent interview, Snowden details his time spent employed in an NSA owned and operated subterranean bunker, termed by workers the “Tunnel,” where domestic data farmers housed and stored some indeterminable portion of the communications data and metadata that the NSA collects.ii The tunnel itself is a re-purposed, 250,000 square-foot torpedo storage area.iii

In 2016, an extensive investigative report identified the location of another NSA storage and operations facility—codenamed TITANPOINTE in the Snowden cache: 33 Thomas Street, in downtown Manhattan.iv 33 Thomas Street is famous (or infamous) for being the world’s only true Brutalist skyscraper. Windowless and clad in precast concrete panels, the building’s only ornamental features are its gigantic exhaust vents that speak to its role as a “safe zone” in the event of a Cold War nuclear catastrophe. 33 Thomas Street originally served as AT&T’s “long lines” building, a critical global telecommunications center. When the property fell out of use as telecommunications infrastructure evolved, it was purchased by an unknown buyer. For years prior to the 2016 report, the goings-on inside 33 Thomas Street were not known to the general public.v Anyone standing too long near the gated entrance to the compound would be ushered along by security personnel.vi

Fig. 2: Floor upon floor of infrastructure in service to information regeneration

Fig. 2: Floor upon floor of infrastructure in service to information regeneration

Currently the CIA has entered into a multi-billion-dollar partnership with Amazon to contract the construction of yet another cloud storage facility—for reasons that remain opaque.vii

In both these instances of internationally scaled information storage, the only viable option was the adaptive re-use of an extant megastructure. While this is surely not a representative sample, it is understood that big data requires vast physical space—a principle that runs counter to the conventional assumptions that data-storage technology is getting smaller and smaller. It is worth noting that perhaps the largest and most relevant issue with big-data harvesting, is that far too many data are being collected and stored, while increasingly complex automated protocols are necessary to extract any relevant information. In order to account for the problem of backward compatibility, many industries are undergoing local pushes to find ways to digitize analog information. The disparity between different modes of organization and query appears to be expanding.

The architecture of the cloud is bereft of ornament, uninterested in human scale, optimized for the maximization of server units, and is inhabited by artificially intelligent algorithms that scrape endlessly through bottomless servers in search of information adrift in a sea of social noise. The cautious and complete preservation of that noise, while certainly contentious, is the only way to ensure that information is not lost.

The consolidation of anthropological material is the final industry. It share a midwife with the international banking apparatus and will likely be monopolized faster than any other industry in history. We would be remiss not to note the similarity between both the Tunnel and TITANPOINTE, and Ivan Leonidov’s 1927 proposal for the Lenin Institue’s “Mechanized Library.”viii Leonidov conceived of a colossal library of five million books in primitive geometries that were axially extended by a straight suspended roadway leading into the center of Moscow.

There and Then (Speculation)
In Rome’s Piazza Navona there stands a monumental fountain, Fontana dei Quattro Fiumi, designed by the architect Gian Lorenzo Bernini. In the wake of large-scale environmental catastrophe, Bernini’s Quattro Fiumi fell beneath the purview of an international preservation initiative that focused on the full and total stewardship of cultural heritage. Bernini’s fountain, one artifact among countless artifacts, was meticulously scanned by a state coalition of drones specially designed and outfitted for the tasks of imaging and documentation.

The resultant digital scans were collected into massive cloud storage repositories and used to create 3D models of each and every artifact of interest, accurate to one part in a million—well beyond the feeble resolution of the human eye. Arguably, this process was more ethical and non-invasive than any other mode of preservation to date. Eventually, entire buildings were taken apart brick-by-brick, reminiscent of the deconstruction and relocation of the Temple of Ares from the Greek countryside into the Athenian Agora, during the reign of Augustus in about the year 2 C.E.

A shift in perception occurred over time, as the housing and curation of the original materials became cause for concern. It was quite simply impossible to justify the incredible amount of resources necessary to care for it all, when a perfect copy of any valued object could be reproduced on demand. As the global initiative came to focus on architecturally scaled subject matter, the importance of protecting the thing physically gave way to the drive to catalogue it and immortalize it within the registry alone. Any semblance of originality was reduced to novelty and paled before the conquest of resolution.

In place of an original was erected a facsimile so exact as to defy the very notion of facsimile, created from a massive and ever-growing registry of representational digitized information. In this way, the cultural heritage of Rome was preserved from environmental destruction, and now remains hermetically sealed from all forms of temporal decay.

As a result of the existence of this infrastructure, the museological profession experienced something of a small renaissance. Bernini’s Quattro Fiumi no longer had to remain in the Piazza Navona by necessity. Rather, it could be in two places at once—or three; on exhibition wherever there was the demand to see it.

Algorithms are guided by an incredibly complex analysis of social trend and demand while concurrently focused on the maintenance of the industry of human cultural heritage. In every major city an identical reservoir of data—linked to the internet ecosystem—was constructed to provide civilians constant and unrestricted access to interact with the most iconic and influential effects of our history.

Combined with virtual reality technology, each redundant repository can be queried by designers, scholars, authors, artists, professionals, and independent interests alike at the touch of a screen. Acting as individual servers, hundreds of these repositories can divide load during times of high-capacity inquiry.

The only thing required to create unfettered access to endless data and information, on such a massive scale, was the inconspicuous deconstruction of reverence for the master, and the quiet destruction of the masterwork itself.


i Kim Zetter, “Meet MonsterMind, the NSA Bot That Could Wage Cyberwar Autonomously,” Wired, August 13, 2014, https://www.wired.com/2014/08/nsa-monstermind-cyberwarfare/ (accessed November 10, 2019).

ii “NSA/CSS Hawaii,” NSA, https://www.nsa.gov/about/cryptologic-centers/hawaii/, (accessed November 8, 2019).

iii Andy Greenberg, “An NSA Coworker Remembers The Real Edward Snowden: 'A Genius Among Geniuses,” Forbes, December 17, 2013, https://www.forbes.com/sites/andygreenberg/2013/12/16/an-nsa-coworker-remembers-the-real-edward-snowden-a-genius-among-geniuses/#5b22e577784e, (accessed November 8, 2019).

iv Jonah Engel Bromwich, “Snowden Leaks Illegal but Were ‘a Public Service,’ Eric Holder Says,” New York Times, May 31, 2016, https://www.nytimes.com/2016/06/01/us/holder-says-snowden-performed-a-public-service.html, (accessed November 10, 2019).

v Sam Hodgson, “Edward Snowden: ‘Do I Think Things Are Fixed? No,” Turning Points, New York Times, December 7, 2016, https://www.nytimes.com/2016/12/07/opinion/edward-snowden-do-i-think-things-are-fixed-no.html, (accessed November 9, 2019).

vi Mark Mazzetti and Michael S. Schmidt, “Ex-Worker at C.I.A. Says He Leaked Data on Surveillance,” New York Times, June 9, 2013, https://www.nytimes.com/2013/06/10/us/former-cia-worker-says-he-leaked-surveillance-data.html, (accessed November 10, 2019).

vii Frank, Konkel, “The Details About the CIA's Deal With Amazon,” The Atlantic, July 17, 2014, https://www.theatlantic.com/technology/archive/2014/07/the-details-about-the-cias-deal-with-amazon/374632/, (accessed November 9, 2019).

viii Fosco, Lucarelli, “The Lenin Institute for Librarianship by Ivan Leonidov (1927),” Socks, October 30, 2018, http://socks-studio.com/2018/10/30/the-lenin-institute-for-librarianship-by-ivan-leonidov-1927/, (accessed November 7, 2019).


Alexander Ford's architectural design work has appeared in the Princeton Architectural Press's 2019 volume, "Single-Handedly: Contemporary Architects Draw By Hand" (Apologue), and has been exhibited across the country--most recently at Art Omi. His writing has appeared in INTaR, Inflection, Fresh Meat, and [TRANS-]. Alexander previously lectured at the University of Arizona's College of Architecture, and served as the Assistant Field Director for Architecture at the Mt. Lykaion Excavation and Survey Project in Arcadia, Greece. He earned M.S. in Historic Preservation at the Columbia University Graduate School of Architecture, Planning & Preservation and B.S. in Architecture from the University of Arizona. Currently, he works for a small bakery.

Nicholas Gervasi, AIA is a Design Director at Terreform ONE and Adjunct Assistant Professor at New York City College of Technology. He previously worked for Gensler & AYON Studio in New York, NY, Cleveland Urban Design Collaborative in Cleveland, OH, and Ammar Eloueini Digit-all Studio in New Orleans, LA. He investigated microalgae as a source for wastewater treatment and biofuel production as a Visiting Climate Researcher at the Climate Impacts Group under Dr. Cynthia Rosenzweig at the NASA Goddard Institute for Space Studies. His writings have been published in the Charrette, Clog, de-arq: Revista de Architectura, Int|AR, Infection, Journal des Rêves, and TRANS-Media. He earned an M.S. in Historic Preservation at the Columbia University Graduate School of Architecture, Planning & Preservation; M.Arch and B.Arch at Tulane University. He is licensed in the State of Ohio, LEED AP BD+C, WELL AP, and LFA.

www.nicholasgervasi.com
www.theapologue.com