all #1 #2 #3 #4 #5 #6 #7 #8 #9 #10 #11 #12 #13 #14 #15 #16 #17 #18


+ +

+ +

Eugene Thacker on Jan 20 2000 issue 13

Bioinformatics, Biopolitics, and Totally Connected Media Systems

Database and Culture

In his essay "Database as a Symbolic Form," Lev Manovich discusses the increasing ubiquity and standardizing potential of the database in late-20th century cultures:

"Indeed, if after the death of God (Nietzsche), the end of grand Narratives of Enlightenment (Lyotard) and the arrival of the Web (Tim Berners-Lee) the world appears to us as an endless and unstructured collection of images, texts, and other data records, it is only appropriate that we will be moved to model it as a database. But it is also appropriate that we would want to develop a poetics, aesthetics, and ethics of this database."[1]

As Manovich points out, what is at issue here is not necessarily a deterministic and exhaustive encoding of culture; what is needed are modes of analysis, critique, and practice which embody the database, both as a technical apparatus and as an organizational logic. In relation to the biological technosciences, this means focusing on the current intersections between the biomedical body and the computer database, and their resultant tensions and aporias. In fields such as genomics, bioinformatics, telemedicine, and computational molecular biology, the database forms a central part of the kinds of research being done, as well as structuring the kinds of propositions and questions which may be brought up.

Media, Storage, Cadaver

In "Gramophone, Film, Typewriter" Friedrich Kittler brings up the ways in which the media revolution of the late 19th century created a linkage between new technical modes, power and history, and the realm of the dead and the phantasmatic.[2] Transforming sound into acoustics, visual movement into moving images, and writing into text, these media each effected, in their own way, the arresting and storing of sensory phenomena. They were able to do this through a process of standardization (an "alphabetism"), which made possible new modes of communication (new meanings for hearing, seeing, writing).

Such technologies were, of course, historical and cultural developments, and were and are thus connected to a range of institutional sites, political uses, and technical regimes. It is in this sense that Kittler suggests that history is a mode of technical inscription, the incorporation of "writing" technologies into the privileged role of mediating culture, and the simultaneous fall away of those other modes which are situated outside the technical requirements of hegemonic media. Thus Western history can be regarded as a technically inscribing process, a "discourse network," from the "monopoly" of print technology to that of computer code. What does not thus inscribe in this way - oral traditions, modes of communicative graphism, hybrid models, primitive technologies - fall through the technical filter of history-as-media.

Inscribing technologies also form a unique relationship to bodies and to death. Whereas Kittler describes literature as evoking a zone of fantasy which extends from it, mechanical media are themselves constituted by the phantasmic, by the tension between realism or simulation, and the understood logic of representation. Thus there are ghost voices on the airwaves, ghost images in the shadowplay of film, and a psychic "automatic writing." As Kittler states, "media always already provide the appearance of specters."

Between power and phantasms, between history and the dead, there are the storage media of the gramophone, film, and typewriter. Power relationships configure the way in which these histories are mediated; the spectral and the phantasmatic constitute the very logic of storage media's capacity to endlessly archive and preserve the dead and memories. Power relationships built into storage media construct history; spectral logics built into storage media re-present memory. This mediation of history and mediation of the dead results in the archive and the document, both presented in machine memory. The archive gathers, filters, and organizes multiplicities - peoples, cultures, events, cartographies and so forth. The document is a representational tautology; it always only authenticates and verifies itself as a fact.

Now, all of this is made possible by the ability of these media to arrest/capture, translate, and store/archive information. Within this are not only encoding operations, but the potentiality of these body-technology relationships into an archive, or, to use a more current term, the database. In the same way that the mechanical storage media of the gramophone, film, and typewriter form "partially connected media systems," so do they also form the content for a future of totally connected media, based on computer code. As Kittler comments, "[r]ight now only the transmission quality of the storage media, which in the connected media systems represents the content, is being computed."

Thus the archive (the form of mechanical storage) becomes the database (the form of digital storage and computer-based memory). While the archive is often unable to be modified - as with printed records, gramophone discs, filmic and photographic plates, typeset pages - the database is defined by its flexibility in the handling of information (and thus the supplementary need for "backup storage," data encryption, and security). This proliferative character of digital media means that computer media can not only write themselves (automated processes, expert systems, and agents), but that through this recombinative logic, information comes to be seen in new, morphological terms. The implication here is that information, as in the mechanical archive, is no longer simply a concrete materiality of fact; rather information, with the database, becomes the highly technical and automatized reservoir for the proliferation and production of other media and other mediated sensory experiences. The database is not only the foundation from which totally connected media systems emerge, it is also the very interstices and linkages which constitute the possibility of connected media.

For Kittler, the transition from the archive to the database, from mechanical storage media to digital storage media, from partially connected systems to totally connected systems, this all implies a complex operation upon the body of subjects as they intersect and are integrated with media technologies. Mechanical storage media began by arresting/capturing and then translating sensory phenomena into information. This information could be stored through its technical-physical inscription in various differently-constituted storage media (the inscription of sound waves onto a gramophone disc, the inscription of light and dark shadows onto film, the inscription of letters impressed upon a page).

The development of a digital storage language and computer technology, taking mechanical storage media as its content, carried this a step further, with some new twists. Instead of fragmentation, the development of a universal language of digital/binary code makes possible the re-integration of different storage media. Media not only become integrated, but they also become interchangable with regards to their information. This means that "media can reconstruct bodies beyond the systems of words, colors, or sound intervals."

Such a situation applies directly to the various genomic mapping endeavors and the digitization of the biological-genetic domains. In cultures where storage media play a central role in the relationships between power, history, and the dead, storage media fulfill a function of accountability. Individual identities and related information, health and medical data, demographic and criminological records, all these and more inscribe the subject upon death. The cadaver of the genetic body does not slip away, to be replaced here by pure simulation or abstract information; rather, the genetic body is accounted for on all of its (molecular) surfaces, through the technical practices of a biological-informational science, and the links between a body and a genetic profile. Thus "the realm of the dead has the same dimensions as the storage and emission capacities of its culture."

As a databasing endeavor, the Human Genome Project is just one isolated example of a much more pervasive technocultural trajectory. This is not the familiar transcendentalism of moving away from the materiality of the body, and towards the purity of cyberspace. Rather, this is the concern with accounting for the materiality of the subject, of culture, of the body. Kittler again: "In the media landscape immortals have come to exist again." Once the biological-genetic subject, recognized as having had life (information) extracted from it, recognized as no longer being a un-predetermined genetic subject, is thus encoded and stored in a database, then the dead are no longer simply the not-living. Informational accountability also means informational integrity, data transmission, reproducibility, and a unique type of lateral immortality specific to digital storage media. The database makes possible a series of potential extensions which exceed the mere recording and preservation of information. With the database information becomes productive, proliferative, morphological. With the database information also becomes organized, classified, and taxonomized according to the range of flexible uses.

When, in the early 1990s, the U.S. government-funded Human Genome Diversity Project (HGDP) drafted plans for a genetic database of some 4,000 to 8,000 distinct ethnic populations, it was met with a great deal of controversy and criticism. The stakes were raised even more when it was discovered that the HGDP had proposals for the patenting of the cell lines from several members of inidgenous populations, all without those members or those communities informed consent. Due to the interventions by such groups as the Rural Advancement Foundation International (RAFI), the HGDP was forced to drop three of its patents, and in 1996 provided a testimony to the National Research Council and has since drafted a document of "Model Ethical Protocols" for research which emphasizes informed consent and cultural-ethical negotiation. Since that time, however, the HGDP has been conspicuously silent, and, despite the flurry of news items and press releases relating to the various genome mapping endeavors around the Western world, there has been relatively no news or updates on the progress of the HGDP.

Much of this curious disappearing act has to do, certainly, with the bioethical ties in which the HGDP has been involved, as well as with the combination of vocal critics such as RAFI, and the HGDP's having been "marked" by the media and dubbed by its critics as "the vampire project." However, while the HGDP as an organization may have slipped from science headlines, the issues and problems associated with it have not. Another, parallel development within biotech and genetics has emerged, which has more or less taken up the "diversity problem" which the HGDP was unprepared to deal with: bioinformatics. Technically speaking, "bioinformatics" involves the use of computer and networking technologies in the organization of updated, networked, and interactive genomic databases being used by research institutions, the biotech industry, medical genetics, and the pharmaceutical industry [3]. These are present uses of such biological-genetic databases, but future uses include individual identification (for reasons pertaining to financial, legal, and/or health insurance contexts) as well as mass biological regulation (biological warfare and instances of statistical disease control). Last year major biotech networks such as reported significant growth in the number of bioinformatics jobs being created, most of which were at biotech corporations specializing in what Incyte has called "point-and-click biology," and the market for biotech investment has in the past few years seen a rejuvenation, in large part due to the hype surrounding the so-called computerization of biology.

In addition, with the sheer quantity of material being generated by such global (read: Western) science projects as the Human Genome Project, the need for a biotech-infotech science has become the more prominent, if only for organizational and managerial reasons. Biotech corporations such as Affymetrix, Incyte, and Perkin-Elmer not only specialize in research and development, but also emphasize product development (e.g., Affymetrix's "GeneChip" technology, Incyte's "LifeSeq" databases, Perkin-Elmer's new line of automated genetic sequencing machines). Bioinformatics promises to deliver the tools which will make genomic science an information science, and propel the Human Genome Project into its next phase of "post-genomic science." With the aid of bioinformatics technologies, the Human Genome Project, originally cast by the NIH as a fifteen-year endeavor, has been shortened to a three-year one, with a "working draft" to be presented next year. In addition, biotech corporations such as Celera and Incyte have each initiated their own, corporate-framed and privately-run human genome mapping projects, claiming the ability to out-do the HGP in providing a more "medical" approach (i.e., mergers providing direct biotech-to-pharmaceutical access of information). The link between biotech and infotech is no longer a supplementary one, it is, for contemporary genomics, a current necessity.

With the advancement and implementation of computer-based technologies into the laboratory, the sciences of biotechnology and genomics are instantiating a new paradigm of scientific praxis in which the linkage between information technology and biological technology is not only intimate, but is also being configured in a range of biotech-infotech hybrids, such as the DNA chip. In the early 1950s, when James Watson and Francis Crick published their famous papers on the structure of DNA, the language they used was consciously infused with terminologies borrowed from information theory and cybernetics [4]. It was here, in part, that the notion of DNA as a "code" and as genetic information began to become an accepted and dominant discourse for discussion and future research in molecular genetics.

Bioinformatics still, for the most part, assumes the traditional centrality of DNA, though, due to more recent research in genetics and embryology, DNA is now regarded as a relatively inert set of directions or a recipe book, whose primary goal might be nothing other than to reproduce itself. However the implicit logic behind bioinformatics (and again, this is a field very much in development) moves beyond the metaphorical borrowing of information and cybernetics discourse to describe the molecular interactions of the genetic material in a cell. What bioinformatics suggests is not that DNA acts like information; rather bioinformatics suggests that, at its basis, and fundamentally, DNA is information. This is not an exact equivalence however. Bioinformatics approaches the organism at the molecular-genetic level, but through the lens of contemporary developments and applications in computer science, programming, and network technologies. The logic behind bioinformatics is that the biological body, as a genetically-grounded organism, is essentially about information - not the information composed of silicon, fiber optics, or pulses of light, but an information composed of carbon, biochemical pathways, and the complex interplay between biological, cytological, and physiological systems. So why "information"? Rather than a more traditional cyborgic relationship, a literalized fusion between flesh and circuitboard, bioinformatics suggests that the genetic organism, like the computer network, is about data transmission, the integrity of information, the process of encoding and decoding, and the operability of a systemic network. Post-war information theory (Shannon) and cybernetics (Wiener) thus provide, in modified forms, the basis for an approach to the biological body of humans and animals as systems based on optimization and operationality. Thus, the information of genetics is not exactly the same as the information computer science; however, on the level of systemic operation, on the level of epistemology, both constitute cybernetic-like networks.

The questions which arise from this are many: What exactly is "biological" or "bodily" about the biological database? Is there any reason to refer to the biological database as "material"? Does the online database challenge in any way the premises of the individual and/or species subject in genetic science? What are the implications of the particular instantiations of online biological data - uploaded, encoded, downloaded, file transferred, archived, and so on? Does the particular mode or organization - that is, the epistemologies - of the online database constitute a significantly new ordering and knowledge-production of the biological and species body?


As a way of approaching such questions, it might be helpful to consider Michel Foucault's later work dealing with "governmentality" and "biopolitics." Though both of these terms begin to appear with some regularity around the time of Discipline & Punish, Foucault later clarified their relationship to his other concepts of "bio-power" and "disciplines."

Roughly speaking, Foucault calls "biopolitics" that mode of organizing, managing, and above all regulating "the population," considered as a biological, species entity [5]. This is differentiated, but not opposed to, "bio-power," in which a range of techniques, gestures, habits, and movements, most often situated within social institutions such as the prison, the hospital, the military, the school, collectively act upon the individual and individualized body of the subject [6].

The fundamental difference here is not between the individual and society, but rather between an individuating and collectivising logic, similar to what Foucault earlier called "dividing practices." Bio-power, in acting upon the body of the subject, did so through a system of normalizing techniques which applied to larger-scale groupings of "docile bodies." A technique may thus be very detailed and individualized, but also apply across a broad range of subjects. The emphasis on the biological population is also an emphasis on the multiple, intimate points of contact between a regulatory technology and a heterogeneous body of subjects. That is, in order for a biopolitical power to be able to regulate and impel subjects in a particular way, it must organize the bodies (both individual and collective) of subjects in such a way that a managable and functional body of knowledge may be produced from those subjects.

Foucault mentions the regulation of birth and death rates, disease control and patient monitoring in hospitals, as well as more contemporary examples of consumer data, individual identification forms, health insurance, health data related to sexuality and psychology, and institutional surveillance of subjects. Principal among these and other examples is the accumulation and ordering of different types of information. The development of the fields of statistics and information science are thus key developments in the modern forms of biopolitics. The population is articulated in such a way that its non-totalization may be re-routed via methods or techniques of statistical quantification. This is not only a mode of efficiency, it is also a political economy of subjects and bodies.

Biopolitics is, first, an organizational technology articulating something called the biological and species population - a collectivity of bio-subjects. Secondly, biopolitics, through a range of techniques and practices, produces and collects knowledge of the population in the form of a managable quantum, or information. And finally, biopolitics reproduces its continual and changing regulation of the population through another set of techniques and practices which insert this informatic knowledge back through the social-biological body of the population, culminating in a quantifiable, organizational entity which may be "touched" at a variety of points through a range of technologies.

What has become of the original issue put forth by the critics of the HGDP? Part of the problem is that the issues dealt with in the criticism of the HGDP have been handled in the way that criticism of genomic mapping and human embryonic cloning have been handled: they have been filed under the worrisome category of "bioethics" (which commands a mere 3% of the HGP's budget). As postcolonial critiques have pointed out, the HGDP came to a relative standstill because it could not reconcile Western-scientific assumptions and intentions with non-Western perspectives towards "agriculture," "population," "medicine," "culture," and so forth. The sheer gap in between the HGDP's bioinformatic colonialism and those predominantly non-Western cultures who were to be the source of biomaterial for the HGDP database illustrates the degree to which "global" once again means "Western" (and, increasingly, "economic"). *One of the meanings of the decrease in the presence of the HGDP and the rise in bioinformatics developments and applications is that the issues of ethnicity and cultural heterogeneity have been sublimated into an informatic paradigm in which they simply do not appear as issues.* That paradigm is, of course, one based on the predominance of information in understanding the genetic makeup of an individual, population, or disease. When, as geneticists repeatedly state, genetic information is taken as one of the keys to a greater understanding of the individual and the species (along with protein structure and biochemical pathways), the issue is not ethnicity but rather how to translate ethnicity into information. In such propositions, ethnicity becomes split between its direct translation into genetic information (a specific gene linked to a specific disease) and its marginalization into the category of "environmental influence" (updated modifications of the nature vs. nuture debate).

The biopolitics of genomic science is that of an informatics of the population in which cultural issues (ethnicity, cultural diversity) are translated into informational issues (either via a universal, generalized map of the human genome, or via individualized maps of genetic diversity). As with numerous other science-technology disciplines (from Artificial Intelligence to Nanotechnology), the apparent neutrality of abstract systems and purified information needs to be questioned. As Evelyn Fox Keller, Donna Haraway, and others have pointed out, information is not an innocent concept with regards to issues of gender and race [7]. The questions which need to be asked of bioinformatics, online genomic databases, and genome mapping projects, is not just "where is culture" but rather, "how, by what tactics, and by what logics is bioinformatics re-interpreting and incorporating cultural difference?"

As Manovich states:

"What we encounter here is an example of the general principle of new media: the projection of the ontology of a computer onto culture itself. If in physics the world is made of atoms and in genetics it is made of genes, computer programming encapsulates the world according to its own logic. The world is reduced to two kinds of software objects which are complementary to each other: data structures and algorithms...any object in the world - be it the population of a city, or the weather over the course of a century, a chair, a human brain - is modeled as a data structure, i.e. data organized in a particular way for efficient search and retrieval"[8].

Thus the bioinformatic database and its Database Management System (DBMS) is more than a technical tool; it is a combined epistemology, technical apparatus, mode of regulation, standardization, and "normalization," and a type of interface between the virtual and real, between information and materiality, between computers and humans. The bioinformatic database not only manages these boundaries, but it also constitutes them; in the very design, creation, and implementation of the database such boundaries are formed. The degree to which the DMBS allows or disallows mobility, morphology, and hybridity will dictate to what degree the "posthuman" or forms of "postorganic life" will be possible.



[1] Manovich, Lev. "Database as a Symbolic Form." Posted through nettime mailing list (1998): <>.

[2] Kittler, Friedrich. "Gramophone, Film, Typewriter" in Literature, Media, Information Systems, Ed. John Johnson (Amsterdam: G&B Arts, 1997).

[3] For a good shorter article about bioinformatics see Aris Persidis, "Bioinformatics," Nature Biotechnology 17 (August 1999): 828-830.

[4] For a classic account of the discovery of the structure of DNA see James Watson's The Double Helix (New York: Norton, 1981).

[5] See Foucault's course descriptions on governmentality & population in Ethics: Subjectivity and Truth. The Essential Works of Michel Foucault 1954-1984, Vol. I, Ed. Paul Rabinow (New York: The New Press, 1994). For critical perspectives on Foucault's notion of governmentality see Graham Burchell, Colin Gordon, and Peter Miller, eds. The Foucault Effect: Studies in Govermentality (Chicago: U of Chicago P, 1993).

[6] See Discipline & Punish (New York: Vintage, 1979).

[7] See Donna Haraway, Simians, Cyborgs, and Women: The Reinvention of Nature (New York: Routledge, 1991), and Evelyn Fox Keller, Refiguring Life: Metaphors of Twentieth-Century Biology (New York: Columbia, 1995).

[8] Manovich, ibid.




Celera Genomics

The Human Genome Diversity Project (HGDP)

The Human Genome Project (HGP)

Incyte Pharmaceuticals

Perkin-Elmer Corporation

Rural Advancement Foundation International (RAFI)


edit post | sent this page to a friend | printer friendly


last 5 articles posted by Thacker

:: Database/Body - Jan 20 2000


about | contact | credits | subscribe