You are on page 1of 56

Between the Digital and the

Analogue in the 21st Century


History, Technology & the Liberal Arts
by Rachel Leow
Faculty of History, Cambridge University
version 2.1

A talk presented at Rochester University


February 2015

I. The heuristic
I want to start with a heuristic and run with it into unwarranted terrains.
Hands up if you are wearing a wristwatch today.
Leave your hand up if you are wearing a wristwatch which has digits on it, not hands.

It's interesting that when I did a search for "clocks" and "wristwatches", the images
Google displayed to me were all analogue clock faces, even though weve had digital
watches since 1970, and certainly since 1985which is the year that this particular
digital watch is from. For those of you that don't recognize it, this is the great movie
Back to the Future, 30 years old this year. Here is Marty McFly, immortalised for all
time in this iconic movie poster staring earnestly at his digital watch.

The original division between the digital and analogue has to do with a category
distinction in electronics and computing. Of course weve had analogue computers
since arguably 2000 years ago. But the electronic revolution made it necessary to
distinguish between analogue computers, which rely on continuous values, like
hydraulics or scale measures, and digital computers which use discrete values1s
and 0sfor their computational processes. In information technology, analogue
information is in the continuous form of waves of sound and light, while digital
The Antikythera Mechanism (c. 200-100 BCE)

information consists of discontinuous data or events. So in one sense, the analogue


world of light and sound is the real world we live in, while the digital world is
abstract, articial.
And so an analogue clock is one in which time is represented by the hands that spin
and point to an approximate location on that spectrum, while a digital clock displays
exact abstract integers.

Which one is more accurate, better? Like any tool, it depends on what you want
to do with them. We seem, as a society, to prefer analogue clockfaces, though. But I
would guess pretty much everyone in this room is literate in both technologies. We
can glance down at our wristwatches and estimate that we are about a semicircle
away from lunch, or look to the corner of our screens and read that its 12:02pm. We
are versed in both the digital and the analogue.
I want to suggest that this sort of multiple literacy is the key to the future.
To do that, I want to talk about cyborgs, ip-opping, and a whole lot of other
things. And I invite you to come and run along with me and my audacious heuristic.

II. Cyborg world


Much of my thinking on cyborgs was stimulated by a wonderful web project by Tim
Maly called 50 Posts About Cyborgs, which ran back in 2010 on the 50th anniversary of
the coining of the term cyborg.1 Tims introduction to that series literally makes
you see cyborgs everywhere. You think a cyborg is just the Terminator? Robocop?
The Borg? Brains jacked into computers, bodies twisted into metal until theyre
barely recognised as human? No way, says Tim. Cyborgs are products of any nonhereditary technological intervention that changes the course of your biological
existence.
Drink caeinated drinks? Wear glasses? Use a cellphone? Youre a cyborg.

1.

Tim Maly, 50 Posts About Cyborgs <http://50cyborgs.tumblr.com/post/1051775931/2-the-project>

Used an axe to cut something you would otherwise not have been able to kill and
eat? Congratulations, says tech writer Kevin Kelly, your ten million year old ancestor
and his articial claw was a cyborg.2

I appreciate this unintuitive denition, because I think it's the right way to think
about the world we live in today: both unexceptionally and exceptionally
revolutionary at the same time. It's important not to underestimate continuities from
earlier times: the ways in which we have perceived ourselves to suer from
information overload for much longer than our present moment of information
explosion; the way new technologies have always fundamentally transformed our
social practices and our knowledge systems.3
But I do also think that we live in a uniquely cyborgian age.

2.

Kevin Kelly, What Technology Wants.

3.

Books in this vein: Carla Hesse, 'Books in Time', in Georey Nunberg ed. The Future of the Book (Berkeley: University of California Press, 1997); Ann Blair, Too Much to
Know (New Jersey: Yale University Press, 2010).

William Gibson, a science-ction novelist who has been called the noir prophet of
cyberpunk, coined the term cyberspace back in the 1980s to denote a space that
was distinctively elsewhere: one we visited periodically, he wrote, into which we
peered from the familiar physical world.4 Now, he said, cyberspace has everted:
turned itself inside out, colonised the physical.5

(Here, somewhat ttingly, is a cigar-box miniature printer made by some


enterprising netizen. It prints only William Gibsons tweets. He must be so proud).6

4.

William Gibson, Neuromancer (1984).

5.

William Gibson, 'Google's Earth', 31 August 2010 NYT <http://www.nytimes.com/2010/09/01/opinion/01gibson.html>

6.

'Cigar-box thermal printer that prints William Gibsons tweets @GreatDismal' <https://www.adafruit.com/blog/2012/03/08/cigar-box-thermal-printer-that-printswilliam-gibsons-tweets-greatdismal/>

We inhabit a world that is profoundly everted. We still talk about the web like its
not everywhere and everywhen that we arelike its not the real world. But the
reality of it is that the web is everywhere we are. The advent of Google has allowed
us to create an enormous external brainthe latest iteration of human neocortex
evolution, if you take Ray Kurzweil at his word.7 Smartphones have allowed us to
take the web out of the computer room (who among you still have computer
rooms?) New media forms now allow us to project ourselves outward into the
world, in a huge fuzzy cloud of blog posts, tweets, Spotify playlists, Whatsapp
messages, digital footprints.8 New technologies allow us to extend ourselves inward,
into new containers of the self: our disk drives, devices, our wearable tech, our
arrays of networked things.

7.

Ray Kurzweil, 'The Hierarchy in your Brain', TED Talk, March 2014 <http://blog.ted.com/2014/03/20/the-hierarchy-in-your-brain-ray-kurzweil-at-ted2014/>

8.

Adrienne LaFrance, 'The Web is the Real World', The Atlantic, 22 December 2014 <http://www.theatlantic.com/technology/archive/2014/12/the-web-is-the-real-world/
383985/>

Press a virtual button on your phone and the real world around you changes, with
astonishing speed: real food appears on your doorstep, or an Uber comes to your
very location to pick you up. Astronauts 3D-printed a socket wrench in outer space
this year. In the Harvard Bookstore, you can print over 5 million e-books on demand
for around $8 a piece.
We're creating a world that seamlessly shapes itself to human desire.9 And it is less
and less easy to distinguish between the machine and the human, when we plant
pacemakers in our hearts and nano-bots in our bloodstreams, remake our sick and
broken with aluminium and titanium, and send our most intimate souls in bits and
bytes across the criss-cross of ber-optics to touch a friend, perhaps even a lover,
whom we have never met. We live in a world strung wonderfully, joyfully,
terrifyingly, between the digital and the analogue. We live in a cyborg world.

9.

Alexis Madrigal, 'Smart Things in a Not-Smart World', The Atlantic, 23 July 2014 <http://www.theatlantic.com/technology/archive/2014/12/the-web-is-the-real-world/
383985/>

III. Suggestive etymologies: The Two Cultures


Who coined the term cyborg? It was this man, Manfred Clynes, in conversation
with his interlocutor Nathan Kline. Clynes was a rare breed, a renaissance man of
both the arts and the sciences, a neuroscientist and musician. His work
systematically combined the two of them into a career that spanned from
performance recordings of Bachs Goldberg variations all the way to patent
inventions in the elds of neurophysiology, ultrasound, data recording and wind
energy. Clynes was obsessed with the science of the emotions, and founded a whole
eld about it, called sentics: the scientic study of waveforms of touch, emotion and
music.

10

I think its no coincidence that the inventor of the term cyborg was in a sense a
hybrid being himself. Hybrid, that is to say, along the axis of the sciences and the
humanities. These have frequently been seen as two houses divided, since C. P.
Snow gave his two cultures lectures at my university, in Cambridge -- just one
year before Manfred Clynes coined the word "cyborg", in 1959. Snow said, then,
speaking in an age and society still reeling from the devastating moral weight of
Hiroshima and Nagasaki:
I believe the intellectual life of the whole of western society is increasingly being split into two
polar groups Literary intellectuals at one poleat the other scientists... Between the two a
gulf of mutual incomprehensionsometimes (particularly among the young) hostility and
dislike, but most of all lack of understanding.10

10.

C. P. Snow, Rede Lectures, 1959.

11

Snows concerns were directly linked to the nuclear moment, the recognition of the
immense power of science to make and destroy, and the urgent need for everyone,
not only scientists, to apprehend and argue with that power, to temper the cans of
technology with the shoulds of informed reason.11 He wrote:
I believe the industrial society of electronics, atomic energy, automation, is in cardinal
respects dierent in kind from any that has gone before, and will change the world much
more.This is the material basis for our lives: or more exactly, the social plasma of which we
are a part. And we know almost nothing about it.
In a way, Snows call has not really yet been met. We now nd ourselves fty years
after Snow launched this appeal for an integrated arts-science culture and
curriculum a matter on which, in his view, the very future of Britain and the world
depended - and, it seems not very much closer to it.
I am completely unqualied to oer any comprehensive proposal for what that
curriculum should look like, though a few of us got together some time ago and
thought about it in fun speculative ways.12 And I think the way forward will be
eventually to encourage more of the same.

11.

Lisa Jardine, 'C. P. Snow's Two Cultures Revisited', C. P. Snow Lecture 2009, reprinted in Christ's College Magazine 2010 <http://www3.christs.cam.ac.uk/cms_misc/
media/Publications_Christs_Magazine_2010_web.pdf>
12.

New Liberal Arts (2009); see <http://www.snarkmarket.com/nla>

12

What I would like to suggest, though, is that continuing to educate students into this
dualistic STEM/humanities disciplinary model is not going to be enough. But I also
think, speaking realistically, that a mass revamp of curricula is only one way, and
perhaps the most institutionally dicult way, to go about eecting these changes.
So I think its most immediately useful to ask, of you the student: why do you want
education? And to ask, of you the educator: why do you educate?

IV. Suggestive etymologies: The Flip-op


The "ip-op" was originally articulated by a self-styled media inventor, novelist
and long-time internet compadre of mine, Robin Sloan, who has all his ngers on the
pulse of the directions of new media and journalism today. Heres his denition:
the ip-op (n.) the process of pushing a work of art or craft from the physical world to the
digital world and back again maybe more than once.
What does that look like? Heres an example.

13

1. Sculpt eight dierent vases. (PHYSICAL)


2. Take photos of those vases. (DIGITAL)
3. Find those photos and combine them somehow into a single vase. (DIGITAL)
4. Print that new vase in plaster with a 3D printer. (PHYSICAL)
5. Take photos of that new vase. (DIGITAL)
6. Make an animated GIF! (DIGITAL)13

13.

Robin Sloan, 'Dancing the Flip-Flop' <http://www.robinsloan.com/note/ip-op/>

14

Let me tell you what I take from this idea. First of all, what is produced out of the
ip-op is a hybrid product, something which is impossible without the input of
either. That is to say, the thing which is produced in the end lies not solely within the
realm of possibility of either the digital or the analogue. By playing on the frontiers,
the combination of the two has expanded the realm of possibility of action and
creation.
Secondly, the act of production and creation could have happened by two people,
one of whom was a sculptor and the other a computer artist, entering into
conversation and joint creation together.
But maybe what we really need, what we are moving towards and perhaps must
move towards, is the amalgam of the analogue sculptor and the digital creator in one
being.
What I take from this, therefore, is a call for exibility. An injunction to play at the
borders. And to become someone who can play at the borders.

15

If you want the utilitarian rationale, its just good business sense. As Paul Sao at the
Institute of the Future said, If you want to innovate, look for the edges. The fastest
way to nd an innovation is to make a connection across disciplines that everybody
else has missed.14 In this talk, I'm suggesting that one of those edges, the gigantic,
elephant-sized opportunity that's running along the edge of most elds, is
technological in nature.
But if you want the real argument, and luckily for me the two are fundamentally
aligned, its that and I will get to this at the end given the world we are living
in and building around us, I dont think we can aord not to be ip-oppers.

14.

Paul Sao, quoted in Mark Stek We Digital Sensemakers in Bartscherer & Cover, Switching Codes (2011), p. 41.

16

V. Frontiers and Edges in the Humanitas Digitas


Im going to talk mostly about technological advances within the realm of the
humanities, since thats what Im most familiar with, but my argument may well be
scalable to other elds.
The digital humanities today encompass an enormous range of of disciplinary and
methodological approaches, as Stephen Ramsay said: from media studies to
electronic art, data mining to edutech, scholarly editing to anarchic blogging, while
inviting code junkies, game theorists, free culture advocates, archivists, librarians
and edupunks under its capacious canvas. Its proponents call for a radical,
revolutionary expansion of the eld of humanistic inquiry to leverage the power of
data

sets,

computational

analysis,

design-centered

thinking,

transmedia

argumentation and interpretative work that exceeds the cognitive or analytical


abilities of the single or traditional humanist.15
I think its no coincidence that the digital humanities have arisen at this time, at a
kind of global conjuncture.

15.

Anne Burdick, Johanna Druckner, Peter Luneeld, Todd Presner, and Jerey Schnapp, Digital_Humanities (Cambridge, MA: MIT Press, 2012).

17

The past few decades have seen the exponential expansion of technology out of the
hands of the select few into the realm of mass consumption.
They have seen the exponential increase in the worlds proportion of educated,
literate and urbanised, and with this, the increases in demand for media, for
information, for new products, new communities, new habits and opportunities of
material and immaterial consumption.

The humanities and social sciences have experienced a profound historiographical


shift in the last 20 years, towards modes of study which pay new attention to the
connectivities of our age. We have shifted our scale to the transnational, and to the
global, moving away from the silos of national history which characterized our 19th
and 20th century scholarship.

18

These paradigm-shifting pressures are pressing scholars towards large-scale


comparative histories that require mastery of regions, languages, archives and
theoretical and scholarly literatures simply beyond the capacity or scope of one
individual. Or towards large-scale longue duree, macroscopic histories that are
perfectly aligned with our new capabilities for processing big data.
And the nal prong of the conjuncture: since the 1980s we have slowly moved into
an age of free-market orthodoxy and its handmaiden, neoliberal austerity. Higher
education has become a buyer's market whose rationale for existence is to act as a
guarantor of employment, prestige and wages. College and welfare budgets alike are
under sustained nancial, social and moral attack. The humanities, especially, has
felt increasingly embattled and imperilled.
In this way, and in agreement with prominent DH practitioners like Matt
Kirschenbaum and Ted Underwood, Ive thus found it most useful to think of the
Digital Humanities as a kind of tactical coalition rather than a single intellectual
project. This captures the sense of the DH as arising at the heady, opportunity-lled
frontier born out of this highly convergent moment.

19

So let me explore some of the frontiers we can nd in the humanitas digitas, in the
following areas.
1. Digital communications: As a transformative condition of the way scholars
communicate with the public
2. Little DH: As part of a transition toward new research practices, an upgrading of
the humanists craft
3. Big DH: As a means of analysis and interpretation in itself, a radical upgrading, if
you like, of the humanists methodological toolbox
4. Digital futures: Where we are headed

V.I. Digital communications


Perhaps the most visible impact that technology has had on the humanities is that it
has created enormous possibilities for communicating scholarship to the public. We
now have more possible formats than ever in which we can create and disseminate
scholarship, and we can now reach wider potential audiences than ever.

20

In this respect, academia has much to learn from watching innovations in journalism.
Journalists have had to adapt incredibly quickly to the reshuing of faultlines in our
content landscape between print and digital platforms, and they are equally invested
in the project of reaching audiences. In journalism and academia alike, the
"message", the "argument", the "story", can now be designed and repackaged for
dierent content platforms, which interlace text and hypertext, sonic and graphic,
animated and static.
As the authors of Digital_Humanities suggest, "To design new structures of
argumentation is an entirely dierent activity than to form argumentation within
existing structures that have been codied and variously naturalized." How we
design scholarship for a multimedia age is one of the key frontiers in what we call
the digital humanities.16

16.

Burdick et al., Digital_Humanities, p. 12.

21

The ipside of this communications revolution is the rapid reconstitution of the


place of the print book, both in academia as well as in society. I don't really want to
get bogged down in the discussion of whether or not the book is dying, which is and
continues to be an endless and passionate debate, represented by Clay Shirky,
batting on the 'death to the book' team, and Nick Carr, on the 'death to the bookless
philistines' team.17

What I do want to do is use the future of the book to suggest a distinction between a
frontier and an edge.
The frontier in the reimagination of the book is a rapidly shifting, exciting one.
People have already begun to think and dream on the uncanny frontier between
electronic and print. Challenges to the traditional publishing landscape have
precipitated a owering of new formats and web platforms. In particular, we are
rethinking how books get made in an age of multimedia, multimodal and multivocal
publishing.

17.

Recommended: Clay Shirky's blog <http://www.shirky.com/weblog/> and Nick Carr's blog <http://www.roughtype.com/>

22

The rst edition of the printed text of Debates in the Digital Humanities was published
in 2012 by University of Minnesota Press.18 From the start the printed book was
intertwined with an online platform which was open for contributors of the volume
and outside peer reviewers to comment and annotate the text, before it was
published as a print book, which we can take, in this sense, to be a kind of
snapshot of a conversation in progress. Indeed that conversation continues - the
book has now been ip-opped back onto an open access platform that will enable
social reading experiences through a ne-grained commenting system and markup
tools. Books are no longer static, unchanging objects.

Interacting with Print was conceived as an exercise in collaborative authorship.19 It


brings together 22 writers in on- and oine contexts to produce a multigraph, as
opposed to a monograph, over the course of a year on the history of print. Seeking to
historicize the idea of interactivity as a concept that spans both old and new media,
they are also, in the process, exploring the act of writing and publishing as a
networked, multi-authored activity.

18.

Digital Humanities debates <http://dhdebates.gc.cuny.edu/>

19.

Interacting with Print <http://www.interactingwithprint.org/project/multigraph>

23

What is a book when it could look and function more like a database? Robert
Darnton, the great historian of eighteenth century France, imagines an e-book
composed of many layers arranged like a pyramid, the topmost layer of which
represents the ordinary monograph which may be printed and bound, but the
subsequent digital layers of which can be clicked through to dive into its embedded
supplements:

media-rich

primary

sources,

bibliographies,

appendices,

historiographies, bodies of documents. Eventually, he says, readers will make the


subject their own, trace their own reading pathways through it, and may even, in a
sense, read an entirely dierent book each time.

And just the other day, James Patterson released his latest novel, Private Vegas, a
thriller e-book which actually self-destructs in 24 hours from the point of rst
reading.20 A book which vanishes in your very hands, so that if you don't read fast
enough, racing against the clock alongside the protagonist of the book, you will be
left forever in the dark about the ending.
That is the frontier: a heady, exciting place. May it be full of exploding books.

20.

James Patterson, Private Vegas <http://www.selfdestructingbook.com/>

24

What is the edge?


Here's an edge, though not the only one. I see the digital as being dierently bodied
from the analogue.
Human beings are creatures of embodied cognition. As George Lako and Mark
Johnson have argued, we are not just disembodied minds oating around, and we
dont actually take well to environments which assume this.21 So much of our
cognition arises from profoundly embodied, pre-linguistic interactions with the
world. The spatialities and embodiments of our human experience show up
constantly in our metaphors of even the most abstract thinking.
The replacement of the page by the screen creates a certain sense of dislocation in
usand just look at those words! Replacement, dislocation. At a level more
fundamental than we might recognize, the screen uproots our deeply-evolved sense
of where we are, what spaces we occupy.

21.

George Lako and Mark Johsnon, Philosophy in the Flesh: The Embodied Mind and the Challenge to Western Thought (New York: Basic Books, 1999).

25

Craig Mod, one of my favourite thinkers about the future of the book, talked recently
about how we are increasingly wrapped in digital thinness. We have folders in our
physical ling cabinets, and folders on our virtual desktopslook how the language
of our computers mimic those of the real world. But a folder in a ling cabinet has
edges, it has weight and body. A folder on the screen looks exactly the same whether
it contains one item or a billion items.

The cloud model to which internet copy culture is movingin which we have a
single master copy held in a corporations data centre which is eectively projected
temporarily onto our screenscreates an even greater sense of dislocation.
The desire for weight and embodiment of the digital generates artistic creations like
these, James Bridles book cubes, which in his terms are touchstones, souvenirs,
counterweights, to the thinness and invisibility of our web readings.22

22.

James Bridle's bookcubes <http://booktwo.org/notebook/bookcubes/>

26

Its the spirit behind the Library of the Printed Web, which consists, as its proprietor
Paul Soulellis puts it, entirely of stu pulled o the Web and bound into paper
books.23

There is an exhaustion with the endless scroll of the ever-burgeoning stream, an


embodied, human unease with innity.24 There is a sense of too much, too fastgiving
rise to the counterpoint of slow news, slow food, slow journalism, long reads, and
yes, even the resurrection of festina lent and a renewed faith in the codex. To Craig
Mod, this dislocation has precipitated a small countermovement, a desire to do
real stu with our hands, a renewed momentum around craft and physicality.25

23.

Library of the Printed Web <http://libraryoftheprintedweb.tumblr.com/>

24.

Alexis Madrigal, 'The Year the Stream Crested', The Atlantic <http://www.theatlantic.com/technology/archive/2013/12/2013-the-year-the-stream-crested/282202/>

25.

Craig Mod, 'The Digital-Physical: On Building Flipboard for iPhone and Finding Edges for Our Digital Narratives' <http://journalofdigitalhumanities.org/1-3/thedigital-physical-by-craig-mod/>

27

I see this sentiment among a loose cohort of early adopters of the web, among whom
I would count myself, who in our pioneer enthusiasms arrived at a kind of
saturation point with screens, who have caught glimpses of the limits of what the
digital can, cant and wont do. And I believe that while some of those frontiers
represent the limits of our technological capacities at the moment, at least some of
those frontiers also run along the blurry edges of what it means to be human itself.
Yet this is not a call to retrogression. Again, I think its an error to think in terms of
zero-sum. Rather, its precisely those in whom this unease is evoked who should be,
in a way, most obligated to pack their bags and head to the frontiers of the uncanny.
We cyborgs and media ip-oppers are the ones who should be running our ngers
along those edges, probing and pressing to see what gives and what needs to stay. A
dance, a ip-op, if you will, between the abstraction of the digital and the
physicality of the analogue. Or perhaps to repurpose Francis Bacon for the 21st
century, between the analogue hand and the digital intellectfor as he said so long
ago, Nec manus, nisi intellectus, sibi permissus, multum valent. Neither hand nor
intellect left each to itself is worth very much.
We need hybrid, cyborg, ip-op imaginings, to imagine the frontier world of digital
print which will suit us as pioneers, scholars, as readers, as writers, as teachers and
studentsbut also, to know where our edges are, as humans.

28

V.II. Little DH
I owe the term little DH to a dear friend, cyborg and colleague of mine, Konrad
Lawson, with whom I ran a digital humanities workshop at Harvard a couple of
years ago, along with another cyborg compadre, Javier Cha.26 We were talking about
how to get the fearful and the self-consciously untech-savvy to get on board with our
scheme to have a wide conversation about the digital humanities. And Konrad
suggested that we make a distinction between big DH, which I think at the time he
characterised as big scary code and mathematical graphs and stu, and little
DH, which were the daily practices of scholars, researchers, students and teachers
whose creative practices were being enriched by the advent of new technologies of
writing, reading, archiving and note-taking. (I should say that Konrad, a historian, is
also an accomplished coder himself, and no fearful rabbit when it comes to Big DH. )
At any rate, I nd the distinction a useful one, because I think it is a good indicator of
the scale and scope of the digital humanities revolution.

26.

The Historian's Craft in the 21st Century <http://www.tinyurl.com/thc21>

29

Technology has utterly transformed the way historians work. We now have an array
of tools which are altering the thought environments in which historians write, take
notes, conduct research, collaborate and communicate with each other. The frontier
in Little DH is tool culture, squarely in the eld of praxis: the proliferation of apps,
software, methodologies and hacks for writing, reading, productivity, project and
time management, archiving, and teaching. These are being innovated and
developed every day, although not reliably in conversation with humanists.
The way we approach our sources has changed. Digitisation lets us root through
archives, primary and secondary literature from the comfort of our homes (sofas). Or
we perform blitzkrieg-like raids on archives, armed with personal camerasI own
huge swathes of British Foreign Oce les in a device the size of my thumb.
The way we are reading is under decisive transformation: thanks to the contributions
of Google Books and other mass repository-scanning endeavours, it is becoming
entirely possible for one to conduct research without setting foot in an actual
library, and this was simply unthinkable and impossible even just 10 years ago.
The web is encrusted with writing, collaborative editing and citation software tools
that have revolutionised the way we write and cite: Scrivener, Ommwriter, Google
Docs, Mendeley, Papers, Zotero, Refworks, Sente and so on.

30

The way we communicate and create academic communities has changed. Twitter,
for example, has helped to democratise the hierarchies between professor and
student, between the eminent scholar and the breathless fanboy and fangirl. It has
become a conference backchannel, a space for generating sparks of contact,
collaboration and encounter that were simply impossible before.
The way we take notes is dierent. Keith Thomas, the great historian of early
modern England, wrote in the LRB some time ago about what he refers charmingly
to as his dinosaur practices. His notetaking habits were rened in an era of
handwriting, typewriters, carbon-copies, and yes, scissors. He would write out
quotes and facts on slips of paper, or literally gut sections of books with his scissors,
and stu them into topical envelopes. Filing, he wrote mournfully, was for him a
tedious activitybundles of unsorted notes accumulate. Some of them get loose and
blow around the house, turning up months later under a carpet or a cushion. It
would be unthinkable, he said, to transfer his lifetime of slips of paper, his cellar,
stued with cardboard boxes and dog-eared folders, and littered with loose slips
which have broken free from overstued envelopes to wiki software that can be
used to develop a personal research database."27

27.

Keith Thomas, 'Diary', LRB <http://www.lrb.co.uk/v32/n11/keith-thomas/diary>

31

I think this raises an interesting question about the extent to which technological
changes are not only making obsolete certain entrenched practices of discipline or
eld, but perhaps also devaluing skills and abilities which society has once
honoured.
And this is not just happening in history or even the humanities, of course. As Isaac
Kohane at the Harvard Medical School put it, Who today is the better doctorthe
one who can remember more diagnostic tests? or the one who is the quickest and most
savvy at online searching for the relevant tests? Does bedside manner really matter
more than the ability to interact competently with a database? We are going to be
uncomfortable, he said, with some of the answers to these questions for many
years to come.28
No coincidence, by the way, that Kohane is a paediatric endocrinologist, a PhD in
computer science, and a librarian. Another cyborg.

28.

Isaac Kohane, quoted in Jonathan Shaw, 'Gutenberg 2.0' <http://harvardmagazine.com/2010/05/gutenberg-2-0>

32

Cyborg experimenters know where the edges are. Take the case of the creation of
online scholarly communities, which is where little DH has also had a huge impact.
Kathleen Fitzpatrick, the author of a wonderful book on publishing and the future of
the academy has been a huge advocate of the opportunities of web to create and
build scholarly communities.29 She has worked tirelessly to enunciate these views
both in theory and in practice. And thats why it is people like her who are most
aware of the challenges and limitations, the edges of technology.
She wrote about the example of Twitter, which developed a huge academic user base
during its early years, roughly 2007-2012 or so, but which recently has seen a
palpable falling-o of scholars and writers who were once very active on twitter, and
have now begun withdrawing. In most cases, they've been impelled to silence by the
ways in which technologies have not adjusted to correctly accommodate the scales at
which scholars want to communicate with each other and with their audiences.
Here's Frank Chimero, a self-confessed "early adopter", writer and designer:
Heres the frustration: if youve been on Twitter a while, its changed out from under you.
Christopher Alexander made a great diagram, a spectrum of privacy: street to sidewalk to
porch to living room to bedroom. I think for many of us Twitter started as the porch our
space, our friends, with the occasional neighborhood passer-by. As the service grew and we
gained followers, we slid across the spectrum of privacy into the street.

29.

Blog and book, Planned Obsolescence

33

And became what Alan Jacobs, a humanities professor at Baylor, called "Big Twitter".
"I just want to sit here on the porch and have a nice chat with my friends and
neighbors," he said. "But wait. Im not on the porch anymore. Im in the middle of
Broadway. So Im doing what, it seems to me, many people are doing: Im getting
out of the street."
This is a problem, as Kathleen Fitzpatrick correctly says, not with technology itself,
but with its human edges: with the social systems and interactions that have
developed around it. If we are going to take full advantage of the aordances that
digital networks provide facilitating forms of scholarly communication from those
as seemingly simple as the tweet to those as complex as the journal article, the
monograph, and their born-digital descendantswe must focus as much on the
social challenges that these networks raise as we do on the technical or nancial
challenges.30 And I think this holds true for technology as a whole.

30.

Kathleen Fitzpatrick <http://www.plannedobsolescence.net/communities/>

34

A last thought for Little DH. Ive also been thinking here of Andrew Abbotts recent
book Digital Paper, which is an incredible primer, manual and a practitioners
meditation on doing research in a digital-analogue world.31 The biggest takeaway
from that book, though, is that however much you master the digital, there is no
machine shortcut. Writing and research are still slow, iterative, circular, artisanal
practices, full of cul-de-sacs and unavoidable ip-ops between the digital, the
analogue, and the limitations of your own mental and physical faculties.
Intellectually, research will continue to be a somewhat mysterious process of piecing
together insights from being immersed in your sources. Logistically, it will continue
to be a fundamental battle against the ever-present entropy and erosion of biological
memory and the integrity of your physical and virtual ling systems. The digital, it
turns out, does not "cure" this state of aairs, and it isnt even clear if it is something
that needs curing. Apparently, even at the edge between digital and paper, this is
just still how humans create researched things.

31.

Andrew Abbott, Digital Paper (2014).

35

V.III. Big DH
What Konrad and I think of as Big DH has been the more visible, exciting and
well-funded realms of the digital humanities, characterised above all by the shift to
the macro-scale: a function of the explosion of data and the tools with which to
analyse it. Big DH projects are frontier celebrations of new scales in humanities
projects: new abilities to derive sensible patterns out of the data of the past through
practices of distant reading.
Big DH appears to be nearly all frontier. With new methodologies like geographic
information systems, text mining, topic modelling, social network analysis and data
visualisation tools, we are able now to read patterns across enormous datasets of
land, language, people and social groupings, letters, shipping routes, climate trends
and more. These have until now been simply beyond the scope of a single person.
We also have new curation and archiving capabilities, new scanning and digitization
practices.
These tools are being developed in the wake of an undeniable information explosion
of our digital age. We live in a world today where physical space is no longer a
relevant or even possible constraint on the amount of information that is being
produced and stored.

36

No one knows how much data there is out there, and some estimates suggest that
90% of all data in the world today has been created in just the past few years. If you
were to print out Wikipedia alone, you would have around 2000 Encyclopedia
Britannica-sized volumes. Someone printed out the edits to the entry on the Iraq War
from 2004 to 2009, and that came to 12 volumes on its own. In the 15 minutes
between now and the end of my talk, 720 hours of Youtube videos will be uploaded,
Google will have received 30 million search inquiries, Twitter will grow by 1.5
million tweets, and we will have sent an additional 3 billion emails. We can't even
begin to print this, let alone comprehend it.
More and more archives are being digitized, and they are enabling scholarship and
lines of inquiry that we would never have thought feasible. We now have an
astonishing range of Big DH projects.

37

One of the most established and richest ones is the Republic of Letters project at
Stanford, which traces the networks of correspondence, scholarly community and
intellectual exchange in the world of Enlightenment France.32 We have network
representations of the Dutch Republic of Letters,33 data visualizations and interactive
web animations of the Atlantic slave trade,34 intimate social network visualizations of
life in an eighteenth century French village,35 GIS maps of the spatial patterns of
modern urban cultural change in Republican China,36 incredible soundmaps of 1920s
New York,37 and more.

32.

Republic of Letters <http://republicoetters.stanford.edu/>

33.

Scott Weingart, 'The Circulation of Knowledge and Learned Practices in the 17th Century Dutch Republic' <http://www.scottbot.net/blog/wp-content/uploads/
2011/06/WeingartHagueFebruary2010.pdf>
34.

The Atlantic Networks Project <https://sites.google.com/site/atlanticnetworksproject/home/a-webmap-of-atlantic-networks/thematic-databases/dutch-and-britishatlantic-slave-trade-voyages-1751-1795>


35.

Angouleme in 1764 <http://www.fas.harvard.edu/~histecon/visualizing/angouleme/index.html>

36.

A Historical GIS Dataset of Urban Cultures in Republican Beijing <http://www.iseis.cuhk.edu.hk/history/beijing/intro.htm>

37.

The Roaring Twenties <http://vectorsdev.usc.edu/NYCsound/777b.html>

38

I also love the Old Bailey Trial archive, where the visualization of large-scale
patterns in nearly 200,000 pages of criminal court proceedings from early 19th
century London have enabled historians to investigate lines of inquiry which are
fundamentally dierent from those they had been pursuing earlier, because
historians have until now been focused on the sparser and therefore more humanly
manageable archive of the 18th century.38

As more and more of our data is born digital, these tools and early frontier
explorations will and must become part of the landscape of the historian's craft. For
quite apart from opening up new ways of examining old historical material, it will be
impossible for historians of the future to curate, preserve and understand the
historical material we are producing at astronomical rates in the late 20th and early
21st centuries without them. Historians must retool.
That's the frontier. What is the edge?

38.

Old Bailey Trial Archives: With Criminal Intent <http://www.oldbaileyonline.org/>

39

At least one edge is in the attened, deceptively inclusive "we".


There are over 7 billion people on the planet, and of that 7 billion, 4.4 billion of them,
or 62%, are not online. 84% of the US has internet access, but that's still 50 million
Americans who are not actually part of the "we" that data scientists talk about, when
they say "we live in a hyperconnected, networked, all-access world". An
overwhelming percentage of those 50 million oine Americans are - just like the rest
of the global oine population - rural, low income, illiterate, female.39 Global
internet penetration is a mere 15% in places like Bangladesh, Pakistan, Nigeria,
maybe 20% in Egypt, Philippines, and Thailand. 15-18% in Indonesia, the largest
Muslim country in the world with a population of 250 million.

39.

Pew Research Internet Project; Report by McKinsey & Co, 'Oine and falling behind: Barriers to Internet adoption', <http://www.mckinsey.com/insights/
high_tech_telecoms_internet/oine_and_falling_behind_barriers_to_internet_adoption>, accessed September 2014.

40

The "we" embodied in that iconic global phenomenon, Wikipedia, masks the fact that
it is a phenomenally gendered enterprise -- its content creators and knowledge
producers are 80% male. It is also disproportionately Anglo-American. Here is a
chart of Wikipedia articles correlated with world population, and as you can see, the
situation for Asia versus Europe is starkly reversed: Asia represents 60% of the
world's population and less than 10% of Wikipedia's articles, while Europe
represents about 11% of the world's population and nearly 60% of Wikipedia articles.
In terms of proportion, North America punches way above its weight in terms of
ratio of population to number of articles.
And as the studies by Mark Graham at the Oxford Internet Institute have shown, the
Middle East and North African (MENA) regions are astonishingly unrepresented:
with only about 260,000 articles in Arabic out of 9 million total Wikipedia articles,
few languages have more speakers and fewer Wikipedia articles than Arabic. As
Mark Graham says, "This relative lack of MENA voice and representation means that
the tone and content of this globally useful resource, in many cases, is being
determined by outsiders with a potential misunderstanding of the signicance of
local events, sites of interest and historical gures."40

40.

Mark Graham, 'How Well Represented is the MENA Region in Wikipedia?' <http://blogs.oii.ox.ac.uk/policy/how-well-represented-is-the-mena-region-in-wikipedia/>

41

In other words, a kind of Orientalism for the digital age. Edward Said, we are not
done with you.41
It turns out that despite the promises of the openness and egalitarian spirit of the
web, despite the supposed ubiquity of data and the "everything is available" rhetoric
of the internet, the online and oine worlds are not separate. As Astra Taylor said,
the digital is not distinct from 'real life', it's not a realm where analogue prejudices
are somehow magically abandoned."42
The massive digital archives of our recent past will require as much critical reading
as our non-digital archives. Maybe even more so, given the pernicious promise of big
data methodologies to allow us to eortlessly read everything - but really, only the
everything that is digitally available. Historians, probably especially historians of
the non-Western, non-English-speaking world, know that many archives of our nondigital distant past will never see the light of digitization. And, in an age of big
digital data, that this may well scale up into histories we will never write, the
attention will never pay, and the empathy we will never cultivate, to the dark,
analogue edges of available human knowledge.

41.

Edward Said, Orientalism. "The Orient and Islam have a kind of extra-real, phenomenologically reduced status that puts them out of reach of everyone except the Western
expert. From the beginning of Western speculation about the Orient, the one thing the Orient could not do was to represent itself."
42.

Astra Taylor, The People's Platform, p. 108.

42

I am reminded of what Samuel Arbesman said about dinosaurs.43 In 1858 the rst
known American dinosaur was found in New Jersey, the hadrosaurus. This sparked
what is hilariously called the Bone Wars, the great dinosaur rush: a period of intense
speculation, exploration and discovery roughly 1877-1892 during the American
gilded age. Huge amounts of work were undertaken primarily at two dig sites: the
Hell Creek formation in Montana and Como Blu in Wyoming, chosen mostly
because there were just so many fossils there, and they were really easy to work at.
And the dinosaurs that were easiest to nd in these two places were: the
stegosaurus, the triceratops, and the tyrannosaurus rex.
Today, these dinosaurs have a disproportionately large presence in our collective
knowledge, not because theyre the most "important", but because they were the
most available: they are the "convenience sample". Convenience samples proliferate in
a data economy that aspires to innite convenience: information wherever and
whenever you need it. But will we be in increasing danger of making too much out
of the low-hanging fruit? As you hear the latest claim about big data and its
promise, its OK to be excitedjust keep in mind that we may be nding the
triceratops, and thinking we understand everything there is to know about
dinosaurs.

43.

Samuel Arbesman, The Half-Life of Facts. See also Samuel Arbesman, 'Big Data: Mind the Gaps', <http://www.bostonglobe.com/ideas/2012/09/29/big-data-mind-gaps/
QClupxdwdPWHtRrZO0259O/story.html>

43

What do we know about humanity? At the heady frontiers of the age of


informational convenience, do we wish to care about its inconvenient and less
available edges? Can we? Should we?

V.IV. Digital Futures


This brings me to our digital futures, and to the can and the should. The frontier, and
the edge.
When I watch videos of tech conferences these days, Im sometimes reminded of
what Linus Torvalds, the founder of Linux, supposedly used to say: I only do
kernel space. I dont do user space.44 I watch talks from computer scientists
showcasing their inventions and amazing feats of data interpretation and I see how
widespread this is. We can recognise your face from this bunch of blurred pixels
using new techniques of compressed sensing and machine learning classication; we
can predict your gender, your race, your political leanings, your sexuality, based on
basic linguistic and network analytics; we can spread happiness and sadness through
micronudging Facebook user feeds.

44.

Quoted in Tim Carmody, 'Memory to Myth', <http://www.theverge.com/2013/1/22/3898584/aaron-swartz-prole-memory-to-myth>

44

Yes we can. Should we? Shoulds arise out of politics, of ethics, of philosophy - out of
sustained conversation, of reection on and investigation of ourselves as a society.
As eminent tech journalist Doc Searls put it, politics is in user space. Justice is in
user space.
Thesis #1. User space is the domain of the humanities.

45

When he said that, Doc Searls was speaking from a place of profound sadness and
disquiet: a memorial service for a remarkable man called Aaron Swartz, who hanged
himself on 11 January 2013. Aaron was a cyborg: a computer programmer, a political
activist, a moral philosopher, writer, business entrepreneur, an architect of social and
technological standards for sharing information, a reader and writer of extraordinary
range. All his life, he fought battles which spanned both kernel and user space. He
played, for example, a critical role in the defeat of the Stop Online Piracy Act and
Protect Intellectual Property Acts in 2012 that posed a fundamental threat to internet
freedoms; he lead an unprecedented wave of citizen activism that, as he put it,
killed the bill dead, so dead that when members of Congress propose something now that even
touches the Internet, they have to give a long speech beforehand about how it is denitely not
like SOPA; so dead that when you ask congressional staers about it, they groan and shake
their heads like it's all a bad dream they're trying really hard to forget; so dead that it's kind
of hard to believe this story, hard to remember how close it all came to actually passing, hard
to remember how this could have gone any other way.45

45.

Aaron Swartz <http://isoc-ny.org/misc/aaron_swartz.txt>

46

Finally, he fought the battle of why only certain people at certain well-funded
institutions got to read online scholarship which was in theory funded by
everybody. He wrote scripts to download something like 4.8 million journal articles
from behind JStors paywallsnot exactly to distribute them freely, as his
prosecutors would later argue based on a legally ungenerous reading of Aaron's
infamous Open Access Guerrilla Manifesto, but simply to make a point.46 And he lost
this battle, though others have taken up his war. Faced with 13 felony charges, the
prospect of $1 million in nes and 35 years in prison, he was hounded by the FBI for
two years, literally, to his death. He was 26.
That is why Doc Searls said, at that memorial service: justice is in user space. And
that is why Aaron's long-term partner Taren Stinebrickner-Kauman asked the
audience at that memorial service the profoundly uncomfortable hard question
lurking at the centre of Aarons story: If youre in the tech sector," she said, "why
are you there? What do you really believe in? If you believe that technology is
making the world a better place, why do you believe that? Do you really understand
what makes the world a bad place to begin with?
Thesis #2. Uncomfortable hard questions are the domain of the humanities.

46.

Aaron Swartz, 'Guerrilla Open Access Manifesto' <https://archive.org/stream/GuerillaOpenAccessManifesto/Goamjuly2008_djvu.txt>. For a nuanced account of the
prosecution, watch the excellent posthumous documentary on Aaron, The Internet's Own Boy <https://www.youtube.com/watch?v=vXr-2hwTk58>

47

All technology has its ip-sides, its dystopian edges. In all places and times,
technology has been created which was put to uses never intended by their
originators: their Janus-like quality is written deeply into their nature as tools.
Yes, Twitter enabled the Arab spring, but Twitter also enabled #gamergate and the
victimisation of online women.
Yes, the web has allowed small businesses like Etsy to achieve an international reach
previously unthinkable, but it also confers the same advantages to identity thieves in
a world of poorly-managed corporate security.
Yes, social media has enabled photo sharing, instantaneous and always-on
communication and other connectivities we have never known before. But our very
connectedness has enabled a surveillance state much more thorough than any
government could have dreamed of during the Cold War, and surveillance has
become the business model on which the internet rests.47

47.

Mike Loukides, 'The Computing of Distrust' <http://radar.oreilly.com/2015/01/the-computing-of-distrust.html>

48

The end-to-end design of the networks that make up the internet, in a way, produce
its neutrality. The network is value-neutral or content agnostic in its treatment of
packets of data that move across it, because it in eect leaves it to the applications
located at the networks ends to determine how the data should be interpreted. The
trouble is that, of course, the content of what passes over the networks is not value
neutral. Its not the bits and bytes - its not the digital - but what comes out and gets
interpreted, judged and valued in an analogue interface that really matters.
Thesis #3. Value, judgment and interpretation are the domain of the humanities.
Yet the amount of data that is passing over the network has meant that increasingly,
we are relying, have to rely, on algorithms to make those judgments and
interpretations for us. Google tells me these search results are most relevant;
Facebook tells me that these are the most important posts today in my feed.
But what is relevance? What is importance?

49

We are, as Zeynep Tufekci said, entering the era of judging machines: machines that
calculate not just how to quickly sort a database, or perform a mathematical
calculation, but to decide what is best, relevant, appropriate, or harmful.'
Such algorithms will improve our lives innitely for the better. We want better
algorithmic and statistical interpretation of metadata in clinical studies. It will help
us to predict who, for example, will respond badly or well to new cancer drugs. With
algorithmic pattern-matching, Google can predict with astonishing accuracy an
impending u outbreak simply by analysing frequency of searches conducted
around common u symptoms.

50

But these algorithms also confer enormous powers. They permit depersonalized
personalization.

Micro-targeted

advertising

based

on

macro-scale

trend

aggregations. Target knowing you're pregnant, and sending you vouchers for
discount diapers. Who is in charge when algorithms are in charge? "Algorithmic
judgment," Tufecki observes, "is the uncanny valley of computing."48 Google's autocompletion algorithms, in their earnest attempts to be helpful, auto-complete search
queries that begin "women should..." with "stay at home", "be slaves" or "be in the
kitchen". Through the work of algorithmic judgment, Google ends up aggregating
and perpetuating awful stereotypes.49 And it is the fault of everyone and no one at
the same time.
And the dark side of personalization is discrimination. The algorithms which will
tell you which cancer drugs to take can also help, for example, credit companies
predict your racial prole simply from things which someone like you tends to buy,
and can deny you loans or adjust your credit on the basis of this information.50 In this
way, the algorithmic, depersonalised use of data can become a highly personal civil
rights issue.

48.

Zeynep Tufecki, 'The Year We Get Creeped Out By Algorithms' <http://www.niemanlab.org/2014/12/the-year-we-get-creeped-out-by-algorithms/>

49.

'Google's Autocompletion: Algorithms, Stereotypes and Accountability' <http://dhpoco.org/blog/2013/11/19/googles-autocompletion-algorithms-stereotypes-andaccountability/>


50.

Alistair Croll, 'Big Data is our Generation's Civil Rights Issue And We Don't Know It' <http://radar.oreilly.com/2012/08/big-data-is-our-generations-civil-rights-issueand-we-dont-know-it.html>

51

Thesis #4. Civil rights issues are the domain of the humanities.
There is something deeply abstractive about the digital world, something dierently
bodied. Humans don't do well with innity. Distance and scale create new ways of
reading patterns in our lives beyond our individual comprehension. Over time, they
divulge patterns of cultural migration, of economic exchange, of the spread of
epidemics and other meaningful aggregations of societys ows and movements. But
they also mask, even distort, the textures and scales at which we live our actual lives.
I love the way Scott Weingart put it in his talk on the Networked Society in at the
DH Forum in Kansas last year.
When you zoom out far enough, everything looks the same. Occupy Wall Street; Ferguson
Riots; the ALS Ice Bucket Challenge; the Iranian Revolution. Theyre all just grassroots contagion eects across a social network. Rhetorically, presenting everything as a massive network is the same as photographing the earth from four billion miles: beautiful, sobering, and
homogenizing...
I challenge you to compare network visualizations of Ferguson Tweets with the ALS Ice
Bucket Challenge, and see if you can make out any dierences.
I couldnt. We need to zoom in to make meaning.51
Thesis #5. Meaning is the domain of the humanities.

51.

Scott Weingart, 'The Moral Role of DH in a data-driven world' <http://www.scottbot.net/HIAL/?p=40944>

52

At this point, I am about to run the risk of making an egregious and unwarranted
Leap. But we have already run my starting heuristic so far into the realm of
indulgence that I feel I need only ask for an iota more of your patience.
Thesis #6. If the sciences are at the digital frontier of the humanities, maybe the
humanities are at the analogue edge of the sciences.
And I would say that this, here, is a qualitatively dierent meaning of the digital
humanities: which is to say, the question of how to be human, and conceptualize
(and ensure) humanity's human-ness in a digital world. Being, in other words, digital
humans. Its not what we traditionally think of as DH; it's certainly not what DH
departments are putting in their job notices. But let us extend our tactical umbrella
just a little further. Because I believe there are few things more critical to a liberal arts
education, and to being a citizen in our networked world.
So now we are back to C. P. Snow, and his unease with a bifurcated world. And we
are back to the question I began with. Students, why do you want education;
teachers, why do you educate?

53

VI. Heuristic provocations


In our world today, we live strung out between the digital and the analogue, the
online and oine world, the fast and the slow, the bodied and the dierentlybodied, the machine and the human. The lines between these are less clear today
than they have ever been. We are past the dominion of what Bruce Sterling called
steam-snorting wonders of the Victorian age. We are past the age of massive
infrastructural projects and atomic energy programs of the mid twentieth century. In
our world, our technologies "stick to the skin.52 They serve as proxies for, and may
eventually become indistinguishable from, our minds and our bodies.
Marx composed these words in precisely that age of steam-snorting wonders.
Technology, he said, "discloses mans mode of dealing with Nature, the process of
production by which he sustains his life, and thereby also lays bare the mode of
formation of his social relations, and of the mental conceptions that ow from
them.53
How much truer, in this, our great age of unbelievably intimate technologies.

52.

Bruce Sterling, Mirrorshades.

53.

Karl Marx, Das Kapital, Vol. I, Chapter 15.

54

We cant aord to draw lines between ourselves as scientists vs humanists, between


kernel-space and user-space, the online and the oine, the can and the should. We
live in a blurred and blurring realm between the digital and the analogue, and we
must make our way through it, understand it, live it, and eventually, determine what
it can, should and should not become. And to do so, we all need to become cyborgs.
So I say to you, the student, that the purpose of an education is to become a cyborg, a
digital human, and I say to you the educator among whom I now, very recently,
count myself that the purpose of an education, in fact its duty, is to produce
cyborgs.
Let me go back to what Craig Mod said about technology.54
I believe technological change is like a freight train of a certain unstoppable momentum, and
we have two broad choices:
1. Stubbornly stand in front of the train and try to push it back
2. Accept the train and be a force laying railroad ties which place it on a nourishing course.
The right course of action is #2. But in order to be those engineers of our future, we
need to be humanists and scientists, ip-oppers, cyborgs.

54.

Craig Mod, 'About' <http://craigmod.com/about/>

55

To see the frontiers, to make the unexpected connections, to drive innovation. We cannot
know the landscape of the future. And it's probably a mistake to think that a college
education should guarantee that future in any way. Nothing intrinsic to a marketing
degree from 1990 could have prepared its graduate for the world of social media. Nothing
in Keith Thomas's history PhD prepared him for the wiki. There was no such thing as a
data journalist or data scientist 10 years ago. We do not know what dreams may come.
Being exible, and living on the frontier, gives you an edge. It's good business sense.
But we also need to patrol the edges of our humanity.
To guide the direction of the unstoppable train of big data to the right, ethical places.
To guide the development of the book, the new architectures of our scholarship, the
modes of our audience engagement -- to place these on nourishing courses, which ensure
that the qualities we value in them translate into new environments.
To innovate and develop technologies which are in line with values society wants to hold.
Perhaps these, then, are the 'new liberal arts': the arts that all men and women can and
should know, if they are to be free humans in a digital world. Let the generation of
Manfred Clyneses, ip-oppers and cyborgs be raised now, in this incredible space
between the digital and the analogue. Let the generation of the in-betweeners begin our
dominion, for we are inheriting the earth.

56

You might also like