You are on page 1of 85

Converting Monks into Friars

Scholars in the Digital Age

[keynote @ Craft Critique Culture Conference]

Thank you for inviting me here tonight; in particular I want to thank Katie and

Sonia for all their hard work organizing this conference—I know the sort of tedious,

detailed and exacting tasks that requires and I'm really glad someone else had to do it.

Thanks to Dana for tossing my name into the hat and for having the chutzpah at thirteen

to join the Horror in Film and Literature list and hold his own despite our attempts to beat

on the brat.

I want to thank all the organizations that sponsored this event: I figured if I needed to pad

the talk, I could read through them all very slowly, but this slide will have to do.

I want to talk about the weird intersections between the digital age and the

medieval one, but a digression first.

If you've read Beowulf, you'll know that digressions are a key feature of the poem. While

initially an episode might seem unrelated to the main narrative, each sheds light on an

important aspect of the overall story, such as how Hildeburh's mourning over her son and

husband prefigures the woman singing at Beowulf's funeral or how the burning of Heorot

tips us off to fractures in the Danish court. So trust me; I'm a doctor.

When I first received the invitation to speak here, after almost deleting it as spam,

I immediately thought of Jim Dixon in Kingsley Amis' comic classic, Lucky Jim.

If you don't know the book, it's one that captures academic life with a merciless and

hilariously observant eye, rather like David Lodge's Small World, which skewers the

academic conference circuit. I loved the book long before I decided to pursue a PhD and

my immersion in academic life has only made me love it more. The novel embodies all

my ambivalence with academia: all that potential for knowledge, thought and meaningful

exchanges too often undone by pettiness, obfuscation and the horrible weight of

bureaucracy. It also has the very finest description of a hangover in English literature.

I mention it now because the climax comes when Jim delivers a speech on

"Merrie England," one that he'd been drafted to give in place of the senior professor in his

department. Of course by the time he reaches the podium, he's hopelessly drunk,

floundering and eventually rude, abusive and giving up all pretence to any attempt to talk

about medieval England in any way. I use this as a "what not to do" example, by the way,

and have assiduously avoided scotch before my presentation.

The second thing that came to mind was the high bar set by Kurt Vonnegut for

speeches at colleges.

His good humor and loving humanism became such a trademark image that a piece not

written by him but by the far-less-well-known journalist Mary Schmich became an

internet sensation as "Kurt Vonnegut's Commencement Address at MIT." It started—

according to Andrea Wesselenyi, who went to the trouble of sifting through the myriad

emails and posts that tangled the story—with one person (for reasons unknown)

forwarding Schmich's piece with Vonnegut's attribution.

The advice "Wear sunscreen" sounded enough like Vonnegut's wry humor that the tag

was accepted and inevitably forwarded by a million billion people afterward. I may be

exaggerating the number, but not by much. That's the thing with a digital meme. It's

instantly shareable.

And easy to manipulate. So Vonnegut's name gets attached and few people

question it. His name—unlike Schmich's—confers authority. As Mayor Vaughn says,

"it's all psychological. You yell barracuda, everybody says, 'Huh? What?' You yell shark,

we've got a panic on our hands on the Fourth of July."

In other words, you forward a piece that says Mary Schmich, everybody says, 'who?' but

you forward a piece that says Kurt Vonnegut and fifty or a hundred of your friends do the

same thing. It's about authority and authenticity—just like in the Middle Ages.

There exist many medieval manuscripts attributed in this manner: the Pseudo-

Alcuin, the Pseudo-Turpin, the Pseudo-ben Sira. These works were attributed to well-

known authors or to fake authors whose names sounded authoritative. I always impress

upon my students that the Middle Ages were all about claims to authority. Latin has more

authority than the vernacular for centuries, the bible has more authority than mere human

opinion, men have more authority than women and kings more authority than their

subjects. Chaucer's Wife of Bath claims experience as her only authority, but she peppers

her lengthy prologue with knowing references to biblical stories and clerical authorities

as well as her own tales of marriage.

We've got the same anxiety about authority and authenticity now. I want to try to

make an argument about the digital milieu of our time being roughly equivalent in some

key ways to the oral milieu of the distant past. Of course the digital one moves more

quickly and can be more comprehensive—a whisper on Twitter can be heard across the

world—but both have that anxiety about authority and authenticity.

How does that manifest? Phishing scams for instance: though we probably want to

believe that even the most novice web cruiser can recognize the typical Nigerian banking

scam for what it is, they keep showing up.

The sophistication of these flim-flammers grows as they ape the style of authority and

authenticity. Our campus has been plagued lately with emails alerting users that their

inboxes are filled and they must log in remotely to sort the problem out. The complexity

of these missives, which try to mimic campus emails from the IT department, is a wonder

next to the random verbiage of the average penis enlargement or lotto winning spam.

Worse, we've now got fake warnings about non-existent scams that fill up your inbox just

the same. Consequently, this gives people more anxiety. Who can you trust?

That's the underlying fear of the digital age. The famous New Yorker cartoon,

"On the internet, nobody knows you're a dog," points to the genuine discrepancy between

presentation and reality that gives so many people a well-spring of anxiety.

At the heart of this fear stands performance. Increasingly in the digital age it's true, as

Vonnegut wrote in Mother Night, we are what we pretend to be, so we should be really

careful about what we pretend to be. I can tell you that I'm a sober, serious medieval

scholar who writes about obscure topics in dead languages,

but I could just as easily be a romance writer who pens scandalously steamy novels in a

variety of genres for the burgeoning erotic romance market.

My title "Converting Monks into Friars" is about the need for us, scholars, to step

up and claim authority in the public realm by actively performing our work in the open.

Monks were supposed to keep to their cells in quiet contemplation offering a mostly

unseen model of piety, while friars were sent out into the community to beg for alms for

the church and to preach the word as a very visible sign of piety. We need to come out of

our ivory towers.

The long history of elitism connected with that phrase goes all the way back to the Song

of Solomon, where the lover's neck is compared to an ivory tower, but it was also a

popular motif in medieval depictions of the Mary, an oddly phallic image of purity like

the unicorn, a fierce beast who could only be tamed by a gentle virgin.

Look around us: the ivory tower has become a death trap—if only for the fact that

American anti-intellectualists are taking an axe to it. The undeclared war on education

has been devastating and continues to broaden. "No Child Left Behind"—a deeply flawed

concept based on falsified data—has assured that all children are being left behind. The

mistrust of education, erudition and intellect grew from our Puritan heritage but has had a

boost in recent years thanks to the virulent right wing campaign to make us ashamed of

our smarts and suspicious of people who can think. [gratuitous Tea Party signage]

One of the reasons I love being in Britain is that they honor and respect their

boffins: what a great word. Stephen Fry, the gay, manic-depressive, gadget happy Oxford

graduate who remains restlessly curious about things like printing presses and poetry may

be the most famous man in Britain. Feminist (that dirty word here) Germaine Greer pops

up on chat shows and game shows. Professor Brian Cox is rightly reckoned a rock star

because of his level of fame that has him appearing with Alan Moore and receiving an

OBE from the Queen last year.

And speaking of Queen, the guitarist of that rock band went back after years of massive

fame to get his PhD in astrophysics.

I don't think we have to take up guitars or chat shows to change the climate for

intellectuals in this country (although I am more than willing to do so),

but we do need to get out there and show people what we do and why it's important. And

we need to do it in ways that people understand and in the places that they go. Too much

of academia has been a tree falling in a forest with no one to hear. Guy LeCharles

Gonzales offers a great example of this in his discussion of the way the conversation on

education has been re-framed by the people behind the controversial documentary

Waiting for Superman not only because of their film, but also their active and elaborate

transmedia platform which inspires action and participation. They're building a

community who connect to one another and encourage further action.

Conversely, Diane Ravitch’s truly excellent book, The Death and Life of the

Great American School System: How Testing and Choice are Undermining

Education, takes the traditional approach, an expert voice armed with data, strong

opinions and a call to action, all buried between the covers of a hardcover book

that will be read by far too few people, and offers those who might be inspired to

act… nothing.

We have to offer something. We have to reach out. We need to let people know

what we're doing and that it's important. Academic integrity doesn't have to mean

academic frigidity. But we've got a massive PR problem that goes far beyond the ivory

tower and its ivy-covered walls.

We face a growing tide of corporate control over every aspect of our lives, one that

directly benefits from an ignorant and ill-informed populace. To do that we need to reach

the hearts as well as the minds of people who may not be in our classrooms (yet). We

need to show that our work is essential not just to other academics, but to the very fabric

of this nation which seems to be unraveling as we watch. Social media offers one of the

best ways to demonstrate what we do as we do it.

How did a medievalist get so entwined in New Media? I know what you think: we study

dead languages in dark libraries where we thumb through dusty leatherbound volumes of

vellum translating obscure sermons. Okay—well, sometimes we do. I did have to master

a number of obscure languages for my PhD. I have gone to the British Library to handle a

tenth century manuscript, so I have touched a part of that distant past, felt the skin of it

with the holes where the page was stretched to far and the hair that didn't get scraped

from hide, while I traced words written by a long dead hand a thousand years before me.

And yes, I cried the first time I saw the Beowulf manuscript, its singed edges reminding

me that we very nearly lost it in the Cotton Fire. And yes, I did suggest my university

start a medieval theme park where people would have to make their own book from


And it's true that some medievalists (like my dissertation director) are quite happy to

have nothing more modern than a fountain pen—if only for the reason that they don't

want to keep a flock of geese in their office for fresh quills—and shrink from using

computers as much as possible.

However, by and large, medievalists are geeks. This, perhaps, does not surprise

you. I seem to be one of the few who did not decide to become a medievalist after

reading The Lord of the Rings.

Tolkien was a medievalist before he redefined the genre of epic fantasy; he drew on

medieval works like Beowulf and The Volsunga Saga. But the people who read him are

often the kind likely to watch Star Wars and read speculative fiction and yeah, own

gadgets and be online. Geeks.

Apart from simply being geeks, the primary reason medievalists were early

adopters of the net: scarce resources. Remember how I said there was only one Beowulf

manuscript? That's the case for a lot of medieval works. When we say there are "a lot" of

manuscripts of The Canterbury Tales we're nonetheless talking about fewer than one

hundred. The "best seller" of the Middle Ages, a work known as The Pricke of

Conscience, boasts a mere 117 copies extant today.

If you want to study a manuscript and it is located far from you, you had to travel to go

take a look at it. The previous solution for examining manuscripts you could not actually

travel to see was to get microfilm of the volume. For fragile manuscripts, this was also a

way to preserve their integrity and avoiding excessive handling. If you've ever used

microfilm, however, you know what a poor medium it is for detail. If you're trying to

closely examine a manuscript to check a translation or a possible confusion, you'll go

cross-eyed trying to peer through the murk.

Digital copies are miles ahead in quality. In 1993, Kevin Kiernan launched the

Electronic Beowulf project and inspired a whole slew of similar projects.

Digital copies really are the next best thing to being there when it comes to

paleographical studies and the way most academic budgets are going, it's the only one

many of us can afford. Scarce resources mean a lot of networking, too. In addition to rare

manuscripts, there were also almost as rare reference books that you couldn't get through

interlibrary loan as well as questions about indexing, scholarship in progress and

cataloguing. Before I started graduate school, I was already networking with scholars in

the field in discussion groups for Old Norse and Old English via Gopher [not that kind].

Back in the days before your mother was on the internet (mine still isn't), there was a

text-only version where information came in hierarchical lists that you could go up or

down, but not sideways. If Gopher was an escalator compared to the stairs of the library,

the world wide web is a Wonkavator. Medievalists were all over it from the beginning.

But it's not easy. Trust me: as a medievalist I know how hard it can be to convey

the worth of what you do. I have been known to use the obscurity of my studies to baffle

and intimidate colleagues. I made sure that my first opportunity to speak in front of my

new colleagues at my current job included a chance to recite some Anglo-Saxon poetry

(without translation, naturally).

My students always get the opening lines of Beowulf recited to them (of course) as well

as the first few lines of The Canterbury Tales.

Mastering nine or ten languages for your dissertation does give a sense of

accomplishment which can at times lead to arrogance, but it also offers the genuine

problem of putting a great distance between what's in your head and what's in the head of

the person to whom you may be speaking.

So, I spend most of my time translating the past to the present. Not just the words:

the majority of what my students read, they read in translation. What I need to translate is

the entire context. I have to bring alive the Middle Ages so that students can understand

not just the sense of the words they're reading (challenging enough that) but the culture

from which they spring. I start every semester with the words, "You have been lied to"

because it's true.

Most of what they think they know about the Middle Ages is lies: people did not think

the world was flat, Vikings do not have horns on their helmets, Roger Bacon was

employing the scientific method in the twelfth century and those "medieval" witch hunts?

Those were from the early MODERN era. And yes, people bathed—in fact the public

bathhouse was likely to be one of the best places for conversation and gossip. These are

the ideas I have to disconnect and replace with the true picture, translate the silly notions

they've got through popular culture about the time period into more well informed and

developed pictures of the reality.

We need to do the same "translating" of our work with the general public. We

have to unpack not just the importance of what we do, but the entire process of how we

do it, to demystify the procedure of scholarship and lay bare our methods. We have to

make academia a spectator sport. There's a great Monty Python sketch that presents novel

writing in this way.

You hear the roar of the crowd as "local boy Thomas Hardy" sits down to write his novel

The Return of the Native and the announcers giving the play by play sound just like the

commentators at any football match: "…he dips the pen in the ink and he’s off! It’s the

first word, but it’s not a word, oh no, it’s a doodle way up on the left-hand margin, it’s a

piece of meaningless scribble and he’s signed his name underneath. Oh dear, what a

disappointing start!" They play the scene for laughs, but I know for a lot of scholars the

very idea of exposing their developing work is terrifying. Yet that's precisely what we

need to face if we want to get more people to comprehend what we do and its importance.

While I don't think it's necessary to literally write in public—though that is also a

useful act—we do need to be much more forthcoming about the process. Fortunately

there are a lot of scholars already engaged in this process in a wide variety of ways. One

great site to find out more about the folks who are doing this is Enabling Open

Scholarship, but there are many more and probably at least one or two in the area in

which you do your work.

Consider academic publishing: there's a lot of work done researching and then

writing. There's also a huge amount of work done to peer review and then edit those

pieces. Then there's a further step of publishing and printing, and finally there's the

distribution step, which may or may not include publicity, garnering published reviews of

the final product or advertising the work at conferences or in other publications. It's a

work-intensive process; it's also a money-intensive process. Academic publishing could

be a lot less expensive if instead of being packed into small print runs of library quality

papers, it were openly available on line to anyone who's interested in the topic.

Open Scholarship does have some of the same costs in time investment, but it has

none of the printing costs and more importantly it doesn't have as much of the lag time.

The instantaneous access of the web—the satisfaction of writing something and posting it

at once—makes it more and more difficult to withstand the slow pace of traditional

publishing. I realise that some of our impatience marks us as creatures of the 21st century

who expect instant gratification, but academia is a conversation, an on-going

conversation. How well does a conversation go if there are gaps as long as years between

the speakers?

Picture this: you're working on your dissertation and you see a notice for a new

book on a related topic. You've got something to add to that conversation. However, even

if you've defended already, you've still got to get your dissertation out to publishers who

might consider it. Should one choose to publish your tome, they will next ask you to edit

it—primarily but not solely to remove all the things your committee asked you to include

to show your knowledge of the work in your field. Add on the usual time for the various

stages of publishing and printing, and your "response" finally comes out—but the

conversation has moved on. Digital scholarship can speed up the process immensely and

keep those conversations going, while maintaining the academic rigour that is so

important to our work.

It's not as if most of our work generates royalties anyway.

Most academic publishers are struggling at best; some exist only because they have

university support underwriting them. People will still want printed editions of many

works, but academic publishers are going to go through the same upheaval that

mainstream publishing has been experiencing in the last couple of years.

Things are not going to change; they have changed. Think also about the prohibitive costs

of textbooks. With the advent of tablets and ereaders it's much easier for students to read

digital texts—certainly a lot more comfortable than reading PDFs on a computer in a

campus lab with buzzing fluorescent tubes, constant chatter and the whirr of other

machines and printers.

This is how we should be writing, too—for ereaders and online reading. I realise that my

weekly column for BitchBuzz has retrained the way I write. Instead of adding footnotes,

my habit is now to add links. I've already incorporated that into my blog writing. Instead

of explaining bizarre words or complicated concepts—or frequently in my case, obscure

British comedy references—I just slap a link onto the phrase and let people investigate

further if they're inclined to do so. I know that when my students read a large number of

them never look at those footnotes at the bottom of the page. Would any more of them

click on live links while reading an ebook, particularly if a dialogue box popped up with

the requisite information, which they could close as soon as they read it and continue

reading exactly where they were? A lot, I suspect.

We can't rely on institutions to transform themselves. We have to do it.

We need to begin living as the kind of scholars our times require and we have to do it

now. We have to get out of the ivory tower and into the daylight of the public streets—

even if it's only the digital information highway as we so quaintly called it back in the

90s. I know it can be difficult to step out of the comfort of the cloister and into the bright

glare of the public gaze, but you can do it. As Kurt Vonnegut said, "wear sunscreen"—it

will help. But get out, be heard and be seen. Show the value of what you do to your

students and to your community and to your fellow scholars.

Too many of my colleagues try to hide from the digital world. "You can't use

websites as sources" some will say. "You can use websites that end in .edu but you

cannot use other websites," or the always reliable, "Do not use Wikipedia under any


Never mind that studies have demonstrated Wikipedia to be reasonably reliable—at least

as much as print encyclopedias. And I think it's a very interesting assignment to have

students monitor a topic on Wikipedia and examine how it changes, because all robust

websites change. Information has never been static; now we have the technology to keep

up with the changes. Yes, it is true that any idiot with a computer can set themselves up

as an "expert."

That's why an essential part of teaching critical thinking should be teaching students how

to critically read the web. Sturgeon's Law was made for the web and we can see it so

clearly there.

One of the very first things I do in my Writing for New Media course is have the

students take a look at the Dihydrogen Monoxide site.

It's a great way to introduce them to the rhetoric of web design. I find that my students

have a sophisticated ability to read visual cues, but they lack the ability to articulate how

images create impressions. Is a website friendly or unfriendly, beautiful or ugly,

convincing or fake. We talk about elements of design including colour—usually the first

thing they mention—and layout as well as links, buttons and images. I'm kind of amazed

at how much more knowledgeable students are about fonts now. They recognize fonts!

They recognize bad fonts, overused ones.

Even five years ago I still found it necessary to explain what I meant by a "font" and they

really only connected it with word processors (and padding paper length). While they

don't quite assimilate the concept of branding, they can spot fonts that ape well known

logos like Coke or Starbucks.

Back to Dihydrogen Monoxide: they parse the site carefully commenting on the

color choice (unexciting), the fonts (serviceable), images (too few, small, most look like

standard issue clip art), links (Paypal for donations, similar organizations) and they

conclude, not unreasonably that the site is probably a few years old by the look of it, run

by people who are more concerned about the message than how it's delivered and overall

seems reasonably reliable—after all it has a copyright notice so we know who's

responsible and it's very informative. I suppose you can tell they're mostly liberal arts

majors in the class, because they seldom stop to consider what "dihydrogen monoxide"

might be.

It might also be slightly cruel and devious on my part,

but I want the lessons to stick: Don't be fooled by the rhetoric of the page: dig into the

content. Find out who's responsible—if you can't find out, there's going to be a reason,

whether it's mere incompetence or some attempt at subterfuge. Doubt and examine. After

this exercise, we turn our now more informed gaze upon websites for corporations, public

services, social justice organizations and even our own college. My students offer sharp

critiques and suggestions about how to improve our website's appearance and usefulness

for current students.

Of course along the way we deal with a lot of the hot button issues that consume

us all as we try to negotiate teaching in the digital age, including those big problems,

copyright, plagiarism and piracy.

I find it interesting that when I mention my New Media course to colleagues their first

question is usually, "What do you do about plagiarism?" It drives me crazy that the

people most concerned about plagiarism of words have no compunction about copying

and using art without attribution of any kind, peppering their syllabus or Powerpoint

presentations with copyrighted images.

You know what I do about plagiarism? Nothing. Well, almost nothing. There's a brief

note on the syllabus after the standard verbiage about academic integrity that says if any

student identifies plagiarism in another student's work, they can receive that student's

points for the assignment.

It hasn't happened so far; I'm not sure it will ever happen. I don't think I want it to

happen. Mostly I think it's unlikely to happen because of the nature of the writing we do.

My students are looking at websites, copying and pasting information back and forth—

and then commenting on it, explaining, analyzing and interpreting that information. They

seldom know in advance which models we'll be using in class as I try to pluck sites and

stories that are in the news at the time I'm teaching. The best way to avoid plagiarism

remains creating assignments that cannot be plagiarized.

Not that I discount the effects of plagiarism and piracy: but I think that policing

via Turn it In and similar paid sources is the wrong approach. We have to face a very

basic and irrefutable fact: once it is possible to make an exact digital copy easily, people

are going to do it.

I'm more accustomed to dealing with this issue from the author side, where ebooks mean

epiracy. I know far too may writers who spend inordinate amounts of time hunting down

piracy sites, railing against pirates on social media and begging people to help fight

piracy. Sometimes I think they spend more time worrying about piracy than they do

writing. I guess I don't worry as much about it because I think a lot of digital pirates are

just digital hoarders. Every pirated book is not a lost sale. Some thieves just steal because

they can. And I'm with Cory Doctorow on the difficulty of monetizing obscurity, which

is a bigger problem for most small authors than piracy is.

iTunes proved an important fact about music piracy. If you make it easy and affordable

for people to get the music they want, they'd rather pay than pirate, especially when they

know that piracy sites leave them vulnerable to malicious downloads of mallware,

spyware and so forth.

Rather than focus on plagiarism, I think we have to bring more attention to the

students' rhetoric. In my New Media course we focus on the ways that we use language to

create an impression for different audiences. Yes, I talk about textspeak, which often

seems to be used as prima facie evidence of illiteracy. I certainly heard much hand-

wringing amongst my colleagues on Facebook about LOL making it into the OED.

It made me immediately think about this book: medieval scribes used a whole slew of

abbreviations in manuscripts to save time. When you're copying a giant book by hand,

stopping frequently to sharpen your quill or squinting in the afternoon light, any time you

can save with an abbreviation is a good thing.

We need a handbook to decipher them now, fashions change; but they understood one

another because it was a shared discourse community. LOL may not be acceptable in all

situations, but I think we could give it a pass sometimes.

We need to work on our own rhetoric, too, and think about addressing a broader public

than just each other and our students. We need to be friars out in the community, not only

gathering alms, but preaching the word: the word is that academia is good, that academia

is not a luxury but a necessity. Convincing some of my colleagues to be public scholars

can be an uphill climb as it requires more time than they have to reconsider everything

they do, but I figure most of you are graduate students and so you can start out on the

right path, rather than have to adapt what you're already doing. So how do we make an

effort to perform as public scholars?

Blog: Whether on your own or as a part of a group, blogs offer an informal site for

showing the intersections between daily life and academia. You can talk about your work

as you do it, you can also talk about the process of working through an academic project.

The sympathy you will generate as well as the understanding achieved pays off

immensely. Blogs also form the backbone of scholarly networking. Where once we had

anonymous bloggers who felt unable to speak truthfully about academic life and its

difficulties and isolation (like Rate Your Students), we now have group blogs that offer

sophisticated knowledge and personal insight to guide those struggling through the

complicated process of academic programs and professional life.

A great example in my field: In the Middle, which features Jeffrey J. Cohen, Karl Steel,

Mary Kate Hurley and Eileen Joy. They present work in progress, highlight new

publications (their own and others') and talk about academic life and conferences. They

even have a mascot.

Because it's a blog, it's public and it's an on-going conversation that people both inside

and outside the academy take part in.

Another avenue that's ripe for scholarly discourse: Twitter. I wish I had a dollar for

every time someone told me they just "don't get" Twitter. They're intrigued by the use of

it in the Iranian elections and the Egyptian revolution, but they're always telling me they

don't want to hear about what people had for breakfast. Well, neither do I, so I don't

follow people who talk about their breakfast.

I follow Margaret Atwood and a host of other well-known writers. I follow publishers

and trade publications. I follow other medievalists and other scholars. I follow people I

know. I follow fake personages who are amusing. And about 1300 people follow me—

why? I could be modest and say, gosh! I just don't know why, but the truth is some are

spambots who follow anyone and the rest are people who like being part of the network

to which I belong—the information I share has relevance for them.

For those of you who get the retro reference, Twitter is "The Relay" and for the rest of

you, it's micro-blogging: passing on essential information to your friends.

Of course, you know I'm also going to mention Facebook. It often seems that a lot of

folks have a very adversarial relationship with Facebook, whether they refer to it as a

time-suck or as an invader of their privacy. But the plain fact is that the social network

has more than 500 million active users, half of whom log on every day. You have the

world at your fingertips on Facebook. You never know how far a message will spread.

You may want to keep your personal identity just for friends and families, but you can

create pages for your class, your program or department, your projects and invite people

to join the conversation, offer advice, participate. Throw your dreams into the air and see

what they bring back to you. I have had new friendships, places to visit, publications and

more because of Facebook connections. I've also had a hell of a lot of fun. Our work

should be fun. Of course it should be rigorous, challenging and thoughtful, too, but this

is our life. Let's enjoy every minute of it.

Make no mistake: things have changed. These are the things we can say at present

about the digital age:

1) Digital media is pervasive

2) Digital interactions are here to stay

3) We don't know what all this means

When Homeland Security announces that they will be sending alerts via Facebook and

Twitter, things have changed. When every business, organization and yes, college or

university has the little collection of "like", "tweet" or "buzz" buttons on their homepage,

things have changed. Everyone may not be on Facebook, but my dad is.

I have had people sniff at online interactions, saying "I talk to people in real life"

with verbal italics on the latter words, as if the people on the other end of the computer

link were not real. I tell them I have dead friends online.

Yes, I revel in the shock value of saying that, but I do want to bring them up short [and I

miss my friends a lot]. While it's possible to be false online, to be a dog or whatever, it's

also a way to reach across continents and oceans and find kindred spirits. I have met so

many people who have brightened my days, and perhaps more importantly, my nights—

those long dark nights of the soul where isolation seems certain and comfort far; a friend

who's there in cyberspace can be a friend indeed. You can tell the digital optimists from

the digital pessimists.

Those who believe the internet made of cats and those who believe it is made of trolls.

I suppose like all things, it's a mixture of the good and bad [Kipper as Naboo]. The

internet is only a tool no better or worse than the people who use it.

No, I am not a complete lover of technology: I love writing with my fountain pens

and I close my letters to friends with a wax seal.

Technology for the sake of technology is not what this is about: just because we have the

ability to do something doesn't mean we need to do it and there are a lot of risks involved.

The development of technology without ethics or morality will not bring us good things.

But we need to use every means possible. There's a concerted war against what we do:

one to turn universities into businesses, to denigrate intellectualism, to cast us as

frivolous dilettantes. We need to be visible, we need to be active, we need to be useful.

We need to employ interdisciplinary scholarship and work across traditional boundaries.

We need to be friars out in the community, preaching the word. The word is thinking and

the word is good.

Be proud of what you do. Remind people that the humanities teach us how to

understand ourselves, what makes life worth living. This is not trivial or frivolous. We

must perform as digital scholars. It can be a good thing to be on stage. Remember, the

stage can be digital; open your mind to the possibilities—all the possibilities, what

Russell Hoban calls, "being friends with your head."

Be friends with your head; if you find a Lady Gaga song illuminates what you're saying

about Restoration drama, use it. If you can make a comparison between The Pardoner's

Tale and Treasure of the Sierra Madre, do it. If you can find a way to explain the current

political situation with World of Warcraft, do it. The difficulties of being an

interdisciplinary scholar get compounded by choosing to study popular culture, too, but

it's what academia desperately needs now. "Demonstrating relevance" seems to have been

interpreted as "you can get a job with this"; true relevance means bringing out our

essential humanity whether you are studying medieval literature or The Wire. Tell us who

we were, tell us who we are.

I wanted this to be a speech that changed the world, that brought children of every

nation together to join hands and sing songs, for all of you to rise up as one and declare

universal peace, to fill your heads with impossibly adorable images of puppies and kittens

and leave a golden glow of beneficence that no political wrangling or natural disaster

could dispel. I guess I'll have to settle for hoping that you got something useful out of it,

even if there isn't a slice of cantaloupe at the end. Mostly I hope I've encouraged those of

you who—like me—can't seem to fit yourself into existing categories. Be friars for the

church that is learning. Go among the people and preach the word. Go on being friends

with your head; have confidence in your passions; and wear sunscreen. It couldn't hurt.