ADHO 2015 Paul Fortier Prize for Early-Career Scholars

I am touched and honored to be awarded the Paul Fortier Prize for best paper by an early-career scholar at DH2015, this year’s Global Digital Humanities Conference in Parramatta, Australia. I would like to begin by acknowledging the Darug people – the traditional custodians of the land in which I have received this honor. I would also like to pay my respects to the elders, both past and present, and extend that respect to other indigenous people.

In his guest speech during the closing ceremonies Dr. John Burrows spoke of some of the “wonderful things” Dr. Paul Fortier contributed to the practice of DH. One of the most central was his commitment to developing our ability to assemble and examine ever-growing data sets of evidence – in some cases, “all of it” – in order to understand collections of previously impenetrable size, scope or scale. In so doing, he and his colleagues inspired the work of later scholars who continue to innovate new approaches in humanities scholarship. I am proud to count myself among those who have been beneficiaries of Dr. Fortier’s “wonderful” vision.

To those who knew him well, Dr. Fortier was more than just a passionate and dedicated digital humanities scholar. His was a generous and inclusive personality, his presence and mentorship a “wonderful thing” to those many scholars with whom he made a special effort to engage. Quite a few people have told me how pleased he would have been to come to Australia from Manitoba to celebrate this conference, surrounded by the diverse and vibrant community he helped to found. I am deeply gratified to receive this award in his name, and in his honor. It is indeed another “wonderful thing.”

I would like to thank the Alliance of Digital Humanities Organizations (ADHO) and constituent organizations, the Association for Computers in the Humanities (ACH), the DH2015 Awards Committee, our Australasian hosts and my fellow nominees.

About the ADHO’s Paul Fortier Prize

Posted in Uncategorized | Leave a comment

ACH 2015 Lisa Lena Opas-Hänninen Young Scholar Prize

Thank you to the ACH (Association for Computing in the Humanities), the ADHO (Association of Digital Humanities Organizations) Awards committee and the CSDH (Canadian Society of Digital Humanities)/ACH 2015 Joint Conference Awards committee and organizers for this great honor.

I am so touched by this unexpected and very meaningful award. Having asked a few of those lucky enough to have known Dr. Opas-Hänninen, I have learned about her vanguard scholarship in English, Linguistics and Nordic Philology, and have likewise heard about her exceptional gift for being able to bring people together, fostering community among those at all levels of academia. Whether in moments of conversation (in any of a surprising array of languages), deft committee-wrangling, or unforgettable conference retreats, Dr. Opas-Hänninen helped forge the Digital Humanities community through moments of shared humanity. She continues to have an influence across our evolving interdisciplinary community of digital humanities scholars, expressed in part by the profoundly humane ways in which we all come together for conferences like this – and thus come to know, respect, challenge and inspire one another.

I am deeply honored to have been awarded this prize in Dr. Lisa Lena Opas-Hänninen’s name. I will work to be worthy of the enthusiasm and support for my work that the Prize represents, and to live up to the example Dr. Opas-Hänninen set for creativity, public outreach and engagement. Most of all, I will work to further her deeply-felt vision of a shared, intellectual common good among those who do Digital Humanities.

About the ADHO’s Lisa Lena Opas-Hänninen Prize

Posted in Uncategorized | Leave a comment

When ‘Retro’ was ‘Nowtro:’ Restoring the Apple II Plus

I have been into computers since I was young. A prescient gift from my parents to a restless pre-teen, my first computer was (I am convinced) instrumental in both productively occupying my time and absorbing my boundless hyperactive energy, but even more, it served to stimulate my imagination. The logical systems of a personal computer (to say nothing of the modular components) were like LEGO on another dimension. They too could be played with, built with. I had years of incredible experiences with my Apple computers and unsurprisingly even now, those early days loom large in my technological and childhood memories, nostalgia blending seamlessly with that sense that I was ‘on the cutting edge’ somehow. Logic, Art, Imagination and ‘Maker’ culture all combined in those early days of personal computing, and the impact on me continues to inform my scholarship and my mindset.

One specific manner in which those halcyon days of personal computing persist in my twentieth-century technological work (and scholarship) is my interest in ‘retro-computing’ – the restoration and use of old, arguably obsolete software and hardware in a modern computing context. After a few years of retro-computing I can honestly say I’m completely hooked. And from Space Shuttles to computer software, ‘retro’ is decidedly ‘nowtro.’

Part 1: Restoring the Apple II+


In the late 70’s, my parents invested in an Apple computer for me. It was an Apple II+, with 48k RAM, a monitor and that brilliant contribution among many of the Apple computer – the Apple Floppy Disk Drive. I moved aside the toys and books and games, and made a real workspace for it – a desk chair, a shelf for the diskette cases and manuals, cables tied neatly. Something about the experience meant ‘productivity’ – I knew ‘work’ was being done – but it also meant ‘creativity’ – I knew I was building, making, achieving something in this new environment. The future would be defined by a loud whirring fan, blinking green characters, and primitive beeps and buzzes.


The history of Apple and of the Apple II Plus computer has been detailed very well, but among the many huge leaps forward the machine offered was the presence of 8 expansion slots. This ‘open door’ to hardware enabled a series of innovations on the Apple platform that deepened my computing and educational experience tremendously. And as much as each of these expansion/upgrades was designed to be ‘user-installable’, incorporating each new hardware peripheral into my little Apple both required and taught a delicate, agile hand and fostered an increasing confidence in myself and in computers.

First, I had been determined to save up for a thermal printer (the Silentype).


Never one for penmanship and frustrated by the limitations of the typewriter, the printer was a practical choice that, once incorporated into my computer system (it was plugged in via a circuit board into one of the ‘expansion slots’ in the machine), helped me understand and master the promise of word processing. I could reassemble text and print it for use in the ‘mundane’ world, and then change the text and re-print it… putting me light years beyond the everyday world of Prestige Elite and white-out.

Another innovation was the joystick:


This hand-held little beauty meant more than just arcade madness (which it most definitely provided) – I could control images on-screen with my hand directly, the motion of the stick being converted into instructions to the software. This led to all kinds of experiments as I quickly began designing games and little routines that demonstrated various properties of this human/computer interface. I began consuming voraciously the various programming languages that were becoming available to me (eg BASIC, Pascal, and LOGO) in building games and my friends would come over and we would play each others’ games together… and it was in this way that another profound change came in my rich new computing world.

The need to have friends come over in order to use the computer with me (and the dangers of acquiring parental approval for the bike rides across town to their houses, to say nothing of the rare and difficult-to-secure parent-facilitated car drive) grew as the sociality of the computer deepened. Eventually, the inconvenience of what would be later called the ‘play date’ was an effective cost/time justification for the addition of a modem (modulator-demodulator) to my system.


The modem connected to the Apple via an expansion slot – but it also connected at the ‘other’ end. Plugging a telephone handset into its coupler, I could use the phone to connect my computer to my friends’ computers remotely, achieved in a fascinating orchestra of tone signals (explained well here)! Even more fascinating, Computer Bulletin Board Systems had begun to appear (there was even one in my town)! The game playing, socializing and collaborative creativity began to really take off as I found myself in a computer network, a community of computer enthusiasts of all ages and ability interacting through (and sometimes about) the personal computer. I joined ‘CompuServe’ and ‘The Well’, and got my first-ever email address at age 12.

As the technology moved ahead, my Apple II+ would become obsolete, its perch usurped a mere 7 years later by a whole host of Apple Macintosh computers (more about my first Macintosh later). However, I couldn’t bring myself to sell that original Apple II or that original Mac, or throw them away, even as they yellowed with age. Given what they had already done for me, surely each had an ongoing function to fulfill even as they slowly receded into obsolescence?

As if to answer my question, in the last few years there has been an innovation of programs and websites able to emulate and run Apple II software, a lot of those nostalgic sparks were rekindled – but yet over 30 years later, after a restoration much like this one, I still find functions that only my beloved, venerable old Apple machine itself can serve. Yes, it is a relic of an older time, underpowered, and barely able to accomplish what modern computers can achieve effortlessly, but it reflects concepts and practices that have become hidden by interfaces and icons, it is of its time but is also capable of relevance in the modern world – through the use of emulation hardware.


Emulation hardware allows older computers to interface with modern ones – to send and receive information on the Internet, to access file systems and to function in a modern context. Obtaining a ‘CFFA’ card for my Apple II from a community of retrocomputing enthusiasts, I was delighted to see that my old Apple no longer had to rely on the floppy drives, mechanical elements of the system that (while they had been at the heart of the Apple II’s contribution in personal computing) were among the most vulnerable to the degredations of time. Using the CFFA card, I was able to plug a USB thumb drive or Compact Flash card into my Apple II+ and have any of the images on the card function as if they had been inserted into the floppy drive as hardware diskettes!

Using my modern PC, I can assemble ‘disk images’ (.DSK and other formats) on a removable drive, and boot the computer into those diskettes automatically. By plugging a cable between the Apple II+’s serial card’s “modem” port and my modern mac’s USB slot, I could connect the two machines and use the modern one as a ‘bridge’ of sorts, giving my Apple a window into the world of telnet and text-based Internet browsers.


Not just in the use of new hardware, this new relationship between the ‘retro’ and the ‘nowtro’ is also about the development of new software – developers have begun to code new programs to run on the Apple II, bringing the Apple new features and connectivity even while confronting the severe memory and storage limitations of these older platforms. This does more than (merely) increase the utility of these beloved, venerable computers – it is an exceptional learning exercise for programmers and technologists, who are often ‘spoiled’ by the comparative availability of storage and processing capacity. Working with 16k RAM and other capability restrictions of 70’s era computing forces one to explore clever and economic ways of writing code and handling user input, an entire of tradition of computing that has largely gone dormant. More than point-and-click user interfaces or dazzling graphics, in this era of computing such skills of economy and logic were those that characterized the very best developers, those who could ‘do more’ within the Personal Computer than mere users (or even their fellow developers) thought possible. And in a fascinating flip, there are lots of new and exciting examples of developers working on combinations of retro-computing elements in hybrid definitions that blur the line between emulation and native, hardware and software, modern and vintage.

My Apple II+ remains an active member of the modern computing community – besides merely being able to run a vast Internet trove of Apple II software on my original Apple, I can surf the web, chat on IRC, check my email and (next up) even tweet from this lovable old computer. It is a reassuringly and refreshingly single-tasking environment, unlike my vintage 1984 Macintosh – which ushered in a whole new world of graphical interfaces, symbolic thinking and multi-tasking. More on my restoration of that machine in the next installment.


Posted in Uncategorized | Leave a comment

Speaking of ‘Speaking in Code’… Part 2


As noon approached, lunch beckoned and the discussion became atomic, situational, one-on-one. Folks began following each other on Twitter, gathering their thoughts, etc., and just then I heard fellow historian Lincoln Mullen mention casually that as the users who comprised the ‘#codespeak’ hash-tag’s ‘network’ on Twitter began following each other, the diversity of the network actually became _less_ interesting, as what were once distinct branches of ‘follows’ were fused. An incredibly unpackable idea, in the ether with so many others at this one-of-a-kind event. Perhaps good DH was, in a sense, all about mixing it up – just as we were doing here at the conference – keeping the approach fresh, diverse, and dynamic while surfacing the deep humanities questions we were asking.

After this intellectually athletic morning session I was ready for a burger with a few of my new friends, each one fascinating. After the discussion over lunch (which was refreshing to both mind and body) we ambled back happily to the Lab in Alderman Library and settled down to work – and play. Next up was classicist Hugh Cayless, whose own work on TEI and text organizational systems was close to my own (historical text analysis-besotted) heart. Hugh walked us through the many layers of meaning and function in text – and the complicated nature of managing these computationally. Organizing systems for managing and researching text must confront the daunting reality that (thanks John Laudun), ‘texts’ themselves as an object of research can possess context, subtext, and paratext, can be related to one another thru allusion and reference, and can serve any number of purposes, whether narrative and poetical, linguistic or material.

As Hugh plainly stated, “Texts are complicated and the tools are stupid.” True enough. And maybe some of us can help change that (indeed, many of those in attendance have already started).

Many humanists have text-based sources, about which they (we) want to learn and understand a lot more – so a vibrant discussion about what it means to research and work with (text and other) data in a DH framework ensued, a central theme being the nature of ‘mastery’, and what it means for the research and pedagogical relationship. In many developer communities, walls of expertise are erected around those with core knowledge – many times, those who are also best equipped to provide the kind of education in a software tool or code base that a DH scholar might need. The ‘noobie’ has enough trouble just trying to learn to swim in a fast-moving code repository, and having a ‘mentor’ or ‘master’ to help you can be instrumental in providing that ‘tacit’ knowledge.

The last seminar of this whirlwind day was led by Mia Ridge, whose own work and presentation on managing ‘messiness’ in data was incredibly apropos to some of my own work. I was fascinated by the demonstration of her approaches to working with cultural heritage metadata and excited to learn more from the group.

The resultant conversation about decision-making in data management, specifically regarding the uniformity (or incongruity) of data, including error, “edge cases” and ambiguous or contradictory information, had me at once both rapt and raring to go. A meta-theme was, essentially, how to manage the balance between crafting one’s data to fit the tools used and, more importantly, crafting the tools to best work with the data. We talked about some of the error scenarios and strategies many of us have faced in text analysis of library archives (including OCR typos, fuzzy dates, metadata errors and the choices behind when to use ‘stopword lists’ and ‘stemming’). The code and data we assemble is certainly a discussion between technologists and academics, as mentioned by Bill Turkel, but it also even more true that as Jean Bauer commented, the data is as much an argument itself as it is a subject of argument. Even more fascinating was the conversation about ‘visualizing absence’, a blog post about which was quickly disseminated via Twitter as the post-conference conversation continued. From Mia’s seminar (as with the others), I came away with a trove of conceptual and practical ‘nuggets’ I have only just begun to unpack and examine in greater detail.

Last was a building exercise – we separated into groups and defined a project based on the discussion we’d had and the themes, platforms, approaches and goals that had most interested us. I sat down with a few of my colleagues and we started hatching our plan. The topic of engaging people (audiences and scholars) from ‘marginalized’ communities had been a fascinating and recurring theme (ie the varying technological and academic knowledge successfully made available to, and the varying participation of, LGBT individuals, those of other ethnic/class backgrounds, the disabled and others). so we would set about exploring (in addition to the many ideas I detailed above and the many more that I have not), possible solutions to those barriers. Whatever obstacles those who are ‘marginalized’ may face, it is clear there are (and there are in DH) many novel means to overcome and upend them. Given its ongoing revolutionary aspect, Digital Humanities promises to remain a progressive and dynamic locus of study for many scholars of all fields and backgrounds. I am excited to be a part of the field and of the community that has emerged from the Lab.

PS: During dinner that night, we had a poetry slam and my limerick won!

“The secret to DH as racket
Is to download some source code
and hack it
Do whatever it takes
to make sure nothing breaks…
…and never forget that [close bracket]!”

I was given a lovely 3d-printed bracelet courtesy of Bethany, Jeremy and the Scholars’ Lab MakerBot! Awesome ending to the conference.

Posted in Uncategorized | Leave a comment

Speaking of ‘Speaking in Code’… Part 1


Earlier this month I had the opportunity to participate in the Speaking in Code conference at the Scholars’ Lab at the University of Virginia in Charlottesville, a summit on the use of technology alongside humanities research run by the Lab’s director Bethany Nowviskie and her unbelievably talented, incredibly helpful and congenial team.

The conference was a good 7 hour drive from NYC, but it was the vivid height of autumn and the gorgeous Virginia countryside helped clear my mind. I was excited to see my friends among the attendees and meet other attendees for the first time, some well-known ‘visionaries’ in the field, and others talented practitioners across the DH (Digital Humanities) world whose work I did not know. For many of the attendees, I too would be among the latter category. The conference was slated to be a singular experience, a handful of diverse DH practitioners in an environment expressly defined as one of professional and personal respect for diversity. A rich and varied community of coders, each exploring the issues and opportunities presented by software in DH scholarship.

This was new.

Inspired by the desire to surface deep issues in DH into discourse, the first ‘question’ of the conference was a central one – the seeming disconnect between ‘tacit’ and ‘explicit’ knowledge. I understood this to be like the experiential lessons learned when handed a lump of play-doh as a child versus having its elasticity, strength, etc. described and demonstrated in vocabulary and numbers. A similar disconnect can be seen between technology knowledge (that of computer scientists and/or coders) and scholarship knowledge (that of humanists) in a wide range of DH study from pedagogy, scholarly communication, collaboration, discourse and ‘building’. Historian/Maker Bill Turkel presented on his work, detailing many ways this leap is possible – interfacing with the (literally) tangible through the use of robotics, circuitry, new software and interfaces and even 3d ‘made’ objects. The logical and software skills that these physical computing approaches helped initiate a vigorous discussion.

Play, it was clear, was central. Popping the hood, getting in there and making art out of spaghetti. DH education could benefit greatly from an effective adaptation to the fact that in some cases the discourse, learning and collaborating that underlie the scholarship/software relation could use a little less university and a little more kindergarten.

At its core, however large or small the group and whatever process and tools they employ, DH also requires the application of novel ways of thinking and new ‘processing’ software (or hardware) on sometimes-familiar source data. Humanists often do not even think of their sources as data, but in fact nearly every imaginable body on which research can be done can be seen as data – ripe for a panoply of potential DH analyses (sociological, historical, anthropological, archaeological, literary or otherwise). That having been said, the concepts and consequences of DH are likewise dazzlingly complicated and ever-dynamic, at any moment as varied as there are teams and practitioners. The manifold considerations that shape DH compel all manner of solo, pair and group collaboration efforts.

We who do DH are working towards and preparing for this goal of knowledge exchange and collaboration. The question arose, given that many Computer Science degree requirements include some form of humanities, what about the reverse? Should effective, ‘tacit’-knowledge oriented computer science classes be part of a standard educational curriculum? In addition to courses, shouldn’t more humanities departments permit the use of a programming language to fulfill a DH practitioner’s language requirements? Coding helps make sure good theories of logical thinking and problem-solving are instilled in the ‘coder’ even as computers and the production of software fade into the background. It helps students to understand that which is increasingly hidden for the benefit of the ‘user’ behind hand-held devices with slick, simple, tactile interfaces.

This is not to say all slick, simple, tactile interfaces are bad – far from it! Stefan Sinclair lead the next presentation and set the stage for our ongoing discussion, showing us some engaging and intriguing ‘procedural’ interface design. He also lead us in an exercise of team design, in which we learned that the unexplained but implicit forms of ‘tacit’ knowledge in this new culture of learning were not physically embedded (indeed, occluded) in a discipline – we do learn by doing, but in not so formulaic a pattern – the formulas and ‘muscle memory’ of design and coding are stepping stones to the ideas that drive us. Great design accomplishes these goals, as evidenced in a number of Stefan’s projects to date.

More in Part 2.

Posted in Uncategorized | Leave a comment

Interface to Face: Touching with Technology

I have always been a musician. From banging pots and pans on the kitchen floor as a toddler to mauling my first guitar at age 9, making music has been one of my primary creative outlets. While music truly has a timeless and primordial quality, perhaps even more than language itself, it is also inherently technological.

Even during my increasingly focused academic career (I am a diplomatic historian), I have simultaneously pursued my musical career as both a pastime and as a profession. Most recently, I scored a film called “Elliot King is Third” with my partner Barb Morrison – herself a full-time professional composer and record producer. The film, which premiered at the Tribeca Film Festival this past Sunday, is set ten to twenty years in the future – utilizing the power of science fiction to address the emotional and controversial issue of transgender rights.

You can view the entire 20-minute film online:

Like any successful artistic undertaking, the elements of the piece (in this case the plot, dialogue, acting, direction, cinematography and score) all combine into a deeply evocative whole, one which connects on a fundamentally emotional level with the viewer. As an assistant composer and score engineer on the film my focus was not only on the music but on the use of technology to compose and engineer the score. However, my job would not be a success were the viewer of the film to notice the technology per se… my goal was not to impress with technology but to use it invisibly in order to convey emotion. In film scoring, technology is a tool to be wielded to greater purpose – communicating ideas about the story’s characters, evoking the moods of the scenery, and facilitating the viewer’s emotional journey through the plot. If technology is to be used successfully it must remain in the background, an indispensable but invisible mechanism for communication.

The same general principle applies to all forms of art, whatever the medium or mode – including (I believe) good academic research and writing. Whether one uses word processors like Microsoft Word, citation managers like Zotero, presentation software like PowerPoint, visualization tools like Gephi and d3.js, etc. the technology is not the centerpiece – it is there to facilitate understanding and provide a scaffold for the story one intends to tell. Indeed, even as Digital Humanities races into more and more academic disciplines with the youthful vigor of its proponents, it remains the research, the narrative, the –story–, that must be told, and told well.

As special effects and technology have raced ahead, how many films have flopped, all ‘eye candy’ and no emotion? Likewise, how many papers, so loaded down with technology that the thought process of the writer is eclipsed (or even ignored), will suffer a similar fate as Digital Humanities further penetrates academia? As DH evolves and matures, the impressiveness of visualizations, sophistication of methodology and other technologically-enhanced facets of research will increasingly assume a supportive role, just like the samplers and sequencers used to facilitate the director’s vision of a film through its score. Such tools will be similarly indispensable, facilitating the academic’s efforts, but it will remain the scholar’s greatest achievement to take the reader on a journey, to communicate effectively with both confidence and nuance, and to deliver a satisfying experience that – while perhaps not primarily emotional – evokes a more profound understanding.

Posted in CUNY GC Digital Fellows | Tagged , , , , , , , | Leave a comment

Downtime is the Best Cure for Downtime


During nearly every software project or coding effort I have ever undertaken there has usually been at least one moment of ‘downtime’ – an interminable period during which the machine seemingly acts in what appears to be complete defiance of the behavior desired. No matter what I would try to do in order to accomplish my goal, I would be met only with a maddening error message. As my conscious mind would run through a host of possible causes and solutions, I felt as if the deus ex machina that had so often inspired me had been replaced by a diabolus ex machina, cackling invisibly behind the beveled interface, its cursor aptly named.

Getting the error message to change – somehow – would often be my first sign of salvation… after all, if I could identify the cause of the error I could potentially solve it. But it often seemed nothing would work to even change the error at all in a productive way. The descent into madness (or at least, frustrated anger) seemed to accelerate. Depending on the severity of the situation and the urgency with which I needed to accomplish the original goal, I might employ a wide array of both secular and spiritual solutions to somehow lift the curse. These would often begin with a quiet moment in which I would try to think through the problem logically, and sometimes that worked – but not often. An appeal for divine intervention via Google search might serve to validate my quandary… surely I was not alone, and someone else had encountered the same error message or condition?

My despair, if not lifted, could only deepen. If a ready answer had not presented itself by this time, frustration would likely be mounting. As emotion and cold reasoning fought for floor space in my brain, print and/or online manuals were soon consulted amidst alternating deep breathing and exasperated sighs. Even a desperate ‘laying on of hands’ on the machine itself would be attempted, sometimes providing the necessary tribute, but just as often being of no avail. Here at the precipice – this dreaded point of hopelessness when an earnest IT staff member would kindly (but blindly) recommend thrice rebooting – is where one finds the true measure of their character. Most often, looking inward would reveal the source of the perceived error, the fault lying not within the program but within the programmer. Ninety-nine times out of one hundred, it is not the machine whose behavior is undesirable to the programmer but the programmer whose keystrokes are undesirable to the machine.

Getting some distance and giving my mind some downtime (whether through a walk around the block, some quiet moments with a loved one, or a blessed nap), would often be my surest way to gain the altitude necessary to overcome the procedural hurdle confounding my logical mind. Certainly, machines fail, unforeseen compatibility errors present themselves, corruptions occur, and hard drives crash miserably. However, these events are comparatively rare in comparison to the self-made brick walls we all erect as we craft software. In these moments of greatest frustration, a programmer often makes the greatest cognitive leaps through acknowledgement and recognition of the fact that it is as much the machine that conditions the programmer’s behavior (for good or ill) as it is the programmer creating conditions for the machine.

Isn’t it telling that refreshing one’s mind by distraction, fun, rest and companionship is often the best route towards better problem-solving and higher quality work? Just as with any athlete, diligent work and training is essential, but occasionally it is the intentional avoidance of work and training that helps us achieve the best results.

As often as not, downtime is the best cure for downtime.

Posted in CUNY GC Digital Fellows | Tagged , , , , , , , , , , , , | Leave a comment

The Lost Art of Spaghetti


When I was five years old I took apart an old rotary telephone. Actually I didn’t just take it apart – I dismantled it entirely. My youthful fingers, newly skilled with pliers and screwdrivers swiped from my father’s tool box, utterly decimated it. I was unstoppable. Every curly, colored wire was traced and pulled from its anchor. Every piece of metal was unscrewed from every other, from the easiest outermost elements to the trickiest embedded ones. My mind consumed the logic of assembly first through the destructive glee of disassembly, fueled by a native and innate desire to experiment. The phone was no longer a phone – it was a wild, chaotic inventory of spaghetti, electronic parts that, once dissassembled, only tangentially seemed to suggest telephony. A world of possibilities lay before me – while I knew I could never reassemble them into a working phone, perhaps I might build my own robot…

A few years later these mechanical and analytic tendencies were already well-developed and put well to use restoring the ebbing functionality of our sometimes-flaky stereo, television and Atari video game unit. But I had never successfully built that robot – the parts were eventually discarded as my mother was forced to buy another rotary phone from the phone company. Despite the vigor of my efforts they were occasionally ill-directed – understandably, my mother is unlikely ever to forget the time she came into my infant sister’s bedroom to find me hovering over her crib with a screwdriver, promising to use my mechanical skills to fix her slight case of lazy-eye. Thankfully, she intervened before my attempts at repair further damaged the object in question (my sister), but that was not always the case. My ongoing technological experimentation led to ever more spaghetti.

As I grew, computer technology grew around me. At age 8, the Apple II arrived in my life. Using the computer required a degree of mechanical skill and I was eagerly up to the challenge, soldering iron at the ready for any number of switches, relays or other elements needed to enable or disable some component or functionality. Occasionally triumphant and occasionally dashed by irretrievable errors of incompetence, I was still unstoppable in my effort to understand what was happening ‘inside the box.’ As the Apple II gave way to the Apple Macintosh, my confidence with the technology inside the device quickly extended to the exciting new field of software, enabled by the hypnotizing Graphical User Interface. It was there, in that world of pixels and movement, that the experimental fervor I had first experienced as a very young child found the greatest purchase. In software, one could copy code and modify the copy, leaving the original pristine – one could tweak and destroy and revert and destroy again. We were free to destroy, in order to learn to create. Reverse-engineering became productive as one could strip away all else but the element to be understood, and examine it in detail to better apprehend its syntax, structure, format and function.

In the twenty-first century, hardware has been sufficiently micro-miniaturized to dissuade all but the most advanced technologists from attempting physical repair and engineering tasks – the voiding of the warrantee an unassailable barrier to such invasive play. It is now almost exclusively in software that the end user’s creativity takes place. And while the sophistication of software has grown tremendously, allowing great strides in learning, the lack of connection to the box, the obscuring of the machine, alienates us and decontextualizes us from the nature of the technology with which we are grappling. It makes it difficult to differentiate between waves of innovation, coming ever faster, and leaves us breathless. Can that same glee of destruction apply when we are dealing not with actual (limited) elements, but virtual (unlimited) instantiations? What does it mean to the development of one’s ability to grapple with chaos when the spaghetti is symbolic, easily reverted back to order?

Posted in CUNY GC Digital Fellows | Tagged , , , , , , , , , , , , | Leave a comment