[iDC] Interview with Warren Sack on New-Media Art Education

Trebor Scholz trebor at thing.net
Mon Jan 24 11:49:12 EST 2005


Interview with Warren Sack on New-Media Art Education
by Trebor Scholz

TS: In a recent interview members of kuda (new media center, Novi Sad)
addressed the lack of non-proprietary software in the corporate world.
But nevertheless, kuda strongly opts for open source / free software in
education as:

"The cadre of designers and programmers that relies on proprietary software
to find a job, is no different than the Fordist proletarian subject but
without proletarian consciousness. We can link the ideas around software to
Marx¹ notions of the necessity for the proletariat to own the tools it uses.
As of now, software and hardware tools are in not in our hands."

There are examples of universities in the U.S. that are in the process of
entirely switching to open source software. How do you see possibilities for
open source in an American academic context?

WS: As implied by Kuda, this is both a question of consciousness-raising and
also of functionality.  There are specific marketing and litigation
strategies of disinformation that are actively undermining the necessary
consciousness raising. These strategies of disinformation are similar to the
ones big media and big industry have been using for at least a century: they
are strategies of "seamlessness."  By this I mean that powerful interests
want you, the consumer and citizen, to ignore the seams that articulate the
parts of computers and networks together. A perfect example of this, right
now (December 2004), is AOL's current marketing campaign.  AOL assures us,
in television ads, that they can create "a better Internet."  This is
willful obfuscation.  The Internet -- as a net of nets -- is, by definition,
outside of the control of a single entity: AOL can't change the Internet
even if it wants to. But, what AOL wants people to believe is that AOL is
the Internet. And, from personally experience, I can tell you that many lay
people think this is the case.  When, for example, I've demonstrated to
novice users who have AOL accounts that they can "see the Internet" from a
standard browser that is not the AOL technology, they have been rather
shocked.  To them it is seamless: there is no difference between AOL and the
Internet.  This serves AOL's interests because people are then led to
believe that there are no other alternatives. Another good example of this
was Microsoft's -- legal claim of a few years ago -- that their Windows
operating system and the Internet explorer web browser were inseparable:
that one could not be shipped without the other.  (Or, Microsoft's current
run-in with the EC courts contending that its Windows Media Player is
integral to the Windows operating system.)  This turned out to be
technically trival to prove to be false -- the application and the operating
system can be separated -- but the U.S. Justice Department must have spent a
pretty penny to convince the judge in charge of the case.  So, my point is
this: to propose open source as an alternative within any given work context
requires some amount of consciousness raising that is being actively worked
against by large concerns that would like the public to believe -- not just
that their products are "better" -- but that no alternatives exists. But,
then there is also the issue of functionality: open source software is
frequently designed and implemented by experts who have little or no insight
into what non-programmers might need or want. Setting up and maintaining a
Linux server, installing an open source database system like Mysql, using
open source alternative's to commercial software (e.g., Open Office), etc.
can be a hassle even for those of us who are experts.  In fact i do not have
anything against non-open source software by companies that build solid
tools and do not engage in disinformation campaigns. Unfortunately, it is
usually the companies engaged in disinformation that also build lousy
software.  There is a crafty business rationale for doing this, for making
your customers your alpha testers: the company saves on quality control
personnel and also gets customers to check in with them frequently.
"Staying in touch" with your customers by having them check in with you
every week to patch the lousy software is unethical, but effective for
fostering a relation of dependence. Any strategy to adapt open source
software should take into account the fact that some commercial software is
a nice complement to open source software.  For example, working with Apple,
Macromedia and Adobe software is usually a pleasure: they write solid,
easy-to-use software that doesn't need to be patched every second day.
These are good complements because (1) They do something better than open
source.  For example, one could use Gimp to edit digital photos, but Gimp
is ultimately a good but imperfect attempt to mimic Adobe Photoshop.

(2) Such software comes from companies that build on top of open source
software, work in coalitions to establish common, non-proprietary standards,
and who work hard to provide alternatives -- rather than fighting for
absolute dominance and the elimination of alternatives. One must also keep
in mind that open source is not anti-corporate. When Richard Stallman's
notion of free software gained a wider interest, the principles and "open
source" corporation-friendly moniker was established to differentiate it
from Stallman's more radical idea of "free software."  IBM and other large
companies are now heavily invested in, develop and critically depend upon
open source software. So, my answer is yes, universities have a lot to gain
by moving some of their business to open source software.  But, I don't
think there are good open source alternatives for all categories of
software. Actually it is good to remember, conversely, that there are
non-commercial alternatives to several crucial categories of open source
software, categories that are the foundations, the very "backbone" of the
software layers of network technologies (e.g., DNS-BIND, OpenSSL, sendmail,
and, arguably, the  Apache web server). So, the commercial vs. open source
distinction is a false dichotomy and the more important criterium to
remember when one does choose to work with commercial software is to ask
whether or not the company producing the software is an ethical company. An
"ethical company" might be an oxymoron in a conventional Marxist's lexicon,
but I think this is a crucial problematic to address if one hopes to
understand our current circumstances of post-industrialization.

TS: How does your writing of media philosophy enter into your teaching?
Which books or essays do you find most helpful in your teaching?

WS: I believe that its important to understand that technologies incorporate
frozen -- i.e., reified -- social, economic and political relations.  For
example, if you have DSL in your home, you almost certainly have more
bandwidth coming into your house than you have going out of your house.  In
other words, structured into the network wiring is the assumption that you
are a consumer, not a producer of information because the engineering has
been done to make it easier for you to download information from the
Internet rather than to upload information.  Information technologies
contain many forms of catachresis (frozen metaphor) that more often than not
started life as quirky philosophy projects and are now "frozen", but working
as silicon and gold components.  For example, the 19th century philosopher,
George Boole, had a project (An investigation into the Laws of Thought) to
try to algebraically deduce truths that is now literally printed into the
very foundations of computers: we know these foundations in contemporary
technology as "Boolean Circuits."  I try to teach my students that each of
these frozen decisions could in fact be undone and replaced with something
else. What would result might be an entirely different technology. This sort
of investigation/thought experiment is also the basis for my own research
and scholarship: I am interested in challenging and finding alternatives to
the foundations of computer science and network architectures by locating
the presuppositions built into contemporary, new media technologies.  An
example of this kind of work is the "Translation Map" that Sawad Brooks and
I did (translationmap.walkerart.org) in which we re-read the founding essay
of the field of machine translation, a text written by Warren Weaver in
1949.  Weaver proposes to understand translation as a problem of coding and
decoding.  We show the absurdity of Weaver's proposal -- and the 50 years of
work in machine translation that has been done based on Weaver's proposal --
and we illustrate a possible alternative by prototyping a network technology
for collaborative editing in which translation is understood to be a form of
collaborative work between people, rather than as a de/coding problem to be
handled exclusively by a machine.  To impart this perspective to my
students, I like to have them read original documents from the history of
technology (e.g., like the texts included in Noah Wardrip-Fruin and Nick
Montfort's "New Media Reader" (MITPress)) and also to read work from science
studies and critical theory that describes technologies as assemblages of
socio-technical relations.  Bruno Latour's book, "Science in Action"  is one
thing students in my "Introduction to Digital Media" course are asked to
read.

TS: In a recent interview Ralf Homann, faculty at Bauhaus University, told
me that Walter Gropius demanded an educational practice in the arts that
focused students on economics from very early on-- Gropius thought of the
artist as a polished, perfected craftsman. He claimed that academies
separate art from life, from the "industry." Today, there is no such thing
as "the industry" for which students could be prepared. It's not like in
other areas where a predictable skill set secures a job. In new media the
skill sets are drastically changing and what was justifiable and useful
yesterday may be irrelevant and dated tomorrow. How do you address this
dilemma?

WS:  On the one hand I disagree: I think there are very specific "craft"
skills that are relatively stable and that can be taught to students of
digital media.  For example, programming is a general skill that is
essential to the construction of all digital media. Even if one does not
know a particular programming language, if one knows how to program it is
really not a big challenge to learn another language. On the other hand, I
agree: there is no one industry for which students are being prepared.
Digital media of today is like writing was to Plato's Athens: it is a
"solvent" being incorporated everywhere and it threatens to dissolve and
rearrange disciplinary boundaries as well as industry differences.  Every
department in the university must today wrangle with the questions of new
media. Some of the oldest departments, e.g., departments of classics, have
been the most innovative in addressing the possibilities and problems of new
media. A lot of what computers and networks do in industry and government is
to automate processes that had previously been done by hand: forms of
production, like bureaucratic procedures are being automated. Bureaucracy --
which means literally "rule by the bureau, or the office" -- is being
replaced by "computercracy" -- rule by computational methods.  Larry Lessig
and other legal scholars have been very articulate in pointing out the legal
ramifications of this kind of transformation.  But, if people don't think
too deeply, computercracy ends up looking a lot like bureaucracy. For
instance, the so-called "desktop metaphor" that structures the interface
most of us use when we operate a computer, is a relatively direct borrowing
from the technology of the office -- files, folders, trashcans, desks, etc.
So, the crucial challenge is to teach fundamentals -- that may in fact be
"crafts" -- so that graduates can rethink computerization where ever they
find themselves.

about Warren Sack
http://hybrid.ucsc.edu/SocialComputingLab/





More information about the iDC mailing list