Tuesday, December 2, 2008

All Networks are Created Equal?




I want to use this space to elaborate on a point I made during our discussion of Inventing the Internet . Not only did military funding (i.e. DARPA) contribute to the creation and development of the Internet, it continues to play a central role in its advancement (i.e. Carnegie Mellon's $66 million of DoD funding in 2007). The ways in which Benkler describes the "networked information economy" does not only apply to the media, or the academy, or social interaction, but to warfare , and it has for a long time. DARPA researchers John Arquilla and David F. Ronfeldt created the buzzword "netwar" to describe the new type of modern warfare. No longer do we see guerrillas against central states, we see networked guerillas versus networked states. In one of the innumerable scary passages from their 1994 project The Advent of Netwar they write, "Netwar is blurring the line between peace and war, offense and defense, and combatant and non-combatant. As a result, the United States will face a new generation of nettlesome challenges that, in our view, will require new doctrines and strategies to combat them" (2). Well, yeah. Good call, DARPA.

The point here is that "the wealth of networks" that Benkler argues for is not only the "wealth of democracy" but the war all-the-time ideology of the something we may try to define as the Network-Industrial-Complex. In fact, the democracy Benkler calls for may be part of the need to implant democracy in intransigent nation states, or even the need to continue manufacturing nation-states at all. Arquilla and Ronfelt characterize netwar as: "about Hamas more than the PLO, Mexico's Zapatistas more than Cuba's Fidelistas, the Christian Identity Movement more than the Ku Klux Klan, the Asian Triads more than the Sicilian Mafia, and Chicago's Gangsta disciples more than the Al Capone gang" (5). They don't mention Al Qaida, but one cannot help but read them into this list. But the scariest thing about their proposition is that not just "sleeper cells" and "rogue agents" but western democracies as well. We can create our own list. EU more than France, League of Democracies more than UN, Coalition more than US, etc. Networks are not inherently resistant anymore than they are inherently imprisoning. The scary reality is that they are hegemonic.

Read more about the dystopian future for free at RAND.org! Get RANDY! http://www.rand.org/pubs/monograph_reports/MR789/index.html

When I was a girl, we sent our emails on long sheets, made from trees - calld it "paper"!

While I had my issues with certain aspects of Benkler's argument (frequent use of vagaries like "anyone can publish" or "attractive public sphere"), It nevertheless made me nostalgic (in advance) for disappearing forms of mass media.

In sections 13 and 14, he describes blogs and wikis as "weighted" - meaning they vary in the degree of moderation exercised by the person in charge. I wished at this point that I could remind Benkler that the idea of a moderated forum for comments on a media source is not entirely new. Letters to the editor, in fact, offered a similar model, although as Benkler points out later, the criteria for endowment with this kind of editorial power is different on the internet. In a sense, though, I think our perception of newspapers as invariably heavily moderated is skewed by a general scholarly overepmhasis on metropolitan, bourgeois media. Small town and working class media emphasize different degrees of control and offer an outlet for different kinds of commentary, for a variety of small or large audiences, much like blogs and wikis. Granted, you didn't have the totally unmoderated end of the spectrum in, say, newspapers, but that's pretty rare on blogs and wikis anyway. All I'm really trying to say is that this kind of claim often perpetuates the myth that modes of communication on the internet are often seen as wildly different from their predecessors, and we might do well to pay attention to residual practices once in a while too.

This led me to wonder how long the residual form will stick around - who writes letters to newspaper editors now? Newspapers and letter-writing alike seem to me like nearly extinct phenomena. But I'm not fully convinced that the technology was the driving force behind this shift. It seems that generations x and y had been accused of carrying an overwhelming sense of apathy even before we lost interest in writing angry letters. Perhaps the technology arose out of our desire for instant gratification, as well as our desire to dispel these suspicions of apathy without having to do a lot of work.

Monday, December 1, 2008

the revolution will not be televised, but it might be blogged

In his valiant attempt to gauge the democaticizing effects of the Internet, Yochai Benkler places a lot of emphasis on user-created and regulated content. Indeed, this is a hot topic in the Internet world, a phenomenon commonly known as "Web 2.0." Personally, and I think Benkler would agree, the whole "Web 2.0" concept, inaugurated in 2004 at O'Reilly Media's eponymous convention, is a little too optimistic. It is certainly not the "revolution" that founder Tim O'Reilly claims that it is. Rather, "Web 2.0" is just a fancy name for recent trends in Internet usage, in which collaborative work, open source material, and user control are common goals. So I would like to take this opportunity to use one of the "Web 2.0" crown jewels, the blog, to examine the blog. Initiate meta-blog.

In my mind, this is the big difference between the blog as a source of information and the mass media as a source of information. A blog can be boring and irrelevant to 99% of potential readers -- meaning anyone with an Internet connection -- yet continue to exist. It is important as a medium because it allows into broader circulation (though not necessarily demographically) a whole set of discourses that would never be considered "air worthy" in the traditional mass media sense. Television, radio, newspaper and magazine companies only disseminate information that they consider to be worth the capital necessary to do so. As such, they generally feel an obligation to air or publish things that will yield returns. Even entities like the Washington Post, known for printing daring and revelatory material, stake their claim for validity on the fact that they produce hard-hitting stories. If a subject is not particularly arresting, mass media entities are often obliged to "punch it up" a bit. I'm thinking here of the weeks upon weeks of repackaged Natalee Holloway coverage on FOXNews, long after all the other networks had moved on. The issue I am talking around here is sensationalism, and though it may not always take a form as obvious as purple prose or yellow journalism, it is always, to some degree, present in the mass media.

Blogs need not be sensational and most are not hard hitting. They are valuable because they allow into circulation a whole set of ideas that would never be considered worthy of airwaves or newsprint. Blogger.com does not depend on this post to receive a certain amount of hits, create advertising revenue, and pay for itself (at least partially) the way other media do. So I argue, blogs are not innovative because they can be posted by anyone and read by anyone. The forte of the blog is its ability to reach a targeted number of readers, while at the same time being generally ignored by everyone else. A blog need not be interesting to the public at large (I dare say ours is probably not). Thus freed from the obligation to be entertaining or arresting to a wide set of readers, a blog (like ours) can actually get down to the business of addressing issues. 

Benkler's first two case studies are exemplary. No mass media outlet would dare publish thousands of corporate emails. Why? The answers, ordered from most relevant to least in terms of media concern (1) it would not sell (2) it's illegal. Until they had been made sufficiently salient issues through the investigative efforts of concerned citizens, no media outlet would touch the Sinclair and Diebold scandals. However, those small groups of concerned citizens who coalesced into full-fledged movements would never have had access to the information had it not been made available online, where it passed unnoticed by the public-at-large.

In sum, I'm happy that most people will not read this blog, nor care to. It is a forum for our group to discuss issues that we find relevant to our course, our readings, and our papers, without having to strike a hard-hitting or sensational pose. If some net stumbler happens across it and finds it relevant and intriguing, great. But, if not, our little blog is still a success if only for this reason: I would rather speak thoughtfully and productively in a small group than recklessly shout to hold the attention of a large one.

The internet's not all that


Yochai Benkler has a good strategy in this chapter. He addresses the criticisms that are most frequently levied at the claim that the internet democratizes by suggesting that we compare our contemporary networked public sphere with the previous, mass-media dominated one, not some idyllic world in which everyone can speak and everyone will be heard. Benkler argues, “There has never been a complex, modern democracy in which everyone could speak and be heard by everyone else. The correct baseline is the one-way structure of the commercial mass media” (¶ 61). This move allows Benkler to assert that the networked public sphere does democratize; Benkler takes a position best characterized as “It’s not perfect, but it’s better than what we had.” I like this claim and even agree with it, but I want to look at it in closer detail. I’m just not sure about the dichotomy of “networked public sphere” vs. “mass media.”

Buttressing Benkler’s “ better than what we had” claim is his suggestion that more people have the ability to produce and publish in the networked public sphere than in the old, mass-mediated model: “Computer literacy and skills, while far from universal, are much more widely distributed than the skills and instruments of mass-media production” (¶ 42). I think this is an important point. Certainly it’s much easier for average people to voice their opinions online than to get them in television, radio, or a newspaper. But, as Benkler acknowledges, the ability to voice your opinions online is “far from universal.” This whole accessibility issue is a big concern of mine, and it’s related to my questioning of the dichotomy.

Benkler tries to be fair here, but he’s too hasty. The "networked public sphere" leaves many people out: those who can’t afford computer technology or internet access and those who aren’t computer savvy. In addition, one problem I have with Benkler’s approach is that he’s so focused on people creating information that he forgets about people receiving it—the community of readers. Anybody with grade-school reading skills and thirty-five cents can read a newspaper. Even less, perhaps, is necessary to watch television or listen to the radio. The community for these media seems large and heterogeneous. The networked community, on the other hand, might be smaller and more exclusive. Some people don’t access the internet at all. Some are only casual users. It just seems inevitable that, at least for now, the networked community is not as large and inclusive as what we typically imagine as the public sphere at large. This is part of why I question Benkler’s suggestion that the “networked public sphere” is an altogether distinct—and superior—public sphere in itself.

I want to respond to Benkler’s “better than what we had” reorientation with a reorientation of my own. Benkler always assumes that the internet has its own public sphere—what the calls the networked public sphere—but it seems to me that instead the internet is simply another medium alongside radio, television, newspapers, and so on. Why would its community be a necessarily distinct “public sphere”? Benkler’s dichotomy of new, networked public sphere vs. old, mass-media model suggests that the internet isn’t a part of mass media. But that’s hardly true. For instance, look at the extent to which radio, film, television, and news overlap with the internet: Youtube, movies online, news websites, television shows online, radio shows online, commercials archived, etc. If anything, the internet seems like an especially indistinct medium considering how much it borrows from and overlaps with other media. I think the internet is a medium rather than a public sphere in itself, and the characteristics that Benkler discusses could simply be conventions of a medium or genre rather than indications of a distinct public sphere.

Saturday, November 29, 2008

Commenting on Benkler

I commented on the "whole page" of Benkler's Wealth of Networks chapter 7 and it hasn't shown up, so here's hoping it will be approved by the author and won't just be deleted. Just FYI for everyone else, if you haven't already submitted. It's not an instantaneous addition to the website.

In short, my post to the chapter was, the internet is all great now that it's relatively new but not so new that no one knows how to use it. But it's becoming more regulated the more it's understood and able to fit into legislative schema (I'm thinking particularly of copyright laws). This isn't inherently bad, but it creates more constraints. China is becoming looser while the US is becoming more restricted. Will we meet in the middle, or will everything eventually become constrained as governments and corporations figure out how to get a firm grip on it? I think it's possible, though I imagine something new would take its place and we would start with a little more history than before and it will take less time to get the public sphere back in full strength again, and less time to tighten up. I do see it cycling through development and freedom to restraint and outdatedness faster and faster.

Wednesday, November 19, 2008

No, really ... who invented the Internet?


Remember Al Gore's famously distorted, misquoted, and media-warped charge, circa the 2000 presidential campaign, that he "invented the Internet"? [The original wording, as liberal pundits have been since vigilant about pointing out, was that he, while acting in the capacity of United States congressman, consistently backed, championed, and initiated legislation which "helped to create the Internet," in his own words ... thanks, Al Franken, for clearing that one up for me.] Well, as it turns out, and as we probably all have since come to understand, Al Gore did not, in fact, "invent" the Internet ... but then again, it seems that no one else really did, either.

Janet Abbate's book labors hard on the side of an "invention" narrative, the kind we saw with Briggs and Burke whose "solitary genius" determinist logic led us through history in terms of names, dates, and respective "inventions". This is not the "invention" of Paul Starr, who wanted us to believe that our media are a group of democratically inspired "creations" stemming from public sphere debate and demand. But while Abbate, in her introduction, mentions "the cast of characters involved in creating the Internet" (2) -- Vinton Cerf and Robert Kahn, Lawrence Roberts, Tim Berners-Lee and, apparently, the entire National Science Foundation acting as a composite entity, are some highlighted members of this "cast" -- we don't catch much of the legislative/political story from Abbate. Chapter 6, "Popularizing the Internet," begins to sketch this medium's ascendance in popular culture and American life, stating how the Internet "would be transferred from military to civilian control," but it never really gets around to actually showing us this transformation in a socially intelligible manner. Likewise, she explains the Internet's eventual worldwide status and ubiquity in terms of "the convergence of many streams of network developments" (209), as though to say that the world was ready and waiting to accept this new medium without legislative intervention, political struggle, or social consequence. And I'm pretty sure that just isn't the case.

At any rate, I am interested, and annoyed, by the varying centrality of this kind of invention narrative in some of our readings this semester. It is, in many ways, a very traditional way of doing history: names, dates, and places usually amount to "facts" in the high school history textbook sense of the term. And Abbate isn't exactly deviating from this tradition; she is, instead, trying to force it where, perhaps, it just doesn't fit. Inventing the Internet leaves me now wondering if the idea of invention -- a notion "solitary genius" that now ties into concerns such as intellectual property, patent laws, etc. -- even exists anymore. People, it seems, don't invent things anymore, but corporations do and military research teams do.

And, I checked, and Al Gore isn't cited once in Abbate's book. For shame! Read all about how Al Gore kinda sorta invented the internet in this 2000 Washington Monthly article.

Tuesday, November 18, 2008

Don't bogart that NET, my friend

(How cool would it be to have a computer with a steering wheel? This photo shows how a "home computer" could look in the year 2004.)

Like Dave, I particularly liked how Abbate emphasizes the role that users have played in the development of the Internet over the years, but I share his questioning of this narrative. Here’s how I see it.

Internet users, she argues, aren’t just consumers: they also help to define and develop the Internet, and those definitions and developments, Abbate explains, aren’t merely of the technical sort. She closes her Introduction with a great comment that almost seems like a nod to those of us in the social sciences: “[. . .] the meaning of the Internet had to be invented—and constantly reinvented—at the same time as the technology itself” (6, emphasis added).

In chapter three, “The Most Neglected Element: Users Transform the ARPANET,” Abbate begins to suggest exactly how early Internet users contributed to its development. She explains that new developments like TIP, ANTS, ELF, and USING came about as users at various ARPANET stations defined new needs and new applications for ARPANET. She discusses how users’ enjoyment of email defined a new “major” function of the net and explains its popularity by saying that email provided access to people rather than to computers (109). Abbate even jokes about shady drug deals that were orchestrated by “Inventive students participating in the early 1970s counterculture” (107).

But frankly, I’m suspicious. None of these people sound like “users” to me; they sound like wunderkind computer science graduate students (yes, even the stoner ones). Now, granted, in the “Dream Weaver” era (I mean the song, not the website-making software) computer technology wasn’t available or understandable to the average person. I totally get that. On the other hand, today—when computer technology and internet connectivity is available to most Americans—I still don’t think that users define the direction of the Internet. While I’d like to believe in Abbate’s narrative about users, I don’t think I do.

How much power does the average user really have over the Internet? You can post a video of yourself on Youtube, but then again, you can only do that because the website and its developers made that possible. Youtube probably existed before your ability or desire to post videos of yourself did. Then there are blogs. You can start a blog in about five minutes and use it for political rants or to make copyrighted material available and in doing so stick it to the man. But will your blog change anything? Will anyone even look at it or be able to find it? I keep trying to think of ways that users can change or develop the internet, but I can’t. All I can think of is Youtube, Blogger, Ebay, Craigslist, and MySpace. That suggests to me that users don’t control the Internet; corporations do.


These kind of sites don’t allow us to do anything but what their templates provide. The whole idea of “My” in Myspace is a total misnomer: the ads everywhere show that it’s not really “your space” at all. It’s hired space. And the fact that you’re given these fairly standard templates into which you can insert your pictures and text again suggest that there are serious limits to the “My”-ness.

In other words, I don’t believe the average user can play a role in the development or transformation of the internet. I believe all we can do is what corporations—through popular websites and applications—allow us to do.

The Cyclic Future




Even the most Phil K. Dickian futurist could not have predicted our current technotopia. We spend more of our lives digitized than we would probably like to admit. It is impressive, though, that Janet Abbate demonstrates considerable foresight in her reading of user innovation of the ARPAnet; you could argue that her analysis foreshadows the current Web 2.0 explosion of user-generated (and edited) content: YouTube, Facebook, Blogs, Flickr, etc. One significant aspect of her prediction does not line up, however. She explicates the most neglected element of the evolution of the ARPAnet, "During [its] first decade of operation, fundamental changes in hardware, software, configuration, and applications were initiated by users or were made in response to user's complaints" (Abbate, 83). This seems plausible enough, until we remember that the vast majority of users who even had access to the network at this happened to be the very developers of the infrastructure: programmers, computer scientists, and graduate students who only could afford access from the large institutions that paid the bill. Maybe her explanation is signified differently within the age of "peer production" and DIY technology, but I can't help but question the amount of "user" innovation when the user probably has, or is working towards a PhD in CompSci, as compared to Joe the Blogger sitting in his underwear in Iowa City. When the user is an expert, her claim seems less radical.

I must commend Abbate on her approach to historicizing an immensely technical narrative, and, from the limited reading I've done on the subject, her casting of a cyclic history of this phase of technological developments seems to align with other historians. My most salient example of this is a great essay by Thomas Streeter from Critical Inquiry called the "Moment of Wired" where he illuminates a dialectical relationship between the romanticized Wild West version of the Internet stemming from bored, pimply middle class slobs and grad students (those who had the technical proficiency to use the web in the early 90s) versus the commercialized version of cyberspace that emerges with the release of the first good web browser, Mosaic. Wired Magazine catalyzed this convergence by using the new glossy interface of a picture-capable web browser to sell the rebellious dream of the web back to a larger market than just the technogrunts who created the dream, and the infrastructure, in the first place. Streeter more realistically characterizes intellectual labor and consumption in his history of the Internet, which, admittedly, is not of the same time period. Abbate resists the temptation of heroicizing the creators of digital technology, but I think she is peremptorily seduced by the cult of amateur, which, by putting the power solely in the hands of the users, obscures the relations of production that produce new technology and technological practice.

She contextualizes the ARPAnet's "consumers" as "researchers who were to use it in their work", because she wants to privilege the innovation that occurs during the development of the technology, not only in the end product, but her division between user and researcher still seems like an imagined one (83). More than even her own oversight, we, as readers ten years later, cannot pigeonhole her brand of user innovation as user-generated content, or even open-source software save we misrepresent the economic conditions that continue to nurture our phenomenally exciting and profoundly alienating reliance of technological media.

IM IN UR HI SCHOOLZ, RECRUITIN UR STUDENTZ

I'd somehow picked up somewhere along the line the knowledge that the military had something to do with the internet, but it wasn't until reading Abbate's account of the r&d leading up to packet switching and the ARPANET that it sunk in just how tied in this wonderful magical invisible thing that I use to interact with my friends and family and to stalk people from high school and to do like almost all non-grocery-related shopping is to saber-rattling nationalism and Cold War paranoia. It kind of blows my mind a little to think that something so integral to my day-to-day life grew in part out of research designed to keep the higher-ups of military command in contact with one another just in case people somewhere went batshit insane enough to try to blow up the world and we needed to, you know, finish the job.

And staff at colleges and universities helped finish the job--Abbate at one point remarks that "the ARPA approach exhibited the weakness and the advantages of an 'old boy' network" (54), explaining how "graduates of the IPTO-funded programs...became a major source of computer science faculty at American universities, thereby extending ARPA's social network into the next generation of researchers" (55) and further promulgating the idea that any research agenda in CS needs to be justified in terms of benefit to the military--that's where the big bucks are, after all.

Hence the picture up top/title of this post--so often we think of high schools as the place where the military does their recruitment. And while that's most likely true as far as straight-up enlistment is concerned, doesn't Abbate's account just bring it home that maybe aspects of the university system are just as much a playground for disseminating militaristic ideologies, or at least ideologies that get us used to the military as a benevolent supporter of oftentimes incredibly beneficial or useful r&d (present case included) while simultaneously obscuring the fact that they're only interested in that r&d so much as it might somehow give them more efficient ways to wage war?

Monday, November 17, 2008

LOL!!!!!1


C: I like to verb words
H: What?
C: I take nouns and adjectives and use them as verbs. Remember when access was a thing? Now it's something you do. It got verbed. Verbing weirds language.
H: Maybe we can eventually make language a complete impediment to understanding.

One of the most interesting consequences of computers, and later the internet, is the phenomenon of "verbing" (a term that I find specifically lovely because it is itself "verbed"). The most obvious incidence of this is the term "to Google," which is practically synonymous with performing an online search. Even now, I am blogging, a verbing of the word blog, which is a contraction of the words "web log." The Calvin and Hobbes comic featured above is taped to my dad's computer monitor (and has been for as long as I can remember) because he insists that when Bill Watterson drew the comic he was talking about computer language, and I think Pops is on to something. When used as a transitive verb, access seems to naturally precede either the words file or Internet. You can "access a room," but it just sounds odd, and even the "access" in "access road" is a noun.

As a trend in language, "verbing" seems pretty recent. But let us for a minute consider gerunds and past participles. Gerunds are present participles. In other words, they are a verb's action noun. When I say, "I am jogging," I am essentially nouning the verb of jogging and using it as the direct object of the sentence. When I say, "Jogging is the greatest thing ever," I am not only lying, I am using a gerund as an object of a sentence. Past participles get even more complicated because they can be passive or active and can also modify nouns, verbs, and entire sentences, so I'll spare you a full exegesis of their many powers here. However, just for the sake of example, in "the painted chair," painted is a passive modifier of a noun. It is an adjective made from the verb "to paint." So if verbs can become nouns, or even adjectives, I see no problem with nouns becoming verbs. 

What does bother me is L337, which, as far as the Internet is concerned, is the real impediment to understanding. For those fortunate enough not to know, L337 is a means of communicating online that takes its name from an abbreviation of the word elite to "leet," that is then rendered into a perplexing array of ASCII characters, the basic units of digital text communications. It is an intentionally difficult to read "language" used by savvy Internet users as a way of perplexing n0085 (aka "noobs," aka "newbies," aka novices to the scene). What makes L337 particularly difficult is its tendency to incorporate common keyboard misspellings. So "the," often mistyped as "teh," is incorporated into L337 as "t3h" or "73h." In online first-person shooters (FPS, or FP5), the term "owned," (which refers to someone who is defeated/killed, specifically in a really neat way) also appears as "pwned," reflecting to common typing error of accidentally striking the "P" key instead of the "O" key. The sum of L337 is a dense and often indecipherable string of characters, the appropriation of which often varies greatly from user to user, ie: the informal epithet "dawg" can be rendered "d@wg," "|)awg," daw9," or any number of other ways. For a medium intended to promote communication, collaboration, and democratic access to information, L337 is an ironic reactionary movement predicated on its exclusivity and opacity.

The L337 phenomenon seems to have already run its course. It enjoyed a brief vogue about five years ago and has since been largely relegated to the ash heap of history as another in a constant procession of discarded fads. However, one of it's spin-offs, the LOLcat, has proved to have exactly the opposite effect of L337. An LOLcat is a photo with humorous text which need not necessarily have anything to do with cats. The text often includes some of the more transparent L337 renderings, featuring abbreviations and awkward or incorrect grammar and spellings. However, to quote Anil Dash in a 2007 article in the Wall Street Journal, "This is a very large in-joke [that blurs the line] between Net geeks and normals." The LOLcat featured below applies the standard "IN UR" trope (IM IN UR FRIDGE...EATIN UR FOODZ or IM IN UR WIKIPEDIA...EDITING UR ARTICLES) to Schrödinger's proverbial cat. To me this seems to evidence the fundamentally inclusive aspect of the Internet. Janet Abbate's Inventing the Internet includes an analogous anecdote, noting that even at the birth of computer networking there was a sense of exclusion, elitism, and stubborn individualism. She quotes Lawrence Roberts, who wrote of pioneering Principal Investigators, "they knew in the back of their minds that it was a good idea and were supportive on a philosophical front, [but] from a practical point of view, they--Minsky, and McCarthy, and everybody with their own machine--wanted [to continue having] their own machine. It was only a couple years after they had gotten on [the ARPANET] that they started raving about how they could share research, and jointly publish papers, and do other things that they could never do before" (50). Perhaps there is something American about practicing a rugged individualism on the digital frontier, but it seems to be trumped by a sense of community and sharing (which made easy when what UR sharing is infinitely reproducible).
And one more just because.
Frame 1: "I Has a Buckit" Frame 2: "Noooo they be stealin' mah Buckit"

Copy this

[I'd insert a clever  One of the recurring themes in "Inventing the 
image here, but it Internet" seems to be the question of ownership
would likely be  online. Abbate describes the instance in which
an infringement.] ARPA authorities declared "that BBN had no legal
right to withhold the source code [of the IMP programs] and had to make it freely available" (citing others, 71). While the average internet user does not care about IMP source codes or any other source codes, it's a relevant issue. Firstly, web designers care about webpage source codes, which are accessible from (I think) any browser for (I think) any page. Once open, it shows the entire code, which may or may not make sense to the viewer. That this information helps to reinforce the idea that the 'net was made to be open and informative. And, importantly, alterable. What it means for designers is that they have to be wary of how replicable their designs are. While webpage designers have as much of a copyright on their work as a painter, there is no way to copy and paste a painting (unless it's a digital file, but that's the next part of this) like there is a way to copy source code. 

For aspiring web designers, there are plenty of copyright-free templates from which to work. And open source webpage codes remain an informative way for young users to learn the tricks of veteran users, in a free-flowing apprenticeship without walls. But I do wonder if it stymies webpage design. In order to get around the issue of people copying their work, many site designers dumb down sites to a series of patchwork images that look like they have more discrete function than they do. Others might hide their work within convoluted and misleading source code that doesn't help a new designer. Or they might use new programs that make it impossible for anyone without commercial grade software to even begin to replicate it. Where's the fun in that? I'm not a web designer, but I do think open source codes have fascinating potential to teach.

What this flows into is copyright for more traditional things, like books and 2D art and music. All of these spring up online in digitized form. Sure, they aren't the original. They're a series of characters and code that lead back to even more zeros and ones, but in the end, it's the same thing (especially for digitized music like BritBrit and books that only come out in eBook form). The internet is an opportunity for sharing information, and ultimately books, art, and music are information, of a sort. And copyright law is complex enough to make Joe QWERTY unsure whether he's downloading legally. Aren't all books educational in some way? So they should all fall under the clause that allows material to be legally downloaded without cost to the end-user. Or shouldn't they? Do you need permission? What if the author/artist doesn't give it? If you buy it, are you buying their permission? What do you own, the digital file, or the one instance of the copy of the artwork?

Artists and authors work hard and deserve money for their work in the same way that anyone else deserves money for the work they do. But perhaps we're coming into a new era of collaborative digital arts, where works are shared and improved (like Wikipedia). I often think that academics should go further in this direction. Some scholars collaborate and some do not. What if we all did? Would it be feasible? Are there some things that cannot be learned by sharing of digital documents? What might happen to those things (e.g. painting, live music, autobiographies, etc.)? Are they rarefied or relegated to the dustbin?

Thursday, November 13, 2008

Advertising Civil Rights

Kathy's article on the power of the black consumer in 1950's radio brought me back to some thoughts I had on an article by Leigh Raiford about SNCC posters during the civil rights movement. In “'Come Let Us Build a New World Together'”: SNCC and Photography of the
Civil Rights Movement," Raiford argues for the importance of photography as a visual tool for community organization and empowerment in the 1960's. I can't revisit her whole argument here, but one of the key points I took away from it was that in terms of the mass distribution of the images and text that defined these moments, organizations like the SNCC picked up where mass media left off. She notes: Reporter Paul Good opened ABC television’s first news bureau in the South, in Atlanta, in the fall of 1963 and recounts that national news reporters were dispatched from location to location to produce reports of major “events,” especially those involving violent confrontations, as they happened, or shortly thereafter. Good notes that, by
1963, he had “received the impression that they [editors and producers] were weary of civil rights stories . . . and they did not want or need any analyses of current white-black attitudes or projections of how these attitudes could affect the course of the civil rights story in the days ahead.” The SNCC saw their opening, and produced posters like the one above.

For Raiford, the poster emphasizes (much like black radio) "the 'beloved community,' a circle of redeeming kinship in and through which individuals aimed for their greatest personal and collective potential. Doing so forces each member of this trinity to recognize themselves and each other as free from fear, internal doubt, and external degradation" (1136). But I also found it significant that these posters were commodities of a kind, allowing those affiliated with the movement to purchase and assert their affiliation with the movement, using a very different kind of buying power to affirm identification with a wide, racially and geographically diverse group of activists. Speigel argues that this image, above all, targeted an audience of financial supporters: "That SNCC chose this photographic poster as representative and appealing suggests that it was the image of themselves the young activists most wanted to promote. That the poster sold out suggests that, at least in 1963, this was the image of SNCC that audiences found most
compelling" (1137).

This argument sends me in a number of different directions, but in the context of the formation of the black radio market that precedes it, it suggest to me that there is another level to the idea of harnessing the powers of advertising and consumerism. Perhaps dissent is as available in the harnessing of the formal and industrial features of mass advertising, as it is in exercising the power of the consumer.

You're traveling through another dimension -- a dimension not only of sight and sound but of mind...Next Stop: The Suburbs!




In all of her discussion of the utopian/dystopian tension in the early years of TV, Lynn Spiegel never mentions one of my favorite TV shows of all time, The Twilight Zone. The connections seems particularly apt when we view her book as an argument about many meta aspects of television: TV shows depicting TV consumption. She describes the space-making aspect of TV: "Given its ability to bring 'another world', it is not suprising that television was often figured as the ultimate expression of progress in utopian statements concerning 'man's' ability to conquer and domesticate space" (102). This makes me think of one Twilight Zone in particular called "People are Alike All Over" which aired in the first season, 1960. The basic plot is that a rocketship crashes on Mars, and the pilots are concerned whether Martians will be friendly or monstrous. When they eventually work up the nerve to leave the ship they find that Martians appear humanlike in almost every way, except that they can read minds. With the ability the hospitable Martians decipher the exact ideal domestic space of Humanity and create a model home for the astronauts to live in, including a big ol' TV. This seems too good to be true, and it is; before long the men realize that the windows and doors to their home are sealed shut. They look outside to see that the Martians are gawking at them, in a perfectly replicated Human zoo.

I think this expression of domestic paranoia is a parody of both the "big brother syndrome" and fear of isolation
within the home theater that drives Spiegel's chapter. TV seems too good to be true in a way; there is always the possibility of an invisible feedback loop,which Spiegel characterizes as the "new TV eye [threatening] to turn back on itself, the penetrate the the private window and to monitor the eroticized fantasy life of the citizen in his of her home" (118). Most episodes of the Twilight Zone do exactly this: turning the idealized mirror of reality back at the viewer, exposing the neuroses and desires hidden in the communal subconscious. So it seems strange to me that she doesn't discuss the show, (although I understand we have just read an excerpt and the fact that the show starts in the 1960s while her archive is focused on the 50s). I can't help but thinking that the Twilight Zone wouldn't fit her argument, despite its topical relevance, because of its focus on a dystopian world outlook whereas Spiegel seems, for better of worse, to be hellbent on overturning conventional wisdom about the boob-tube and culture of mass deception that broadcasting heralds. Community in the world of Rod Serling is fraught with greed, paranoia, and nearly universal distrust, while Spiegel tries to paint the opposite picture.

Marketing and Group Identity

In Kathy's article about Black Radio, she concludes by drawing a parallel between the construction of the African-American market and the rise of boycotts in the Civil Rights movement. This got me thinking about the ways in which being marketed to as part of a group makes it easier for us to participate in that group, and the extent to which group identity precludes group (any group, not just racial) action/activism.

For a more personal example, I'm thinking of the ways in which I got involved in the punk subculture when I was in late middle school. I didn't have a cool older brother or particularly cool friends to introduce me to new music, so my first exposure to anything even remotely influenced by punk was through mainstream commercial radio and magazines and I branched out from there. It's not too much of a stretch for me to say that my participation in the punk scene back in Wilkes-Barre (the picture to the left is a crowd at the primary venue there) began as a result of my being marketed to as someone who would consume a particular type of music and all the matching accessories and fashions etc. to go with it. This music was a pretty big deal for me, too--not only did going to shows and listening to music give me something to do that kept me out of trouble growing up, I could trace a direct line from most of my current political and social ideologies back through other media to certain albums I bought as a teenager. So in a big way the ways in which I engage in political activity and activism now is the result of years of gestation of seeds planted in me after being marketed to in a particular way.

I guess it's a little un-punk of me to not be pissed about this, but I'm not really--along with Kathy's article, it gives me a bit of hope for some positive effects that being part of the audience commodity can potentially have. In closing, though, I'd like to point to what I see as a pretty unique counter-example in the LGBT community--a community that has been able to engage in widespread political action and activism without ever (to my knowledge, at least) being significantly marketed to. I don't know that this illustrates anything beyond the fact that there are other avenues to the group construction that's a prerequisite for group activism, but it's interesting nonetheless.

Wednesday, November 12, 2008

A part of the family ... and the furniture.


Lynn Spigel's history of the television's "installation" in American domesticity points, interestingly, to a time when machines were compelled to look less like machines and, instead, more like furniture. This, in part, probably helped to assuage some of the fears of Americans about these early techno-monsters taking over their living rooms (formerly parlors). Spigel also applies these efforts at furniturization to anxieties about the television's watchful influence over the domestic realm when she demonstrates how "... the attempt to camouflage technology as a piece of interior decor went hand in hand with the more specific attempt to 'screen out' television's visual field, to manage vision in the home so that people could see without being seen" (118).

There is, however, an entire tradition of this kind of "camouflage" of mechanical aids within the home, beginning with the Singer sewing machine (referred to briefly by Spigel on pg. 24). Anyone who's come across an old Singer in their local second-hand store is probably familiar with the sewing machine/table combo design that required the machine to be lifted out of the center of an otherwise innocuously appearing "endtable". For a while, this was also popular with record players: a host of forties and fifties-era turntables lurk similarly within tables or other kinds of furniture. The television pictured above is at 1958 German design that was marketed as an effort to combine television's functionality with aesthetic grooviness. It's kind of wacko looking, if you ask me: lip chairs everywhere beware of the flying-v tv!

At any rate, I am impressed by the extent to which one might be able to chart the social acceptance of these sorts of in-home technologies over the years by their forms: the more accepted they become in American culture, apparently, the less they have to resemble other household items, or appear to fulfill other functions. The stove/television contraption mentioned by Spigel is another example of manufacturers' desires to assuage their consumers: the setup almost suggests that watching a television is like watching a chicken cook, or a pie bake -- second nature to women's habits for decades previous.

Tuesday, November 11, 2008

That's Not Funny!!! (anymore)

Amos and Andy are, in several ways, like the Marx Brothers. They began on the vaudeville circuit before moving to radio and television. The characters portrayed specific racial and ethnic stereotypes. Groucho was Jewish, Chico Italian, and Harpo Irish (an association considerably more evident in the vaudeville manifestation of the Marx Brothers than in the later forms where he is more of a mime/clown figure). And most importantly, both comedic teams portrayed a sense of racial and ethnic anxiety within the milieu of Northern, white, mainstream culture. Amos and Andy emigrated to Chicago in the Great Migration and their comedy was as much about North and South as it was about black and white. The Marx Brothers repeatedly found themselves confounded by the mores of high society, be it in A Night at the Opera or A Day at the Races. Why, then, can we still see the Marx Brothers on television, while Amos and Andy are capable of inducing a visceral reaction powerful enough to keep them under lock and key? While we cringe at a white man in blackface, is it any less troubling to see a German Jewish man portray the libidinous Italian Catholic Chico (originally named "Chick-o" for is love of "chicks")?

Because in several salient ways, Amos and Andy are also very different from the Marx Brothers, and it is these differences that have determined their place among posterity. The Marx Brothers' humor did not vary dramatically from traditional vaudeville. It was largely linguistic and local. It was a series of set-ups and punch-lines that did not attempt any large story arc between episodes. Ely points out that Correll and Gosden had intentionally tried to distance themselves from traditional stage humor, instead crafting jokes based on consistent character traits and plot developing contours. While this allowed for the humanization of the more admirable characters, it insisted on a kind of realism which by default also included the not-so-admirable characters.

The Marx Brothers played their jokes against straight characters who were almost always Anglo-Americans and so beholden to polite society and bourgeois propriety that they were deaf to the absurdity of the conversation before them. They were interlocutors detached from society as a whole, so intensely alien that their words seemed to have no effect on the world around them. So anarchic is the Marx Brothers style, and so broadly satirical is their humor, that it is not constrained to any specific place and time. Duck Soup, for instance, is a critique of politics and war, while Animal Crackers is a send up of the mystery genre. Amos and Andy, however, were firmly grounded in the historical reality of the Great Migration. As such, they were not only deferential in the face of whites, they were often fearful. Instead of taking the historical moment as an object of scrutiny, the pair engaged relatively realistic surroundings in a way that was relevant to the times. As was too often the case in early 20th century America, Amos and Andy ended up being the butt of the joke. Their interactions with whites, rather than subverting conventional social forms, reinforced them.

There is no doubt that the legacy of racism is more present in the American consciousness than that of ethnic discrimination. Jews, Italians, and Irish were not subject to Jim Crow, and they were relatively established members of society by the time of the bus strikes and sit-ins in the South. In answering why Amos and Andy are still controversial today, while the Marx Brothers are generally considered to be innocuous, one must give a richer answer than simply "racism." One must also take into consideration the fact that the Marx Brothers thrived on fantastical mayhem, surreal absurdity, and whip-fast dialog which sailed over the heads of their interlocutors. Amos and Andy, however, were inherently bound up in history. They were born in the moment of the Great Migration and doomed to die in the Civil Rights Movement.

Monday, November 10, 2008

Representation, Visibility, and Their Detractors

A topic that keeps coming up in Ely’s book about A&A is the problems that people—e.g., the NAACP—had with the representations of blackness that came across in the program. I’d like to consider this issue in closer detail.

I feel like most of the complaints regarding the representations in A&A are founded on the assumption that white viewers would listen to/watch the programs and either have their preexisting beliefs about African Americans confirmed or else “learn” about blackness via A&A. That’s not an unreasonable assumption for two reasons. First, American culture then was (and often still is) so segregated that whites and blacks might not have the opportunity to interact on an everyday basis. A show like A&A might genuinely represent the biggest “contact” with black culture that some white Americans experienced (in other words, interracial contact was mediated). Second, studies have shown that media is most powerful in shaping the perceptions of its audience when the audience hasn’t had firsthand experience with the subject or issue being depicted; clearly this dovetails with the segregation point.

Still, these complaints about A&A’s depictions rest of the belief that white listeners/viewers will see fictional characters and believe that they are “representative” or “typical” of African Americans. Ely quotes an entertainment columnist named Billy Rowe, for instance, as saying that the A&A TV program was worrisome because “to many children across the nation this show will be their idea of how Negroes behave” (217). This sounds almost ridiculous, doesn’t it? It assumes a great deal of naïveté on the part of audiences, and one of my biggest pet peeves is when people underestimate media audiences. Then again, I’m reminded of hip-hop and rap studies that suggest that these musics “can be a way for Whites to vicariously learn about African Americans” and “may allow White adolescents to satisfy their curiosities without ever having face-to-face contact or interpersonal relationships with any African Americans” (Sullivan “Rap and Race” 247). How do we get past this dilemma?

The way that Ely gets past this dilemma is to privilege the value of visibility over the responsibility of representation. Ely asserts that Gosden and Correll “kept their characters from becoming absolute sterotypes” and suggests that the characters eluded predictability and “possessed both virtue and intelligence” (86, 85). But he also emphasizes that A&A did have black fans, that TV show was the first with an all-black cast, and that—at its best—the show presented black professionals and depicted black success. Ely implicitly suggests that, despite some troublesome stereotypes, the visibility of African American culture that A&A provided to the public was a productive thing—much in the same way that Kathy’s “Forgotten Fifteen Million” responds to criticisms that black radio was a “form of segregation” (128) by pointing out the ways in which black radio “helped create a sense of black community identification” (126). Ely also takes on the idea that the show’s black characters could be taken as “typical” African Americans by pointing out that the show figured a “fairly complex black society” (73). Thus, even if some audiences take A&A as a “realistic” portrait of black culture, it might not be such bad thing since they’re (he suggests) getting a fairly diverse impression.

In a way this reminds me of the old debate about the TV show Will and Grace. Is it terrible because its gay characters reinforce the stereotypes of fussiness, effeminacy, style obsession, and so on, or is it wonderful because finally we get to see some gay characters on primetime TV? Both points carry weight, I think we’ll agree.

In closing, I want to mention that something Ely doesn’t seem to consider: the representations of whiteness and white Americans that come through in A&A. Black Americans could be understandably worried that the audience was seeing a picture of blacks as lazy, unintelligent, and poor. In retrospect, though, shouldn’t white Americans—in laughing about black laziness, unintelligence, and poverty—have been worried about how they were coming across? Compare the above photo with the one below.


Friday, November 7, 2008

More like Amos 'n' Candy

The pink-haired diva pictured here is the controversial Shirley Q. Liquor, a caricature of a poor black woman played by a gay white man raking in the dough with his racist shtick. Shirley Q. is a figure not so different from those of Amos 'n' Andy. And yet it is. Not only does Shirley's blackface still smile, but underneath her made-up skin, she is a man, a gay man.

Jasmyne Cannick writes on "Ban Shirley Q. Liquor" that the difference between a black man (such as Tyler Perry) donning the character of a black woman and a white man donning that mask is the difference between ignorance and racism. But is it? Newman's and Ely's articles both seem to point to empowerment even within a potentially limited, and stereotypically racist, sphere. The radio shows of the 1940s and 50s might have been limited in scope and have reinforced stereotypes, they made way for public power. Amos 'n' Andy, the TV show, offered roles to black actors and showed the world black characters, albeit minor characters, with positive characteristics.

I'm not a proponent of Shirley Q. Liquor or any other blackface (e.g. Tropic Thunder). And I'm sometimes even squeamish of movies like Madea, which Cannick considers less troubling than Shirley Q, because it's a black man mocking black women, instead of a white man mocking them. I do wonder, though, why it is popular among gay men (I first learned of Shirley through my gay friends). And I wonder whether Charles Knipp (Shirley's daytime ego) is allowed his racism because he's a gay man and is already disenfranchised. Does it become like a form of "in-language" that's tolerated because it's one minority to another? Or are there other social issues at play? Black Americans and gay Americans have certainly had their issues. They follow the path of many instances of minority against minority. Down-low culture is looked on with both criticism and a sort of exotic Otherness by white gays. Black christianity often embraces the homophobia of the most extreme corners of the religion.

Like the hostility between Irish immigrants and black citizens seen a hundred years ago (and perhaps still), there is usually only room for one minority at the top. To gain majority support, one minority joins with the majority in demoting the other minority. Pro-minority support is like energy: it cannot be created, only transfered. To some extent the progressive election results on November 4 was retrograde: black Americans win and gay Americans lose. As Kathy suggested in class on Thursday (and I would agree), this was likely no coincidence. And while the "Yes" votes on Prop 8 (the only really feasibly-fought of similar measures on 11/4) was not won solely by the black vote, it probably received more "yes"s than "no"s from that sector.

So what do we do with that? I'm guessing Shirley Q. is a way for some gay men to "win," if only by making black women "lose." Is there anything to be gained by black women? Within the awful stereotypes of working the welfare system, promiscuity, and ignorance and poor English, I certainly can't find it. So what makes Amos 'n' Andy different? Regardless of the groups, can there ever be a win when there's such a loss?

Thursday, November 6, 2008

Reading the Election through Football


Michael Oriard spends a lot of time discussing the ways in which various newspapers presented football games to their intended readers in the last decades of the nineteenth century. His meticulously researched history of football in the media is interesting, but I repeatedly found myself wanting to find modern day instances of sports writing targeted to a specific demographic. Curiosity drove me to this Tuesday's National Edition of New York Times where, unsurprisingly, I found a story on the Steelers-Redskins game played the night before. What proved arresting about the story, however, was its utter lack of football. There isn't even a mention of the final score. The whole of the article is dedicated to the three-minute interviews of Barack Obama and John McCain featured in ESPN's "Monday Night" coverage of the game, and it says a lot about the implied readership of the Times.

Oriard, I think, would have had a field day with this article, because it seems to fit harmoniously with his emphasis on audiences in media coverage of sporting events. In discussing the role of football in the explosion of newspapers in the late 1800s, he bases his argument largely on the assertion that football had a broad appeal, from the New England gentility of Harvard, Yale, and Princeton to the hordes of working-class crowded along the fences. Since football was tremendously popular, including it in the newspaper was a way of marketing to a larger audience. The candidates' appearances on ESPN is striking because, on the eve of one of the most important elections in history, when campaigning and media coverage was at a frenzied pitch, the largest available television audience was watching football. Both candidates agreed to do the interviews because it gave them an opportunity to piggyback on the 12 million viewers tuned in to ESPN. I would offer as a comparison the second presidential debate, in which the network with the highest ratings, ABC, managed to wrangle 13.2 million viewers.

This unprecedented (never before has a candidate for president appeared on "Monday Night," let alone two) and somewhat uneasy blend of politics and sport was conceived in a spirit of pragmatism. The success of the program would suggest that, though the combination of the two is unusual, more unusual is the fact that "Monday Night" had not done it before, considering it has been on the air for 10 previous elections. If one measures the success of these interviews by the size of the audience they reached, then it is not hard to assume this will mark the beginning of a trend in eleventh-hour campaigning in the elections to come. And if Oriard can read the Civil War into newspaper coverage of a Harvard and Princeton game, it is no stretch of the imagination to see the epic struggle for the White House as analogous to the clash of the Steelers and the Redskins. But if Princeton is the South, unable to break the adamant defensive line of the North, it seems moot to assign a candidate to a particular team (and hopefully one would cringe at the possibility of the Redskins as representing Obama). Neither Washington nor Pennsylvania went McCain, so no matter who wins the game, Obama's team comes out on top.

The Savagery of Sport


"Football, during years when relatively few readers of the daily newspapers actually saw any games, was invested with the meanings and resonance of heroic myth" (109). Or so goes Michael Oriard's general argument in Reading Football. Certainly his account of the popular's press role in shaping a national fascination -- if not obsession -- lends logic to these observations, but I think what is often even more interesting in his work is the way football, and sport in general, becomes a site of cultural assimilation and national allegory.

Here's an example: 190 miles or east of our fair city of Pittsburgh lies the city of Carlisle, PA, currently home of the United States Army War College, but formerly the site of the Carlisle Indian Industrial School, an all-native boarding school used between 1879-1918. Carlisle I.I.S. was the United States government's (infamously failed) attempt to provide an institution that might take charge in assimilating and "Americanizing" American indians during this time period by means of its mantra, "Kill the indian and save the man." Its students were, of course, not brought to Carlisle willingly, but forcibly removed from homes on reservations. There were disastrous results at Carlisle: hundreds of native students died, either from malnutrition, diseases formerly unknown to native populations, or extreme physical and sexual abuse. Many were killed for trying to escape.

But, in addition to a record of death and disaster, Carlisle is famous for another reason: its football team. Glenn Scobey "Pop" Warner, a legendary coach, headed up the Carlisle Indians (aptly named) and their rise to glory: to this day, they have the best winning percentage (.647) of any defunct college football team. Lending even more credit to football at Carlisle, however, was Jim Thorpe, now a major name in the sport. The Thorpe-Warner relationship caused Sally Jenkins, writing in 2007 for Sports Illustrated, to name Carlisle "The team that invented football." What's interesting, and very troubling, here is, of course, that football was for Carlisle likewise the sport that invented America in the bodies of these young American indians. Lars Anderson's 2007 book Carlisle vs. Army: Jim Thorpe, Dwight D. Eisenhower, Pop Warner, and the Forgotton Story of Football's Greatest Battle casts the three characters mentioned in its titled into a narrative of nationhood, popular culture, and the development of American media. The historic 1912 scuffle, in which Eisenhower was playing for Army, ended with Carlisle trouncing the army 27-6 thanks to Eisenhower's hurt knee (the result of wrangling with Thorpe on the field) and Thorpe's famous 97-yard touchdown.

I thought this was interesting valence to add to the narrative constructed by Oriard, and an apparently oft "forgotten" one. The relationship between the popular press and football has likewise always been a relationship between individual and nation: sport is too often an allegory for the "battles" of nationhood, and too often an excuse for similarly constructed false loyalties at home.

Healthy Body, Healthy Mind


While searching for a show and tell object, I came across some stuff that makes for an interesting addition to the strains of discourse outlined by Oriard. Apparently, as college football was becoming mass entertainment in the 1890's, the phenomenon was met with less enthusiasm by some. In the Atlantic in 189?, a long article laments the signs that "the standard of sport has fallen...professionalism has crept in" (64). College athletics, formerly an elite tradition of mind/body refinement brought over from England, was in danger of moving away from what Bourdieu calls "disinterestedness" - the privilege of forming one's habits and practices independently of any kind of need for financial or cultural capital. Even more, an article in the Christian Observer in 1900 laments the decline in the intellectual quality of subsidized football players: "Let us have athletics in colleges, by all means. But tuition in football, which turns out blockheads, comes quite too high for the cost" (22). These lines of argument seem to be worth paying attention to, seeing as they have a significant place in contemporary discourse. In the aughts of this century and the last alike, outcry against overpaid and underintellectualized college atheletes, whose labor on the field, diamond, court, rink, etc. has become one of the more significant entertainment commodities out there, but whose wages are paid in cultural capital, is worth thinking about - not only because it is one of the more bizarre relations of labor power and compensation out there, but also because of its power as a discourse on race, class, and gender.

Tuesday, November 4, 2008

From Patent Medicines to Branded Pharmaceuticals


When reading the Ohmann, I got distracted by a comment that he makes when he’s discussing the rise of magazines. He explains that, increasingly, magazines like Ladies’ Home Journal refused to run ads for the old “patent medicines” that used to be a regular part of advertisers’ bread and butter. Part of the reason, Ohmann suggests, is that the magazines were trying to be respectable and thus wanted to distance themselves from products whose validity was questionable. He also says that ads for “[. . .] medicines fell out of favor mainly because they did not fit into the new domesticity that was emerging”(93). Basically, Ohmann says that advertising for patent medicines experienced a huge drop just as the magazine industry was growing.

Today, on the other hand, advertising for prescription medications is absolutely colossal. In other words, there’s been a big switch. Think about all the clever, sexy prescription brands you know. I came up with a whole list in just a few minutes: Zoloft, Viagra, Celebrex, Mirapex, Priolsec, Nexium, Detrol, Lipitor. Think about the sad little Zoloft eggs that mope around as a result of their depression. Consider the people made out of copper pipes who leak because of their overactive bladders. Then there are the cheerful middle-aged men dancing around because they (apparently no longer) suffer from impotence.

What I’m getting at is this: these branded prescriptions are marketed so aggressively that not only are we aware of their existence, they have actually become a part of our culture and mass consciousness. I mean, think about how many awful Viagra jokes we’ve all heard over the years? Half of the spam email I get has the word “Viagra” in the subject line. This is a particularly vivid illustration of what Ohmann calls the “infusion of muted commercial purposes into our repertory of meanings” (102). He mentions famous old ad slogans like “The Beer that made Milwaukee famous” and explains that “[. . .] it means something for a culture when so many of the formulaic epithets and verses that stand ready for use in any conversation were put on the tips of our tongues by ad men” (102).

But prescription drug marketing is different from beer marketing in a crucial way. Beer ads, thankfully, don’t make pretentions to educate the public. Prescription drug marketing, on the other hand, contributes—or at least purports to contribute—to public information regarding medical conditions and diseases. For instance, had you ever heard of restless legs syndrome before the makers of Mirapex started advertising? Before Prilosec (“the purple pill”) came along, did you know there was a difference between common heartburn and a condition known as acid reflux disease?

Now, there’s good and there’s bad here. I do believe that it’s good for the public to have access to medical and health information. But we need to acknowledge that drug manufacturers are not in the information business; they’re in the selling-stuff business, and drugs are their stuff. Are you depressed or impotent? Do your legs keep you awake? The drug companies want you to know, but they want you to know only because they want to treat it for you. So drug ads are sort of like those Dove “real beauty” ads: the information you get from them sounds nice, but the source of the information is problematic. One of the quotations about advertising that Ohmann uses is “Advertising aims to teach people that they have wants, which they did not realize before [. . .]” In the mode of drug marketing, we can alter this slightly to say that the drug marketing aims to teach people that they have medical conditions that they did not realize before.

Don’t get me wrong: I am not one of those people who tries to argue that people’s medical conditions aren’t real or are just “in their heads,” and I do believe that we should pursue medication and other treatment for our health issues. But it’s important to acknowledge that, in the medical information game, the stakes are greatly unequal when it’s you and me on one side and a big drug corporation on the other.


In closing, take Pfizer. Wikipedia told me that their marketing budget is three billion dollars, making them the 4th largest company in the United States in terms of marketing. Since it’s election day and politics are in the air, I’ll mention that Pfizer is also one of the 53 entities that contributed the maximum $250,000 to President Bush’s second inauguration in 2005. And this is the same company that tries to provide medical information to us, ostensibly for our benefit.

Brand recognition

Ohmann's chapters provide the sort of case-specific analyses that I've really been wanting to see done with Smythe's model, and by looking at particular advertisements from the dawn of the creation of the "audience commodity" I think he does a pretty good job of providing us with some nuances we can add to the "Smythian" model, if we're so inclined.

Particularly, I think he gives an interesting analysis on what the consumer "gets" from buying branded merchandise, even if the comfort gained from brand recognition is a need that is "historically specific and generated by the new system" (150). I like this particularly because it comes at the audience commodity from a bit of a different angle and implies that not only are we suckered into working for mass media sources by participating in audiences strictly by the lure of the editorial content, we gain something from absorbing the advertising as well.

I see this as directly tied into Chapter 10's discussion of the link between branding's repetition of logos and our mnemonic associations with them (hence the pictures). Advertising, according to Ohmann, is a "discourse of repetition whose instances are unimportant as against their cumulative weight" (153), creating such a network that "image overrides verbal text claiming a residual place in memory even if the reader skips the text entirely" (153). I think the ubiquitousness of Coca-Cola merchandise in different languages (even an army/navy surplus store with definite Zionist sympathies near where I lived in Philadelphia stocked t-shirts with the Hebrew version) proves this point--as does the artwork Courtney pointed out a few posts ago--we're so tuned into this indexical network of colors and shapes that we don't even need text to evoke the familiar feelings of our favorite soft drink or candy bar or oatmeal.

But as a closing aside--man, can you guys believe people used to eat stuff called "fried mush?"

Thursday, October 30, 2008

Exploitation Films and Poetic Justice

I hope you'll excuse the shamless self-promotion here, and I know I've already blogged once this week, but reading Neve's chapter on the censorship and politics of noir reminded me of a conference essay that I wrote in the spring about poetic justice in exploitation movies of the 1960s (I know, more 60s stuff...).

Here's the thrust of my argument: I talk about the use of poetic justice to "punish" "transgressions" of sex, drugs, and violence in several late 60s movies and suggest that "In the late 60s [. . .] youth culture was no longer incipient. Indeed, there’s much evidence to suggest that while exploitation cinema prior to the 1960s was primarily aimed at parents, late 60s exploitica was aimed at young and old alike. In the late 60s youth culture had itself reached adolescence, and for the first time, youth culture was truly mass culture, not just some underground or alternative. For this reason, the moral stakes were higher and were important not just to the youth and not just to their parents, but to the culture at large."

I thought of this because I feel like it dovetails with Neve's comment that "The censors [. . .] always saw to it that evil was punished in these pictures and that sin or corruption was depicted with a degree of restraint" (98)

If Neve's essay or my preview has piqued your interest, you can check out my essay by following this link.

Again, I apologize if this feels tacky or isn't in the spirit of this blog. My intention here is simply to add to the discussion, so I hope it doesn't seem like I'm instead trying to directed it toward myself. Just give me dirty looks in class today if you feel it's warranted.

Tuesday, October 28, 2008

The Ritual in the Age of Sonic Reproduction

The last few pages of Jonathan Sterne's chapter, "The Resonant Tomb," brought me back to thinking about Walter Benjamin and his "The Work of Art and the Age of Mechanical Reproduction." In the Epilogue, Benjamin warns of the aestheticization of the politics, the de-humanization of the masses, and how these trends lead to war. Fascism, he argues, "sees its salvation in giving these masses not their right [to change property relations], but instead a chance to express themselves." I believe Sterne's discussion of the Omaha tribe and the desperate, sympathetic attempt of anthropologists to preserve their rituals and songs resonates with this argument. Those anthropologists who recognized the genocidal policies of the United States government toward Native Americans certainly knew they could do nothing to change the property relations (ie: land seized from the Native Americans), so they offered the Omaha, at the very least, "a chance to express themselves." Thus, they launched a project to "preserve music, ritual, and languages that federal policy at the time of their recording had intended to drive into the ground within a generation" (qtd. in Sterne 331).

Benjamin claims that "the earliest art works originated in the service of a ritual," and though these works of art are alienated from their "aura," they are never fully divorced from it. These idols, talismans, or other objects used in ritual are, more or less, constant from one moment to the next. Early recorded sound, on the other hand, was subject to a "triple temporality." The medium itself can degrade. The recorded sound is just a the audible aspect of a fragment of a ritual. The recorded event is separate, modular, broken from the past. These temporalities are symptoms of translating the "interiority" of the "live" event into the "exteriority" of the recorded form. Furthermore, there "was no "unified whole" or idealized performance from which the sound in the recording was then alienated" (Sterne 332). The song or ritual exists within time. It can be present one moment and absent the next. This seems to explain why many members of the tribe were less interested in preserving rituals that the anthropologists. They were not seen as something material, durable. Putting a song in cylinder form and playing it later would not reproduce the ritual.

However, the same techniques of preservation which "exteriorized" the "interiority" of these rituals are also responsible for bringing the audible aspects of the recorded events back to the Omaha tribe nearly a century later. It seems to me that these recordings, in their materiality, can serve as a kind of talisman or spiritual object. An item used in a ritual (an idol, rattle, or drum) is given meaning by the person using it, and these recordings work in the same way. This is achieved through a "re-interiorization" of the "exteriorized" event by the individual. This may seem like a sleight of hand, substituting the recreated or reconceived interiority of someone today for the original interiority at the time of the recording. I would counter this by saying that culture is mutable. The interiority of a song is not static from invocation to invocation, or singer to singer. Each instantiation of a song allows for change; its interiority is recreated each time it is performed.

Sterne stops short of claiming this, saying only that "If the past is, indeed, audible, if sounds can haunt us, we are left to find their durability and their meaning in their exteriority" (333). A return to Benjamin can explain why even if a recorded ritual can be "re-interiorized," this does not guarantee the revivification of a destroyed culture. Benjamin sees the masses under Fascism, subjugated to the will of a dictator, as analogous to art in mechanical reproduction, violated by being "forced into the production ritual values." We only have fragments, both of the remaining Omaha and their rituals. Reuniting the two is "no doubt a good thing," but it is not enough to reverse the damage of this fragmentation and is powerless to change property relations or allow for any real exercise of rights (331).

Our Commodities, Ourselves



Gitelman and Haltman both argue that technologies in their respective periods invoke femininity to ease the user's transition from the old to the new. If the manufacturers of phonographs and telephones had to position their products as feminine, domestic, and personal (or has having a kind of personality), in what forms has this kind of personal identification survived? How do we now cope with new technologies, in a world of niche products and targeted marketing?

I don't really have an answer - this blog is ultimately a shameful excuse to post the above video. I think it's safe to say, however, that personal identification with technology is more important than ever. I think we tend to see ourselves as fully transitioned into our postmodern, posthument era, but our advertising belies this assumption. Apple's TV spots come to mind - the ones with the hip young guy in jeans and the old stuffed shirt. If you'd rather hava a beer with the Mac guy, well, you'd rather have a beer with a Mac. Even without Apple's ads, they've built this idea of "personality" into the way their products work - as in the ipod's request that you give it a name on your computer.

In our cultural characterization, we have slotted contemporary gadgets into the role of domestic assistants - sometimes by invoking gender or race, other times by simply positioning them as human. I also wonder if this positioning relies on the global structure of technological production and support. If my ipod was made in China, and its support network is in India, isn't it a tad disturbing that it's positioned as my servant?