MULTIMEDIA


Multimedia works are texts created using multiple forms of media—text, audio, images, videos—in order to create a richer end-product. "Multimedia" tends to be the cross-over term between Rhetoric/Composition and people working with digital texts outside academic contexts. Because of its widespread uses, it is defined in a number of ways.


Definitions

To start, Tech Terms defines multimedia as "the integration of multiple forms of media," including text, graphics, audio, video, animation, etc. Similarly, educational software that combines multiple media is known as multimedia software. What this definition offers is basic yet fundamental: multimedia works must combine different media components to create a unified end-product.

Blattner and Dannenberg, editors of Multimedia Interface Design, add explicit references to computer technologies to their definition: "A multimedia computer system is one that is capable of input or output of more than one medium. Typically, the term is applied to systems that support more than one physical output medium, such as computer display, video, and audio" (xxiii). The key idea here is that multimedia must include the manipulation of more than one medium. Blattner and Dannenberg center this definition specifically in terms of computer systems, yet for an idea of how multimedia relates to the field of Rhetoric and Composition, we need a definition that relates to composing processes.

In A Companion in Digital Humanities, Geoffrey Rockwell and Andrew Mactavish provide a useful definition for thinking about digital composition: "A multimedia work is a computer-based rhetorical artifact in which multiple media are integrated into an active whole."What makes this so useful are the steps that Rockwell and Mactavish take to flesh out the definition, breaking down each key term and developing the concepts:
  • Computer-based: "[A] multimedia work is a digital work that is accessed through the computer even if parts were created in analogue form and then digitized for integration on the computer."
  • Rhetorical artifact: "A multimedia work is one designed to convince, delight, or instruct in the classical sense of rhetoric." It is "designed by humans to communicate with humans."
  • Multiple media: "Central to all definitions of multimedia is the idea that multimedia combines types of information that traditionally have been considered different media and have therefore had different traditions of production and distribution."
  • Integrated...artistic whole: "[T]he integration of media is the result of deliberate artistic imagination aimed at producing a work that has artistic unity, which is another way of saying that we treat multimedia as unified works that are intended by their creator to be experienced as a whole."
  • Interactive: "Some level of interactivity is assumed in any computer-based work, but by this definition interactivity becomes a defining feature that helps weave the multiplicity into a whole."

With this definition, multimedia works start to really take shape: they are accessed through computers, created for classical rhetorical purposes, produced using multiple mediums in order to create a unified product, and are interactive to some degree. And for digital composition, the ideas of creation, rhetoric, and interactivity are key. Multimedia works, like traditional writing projects, become texts created in order to argue a certain position for a particular audience in a new way.

Other noteworthy definitions:
  • 1991: In "Getting Started with Multimedia: What Is It? Can You Use It? What Will It Cost?" Jim Heid defines multimedia as the "integration of two or more communications media. It's the use of text and sounds, plus stills and moving pictures, to convey ideas, sell products, educate, and/or entertain" (225).
  • 2000: In Multimedia Literacy, Fred T. Hofstettler defines multimedia as "the use of a computer to present and combine text, graphics, audio, and video with links and tools that let the user navigate, interact, create, and communicate" (2).


Related Terms

“New media” and “new media studies” often refer to the “newness” of these multimedia works compared to previous types of media, entertainment, and instruction. According to Rockwell and Mactavish, "‘New media’ emphasizes the experience of these works as ‘new’ and different from existing forms of entertainment and instruction, but new media can also refer to media new to the twentieth century." That is, "new media" can often refer to multimedia works, but it also extends to many media advances within mass communication, such as TV and radio.

"Hypermedia" refers to hypertext and the creation of non-linear texts. "Hypermedia" evolved out of "hypertext" and emphasizes the way these works are multi-linear labyrinths of information that the user navigates (Rockwell and Mactavish). In his 1965 article "Complex Information Processing: A File Structure for the Complex, the Changing, and the Indeterminate," Ted Nelson described hypertext as "a body of written or pictorial material interconnected in such a complex way that it could not conveniently be presented or represented on paper. It may contain summaries, or maps of its contents and their interrelations; it may contain annotations, additions and footnotes from scholars who have examined it" (96). This definition translates well to today's notions of hypertext and hypermedia, particularly when thinking about the design of a Wikipedia page: it is a digital space that could not be truly represented on paper; it summarizes information and embeds links to external content; and the footnotes lead viewers to additional sources.


History

This list is largely adapted from Rockwell & Mactavish and covers a wide range of technological developments that have influenced multimedia production, e.g. word processors, desktop publishing, sound and video standards, and various software that have allowed users to create and share images, videos, and web pages.

  • 1964: IBM released the first commercial word processor: the IBM MT/ST, allowing people to create,
    ST.jpg
    IBM MT/ST
  • manipulate, and disseminate (via printing) rhetorical documents.
  • 1977: The first successful personal/home computer, the Commodore PET, was released.
  • 1983: Computer platforms began to incorporate MIDI, Musical Instrument Digital Interface, allowing users to record, edit, and play back musical performances. Now, all personal computers have built-in sound and microphone capabilities.
  • 1984: Apple released MacPaint, a program that allowed users to create, edit, and print simple images. Images could also be incorporated into other documents to create basic multimedia artifacts.
  • 1984: Desktop publishing became accessible to anyone with a personal computer with the release of MacPublisher—the first WYSIWIG program. Desktop publishing was a milestone in multimedia creation that allowed computer users to create documents that combined texts with graphic art.
  • 1985: Apple quickly released the LaserWriter printer and Aldus released PageMaker software to accompany MacPublisher.
  • 1987: HyperCard was released as an application that combined database capabilities with a user-friendly database. HyperCard worked like a virtual stack of cards that held different information on each card—e.g. written text, visual graphics, and buttons/hypertext that linked users to other cards.
  • 1988: MPEG, the Moving Picture Experts Group, was founded to create standards for audio/video compression.
  • 1991: Apple released QuickTime, an application that can play multiple formats of digital videos, pictures, and sounds. QuickTime is a common application that is still frequently used for sharing videos.
  • 1996: Adobe released GoLive, a WYSWIWIG HTML editor that allowed people to create and manage web spaces. GoLive was surpassed in popularity by Dreamweaver, one of the most popular WYSWIG applications. Today, there are a number of HTML editors available for multiple platforms that can be purchased (e.g. Dreamweaver) or downloaded free of charge (e.g. Bluefish, KompoZer, OpenOffice.org, SeaMonkey).
    Adobe_Creative_Suite_Master_Collection.jpg
    Adobe Creative Suite Master Collection
  • 1996: Macromedia bought FutureSplash Animator, which became Adobe Flash in 2005. Flash is a multimedia application known for its ability to combine text, drawings, and still images to create interactive videos that animate web pages.
  • 2001: Microsoft released the Microsoft Tablet PC. Today, popular tablets, like the HP TouchPad and Apple's iPad, boast platforms that can multi-manage text, images, sound/music, and video on easy-to-use and interactive interfaces.
  • 2003: Adobe first released its Creative Suite package—including Dreamweaver, Photoshop,Illustrator, and InDesign, among others—a collection that gave buyers access to a number of programs that allowed for graphic design, image manipulation, video editing, and web development and management.
  • 2003: As a cheaper alternative to Adobe Creative Suites, Apple released iLife in 2003. iLife includes iPhoto, iMovie, iDVD, GarageBand, and iWeb, applications that allow Mac users to organize, edit, and publish various multimedia projects.


Types

Because multimedia works are comprised of many different media that can be designed to serve different purposes, there are varying types of works. Rockwell and Mactavish outline four: web hypermedia, computer games, digital art, and multimedia encyclopedias.

Web Hypermedia

These types of multimedia works include hypertexts that link to internal and external information, encompassing thousands of educational, research, and entertainment hypertexts. The ultimate example is the World Wide Web itself, an online system comprised of hypertexts that take users to different sites of information for different purposes. Other examples include weblogs that connect and react to articles, photographs, and videos; Prezi and PowerPointpresentations that link users to outside information; and particularly long websites that create links within the text for users to navigate more easily throughout the space.

Computer Games

Advances in personal computers in the 1980s and 1990s increased the popularity of computer games, also known as PC games, that combined images, animations, and sound. Here, users could play in a "fictional world characterized by navigation and puzzle-solving" (Rockwell and Mactavish). More recent advances in both software and hardware create new opportunities for users interested in computer games for both educational and entertainment purposes.

Today, game studies encompasses both computer games (e.g. Second Life, World of Warcraft, MOOs, and MUDs) and video games. Studying games involves analyzing games as sources of meaning and as social artifacts. Scholars who explore games do so to learn more about literacy, public writing, design, and the pedagogy of play. Computers and Composition published a special issue in 2008 themed “Reading Games: Composition, Literacy, and Video Gaming,” edited by Matthew S.S. Johnson and Pilar Lacasa that explored the intersections between critical media literacy and composition. Similarly, Computers and Composition Online's Fall 2008 issue collected a number of relevant articles about design theory, the intersections of games and race, and reflective gaming practices.

Narratology scholars have also taken up game studies, exploring the various narratives and narrative structures of both computer and video games and analyzing how the stories of games are created and manipulated by users.

Gee-Good_Video_Games.jpgGee-What_Video_Games.jpgMcGonigal-Reality_Is_Broken.jpgBogost-How_to_Do_Things.jpg

Digital Art

Digital art includes "interactive installations that are controlled by computers and use multiple media" (Rockwell and Mactavish). Artists are able to use multimedia to create installations that make viewers a part of the experience, rather than observers of the art.
  • Laurie Anderson, a musician and artist, uses multimedia to create interactive installations that focus on the intersections of technology and human communication. Examples of this can be found on her online gallery, including her Talking Pillow
  • Pipilotti Rist is another artist celebrated for her use of audio/visual technologies to create interactive installations. On her gallery site, she writes that these multimedia projects interest her because "there is room in them for everything (painting, technology, language, music, movement, lousy, flowing pictures, poetry, commotion, premonition of death, sex and friendliness)." That is, digital art allows her to explore different mediums and themes in order to "encourage the mind," engaging viewers on levels that a single medium couldn't allow.

Can art be scholarship?
Yes, as demonstrated by Professor Marcel O'Gorman, Professor of English at the University of Waterloo and Director of the Critical Media Lab. O'Gorman has created and performed a number of multimedia installations that explore the impact of digital technologies on the human body.His projects include the screening coffin, cycle of dread, and the dreadmill. According to the DREADMILL website, this scholarly project is a critique of increasing human immobility in a virtual culture, requiring that participants "walk and run on a treadmill that powers a multimedia display of video, still images, text, and graphics." O'Gorman has delivered academic performances about post humanism while using the treadmill device.

Multimedia Encyclopedia

Multimedia online encyclopedias--like Wikipedia, educational wikis, Encyclopedia.com, Encyclopedia Britannica Online--are mirroring their traditional counterparts by cataloguing information, but they are "taking advantage of the computer's capability to play time-dependent media like audio, animation, and video to enhance the accessibility of information" (Rockwell and Mactavish)


Multimedia Scholarship and Journals

Wysocki-Writing_New_Media.jpgRice-The_Rhetoric_of_Cool.jpgBrooke-Lingua_Fracta.gifReid-The_Two_Virtuals.jpg

Though multimedia is a term that is often applied to non-academic and industry work, there has been multimedia scholarship done within the humanities. Though we often think of the "boom" in multimedia work as a product of increased access to computers and the Internet in the 1990s and early 2000s, multimedia composition scholarship has deeper roots. In 1970, Paul Briand wrote "Turned on: Multi-Media and Advanced Composition" for CCC. This short piece describes Briand's own use of media projectors, recorders, and tools to teach composition rather than media that students could use to write; however, he still offers a valuable conclusion about using multimedia in the composition classroom: "[T]he skill of writing can be taught—and with great success—by means of a multi-media approach" (269).

As technology intersected more with our reading and writing practices, scholars began producing work about technology's role in the composition classroom, not only for instructors, but also for the students. A quick look through the CCC archives shows that rhet/comp scholars weren't researching the impacts of multimedia on composition classrooms until the mid-1990's, and the journal itself didn't publish many articles about multimedia and composition until the mid-2000's. In 1995, Cynthia and Richard Selfe published "The Politics of the Interface: Power and Its Exercise in Electronic Contact Zones," an article that implores writing instructors to think about computer use critically. They argue that the computers used within composition classrooms maintain borders that contribute "to a larger cultural system of differential power that has resulted in the systematic domination and marginalization of certain groups of students," particularly in terms of gender, race, and nationality (Selfe & Selfe 481). Piece by piece, articles were published about email (Spooner & Yancey, 1996), technology and voice (Eldred 1997), the politics of cyberspace (Reynolds, 1998), and literacy (Selfe, 1999). Looking at CCC today, more articles can be found regarding the role of media in our composition classrooms.

Though CCC was slower to publish articles relating to multimedia and composition, Computers and Composition: An International Journal released its first newsletter in November 1983. According to the journal's website, Computers and Composition publishes articles related to "the use of computers in composition classes, programs, and scholarly projects" and "offers information about integrating digital composing environments into writing programs on the basis of sound theoretical and pedagogical decisions and empirical evidence."

Similarly, Kairos: A Journal of Rhetoric, Technology, and Pedagogy was formed in 1996. According to the "about" page, the journal's mission is "to publish scholarship that examines digital and multimodal composing practices, promoting work that enacts its scholarly argument through rhetorical and innovative uses of new media." Though not specifically focused on composition, Kairos publishes "webtexts" related to the many intersections of technology and writing studies.



MULTIMEDIA vs. MULTIMODALITY


Multimodal_Texts.png
"Multimodal Text" word cloud from Insight into English.

"Multimodal Text" word cloud from Insight Into English

To bridge the gap between multimedia and multimodal, it may be helpful to see how the two concepts overlap and how they differ. Claire Lauer does this thoroughly in her 2009 Computers and Composition article entitled "Contending with Terms: 'Multimodal' and 'Multimedia' in the Academic Public Spheres." In this article, Lauer argues that term preference depends on both context and audience. That is, academic and non-academic contexts play a key role in how each term is valued. Lauer writes, "There is a greater emphasis on design and process in the classroom, which makes the term multimodal more suitable in that context, and a greater emphasis on production and distribution in non-academic or industry contexts which explain the use of the term multimedia in that context" (226). And even though multimodal may be more appropriate for the work done within the composition classroom, Lauer argues that composition teachers should continue using both terms to better prepare students for how they may encounter multimedia outside academia (226).

Another way to think about the differences in terms is defining "modes" and "media." Lauer defines modes as the "ways of representing information," which include words, sounds, and images. Media, then, are the "'tools and material resources' used to produce and disseminate texts," which include books, computers, and voices (Lauer 227). Though this shows the differences between modes and media, these definitions allude to the interdependent nature of both terms. That is, "the media we use affect the ways in which we can realize meaning through various modes" (227). Lauer is pulling from Gunther Kress here, arguing that our mode of writing is affected by which medium we choose to do that writing; thus, media and mode are always inextricably tied together.

To find out more about multimodality, visit the multimodality page!