The digitization of information, and the growing technologies used to manipulate and analyze it, is rapidly changing the context of the classroom. A couple of weeks ago Ian Milligan reported on the growing debate over the use of laptops and other technology (like cell phones) during class time. Milligan makes a compelling argument for the importance of allowing students to use their computers in the lecture hall. Although I agree with much of what he has written on the subject, the use of technology in history courses poses a more complicated problem than simply addressing whether it should or should not be used: Where does digital literacy fit in the university curriculum and how should it be taught?
The use of technology in the classroom – and often in research – is somewhat idiosyncratic. It is often based more on the experience and comfort of the instructor, which is usually developed outside of the education system, rather than focused on the skills required by the students. Wikipedia serves as a good example. Some instructors tell their students to never use it, while others sing its praises as a useful and open way to learn general background information. In some classes, students are taught to use internet resources such as Google Scholar and Internet Archive; while in others, they are sent to carry out library searches and told to limit their internet use. All of this can result in students who are equipped with differing research skills and experiences. More importantly, though, it can create a different set of student expectations about the goals of their university education.
This is made all the more complicated by the variety of resources available to students. There is a wide array of computer programs like Zotero that help students with their citations. Websites like Internet Archive, Worldcat, and WordReference.com have search windows that you can embed in your web browser, making internet searching that much easier. A whole host of Google products, particularly Google Scholar and Google Books, have made searches for both primary and secondary source material considerably easier – so long as students know how to appropriately frame their search query. And finally, an increasing number of open source database and mapping software makes it possible to manipulate data quickly and efficiently. Jim Clifford has put together a concise introduction to some of these resources on his website.
These tools are increasingly becoming a part of how historians go about their work. Unlike the ambitious counting projects of the 1960s, 70s and 80s, which used large and costly equipment, historians now have access to powerful software that can manipulate our source material in ways that were unfathomable even a decade ago. Without having to manipulate or build your own software, commercial and open-source database, mapping, and network analysis programs are significantly easier to use and can often be more-or-less self-taught (or learned after a one or two day workshop).
Despite the diversity of ways that historians have put these technologies to work in their research, much of the discussion on technology and teaching has emphasized how it should be used to facilitate the students’ learning of course content rather than on developing the skills they require to research the past. The Chronicle for Higher Education’s series, College 2.0, does an excellent job at summarizing the discussion so far, laying the parameters of this debate between those who ban technology from the classroom, those who embrace it, and those in-between. The central message throughout these pieces, though, remains the same: the use of technology in itself does not make a good learning environment; rather the critical use of technology in the hands of a good teacher can have positive implications for student learning.
What this discussion on teaching and technology leaves out, however, is a broader reflection of when and where students should develop the technical skills that they will require to critically engage in future research projects. Given the rising importance of these technologies to historical research, these skills need to be taught to our students in a much more organized fashion. Rather than only being discussed at the level of pedagogy, technological change in history related fields needs to be addressed as part of a university’s and history department’s curriculum.
It is becoming increasingly important for future historians to know and understand the digital resources available to them. Other disciplines have taken positive steps. In psychology, for example, students are trained in both statistics and some of its software, like SPSS, a well-known and very powerful piece of statistical software. Honours students’ research are also often co-supervised by graduate students, allowing undergraduate students to conduct research using this software without taxing faculty resources and providing graduate students with experience of more one-on-one instruction and the opportunity to learn and explore better methods and research design.
Training students to use the technologies employed in their disciplines is increasingly being recognized as an important component to the education system. The 2010 National Education Technology Plan in the United States, which is focused broadly on all levels of education, emphasized the important role of digital literacy in preparing students for future work environments:
“How we need to learn includes using the technology that professionals in various disciplines use. Professionals routinely use the Web and tools, such as wikis, blogs, and digital content for the research, collaboration, and communication demanded in their jobs. They gather data and analyze the data using inquiry and visualization tools. They use graphical and 3D modeling tools for design. For students, using these real-world tools creates learning opportunities that allow them to grapple with real-world problems…”
Unlike psychology, however, the wide variety of software and historiographical fields available to historians makes a singular type of history program difficult to implement. Some history students may find GIS software helpful, others may benefit from a more advanced knowledge of spreadsheet and database programs, while others would prefer software that facilitates qualitative text analysis and optical character recognition.
The solution, I think, is to build the teaching of technical skills into the university curriculum by aligning the skills that are taught in individual courses. Part of this involves the development of an expertise and emphasis in the digital humanities (a goal which many universities have embraced), but also an emphasis on more traditional and effective forms of teaching, such as challenging students to apply their learning in a variety of different settings.
Here’s one way that this could play out: First, along with basic library skills, a more advanced session on digital literacy (for academic purposes) could be implemented in first year history courses. Second, assignments could introduce students to major online collections of primary documents and how to put them to best use. Third, alongside more traditional assignments, courses could encourage students to develop a familiarity with digital publishing through tools like wikis and blogs. Fourth, online academic standards could be agreed upon at the departmental level, ensuring that students receive a consistent message about the validity of various types of online resources. What is the status of Wikipedia, or even of this blog post, in the classroom? Finally, students should be encouraged to use technology to facilitate their research. This could be as simple as suggesting that they use Google Earth (which would help build a foundation for future use of more complicated GIS software) or some of the more advanced functions of Microsoft Excel (which could get them thinking about databases).
These are just a handful of ways that digital literacy could be taught through an undergraduate history curriculum. With what tool kit do you think history undergraduates should leave their history programs? Should digital literacy be seen as similar to other forms of academic literacy, such as developing reading, research and citation skills?
Like the university’s more traditional focus on developing these skills, digital literacy is quickly becoming a critical component to studying the liberal arts and humanities. As technological change creates new ways of accessing and analyzing primary and secondary sources, new sets of skills and expectations are required for both students and teachers. The printing press revolutionized learning by making literacy a critical component to one’s education. The microchip has made a similar impact, requiring that students develop a new type of literacy which equips them with the necessary skills to benefit from these new technologies.
This piece has been cross-posted on the new blog Teaching the Past, jointly run by The History Education Network and ActiveHistory.ca. Check the blog out for posts on topics similar to this one. Special thanks to Ian Milligan, Jim Clifford and Amanda Williams for thier insight on this topic.
Good and interesting article. I have two comments:
“…historians now have access to powerful software…” – in the wonderful politically correct world we’re living in, do historians have access to history?
“With what tool kit do you think history undergraduates should leave their history programs?” – how about honesty and truth?