Historical Thinking Matters (Practicum Reflection)

Historical Thinking Matters was great fun for me, not least because I’ve never had a course on American History (OK, one survey level in college that we can’t actually count: horrible story about a lazy professor… involving pirates… )! I tried the module on the Spanish-American War. I didn’t have a textbook handy, so I took my cue from Mills Kelly and read Wikipedia to prepare.

The document investigations were more guided than I expected, leading the student to the inevitable conclusion that the Spanish-American War did not break out as a direct result of the destruction of the Maine. The module was also simpler than I expected, pleasantly so, providing enough guidance for the student to discover that the story of the Maine was tangled in a series of causes and motivations, but not offering or forcing judgement on what those might be. In this, the module answers Sam Wineburg’s injunction to provide history training as part of education broadly, helping students to look through heavily mediated texts like those provided by journalism and past simple causes. It also follows an an assumption in the readings that I found intriguing: young western students seemed to the VKP researcher-instructors to be infused with simple causal and teleological thinking (is it in the water?). The west is of course famous for this mode, but I want to know more about how it works.

The assignment/set of assignments that I came up with is unsurprisingly modeled directly on the HTM site because that’s what I was thinking about when coming up with something. My assignment focuses on Roman texts and the skills a student needs to assess them as historic documents. I would run a module set around Livy’s account of the 2nd Punic War because it offers some interesting opportunities to learn historical thinking (and because I’m a little in love with it).

Three phases would offer the student some valuable lessons. A piece dealing with sourcing and contextualising would drive home for the student the secondary-ness of Livy’s account (written, you know, like, 200 years after the battles it describes). Juxtaposing other source material and adding guided questions would draw out the primary-ness of Livy as contemporary literature (how does Livy’s text fit the Augustan/early imperial era? show contemporary Roman values, etc.?). Using HTM’s tactics, we could expose to students contemporary historical thinking by comparison with Roman historical thinking. This might involve juxtaposing other accounts of the war with Livy’s trying to line up parallel events where possible. We have a contemporary Greek source and at least one fragmented annal that was a source for Livy that would help students to see what kind of history Livy is “doing.” I like the third part the best. It has the least in terms of “agenda” and offers students the ability to find out what contemporary historical thinking is by contrast to that of other places and times. It will also invite them to question our own modes of thinking as forms of bias, which is never a bad thing.


Teaching and Learning (Week 13 RR)

I very much enjoyed this week’s readings on a number of levels. So far, I haven’t had an opportunity to teach history, but occasionally I am handed an undergraduate Latin lecture, and I have been tutoring a set of Greek and Latin students privately for the last several years. Teaching ancient languages has some notorious challenges. One major example is understanding its high attrition rate: students tend to learn very well or not at all (and the why is not always simple). Although I prefer tutoring to teaching for all the usual reasons, the problem seems (so far) to remain even in a close one-on-one situation in the same ratio. My working hypothesis is that this has a great deal to do with how we transmit/receive, come equipped with/scaffold (cf David Jaffee’s essay in our Academic Commons collection of VKP essays) mechanisms for learning and integrating the pieces of a method into a framework.

Sam Wineburg’s description of teaching as the making visible of the practitioners’ processes resonated for me. This is the challenge of language pedagogy as well as that of history (at least where we’re training thinkers and not temporary repositories for facts), to make the steps and pieces of a whole system visible in coherent, digestible packets. Another parallel challenge for the instructor is that achieving sufficient preparation to teach is tantamount to having internalised specialised knowledge to a degree from which it becomes difficult to teach (i.e. to recall the smaller structural components of ones own learning) although the knowledge and methods review we can get from teaching something is one of its primary rewards.

Michael Coventry, et. al., in the JAH article featuring the Visible Knowledge Project ask broadly about what the student brings as far as assumptions, past training, literacy, and education (writ broadly) that make a difference to a) how well the modes of teaching selected by the participant instructors work, b) how they must be modified in order to work. Sharona Levy’s piece in the Academic Commons set describes her exploration of this problem as she attempts to devise and operate a metric for how students read. Far from revealing a formula, her process is iterative, the metric serving to expose the gaps in her methods and pedagogical problems generally; she observes the shifts of her adjustments.

A corresponding piece by David Jaffee tracks student progress with a set of activities designed to teach visual literacy, or, more accurately, historical thinking with images. He likewise was less interested in the “success” of his formula, which he evolved several times to promote different results, than in in some way meeting his objective of teaching ‘historical thinking’, especially the finding of problems. In this he responds to Wineburg’s call to arms to teach historical thinking as a set of critical skills any educated person should have in daily life. Jaffee found great success in his student project that dealt with the image of an assimilated Native American leader paired with an archive of primary texts students could select for working with, not because the students came up with a satisfactory answer, but because they found problems. But he also took pleasure in the fact that he had taught them to ponder “the relationship between choice and constraint” as they problematised the whys of voluntary assimilation to an invading culture. As much as I myself agree with him on the lesson, this leaves me a little unsettled. In his conclusion, Jaffee does complicate what it means to think historically and reminds us that this is also not a neutral question. The categories used on the HTM are much cleaner (how to source, how to read closely) as tactics rather than highly contextualised questions, but as always, our tactics have a context too of which we must be aware.

This is all very obvious stuff, but I found it quite stimulating both for thinking about the ‘value’ of history and about my own experiences, problems, and aspirations for teaching.

Open Access/Open Sourec (Week 12 RR)

I am noticing that just about every week, I blog about my computer-friends and family, and I’m sorry for this, but I have to do it again… Lawrence Lessig, Free Culture, was a very interesting read for me because I was already acutely aware of his work, and his work as it had filtered out to the conversations online and in casual settings. In other words, I have already internalised its conclusions to some degree, but not its logic, so I made a pretty careful read of it.

Mostly, I’m still in. Lessig makes a sensible argument about the resistance shown again and again to architectural innovators by the old guard which had marketised previous architectural forms, the regular-old circle of life. But this time, he finds something new and menacing in the resistance. The main thrust of his book (found on 255) is that the guardians are going a little above and beyond this time, that they are trying to take hold of culture itself. If you agree with him, that’s terrifying. I’m not entirely certain that I do. He may be confusing players with the game, but I might be wrong to think that there could ever be a difference between those two categories. Food for thought…

Lessig set the stage for us to understand the nitty-gritty how-to guidelines for Open Source projects as presented by Karl Fogel’s book, Producing Open Source, especially Fogel’s last section (154 on) “Licenses, Copyrights, and Patents” informed by his introduction to GNU Linux ideology in the introductory section (3). The pressure exerted by the GNU license isn’t a direct counter to that deep proprietary system which might scare us after reading anything by Lessig in that it exerts yet another kind of control over use. In fact, most of the license agreements and compliance standards Fogel surveys represent attempts to bridge the divide between the rise of Proprietary and Free Software, so that real people might work and do business both (Lessig would probably approve).

Control over making information/culture/code public features in part of Elena Giglia’s article in D-Lib on the proceedings of a conference on Open Access in a discussion on sharing data. Publication of raw data alongside publication or as a product in itself is an interesting idea and has benefits (more than harms) for the sciences as a whole, although potential costs to the academics and other researchers performing experiments (I would also ask methodological and theoretical questions about the degree to which data gathering and interpretation are not so separated even in our beloved empirical sciences, but never mind, and the idea of standardising presentation for streamlining and search ability is also interesting with pros and cons). But how or if this can be relevant to humanities scholars, I am not certain.

Peter Suber (in the available online chapter of  his book Open Access) introduces us to ways in which academics in humanities might benefit from Open Access publications with rhetoric almost as catchy as Lessig’s (I’m a fan): ‘wouldn’t it be great if people who weren’t paid directly for their writing anyway could post it for free?’ We were introduced to the idea of free online publishing in previous weeks, and here juxtaposed with the rest of the week’s discussion it takes on another cast (or at least it shows up better) in that we must think about protecting our content from becoming proprietary (and commercial). It seems a little ironic, and feels a little like looking over our shoulders for Lessig’s Orwellian smoke giant on the horizon. Aren’t the concerns of academics, as Suber says, not about money? And, did anyone think that the Creative Commons licenses some of us added might be improved for use by academics? A new Open Copyright with more targeted (yet still sufficiently broad) goals?

Archival/”Archival” (Week 11 RR)

The first reading (the CLIR report on Digital Forensics) was especially interesting to me because I have a very close programmer friend who worked for many years in e-litigation support doing very interesting jobs like crushing white collar crime under the black high-heeled boot of awesome computer savvy (think Enron). The one thing I recall best from years of hearing stories is that e-litigation is incredibly lucrative. Software, service, and support are incredibly expensive (they usually worked, up to at least 2010, as contractors billing high power law firms at rates proportional to what the firms themselves bill, and there was actually some fuss in the industry around that time as richer corporations ate up the smaller companies and overtook the industry). The field is also commercially driven and technologically competitive (read proprietary). My immediate thought, which was certainly addressed in the report, was to wonder how small and poorly funded (relatively anyway) institutions could acquire the tools for this necessary work. The Council supplied costs charts that were more or less illegible to me, but also summarised that the work is indeed quite cost-prohibitive to most institutions (including service and expert personnel). Amongst the objectives they listed at the end, the Council called for speciality tools (i.e. not for the legal industry) and promoted sharing that was orientated more towards technological and methodological kinships.

In addition to costs, when working with archives of a digital format, the other primary concern I noted was ethics with specific respect to personal privacy. The issue is an older one, but expanding in the digital age in terms of both preservation and presentation of information. Kenneth Price centres his discussion of terminology for what we do, and in turn our procedures and practises, around his own Walt Whitman “archive” (or collection if you like per his concern over terms and the subsequent discussions in the reading). He raises a key question belonging to archival science about the ethics of collecting and sharing artists’ … well, junk, junk and private objects.

It is a funny sort of a question that surrounds our old privileging and near-deification of the authour, the making of sacred relics from their miscellany. In archiving as collecting and structured allowing of access, it is a concern, but the ante is up in the Digital Age. Would Walt Whitman want his private notes available online? Well, according to a quick glance at an unassigned section of the NINCH Guidelines, the dead have no protection.

Back to the CLIR discussion, there is a rising concern that matters of privacy have to be decided clearly and permanently between an institution and a donor of a digital archive before an agreement can be struck because of the additional capacity of digital media to store every trace and track of a person’s life and the ability of digital forensics to get it out. In the Digital Age, standard concerns about privacy, especially financial and medical, but also limits on social scope, have become increasingly anxiety producing (fyi. I have access to a hard drive shredder this week if anyone needs it after reading that article!). I think it will be interesting to see how this actually plays out legally.

An additional layer of the public and private question comes up in our readings by Shelia Brennan and T. Mills Kelly, and that by Dan Cohen. Both articles deal with the “crowdsourcing” and presentation of disaster “archives” (Katrina and 911 respectively). I think it was Cohen who brought up the matter of collecting and storing memories privately. Apparently, donors of the memory artefacts could in some cases choose to have them made public or simply included. I would actually like more detail on how this works before commenting. Are they private for a set period of time? Forever, and able to be used generally, but not directly published?

Even with permissions to share, memorialising disaster and trauma comes with a fair amount of ethical baggage. The literature on this subject is endless and rather daunting (and makes me appreciate the fact that I study very, very long gone people) and deals with the ethics of collating, curating, presenting the traces of trauma even with participant permissions. This week’s readings had many interesting topics, but I really focused in on this question of ethics in historical and archival/”archival” practise and its relationship to our ability to store more, examine at once more deeply and more broadly, and share so very rapidly.

Data Management (Week 11 PR)

(Note to Class: some of the LoC materials are back up as of this morning)

This week’s practicum led me to think a little more carefully about what kinds of data were part of my proposed project for our class, and of what I will actually try to build in the next four weeks. In terms of central data, my project consists at first of ‘other people’s data’ only (geospatial data and metadata about objects), which is brought into my tool to be worked. There are, however, other kinds of data to deal with: there’s an application to store, the code that drives the site where my tool is accessed. In the greater project I dreamed up, there’s also a forum and other discussion places like news and FAQ wiki, tool development and sharing. The ultimate form of the project includes data management and storage and resharing of the data for processing through my GIS tool. Drawing from our chapters in NINCH Guide and from Digital Preservation in a Box, I made a little list of concerns and possible solutions for my data.

-Use advisory boards (people) and systems (tools) for planning data in workflow and backup (draw pictures of the data and use calendars to manage regular processes such as backing up information)

-regular backups need to be made of the living website: forum discussion, blog posts, tools developed and shared, etc. An automated procedure (to be checked on by a human at scheduled intervals) would be ideal, where the data is migrated according to rules for naming and date and time stamping files straight onto backup media. Is it possible to have data archived in an off-site location automatically if both machines are online?

GIS point data and object metadata:
-meet metadata standards – this shouldn’t be challenging since my project consists first of stealing other people’s data and borrowing standards developed by one of my target repositories for additional database items; but we see in the readings how critical standards are for storage and preservation as well as working functionality in processing tools. This includes having subject thesauri and authority files; for input some level of data input control is necessary (for this, though, I’ve envisioned nearly complete control through drop downs).
-Storage of data: considerations include allowing for partial privacy for users of proprietary or protected archaeological data and partial open access. A hard drive on which to back up all of the data is advisable, but my main concerns for storage and retrieval in an emergency are a) getting data to users quickly from anywhere (with varying degrees of access to protected data), b) giving back to the distributed web by reduplicating and resupplying the data to other types of users elsewhere. What I’m thinking about is a second server or backup server. Cloud storage also sounds interesting and a pay-for-storage model on an otherwise free tool would provide a means to purchasing and maintaining one. From the link on the Preservation in a Box page that compares commercial platforms, SpiderOak sounds the best so far for cost, storage, and privacy, although they all have some serious downsides, namely being consumer products and lack of support for certain OS and platforms.