Graduate Work

  •  

    New Directions in the VMW

    Alison Langmead has embarked on outreach efforts to connect the VMW with other digital humanities spaces, beginning with our colleagues in the US, but soon hoping to move more internationally. We are looking forward to all of the opportunities this will provide, and are perceiving a future where the question is less "what can computers do for the study of material culture," and more, "what shall we do today?”

    One of these outreach efforts is making the connection with Tracey Berg-Fulton, creative technologist and webmaster at Registrars Committee of the American Alliance of Museums. Berg-Fulton donated to the VMW a 26000-entry dataset of Algernon Graves' collection of 18th- to 20th-century art sales, digitized from his vast, published ledger Art Sales from Early in the Eighteen Century to Early in the Twentieth Century. In the short run, S. E. Hackney and Lily Brewer are working toward implementing this data into visual patterns and historical contextualization respectively for Sotheby’s Institute of Art Research Award through the Art Libraries Society of North America. Undergraduate research assistant Vee McGyver, under Hackney’s supervision, is working on figuring out how to visualize relationships in data based on art sales using a force-directed graph from the javascript d3 library. Frick Fine Arts Library director Kate Joranson is sponsoring these efforts.

    As Graves’ data becomes available and conceptualized in visually informative ways, we’re investigating ways in which the data can turn into objects that we can track through Itinera (itinera.pitt.edu). By honing and creating more geographically specific locations for these entities and tracking works of art through Graves' art sales, the VMW cohort under Brewer’s guidance is working toward diversifying Itinera by mapping the European and non-Western routes of lesser tracked populations such as influential women and people of color through 18th-century Eastern Europe and Turkey. In our attention to multiple scales and modalities of historical vision, our attention focuses on the questions, how can we visualize and generate new insights into the travels of 18th-century travelers through contemporary identity politics and digital mapping methods? Furthermore, how can mapping diverse populations in this time over this space creating meaning through historical place-making?

    As the end of the term approaches, the Sustaining MedArt team lead by Aisling Quigley continues to unearth and reconstruct the socio-technical history of the website, Images of Medieval Art and Architecture (www.medart.pitt.edu). While the digital forensics research has provided helpful insights into the foundations of the site, this work has been arduous. The digital forensics tools are complex and uncooperative, and the dissection of the site itself has revealed a tangle of messy innards. Despite numerous obstacles, however, our team perseveres undaunted! Indeed, the complexities are revelatory in and of themselves, and the data is slowly but surely bringing to light important moments in the website creation process. Following from this work, the team, comprised of Quigley, Lindsay Decker (read Decker's reflections on the subject here), and Jedd Hakimi, is discussing and establishing a firm infrastructure for developing a socio-technical digital preservation roadmap.

    Undergraduate researcher Dheeraj K. Jalluri works on a neuroaesthetic research project investigating neural basis of artistic aesthetic experience in Abstract Expressionist art under Brewer's guidance. This semester, he is focusing on formulating a method to quantitatively analyze artwork qualities implicated in neuroaesthetic theories, such as symmetry and contrast and value using Photoshop. In future exploration, he gears his tools toward the crowd-sourcing tool Mechanical Turk and Fourier Analysis in the development of a larger research question that best suits these methods.

    Decomposing Bodies’ focus for the coming year will be building a unified online collection and corresponding data set for thousands Bertillon cards in the collection, and making that data accessible. The historical, physiological, and contextual data contained on these cards is a rich vein for researchers across many fields, and our goal with DB is to begin to make our digitized collection more visible to research communities and to begin building the relationships that will result in future projects and collaborations. These goals manifest in continuing the work of classifying and transcribing the cards, managing their metadata, and creating more robust public-facing representations of the project, under the guidance of project manager S. E. Hackney, and with contributions from the entire VMW cohort. (Read more of Hackney's reflections on the subject here.)

    As an invitation to inter-institutional connection and networking, those interested in our efforts toward constructing bridges to other digital humanities spaces can follow #arthistory on our Digital Humanities Slack (https://t.co/BI1cizC4de) and through our new listserv at https://list.pitt.edu/mailman/listinfo/ddarth.

    Categories: 
    • Current Projects
    • Decomposing Bodies
    • Itinera
    • Sustaining DH
    • Sustaining MedArt
    • Populations
    • Undergraduate Work
    • Graduate Work
    • Spaces
    • VMW
  •  

    Almost There with BitCurator

    For the last  couple of weeks on the Sustaining MedArt project I have focused on compiling what I have learned from comparing MedArt with MedArt-2014 in addition to continuing to work with BitCurator.

    In my last post I excitedly announced that I had conquered BitCurator perhaps a bit prematurely. I have found that creating a disk image of the hard drive holding MedArt and MedArt-2014 may not be possible. The 2 TB hard drive is too large to be successfully imaged on a computer with only 700 GB of storage left. To combat this problem I tried making a copy of MedArt (which is only 1.8 GB and much easier to work with than the 7.7 GB MedArt-2014), but I have not been able to get BitCurator to read any of the copies I made.

    However, I was able to simply copy and paste MedArt onto a small 16 GB flash drive and was able to use it to successfully experiment with some of the forensic tools on BitCurator. Using tools like Bulk Extractor to create reports I was able to see everything that had ever been on the flash drive I was using, even though everything on the flash drive was deleted before the copy of MedArt was added. I did find a few things that were interesting like email addresses attached to certain files that showed me when the files had been emailed and to whom. But the reports really seemed to focus on the actual flash drive more than on its present contents. So I think I will start moving away from BitCurator tools that require a disk image, granted there are not very many.

    The BitCurator tools that might provide more useful responses are FITS and sdhash, which I learned about by talking to Matt Burton in DSS. FITS can extract metadata from MedArt and MedArt-2014, and sdhash can compare data. I’m also going to look into Mac’s Filemerge utility which can compare folders. Hopefully using these tools I will be able to get results more relevant to the Sustaining MedArt project.   

     

    Categories: 
    • Sustaining DH
    • Sustaining MedArt
    • Graduate Work
    • VMW
  •  

    Structuring Decomposing Bodies

    Working on Decomposing Bodies over the last month and a half has been an exercise in process. Shortly after the start of the semester, the Data (after)Lives show went up, featuring data from DB, some of the physical Bertillon cards and exploring many of the same ideas that we confront in DB every day. Data (after)Lives was a great way for me to see what Decomposing Bodies is as a concept, but since then, most of my work has been examining and manipulating it as a structure.

    Since DB has changed hands several times over the past few years, a lot of what I have been doing is following the threads of my predecessors, trying to understand their processes, the choices they’ve made, and their relationships to the thousands of image files that truly compose the heart of this project. For every task that needs to happen to construct the dataset around these images, and to make that data available to researchers, there are dozens of tiny tasks that have to take place. Tasks from, “mark which files have been uploaded to Omeka” and “transcribe the handwriting on the cards into metadata fields” all the way to “defrag the hard drive” and “back everything up.”

    Let me be honest, visual media isn’t actually my area of expertise. Or even my research interest. But! The way people collect, label and organize things is. In case you couldn’t guess, I am a PhD student at the iSchool, rather than in Art History. For me, Decomposing Bodies is an interesting blurring of observing and contributing to how resources get organized and disseminated. I am finding gaps in documentation— what does tag “pass1c” mean?— and creating my own protocols for the project going forward— it means the “Age”, “Apparent Age”, “Born in”, and “Complexion” fields have been transcribed.

    Everything that the VMW does with DB is in preparation for other people to do something else with it later. We have to try and answer questions about how imaginary potential future researchers will want our data to be formatted, and what kinds of questions they might want to ask. The Data (after)Lives exhibit is the beginning of presenting those questions, and inviting conversation around what it means that these cards exist in the first place. My work, for now, is about making sure that those conversations can continue, and that all the pieces of this project are speaking the same language.

    Categories: 
    • Decomposing Bodies
    • Graduate Work
    • VMW
  • Vitruvian Man

    What does this mean?  Let's check the data on that.
     

     

    Sustaining MedArt and the Fear of Interdisciplinarity

    As a collaboration between Information Sciences and the History of Art and Architecture, “Sustaining MedArt” is a project conceived to be interdisciplinary—a term which many in my home department of English are happy to invoke when convenient, but one that few want to deal with practically when it means crossing venturing outside the humanities. This is, at least, how I feel as someone working on video games, a cultural artefact caught between quite a few disciplines at the moment, including sociology, computer science, literature, film studies, communication, media studies and so on. And, while I am happy to tout my own scholarship as interdisciplinary, it is perhaps more accurate to admit that it is mostly only my primary object of study that has a clear interdisciplinary appeal. Coming from humanities background, I largely treat video games as meaningful “texts,” reading them primarily from the inside out, following a hermeneutic tradition found in literary, art, and film scholarship. While I can appreciate and respect more scientifically oriented scholarly approaches—which I am also happy to incorporate anecdotally into my own work and pedagogy when applicable—I am also aware that, as a humanist, I conceive of scholarship’s purpose quite differently.

    I can imagine that my resistance appears narrow-minded to those conceiving of interdisciplinarity as a benevolent practice for opening lines of communication between scholarly discourses seemingly haphazardly sealed off from one another. (It is true that these disciplinary lines felt particularly inane on those occasions when I believe have stumbled onto some novel idea, only to discover that I have merely been rehearsing a tired conversation occurring in the department down the hall.) At the same time though, there are a number institutional and intellectual reasons for being suspicious of interdisciplinarity when it connotes crossing lines between the humanities and the sciences (social or hard). For starters, humanists are already on guard against what seems to be the increasing corporatization of the university—a system we fear is overly concerned with measurable “outcomes” and determines success through quantitative and data-based correlations. The humanist’s understanding is that much of what we provide students cannot and should not be calculated.  Merely consenting to these statistical schemas is often antithetical to our ideological skepticism of assessing value in terms of capital productivity.

    This belief is perhaps a function of a broader skepticism of applying scientific empiricism to the abstract categories of human culture, knowledge, and progress. Even as we humanists might maintain methodological rigor in our formal analyses, we do not present that methodology as “scientific.” We resolve to subsist in an era after post-structuralism, post-modernism and historiography (among other meta-critiques of critiques of critiques) have wreaked havoc on teleological searches for “the truth,” and so the conclusions we reach in our scholarship are self-consciously rhetorical and may even contain traces the creative abstraction of those primary sources we ostensibly interpret. As I see it, our general goal is to engender an unceasing conversation about what it means to be human by continuously revealing new ways of seeing things.  We question any ideological structure that allows us to take any part of our experience for granted as a given. (The central paradox at the heart of this humanist scholarship may be in the essential creed that all truth claims can be undermined.)

    So what happens, when avowed humanist like me joins an interdisciplinary project faced with the very practical responsibility of determining a “Socio-Technical Digital Preservation Roadmap” for actual cultural artefacts? Can I allow my research and analysis to actually suggest an entirely practical solution to a very concrete set of concerns about archiving material in the digital age? To be perfectly honest, I don’t know yet, but I am eager to find out. To be fair, maybe this isn’t as big deal as I seem to be making it—it might merely be another instance of the tension between theory and practice that a humanist faces every day when we have to do things like teach, write, and administrate.  However lofty our scholarship may be, our very participation in a functioning world is premised on the fact that we can act as if we are oriented towards something aligned with a notion of progress.

    Anyway, as an opening post, I hope I expressed some of my initial thoughts about my participation in this project.  In my next post, I’ll provide more of a specific look into the various ways we can define the MedArt website as the central object of our concern, and the ramifications stemming from those respective definitions.

    Categories: 
    • Sustaining DH
    • Sustaining MedArt
    • Graduate Work
  • A screen grab of MedArt ca. December 1996 c/o the Internet Archive.

     

    Update: Phase III of Sustaining MedArt

    It seems that autumn is finally here, more or less. It is a splendid season, but also perhaps the most hectic in academia-land. New students arrive, conference abstracts and grant proposals are due, and time seems to accelerate and contract alarmingly (or so it feels, as I get older).

    The Visual Media Workshop has expanded to include eight student researchers (from undergraduates to doctoral candidates), each arriving with their own skills and experiences and their own unique roles in our various projects.

    This post will focus specifically on Sustaining MedArt, our lab project funded by a Research & Development Grant from the division of Preservation and Access at the NEH. This project takes Images of Medieval Art & Architecture, a valuable scholarly resource and early instantiation of a digital humanities project, as a case study for exploring the correlation between usability and sustainability of digital content. Our research will culminate in the creation of a Socio-Technical Digital Preservation Roadmap with broad applicability to digital humanities projects.

    As of September, we have entered the third phase of our research, having successfully completed and analyzed initial user studies. For those who are curious about how a work plan might evolve for a year-long project, I’ve described our phases below:

    May 2016
    • conducted over 100 on-site, face-to-face interviews at the 51st International Congress of Medieval Studies in Kalamazoo, Michigan
    research team: Sarah Conell, Kiana Gonzalez, Alison Langmead, Jackie Lombard, and Aisling Quigley
    Summer 2016
    • transcribed and analyzed interviews from Kalamazoo
    • used grounded theory to extract phenomena from these interviews
    • began digital forensics work on the site (extracting file trees, etc.)
    research team: Kiana Gonzalez, Chelsea Gunn, Alison Langmead, and Aisling Quigley
    Fall 2016
    • develop the theoretical foundations for our project
    • continue digital forensics work with BitCurator
    • interview Dr. Alison Stones and early contributors to the site
    • submit paper abstracts for confernces
    research team: Lindsay Decker, Jedd Hakimi, Alison Langmead, and Aisling Quigley

     

     

    In this third phase, we are benefitting greatly from the significant contributions of Jedd Hakimi, doctoral candidate in Film Studies, who is creating an in-depth bibliography and developing the foundational underpinnings for our work. Lindsay Decker, MLIS student at the iSchool, is valiantly diving into BitCurator, battling with its many quirks, and becoming our in-house expert on digital forensics.

    Initial findings from our research thus far include the following discoveries:

    1.  many scholars implicitly trust the authenticity and reliability of the content on the Images of Medieval Art & Architecture site because of the obvious association with an academic institution (expressed through the ".edu" in the site's URL), and the presence of Dr. Alison Stones name on the homepage (a known and respected entity in the history of medieval art and architecture)
    2. many scholars express concern/embarassment/shame/guilt about the fact that they resort to Google for image searches, because they generally distrust the authenticity of the information they discover, or cannot find attributions for this content
    3. many assume that a search bar will improve the website. We've found evidence that the initial site creators and contributors experimented with a search feature in the early days of the website, but we've found no evidence that it was ever implemented as it is absent from the December 1996 screenshot.

    I will post with further updates in the not-too-distant future, and my fellow team-mates are also contributing to the site with their own blog entries. Stay tuned!

     

    Categories: 
    • Sustaining MedArt
    • Graduate Work
    • VMW
  •  

    Getting to know everything about MedArt

                It has been a busy two weeks on the MedArt project. I am thrilled to announce that I have conquered BitCurator, and it is up and running in the VMW. However, that is a fairly recent development so I have not had the opportunity to use it too much. I did manage to get it working last week on my personal laptop. My initial sense of victory was quickly dashed when I began imaging the MedArt files and saw the time remaining was slowly counting down from 68 hours. My five year old PC was clearly running BitCurator under protest. Hopefully, the Macs in the VMW will be better equipped to handle this kind of workload.

                When not grappling with BitCurator I have continued to familiarize myself with everything we have on MedArt. I have been going through the MedArt folders we have saved on a hard drive in search of any anomalies and any differences between the pre-2014 folder and the folder that contains information about the current version of MedArt. One of the more interesting features is an early version of a search box. As far as we know the search box was never implemented. The search box wasn’t really one box but two. In the first box the user was to select the physical location of the architectural site they were looking for, and in the second box they had to select the type of building they were looking for like “church” or “cathedral.” There was also evidence of MedArt being used as a classroom tool in a folder titled “grant.” This folder contained a series of online quizzes on architectural features dated to 1997. I am interested to find out to what extent MedArt was originally intended to be used in the classroom.

                To supplement my understanding of the ways in which MedArt has changed over the years I also utilized the Internet Archive. The earliest snapshot of MedArt on the Wayback Machine was from Dec. 22, 1996 and the latest was from this past June. By looking at these snapshots and the many in between I was able to get a clear view of how much the site has changed over the past 20 years. I isolated five dates that demonstrated the biggest changes to the MedArt site. These five images will illustrate the evolution of MedArt.

                I also did some reading to further my understanding of digital forensics. In fact the entire Sustaining MedArt team read chapter three of Mechanisms: New Media and the Forensic Imagination by Matthew Kirschenbaum. The chapter was called “’An Old House with many Rooms’: The Textual Forensics of Mystery_House.dsk.” In this chapter Kirshenbaum describes his experience analyzing a computer game from the 1980s called Mystery House. He used a disk image of the game which acts as a representation of all the information on the disk at a certain time (Kirschenbaum 2008, 114). This is similar to what I will be doing with MedArt on BitCurator. In this chapter Kirschenbarm highlighted the importance of placing the game in the context of its time. On the Mystery House disk he found evidence that it had once housed two other computer games from earlier in the ‘80s. He concluded that the user of the disk must have overwrote the other games with Mystery House because they had nowhere else to store it and they had finished the other two games (Kirschenbaum 2008, 127). The exercise of placing the files in the context of their time will be an important strategy for the MedArt project which began in 1995. The internet then was very different from what we use today, and it is important that we do not forget that going forward.

                One way I have become better acquainted with the internet of the mid-90s has been by investigating databases similar to MedArt that were established around the same time. Some of the sites I found were the Index of Christian Art, the Early Modern Women Database, the Emily Dickinson Archive, and the Valley of the Shadow. Each site has stood the test of time differently. The Index of Christian Art recently put out a survey and is currently implementing the changes suggested in the returned surveys. The Valley of the Shadow and the Emily Dickinson Archive have recently undergone updates. However, the Early Modern Women Database is no longer being maintained at all by the University of Maryland and hasn’t been since 2012. I also used the Internet Archive to look at these databases before they underwent updates, and found sites very similar to MedArt. I was not able to find anything on the Internet Archive regarding the Early Modern Women Database other than snapshots of a screen that reads “URL not found.” I’m excited to see where we take MedArt as we proceed with this project.

     

    Kirschenbaum, Matthew G. 2008. Mechanisms: New Media and the Forensic Imagination. Massachusetts: Massachusetts Institute of Technology. 

    Categories: 
    • Sustaining MedArt
    • Graduate Work
  •  

    A Whole New World: My Introduction to MedArt

    This week marks the end of my second offical week working in the Visual Media Workshop on the Sustaining MedArt project. It's been an interesting transition from the physical forensics I performed as an archaeology major in undergrad to learning about digital forensics as an MLIS student. I've found a similar excitement to finding indiscrepencies in file trees, extracting themes from transcribed interviews, and small successes in using unfamiliar software comparable to finding potsherds on a dig site. I'm anxious to delve into the realm of digital forensics on the MedArt site to see if I can't find any old foundations or skeletons hidden in its data. 

    I began my assistantship with the Sustaining MedArt project by familiarizing myself with what work was done over the summer and the requirements imposed by the grant itself. I'm thrilled to participate in the exploration and preservation of a pioneering medieval art & architecture website that is very nearly as old as myself. I also majored in history and minored in medieval studies as an undergrad, and I never came across a site like MedArt. It's been interesting to compare it's functions to the sites and datatabases I've come across in the past, primarily ArtStor. Once I had brought myself up to speed on the goals of Sustaining MedArt I started analyzing the transcriptions of the interviews from Kalamazoo in search of any themes that hadn't yet been found. One thing I found particularly intersting was the contradictions between interviewees. Some complained that one can't use MedArt without knowing exactly what you're looking for because of the lack of a search feature while a similar number of interviewees complaining that one can't use Google without knowing exactly what you're looking for as it provides  too many images per search. This is an intriguing contradiction of viewpoints that I hope to look into more later.

    When I first started working here I was also tasked with downloading and familiarizing myself with BitCurator, a digital forensics program. I began by trying to download it as a virtual machine. So I downloading and installed VirtualBox to my PC, downloaded and unpacked BitCurator using 7zip, and was devastated when it failed to run. Keep in mind that because my journey into computer literacy began very recently each of these steps took a couple attempts, multiple explainations garnered from the internet, and the occasional YouTube video. After an unproductive but successful attempt to enable the virtualization abilities of my computer I ran out of ideas. After a failed attempt with Aisling to download BitCurator as a virtual machine on one of the Macs in the VMW following another unsuccessful attempt to download it as an ISO it's been moved to the back burner while we wait for some advice from the Digital Scholarship Lab. After 2 weeks of trying to make BitCurator work I'm starting to take it personally. The saga of Lindsay vs. BitCurator will continue.

    In light of the free time I've gained from temporarily ending my battle with BitCurator I've started doing some reading on digital forensics. I'm excited to explore the literatur on the topic further and to begin implementing my new found knowledge on MedArt. I have also been comparing the file trees of the 2014 update of MedArt to the preceding version, and have found some interesting changes. The 2014 version made some important corrections, like eliminating two folders that for some reason repeated indefinitely in the older version. It also added significant contributions to the glossary, bibliography, and citations folders. Within the 2014 versions I've also found 3 different lists containing folders for english and french cities. The first list seems to contain all the folders in the 2nd and 3rd lists, though each of these other lists are missing some of folders from the first list. I will be very interested to find out why some of these changes were made once we start applying digital forensics. 

    Next week I intend to do some more reading and start exploring the hard drive where MedArt is stored. Wish me luck!

    Categories: 
    • Sustaining MedArt
    • Graduate Work
  • Exhibition poster, designed by Aisling Quigley
     

    Data (after)Lives opens tomorrow!!

    Opening Event: Thursday, September 8th, 4-6pm

    This exhibition incorporates the work and research of Rich Pell (Curator at the Center for PostNatural History), Paul Vanouse, Steve Rowell, Aaron Henderson, and Heather Dewey-Hagborg. Paulina Pardo Gaviria also reinterpets the work of Letícia Parente (1930-1991). Also co-curated by Dr. Alison Langmead, Dr. Josh Ellenbogen, and Isabelle Chartier. Design associates: Aisling Quigley and Jennifer Donnelly. 

    Categories: 
    • Decomposing Bodies
    • Graduate Work
    • Faculty Work
    • UAG
  •  

    Is a Search Engine Necessary [to MedArt]?

    A major trend that the Sustaining MedArt research team found in the interviews taken in Kalamazoo on the usability of the site was the assumption that a search engine would be useful or is necessary; many of the interviewees noted that it would be easier to navigate MedArt if there was a search bar or some kind of search function on the site. This trend made for an interesting inquiry on users' experiences with search engines. 

    One of the articles that I read, “Private Power, Public Interest: An examination of search engine accountability,” discusses information as a “critical commodity” of our modern society and search engines as shaping the internet user’s experience by emerging as “managers of information, organizing and categorizing content in a coherent, accessible manner” (Laidlaw 2008, 113). This was an interesting point that the article made, and I started to think about our trend in relation to this claim. Many of the interviewees would try to navigate through the site, and almost every single one of them would be able to navigate in order to find “Canterbury Cathedral” as we asked them to. This task was marked as “extremely easy” by most of the participants, even though some of them said it would make it easier if there was a search function on the site. This phenomenon in itself may support Laidlaw’s argument that search engines have made an impact on how internet users experience the web and how they expect to find information that they need. If they have to find information in an unexpected way, they yearn to have the ease of that search bar or search engine. In MedArt’s case, the unexpected way to find information was to follow a path by clicking the correct alphabetical symbols and finding information by manually “searching” a list of possible choices with your eyes and withyour own stored background knowledge about the information being sought.  

    Laidlaw argues that search engines owe a public interest duty because they control our informational experience (Laidlaw 2008, 123). She relays to us that, “[b]y controlling the structure of how information is accessed, search engines control the information flow. Without more, this might not be as consequential, however, search engines are now the portals through which the information on the Internet is experienced. They are seen as authoritative and reliable, and shape public opinion and meaning.” Similar to the subject headings that the Library of Congress creates to categorize world knowledge in a way that definitely shapes our ideas of how the world’s knowledge should be structured, Laidlaw argues that search engines do much of the same thing. She does discuss, however, the difficulties surrounding imposing a public interest duty on search engines. One of the most pressing difficulties is search engines staying neutral and organizing information in an unbiased fashion (Laidlaw 2008, 126-128).

    Overall, this trend of assuming that a search engine or search bar would be useful or is necessary is one that should and will definitely be explored more in depth. One of the starting questions for the Sustaining MedArt team could be, "is a search engine neccessary to have on the MedArt website?" 

    I know that I have barely even scratched the surface here. All that I mean this to be is food for thought until the Sustaining MedArt team takes the research further with some of the trends found in the interviews.  

    Citation:

    Laidlaw, Emily B. 2008. "Private Power, Public Interest: An examination of search engine accountability." International Journal of Law and Information Technology 17 (1): 113-45.

    __________________________________________________________________________________

    On a more personal note:

    As my time at the VMW working on the bibliography for Sustaining MedArt and transcribing/coding interviews comes to an end, it is important that I reflect on the work I’ve done this semester and how it has made an impact on my experience and notion of research.  I’ve learned more specifically about some big concepts, like grounded theory, and a little bit about coding interviews. I’ve learned that it is entirely possible to construct a bibliography of resources on a project and on topics that I was not very familiar with when starting out. As someone who just took on a position as a Visiting Fine Arts Librarian in the Frick Fine Arts Library, this gives me hope for my future “search-and-seize” efforts (as one of my library school instructors called the practice of retrieving information). My work here has definitely had an impact on my experience and notion of research, in that it has exposed me to different ways of approaching research (using grounded theory) that makes a lot of sense to me and has the ability to be very rewarding in unexpected ways because of the nature of that approach. Overall, being exposed through this work to some of the approaches that I was unfamiliar with before has been very rewarding.

     

     

    Categories: 
    • Sustaining MedArt
    • Graduate Work
    • VMW
  •  

    Learning About Grounded Theory

    While reading up on grounded theory I was having trouble coming up with a “grounded theory for dummies” definition that would help me with my own understanding of it. I started by reading articles by others who had tested grounded theory, used it in their work, and had come up with their own methods for applying grounded theory to their work. Everyone I was reading was starting with grounded theory but many were using different methods and achieving different outcomes. I ended up facing frustration and decided that I needed to begin again, but this time at the source. So I started reading Glaser and Strauss’s 1967 explanation of the theory that they had “discovered,” and it’s actually easy enough to understand right from their first chapter. They describe grounded theory as the “discovery of theory from data” (Glaser and Strauss 1973, 1).

    Basically:

    Analyze the data you’re working with and extract a theory, or multiple theories, from it.

    Don’t try to prove some theory with the data you have; analyze and compare that data in order to generate theories (Glaser and Strauss 1973, 2).  

    I also really enjoyed Roy Suddaby’s article about “What Grounded Theory is Not” (Suddaby, 2006). He explains that what Glaser and Strauss were trying to do was fight against the presumption that all of the subject matter that was dealt with by the social sciences and the natural sciences was the same, which we now know to distinguish between as qualitative and quantitative (Suddaby 2006, 633). By focusing on the importance of interpretive work, Glaser and Strauss were revolutionizing the way that researchers could deal with and react to the qualitative data they are working with (Suddaby 2006, 633). Suddaby’s article gives 6 big misconceptions that most people have about grounded theory that I found quite helpful to read:

    - “Grounded theory is not an excuse to ignore the literature”: It’s important to have some background knowledge about the data and the field you’re doing research in (Sudabby 2006, 634).

    - “Grounded theory is not presentation of raw data”: The data should be digested and theories developed—interviews are a good example; you shouldn’t just present the transcripts, you should analyze them, find common themes, and abstract the experiences into theoretical statements (Suddaby 2006, 635). 

    - “Grounded theory is not theory testing, content analysis, or word counts”: “[t]he purpose of grounded theory is not to make truth statement about reality, but, rather, to elicit fresh understandings about patterned relationships between social actors and how these relationships and interactions actively construct reality” (Suddaby 2006, 636). This doesn’t mean don’t use mixed methods, it’s just that you’re not starting your inquiry with a hypothesis or a theory, you’re looking for that theory in the data (Suddaby 2006, 636-637).

    - “Grounded theory is not simply routine application of formulaic technique of data”: For example, if you code your interviews but don’t apply any subjective interpretation to them, or if you simply let software analyze your data without applying any subjective interpretation to them, or if you present any of your data or routine “mechanical” analysis of the data without “creative insight”—this is not grounded theory (Suddaby 2006, 637-638).

    - “Grounded theory is not perfect”: This one is self-explanatory. Some people try to make their methods using grounded theory perfect by creating rigid rules and guidelines, but social processes are very complicated, and saying for example that you recorded just the right amount of interviews for a study, or that you collected the correct amount of data for it, can be faulty because that is a very subjective assumption to make (Suddaby 2006, 638-639).

    - “Grounded theory is not easy”: A great grounded theory study is “the product of considerable experience, hard work, creativity and, occasionally, a healthy dose of good luck” (Suddaby 2006, 639).

    - “Grounded theory is not an excuse for the absence of a methodology”: Your methods for collecting, analyzing, and drawing theories from that data should be clear in the presentation of your research (Suddaby 2006, 640).

    After completing some of this reading, and analyzing some of the interviews we recorded for the Sustaining MedArt project, I definitely agree with Suddaby that grounded theory is not easy, but I feel like the definition of what it is and how it works is now much clearer and easy to understand.

     

    Works Cited

    Glaser, Barney G., and Anselm L. Strauss. 1973. The discovery of grounded theory: Strategies for qualitative research. Chicago: Aldine Pub. Co.

    Suddaby, Roy. 2006. From the Editors: What Grounded Theory is Not. Academy of Management Journal 49 (4): 633-42.

    Categories: 
    • Current Projects
    • Sustaining MedArt
    • Graduate Work
    • VMW

Pages