Live-blogging the 2009 Vancouver PKP Conference

Questioning “Accessibility”, Conceptualizing Diversity, and Practising Inclusion: The Session Blog

Date: July 9, 2009

michael felzcak

Presenter: Michael Felczak, PhD student, School of Communication, Simon Fraser University. Online editor for the Canadian Journal of Communication, a researcher at the Centre for Policy Research on Science and Technology and at the Applied Communication Technology Lab at SFU, as well as researcher and developer for PKP.

Session Overview

Abstract

Presentation

Michael began with the following quote, adding clarity by framing it in today’s expanded use of internet, to set the stage for his presentation:

“The power of the web is in its universality. Access by everyone regardless of disability is an essential aspect.” – Tim Berners-Lee, inventor of the world wide web.

Web Content Accessibility Guidelines were reviewed. The main idea is to provide text alternatives for non-text content. This enables people with text tools to use it with assistive devices to access content.

Today, he focused on video in particular because it is being used so widely today. For example, a text transcript would be satisfactory to meet guidelines to accompany with the video. Restricting our understanding in terms of disability, doesn’t really address the needs of others online.

What about users with internet connections one day, but not the next? Downloading the video would still allow access if no connection.

What about users with expensive internet connections? Paying for each megabyte every time they access your site costs money.

What about users with slow or unstable connections? If they could download and save it offline, they could view it later.

What about users relying on mobile devices…developing countries, students, or faculty who use the latest gadgets? Flash support is starting to appear on some devices, not others. Apple isn’t interested.

These examples show we need to broaden our conception to be more socially inclusive.

We can improve if we:
1. provide direct download options

2. multiple file formats for Window, Apple and Linux

3. offer high and low resolution files for download options. Content shouldn’t download unless you click play. [In some cases, it begins downloading right away, tying up bandwidth.

Linux
– a third of all Dell notebooks are Linux
– developing countries are using Linux
– non-profits provide free PCs using Linux

We should also license audio/video using Creative Commons to allow sub-titles or use in other languages and allowing local distribution on other media.

We encourage CC symbol and use.

The online publisher will experience cost and time. Windows, Apple and Linux all require technology tools. Each time you use audio and video, you can convert the files to other formats quite easily. SUPER, by eRightSoft, is the popular tool for converting files. See the slides online for the various tool formats and details.

Final notes:
– start with the original each time to convert files and avoid loss of quality
– the higher the resolution, the higher the file size
– similar problems and principles apply to audio
– many free resources and guides are available online

Audience Discussion and Questions:

[Please note that questions and answers have been paraphrased]

1. Are there no standards that apply for video and audio? In theory, yes, but in practice, Windows wants their standards, Apple wants theirs, etc.  It comes down to this – as a publisher, can you do a little extra to do the work or do you want to force your readers to do the work of finding a reader? I think it makes sense for the publisher to do a little extra work and save readers the time.

2. <Question not heard> In NA, we assume everyone connects the same way and we design tools to interact with content in that context. We  need to rethink this.

3. Comment: As someone who came from performing arts, I did quite a bit of research on voice description, for example. A low vision patron viewing the video can still access the content if someone is describing movement,etc. This could be included.

4. Comment: I think it’s very healthy to revisit everything that is done so that it improves. Issues with accessibility are very important and we’re seeing mobile access growing, and access to indigenous areas, are very important. It’s more than providing tools, it’s showing them how to use it, how to use it safely, and it’s a daunting task. You need a large team. I’m glad to be here and see this room full of people talk about an important aspect: allowing people of different cognitive abilities and others to access and understand content. Thank you.

5. Do you have any resources for simple language symbols, in terms of translating a text so that it’s more readable for a more visual learner? The blogger suggested that resources requested may be available from Dr. Rose at CAST, the Universal Design for Learning website listed below.

Presentation link: to be added by conference organizers.

Related Links:

Web Content Accessibility Guidelines

Converter software from SUPER

Universal Design for Learning, CAST

July 9, 2009   Comments Off on Questioning “Accessibility”, Conceptualizing Diversity, and Practising Inclusion: The Session Blog

PKP Open Archives Harvester for the Veterinarian Academic Community: The Session Blog

Date: July 9, 2009

Presenters: Astrid van Wesenbeeck and Martin van Luijt – Utrecht University

PKP 2009

Photo taken at PKP 2009, with permission

Astrid van Wesenbeeck is Publishing Advisor for Igitur, Utrecht University Library
Martin van Luijt is the Head of Innovation and Development, Utrecht University Library

Abstract

Presentation:

Powerpoint presentation used with permission of Martin van Luijt

Quote: “We always want to work with our clients. The contributions from our users are very important to us.”

Session Overview


The University Library is 425 years old this year. While they are not scientists or students, they have a mission to provide services that meet the needs of their clients. Omega-integrated searches bring in all metadata and indexes it from publishers and open access areas.

Features discussed included the institutional repository, digitization and journals [mostly open and digital, total about 10 000 digitized archives].

Virtual Knowledge Centers [see related link below]

– this is the area of their most recent work
– shifts knowledge sharing from library to centers
– see slides of this presentation for more detail

The Problem They Saw:

We all have open access repositories now. How do you find what you need? There are too many repositories for a researcher to find information.

The Scenario

They chose to address this problem by targeting the needs of a specific group of users. The motivation – a one-stop shop for users and increased visibility for scientists.

The Solution:

Build an open-access subject repository, targeted at veterinarians,  containing the content of at least 5 high-profile veterinarian institutions and meeting other selected standards.

It was organized by cooperating to create a project board and a project team consisting of knowledge specialists and other essential people. The user interface was shaped by the users.

Their Findings:

Searching was not sufficient, the repository content, to use his word, “Ouch!” Metadata quality varied wildly, relevant material was not discernible, non-accessible content existed and there were low quantities in repositories.

Ingredients Needed:

A harvester to fetch content from open archives.

Ingredients Needed 2:

Fetch more content from many more archives, filter it and put it into records and entries through a harvester, then normalize each archive, and put it through a 2000+ keyword filter. This resulted in 700,000+ objects.

Ingredients 3:

Use the harvester, filter it and develop a search engine and finally, a user interface.

Problem: The users wanted a search history and pushed them into dreaming up a way of doing that without a login. As designers, they did not want or need that login, but at first saw no way around a login in order to connect the history to the user. Further discussion revealed that the users did not have a problem with a system where the history did not follow them from computer to computer. A surprise to the designers, but it allowed for a login-free system.

Results: Much better research. Connected Repositories: Cornell, DOAJ, Glasgow, Ugitur, etc.

Workshop Discussion and Questions:

1. How do you design an intelligent filter for searches? [gentleman also working to design a similar search engine] Re-harvesting occurs every night with the PKP harvester rerunning objects through the filter. Incremental harvests are quick. Full harvests take a long time, a couple weeks, so they try not to do them.

2. Do you use the PKP harvester and normalization tools in PKP? We started, but found that we needed to do more and produced a tool outside the harvester.

3. <Question not heard> It was the goal to find more partners to build the tool and its features. We failed. In the evaluation phase, we will decide if this is the right moment to roll out this tool. From a technical viewpoint, it is too early. We may need 1 to 2 years to fill the repositories. If you are interested in starting your own, we would be delighted to talk to you.

4. I’m interested in developing a journal. Of all your repositories, do you use persistent identifiers? How do I know that years down the road I will still find these things? Is anyone interested in developing image repositories? There is a Netherlands initiative to build a repository with persistent identifiers. What about image repositories? No. There are image platforms.

5. Attendee comment: I’m from the UK. If valuable, we’ll have to fight to protect these systems because of budget cuts and the publishers fighting. So, to keep value, we’ll have to convince government about it.

Related Links:

OAI6 talk in Virtulal Knowledge Centers

University Library at Utrecht

Online Journal

Open access interview

http://www.igitur.nl

http://www.darenet.nl

http://www.surf.nl

http://www.openarchives.org

NARCIS, a dutch repository of theses

First Monday article

Posted by Jim Batchelor, time, date

July 9, 2009   Comments Off on PKP Open Archives Harvester for the Veterinarian Academic Community: The Session Blog

On Open Humanities Press: A Panel Presentation by Members of the OHP Steering Group: The Session Blog

July 9, 9:30AM – Fletcher Challenge Room 1900

Presenters

Barbara Cohen, Director of Humanitech, University of California, Irvine.  Steering Group, The Open Humanities Press.

Gary Hall, Professor, Media and Performing Arts, Coventry University, UK.  Co-founder of The Open Humanities Press.

Session Abstract

Archived video stream of session

Background

Launched in May, 2008, The Open Humanities Press (OHP) is a scholar -led open access publishing initiative that currently publishes 10 journals.  Central to OHP’s vision are goals articulated by the Budapest Open Access declaration (2002) to remove barriers to scholarly literature, accelerate research, enrich education and share the learning of the rich world with the poor.

Session Overview

_igp9932-21

Photo: J. Miller - PKP Conference

Barbara Cohen started the session with what she called “Open Access 101”, a quick survey of some basic principles and recent initiatives focused on ideas of giving free and open access to peer-reviewed scholarly literature on the Internet.   This background is important context to consider in relation to the principles and goals driving The Open Humanities Press (OHP), an open access publishing house that launched in 2008 with 7 journals (now 10).  Central to OHP’s vision are goals articulated by the Budapest Open Access declaration (2002) to remove barriers to scholarly literature, accelerate research, enrich education and share the learning of the rich world with the poor.  Cohen went on to note that despite the fact that most scholars prefer to read electronic copies of articles, the Internet is still perceived by many Humanties scholars as being an unsuitable publishing medium for serious humanities research.  In a 2008 talk at Irvine, Sigi Jӧttkandt, one of the co-founders of the OHP, characterized this perception where the Internet was seen as “a sort of open free-for-all of publishing” medium in stark contrast to trusted, peer-reviewed paper-based scholarly journals.  The OHP was envisioned as a means to overcome this perception by bringing high-quality editorial standards and design processes to the field of Humanities scholarly publishing on the Internet.  It was essential for the founders of the OHP that scholars felt that its journals would be good places to publish.  The founders of the OHP feel that their stragegy of developing an open access publishing house has been a good way to gain the trust of the scholarly community in the humanities.

Cohen described OHP’s key goals as: advocating Open Access in the Humanities; fostering a community of prestigious Humanities scholars; promoting intellectual diversity, and exploring new forms of scholarly collaboration.  A strong peer review model was seen to be key to the success of the OHP in developing a level of creditability and trust amongst Humanist scholars, and to that end, the OHP has gathered a prestigious, rotating editorial board, as well as a strong steering group, all without any operating budget.  The OHP also has worked to bring  open access content to its readers in journals that share high production values and effective leveraging of new technologies such as PKP’s Open Journals System and, in the future PKP’s Open Manuscript Press software.

Gary Hall

_igp9921

Photo: J. Miller - PKP Conference

The second speaker, Gary Hall, picked up the importance of open access initiatives with books and monographs.  Such initiatives were particularly significant in the Humanities because scholars in these disciplines place such emphasis on books over journal articles.   Hall discussed several book projects that are underway with The OHP, describing these efforts as focused upon a new cultural studies project, liquid books with a fluid structure up to the challenge of exploring the potential shape of the book to come.  The first of these books has been published as New Cultural Studies: The Liquid Theory Reader.  Hall was particularly interested in the potential of experimental projects that would allow scholars to challenge traditional concepts of the codex by expanding to include the range of materials/media found within printed books: excerpts, snippets of media, clips from multimodal texts.  Such an exploration is an important response to the emerging landscape for digital texts, a landscape influenced by the proliferation of books scanned by Google and reading devices from ipods to Kindles.

Hall also indicated his interest in creative ways to employ open access and open editing strategies in liquid books that were free for anyone to read, write, remix, and reinvent to produce alternate parallel versions of books.  Such acts of distributed writing and editing within liquid texts would, Hall hoped, raise critical challenges to traditional notions of authorship, intellectual property, authority, etc.  This potential dismantling of the authority of the text was a particular challenge for open access initiatives, as they ran the risk of reinscribing and reproducing traditional approaches and limits of current knowledge production, this time in an electronic space. Drawing upon Derrida, Foucault and Barthes, Hall offered that open access could bring interrogations of academic authorship so as to loosen up these notions, making them less fixed and rigid (more jello-like).  By recognizing some wobbles in the smooth surface of academic publishing, scholars would be in a good position to delineate and respond to shifts in power and authority increasingly evident in decentralized forms of writing such as MyTimes (a cross between the associated press and an RSS reader), Wikipedia (a networked, distributed and very liquid work), and other such fluid sites for knowledge production.   Such a redistribution and decentring of traditional authority could help scholars to avoid replicating the current centre/periphery dynamics of knowledge production and dissemination, an imbalance that sees 90% of the world’s scientific research being published by just 15 countries.

Question Period

During the question period, one member of the audience identified a contradictory tension that seemed evident in the presentations offered in the session.  On the one hand, the speakers stressed a need to establish the credibility of academic publishing.  On the other hand, it was clear that there was also a keen interest in exploring the boundaries of new media (and traditional academic practice) so as to destabilize the model of academic publishing alongside the decentring of other concepts like authority (authorship), and the very form of scholarly writing be it journal articles or monographs.  Keeping these things in balance is quite a challenge, especially when many scholars are as intent on building credibility in these new forms of academic scholarship at the same time as others are intent on destabilizing the very units and processes that have long characterized academic discourse.  This somewhat anxious tension seems to describe aptly the stance of Humanist scholars exploring new cultural studies.

Related Links

The Open Humanities Press – Website for The Open Humanities Press, with links to their current journals: Cosmos and History, Culture Machine, Fast Capitalism, Fibreculture, Film-Philosophy, Image and Narrative, International Journal of Žižek Studies, Parrhesia, Postcolonial Text, Vectors.

Hall, G. (2008). Digitize This Book!: The Politics of New Media, or Why We Need Open Access Now (Minneapolis and London: University of Minnesota Press).

Jӧttkandt, S. (2008). Free Libre Scholarship: The Open Humanities Press. Irvine, 3 April, 2008.

Jӧttkandt, S. and Hall, G. (2007).  Beyond Impact: OA in the Humanities.  Brussels, 13 February, 2007.

King, J., Lynch, C, Willinsky, J. (2009) Open Access in the Humanities. Podcast.  University of California, Irvine.

July 9, 2009   2 Comments

Customizing OJS for Magazine Publishers: The Session Blog

Presenter: John Maxwell

July 9, 2009 at12:00 noon

John Maxwell

John Maxwell

Background

John Maxwell is currently with the Canadian Center for Studies in Publishing at Simon Fraser Universtiy in Vancouver, British Columbia.  John’s work specifically focuses on the advantages of open journal system (OJS) software models to be used as a framework for the Online Magazine Management Models (OMMM) project that applies to use in the area of publishing small cultural magazines (Maxwell, 2008).

Session Overview

John Maxwell reported on the very recent initiatives of a new model for small magazine editorial publishing.  The OMMM) project based its creation on the Open Journal Systems (OJS) with what Maxwell describes as a process with a more collaborative approach but at the same time keeps within the intent of the OJS overall concepts.

Small cultural magazines do not have the exact purpose of journal publishing but there are similarities to moving from text based publishing systems into online management systems.  The OMMM project is not at a stage of online publishing but is using the OJS concepts and applying that process to the electronic management of submissions as a starting point.

With the specifics of the OJS, Maxwell’s work centers around taking some of the same ideas of OJS and applying those systems to fit into a version that would be streamlined for magazine publishing.

What the OMMM project essentially produced was a translation of user interface from OJS into a more user friendly language for magazine publishing.  As the project evolved two models of editorial governance were designed:

  • Discrete delegation of responsibilities
  • Collaborative, task-based, process management

One of the key concerns while working on the OMMM project was to capture a workflow model that would capture the essence of a wide variety of small cultural magazines.  Currently the project is using PLONE as the Web platform – designed as a “miniature OJS.”  PLONE houses the Editorial Submission Management (EMS) that includes the following aspects:

  • System of buckets for organizing content
  • 3 stage workflow (red, yellow, green)
  • Basic notification
  • Word.doc – Web based content
  • Relies on collaboration and trust among the editorial group

Although this new concept is in its infancy the concept of using OJS framework has moved the publishing of small scale magazines into the cyber world and has started a ground level development in the first steps toward newer developments of online publishing for the future.

Audience Input

The overall sense from those audience members who spoke – were complimentary to the project or any project that uses online open access; however, access to the technology remained the major issue for users of any place across the world….not just remote areas.  There still remains key issues around access to the technology and infrastructure in order  to access these online programs for the publishing and reading of open source knowledge.

References

Maxwell, J.W. (2008). OMMM Project: Toward a collaborative editorial workflow. Blog. Retrieved July 7, 2009, from http://thinkubator.ccsp.sfu.ca/wikis/ommm/OMMMProjectTowardACollaborativeEditorialWorkflow

Related Links

BC Association of Magazine Publishers

Workshop: Web Content Management for Publishers – August 4,5 and 6

July 9, 2009   2 Comments

Open access journals copyright policies: an analysis of the information available to prospective authors: The Session Blog

Thursday, July 9, 2009 @ 11:30
SFU Harbour Centre (Earl & Jennie Lohn Rm 7000)

Presenter:

couture

(Source)

Marc Couture (Science & Technology professor at Tele-université: Université du Quebec à Montréal’s distance education component)

Session Overview

Session Abstract

Marc Couture presents his research findings about the availability of copyright policies on open access journals. He addresses the assumptions about copyright, the statistics related to his study and recommends a framework for publishers to use with respect to making copyright decisions that take into account the best interests of both the author and publisher.

Commentary

Couture urges authors to become aware of the copyright policies associated with the journals they are interested being published in. He establishes some basic assumptions he operates on about copyright prior to his research including: copyright is important to authors, the deal between the author and publisher involved in publishing an article must be legally and ethically fair and that the interests of the journal, the author and the end-user (the reader of the article) must be equally taken into account.

Research

The guiding question for Couture’s research was “where can information on copyright be found on open access journals websites?” Specifically, Couture was looking to see if a prospective author can infer from the website who will keep copyright, what rights the author will retain and what permissions will be given to end-users. 300 journals (representing 251 publishers) from the DOAJ list were randomly selected and scoured for any form of copyright that could include statements, Creative Commons (“CC”) licenses, transfer/license forms etc. Key results indicate that copyright information was not easy to find – 9% of journals did not have copyright information and 63% of journals had copyright information buried on an “other page” (ie. not a home page or specific copyright page). Additionally, copyright policy was not consistent across journals; something that prospective authors need to be acutely aware of.

Couture points to the relevant issue of semantics in relation to copyright statements. He identifies key words found in copyright statements ranging from ambiguous terms, such as “make available” and “copy” to more precise terms, such as “photocopy” and “display publicly”. “Use” is the umbrella term that envelops all terms and copyright statements that rely on “use” to direct the reader are clearly poorly defined. An example from a copyright statement is given:

“the full text of articles can only be used for personal or educational purposes?”

The uncertainty that lies within the statement is demonstrated in attempting to answer two questions

–    Can a teacher post the article on his website?
–    Can an engineer working in a company distribute printed copies of the article to her team member?

In addition to the ambiguity of specific words, Couture points out that too many words is no better than too few words.  Another factor that requires clarification is whether or not everything that is not explicitly forbidden is permitted. Couture poses this question as an example to publishers that if their exact intentions are not stated, prospective authors and end users might derive incorrect assumptions about copyright.

Proposal

As a result of his research, Couture wanted to create a proposal that would define the outline of a software tool which could help a journal by generating, through a series of inputs, a clear and unambiguous statement indicating copyright policy that could be add to a website.  The key, he says, is generating simple text that is aimed at authors and end users. This is a work in progress and Couture would like to see the publisher approach the grid from the viewpoint of “what do I want as a publisher?” rather than “what do I want to forbid the author from doing?”.

The exact content of copyright policies are investigated and Couture notes that about half of the journals require a transfer of ownership from the author to the publisher. This leads to Couture’s secondary motive – establishing the divide that exists between the desires of authors with regards to copyright and the reality of publishing. Couture would like to see what he refers to as “fair practices” whereby there is no transfer of copyright, no more rights than required are granted to the publisher and broad end user permissions are in place (in the form of CC licenses).

Couture’s presentation makes it clear that the copyright policies of open access journals lack a common sense of purpose or consistency and that publishers should make copyright clarification a priority.

Related Links

Article – “The facts about Open Access”

Directory of open access journals

Related Reading

Hoorn, E., & van der Graaf, M. (2005). Towards good practices of copyright in Open Access Journals. A study among authors of articles in Open Access journals. Pleiade Management & Consultancy.

July 9, 2009   Comments Off on Open access journals copyright policies: an analysis of the information available to prospective authors: The Session Blog

UK Institutional Repository Search: a collaborative project to showcase UK research output through advanced discovery and retrieval facilities: The Session Blog

July 9th 2009 at 10am

Abstract

Presenters: Sophia Jones and Vic Lyte (apologies for absence – please email any comments and queries)

Background

jones21(Source)

Sophia Jones, SHERPA, European Development Officer, University of Nottingham.

sophia.jones@nottingham.ac.uk 44 (0)115 84 67235

Sophia Jones joined the SHERPA team as European Development Officer for the DRIVER project at the end of November 2006. Since December 2007, she has also been working on the JISC funded Intute Repository Search project.

Jones has a BA in Public Administration and Management (Kent), an MA in Organisation Studies (Warwick) a Certificate in Humanities (Open University) and is currently studying part time for a BA in History (Open University). She is also fluent in Greek. Prior to joining the University of Nottingham, Jones worked as International Student Advisor at the University of Warwick, Nottingham Trent University and the University of Leicester.

Sophia’s interests include international travel, music, cinema and enjoys spending time reading the news of the day.


vic-lyte-mimas(Source)

Vic Lyte MIMAS, Director of INTUTE, University of Manchester

Mimas, University of Manchester, +44 (0)161 275 8330 vic.lyte@manchester.ac.uk
Vic Lyte is the Director of the Institute Repository Search Project. Lyte is also Development Manager at Mimas and also Technical Services Manager at Mimas National Data Centre. Lyte’s specialty areas are design and development of Autonomy IDOL technology within Academic & Research vertical and advanced search and discovery systems, architectures and interfaces with research and teaching context. Lyte’s work is lead by repository and search technologies.

Session Overview
The UK Istitutional Repository Search (IRS) was initiated after a perceived gap was noticed in the knowledge access search process, where there appeared to be unconnected islands in knowledge materials.  The IRS followed on from the intute research project. The high end aims included easier access for researchers to move from discovery to innovation  by linking repositories to exchange knowledge materials. The IRS uses two main search methods: a conceptual search and a text-mining search. A question was asked by a member of the audience to differentiate the searches. Jones responded by stating that the conceptual search searches documents, whilst the text-mining search searches within documents.

Jonesa then demonstrated the two types of searches by using an example quesry of ethical research.  Searches can be added by suggested related terms, or narrowed down by filtering through repository or document type. The conceptual search results can also be viewed as a 3D interactive visualisation. The text-mining search can also be viewed as an interactive cluster map.

In summary Jones stated that IRS had met all its high end aims. In engaging in a focus group with the research community there was encouraging support. The only suggested improvement was a personalisation aspect, which the IRS would have the potential to add, with a projected roll out in the next phase of IRS.

picture-3(Source)

Questions and comments

1. How is the indexing managed? Answer: Please email Vic Lyte.

2. Does IRS have data-mining tools in the set? Answer: Yes, this was developed by NaCTeM.

3. How much does IRS cost? Please email Vic Lyte.

4. Is IRS available now? Yes, there is free access. However at this stage IRS is more of a tool than a service.

5. Comment from Fred Friend (JISC): I’m from the UK. If this is valuable, we’ll have to fight to protect these systems because of budget cuts and the publishers fighting. So, to keep it’s value, we’ll have to convince government about it.

References

Intute Repository Search (IRS) project

IRS Demonstrator Tool

Mimas Press Release February 26 2009

July 9, 2009   Comments Off on UK Institutional Repository Search: a collaborative project to showcase UK research output through advanced discovery and retrieval facilities: The Session Blog

Legal Deposit at Library and Archives Canada and development of a Trusted Digital Repository: The Session Blog

Thursday, July 9, 2009 @9:30
SFU Harbour Centre (Sauder Industries Rm 2270)

Presenters:

Pam Armstrong (Manager of the Digital Preservation Office at Library and Archives Canada and Business Lead of the LAC Trusted Digital Repository Project)

Susan Haigh (Manager of the Digital Office of Published Heritage at Library and Archives Canada)

Session Overview

Session Abstract

Pam Armstrong and Susan Haigh succinctly present the aims of and current progress of Library and Archives Canada’s Trusted Digital Repository and Legal Deposit.

Commentary

Pam Armstrong reviewed the Library and Archives Canada (“LAC”) Trusted Digital Repository (“TDR”) project currently underway. Two key aspects of the project were identified as the building component (of the necessary infrastructure) and the business component (the workflow and efficiencies associated with it). Current projects with the TDR include creating a digital format registry, establishing threat risk assessment, establishing effective communications strategies and establishing a storage policy for data. Armstrong points to collaboration as the key to facing the challenges of preserving the digital heritage of Canadians which, given the amount of collaborators involved, is crucial in ensuring the integrity of preservation. Additionally, Armstrong suggests that the TDR brings a corporate approach to digital preservation due to the requirement of having content providers register online. The logistics of the TDR are displayed in a framework illustrating the structured flow of data. Data goes through a quarantine zone then to a virtual loading dock (where it is, among other things, scanned for viruses, decrypted and validated) then to staff processing (where metadata is enhanced) then in to a metadata management system and finally to an access zone. Ultimately, it appears that the TDR will not only organize and preserve data but it will also make it more accessible to Canadians through the world wide web.

Susan Haigh reviewed the mandates of the Library and Archives Canada Act and broke down the logistics of the Virtual Loading Dock (“VLD”) that exists within the Legal Deposit side of the electronic collections. Haigh highlighted the LAC mandate as the preservation of documentary heritage of Canadians through acquisition and preservation of publications at any cost. She identifies the VLD as an effective tool to do so that requires publishers to adhere to a law obliging them to deposit two copies of a publication and its contents in cryptic form along with any metadata in the VLD. This puts the onus of responsibility on the publisher, though Haigh forwards this with “theoretically” indicating that it is not yet as streamlined as it could be.  Simply put, the VLD is part of a drive to build an accessible and comprehensible electronic collection. As a result of this collaborative effort since 1994, there are over 60,000 titles (monographs, serials and websites) and over 150,000 journal issues housed in the Legal Deposit. Open source software is called up in LAC projects including Heritrix, an open source archival software, which is used to acquire domains associated with the federal, provincial and territorial governments as well as Olympic Games domains.

Front page of LAC's Electronic Collection (public access)
Front page of LAC’s Electronic Collection

(Source)

The VLD was conceived as a pilot project in an effort to make the electronic collections at LAC more efficient. This software-based approach had several aims including to test methodologies and to learn technical and operational details about the workflow related to the intake of electronic materials of various types. As is the nature of a pilot project, LAC learned many lessons about the VLD that will ensure adjustments will be made in the future to make archival of materials even more efficient. Lessons learned include the need for distinct ISBN’s for digital editions, the need to have the bugs worked out of JHOVE with respect to large PDF files and a need to tweak the functionality for publisher registration. Haigh indicated that this first release of the VLD was very useful in raising an understanding of the importance of functionality for metadata capture by making sure it is efficient and clear to publishers.

Moving to the future, LAC will begin to test serials as well as all transfer methods associated with the Legal Deposit. Additionally, LAC will explore two important questions related to preservation:

1.    How do we capture Canadian OJS journals for legal deposit and preservation purposes?
2.    How can LAC and the Can OJS community collaborate?

As for access, it is clear that the goal of the TDR is to provide access but I would be interested to see statistics regarding who is actually accessing the materials from the LAC (ie. mostly the academic community or individuals for personal purposes). In keeping within the confines of copyright, Armstrong notes that rights associated with content as dictated by the publisher will be reflected in terms of access.

Related Links

Library and Archives Canada – Official Site

Library and Archives Canada Act

Library and Archives Canada – Trusted Digital Repository Project

Library and Archives Canada – Legal Deposit

Library and Archives Canada – Electronic Collection

Article – “Attributes of a Trusted Digital Repository: Meeting the Needs of Research Resources”

July 9, 2009   Comments Off on Legal Deposit at Library and Archives Canada and development of a Trusted Digital Repository: The Session Blog

Website for CONICET´s Academic Publications: the Session Blog

Presenter: Alberto Apollaro

July 9, 2009 at 4:00 pm

Session Abstract

Background

Alberto Apollaro is a member of the Scientific Electronic Library Online (SciELO) Argentina group and is a specialist in webspace development and applications.  Mr. Apollaro joined CAICYT-CONICET in 1998 as a systems website administrator, including serving as the webmaster for CONICET’s specific website for academic publication.

Session Overview

The SciELO project’s directive is based on “the development of a model methodology for the preparation, storage, sharing and evaluation of scientific publications as an electronic support.”(1)  As an alternative to print, the library facilitates international distribution of Latin American scholarship with regional impact in an organized, accessible format.  This regional project stems from National Council for Scientific and Technical Research (CONICET) policy, which in turn is administered by the independent, Argentinian Centre for Scientific and Technological Information (CAICyT).

Comisión Asesora de Investigación Científica y Técnica	/ Consejo Nacional de Investigaciones Científicas Y Técnicas

Comisión Asesora de Investigación Científica y Técnica / Consejo Nacional de Investigaciones Científicas Y Técnicas


The CAICyT has charged the production a website which will allow Argentina systematically catalogue digitalized data that is editorialized and peer-reviewed as per the academic standards previously described by CONICET.  Also, in part of CAICyt’s directive to push Argentinean scholarly publication into the open-access era, CONICET draws upon 15 university repositories to fuel 32 open-access e-journals.  This initiative is facilitated by the use of the open-source Open Journal System (OJS) Software which allows online management of the process from submission through to publication.

The process began with journal selection from the Latindex, which contains over 2,800 titles.  Editors were then invited as the website was constructed.  CAICyT would provide the editors’ platform for discussion and consultation, while also providing publishers with guidelines for quality improvement .  OJS allows the website to self-archive authors’ submissions and facilitate peer-reviewing and copy-editing quickly and efficiently.

The expectation is that the CONICET website will provide a repository and portal for local Argentinean scholarship, and allow publication not just limited to text but multimedia also.  The streamlined editorial model allowed by OJS will hopefully encourage submission of regional scholarship and see it through to immediate publication, while operating under a more economically attractive model in comparison to traditional publication.

During the following discussion, Mr. Apollaro described the OJS is a very attractive mechanism to facilitate publication in Latin America (for the aforementioned reasons), but the problem lies in the current unfamiliarity shared amongst Argentinean scholars with OJS.  This is essentially holding back the website’s growth, and progress of the Latin American open-access movement in general.  Certainly, one of CONICET’s future efforts should be focusing on increasing open-access awareness in the Latin American scholarly community.

References

1) http://www.scielo.org.ar/scielo.php?script=sci_home&lng=en&nrm=iso

July 7, 2009   Comments Off on Website for CONICET´s Academic Publications: the Session Blog

Blogging the PKP Conference

This is a blog for the PKP Scholarly Publishing Conference 2009 in Vancouver, Canada. A group of graduate students in LLED565D: Developments in Access to Knowledge and Scholarly Communication are live-blogging the conference.  Postings on the conference sessions will be up on this blog right after each session ends. At a later time, we will be editing our posts as necessary.

bwh6iyq3tm

July 2, 2009   Comments Off on Blogging the PKP Conference