Open University Workshop Videos

On Friday the 27th November we held a Clipper project meeting at the OU and then followed it with 2 workshops that were also videoed and webcast live over the internet by the OU. It was a long day but very productive. The workshops were held at the Knowledge Media Institute, Open University, Milton Keynes.

IIIF Workshop

The first workshop was delivered by Tom Crane of Digerati, with whom we have been discussing what  technical standards to include in the Clipper project. The subject of the workshop was the International Image Interoperability Framework (IIIF), we have been discussing how this might be extended to cover annotating audio and video resources. You can find the webcast at this link http://stadium.open.ac.uk/2620

Clipper Workshop

The second workshop was a short overview of the Clipper project, based on our previous community engagement workshops, followed by a question and answer session. You can find the webcast at this link http://stadium.open.ac.uk/2624

 

Clipper – Legal Issues – the long version

At the last Jisc sandpit workshop we were asked by the Judges to expand on the legal issues surrounding the use of Clipper. This is quite a long post but we thought it best to have this all in one place. Here we analyse the situation from 2 points of view (this is a working draft):

  1. The owners / managers of the audio-visual content that is being clipped and annotated
  2. The users who are generating the clips and annotations

Continue reading

Technical Standards / System Design Part 2: Looking Forwards to Phase 3

The current prototype Clipper application is built using these open Web standards

Moving forwards in phase 3 we envisage using / investigating these standards

Our aim from the beginning has been to create a toolkit that has little or no dependency on any proprietary and ‘closed’ technology or standards. Choosing the above standards was a good start. Moving forwards we shall need to create a more detailed data model. We had been aware of the W3C Annotation Data model: http://www.w3.org/TR/annotation-model/ and the W3C web annotation working group http://www.w3.org/annotation/.

From a research point of view the following 3 standards could provide the vital ‘glue’ to bind a Clipper installation or service into the global digital research ecosystem

  1. DOI: Digital Object Identifier System: In our discussions at the Roslin Institute we have identified the possible use of DOI’s to identify Cliplists, clips and annotations as well as the audio-visual resources they are linked to
  2. ORCID: Provides a way of linking annotations etc. to individual researchers
  3. OAI-PMH; Provides a useful way of sharing Cliplist information between repositories

As a result of our community engagement activities we have been fortunate in encountering Tom Crane and the Digirati company and in the ensuing discussions Tom has been suggesting that that these existing and emerging standards will be really worth exploring in Phase 3 and we think they look really promising:

Tom has pointed out that the IIIF Presentation API – http://iiif.io/api/presentation/2.1/ with its concept of an IIIF manifest is close to our idea of the project being the container for Cliplists etc. He has also suggested that the IIIF Shared Canvas: http://iiif.io/model/shared-canvas/1.0/index.html concept can be extended to time based media. With some time-based media vocabulary the IIIF work might be just what we need in Clipper. Tom is coming to the OU this Friday (27/11/15) to present the work of the IIIF and we hope to discuss this further with him then and make plans for phase 3.

Technical Standards / System Design Part 1: Reflections

We have been discussing the Clipper toolkit with people recently as part of our community consultation process. One interesting question we have been asked by the digital library / information community is what ‘Data Model’ are we using? To be honest we have not thought too much about this until now as we had done a fair bit on that previously around 2009. So, a bit of explanation here might help us to clarify our position going forwards.

In the earliest phase of Clipper (around 2009) we created it in Adobe Flash and ActionScript using the Adobe AIR rich ‘internet application’ to create a cross-platform app (PC and Mac that is). This was a little before the HTML5 take off and the rise of tablets and smart phones). In that earlier project we did a lot of thinking about the data flows involved in the user interacting with audio-visual resources and what data would need to be gathered by the system to deliver the functionality the user needed. You can find a set of graphic flowcharts representing the data flow at this link. At the time we were fortunate in working with a colleague at Manchester University (Gayle Calverley) who had just completed a study for Jisc on the types of metadata needed for the storage and management of time based media in repositories. The report that Gayle created was thorough and really useful it was called the “Time Based Media Application Profile”, and it is still on line:

http://wiki.manchester.ac.uk/tbmap/index.php/Main_Page

In the end we did not implement a detailed data model based on that study, instead we developed our own ‘slimline’ version based on user ‘walkthroughs’ of the system and ‘reverse engineering’ approaches to see what data would be required to deliver the functionality we needed. The metadata schema we came up with was based on Dublin Core. We produced our own report detailing our approach to metadata and, with Gayle’s help, mapped it to the Jisc TBMAP report. This approach certainly made our life a lot easier then and to extent it still does today, it is useful to reflect on this as we go forwards and I think we shall certainly be using this and Gayle’s report in Phase 3.

Royal Conservatoire of Scotland and Roslin Institute Feedback Online

We have been very fortunate in visiting these two institution to demonstrate the toolkit and get feedback and both have agreed to participate in pilot projects on phase 3, which is great. What is striking is the similarity in research data management needs despite being in very different research environments.

Royal Conservatoire of Scotland Feedback http://blog.clippertube.com/index.php/the-royal-conservatoire-of-scotland-51115/

Roslin Institute Feedback – http://blog.clippertube.com/index.php/the-roslin-institute-university-of-edinburgh-131115/

London Workshop Summary

We held our third community consultation and co-design workshop at the British Library Labs on Monday the  26th of October  October. Thanks Mahendra Mahey, manager of British Library Labs, for organising the hosting of the event. We really do hope to return in Phase 3!and thanks to all the people who attended and gave so generously of their time and insights.

This was another busy and productive workshop with lots of ideas and suggestions and collaborative opportunities to follow up. For us, a particularly useful encounter was with Tom Crane from Digirati and this has continued really useful dialogue about technical standards that we are working into our phase 3 plans. It is really encouraging to see people so keen about adopting the toolkit. Below I have recorded the main points in bullet format.

11:00 Demonstration – prototype system, initial feedback and discussion

  • Good for language teaching – immediate use (Elina +)
  • UCL central teaching tech support – many uses including lecture videos
  • BUFVC – yes we could use it to extend our service
  • Audio annotation – i.e. to be able to attach / pin an audio recording file as an annotation to a resource in a Cliplist (the resource could be an audio or video). Useful for many purposes as well as accessibility such as language teaching (e.g. translation)
    • cf with the visual annotations for audio suggested in Manchester
  • Using with audio / video submissions for assessments
    • Teacher clips / annotates a student submission
    • Students clip / annotate their resources for submission
  • What about enabling student annotations to be put through Turnitin? – would need to use the Turnitin API and save the annotation as a PDF? Or somehow upload the annotation text for a Turnitin report
  • Visual drawn annotations on top of the video (linked to the timeline to complement the text annotation (use the Canvas element overlay the video)
  • Manual entry of times in the clip creator
  • Are the URLs persistent? (Mahendra / Daniela / Ollie). Yes as long as you follow good practice or control the resources – even if the link breaks the UGC will still be useful
  • UCL can I test with my own files? Yes using the paste URL (Will uses Roslin example)
  • Mahendra HTML5 works on mobile?
  • BUFVC and UCL – aware of videojs and the author Brian Cove
  • Ollie – can we add web links to the text? Yes
  • Mahendra – any limit to annotation text? No. Can we use Unicode? Yes
  • UCL have a sound cloud collection – could they use it with that? Need to investigate APIs etc. – this is a theme emerging the need to integrate with social media services etc.
  • Annotations – how are they presented – can they pop up? (suggests that customizable edit and consumer views might be useful)
  • Can we have annotations on the video? And just on still images?
  • What format are the annotations in? HTML. Suggests a need for rich text editor possibly plus the annotations etc. can be outputted in different formats CSV, Excel, XML
  • UCL Can the thumbnails be changed? Yes if the resources are on your server (what about UCL as development partners?)
  • BUFVC – Citations? By which we mean is it capable of taking the catalogue information/ metadata from a resource collection and using it to pre-populate fields? Yes as long as Clipper is integrated into the system. A good way to visualize this is to look at the existing Clipper integration with YouTube – where Clipper is using the open APIs to harvest and use the data to populate the title and description fields and the thumbnails
  • UCL / Pete Collins Jisc – could Clipper be integrated into Facebook? Yes
  • What about licences? Of the resource content? Really about user awareness but we also need to be able to represent licence conditions clearly in Clipper. Also needs to be born in mind that clipper only gives users access to the resources that have rights to view. If a Cliplist is shared with people who do not have the rights to view / listen then they see the annotations and descriptions and titles of the Clips but not the a/v content.
  • Ollie – the sharing experiences of services like soundcloud would be useful. The ability to share with selected users, groups, social media and to use tags and categories to manage your own content and help other people find it
  • CF Facebook closed groups and open group and notifications. So some stuff could be open (metadata and UGC) to encourage people to find and identify stuff of interest to them so that they would then go further and log in
  • Tree Browser and Daily Motion – Trevors Note

11:45 Practical hands-on – try it and feedback

  • UCL – mobile first, responsive design policy especially for students. Discussion about how most students will consume content on mobile devices (?) so Clipper out needs to be responsive. Discussion about edit view will probably need to be on laptop
  • There are problems with mobile video on Apple devices – explore and answer
  • UCL – would be good for creating a synopsis of the content via annotations and clips and also for deployment in Moodle as a plugin (?)
  • Does it / can it work offline?
  • Tom Crane – difference between Creation view / interface and consumer view interface
  • Will there be short codes for WordPress? https://codex.wordpress.org/Shortcode
  • Ollie UAL – Ability to handle groups would be good (permissions) ability to follow groups and get updates and see latest activity on a dashboard (admin teacher views of the dashboard and user activity)
  • Following on from above – Olli and Trevor and others discussion – best to use existing tools for collaboration and groups work etc. such as WordPress, VLE and Moodle – so best to look at integration with them? So more user management and permissions is the key to that kind of thing
  • Linda from BUFVC – more metadata is needed – to be pulled in from existing catalogue data to pre-populate fields in Cliplists etc. so that it could be part of the workflow
  • Possible answers for collaboration tools are Bootcamp for comments also look at the P2 theme from WordPress
  • Our use of WordPress is good for rapid prototyping – from the Southampton colleague who did Synote
  • Note for Phase 3 bid examples of early adopters with screen shots if possible

13:30 Discussion – implications for data management, service development and policy

Data management

  • Useful for teaching and using an institutional archive
  • Issues about anonymity, ethics, sharing
  • RDF storage of comments?
  • What happens at the end of time limited content
  • Tom Crane need licences for annotations and access conditions by default as options
  • Share via zip and email
  • How might the licence conditions of annotation relate to the audio / video – need to make clear the distinction to the users (both editors and consumers)
  • How can you search through the different resources? Needs to be connected into the collection concerned – the current connection to YouTube is a good example
  • HTML5 is fine for the tool and storing the data but what happens when the standards change?
    • Good question – well HTML is probably the most documented electronic communications standard in history so prospects for access and reuse into new formats are very good
    • In Clipper there is a 3 way split of storing data that provides good forwards compatibility and preservation
      • Database
      • JSON
      • HTML (presentation

Service Development

  • UCL – audio-visual resources are in general difficult to manage
  • Clipper could be used as an institutional service and as an individual personal service
  • The copy and paste URL would be a very useful function – potentially cover a lot of independent researcher scenarios
  • Use Clipper as an ‘API’ tool. By hooking it up with existing services such as:
    • Opencast / Matterhorn
    • Vimeo
    • Soundscape
    • Kaltura
    • BUFVC
    • Jisc Media Hub
  • Tom Crane Digirati. The IIIF standards community is very interested in this toolkit and could be of great help.
  • Running a service means dealing with standards and exceptions (BUFVC)
  • Integration with Turnitin for annotations used in assessment
  • What about using Clipper to create citations from audio-visual datasets?

Policy

  • Ideally Clipper would be bundled with a policy development pack highlighting some of the issues and questions that need to be addressed when considering an institutional deployment. Especially relevant for learning and teaching where policy is underdeveloped. Would be good to have some example policies and a policy editor to create / paste the text. Have a policy agreement tick box facility
  • Be good to have the ability for policy to cover projects and groups of users on projects
  • Data protection issues
  • Levels of sharing inside an institution should be possible (onion skin metaphor)
  • Openness should be encouraged policy wise and possible technically
  • External search of Clipper content? How would that be managed?
  • Have a licence picker (all the Creative Commons options by default plus straight copyright)
  • Would be good to save direct to an institutional repository to have plug ins to do so – and to shared services like Figshare
  • For sustainability it might be a good idea to apply to the Apache incubator? Needs 3 independent people
  • The open annotation standard and the IIIF standard might be the answer?

Tom Crane (Digirati) Notes

  • Provenance of annotations (who, when)
  • Search API – see the IIIF search spec
  • JSON => JSON-LD this would allow project JSON to stand alone
  • Look at alignment to common vocabularies (to increase take up)
  • Dealing with 3rd party resources – how would the authentication flow work
  • Common Annotation format
    • Annotation server
    • Tagging
    • Searching
  • Encourage video providers to enable CORS > to allow snapshots of the canvas
  • Indirection – allow an annotations to be pinned to non existent segments of video cf for film restoration
  • Indirection – allows annotation targets of an abstract ‘canvas’ allows different formats
  • Ability to annotate regions of the image (x,y,w,h)
  • Clipper app is both an authoring and viewing environment. What ir produces needs to be able to be consumed in simpler read-only viewers.

14:30 Technical discussion, ideas and requirements for institutional deployment / national service

Writing up in progress from audio