Considerations For DAM And Collections Management Convergence Projects
This feature article has been written by DAM News contributor, Ralph Windsor
One of the key differences between preservation and other DAM uses cases is the complexity of the metadata schemas (or ‘metadata design’ to use a less technical term). In particular it’s the distinction between assets and other ‘objects’. In the example given above, Theresa is referring to the actual artefact, but it can get more intricate than that.
In preservation or heritage subject databases, ‘object’ often refers to further entities, linked to, but autonomous from a core asset record – which doesn’t have the same pre-eminence that it does in most DAM systems. So you might have a digital asset which is a photo of four artefacts, each of these requires information, such as the country of origin, provenance, physical location etc and those artefact records may be linked with additional assets or records. In a hybrid CMS/DAM, assets are of equal status to any other object (and might even be considered less important by many users). There could also be some further external entities, for example, buildings, exhibitions or other subject domain-specific elements. An object might be placed in many exhibitions and users will want to carry out tasks like cross referencing back from the exhibition to find all the artefacts that were used in it and then all the digital assets such as photos from a specific exhibition featuring a given piece. That’s a fairly simplistic example and the often the research tasks these systems have been partly purchased to assist with might need more versatile criteria and querying capabilities.
In many conventional DAM systems this can pose a problem because the developers are used to the idea that the asset record is the atomic unit of the whole application and this new multi-dimensional object construct can spoil their carefully architected, asset-centric data models. Further, many DAM vendors have been trying to mask underlying complexity in their application to help with end users like marketing people who have needs which are conceptually less sophisticated, although that might start to change soon.
It’s true that you do get these multi-item fields for other common DAM metadata tasks (photos of people being a good example) but with preservation there are typically many more of them and often they can go multiple levels deep too – so sub-objects each with their own separate metadata schema and linked hierarchical relationships. As well as asset metadata, you need database management tools to control all that object data so that users can modify the values, deactivate anything no longer required and generally batch manipulate them just like they do with asset records. As should be clear now, the complexity (and the time/cost) can escalate rapidly and its not just any generalist DAM system that can be used.
Those vendors who have existing familiarity with preservation or heritage related projects have probably engineered their applications to deal with this kind of flexibility already by abstracting their data model to avoid some of the issues described above. However, that still doesn’t save them from the need to do some fairly in-depth metadata analysis to work out how to model the existing schema and decide whether rationalising it is a good idea or not.
Migrating Legacy CMS Applications
As alluded to in Theresa’s article, there is usually some unwieldy legacy application and a non-trivial data migration and cleansing task to get useful data out of it. In addition to object metadata, there are transactional records such as records of loans, costs applied and profiles of partner institutions (such as other museums) which for various legal and financial reasons can’t be discarded. They also contain potentially useful Business Intelligence data in their own right which can hold value.
My experience with CMS related DAMs is that physical objects tends to be less complex to deal with from the perspective of the system’s internal data structures. Some have a dedicated type/sub-type for it, others refer to them as ‘metadata only’ records so the physical nature of the object is implicit. As Theresa’s article describes, it’s more the tracking of non-digital items, especially the limitations of (and need to integrate with) the organisation’s existing artefact management hardware.
Tracking may still be done with barcodes rather than RFID and depending on the scale of the collection, re-tagging everything might be impractical (or even risky in the case of artefacts that are in a fragile condition). So the replacement DAM software might need to be integrated with whatever readers are in use and probably have some label printing capability too. The printer model used for generating physical tags may be quite old. A few times in the past I’ve witnessed custom label printing clients needing to be developed just to handle them as a result. The professional services expense of integrating new software with existing kit must be compared with lower consulting fees but spending more on new equipment via a cost/benefit analysis. This needs to consider all of the different expense items holistically, rather than just the price of the gear.
As Theresa says, these legacy systems are often the bane of curator’s lives, however, a number I’ve met have also developed a specialised form of what you might call ‘Software Stockholm Syndrome’ in terms of their relationship with them. Despite a legacy application’s glacial performance and contemptuous user interface, some staff may have grown attached to it and any replacement needs to replicate every last detail of the available functionality before they will even countenance the idea of giving it up – and even then it will be with intense suspicion.
Obviously not everyone in the organisation will agree and some of those who have spent less time with the incumbent system will despise it and they are often driving forward plans to have it replaced. This can also generate a conflict between those that want to be first on the lifeboat and others who want to remain on the sinking ship. Anyone responsible for delivery of a replacement (whether internal project managers or external vendors and consultants) will need to develop strategies for negotiating between these two polar opposites. This is the other human or political dimension to replacing a CMS that you don’t hear about very often – although it is consistent theme of Collections Management migrations in my experience. While you get change management issues with all software refresh projects, with Collections Management they can get particularly ugly if not handled sensitively.
CMS Workflow And The Significance Of The Terminology Used
On workflow, it is the case that DAM systems are getting better at handling more sophisticated business process models and these are easier to apply to CMS usage scenarios now than they once were. In the earlier CMS migrations I was involved in, the idea of workflow for those from DAM backgrounds was still this linear method that required sequential lists of personnel to process items in a set order. They usually became unworkable when used for real and generally required complete custom re-writes before they were serviceable for many common tasks.
Some care is required with the definition of workflow also. On a few occasions, I’ve heard the entire collections management system be referred to as ‘the workflow system’ whereas the more conventional definition for those from a DAM, ECM or Document Management background refers to a specific business process such as booking a loan transaction and getting approval for it or making changes to an asset’s metadata. These differing interpretations of the terminology can cause confusion and disputes later if not cleared up early on.
DAM & Collections Management Convergence Checklist
Below is a rough and ready checklist of the points or questions those contemplating a convergence with DAM initiatives need to consider. It is by no means comprehensive:
- Has the current operational process been analysed and rationalised so inefficient practices are not just being replicated in a flashy new software system?
- Do the candidate vendors respect the higher than average complexity of cultural or heritage metadata schemas. Are they blasé about the capabilities of their product and the ease with which they can accommodate you?
- Can their workflow and business process modelling tools at least get close to what you need without major modification?
- Has a cost/benefits study of either integrating with existing kit like barcode readers, printers etc or replacing them been conducted to decide the optimum mix of old and new? It won’t necessarily be either wholesale replacement nor slavishly replicating what exists now.
- What is the change management plan for introduction of this system and how will the concerns of legacy application users be addressed?
- Has the terminology and frame of reference been defined? Does everyone definitely understand exactly what they mean?
Preservation Initiatives And DAM Values Chains
There is a clear opportunity to avoid cost duplication and reduce the number of software systems deployed in cultural or heritage institutions, however, our on-going discussion of digital asset management values chains has some bearing here too. There are some conflicting functional requirements between groups of users (like museum curators and marketers) that place all DAM solutions at risk of trying to spread themselves too thinly. As the saying goes, you can’t please all of the people, all of the time.
This might get solved by a merger of interests between DAM and CMS vendors so you get specialists who have a DAM background but concentrate on the preservation market – which has already happened to an extent. Given that a typical CMS has a lifespan of much longer than a corporate DAM, with 20 years being not uncommon it does appear that this could be just re-creating the same outdated legacy application problem all over again, however.
Whether or not it will actually happen is harder to predict, but in my opinion, an improved method would be a core integration platform with specific preservation services that run on it. This would be more scalable and easier to selectively upgrade in a piecemeal fashion as determined by changing needs.
Some CMS implementations can have ‘mega-project’ characteristics that dramatically increase both costs and risks of failure. Speaking from personal experience, they sometimes seem like World War One military operations where thousands of hours of time are slaughtered (by all parties) to gain a few yards advantage in terms of project progress.
Given that most European or North American institutions are (like everyone else) going through a period of austerity where fiscal responsibility is high on the agenda, it seems like this should be taken as the cue to radically revise the solution delivery method employed so it is more suitable and avoids leaving another legacy application problem for a new generation of curators in 20 years time.
Share this Article: