DAM Interoperability In 2017 – Where Do We Go From Here?


Tim Strehle of Digital Collections and Margaret Warren of ImageSnippets have contributed an article with the title: Improving DAM Interoperability In 2017 which is part of our Improving DAM In 2017 series:

DAM products have improved a lot in the last few years: They are now cloud-enabled and more user-friendly, they support video and let us share content better than ever. Integrating the DAM with other systems has also gotten easier because most systems have APIs by now. But point-to-point integrations are still the rule, where each integration between any two systems requires a software developer to write code tailored to them. Let’s look at the importance of interoperability for Digital Asset Management, what interoperability means in practice, and how we can improve it.” [Read More]

This is an excellent piece which concisely summarises the major issues relating to DAM interoperability.  The item gives both technical and non-technical readers a good understanding of the background to the factors driving the need for integration and why so little progress has been made, as well as some possible options for re-starting interoperability initiatives.  One of many positive aspects of Tim and Margaret’s article is that has been diligently researched and includes a lot of links (i.e. evidence) that supports their conclusions.  A cross-section of stakeholders and commentators are cited with relevant quotes, sources and links so you can see where Tim and Margaret have got their information from and decide if you agree with them or not.

There are some valid points about DAM interoperability discussed and the piece covers some of the previous attempts at DAM interoperability which have fallen by the way-side.  Tim and Margaret also introduce the possibility of utilising the semantic web (and schema.org, specifically) as potential interoperability frameworks.

The observation about the number of discrete one-off third party integrations DAM vendors have carried out (the ‘Cambrian Explosion’ section) is an interesting topic in its own right.  I have previously had some discussions with ancillary DAM tool vendors (i.e. firms who develop components which can be used within DAM solutions) and they have expressed some irritation at having to run through the whole integration process with numerous different firms over and over again.  The number of vendors and the uncertainty about whether they might get compensated for their efforts in the form of licence sales has persuaded several to not bother with individual DAM integrations now but instead to choose some larger generic technology (e.g. a cloud file storage/sharing provider) and oblige prospective DAM vendor partners to make their products compatible with them.  This means inferior platforms (from a DAM perspective) set the pace, based on user volume alone.  This implies that if the market doesn’t do something about standards, commercial natural selection will intervene and pick the providers with the most clout, whether the outcome is favourable for the majority of DAM software developers or not.

Part of the problem with interoperability is the short-sightedness of those involved and an almost wilful insistence to miss the point of why anyone would want to do this.  I have had conversations with some vendors who have said things like “we’re going to develop our own interoperability framework just for our product”.  Even among those who have understood the flawed logic at work with this kind of statement still tend to act like that is really what they think, while stopping short of actually saying it.

One of the other key misconceptions is the idea that the main use-case for interoperability will be to facilitate transfer of digital assets between different DAM systems.  This tends to make vendors nervous as the prospect of making it easier for clients to migrate away from their platforms isn’t something many are likely to voluntarily sign up for.  As alluded to by Tim and Margaret in their piece, however, a typical integration requirement (and one which would certainly benefit from some kind of standardisation) is more likely to be with a third party application that needs to consume digital assets sourced from a DAM system rather than another competing DAM solution, per sé.  The fact digital assets become more fungible (i.e. easy to exchange between products) does present a risk for anyone who fears that they might be about to shed sections of their customer-base, however, it is also more likely that their application is going to get more embedded within a given organisation which reduce the chances of it occurring (especially if the vendor is being responsive and proactive about dealing with issue reports and feature requests).

I take the point about the semantic web and innovations like linked data, but I think the objectives and expectations of any standards first need to be scaled back and then slowly (and very incrementally) developed over time.  At this stage (and unless something or someone else with the aforementioned market clout imposes it) they may be more of a destination than a starting point.  To get this underway, something as simple as a centralised library or resource where DAM API documentation could be stored and accessed would help provide a focal point for everyone in the DAM industry to see what conventions and practices are beginning to emerge.  While lack of imagination and inability to apply lateral thinking are usually negative characteristics of the DAM software market, in this case, they might actually be advantageous for developing some standardised protocols for the exchange of digital assets.

Share this Article:

Leave a Reply

Your email address will not be published. Required fields are marked *