Finding Signs Of Life In DAM: Interoperability 1.0
This feature article has been contributed by Ralph Windsor, editor of DAM News and is the second article in a series of items with the title Finding Signs Of Life In DAM. See also the previous article about digital asset supply chains.
In the first part of this series, I attempted to diagnose the reason why DAM has gained less traction than other enterprise software markets and proposed that the key issue was the lack of relevant metadata which made it harder to isolate digital assets than it needs to be to permit the productivity benefits of DAM to be realised. In the remaining articles, I am evaluating potential solutions.
The Context To DAM Interoperability
The last piece in the series outlined three possible solutions to the relevant metadata problem:
- Digital Asset Supply Chains
- User Education
That item focused on the first point, the Digital Asset Supply Chain and how it was important to try and identify the source of digital assets upstream, before they were ingested so it was easier to offer users metadata suggestions. I discussed how the digital asset supply chain could be used to derive metadata so users did not need to come up with all of it entirely unaided and I also noted that users, themselves, also needed to be willing to help provide some hints to help make this more accurate and efficient.
The second consideration is interoperability. Why is interoperability important to DAM progressing as an enterprise technology? Without proper standards, every third party technology which has to interface with a DAM solution must do so in a bespoke manner. The implication is that either the technology must be built with multiple connectors to a range of different DAM solutions or DAM development teams have to build their own integration features, independently and with little or no opportunity to re-use existing work.
A further problem is the inability for vendors to specialise in a given area. If there is no opportunity to easily exchange data, end-users are forced to buy monolithic systems that try and solve every single requirement. There is little possibility for one vendor to partner with another and develop their expertise in a more focused manner. Moving digital assets between different products is not feasible without custom integration being necessary between every single node on the supply chain (see the reference to Cambrian Explosion of DAM integrations in Tim Strehle’s DAM News article about DAM interoperability). This in-turn makes it more difficult for end users to take a best of breed approach towards deciding what DAM vendors they wish to work with and they are forced to choose a single provider, even though their capabilities in one functional realm might be less developed than another option.
It is common to hear DAM vendors, consultants and analysts pontificate about how bad ‘silos’ are, yet as measured by the complexity involved to allow them to interoperate with other tools, DAM applications are among the worst offenders. I believe many DAM vendors prefer silos containing clients’ digital assets because they are, at best, ambivalent about them being transferred to another platform and at worst, insecure and fearful about losing out to competitors.
Due to the lack of interoperability standards, many of the ancillary tool vendors who provide functionality which DAM systems may use often select something which is not a DAM solution (e.g. Dropbox, Box etc) as a generic option because it is easier and cheaper for them to pick something that already has critical mass rather than deal with this on a case-by-case basis. Many with some knowledge or expertise in Digital Asset Management (and DAM vendors in particular) complain about prospective customers deciding to use these less versatile products (but with much larger user-bases) as faux-DAM solutions. Without any interoperability standards which might enable them to collectively offer a user-base of a comparable size, however, it is difficult to see what catalyst would cause that to change. The lack of interoperability in DAM encourages the growth of less specialised (and poorer) products because of lowest common denominator effects like the one I have described. These reasons explain why the DAM software industry is structurally inefficient and why it gets nowhere very fast and hence is being written off as ‘dead’ by some long-term stakeholders.
Interoperability Standards Will Facilitate Digital Asset Supply Chains
In the preceding digital asset supply chain article I gave some examples of business scenarios that depend heavily on the efficient operation of physical supply chains like parcel delivery, electricity supply and consumer goods. All of these have interoperability (and standards to support them) at their core. For almost everything that modern civilisation depends on, there has been a process of agreeing some conventions to make it possible to use multiple components provided by different suppliers and have a high degree of confidence that they will still work together. That is not the case in Digital Asset Management and I contend this is a key factor why it has not yet lived up to its promise.
To make a comparison, imagine if you had an electrical device, say something quite simple like a desk lamp. How do you make this work? Obviously, you plug it in to a power outlet and turn on the switch. If this were a DAM software industry desk lamp, however, all the plugs and sockets are incompatible with each other, so you would need to disable your power source, attach the live and neutral connectors, locate (and fit) some protective covering to ensure the user does not get electrocuted, employ a health and safety consultant to carry out a risk assessment to verify that it was safe, then restore the power and have at least one ‘support engineer’ on-hand in case something went wrong. All this kind of complicated activity might also necessitate a project manager who would plan and coordinate everyone involved (as well as deliver weekly progress reports and prepare PowerPoint presentations etc to the ‘stakeholder’ – i.e the person who needs to turn on the lamp). Now you want to turn it off? That wasn’t in the original scope, so the power will have to be shut down once more (so there will need to be a light ‘outage’ for an agreed window) and an engineer will need to be briefed to design a switch, test it, fix any issues (including paying for replacement bulbs during the ‘development’ process) and then we can switch everything back on again. By the way, here is an invoice for this custom integration project for your kind attention.
Does anyone still need convincing that DAM is not in desperate need of interoperability standards?
Why Is DAM Interoperability Still Not Yet A Reality?
Having established that there is a clear and unequivocal business case for interoperability standards to enable scalable digital asset supply chains, attention then must turn to how to implement them. Tim Strehle’s article (which I referred to earlier) reviewed the current state of DAM interoperability. The attempts at formulating the standards which he mentions are hampered by low rates of adoption and limited interest (at least as they relate to DAM). There are a few reasons for this, which I can summarise as follows:
- Standards bodies (like OASIS, who sponsored CMIS4DAM) require fees to be paid to join and participate. This actively discourages smaller vendors from getting contributing. Further, their processes for approving standards are slow and bureaucratic in nature. I was involved in the early stages of this and I can appreciate the need for robustness, governance and the rigour of a formalised approach, but if these principles are too rigidly enforced too soon, they stifle progress and tie everyone up in non-productive activities. The results of this (in regard to CMIS4DAM) are self-evident. One further point is contrary to some opinions, DAM is not a subset of ECM and the CMIS standard itself has arguably failed to gain significant traction (although it should be noted that it does have support from a number of ECM vendors).
- In highly fragmented markets like DAM where there are no major players to speak of, it is exceptionally difficult to get critical mass if only a few firms participate. Having only large companies (i.e. $1bn+ turnover) who happen to offer a DAM system will not capture even 5% of the market, not least because these businesses only earn a fraction of their revenue from DAM. Even by moving down to the ‘large small’ operators, less than 20-35% of the market is likely to be represented. Many DAM systems in-use in organisations (some of a very significant size) are developed by vendors who have less than ten DAM-related clients. Trying to coerce the whole sector into acceptance of a protocol solely by virtue of having a small group of large and powerful interests will not work because they just don’t exist in DAM (at this stage in its development, at least).
- There is a perception that DAM interoperability is someone else’s problem. I have seen tweets and comments along the lines of ‘I don’t how interoperability will happen, but I imagine those who know about these things are dealing with it’. To those people, a couple of words of advice: they aren’t. Don’t believe the advertising, no one has addressed this properly and very little is being done about it by anyone. Nearly everyone in DAM thinks this is someone else’s problem that will get solved, somehow, eventually but no one knows by who or how (let alone when).
- Some vendors still regard interoperability as a custom integration project (see my previous comparison with electric lamps). A few even view interoperability standards as a threat to their revenue streams (or those of their channel partners). This is about as short-sighted as opening a railway station in the mid-19th century and telling passengers that trains departing from it would have to be carried by-hand to their destination using teams of human ‘train lifters’ rather than using wheels running on tracks because you can charge more for the tickets if the service is delivered that way.
- Where an existing protocol is to be used, for example Semantic Web and Linked Data, there is a learning curve required to understand it which also inhibits adoption. Long-term, I believe this is the destination where DAM needs to go (and I know a few people active in DAM already understand that). Unless there is a substantial increase in their wider adoption which forces everyone to pay attention, however, I believe this is more aspirational than a practical means to get everyone off the starting line.
- Discussions about standards amongst technical people (like software developers) can quickly get bogged down in minutiae and details which frustrate progress and fuel resentment rather than achieving consensus. All too often, decisions about standards get left solely in their hands, because ‘this is technical, right?’ This contributes to the ‘not my problem’ attitude prevalent when the DAM interoperability topic comes up. To practically make progress with a protocol, far wider participation is required than just technical personnel. Just like you can’t leave implementation of DAM systems solely in the hands of software developers, the same applies to any industry-wide standard also.
Some of these problems (such as the last one) are generic and would affect any field, but the characteristics of the DAM market make them particularly difficult to resolve.
Cutting Through: Getting To The Point Where DAM Interoperability Is Feasible
There are not inconsiderable obstacles preventing DAM interoperability, however, many of these are self-inflicted and therefore can be addressed. In the spirit of this series of articles, I am going to try and do this right here.
In 2013, I wrote an article for CMSWire, The Building Blocks Of DAM Interoperability where I mentioned four items including unique identifiers, standardised metadata frameworks, asset registries and embedded metadata. While I still think these are valid, the most important piece of the interoperability puzzle is a standardised metadata framework (for all the reasons previously discussed in the other articles in this series). There are operations on digital assets and users also which need to be considered too, but a metadata framework is the basis or foundation stone upon which more advanced schema can be introduced.
Interoperability Standards Development Using Agile Methods
When discussing interoperability in the past, a point which is usually acknowledged by those involved is the need for it to at least start by being as simple and easy as possible so participation is maximised. What usually happens not long after is either nothing at all, or an eagerness to move on to a given ‘pet topic’, some aspect of functionality etc which holds interest or appeal for those concerned. The necessity to keep it as simple as possible seems to have a de-magnetising effect on technical people and they find the subject dull and uninteresting as a result. So how should we proceed?
Methodologies like ‘agile’ are popular among software development teams. I am not as sold on their effectiveness as many are, however, I understand that an incremental approach can have advantages, further it echoes the same continuous improvement philosophy that underpins supply chain management principles. As a method to develop standards in complex fragmented markets like DAM software, they could offer a means to help generate some forward momentum.
DAM Interoperability 1.0
Based on the previous observation, here is my agile-style DAM Interoperability 1.0 specification for a base asset metadata node:
This is expressed in XML, but it could equally be some other structured notation (e.g. JSON).
DAM Interoperability 1.1
The key flaw with version 1.0 is that there is not a great deal of standardisation, developers can add more or less anything they like into the asset node which does not make it very useful. How do we decide what metadata elements to incorporate into this framework? I cannot answer that question at this point in time because no one has ever asked it, so let’s allow each DAM development team to define their own subsidiary node called ‘extension’ and put anything they like in it. For example:
<asset> <extension name="[name]"> </extension> </asset>
Where [name] = the name of each vendor and/or their DAM platform.
<asset> <extension name="XYZ"> <id>1234</id> <title>My Asset</title> <copyright>J.Smith </copyright> <keywords> <keyword>example</keyword> <keyword>digital</keyword> <keyword>asset</keyword> </keywords> </extension> </asset>
To make it clear, everything inside ‘extension’ is entirely arbitrary, the schema might be significantly different or quite similar. If the DAM you use doesn’t have a ‘title’ node (nor any other) that doesn’t matter. In addition, the node structure might vary a lot between types of assets, or remain the same (or it might change based on some other alternative criteria). For this edition, it’s not important, only that every DAM development team has a clearly designated area that is held within the same metadata packet, but does not interfere with anyone else’s .
What have we achieved here? In terms of interoperability, not a lot, it is still not possible to identify core asset metadata in anything like a standardised fashion. What there is, however, is the basis or foundation which I mentioned as being essential to enable interoperability standards to progress. Nearly every DAM vendor who offers a web-based API (using REST as a base-protocol) should be able to implement the DAM Interoperability 1.1 specification and achieving compliance should take a moderately skilled developer less than a couple of hours (many far less than that).
DAM Interoperability 1.2
Those DAM developers who have now implemented DAM Interoperability 1.1 can congratulate themselves, instruct their marketing departments to issue press releases about how they are ‘leading DAM interoperability players’, at the ‘cutting edge of DAM technology’ etc. Since we are now all on-board with DAM Interoperability 1.1, it is a good time to discuss where we go with version 1.2.
At this point, we need to conduct a more detailed analysis and make some decisions about how we proceed to make version 1.2 a reality. Ideally, we need to analyse all the APIs offered by many different development teams plus those of some other related standards like IPTC, Dublin Core, PLUS etc then abstract a series of base entities and incorporate these into the core schema (i.e. outside the ‘extension’ node and into the main area common to everyone). I do not expect subsequent versions to be as straightforward as previous ones, but developing standards (just like developing software) is never going to be an ideal career choice for those who yearn for the simple life.
Fortuitously, myself and my colleagues who help me put together DAM News (and our other associated services) have built something to help with collecting all this together and carrying out further fact-finding, which is held here at damstandards.org: http://damstandards.org/
If you happen to be a DAM vendor (commercial or otherwise) or the operator of a DAM-related standard, you can register on DAM Standards, request to become an editor and place details of your API on it. You can add PDFs and provide some links to further information. As with our vendor directory, we will announce those who participate on Twitter. We have a simple discussion board system for DAM interoperability 1.x and if we get enough interest, we will open this up for general participation (and possibly some other associated protocols). All this is free of charge and apart from the necessity to register and enter some basic details, there are no complicated membership procedures, legal agreements to sign etc. It should take you about 5-10 minutes.
In the near future, we will be introducing an option on our DAM vendor directory to indicate DAM Interoperability 1.x compliance which will get validated (and there are no plans to charge for this either). So there you have it, DAM vendors, a relatively easy opportunity to contribute to the development of a usable industry-wide DAM interoperability standard and some free PR/marketing exposure thrown in. What’s not to like?
In this article, I have sought to achieve two goals:
- Demonstrate that DAM interoperability is essential to the long-term success of this sector and show those who dismiss or ignore it that this is short-sighted and puts themselves (and everyone else) at some peril in terms of their long-term survival.
- Provide a plan and some practical methods that might offer a chance of at least getting a discussion going about DAM interoperability and giving everyone an opportunity to describe their own method.
I fully expect some passionate debate about this subject and a lot of arguments about both the implementation detail as well as the wider strategy. Those are all positive by-products of this process. What should not be countenanced is a dismissal of any attempt to pursue a standard (whether through self-interest or laziness) or non-constructive ‘we’re all doomed’ contributions. If you really believe that, you might as well leave this market because there is no point in continuing to place your investor’s capital at risk by being involved in it as a long-term endeavour (and that applies even if the investors and owner/operators are one in the same person).
In many ways, the premise of this series, Hector Medina’s concern that ‘DAM is dead‘ is really a question about whether Digital Asset Management (as an endeavour) is capable of achieving true scalability. To realise that goal, a level of industrialisation is necessary which has not been possible, to date because of how complicated it is to get DAM solutions and associated ancillary tools to work together (see my electric light comparison earlier). This is why there are still hundreds of DAM vendors who all consider this to be one of their key markets and why there is little evidence of any genuine consolidation trends, apart from some very selective M&A activity.
In the past I have discussed DAM with a number of private equity interests who were contemplating taking positions in DAM vendors. An observation made by some is that many firms are a bit too much like web design operations in terms of both ambition and size (as measured by turnover). There is both a lack of understanding of how to develop their products in a fashion which is strategically scalable (i.e. not just being able to deal with lots of data but having any kind of understanding of the digital asset infrastructures they are supposed to be building) and also a fear that by doing so, they might impact their already small revenue streams.
Whether or not the DAM market can agree some interoperability protocols is a pivotal point in its development. If it can, there will be an opportunity to scale (with both threats and opportunities to current incumbents, in equal measure). If not, it will eventually wither away to be usurped by an as yet-unspecified technology that can both deal with these problems and assimilate many of the current benefits which DAM is supposed to excel at.
I put this question to those who operate on the sell-side of DAM: what is it going to be? Are we all going to finally deal with the interoperability crisis in DAM or is everyone satisfied to work for a business that (as measured by percentage of income from DAM alone) all too frequently barely competes with the average local convenience store? The time to provide an honest answer is now at-hand.
Share this Article: