DAM Innovation – Who Hit The Pause Button? (Part 2)


In part one of this two part series, I discussed how the level of innovation in the Digital Asset Management industry has reached a plateau recently which the evidence suggests it cannot now advance beyond. I made reference to Jeff Lawrence’s CMSWire article as it contained an aggregated selection of quotes from vendors who were asked to describe their road maps for 2015.

In part two, I will examine the final item in Jeff’s article: Business Intelligence and analytics and then outline the potential risks to the DAM industry resulting if the trends described in both these DAM News items continue along the same path.

While writing all this, I also read this the following by David Diamond on his DAM Survival Blog: Digital Asset Management Predictions for 2015.   In particular, the last two points in his list resonated with me. I had thought along similar lines same myself before reading his piece and there are a number of common themes discussed here also.  Anyone who has not read David’s piece already should do so in addition to my own.

Business Intelligence and analytics
This was the fourth item mentioned along with user interfaces, video for social media and interoperability. There is evidence of an understanding by vendors that DAM users need to know where their assets are being used and to cross-reference that data with other applications. For example this from quote from Deanna Ballew of Widen:

Our goal is to help marketers and creatives capture, organize, share and analyze their marketing content, so we’re providing insights that will help them make better decisions about each piece of content, based on its efficacy.” [Read More]

Deanna has broken down the people and processes in her summary which suggests Widen have thought about a strategy for delivering business intelligence functionality that is relevant to DAM users. With that said, this is supposed to be about innovation and it’s similar to the issues I observed with the interoperability section, some good intentions, but not much information on the ‘what’ and ‘how’. The CMSWire article is billed as a ‘sneak peak’ into what is coming next in DAM, but I doubt much furtive behaviour would really be necessary to find out the information that is being offered.

There are attempts to clarify what is meant by innovations in analytics in a more tangible form, for example, the following from Charlie Ward Wright IV of WAVE:

Customers are asking ‘How can I mine all this data and make my content smarter?’ Customers want to know everything about their assets analytics, business intelligence and customer segmentation. WAVE is integrating, implementing new features around customer information, adding a basic CRM that tracks businesses and contacts against their asset usage, which is starting to become powerful. For example, WAVE turns all system users into contacts, which means anyone that uses the system can be tracked” [Read More]

It is easy to be excessively reductionist in my critique of the work undertaken and disregard the effort that has almost certainly been expended by the development team involved, but in terms of raw functionality, this is saying there is a user profile (or ‘account’ aka ‘contact’) and an audit trail which tracks transactions (i.e. numerical indexes representing actions that can be statistically analysed, not simple text logs of activity).  Users and assets are the two most important entities in nearly every DAM system you are likely to encounter (currently, at least). In my opinion, if you don’t have tracking of key events and the relationships between users and assets on a multi-user DAM then your product is fundamentally flawed from a long-term ROI assessment perspective since there are few other practical methods to gain the insight required to make properly informed decisions.

It seems a lot of vendors agree with this, hence why I see this feature in lots of other DAM systems already. That being the case, what WAVE describe is not innovative, it is making a virtue out of a necessity and once again, a path many others have already crossed before them. The right questions is asked, but the answer is not sufficiently in-depth (and by their own admission, the question is one the users came up with themselves).

You could argue that those vendors who talk in generalist terms and describe objectives rather than hard facts are simply better at playing the PR/marketing game than their peers and can appear superior, without necessarily having anything to back it up. That analysis might well be an accurate one, but it is further evidence that the points of differentiation between vendor products are narrowing to just marketing -related differences and there are fewer opportunities to innovate (that anyone is prepared to invest in at least).

2015: The Year Most DAM Vendors Abandoned Metadata Innovation?
Another striking feature of both these articles is how little metadata gets a mention. There are some who claim DAM is entirely about metadata and not much else, I would hesitate to go that far, but suffice to say, if you don’t have metadata, then Digital Asset Management ceases to exist also.

There is coverage of Picturepark’s adaptive metadata feature, but for a reason that only Jeff or the CMSWire editors can answer, this is in the User Interface section. A possible theory is that if they had included the metadata topic in a dedicated area (as you would normally expect with DAM) then it would only include Picturepark, since most of the other vendors have run out of anything more innovative they can do or say about the subject. It would need to be noted also that adaptive metadata is not new any longer either and from my recollection, this was announced in 2013. I will say, however, there needs to be many more architectural innovations like this in DAM and also an understanding of their importance by those handling the marketing for vendors (a point not lost on Picturepark, it must be acknowledged).

It is a bit of a surprise that metadata gets so little coverage given the number of professional digital asset managers who regularly complain about how poor and restrictive the metadata tools available in DAM solutions is. I know that many end up using third party applications to handle this task, such as that rarely-mentioned (and long overdue for retirement) workhorse of metadata management, Microsoft Excel. It is a tenable proposition in 2015 that if you want to catalogue a very large collection of digital assets, the most practical option is still usually not your DAM system of choice, but some 20+ year old desktop application?

I know from the kind of consulting enquiries my firm receives, cataloguing is a big issue for DAM users and it is a problem which isn’t going away. There is a significant and growing backlog of material waiting to have metadata applied to it, which represents a major bottleneck to assets being introduced into digital asset supply chains – with consequences for other activities that depend on the assets being available in a findable form. This is the kind of hard and boring problem that many active DAM users want DAM vendors to help provide solutions for, but the vendors don’t appear to be that interested in their plight and prefer to apply lipstick to four-legged livestock residing on their farms instead. After the first article, we had a few comments on the DAM News LinkedIn group, including this one (which I completely agree with) by Jim Jezioranski from DAM systems integrator, Otec Solutions:

Existing clients want their supplier to focus on core features and stability but those are base expectations of a prospect so suppliers continue to work on the general appeal of their product. Too many vendors try to peddle incremental upgrades as innovation (HTML 5). Some of these are even long overdue and others are pet projects.” [Read More]

Recently, we had an application for a profile on our DAM vendor directory from a firm who offered metadata management tools. We had to decline their request since they did not meet the basic criteria for a DAM solution having no representations of assets (plus a small range of other points) but it did appear to me that they were offering many of the features that are essential for managing large-scale metadata operations and they were delivering it in a manner that was far superior to a number of the DAM products I have seen.

As evidenced by the articles Deb Fanslow has written for DAM News, a trend in DAM is more trained librarians getting involved, often for commercial operations now as well as the academic and public sector roles they might have historically been more closely associated with. My expectation is that as these individuals gain budget-holding responsibility, their willingness to put up with DAM tools that fail to provide the kind of metadata management capabilities they need will decline quite swiftly. A possible reaction might be for them to do the cataloguing using one of these specialist metadata management applications and then decide that delivery of the assets to end users could be better handled through systems which do a better job of presenting digital media, like WCM, for example?

WCM And DAM Convergence Or DAM Market Capitulation?
It is the case that DAM systems offer facilities like asset manipulation such as re-saving media in different formats etc that are not found so readily in WCM, but very few of the components used to provide these features are actually developed in-house by DAM vendors. A point that has been made before on DAM News is that DAM software implementation is a little like the automotive industry. Most of the key components are produced by someone else, vendors are mainly responsible for integrating them into a cohesive unit – this is a contributory factor to explain why there are so many competing products.

WCM vendors already have a platform which many have integrated with a number of other tools also in a similar manner to DAM. Many of the core features are also quite closely related and when I liaise with developers of DAM systems, a number have previously worked in web development or even directly on WCM platforms (either implementing components for them or the platform code itself). That they could re-tool to offer Digital Asset Management features via a module or add-on option should not be underestimated by the DAM software industry since a lot of their development staff probably started their careers in that market anyway.

The DAM vendors who have diluted their core offer with a lot of features that try to compete with WCM are at risk of spreading themselves too thin and/or designing product platforms that are nightmarishly complex to maintain. Large firms like Adobe might be able to ride this out by throwing money at the problem, but as has been noted before, DAM is highly fragmented and composed of smaller vendors where even the better known brands have quite low numbers of full-time staff (more than one hundred remains a rarity). I doubt many have the required capital to see this process out (not to mention the lack of an established sales channel which Adobe already possess).

It is a given that WCM systems are still not DAM, but a few could be made into a reasonable facsimile of one – and to a level that many general users will deem satisfactory for their needs, especially if the functionality that WCMs are weak on is bolstered by some other more specialised tools that offer benefits over and above what many DAM systems can provide.

DAM Lite: The Start Of The DAM Bear Market?
There are a number of other trends in DAM which could confirm the hypothesis that the recent rapid growth in our industry which has been in play for six years or more might now be slowing. One of them is the rise of so-called ‘DAM Lite’ systems. These are products with stripped down features, either implemented as separate editions with their own brands or using the same platform with more advanced features disabled and/or removed.

There are potential risks for anyone who pursues this strategy and as I will explain, they might boil over to affect others who do not as well. One which is already beginning to take shape in DAM is vendors competing with themselves. If users are offered two editions with different price points, they will usually look at the cheaper option first and attempt to find a way to justify a lower level of expenditure. Users of the lower cost option will still make demands for functional improvements, just as the more expensive product users do (in fact the cost as a ratio of their overall budget probably has more significance as far as they are concerned). While some might upgrade to the more advanced editions, I doubt the number of those who do is very high. To keep the cheaper edition competitive and win new customers, more functionality will need to get continuously added, especially as new entrants will be expanding the scope of what they provide at a comparable price point.

All the time the senior product can be enhanced with innovations that users are willing to pay for, there is a point of differentiation to justify the split pricing model. As I have discussed, however, that is drying up in DAM now. In very mature markets with few competing firms, vendors can get away with upgrades that are cosmetic or relatively insubstantial, but in those with hundreds of competitors and increasingly homogenised products (as there is in Digital Asset Management in 2015) there will be pressure to reduce prices. At some point, if DAM vendors do not make some clear decisions about what they are offering and to whom, there will be a price crash as the bottom end of the market destroys demand in the middle and a stratification process ensues (i.e. very expensive or very cheap, not much in between).

Deflationary Trends In The Cloud And Everywhere Else Too
In Jeff’s CMSWire article, a number of the vendors discuss how they foresee Cloud as being the future. I think it probably has now already attained that position of pre-eminence (having been promised for quite a few years already) and this might be a further indication that there are a lack of noteworthy innovations to discuss.

In addition to storage and virtual servers, Amazon, Google, Rackspace and others of their ilk are looking for increasingly diverse range of methods to extract fees from customers. These include core services that Cloud DAM vendors depend on, such as video transcoding, media streaming and more recently workflow. Many DAM vendors who have consolidated around a single Cloud services provider are effectively channel partners or resellers who could not exist independently of them. The Cloud providers like Amazon and Google are already pursuing aggressive pricing strategies to acquire business from each other. It doesn’t seem like it will be long before unlimited Cloud storage will be offered free of charge or in return for some modest monthly subscription (and is already available to users of the Google Apps platform). This will contribute to the pressures in the DAM market as the under-performing vendors will rely on the margin to survive and their stronger competitors will then spot an opportunity to acquire customers by lowering prices.

The wider economic picture has to be considered also. There are general deflationary trends in most developed economies now with downward pressure on prices for lots of items, including core commodities like oil and industrial metals such as copper as well as consumer items like food. The DAM sector moved in step with the wider tech market and avoided the worst effects of the financial crisis in 2008 because what was offered then was considered new, innovative and offering a potential for efficiency savings which had to be taken seriously.

That advantage is evaporating fast now and has not been replenished anywhere near as much as it needs to be. DAM software is purchased by companies who operate in a cross-section of other markets who will face their own margin compression effects and will need to offset them onto their suppliers by reducing their cost base, i.e. the wider economic factors are exerting downward pressure on prices in general. The DAM industry appears to not have grasped what allowed it to buck the last negative economic trends and, instead, is demonstrating that it too can only compete on price and little else.

Conclusion
To summarise the key challenges facing DAM:

  • Lack of innovation.
  • Homogenised products that all look and behave alike.
  • Failure to address user’s more complex problems.
  • Differentiation based on price and not much else increasingly prevalent.
  • Greater supply of a wide range of components utilised by vendors at lower prices.
  • Competition from related fields and increased opportunity for users to integrate components themselves.
  • Deflationary wider economic environment.

These factors combined might be painful for many vendors who are currently comfortable but not gaining a lot of traction at the upper and lower ends of the price spectrum, especially those who offer very little which cannot be obtained at lower cost from the numerous competing interests now lining up to take their slice of the pie.

The last few years in DAM have produced what I understand economists call a ‘rising tide’ effect where all boats have risen and momentum has helped generate business for nearly everyone who offered a credible product and managed to market it competently. The next five year period seems like it might be a lot more difficult and just being in the game and doing what everyone else does won’t be enough to maintain satisfactory revenues any longer. The DAM software industry has to stop congratulating itself about the spike in demand enjoyed over the last few years and become genuinely more innovative, particularly in relation to some of the deeper, architectural issues that will drive more sophisticated, productivity enhancing features.  Alternatively, it must prepare for fierce price competition, with all the undesirable consequences that this effect might imply.

Share this Article:

4 Comments

  • First off, Ralph, thanks for the mentions of my 2015 predictions article and for the nice words about Picturepark’s Adaptive Metadata technology. I’m obviously biased, but I do think that Adaptive Metadata is the only interesting thing that has happened in DAM in the past few years.

    But as you mentioned, it’s something that has already happened. So what do we do now?

    In truth, Adaptive Metadata is a new metadata foundation that enables us to rethink all sorts of aspects of conventional digital asset management practices–that was the idea behind it. It will be many years before we can safely say we’re getting bored with it, but that doesn’t mean Adaptive Metadata is the be-all and end-all. Much more is needed, from us and from other vendors.

    Perhaps the saddest commentary on our industry for me came recently when I overheard someone talking about how great it would be if you could easily “steal” features from competing systems to make your own better overnight. (This was not about DAM.)

    It got me thinking about what I would want to steal from other DAMs if I were to venture out on my own and start up a new company. Without a doubt, I would want Adaptive Metadata because I can’t even think of configuring a DAM now without it. (And it really pisses me off that our CRM doesn’t offer something similar.)

    But otherwise, what is there that’s worth stealing?

    As you mentioned, DAM systems are all starting to look and feel disturbingly similar, particular in the “Lite” category. They’re becoming commodities and, as you also mentioned, this will greatly affect pricing once storage is free and so-called “high end” alternatives are gasping for breaths of justification. In fact, it’s no wonder that so many vendors are doing nothing to promote the notion of a DAM interoperability standard because system lock-in is going to become an increasingly important business plan for those who don’t move forward.

    I’m a big fan of the CNBC show “Shark Tank.” It not only inspires me to be able to see ordinary people come into “the tank” with extraordinary ideas, but I find it very interesting to see how the “sharks” respond to those ideas. I often imagine us DAM vendors standing up there trying to collectively convince the sharks that our industry is worth an investment. But the responses I guess we’d receive are similar to those they tell others: “Your product isn’t focused enough for anyone to understand it” or “if there is a need for your product, how come I don’t know about it” or, my favorite, “I have to be interested in what I invest in and this doesn’t interest me.”

    In all cases, the responses end with “I’m out.”

    So what the hell is wrong with us–all of us? Is DAM so boring a concept that the world’s most talented developers, designer, salespeople and marketers go elsewhere? Is it so complicated that we fail to explain it adequately? Are we so blind that we’re creating solutions for problems that don’t really exist? Or perhaps the problem does actually exist, but we’ve just failed to properly address it.

    In any case, I was once fired from a DAM vendor for complaining (over and over) about lack of innovation in our own product. And now I see much of the industry complaining about the same thing. In some strange way, this gives me hope.

    David Diamond
    Director of Global Marketing
    Picturepark

  • Ralph, David, thanks for your insights. I’m happy to be (literally) on the same page with you :-)

    I really liked Jeff’s article: It was a good overview of what DAM vendors are planning to build. But you’re right, there’s not much innovation going on. I’m with David – Picturepark’s “Adaptive Metadata is the only interesting thing that has happened in DAM in the past few years.” (The most interesting thing that has NOT happened being the “DAM value chain” which DAM News wrote about in 2013.)

    The innovation I’m looking forward to is Linked Data in DAM and related systems (WCMS, CRM):

    Currently, we’re evolving data and metadata structures separately in many data silos. But none of these systems has all the information we need on its own; we must connect them. Think of islands with complex railway systems that turn out to be incompatible when you’re building bridges between the islands. Interoperability is still way too hard, and the lowest common denominator very low.

    An example: Try finding photos by the photographer “John Doe” across your DAM system(s), WCMS, photo agencies you’re buying from, and Google Images search. Assume that each of these sources has that information stored as structured, searchable metadata. You’ll likely find out that what one system calls a “photo” is named “image” or “picture” in other systems (and you’ll have to look for this in the “asset type”, “record type” or “object class” field). The photographer will be stored in a “creator”, “photographer” or “taken by” field, as “John Doe”, “Doe, John” or “john_doe”. And since there’s more than one John Doe in the world, you have no way of knowing which of them they’re referring to.

    That’s the kind of problems Linked Data tries to solve. Imagine how much easier it would be to aggregate and combine your own precious data, and join it with other people’s. Let’s see whether 2015 brings us closer to that goal. I’m not holding my breath for DAM vendors to invest in this, though (including the one I’m working for)…

    Tim Strehle
    Developer at Digital Collections

  • Jose Eugenio Grillo

    Deep analysis and perfect conclusion with relevant comments. Congrats.

  • Great article and discussion! As far as advancing metadata and standards, I can’t help but wonder what would happen if commercial DAM vendors got together with cultural heritage DAM vendors, and a few good UX designers and information architects…

    Library/archival systems (OPACs/catalogs, scholarly databases, DAMs, etc.) are famous for their extremely granular metadata and taxonomy capabilities (due to the need for abstract, hierarchical schemas and extensive vocabularies and authority files to describe both physical and digital entities and their many relationships) and preservation capabilities. It’s perhaps foolish to try to generalize the current commercial DAM offerings, but in comparison, they are better suited for handling works in progress, production/marketing workflows, and and asset distribution. Both camps have been lambasted quite thoroughly for their UIs. It seems to me that each would benefit from stealing a bit from the other, and consulting some professional interface designers.

    And I am optimistic that if more information professionals enter the DAM field, there will be more dialogue with DAM vendors in regard to the information architecture capabilities of DAM systems. In the interest of avoiding the monolithic DAM system, API integration with taxonomy/thesaurus management software would be high on my list, as would flexible/abstract data models, and of course an info pro to ensure quality information governance. Oops, I just leaked some copy from my upcoming articles! ;-)

Leave a Reply

Your email address will not be published. Required fields are marked *