How Different Will DAM Be In 2030? Part 2


This is the follow-up item to the first part of How Different Will DAM Be In 2030 which discusses Martin Wilson’s article for Information Age: 7 predictions for digital asset management in 2030.  In this piece, I will consider the remaining predictions 4-7.

4. Users will demand streamlined simplicity

As DAM systems acquire more and more features, there’s a risk that they’ll become overwhelming to users. The result? Stripped down, easy-to-navigate interfaces will be key selling points. Think minimalist aesthetics and intuitive interfaces, with all the complexities tucked neatly away beneath the bonnet.” [Read More]

For anyone who has a DAM which has been in active use for a few years, this might comes across a bit like the interface design equivalent of hoping for world peace and an end to global conflict rather than a prediction, per sé (with all the likelihood of being achieved which that analogy implies).  I cannot offer much of value on the topic of how to prevent the inhabitants of the world from killing and maiming each other, however, I do have an idea of what causes DAM software to become over-complex and the blame lies with everyone, not just the developers (nor the vendors who employ them).

On Martin’s point about ‘minimalist aesthetics and intuitive interfaces, with all the complexities tucked neatly away beneath the bonnet’, most recent DAM applications I see have the easy part (the ‘minimalist aesthetics’) and have done so for quite a few years, however, I think a good number of DAM UI designers appear to misunderstand the difference between graphic design and interaction design – in particularly how the interface will need to adapt as the volume of assets and product sophistication begins to scale up.

It is quite easy to implement a DAM interface which is very simple, stripped down etc and which non-DAM experts can find assets relatively quickly.  The bigger problem is what happens when users want to conduct more intellectually demanding operations like advanced searches, organising assets into collections (and merging/replicating them) or re-purposing.  The situation tends to get even more convoluted when users move into the ‘administration’ side of the interface to carry out asset cataloguing, metadata management, user management and workflow design, especially.  You can also add custom extensions into the mix since if the organisation is of any kind of size, these will almost certainly end up being needed eventually.

Another cause of excessive demands on the mental capital of DAM users also occurs when unexpected metadata gets introduced.  This tends to happen when users who either have not been trained (or who do not care) add a lot of garbage metadata which generates some bizarre search results and other strange side-effects that no one anticipated. In efforts to address these issues and deal with competing demands for functionality, the developers add even more ad-hoc extensions which temporarily appear to fix the issues and keep some of the users happy, some of the time.

DAM vendors and the management staff that work for the clients who commission them currently over-obsess about basic usage.  Very few want to engage with why interfaces end up becoming more complex than they need to be and how to go about rationalising them.  Both users and developers opportunistically reach for phrases like ‘less is more’ and ‘simplicity is best’, but only when it suits them.  For users this is typically when they first start out with an application and cannot work out how to use it, or it appears as though an investment of time might be required which exceeds what they are willing to commit to.  For developers it is when being asked to implement some tricky piece of functionality which they don’t like the look of.  The corollary of these positions involves quotes like “yes, but we must have feature abc, because of reason xyz” (users) or “we had to build it that way because you said you wanted feature abc and otherwise it would conflict with objective xyz” (developers).  Overlaying all of this is cost: devising applications that are both powerful and simple is expensive.  While these are considered worthwhile objectives at the outset of DAM initiatives, the motivation to sustain them tends to wane a few years after the initial roll out.  The expediency (or ‘hacks’, in software developer parlance) that subsequently results is a major factor contributing to why users now find your DAM system hard to use.

I cannot pretend I have a pithy one-liner already prepared to answer all this, but if the following principles were taken into account now (in 2016) by 2030, the prediction referred to might be achievable across the lifetime of the system, not just when it first gets launched:

  • Precisely define scope.  Keep functionality within a tight brief and avoid trying to solve multiple problems simultaneously.
  • Integrate existing (and functionally separate) applications where possible, don’t replicate them in two different applications.
  • Segment users into defined groups – don’t just treat everyone as an amorphous mass who will all start using the system at the exact same point in time.
  • Understand that users needs will evolve and try to anticipate how that could occur and the impacts it will bring.
  • Segment assets into different catalogues based on their perceived value to the business (and plan to need to move assets between them regularly).
  • Continuously seek to educate all users and (if feasible) have the DAM product development team directly involved in these sessions so they hear user feedback first-hand.

Note that the list above isn’t a recipe that you can just follow and out comes ‘streamlined simplicity’, it is no more than a set of principles which might help stop your DAM app becoming a complicated mess of your own making.  If you are involved with DAM, in any capacity, you are a part of this problem.  Coming to terms with it (and taking responsibility) is the only way progress will be made, so some understanding of the dynamics of the situation could go a long way towards mitigating many of the effects described.

5. Big data integrations will provide business intelligence

Big data in DAMs will enable users to make better decisions (and also demonstrate ROI). Current systems can provide metrics on the popularity of assets – for instance, how often an image has been downloaded. But we’ll see a whole new layer of analytics, such as marketing intelligence on how audiences engage with digital collateral.” [Read More]

This is more reasonable and achievable than the previous item.  As with some of Toby Martin’s predictions, I think this process is already happening in DAM now and it is being discussed by vendors who are taking their cue from users.  A while ago in 2012, I wrote an article for DAM News about DAM and Big Data and noted more or less the same trends as Martin has independently identified, that asset usage data is likely to be collected outside the DAM and integrated with asset records.

Large tech firms such as Google, Facebook, Amazon etc vicariously collect data from users because they know it offers raw materials to extract patterns of behaviour which can be used for predictive purposes (not always reliably, it should be noted).  Using on-line technologies, they have industrialised market research and the scale of their operations and immediate feedback loop permits them to take it to another level not previously available to corporations in the past.  Other organisations (both private and public) are now coming to the conclusion that they need to do the same (albeit on a smaller scale).  It is inconceivable that Digital Asset Management technology will not get put to use in the service of this activity because it represents a near-perfect opportunity to collect structured data with two ready-made entities (users and assets) as the focal points to organise around.

I am less sure about some aspects of the later part of this prediction, but I think I can see the direction of Martin’s thought process:

And as metrics become integrated with machine learning capabilities, DAMs will even help users to make decisions about which assets to share on particular channels. It may even reach the audience intelligence level, with DAMs mining data of social media followers to identify the kind of content they might like.” [Read More]

I have mentioned in the previous article about my reservations over AI and machine learning.  In an advisory capacity, however, to inform you about facts which the algorithms have identified as significant (that you can optionally take or leave) this might yield some insight which you would not have otherwise uncovered.  This is a better use of AI, in my opinion, since there is still some capacity for a human being to critically evaluate the suggestions and filter out the ones which are clearly invalid.

6. Assets and software services will be decentralised

…the trend in software, hosting and storage is now moving away from single servers and monoliths towards distributed systems. Microservices will become the defacto software architectural style and cloud hosting may even be superseded by ‘fog-hosting’, a form of decentralised networking that could bring greater efficiency of hosting, improved user experience and reduced pressure on networks.” [Read More]

I agree with this prediction and I think it is the one which is most likely to turn out to be accurate because it follows patterns which have historically occurred in other sectors.  The logic behind this is relatively straightforward: as functional requirements become more sophisticated, it becomes uneconomic for vendors to implement everything themselves in-house and they are obliged to sub-contract the less profitable elements of their offer to specialist providers.

I am not familiar with ‘fog-hosting’, but I think I understand the meaning behind the terminology and the implication is that rather than going to a single cloud provider and sourcing everything from them, multiple cloud operations are utilised, i.e. many clouds = fog.  This was a subject I discussed last year on DAM News and developments like MAIDSAFE seem to point in that direction.  I find the libertarian political subtext of that project to be an unnecessary distraction which is likely to limit its adoption and I anticipate that someone else may offer an alternative that employs similar principles (if they have not already done so) but without the insistence on obfuscating where each fragment of data is stored.  The core idea of the protocol is a good one, however and has considerable potential for re-use.  In addition to storage, I would anticipate the same scenario with other services which DAM solutions may sub-contract tasks to.  This is a logical extension of using local software components and application libraries etc and it is already well underway.

One point to note about this prediction which could mitigate against it is how technology models tend to alternate between consolidated and distributed phases over a period that lasts for around 10-15 years.  As such, microservices, ‘fog-hosting’ etc might become hot topics before 2020 but a decade later in 2030, they could have started to fall out of favour with something else which seeks to bring everything back to a central point again acquiring momentum.  Using terms like ‘fog’ in relation to IT subjects may not be entirely beneficial from a marketing perspective and I can foresee the centralisation-oriented tech vendors of the future using phrases along the lines of digital assets being ‘lost in the fog’ etc.  I predict some debate about how much users should depend on distributed technologies and DAM risk management plans being required that can evaluate the relative merits of both sides of the argument.

7. The invisible DAM

The final prediction has some relationship with the previous one and is arguably a logical conclusion of DAM de-centralisation trends:

It’s likely that the future of DAM lies in partnership, integration and interoperability. The result? Digital asset management could move from being a discrete system to a discreet system – an invisible function of a wider ecosystem of integrated software.  We may not even call it DAM anymore. That’s because when DAM started out, it was all about the media library – the storing, cataloguing and sharing of assets. But right now we’re in the middle of an evolutionary shift from media library to media hub. This means it will become the norm for DAMs to be integrated with other software, including design, content management and workflow systems. In short, un-siloed digital asset management combined seamlessly with applications for sales, marketing, design and everything else (kind of like an asset operating system).” [Read More]

The ‘asset operating system’ analogy is one I have heard before (I believe David Diamond has described it in these terms previously also).  I would expect this to be realised using internet connectivity (aka ‘the cloud’ or ‘the fog’ etc) as a method of communication.  This is already forming; storage facilities like Amazon S3, Azure refer to files as ‘storage objects’ and have metadata associated with them.  If DAM continues to be adopted more widely, I can see its terminology becoming preferred and ‘digital asset’ being substituted instead because it is a more business-oriented description of what these entities really are.

There are some points which are not mentioned in Martin’s piece but will become issues.  At some stage, the complexity of integrating assets and metadata is going to have to become far more straightforward than it is now.  For that reason, I can envisage some kind of universal asset identifier being realised to help address this.  I have mentioned blockchains and distributed ledgers in reference to this before on DAM News because they provide a ready-made identifier which can be used for other purposes also (and do not depend on the DAM industry getting its act together to agree interoperability standards).  Although these concepts look like the strong favourites to get widely adopted, it is certainly possible that something else might usurp them by 2030, especially as there appears to be significant internal disputes and conflicts among those responsible for agreeing blockchain protocols.  Irrespective of how that issue gets resolved, for global asset operating systems to be practical, it will be necessary to uniquely identify assets using a generic method that can be easily implemented by application developers.

I believe that infrastructure and protocols is where the real action in DAM will occur over the next decade, but if you are a hands-on end-user, these sort of considerations will  be lower down your priority list.  As suggested by Martin, I can see DAM functionality dividing into two distinct groups: management tools and search/selection interfaces. The former will be concerned with analytics, controlling access, monitoring, batch operations and ad-hoc cataloguing tasks etc also, i.e. DAM operations management.  People who currently have job titles like ‘Digital Asset Manager’ are likely to spend more time using these kind of systems and will want functionality that allows them to leverage their time as much as possible without sacrificing precision and control where it is needed.

The second group are those whose priority is rapidly gaining access to assets and a more defined range of additional functionality like organising assets into collections as well as possibly manipulating them.  If there is a universal interoperability protocol for exchanging digital assets, it becomes possible to use one vendor’s front-end with the management capabilities of another.  Rather than one provider having to offer everything in a single monolithic product, it is then tenable for vendors to specialise in one side and develop a competitive advantage over those who are fighting wars on both fronts.  From this perspective, it might seem like the DAM will disappear or become invisible, but it will be more like the two major (but distinct) elements which are currently combined in DAM systems beginning to de-couple from each other to the extent that they eventually become independent applications in their own right.

My expectation is that a lot of current vendors operating now will still want to hold on to the whole piece (i.e. both elements) and some of the larger and better capitalised ones might even still be able to pull that off for some time to come.  For the majority, however, it will become a case of choosing which battles to fight – or risk losing them all.  To offer an approximate comparison: some legacy DAM vendors from the earlier years of DAM in the 1990s built their own proprietary databases, but no one who entered the the market in the last 15 years would even consider doing that now as it is a massive duplication of costs and a waste of time and effort as database vendors can deliver products which are far superior to anything DAM vendors can build themselves.  Even the larger firms (the very few that operate in the DAM market) will probably be forced to allow users the flexibility to make their own choices about how much of their suite to utilise, or they run the risk of their products being deemed incompatible dead-ends and poor choices to replace any incumbents who have similar limitations.

Having looked through Martin’s article, although I do not agree with all his predictions, the general themes (and especially the latter items) appear plausible to me.  Undertaking any kind of market crystal ball-gazing exercise can seem like a fool’s errand since there are so many other unknown factors that can completely change the premise on which they depend.  With that said, I do not think this is a pointless exercise and given the lack of imagination collectively exhibited by the sell-side of our industry, any attempt to think about where DAM might be going over the longer term is certainly worth debating.

Share this Article:

One comment

  • Still sounds a little like Hen’s Teeth and Unicorns. That isn’t a swipe taken at you or the author; I admire the vision and positive attitude and certainly hope for what is predicted. I’d be in heaven!

    Maybe I’ve just had a long string of bad experiences and there really are companies (users) and DAMs (vendors) with the stomach for the discipline involved on both sides to make these predictions come true. So far my experience says otherwise though, despite trying for these things insomuch as my capacity allows where I work.

    Maybe you could provide some context? Are we talking about smaller, presumably nimble DAM companies with something to prove and the skills and confidence to kickstart such efforts? Or will all this come from the middle ground who traditionally are happy to be “good enough” for lack of real competition? Or will this come from DAM offerings up on Mt. Olympus who have the money to truly get it right and the clients who are willing to pay for it?

    And that’s only the DAM side; if the users don’t follow-up on the promise, even the perfect product will never work. This may be a bigger problem than the DAM, if that’s possible.

    I’m honestly confused how this will happen inside 15 years based on the last 15 years. But, I’ll cross my fingers, nonetheless. :)

Leave a Reply

Your email address will not be published. Required fields are marked *