You are here: Home » Special Features » Creating The Right Conditions For Innovation

Creating The Right Conditions For Innovation

This feature article was contributed by Martin Wilson of Asset Bank and is part of our Improving DAM In 2017 series.


In his own words, Ralph Windsor claims the DAM market is stuck in a rut, that there is now little true innovation by vendors. Even as someone on the vendor side, I find it hard to disagree with this analysis.

The products available now are fundamentally very similar to those on offer 15 years ago. Back then I remember describing a digital asset management application as  “a searchable database of digital files”. That’s still a pretty accurate one liner.

This may be about to change. It feels like the DAM industry is poised for a big shakeup, driven by rapid changes in the wider software development industry. There’s nothing wrong with the techies driving this: technology and marketing are no longer separate disciplines. Today’s CMO needs to be equally comfortable with the technical aspects of their product, and processes such as analytics, as they are with traditional marketing theory. Developers work in agile teams and obsess over value, rather than features. Products evolve via goal-driven decisions and ruthless prioritisation based on data.

If existing DAM vendors want to be part of this – to escape the rut, they need to embrace the changes in software engineering that have already happened in other software product industries. DAM needs to catch up.

Why has innovation stalled?

Successful software applications deliver value by solving business problems by relieving pain points such as tedious manual processes, or by doing something that was previously not possible (including exploiting or creating opportunities).

The DAM market is mature, and it’s reasonable to assume that most DAM applications now solve the problems of core digital asset management acceptably well.

It is obvious that the rate of innovation in a particular market will slow as it matures and all the quick-win solutions are implemented, but there is another reason why it may have slowed even more in the DAM industry.  To understand that we need to think about how software innovations happen. At a high level, there are two approaches to solving a problem with software: use currently available technology innovatively, or innovate technology itself.

DAM vendors are dependent on third-party technologies

Most DAM vendors are really systems integrators and so tend to use the first approach. There’s nothing wrong with this — using existing technologies to solve clients’ problems is value-focused (lean), and in the early days much was achieved by applying existing technology to digital asset management workflows. But now we are at a point where many vendors who want to innovate further are dependent on third-parties, waiting for them to make applicable technological advances.

A great example of this is the huge excitement about AI auto-tagging. Tagging assets is a boring, labour-intensive manual process, crying out for a better solution. The amount of time it takes to enter metadata in order to make assets findable is a massive pain point for users. Large organisations spend hundreds of person-hours on this every week, and this hasn’t changed since the early days. An innovative solution is much needed.

As soon as image recognition APIs appeared on the scene, and were marketed as being commercially-credible, DAM vendors rushed to integrate their software with them. We were no exception — we launched our auto tagging module in early 2016.

Unfortunately auto tagging is still not accurate enough to be a game changer, and DAM vendors simply don’t have the resources to make the significant advances in the underlying image recognition technology. So if want to say goodbye to manual tagging, we’ll have to wait for the likes of Google to improve their machine learning capabilities.

Plenty of other pain points still exist in the core digital asset management processes carried out by almost every organisation. But solving them is dependent on technological advances that DAM vendors can little influence. Here’s an extreme example: we all know that finding the right image for use in your latest marketing campaign can be time-consuming. Imagine an application that can read your mind, finding the right image for you just as you realise you need one. When can we have that? Not yet, that’s for sure. (AI agents that analyse what you’re up to on your computer in real time and then make this data available to other applications can’t be far off).

Be ready to innovate

This doesn’t mean DAM vendors just have to sit back and wait. Third-party APIs and cloud services are emerging at an unprecedented rate and integrating with them has never been simpler, now that REST reins supreme. App Store style marketplaces in Amazon AWS and Heroku make it easy for developers to market their API services and for other developers to use them.

Every new, innovative third-party technology has the potential to solve unsolved problems. DAM solutions need a technical architecture that enables emerging technology to be plugged in quickly. To be ready for the future, a DAM platform must be adaptable.

Look for new problems

Another source of innovation is unexplored problem domains, which present new opportunities to use existing technology innovatively. To leverage these, we need to challenge the idea of a single one-size-fits-all “DAM application”.

Instead of viewing a DAM solution as an independent application that users access when they want an asset, it should be seen a platform underpinning all the other applications used across the enterprise. This goes beyond offering CMS plugins and a decent API it means making it easy for domain-specific applications to include any component included in the DAM architecture, and bespoke front-ends that support different parts of the business differently.

In niche business domains, either as-yet unexplored verticals or just requirements within a single enterprise, there is much more scope to innovate, using today’s technologies, as the unique problems of these domains are uncovered.

Some DAM vendors might not like the sound of this, especially those who used their product to escape the software consultancy treadmill. But this isn’t custom software development as we knew it, at the other end of the spectrum from an out-of-the-box implementation. The two approaches to software implementation are converging, as plugable, reusable components enable bespoke applications to be assembled rapidly from out-of-the-box modules.

The right architecture

The technical architecture of a DAM system has a major effect on its ability to support future innovations. Some fairly recent technology trends have changed the game in software engineering and can’t be ignored.

Splitting up the application into domain-specific, independent modules has always been best-practice. By deploying each of these on separate servers you are essentially creating microservices and all the benefits they offer in terms of development, deployment, scalability and reuse.

Another important characteristic is an event-based architecture, which enables tasks to be carried out at the same time, rather than as a lineal sequence. Now that processing power is cheap and easy to use (especially in the cloud), having components work in parallel whenever possible is essential. Users don’t like waiting.

The front-ends should be written in JavaScript, to provide modern, dynamic user experiences. Technologies such as React, which facilitate the reuse of front-end components, make it possible to provide bespoke user interfaces for individual clients to fit their workflows exactly, without exorbitant price tags.

Continuous delivery to the cloud

Even now, many enterprises still have their own data centres and haven’t fully embraced the cloud. I don’t think I’m going out on a limb by suggesting their number will dwindle, especially as encryption-at-rest and virtual private clouds start to allay security concerns, and internet speed becomes a non-issue.

At present though, the market for on-premise solutions is still strong. But this model has serious drawbacks in terms of maintenance and upgrades, even when using containerization technologies such as Docker and Kubernetes. Our solution can be deployed both in the cloud and on-premise, however when we roll out new features or experiments to early adopters we always do this on our cloud offering first, enabling us to continuously deploy changes in response to user feedback.

Vendors offering on-premise solutions only will find that this slows down innovation. The build-measure-learn cycle of the lean startup approach to product development is hard when if it takes weeks to gain permission to deploy a change to a client’s server.

So is it all (DAM) systems go?

It may be a while before all this becomes the norm. In some ways the industry is a victim of its own success. DAM solutions that have been around for years are very likely to be fully-featured (no, I didn’t say “bloated”) and monolithically architected; two seemingly unrelated characteristics that, in combination, mean the DAM industry is lagging behind software engineering practices in less mature markets. Here’s why:

  1. The barrier for entry is high. Assuming they know their stuff, a new entrant to the DAM market would build its solution using modern techniques and technologies. But how many new entrants are we likely to see? Unless it can position its product uniquely (which is the trick of course), a startup would need a whole range of features ready for launch in order to compete with existing solutions. In lean startup parlance the The Minimum Viable Product (MVP) would be enormous. This makes it a risky proposition for new players. That’s not to say we won’t see any – if the gap widens between what’s on offer and what could be on offer, new vendors will step in.
  1. For existing vendors, the incentive to change their legacy applications is low. In many cases it would involve significant re-architecture work, which competes with pressure from the sales team to add new features. It’s risky to trust the vision of the technologists. Today’s cool architecture is tomorrow’s legacy application is it worth the investment?
  1. The market for core DAM is still growing. Most vendors still make a good living selling a single, generic application that supports the common business processes we currently define as Digital Asset Management.

So, for our industry to become innovative again, we’re going to need new players willing to take a risk (and with funding) and/or some forward-thinking existing vendors willing to embrace change.

This feature article was contributed by Martin Wilson of Asset Bank and is part of our Improving DAM In 2017 series.

Comments on this entry are closed.