How To Avoid Wasting Your DAM Budget: An ROI Oriented Approach To Digital Asset Management Implementation
This feature article was contributed by DAM News Editor, Ralph Windsor
One theme of 2013 was the number of vendors, consultants and analysts whose marketing departments fell over each other to produce ‘me too’ colourful infographics that told an engaging (but ultimately fictional) quantitative ROI story about DAM. Usually, this was something along the lines of x average hours saved multiplied by number of employees multiplied again by their hourly cost. The ‘average hours saved’ is where these graphics usually transition from accounting fact towards marketing fantasy.
DAM ROI Based On Empirical Evidence
Here at DAM News, we have been at pains to point out the folly of this endeavour at every opportunity. For anyone who wants to eschew those methods, however, there are few other readily available alternatives. Clearly, investments into DAM need to produce a positive return that is at a level which adequately compensates an organisation for the risks involved. It is easy to see why quantitative methods are popular as they are simple to both explain and understand. That does not mean, however, that they are either accurate or instructional for identifying how a business generates ROI from DAM – which it absolutely must do.
I have an alternative approach which might not offer as many opportunities for colourful graphics, but which uses empirical evidence based on the unique characteristics of each individual organisation where Digital Asset Management is being actively considered. Using a case-by-case analytical evaluation technique rather than some mythical ‘averages’ it is possible to more efficiently assess ROI and use the data obtained to optimise it once implementation has commenced. My contention is that this places ROI at the centre of DAM implementation decisions, rather than being a marketing or PR exercise that is forgotten about once contracts are signed.
The most crucial element of my method is to accept up-front that you will not be able to accurately predict ROI in advance. All the percentages you often see on infographics bear little relationship to each individual case with DAM. Even as averages they are flawed and subject to all manner of factors which may lead your organisation to generate a significantly different ROI (in a positive or negative direction). By acknowledging this basic fact about DAM ROI, the obvious method for dealing with investment decisions is to handle them incrementally and in very well defined stages that you can assess carefully before committing more funds.
The outline of my method is a series of questions grouped under three activities:
Each is carried out in an incremental and iterative fashion – in other words there is not a single start, middle and end sequence, but the activities will be repeated across the lifetime of the system with the prior stage informing the following ones.
One of the key principles of this approach is to avoid making assumptions which can impact the accuracy of assessments and affect what data is collected to measure ROI. To do that, you need to have conducted an analysis of your requirements from an ROI perspective.
There is no definitive set of questions that will work in every case, but the following are a good series of DAM analysis points to start off with:
- What are the management objectives of the organisation and/or individual departments that it is composed of?
- What outcomes is each group of stakeholders looking for, both in general and as a result of the introduction of the DAM system
- What are the key desirable characteristics of DAM solutions for the organisation?
- What are the facts about the organisation, its digital assets and how they are currently used?
- How, precisely, does the organisation currently manage its digital assets now?
- Where does the organisation expect DAM to generate ROI?
- What are those expectations based upon?
- How will DAM help the organisation to realise its management objectives?
- What lessons were learned from prior implementations (within the organisation especially)?
- Is any post-implementation evaluation data available? What can be learned from it?
In answering these points, further questions will be revealed. You almost certainly will need to drill down into more specific DAM issues, for example, the extent to which your organisation’s digital assets are likely to be used on mobile devices and whether implementation of mobile UIs might be one that could crop up currently. At the initial stage, however, it is crucial to focus on why you think you need DAM in the first place. Enhancing productivity is usually the single biggest reason and you need know what is currently impacting that so you can see how (or even if) this maps across to the stated ROI benefits of a given DAM solution.
Many organisations quickly gloss over the analysis stage so they can get to what they think is the ‘real work’ of implementation. To maximise ROI, implementation should be constrained (as far as possible) to only that which you can prove will contribute a positive return. The less staff time, business change and general upheaval created, the lower the impact and the cheaper the cost. Further, implementation generates ripple-like effects where the act of carrying out one task generates others as a result. You need to be fully conscious of why you are doing it and what, exactly, you hope to obtain as a result or the implementation activity can rapidly get out of control.
The following are some key questions to consider for DAM implementations.
- What is the absolute minimum amount of implementation work that can be contemplated at the lowest cost?
- What benefits are each of the implementation activities expected to offer?
- How can those benefits be assessed post-implementation?
- What existing systems or methods exist already and how can these be used?
- What working practices are the biggest obstacle to DAM and how can they be changed?
- What are the key implementation risks and how can they be managed?
- How will we test implementation work against the facts uncovered during analysis?
- How do the results of the evaluation inform subsequent analysis and implementation stages?
It should be noted that implementation does not just refer to the technical work involved in setting the system up, but covers everything from deciding you are going to progress a DAM initiative through to evaluating whether an element of the implementation worked or not. This means it will include activities like vendor selection, training and change management.
The important question from any implementation activity is whether or not it has generated positive ROI – or at least enabled that to happen in a subsequent iteration. Therefore, a critical issue is having the tools to evaluate it properly.
When considered from a purely ROI perspective, the biggest benefit of DAM solutions is the opportunity to collect accurate, real-time data about user activity in relation to digital assets. Making unsubstantiated assertions about ROI before implementation is guesswork. Post-implementation, you gain access to harder facts to support or refute an argument about a business case. These are subject to misinterpretation or can be skewed by technical issues with the DAM software itself etc, but the act of collecting real numbers makes the next iterative analysis phase possible. You need better quality data so you can make more informed decisions rather than indulging in conjecture or using second-hand figures from some other organisation’s DAM implementation or average numbers offered in analyst papers etc.
Here are some ROI related evaluation questions you might want to consider:
- Has the introduction of DAM helped us with our wider objectives as an organisation? If so, exactly how?
- Precisely how has the DAM solution made each department or group of stakeholders more productive?
- Is the DAM solution being widely used by as many staff as was expected?
- Were the expectations about the first implementation phase realised? If not, why not? If they were, what facts prove that?
- What can be learned for any future implementation work?
- What data are we not collecting which would help us to make better decisions for subsequent phases?
- What were the best and worst aspects of the implementation and how can the latter be avoided and the former encouraged?
The above are quite general and (as with the other examples) intended to be no more than cues to prompt more specific questions. The DAM News feature article DAM Vital Signs – Performance Review Techniques To Enhance ROI will help with some more specific examples. In addition, this report by Gleanster Research, Future-Proof Your Investments in DAM (free to download with registration) is useful also.
Applying the Empirical ROI Method To DAM Initiatives
The three activities described above are mainly concerned with what decisions to make and questions to ask to keep ROI uppermost in mind. Over-arching the activities are a number of themes or principles which permeate through all DAM implementations:
- Understand the incontrovertible truths of Digital Asset Management
- Start small and continuously improve your DAM solution
- Focus on your objectives
- Relentlessly seek the truth about the ROI your DAM initiative is generating
- Be specific – but avoid getting bogged down in detail
- Be positive – but manage your risks
Below I explain each area and why it is important.
Understand the incontrovertible truths of Digital Asset Management
There are a small number of indisputable truths about Digital Asset Management which you are unlikely to find anyone arguing with. I cannot claim this is an exhaustive list, but if you find ROI factors and implementation decisions being proposed which appear to contradict these points then alarm bells should ring:
- The primary business case for implementing DAM is to enhance productivity.
- Digital assets are all unique – otherwise you would not need a DAM system to find them.
- Every organisation that implements DAM is also unique and you cannot wholesale apply one potential ROI factor directly to another.
- The users accessing the DAM system (whether administrators or not) are also unique and have different requirements or objectives for using a DAM solution.
- The volume of digital assets being both used and generated by most organisations is increasing at an exponential rate.
As well as these truths about DAM, there will be others that apply to your organisation. Every implementation decision needs to be evaluated to see whether or not it is credible in this context. If not, there needs to be a fantastically brilliant reason that will enable your DAM implementation to buck the trend, otherwise the proposal is likely to be poorly conceived to start with.
To apply this to a real scenario, on a few occasions, I meet clients who plan to offset the cost of their DAM system by generating revenue from their digital assets. Without the projected revenue, the business case for the full scope of requirements does not stand up – i.e. the system is not self-funding through productivity benefits alone (see point one of the above list). In nearly all cases, these initiatives will not meet ROI objectives and therefore have to be considered failures. There are some organisations who have digital assets which are highly marketable, but they are very few in number and the vast majority have already gone down this path and banked the available revenue opportunity years ago.
To take a less specific example, where you see ROI examples that quote the time saved by an ‘average’ user when searching for digital assets, this breaks the principle above that all assets and users are unique. The nature of the search being conducted is impacted by numerous factors. If you are asking someone to find an asset they were working on five minutes ago, but not having first accessed it via the DAM system, you might find that DAM solutions actually slow them down and appear to have a negative impact on ROI. Compare that with asking them to search for something they were working with a year ago (or maybe never even saw before) and the productivity may increase dramatically, but even then, only if the cataloguing of the asset has been carried out diligently in a manner that allows the user to find it using some predictable and relevant descriptive keywords.
Start Small And Continuously Improve Your DAM Solution
Without question, the most successful DAM implementations I have witnessed are those where the sponsors have been able to resist the tendency towards over-aggrandised phase one objectives. As described in the activities section above, the amount of implementation should be minimised where possible and copious data collected at each stage to assess how effective it has been. There is a high temptation to just keep tacking functionality to the list of requirements and a general ‘sweet shop’ type mentality which can develop if someone is not prepared to act as the responsible adult and keep each stage as fiscally prudent as possible.
DAM implementations used to necessitate large-scale custom software development projects, but now, the existence of SaaS and open source options has required all vendors (not just those from the aforementioned groups) to consider how to enable incremental implementation where clients use much shorter implementation and testing cycles. Make use of this enhanced flexibility to preserve as much of your budget as possible so you can identify the biggest ROI opportunities and tick them off first.
Focus On Your Objectives
The scope of DAM solutions has expanded considerably in recent years. This is partly a consequence of vendors trying to play catch-up with each other to ensure they are not at a perceived disadvantage to industry peers. When demonstrating products to prospective end users they are usually eager to show these new features.
Unless you can remain focussed on the outcomes you want to gain from DAM as they relate to your specific circumstances as they stand, right now, this feature bloat affliction that vendors suffer from can become infectious and transfer to your own proposed solution scope. Even if the vendor shows you something potentially interesting for the future, you need to consider how useful it will be in addressing your current objectives. I tend to find when reviewing DAM solutions with clients that placing features into some kind of theoretical ‘feature bank’ for future use and using that as justification for favouring a given product rarely pays off as expected. Usually by the time the organisation gets round to implementing the proposed requirements at some indefinable point in the future, the nature of them has changed and the original solution will require some customisation anyway.
Relentlessly Seek The Truth About What ROI Your DAM Initiative Is Generating
Once the first iteration of a DAM implementation is complete, there is a tendency for managers to either lose interest or align themselves as the protectors of the solution (in a quasi-paternal manner). This is understandable as they will have to answer questions about it from colleagues and justify how they have spent the organisation’s money. That should not prevent them (even if not publicly) from not relentlessly finding out what is happening with a DAM solution. If some of the other principles have been adhered to (such as starting small and collecting raw data about user behaviour) then the big area for enhancing ROI is post-implementation. It is at this stage that real data becomes available about how the system is being used.
Managers need to be prepared to ask difficult questions and doggedly pursue answers to them in the detail they consider necessary. After addressing core user requirements like searching, uploading etc, the DAM solution implementation should address the automated collection of auditing data so it is possible to carry out evaluations of the previous implementation work and to inform the follow-up analysis stages.
It is essential to eradicate any conjecture, assumptions, hopes, fears and other un-proven assertions about your DAM solution and replace them with proven facts. By all means, devise hypothetical theories as to why a given situation is occurring, but test those at every available opportunity and check them again if other data suggests they may now be invalid.
Be Specific – But Avoid Getting Bogged Down In Detail
The principles I have described depend on the collection of empirical data about the DAM solution. Accurate data implies that at least some of it must be quite specific also. This is related to the following point, but an issue can be just sorting through the volume of information, especially where it is not already neatly organised into summaries and the detail has to be manually consolidated (e.g. via spreadsheets etc). In most cases, you will be primarily concerned with summary data rather than detail and this is what the reporting and Business Intelligence capabilities of the DAM solution should focus on. With that being said, noting the previous point, it should still be possible, at any time, to go further into the supplied numbers and not have to wait around for someone (or something) to generate them for you. Detailed auditing data should be available on-demand as and when you need it as well as all the necessary summary reports.
Be Positive – But Manage Your Risks
One of the potentially negative side-effects of the principles describes in this article is ‘analysis paralysis’, i.e. the existence of so much conflicting data and pressures from stakeholders that those responsible for the DAM solution do not know how to proceed.
It has to be acknowledged that sometimes it will be necessary to make decisions without all the information being available or when it appears to contradict itself. Where such situations do occur, it is acceptable to take a decision which is not fully supported by a complete series of facts, however, you must also be aware that you are increasing your exposure to risk as a result and any methods to mitigate that should be fully utilised.
An example might be scenarios where auditing data about search results suggests that users are always finding results, but user focus groups report that they come up null very frequently and the asset cataloguing and/or metadata requires further attention. The manager may decide to hire professional picture researchers to enhance the quality of the metadata recorded about each asset to assuage some of the concerns of the users. To reduce risk, a small sample of assets might be selected first and then tests carried out to see if the user’s still believe irrelevant search results are being retuned. If so, this might suggest a training issue, if not, further investment in professional keywording could resolve the problem.
As you can see, there is no silver bullet to assessing DAM ROI, apart from perhaps your own common sense. A summary to finish up on would include the following points:
- Ensure your DAM solution captures user behaviour via auditing and reporting data.
- Use actual data for ROI assessments, not hypothetical numbers from third party sources.
- Implement in discrete, incremental stages which you test.
- Make no assumptions and ensure there are verifiable facts to support all decisions.
- Concentrate on what you want to get out of your DAM solution right now and avoid projecting too far into the future.
- Carefully assess your risks and develop solid plans to mitigate them.
The principal benefit of this technique is that it requires a rigorous and thorough approach to evaluating DAM solutions where you focus on the task at hand. These methods can help you cut through the noise in the DAM market and help to ensure you get a decent return on your investments into your DAM initiatives.