How DAM Can Save You Money – Finally A Non-Zombie DAM Infographic
Many DAM vendors have recently been copying each other’s marketing strategies by promoting these ROI Infographics that claim to tell you exactly how much money a ‘typical’ DAM can save without any real basis in fact or plausibility. My co-contributor made the comparison with ‘zombie vendors’ a few months ago.
Like in a zombie apocalypse movie, waves of them seem to keep arriving currently, spreading their infection to all those who cross their path and turning vendors into hapless victims of software marketing madness (with the associated total loss of all critical brain function that generally accompanies this affliction).
Last week, Canto sent us a DAM Infographic with the title How Digital Asset Management Can Save You Money. At first I was not looking forward to reviewing it, as this has generally become a depressing experience of late, for all the reasons mentioned in the article referred to. Having had a proper look, however, the Canto production is surprisingly better than most and I am hoping a few other DAM marketing personnel (whether they represent vendors, consultants or analysts) might read this, take note and concentrate more on the ‘Info’ than the ‘Graphic’ in the future.
Much of the credit would have to go to the core research itself. This was carried out by Gleanster Research and the decision to reference it back to users is a more suitable measure that is easier to rationalise than the more common (and flawed) average time taken for x activity multiplied by employee hourly rate of y approach. With that said, there are still some issues and assumptions which, in the interests of balance, I will have to take to task now.
They refer to ‘top performing organizations’ which are measured by “Year over Year revenue”. It is not clear what kind of revenue they mean here, either revenue of the business, or revenue generated by the DAM? While DAM might help many organisations with the bottom line, it’s less common for it to have a top-line impact unless it happens to be a media operation that sells its digital assets to customers (which is probably unlikely for most corporate DAM users). Year over Year revenue also suggests a bias towards high-growth firms.
The big opportunities for productivity improvement via DAM are (from my experience) in larger and more established businesses where there is a greater opportunity to save costs – they might not necessarily be enjoying double digit Year on Year growth. The other point is that this would obviously exclude any not for profit or public sector organisation who can equally benefit from DAM. They can implement it just as effectively (or better) than their commercial counterparts. Firms that are already large can also generate a larger ROI in absolute terms by sheer fact of their existing scale.
I am unsure whether Year over Year revenue growth is a useful measure for assessing top performers in relation to Digital Asset Management. I can see the motivation for linking DAM to some more tangible accounting measure, but in doing so perhaps they have sacrificed relevance? It is entirely possible I have misinterpreted what they mean by this measure, so I wait to be corrected by Canto and/or Gleanster.
There is a report which you can register for, but as I write that wasn’t working because it was conditional on attending Canto’s webinar held yesterday (4th December). Canto might want to place another link to that research somewhere near the graphic (with or without registration).
The two other factors: cost savings and digital asset utilisation are better, but still have some limitations as measures. Hardware is mentioned as an efficiency item, along with support and licences. Some of this sounds reasonable (e.g. saving software licence costs by not needing to roll-out desktop apps to do basic asset manipulation) but I would need to review the criteria used and how they weight them. The hardware point I have heard mentioned before and I am not sure how to what extent that is as important now in DAM as even on-premise systems tend to be consolidated on shared corporate servers (without even mentioning SaaS DAM products where it’s not an issue). I suppose there is the saving of not needing to make hard copies of print collateral so frequently, but I suspect that is negligible and having a DAM (or not) is less likely to impact costs.
They claim that 92% of users implement DAM for cost saving reasons; that sounds entirely reasonable to me and the other 8% are probably either media businesses or have some other more specialist reason for implementing DAM.
The graphic has some reference to DAM features which are widely understood and recognised, but not always properly considered in terms of ROI, for example, format conversion. This is a common DAM requirement and one of these features that gets used a lot, from my experience. A high proportion of the requests production staff in marketing communications departments tend to receive (where no central DAM exists) is image format conversion from colleagues who do not have a tool like Photoshop installed. This self-service element and ability to distribute workload more widely across the business is one of the major benefits of DAM, so it is positive to see it brought to the fore.
The tips to improve ROI are mostly good ones also, especially, the first item about configuring separate repositories for work in progress, production ready and archived content. This is a strategy I recommend to clients who have hundreds of thousands or millions of assets (as too is the phased approach with lots of testing and collection of user feedback at the conclusion of each stage).
The final point about using the DAM to auto-populate metadata is a little more contentious, sometimes this can offer productivity benefits, other times it generates bogus metadata that damages the integrity of search results. Some automation is undoubtedly useful, but depending on it exclusively is a potentially risky strategy. See our DAM findability article for more about this subject.
One area missing from the graphic (that I could see) was the ability to collect real time and accurate Business Intelligence (BI) about your digital media operations. DAM systems put digital assets into an integrated data collection and management solution, therefore, they offer an almost unlimited range of opportunities to track events and measure or audit user activity – all of which you can use to help make better decisions. This is where the real ROI as it applies to your organisation can be discovered. It is essential to remember that:
- Digital assets are not commodities
- Your organisation is unique
- Your users are all individuals
- DAM implementations (even using the same product) are never identical
You cannot extrapolate someone else’s plan and fit it to your own circumstances without careful evaluation of how much applies to you first – although it is true that you can learn from the experiences of others and use them like case studies to learn lessons and make better decisions. The ‘I’ in ‘ROI’ implies there is a risk/reward effect that requires consideration and on-going management. For all those reasons, being able to collect actual data, as it happens, is essential to accurately measure ROI properly and decide whether a given feature or component is making a positive contribution or not.
Canto were eager to point out that their graphic was re-tweeted by fellow vendor, Widen. I also note that Widen appeared to grasp the Business Intelligence point discussed above (perhaps more so than Canto) since they included a link to a blog post (and video) about reporting metrics. There is a DAM News feature item about metrics and auditing for DAM systems too on our features area.
As is the DAM News house style, we rarely give anyone a perfect report card, so my critique here needs to be considered in that context. Taken overall, however, Canto have come up with a pretty reasonable effort at producing a DAM Infographic that tells the reader something useful and it is a worthwhile counterpoint to many of the others currently circulating on the DAM scene.
I would recommend that all those with an interest in our subject should at least take a look and think about the implications for a while before they too get bitten by the zombies and release yet another example of one of these implausible DAM ROI Infographics.
Share this Article:
Ralph,
I appreciate the thorough review of the infographic and underlying implications for some of the data. I thought I would address some of your comments to clarify a few points. Let me start by saying it’s a pleasure to get poked and prodded by someone who really has some domain knowledge on the subject. That makes your validation of some of the findings that much more valuable and tells me the metrics we did use were spot on for isolating key best practices in DAM adoption.
I’d like to take a stab at answering a few of your questions:
– Our benchmark classification of “Top Performers” is determined by one or more metrics that are relevant to the topic area. The calculation is actually a weighted algorithm that aggregates performance across all metrics to rank survey respondents. We then take the top quartile which are classified as “Top Performers” and we look at what they are doing differently. It’s sort of like a big pivot table.
– What type of revenue are we discussing? Plain old grassroots generic annual company revenue. Data from our surveys is self-reported, so there’s already bias in the data based on user knowledge of company revenue performance. As you point out, yes, it is possible for the data to slightly skewed by high-growth companies but the idea behind using revenue growth is gain exposure into the practices of successful companies. This might leave a few struggling companies out of the running’s for determining best practices in DAM, but we are really looking for discernible trends in the data not hard statistics. So revenue is added to the mix of metrics to more narrowly focus on success- assuming top line growth is a metric companies would benchmark against. Also, revenue is only one of multiple metrics that impact the classification of Top Performer. We have found it to be a useful metric in amplifying certain insights from all Topic Areas we report on. People generally emulate companies that are growing. Also, there’s nothing wrong with learning from high-growth companies, they are doing something right and they are just as likely to struggle with DAM or utilize best practices because they are forced to do something about it quickly. It would be foolish if not impossible to ascertain Revenue Generated by DAM so that is not something we tried to benchmark and I wouldn’t trust users feedback anyhow. Who measures that? Scratch that. how would you measure that consistently across respondents?
– As you point out, there are other big opportunities to gain benefits from DAM that are more intangible or indirectly have huge impacts on productivity or cost savings. I might add that the cut of Top Performers in the report you read would result in bias towards DAM users since “adoption” was a metric. We wanted to know what successful companies that tend to have high ratios of user adoption are doing differently. In many of our reports the metrics are more generic in nature and therefore the results would actually isolate technology penetration and use. For example, learning that Top Performing companies were X times more likely to use SaaS based solutions is pretty interesting- but it’s more interesting if you know that only 30% of respondents used DAM and Top Performers accounted for 75% of them.
– The report is also available on Gleanster.com and it is free with registration: http://www.gleanster.com/reports/future-proof-your-investments-in-dam
I appreciate the coverage and the candid evaluation of the metrics. You had some great insights to share that have clearly been influenced by the realities of the real world. Analyst reports always risk the danger of being too disconnected from what it actually takes to apply the best practices. Saying it doesn’t usually translate the same when you actually do it. Having said that, the metrics we used were successful in isolating and validating at least a few best practices, and that to me is a win. It’s our goal to generally share insight into what Top Performing companies are doing, but you should always take it with a grain of salt. It’s one data point that becomes 2 from analysis like yours. Grab a third data point and the buyer has a trend… I think you sort of have to take what you can get in this day and age. Happy we could provide a little defibrillator shock into an otherwise zombified expectation on DAM insights. ;)
Ian Michiels
Principal & CEO
Gleanster
Ian, thanks for the response to the points – those are all useful for people wanting more detail on the research background and methods used. I’ve been reading through the report you have provided the link to in your comment and its one of the more credible ones I’ve read about DAM.
In addition to the report, “Future-Proof Your DAM Investment” we also held a webinar with top branding firm Lippincott and XO Group (founders of theknot.com) where they shared their best practices and tips that have made them successful with their DAM. In addition, Gleanster shared information from their research report during the webinar. To watch the pre-recorded webinar and get your own copy of the Gleanster report for FREE click here: http://www.canto.com/white-paper/future-proof-your-investments-in-dam/