You are here: Home

Writing on his Tame Your Assets blog, Ian Matzen published an article last week: Are You a Digital Asset Hoarder or a Digital Curator?   The thesis of the article is that defaulting to a ‘never delete’ policy for managing your digital asset collection is not an optimal strategy because of the volume of low-value material that get retained as a result.  Ian advocates having a collections management policy to help guide retention/deletion practices:

I am arguing for smart asset retention. If you can’t explain why, for example, you are keeping a gazillion images of your CEO in silly costumes, you have a problem. Or why those assets sitting on Pat’s computer desktop, yes, the ones downloaded from Google without rights metadata, have been uploaded into the DAMS, you’ve got some explaining to do. For archivists and librarians alike, collection management is a familiar concept and should be at forefront of every digital asset manager’s mind, whether we “weed” or retain all content.” [Read More]

I think Ian has some fair points over this issue which I generally concur with.  Several factors, however, may conspire to make this a more complex policy to implement than it first appears, although there are also some methods which can be used to help make rational management decisions about retention policies for DAM that I will discuss later.

To start with, I will make it clear that if I had a binary choice between deleting data and keeping it, I am one of those people who usually sides with retaining it.  I come from the school of IT that says you can never have too many backups.  While managing large scale data repositories can be a demanding task, bitter experience has taught me that needing to have engineers go through all kinds of obscure recovery processes to try to retrieve deleted data is nearly always a worse problem to deal with (not least because of the regret which accompanies it).  This is a more straightforward technical issue though.  The problem Ian discusses is more of a qualitative one, i.e. deciding what data is worth holding on to (or not, as the case may be).  Later in the article, he offers this advice:

For collection development to succeed there needs to be a policy, also known as a retention policy, drawn up by your governance committee and then ratified by stakeholders. It should define, as specifically as possible, the asset types that get stored and managed, how long they are kept, and what gets tossed, such as duplicate, ephemeral, or low value assets.” [Read More]

This sounds reasonable, however, thrashing out the finer details of how this will work in reality is often easier to say than it is to actually implement.  In the case of digital assets, knowing (in advance) what is worth retaining is not without risks.  To take a couple of Ian’s examples, the multiple photos of the images of your CEO in silly costumes do seem like reasonable candidates to remove, but one of them might include material in the background which later acquires greater importance (e.g. to prove or disprove that a person or object was present at the time).  In Peter Krogh’s book about DAM, there is a photo of New York taken prior to September 2001 which shows an American flag with the Twin Towers behind it.  Immediately post-origination, the image might not have been considered particularly significant.  As noted by Peter, however, context (i.e. major geopolitical historical events) has considerably increased the power and value of the image in a way that it would have been very difficult to predict.  In the case of duplicate assets, while the binary essence might be identical, the audit/usage and workflow records may well be different (and contain their own insight which might not be apparent initially).  To delete one of them, a judgement has to be made about which is more valuable than the other (and the likely outcome is that most users will just randomly choose).  It would be fair to say that most DAM users underestimate how much metadata any given digital asset has, even one which has had no cataloguing data applied.

At this point, those who have tendencies towards deletion rather than retention are likely to counter with the observation that you can make a case for retaining anything.  They are more right about that than they would imagine.  As most Digital Asset Managers are well aware, the whole repository is full of ‘special cases’.  The reason you need to manage digital assets in the first place is that they are all unique (even the ones that appear to be perfect copies).  The value of a digital asset to a user fluctuates depending on scarcity and context, as such, the business impact of getting a valuation wrong might be negligible, or it could be huge and have a long-lasting impact, you just don’t know with complete certainty either way.

There is a wider digital transformation perspective to this which is not one I regularly hear mentioned in discussions about the subject.  Most of the organisations who are profiting the most from digital technology currently regard data as an asset to be accumulated.  Simply acquiring it is obviously not sufficient by itself and further metadata needs to be applied to add sufficient value to make it useful, but if you lack the raw materials to start with then the opportunity is lost.  Firms that are data-rich are now increasingly likely to be cash-rich.  For that reason, I advocate not removing any data you own if you can avoid it.  Data represents stored activity; it cost someone time and money to generate, so by disposing of it, you write down the value of that investment to zero and without any possibility that you can profit from it at some future point.

The default position should be to never delete data, unless it meets one of the following strict criteria:

  • The IPR is owned by someone else who did not give permission for your organisation to use it (i.e. it breaches copyright and is effectively stolen).
  • You have exceeded all available storage capacity and fully exhausted any means to extend it (for financial or other reasons)
  • There are either no funds to allow the data to be retained in off-line capacity or the data is sensitive and cannot be held outside the organisation for a significant reason (for example legal compliance).

While the above are reasonable principles, I have to acknowledge Ian’s guidance to devise a policy for assessing digital asset value to mitigate against un-managed data hoarding.  Just not deleting anything is impractical for most organisations, not least because while storage is cheap (as described in Ian’s article) it is not infinite, nor free.  Even if the availability of capacity is not yet an issue, you will still have stacks of digital assets which might appear to be intrinsically worthless for an indefinite period interfering with searches for more valuable material and generally making getting any work done far harder than it would be if some kind of purge was enacted.

The best method I have come across to hedge the risk of losing data which might become valuable later against the operational challenges of retaining what are currently perceived to be low value digital assets is to use multiple catalogues, in other words to segment them.  Earlier in this article, I used the description ‘binary choice’ to refer to data deletion.  Ideally, digital asset repositories should make it possible to avoid this becoming a one-time, do or die decision about whether digital assets are kept or not.  Rather than a hard deletion, instead consider a deaccession policy which uses a levels-oriented approach (corresponding to catalogues based on current value) which assets move through and business-specific criteria or rules that determine when that happens. Lower value items which are strong candidates to be removed entirely exist in catalogues which are commensurate with their current role and users either have to ask to search them or possibly even need to request access.  Should storage capacity becomes an issue, the lowest value collection is where candidates for removal are sought first.  This avoids having to make arbitrary decisions making about what to retain.  If offline storage is available (and options like Amazon Glacier and various similar competitors make that a cheaper and more tenable proposition than it once was) then it might be possible to keep the metadata and possibly some smaller proxies (previews) so there is some memory that the material still exists.

At present, while these sort of facilities are offered in DAM solutions (especially transmission to offline storage) they tend not to be as flexibly implemented as they will need to be in the future as the volume of digital assets in circulation continues to increase.  I would imagine most DAM users have a system which offers an option for changing the status of an asset to ‘archived’ or ‘admin only, ‘not published’ etc.  The terminology varies, but the common factor is that the assets are not deleted, but are not available to general end users either.  These are a starting point, but they are unlikely to offer a satisfactory level of descriptive precision in the future nor opportunity to segment data as flexibly as will become necessary.  To an extent, this is also possible using asset/user permission controls (i.e. by putting assets into groups and then controlling access to those) but even then, the results may still end up finding their way into searches alongside higher value assets for anyone with access.  I have seen a few solutions that do provide multiple catalogues of the type that would suitable, however, they tend to be more preservation use-cases (e.g. Collection Management Systems as used by museums etc).  As often happens in the DAM software market, I can see more of these facilities getting replicated in business or marketing-oriented DAMs as their users begin to better appreciate the necessity for them.

I should make it clear that while this approach offers some advantages, it too has drawbacks.  For example, while decisions about the value of digital assets are likely to be less fraught and risky than fully deleting them, they will still provoke debate and once an asset has been downgraded, the likelihood of it being reinstated will probably diminish.  Similarly, if users don’t have default access to assets which would have otherwise been deleted, then as far as some will be concerned, they might as well have been.  Lastly, although these features are relatively straightforward, they do introduce further complexity to the software and the risks of their implementation (and costs) need to be managed, so this method is not a free lunch, by any estimation.

With all that said, I believe the approach described provides the basis of a model which could be used for a more versatile deaccession policy framework.  This is going to become an increasingly demanding issue for organisations to address.  While there will be a strong instinct to treat removal of data as though it was some kind of exercise akin to cleaning out your loft, in a digital environment where (for the most part) data will remain in the same pristine condition it was in when introduced, you need to think carefully about how to preserve its value without allowing your DAM initiative to turn into a data hoarder’s charter.


Why Digital Assets Are A Superior Source Of Value For Enterprises

October 17, 2016 Opinion

Last month, Lisa Grimm wrote an article for CMSWire: DAM Expands Its Reach in the Enterprise.  Amongst other topics, the item describes how DAM has incrementally moved into areas which used to be the preserve of ECM (Enterprise Content Management): “Some argue DAM is creeping into the space enterprise content management wanted to occupy a decade ago, albeit […]

Read the full article →

How Different Will DAM Be In 2030? Part 2

October 6, 2016 Opinion

This is the follow-up item to the first part of How Different Will DAM Be In 2030 which discusses Martin Wilson’s article for Information Age: 7 predictions for digital asset management in 2030.  In this piece, I will consider the remaining predictions 4-7. 4. Users will demand streamlined simplicity “As DAM systems acquire more and more features, there’s a risk that they’ll […]

Read the full article →

How Different Will DAM Be In 2030? Part 1

September 26, 2016 Opinion

Martin Wilson, of Asset Bank has written an article for Information Age: 7 predictions for digital asset management in 2030.  The item has some similarities with Toby Martin’s piece which I discussed a couple of months ago (albeit the latter is about the medium term future of DAM well before 2030).  I agree with a number of the points […]

Read the full article →

Review Of Digital And Marketing Asset Management By Theresa Regli

September 9, 2016 Digital Asset Management Books

Last month, a book called Digital and Marketing Asset Management by Theresa Regli of Real Story Group was published.  There have been a few DAM-related publications released or updated this year and I am aware of a couple of others by John Horodyski and Dan McGraw which I still have to read.  This is quite different to how things were about five […]

Read the full article →

DAM Is Metadata Operations Management

September 2, 2016 Opinion

I have read a few articles etc recently which advance the currently widely held opinion that DAM is exclusively a marketing concern.  This is not one I share (although I will acknowledge that, at present, marketing people are the largest single group of DAM users I come into contact with).  The more I analyse this situation (especially with […]

Read the full article →

Blockchains As Emerging DAM Interoperability Activity Registers

August 25, 2016 Emerging DAM Technologies

In his Bits On Blocks blog, Antony Lewis has written an article titled: The emergence of blockchains as Activity Registers about the two different roles Blockchains play.  For those unfamiliar with the term, blockchains are the distributed ledger technologies used to record transactions in commodity digital assets (aka ‘cryptocurrencies’) like Bitcoin.  The item refers specifically to financial services, […]

Read the full article →

DAM Guru Events Calendar Syndicated On DAM News

August 16, 2016 Industry News

Readers may have noticed that we are now syndicating events from the DAM Guru Events Calendar on DAM News (in addition to Tim Strehle’s Planet DAM aggregated feed).  The latest events are included on the sidebar to the right and there is a dedicated resources page with a longer list.  The DAM Guru Programme (DGP) calendar was […]

Read the full article →

Clarifai vs Google Vision: Two Visual Recognition APIs Compared

August 12, 2016 Emerging DAM Technologies

As regular readers will be aware, I have been following various automated visual recognition tools for some time now and so far remain underwhelmed by the results (especially compared with the marketing hype that typically accompanies them).  A few months back, I checked out the Google Vision API with some test images and the results were partially successful, but […]

Read the full article →

Review Of Metadata For Content Management By David Diamond

August 5, 2016 Digital Asset Management Books

David Diamond has released another book titled: Metadata For Content Management as a follow-up from DAM Survival Guide.  The new publication does not specifically refer to Digital Asset Management because David says he wants the topics discussed to be relevant to all types of system that mange content (i.e. it has a wider frame of […]

Read the full article →