Management Strategies For Developing Robust DAM Metadata Models
Faith Robinson, Director of DAM & ECM at Hasbro, posted an article about metadata models on LinkedIn a few weeks ago: It’s always about the metadata. This is one of a number of different metadata-related pieces recently which I have been intending to discuss on DAM News. The summary is that developing robust metadata models which you can use as a basis for your DAM solution is an involved process that needs to be handled thoroughly and with a lot of care if you want them to enhance findability and have longevity as a result (with particular reference to developments in Linked Data, Semantic Web and Web 3.0). She describes a project she and her colleagues are working on where the metadata model has taken several months:
“The framework we have today is about 80% complete. When it’s finished in another few weeks, we will be able to set a new model in place for brand assets that will scale and grow and mature to eventually take on adaptive and workflow metadata. Could we have done this in a week – probably. Would I have been confident that it was right….probably not.” [Read More]
As per the comments under the article from various others, I would agree with everything Faith says and it is all solid advice. What is worthy of further discussion is why it doesn’t always happen in the manner she describes and how metadata models continue to get developed during lunch breaks and on the back of envelopes (if anything was written down at all).
There are some points I need to note: firstly, Faith’s job title is Director of DAM & ECM. A good number of organisations I come into contact with (including some very large ones) have no such individual who gets paid to consider these issues (at any level, much less a director). Some of those who have a limited grasp of how the ROI for DAM works are actually firing people who would have held this position on the basis that they don’t need a human being now they have DAM systems (which in my view is an excessively reductionist and short-sighted strategy). The sector different organisations operate in clearly has an impact on the level of importance attached to this role and those that deal with large volumes of rich media data will tend to have a built-in bias towards considering Digital Asset Management before those that do not. However, if you subscribe to the idea that every organisation will eventually require a DAM strategy (and products to help implement it) then the necessity for this role is obvious, even if it does have to get combined with several other functions due to budget constraints.
The other issue is a misunderstanding of the value of metadata and a lack of ability to strategically plan how to achieve ROI from DAM. There seems to be three groups of stakeholders in DAM: those who understand the significance of metadata and why it is so critical to get right; another less-engaged set who go along with the idea because they have been told it is important (but still do not publicly admit to not understanding why) and a third category who are focussed on the data files and think these are synonymous with assets, i.e. the metadata is not a key concern.
To put this simply, if you cannot find assets in DAM systems, it’s a waste of time doing it at all and the ROI case is completely blown. The primary function of DAM systems is to allow people to find stuff that was put into them, everything else you might want to do with one depends on that essential attribute. There is another choice quote in Faith’s article that deals with the findability point:
“If you fail at doing that, you risk the DAM system becoming the object of frustration. It gets labeled as “never having anything I need” and other than your travel card website, your DAM system can become the second most loathed site that everyone “has to use.”[Read More]
When the stakeholders who do not understand why metadata is essential are in charge of DAM initiatives (or others who get persuaded that it isn’t really that necessary) the focus switches from metadata to management of binary data (files). This is a bit like running a business and saying you want to make lots of money and detailing all the things you plan to spend the (as yet unrealised) profits on. It might be a reasonable objective, but it doesn’t help tell you how to achieve that, you require a strategy that will give everyone a point of reference to tell them what to do. Metadata models are the DAM equivalent of a business strategy. You can’t run a successful business without a plan, you can’t build a successful DAM without a thoroughly researched and tested metadata model. Just like business plans tend to be more nuanced and complex than how business owners thought they might work out when they first started working on them, so it is also with metadata models too, hence why they require some time to complete satisfactorily.
Will all that being said, it is important to also consider some real world factors which might necessitate some expediency being injected into the metadata modelling process and also some further aspects which can still de-rail the strategy despite the best efforts of those who have devised it. These particularly come into play with organisations that lack a dedicated DAM champion where management of media assets is just one topic of many to be considered.
The first of these is the time constraints: there is often a lot of senior management pressure to get something built so DAM can be ticked off the list. The compromise route is to counsel that this is not a straightforward task and it is critical to the ROI of the system, rather than (as per the observation in Faith’s article) that it’s a very quick job which can be delivered in short order. You might not get as long as you would want to work out a metadata model, but you have at least forewarned that it is not a trivial task and created the possibility of getting more time to do it more properly than might otherwise have been the case.
Another aspect is having access to everyone who will either supply assets or expect to get them out of it. It seems quite common with some DAM solutions for one group of more vocal stakeholders to set the agenda and skew the metadata model towards their needs. There is an argument (which I subscribe to) that says you should avoid trying to implement a single DAM solution that tries to answer the needs of every single user but not everyone has that flexibility (especially smaller organisations). This is partly a function of the time constraint discussed above, if you don’t have enough hours set aside to consult with a wide cross section of users, stakeholder bias risks might go up. A way to mitigate this is to try and identify as many active potential users (and the interest groups they represent) before you go and talk to them. It might not be appropriate to give everyone the same amount of time, for example, users from a graphics studio might need longer than someone in accounts who downloads a PowerPoint presentation template once a year.
Lastly, there is a school of thought which says you should start with a basic outline of a metadata model and then refine or revise based on real feedback and end user testing. There is a conceptual logic to that which does make sense, but it is important to be conscious of the implementation constraints you have to work under. Your corporate DAM system isn’t Google; there are not teams of well-paid engineers who are able to analyse every facet of how it is used, develop minor adjustments, test against millions of users, refine and re-deploy without outages. The DAM software is likely to have some relatively flexible metadata controls (certainly more modern ones) but making wholesale changes to them after you have launched and trained everyone is likely to cause some upset among users who will have just familiarised themselves with where they expect everything to be. In addition, if the revised metadata model turns out to be one that the software cannot represent, changes will be required. Modifying DAM systems that are in active use with lots of assets already ingested into them is more effort than if it can be done before you get to that point. That implies cost to get them developed then outages or at least ‘at risk’ periods during deployment. Note this is the case for any system, whether you host it or the vendor does. The ‘agile metadata’ strategy is not without merit, but metadata models have a fundamental role in your DAM solution analogous to DNA. Ideally once defined, it can be left alone for an indefinite period. Just like before you embark on a journey, you need to have a reasonably clear idea of your eventual destination, so I believe also that metadata models are too important to leave to something you only start to think about properly when you are already on the move and knee-deep in numerous other implementation decisions.
Faith’s article is well worth reading by both those who are already familiar this subject (who may have a similar role to her) and also those who are less well versed in the complexities of implementing DAM. The former will get some useful arguments to persuade colleagues to allow enough time for the job to be done satisfactorily, the latter might better understand why it is so important.
Share this Article: