Call for Contributions: DAM Migration


This month’s editorial theme is DAM Migration.  Whether it’s upgrading from a legacy platform to a new system, or embarking on your first DAM initiative, moving a large volume of files and their associated metadata can be a daunting task.  However, migration is also a great time to take stock and perform an audit of your organisation’s digital assets, and provides an ideal opportunity to purge duplicate, out-dated or obsolete files and folders.

DAM migration is certainly not for the faint-hearted and often requires a high level of technical expertise that includes server-side scripting, database administration, API configuration, media processing, and the implementation of mechanisms to track the migration’s progress and provide the ability to resume the process after the invariable bugs and glitches have been resolved.

A DAM system comprises a whole lot more than just a bunch of files, and after years or even decades of use, the associated database records, metadata (such as keywords and tags), derivative files (such as thumbnails and video previews), categories, user permissions, rights management and analytical data all need to be taken into consideration and methodically transferred to the receiving system.  The devil really is in the detail here, and often something as simple as incorrect character encoding or a misconfigured database schema can bork the whole process.  Carrying out numerous tests with a diverse set of digital assets will allow you to identify any edge cases.

Having personally been involved in a number of large scale DAM migrations, it’s often been the case that we’ve built dedicated wizard-type processes to assist with the task.  A typical scenario might involve exporting assets to a common format (e.g. Excel) that allows users to review and modify the records, then re-uploading them before conversion to a transitory format (e.g. JSON) in order to perform sanitisation prior to being imported, ingested, and catalogued by the receiving system.

A DAM system that has its own well-designed API can greatly reduce the amount of manual entry by providing dedicated calls for the batch processing of files, metadata, keywords, permissions, users, and rights management.  Emerging AI technologies can further reduce the workload via image recognition and auto-tagging functionality, although care must be taken as such features may still be in their infancy and are likely to introduce their own caveats and complexities.  A ‘belt and braces’ approach to migration is preferable to a set of flaky, half-baked AI features, so it’s wise to employ a certain degree of scepticism and ask your vendor up-front about the DAM’s real-world capabilities and limitations.

Call for Contributions

We’re inviting our readers and subscribers to share their own tips, insights and experiences on the subject of DAM migration.  Has the vendor of your new platform provided sufficient guidance on moving away from your legacy system?  Do you have any tips or insights that have enabled your migration to go smoothly?  Are you struggling to come up with a migration strategy that enables users to review and edit metadata prior to ingestion?  Have you used an external supplier or consultant to handle your migration?

Contributed articles must be exclusive to DAM News (i.e. not previously published elsewhere), non-promotional and adhere to our editorial guidelines.

You can send your contributions to russell.mcveigh@activo-consulting.com.

We reserve the right to modify submissions in order to comply with our editorial guidelines and will notify the author should we need to make any changes.  We will also provide a link to either you or your company’s website and/or LinkedIn profile.

Share this Article:

Leave a Reply

Your email address will not be published. Required fields are marked *