Canto Upgrades Cumulus To Version 9.2


DAM Vendor, Canto, have announced version 9.2 of their long-standing DAM solution, Cumulus.  The key feature revisions are:

  • Amazon S3 storage of assets
  • Upgraded workflow with review/approval cycles
  • What they call “Social DAM”
  • Mandatory fields collected at the point of upload
  • Duplicate asset checking

This is the quote on the press release from Canto boss, Jack McGannon:

In addition to serving up large files from the edge of the internet via a content delivery network (CDN), the Amazon S3 Asset Store for Cumulus allows organizations to take advantage of the elastic performance of Amazon’s ‘pay for what you use’ pricing model for the physical storage of digital assets.  With Cumulus’ new hybrid cloud option, we tackle the full spectrum of enterprise IT digital asset management needs, offering Cumulus also as perpetually licensed on-premise software and as a cloud-hosted managed DAM.” [Read More]

There is a two page blog article which includes a good level of detail about what you’re actually going to get with screen grabs etc.  They’ve kept the platitudes to a minimum, which is a marketing strategy that (finally) seems to be gaining some momentum among DAM vendors these days.

I couldn’t point to much on the upgrades list that is genuinely innovative when compared with the wider DAM industry.  This release seems to follow the theme of the wider on-going DAM industry maturation process.  Many vendors now are in a feature equalisation process where the upgrades consist of copying items that other teams have which they themselves might have been weaker on in the past (and where they think they may have lost out on sales pitches etc).  Although a minor point, the duplicate asset checking is possibly noteworthy.  I would guess the checksum or hash-based method is included as that seems to be standard for every DAM system worthy of the name these days, but their blog also mentions using custom business logic to test for duplicates:

Sometimes, a simple content-based duplicate test is not enough. You might have custom business logic determining if an item is considered to be a duplicate. For that purpose, it is now possible to implement custom upload validators. Your code of such an upload validator gets executed before an uploaded file is added to the Cumulus catalog. You are not limited to comparing the file to existing Cumulus content, but you could also e.g. query an external system. For example, you could reject images using a wrong color model or when you can’t determine the correct license.” [Read More]

Presumably if this uses some kind of hooks-based technique to check an asset upon and upload event then it’s probably fairly easy for others products to do the same, so this might all be down to the description but I would expect to see more permutations around key points in the asset life-cycle (their metadata ‘pre-filler’ being another case in point).

The Social DAM item seems more like a bit of marketing hyperbole.  I dealt with the wider topic on DAM News last year, it’s not really social media integration being offered, but  instead what they have done is to copy some of the conventions of social media like using ‘@’ to refer to users and subscribing to updates about assets or getting RSS feeds etc (which are features that have been in DAM systems for a number of years already).  In mitigation, more users are going to be familiar with social media styles of interaction these days, so it does make sense for them to have employed them.

The headline feature is clearly the S3 integration and hybrid cloud/on-premise DAM appears to be the preferred strategy for formerly on-premise only vendors as they come to terms with the inexorable trend towards cloud delivery and the asset storage cost saving opportunity it offers (amongst other benefits, it should be said).  One issue which often crops up with the on-premise model is what happens when some external third party needs access to assets.  I’ve not read up how far they address this aspect, but that would be a point to verify.  Just because the asset’s file might be stored on ‘the cloud’ (or an Amazon-operated data centre somewhere in Virginia or Dublin etc to be more precise) doesn’t mean it’s going to be generally accessible to other authorised users unless the DAM has specifically been configured to allow that to happen and there is some interface to identify what the DAM contains.

On-premise DAM generally seems to be used where either the organisation knows they will have data security compliance problems from external hosting (e.g. government and finance sectors etc) or if the IT department are in exclusive control of the selection process and plan for everything to be held internally on their own kit.  I have seen this several times with internal hosting where it’s all fine for the in-house staff, but a brick wall (plus a firewall) for anyone else.  If your usage scenario is marketing-oriented, people like agencies, photographers, printers, journalists etc are likely to be heavier users and potentially suppliers of your assets.  If they don’t have access, the ROI from the DAM is going to be impacted.

The stock answer to this problem from most IT departments seems to be to direct external users to some messy VPN option which they have to install themselves with various associated tokens, dongles and other easily losable security hardware paraphernalia.  If hybrid DAM is being proposed as a solution, make sure that it’s sufficiently hybrid to enable access to everyone who might need it, irrespective of what location or network they originate from.  A lot of hybrid DAM solutions now seem to use a split internal/external asset access model and that can work, but there are potential pitfalls with it that relate to the extra management complexity for both the software and the administrators who need to decide what to allow to go where.  Hybrid DAM offers a lot of potential flexibility, but the pay-off is complexity and the need for a solid plan which regularly gets reviewed to ensure it is working as intended.

Overall, based solely on examination of the marketing communication output alone, this seems like a decent enough release from Canto considering that they have the disadvantage of an older product with a large existing user base (plus many partners and integrators etc).  On a wider level, the Amazon S3 integration is a good marker point to highlight the far more distributed nature of DAM implementations these days and I suspect storage will just be the start of that process.  I am not sure how well this update will assist Canto to capture new customers, although it shouldn’t do them any harm in that respect and it might also avoid a few long-term existing ones from looking elsewhere for software to service their Digital Asset Management needs.

Share this Article:

Leave a Reply

Your email address will not be published. Required fields are marked *