You are here: Home » Vendors » Adobe SmartPic: A Brief Glimpse Of Metadata Insight?

Adobe SmartPic: A Brief Glimpse Of Metadata Insight?

by Ralph Windsor on March 17, 2015

Last week, Adobe’s PR people sent us a variety of links in relation to their summit which were demonstrations of technologies from their ‘labs’ research division.  One example which is specifically about DAM is called SmartPic.  There is a quote below, but to understand the first issue I will discuss, you also need to see the visual accompanying their description:

Effective asset management requires a deep understanding of each piece of content. How should the asset be labeled? What visual elements should be identified? What campaign elements are associated with it?  Tracking this data is a huge challenge, and even the best content management tools require a carefully planned strategy for tagging each asset with the appropriate metadata.” [Read More]

Reading this through the first time, it gives the impression that SmartPic is both a cataloguing tool and something which can be used to generate analytics about asset usage.  Given that most of the labour-intensive work with DAM starts with the cataloguing task, I would envisage more DAM users to be interested in what it can do in respect of that problem first and while the analytics is undoubtedly beneficial, I would suggest it is an issue that is further down the list of priorities for most (currently, at least).  If you read the above quote with the “Biker on mountain trail” visual, you might (as I did) put 2 and 2 together and make 22.  It gives the impression that this technology will automatically generate captions and narrative descriptions.  I clarified this point with Adobe’s representatives and they did confirm (without any evasive language, to their credit) that it is only the tags, not the narrative which SmartPic will automatically derive.

Having established that we are talking about a system which is an auto-classifier (i.e. data analytics and associated mechanised inferences) rather than something more sophisticated the frame of reference is data engineering, not artificial intelligence and I will analyse it accordingly.  Here is some more from the blog post:

Of course, managing your metadata is just the beginning. To make the most of your assets requires the science of analytics — gaining insights from your data to understand which content is most effective and why. For many marketers that means pouring over spreadsheets, looking for connections between data points and interpreting the results…But what if there was a way to make the whole process better — a way to make tracking assets foolproof and analyzing the data associated with their use more efficient?” [Read More]

Followed by:

‘We’ve been looking at this for years, but it’s not a simple problem to solve,’ says Steve Hammond, senior director of Marketing Cloud product management at Adobe. ‘First you need to have a unique ID for each asset, and then you need to know where that asset is being placed.’” [Read More]

It is some time since I was hands-on with any kind of serious low-level software development and implementation activity, but this doesn’t strike me as that difficult in terms of either information science or coding problems.  The unique identifier issue is an integral element of any DAM system (since they need to isolate a single asset to function properly) and even if files are not uploaded to create asset records, you can still use hashes and checksums etc (of varying types) to get an approximation of the same thing.  Correlations between data entities sounds like a taxonomy design issue – so the principal level of effort is with the information architect and/or metadata subject expert to have a framework which can be shared across the enterprise.  I am not going to say this problem solves itself, but with those two key ingredients provided for you, the cake becomes quite a lot simpler to bake and some might contend that it is the users doing the vast majority of the incremental ‘data entry’ work here (as it usually is with DAM).

There are two points of interest with this.  Firstly, it has the potential to be more innovative than many other examples that have been offered by developers of DAM solutions in the last few months.  The second is that it leverages existing metadata already entered by users and is one of the areas we have mentioned on DAM News where vendors should be focussing attention.  The system essentially facilitates the making of connections and the auto-tagging is a by-product (albeit a very useful one).  This is a point we have mentioned before on DAM News, many of the emerging automated cataloguing methods depend on the users being organised and having well thought-out processes for asset ingestion (if you define that as uploading and cataloguing, as I do).  If you are hoping to show up with the proverbial shoe-box of photos and then expect some mysterious and intelligent software to sift through them automatically for you, prepare to be disappointed.  This type of innovation will reward those who have already put in the time, money and effort to doing the job properly and allow them to take further advantage of their existing investments in this area.

One potential downside with a leveraged approach to metadata generation like this, is that garbage in might not just equal garbage out, but garbage squared or worse (to paraphrase the famous information science maxim).  In other words, if cataloguing is handled poorly for one batch of assets then the resulting metadata has the potential to become like some polluting effluent that rapidly spreads everywhere else far more quickly than it might have done before.  I expect a number of Digital Asset Managers to be wary of what this system auto-suggests and they will require more sophisticated controls than Adobe were perhaps planning to offer.  These are all trade-offs from any kind of automation tools, however and calibration of them is another job users will probably never get away from.

From the industry perspective, SmartPic presents an intriguing challenge to Adobe’s DAM system competitors.  Based on feedback from a number of other consultants and DAM experts that I speak to, I cannot say Experience Manager has an exactly stellar reputation as a DAM solution at present.  However, this line by Steve Hammond from Adobe is an indication of how they start this particular race a number of yards ahead of the chasing pack:

Fortunately, we’re uniquely positioned to solve this problem. Adobe has a unique footprint across creative tools as well as the digital marketing and analytics landscape.” [Read More]

If Adobe have an edge, it is their established brand in respect of graphics production tools like Photoshop, Illustrator, InDesign etc.  This offers them a ready-made sales channel to prospective users and (crucially for this exercise) the ability to transparently collect metadata.  For other vendors, this suggests that they need to be able to offer the same deep-level of integration, not only to read the contents of the files (including any embedded metadata in XMP etc) but also directly within the UI of the tools themselves.  I am aware many vendors claim to have these capabilities already, but a far smaller number of them do it competently in a way that will enable them to directly support features such as SmartPic without a disjointed user experience.

I cannot yet see the analytics (in terms of asset usage across campaigns etc) being as much of a point of interest for many DAM users yet, but Adobe are obviously thinking about how these trends will play out and positioning themselves to benefit from them when more users understand the benefits of them also.  In particular, it is the digital supply chain visibility aspect of this and being able to identify asset usage patterns which starts to develop an advantage which will be harder for competitors to usurp.  For other DAM vendors, it is becoming a decision about whether they want to compete with Adobe across the supply chain or choose instead to specialise in some more focussed aspect of it.  I would imagine both options are unappealing, but a clear decision either way will have to be made for those who plan to remain in the game over the longer term.  As is clear from a brief analysis of SmartPic, the advantages come less from raw technical ability than they do existing commercial positioning, which tends to be another defining characteristic of all maturing markets.

Related posts:


{ 1 comment… read it below or add one }

Lisa Grimm March 18, 2015 at 5:35 pm

It would be hugely helpful for me to see the use of an asset across campaigns, but the current state of DAM analytics in AEM is almost entirely absent; to your point, it seems like they are trying to skip the basics – things like when an asset was uploaded, tagged, found, user data, etc. – and are skipping ahead to campaign planning. That’s all well and good as part of a digital campaign planning tool, but it doesn’t in and of itself improve their DAM structure (such as it is).

Managing metadata in AEM is a chore now, and I would certainly love automation, but it doesn’t work without a proper framework.

In short, I’d like to see Adobe make more of an effort in finding out what DAM users and managers really need, before leaping ahead to what marketing organizations think they want.

Leave a Comment

Previous post:

Next post: