You are here: Home » Vendors » Are G2 Crowd Digital Asset Management Software Reviews An Authoritative Source?

Are G2 Crowd Digital Asset Management Software Reviews An Authoritative Source?

by Ralph Windsor on April 10, 2017

Last month, G2 Crowd released some press about their revised rankings of the ‘Best Enterprise Digital Asset Management Software’, based on their user reviews:

G2 Crowd, the world’s leading business software review platform, today released the Spring 2017 Enterprise Digital Asset Management Software Grid report to help businesses make the best digital asset management technology buying decision.” [Read More]

G2 Crowd and their reviews have been discussed at some length on the DAM News LinkedIn group before.  The consensus from that conversation seemed  to be that they were not entirely trustworthy, either because unscrupulous vendors could manipulate them or due to the opaque nature of the ‘G2 Score’ and their methods of devising these quadrants, which analysts are keen on (and that are arguably of dubious merit also, in my opinion).  I have spent some time looking over G2 Crowd and I find it to be lacking in authority, not least because G2 use some aggressive practices to obtain reviews which should be taken into account by prospective purchasers of products reviewed on their site.

Before I deal with that side, let’s consider some semantics about their use of the description, ‘Best Enterprise Digital Asset Management Software’.  The idea that you have ‘best of’ anything for Digital Asset Management is flawed.  DAM is a wide-ranging field and a product which is highly suitable for one group of users might be useless for another.  It could be argued that this is based on user reviews as a measure of ‘best’, but then this depends on who the users are doing the reviews.  Do they work in the same sector as you, is their role similar, are the characteristics of their organisation the same, how long have they been using the product before the review was issued?  All these factors (and many others) can skew the results.

The second is their description of all the firms listed as ‘Enterprise Digital Asset Management’.  At one time, ‘Enterprise’ (when applied to software) used to mean ‘big’, as in user volumes and cost, with a complete implementation being a minimum six figure undertaking.  Lots of vendors, therefore aspire to be ‘Enterprise’, because they prefer rich customers to less well-off ones.  For the majority of those listed on G2, however, these are unfulfilled aspirations (and are likely to remain so).  There are a few names who certainly do have those credentials, but they also don’t have very many reviews – in fact some of the vendors who tend to service the expensive end of the market have no reviews at all.  As such, there is a potential argument for solution suppliers who  want to be known as enterprise DAM vendors to ask to be removed from the G2 site to avoid being deemed not sufficiently ‘enterprise’ enough.

Moving on to the reviews, this seems like quite an unclear situation too.  Some are almost certainly legitimate and the people who wrote them demonstrate a good knowledge of DAM which suggests they genuinely use the products they are discussing, however, in the mix there are others where the use of similar phrases across a number of entries suggests some gaming has gone on by ambitious vendors.  For example, at least one firm has a perfect satisfaction score of 5/5 from every respondent and in answer to the question ‘what do you dislike’ there are phrases like ‘nothing, the product is perfect’, ‘there is nothing to dislike’ etc.  There are other reviews where the vendor has been given a 5/5 ‘perfect’ score, but the reviewer has gone on to list missing features or limitations (including additional costs in a few cases).  I don’t know how many other people share my view on this subject, but in my opinion, 100% should mean absolutely perfect and is practically impossible to attain.  The best software will never get developed, to paraphrase the saying about books.  I have seen a fair proportion of the products listed in-action (whether via demos or as client implementations) and not one deserves 100%.  As discussed earlier, my assessment would also depend on the usage context so the score attained would be lower or higher depending on what someone wanted to actually use the DAM for.

Using raw numerical methods, especially a blunt instrument like a five point scoring system is an exceptionally poor way to evaluate complex products that encourages reviewers to contradict themselves solely to make it easier for the publisher (G2) to implement their own ranking criteria.  When clients propose procurement scoring systems for selecting products, I advise against it (and virtually every other knowledgeable DAM consultant I have met or read articles by has the same view).  The reason is that what almost inevitably happens is that the scores will get fudged or just ignored if they don’t agree with what the consensus view of the purchasing authority happens to be and you just can’t reduce sentiment about product to a simple quantitative evaluation very easily.

Where the situation with G2 changes from being merely over-simplistic to somewhat lacking in trust are their tactics for gathering reviews.  In the LinkedIn discussion about G2, I cited a response to a question on Quora.com, How do sites like TrustRadius, IT Central, and G2 crowd plan to monetize their businesses?  The answers seem reasonable to me; essentially, G2 are trying to create a data warehouse which they can leverage through spin-off products and services like advertising and data reports etc.  The model is a well-worn one now, buy the content cheap, re-package and then sell it for more than you paid for it.  This means they need reviewers because the copy they write is the raw-material required to enable the monetisation process.  To that end, G2 offer an Amazon gift card for anyone who submits a review.  Although that creates an incentive, it’s not a huge one if the review is freely given.  The problem is that it isn’t always, G2 actively and aggressively solicit opinions by contacting customers of vendors featured on their site.  Not all the software vendors they cover like this aspect:

Let me make this crystal clear, G2 Crowd. It is not acceptable for you to email my customers, imply we have a working relationship, and then bribe them for product reviews for your own benefit. It’s irresponsible, inappropriate, and frankly, pretty gross.” [Read More]

Some people in receipt of the gift card have a more positive view, however and take to affiliate marketing sites to encourage others to participate:

Do you use any software as part of your job? Tell G2 Crowd what business software products that you currently use at work and some of them you can review for $5 to $15 per company. LinkedIn profile is required so they can validate you work at a company! I signed up for the $5 Starbucks offer a few months back but I’ve been reviewing software and earned the maximum of $50 already on Amazon, I have purchased a salad spinner with this Amazon cash so I can eat more greens too. TMI? Maybe, too much free money? Never.” [Read More]

These two quotes highlight the key issue with G2, especially with a market like DAM.  Using questionable tactics to encourage people to write reviews and aggressively spamming users to harvest low-cost content makes the whole thing look cheap.  If you combine that with the flawed methods employed for scoring suppliers and sprinkle in some fake reviews authored by those vendors who are partial to game-playing then you get a concoction which could leave a rather unpleasant after-taste.

G2 are not alone in using these kind of ‘review mill’ methods to achieve critical mass, nor are they the only ones who employ mechanical approaches to derive vendor rankings.  The DAM vendor who told me about the French fake DAM review site last year also pointed out that Capterra use a ‘social reach’ scoring methods as one of their criteria to assess vendors.  They also demonstrated to me how these can be easily gamed by buying fake twitter followers, so dubious practices are rife on these sort of sites in one form or another.

I must emphasise that not everyone who posted a review for a given product is doing so for the wrong reason, it probably is only a minority.  Just because they appear on G2 Crowd and got given a fifteen dollar voucher for their trouble, doesn’t invalidate what they say, but the fact that G2 have resorted to these methods has muddied the waters because you can’t easily tell who is real and who is just doing it because they want a cheap salad spinner (or because they set up an account to review the product developed by the DAM vendor they work for).

A few years ago, Tim Strehle wrote a blog article about the complications of getting reviews of DAM products and he quoted from a comment I made on a discussion thread where I observed that you don’t read many negative reviews about DAM solutions, partly because the users fear making them as they’ll have to deal with representative of their system supplier for potentially a long time afterwards.  This is a point to keep in mind about reviews of more expensive software.  The other issue is that if you ask the majority of users what they think of their DAM less than six months after it has been first deployed, they will tend to have a positive opinion, partly this is relief at having made it through the demanding process that DAM implementations can sometimes turn into.  To avoid these effects, opinions should be solicited is a year or more after the initial rollout and preferably when some new people are introduced who were not a part of the team who chose the vendor.  This period of reflection affords some opportunity for a more balanced perspective.

Sites like G2 Crowd, Capterra etc are not useless as they do at least list some options for prospective DAM users to think about, but the authority of them is not something you should depend on, especially given that DAM solutions are relatively expensive purchases (even at the lower cost end of the market).  This is partly the reason why we have stuck to a simple directory for our own list of DAM vendors rather than proposing it as a reviewing facility.

Like a number of other issues in DAM currently, the situation with getting access to impartial, in-depth reviews of DAM products remains far more difficult than it should be.  While there are some detailed analyst reports available, these are not cheap and they might be cost-prohibitive for a fair proportion of prospective DAM users.  The same criticism could be levelled as using the services of a consultant too in order to prepare some custom evaluations instead of a report of pre-written ones.  This is something I have been talking to a number of other people about and I hope to make an announcement about it later this year, but there is nothing imminent.  At this juncture, therefore, the best advice I can give is for users to educate themselves and try to talk to some real users – which you might need to do offline in an environment where opinions can be more freely given.  Even if you are able to afford the services of a consultant and/or buy an analyst report, you can never know enough about DAM.  An investment of time spent educating yourself about it will pay a reliable stream of dividends in the form of superior knowledge and expertise.

Related posts:


Leave a Comment

Previous post:

Next post: