I read Chris Mellor’s article on Block and Files covering Enrico Signoretti’s GigaOm report on “unstructured data management suppliers,” and it has prompted me to decide to share my unfiltered perspective on the IT analyst industry publicly.
Bridges will be burned in the process, undoubtedly.
My motivation for calling out the IT analyst industry, starting with GigaOm as an example
Is this revenge blogging for overlooking HubStor in the report? It’s more important than that.
Analysts wield a great deal of power. Their influence has a great impact on the market. As a result, I believe they should all be held to a high standard of integrity and thoroughness.
I hope that sharing my experience and views about the way the analyst game operates for enterprise IT products in practice (both the good and the bad of it) is valuable for the IT community. I hope it gives IT leaders insight into how analysts can mislead them.
I want IT leaders to make smarter technology purchase decisions that result in better outcomes for their companies and themselves.
After all, if you think about it, peoples’ jobs—and lives—are on the line. I know that sounds dramatic, but situations can result in catastrophe if a business cannot recover its data or access critical information promptly because of poor storage, backup, and data management infrastructure.
I’m not blaming all IT failures on analysts. That would be silly.
The world of enterprise IT is full of stories of technology vendors that are great at marketing and sales but horrible at customer service and product quality. There are times that customers have been on the losing end of great products that are neglected because of ivory-tower product strategy decisions. And haven’t we all witnessed wasteful spending on big-ticket products that do not align with objectives because senior people got involved in the vendor selection process? Analysts aren’t solely to blame, of course, but the fact is that analysts indeed play a role in such churn.
With that said, I plan to write a series of posts about the condition and practices of the IT analyst industry. We’ll start in this post by using Enrico’s report as an example to illustrate some common problems with the industry.
The scope and categorization problem
Analysts can help us gain a macro understanding of markets by offering accurate characterization of buying categories and their relevant technologies.
However, more than ever before, analysts struggle to set a workable scope when they attempt to define these things. Why is that? The technology landscape has become more complicated as a result of converging products stemming from historically siloed IT practices. And the various deployment architectures of on-premises versus cloud versus hybrid add another angle into things.
Instead of covering the market realities with full resolution, I believe that analysts are prone to convenient oversimplifications. Maybe the analyst firm doesn’t have the bandwidth to cover it all, so they publish half-baked reports. Or, perhaps, the higher-ups at the analyst firm have determined that the report must be seven to 15 pages because—they believe—a 100-page report would be too much. I’m just guessing as to the reasons why, but the fact remains that analysts lean towards low-resolution coverage.
Case in point, consider GigaOm’s report is on “unstructured data management suppliers” sets an incredibly broad and ambitious scope. Throwing in the term “unstructured” excludes databases, giving us some scope refinement to “data management,” but it’s still an incredibly broad (and deep) scope. You could argue that the unstructured data management arena encompasses backup, archiving, records management, collaboration, data security, data loss prevention, and various types of primary and secondary storage technologies. One would think that such a broad scope would yield hundreds of vendors, and indeed it could. It would require a massive report to do it justice. However, Enrico portrays the world of unstructured data management as having 16 relevant vendors only. It’s laughable.
Thus, our first analyst fallacy is the oversimplification of markets such that readers walk away with a fuddled perspective of the technologies and relevant vendors.
Poorly defined market scopes lead to a strange assortment of vendors
What vendors made the cut for Enrico? Chris Mellor tells us:
They are Apartavi (sic), AWS Macie, Cohesity, Commvault, Druva, Google Cloud DLP, Hitachi Vantara, Igneous, Komprise, NetApp, Panzura, Rubrik, Quantum, Scality, SpectraLogic and Veeam.
What an unusual assortment of fruits in this basket. We have Honeycrisp apples, Cara Cara and Navel oranges, a few different types of pears, some bananas, and a pineapple. If you’re hungry for fruit, these are the fruits of the world, according to the analyst. Nevermind that you are aware of many others.
Hitachi Vantara and NetApp are known for primary storage arrays. Rubrik and Cohesity are hyperconverged secondary storage plays. Scality is known for object storage but is also a software-defined storage player like Komprise. Panzura is a cloud storage gateway company. Igneous and Aparavi are SaaS-based data management platforms. Commvault, Druva, and Veeam are more backup than anything. And AWS Macie and Google Cloud DLP are almost utility-class tools for specific data security and compliance use cases.
What are product names AWS Macie and Google Cloud DLP doing amongst a list of vendor names?
Vendors like Komprise have licensed software, while others on the list offer subscription SaaS, and others are into the hardware game in a big way.
And yet, somehow, we have this smattering of dissimilar vendors (and two products) arranged nicely into this common evaluation framework.
Thus, we have our next fallacy, which is comparing dissimilar vendors or technologies within a common framework as if they are similar.
As professionals in the IT industry with precise requirements, we need logical and in-depth analysis of the market.
To illustrate, imagine you picked up the latest issue of Car and Driver magazine. In it, they evaluate the market of ground vehicles (a broad market scope), and they arbitrarily decided to list a Yamaha YZ450F motocross dirt bike, Tesla, a Chevy Suburban, Honda, and a Meepo AWD PRO. (Notice I named some specific products while also naming companies that offer a full spectrum of products). If we are in the market for vehicle options, would my fictitious ground transportation radar be of much help? This kind of comparison can be done, but it isn’t logical, so we wouldn’t put much stock in it, and it wouldn’t be of much help to us. Likely, as buyers, we might have already decided we need a vehicle that seats seven people, so we’d seek an analysis of market options for large SUVs or minivans, perhaps. However, in the IT industry, such study appears to be hard to find.
Influenced and/or unfair inclusion criteria
Enrico’s Radar on Unstructured Data Management has such a hodgepodge of vendors (and some product names) that it raises the question: What was the selection criteria that resulted in such a medley of names?
How can you include:
- Commvault but not Veritas, Micro Focus, and Open Text?
- Igneous, Komprise, and Aparavi but exclude HubStor and Clumio?
- Rubrik and Cohesity and not Actifio?
- Panzura but not Nasuni?
- Hitachi Vantara and NetApp but no Pure Storage and Dell EMC?
- Veeam and Druva and not Zerto, Acronis, Datto, etc?
- AWS Macie and Google Cloud DLP and not the dozens of other DLP and data classification offerings from vendors such as Varonis, Active Navigation, and Stealthbits?
On LinkedIn, I am a 1st-degree connection with Enrico, so HubStor is known to him. I am sure that names like Veritas, Actifio, Nasuni, Varonis, and Dell EMC are known to him also.
Analyst firms like Gartner will provide some transparency around their inclusion criteria when deciding which vendors make it into a report. Maybe Enrico’s story offers some insight here, but my hunch is that it does not. For instance, Gartner Magic Quadrant reports typically share the inclusion qualifications, which are usually based on a specific annual revenue amount. Gartner will sometimes mention the vendors that did not make the qualification criteria. While this is better than no visibility into the inclusion criteria, revenue-based qualification is prone to problems. Vendors are often private and do not publicly disclose financials. And in many cases, specific product categories are easily manipulated such that vendors can count other revenue streams to qualify, and there is no way to verify. The analyst firm takes the vendor at their word.
The next fallacy then is the omission of relevant vendors while claiming that the market analysis report covers the suppliers in a particular segment. It is akin to a market overview of dirt bikes that includes Honda, Yamaha, Kawasaki, and Suzuki but leaves KTM, Husquavarna, Sherco, and a half dozen others off the radar.
With several large vendors excluded from GigaOm’s report, revenue qualification isn’t the deciding factor for inclusion. It raises the question then, by what measure is a vendor considered to be a worthy “unstructured data management supplier?” Does one need to sponsor/subscribe to GigaOm’s analyst relations package to be on the radar? I’m just asking. Unfortunately, no matter how you slice it, it’s bad optics on the analyst and their firm. Either there’s an integrity problem, or it is an issue of poor research practices.
Analyst firms covering enterprise IT must embrace the fact that they are dealing with nuanced markets that are rapidly changing. These complicated markets need to be analyzed with higher resolution, not less.
Complexity is not an argument for low-quality research and partial analysis because, despite the rapid technological changes, many aspects remain the same.
For example, backup is still backup. The workloads and infrastructure you need to protect are evolving, and so are the delivery models, but none of this changes the fact that there is a market of solutions to be analyzed.