The Future of ECM: Content Services & Systems of Understanding
Over the past few years, much has been written about the demise of Enterprise Content Management (ECM) systems and, more recently, about the emergence of content services platforms, applications and components.
Death of ECM & Rise of Content Services
In January of 2017, Michael Woodbridge of Gartner wrote in his blog:
ECM is now dead (kaput, finite, an ex-market name)
At least in how Gartner defines the market” and he announced the birth of Content Services.
John Mancini of AIIM responded by introducing the world to Systems of Understanding and ushering the era of Intelligent Information Management, which was also the topic of his keynote at the AIIM Conference in Orlando.
Yes, as they say, the times, they are a-changing. The volume and diversity of content have continued to increase.
- Mobile devices have proliferated beyond imagining and invaded our day-to-day lives.
- The Cloud has emerged as the predominant force in IT infrastructure and in how applications and services are delivered to customers.
- Social media has forever changed how we share information and communicate.
- Consumer apps and the Shadow IT era have shown us that what users want matters, and that good is often good enough.
- And, more recently, as machine learning and artificial intelligence technologies have begun to mature, the opportunity to derive greater insight from content has come to the forefront.
This is the nature of technology, to change, to evolve. With this continual evolution, there are, inevitably, epochs in the market, periods of change that cause upheaval and perhaps some hand-wringing, but also create incredible, new opportunities for customers and the vendors that serve them.
I will leave the discussion about what to call this new era of ECM to brighter minds and to those in a better position to comment. (Or, perhaps, I will ignore my own better judgement and discuss this in a future blog posting.) What I really wanted to do today was focus on the future of ECM. I also wanted to move the discussion forward, as much has already been written about the impact of S.M.A.C. (Social, Mobile, Analytics and Cloud) on ECM.
So, over the course of this blog, I will share with you on the future of ECM, focusing on some of the other key elements of the next-generation content platform, including: modularity, connectivity, metadata models and extended, hybrid-Cloud infrastructures.
Modularity is key for innovative and future proof solutions
For those of you who know me, no, this clearly won’t be a treatise on architecting ECM systems. Instead, I will look at the practicality of a modular architecture from two simple perspectives: one, in building new solutions on a Content Services platform; and, two, in ensuring that your next-generation system is future-proof.
What is the purpose of a Content Services Platform?
It is, very simply, a common construct upon which to build solutions. These are either bespoke solutions, where customers leverage the platform for their unique business needs and benefit; or, they are repeatable applications, where vendors or their partners develop tailored applications to address specific horizontal or vertical (industry) business needs. Modular services, or micro-services, make it easier to develop specific solutions or applications. Rather than dealing with an unwieldy, monolithic platform that tries to be all things to all people, developers can simply invoke modular services that deliver the specific functionality they need.
Remember the old argument about best-of-breed vs. integrated ECM platforms?
In many cases, customers sacrificed greater functionality or a superior user experience in order to avoid the pain of integrating best-of-breed solutions. With the advent of the Cloud and Cloud services, it’s becoming increasingly easy to find the exact service you need and simply embed it – perhaps alongside services from other vendors – to create the application you need for your business. As we move forward in this new era of micro-services, this has some very interesting implications for vendors, as they price their modular platforms, and for customers who may be only utilizing a portion of the capability of the platform.
Modularity is more than skin deep.
Customers should also look for next-generation platforms that architect more deeply for modularity to allow new advances in infrastructure and technology to be readily added and embedded in the platform. One of the hidden issues of monolithic ECM platforms is that, with many of the major releases, customers are faced with a full-scale migration to adopt the latest version. The costs and resource requirements associated with these “upgrades” are massive and one of the most significant drivers of high total cost of ownership (TCO) for the previous generation of ECM platforms.
Therefore, wouldn’t it be much better to adopt an ECM solution that was forward-compatible, one that allowed you to update the underlying database (e.g. to offer lower-cost, more performant NoSQL options), to add new options for content storage, or to incorporate a new search engine – all without replatforming? Believe it or not, some vendors do think about architecture and its impact on TCO for customers.
> Calculate your TCO here
So, my argument is simple: as you are looking for a more modern, micro-services-based architecture to speed your development efforts, you should also look for modularity in the underlying platform and for an architecture that can continuously take advantage of the latest advances in infrastructure and supporting technologies.
Connectivity & Openness
I will go a step further and explore one of the key issues with legacy ECM systems and contrast this with a more modern approach that revolves around openness and connectivity.
There is a fallacy at the heart of ECM.
Enterprise Content Management was originally more of a philosophy than a technology: we believed that we could make unstructured information (content) easier to find and work with and, by doing so, we could help customers to be more efficient, more productive and more informed in their decision making. However, along the way to this simple but lofty ambition, our original concept morphed into an entire suite of technology and, at its heart, was an idea that a single repository was needed to make all this information readily accessible and sharable.
It seemed logical at the time. If organizations were going to deploy a single ECM system broadly, across the enterprise, then it made sense to have a common repository at the heart of this system – a single source of truth, as we used to say. The problem is that there are actually a number of issues with this approach:
- First and foremost, almost no one actually deployed only one ECM system. Instead, depending on the needs of the business, different departments purchased and deployed different systems.
- Second, most ECM suites grew through acquisition and, in many cases, this single-repository approach made it more difficult to properly integrate newly acquired functionality.
- Finally, not all products within any suite are created equal. So, in many cases, customers were forced to either purchase a new tool from a different vendor (see issue #1) or procure a less functional, less capable or less appealing tool in order to purchase from the same vendor.
I would also point out that many of these legacy ECM suites were architected 10, 15 or even 20 years ago and were really designed as proprietary systems. So, perhaps a “best of breed” or connected approach was difficult or even impossible to achieve back then. However, technology has certainly evolved since then. Isn’t it time that ECM evolves as well?
Your users should choose how they want to work.
In this modern, app-driven world, the user is empowered to self-select the technologies that make him or her the most productive. Microsoft SharePoint, Google Docs, EFSS tools like OneDrive, Box and DropBox: these are all great examples of simple, lightweight apps that users employ to create, share and work with content. This only begins to scratch the surface. How about messaging apps like Slack or Microsoft Teams, where content is routinely shared and distributed? Or, business applications like Oracle, SAP, Workday or Salesforce, where content is often a critical element of the business process?
The truth is that the number and diversity of content, productivity and business apps that employees use will only continue to increase, which leaves the modern enterprise with two simple choices:
1. Limit user access to these apps and continue to attempt to force users to adopt enterprise standards (we know how this will end).
2. Allow users to work the way they want to work.
I think the answer here is apparent. Your next-generation ECM solution should provide out-of-the-box connectivity to your most common business and productivity apps. It should also provide an open, standards-based connectivity framework that enables you to easily integrate with new apps, when OOTB connectors aren’t available. This simple, best-of-both-worlds approach gives your users the flexibility they need, while enabling your organization to better leverage your content across the enterprise.
You should have choices about where your content lives.
If you’re like most organizations, you have multiple content silos in your organization. Perhaps these are departmental deployments of legacy ECM solutions, SharePoint instances scattered across the enterprise, or newer content stores for sync and share apps.
With legacy ECM, you have only one choice: to have your users copy or move this content, slowly and painfully, into a centralized repository and then hope that they can find it again. With next-generation ECM, the answer is much simpler: connect to these different repositories to provide federated access to all this information. Again, out-of-the-box connectivity for common content stores and an open, extensible connector framework is a key element of a next-generation ECM solution.
By the way, once you have been given a choice about where content lives, you also have choices to make about where content will live in the future. A modern, next-generation ECM platform should support any number of storage options for your content. Does your regulatory environment require your content to be stored on-premises? Do you want to take advantage of low-cost, high-availability Cloud options like Google Drive, Box or even Amazon S3? Or, does a blended, hybrid approach better fit your needs? Great. These should all be storage options that a next-generation content platform provides.
You should choose when and where to deploy your next-generation solution.
As we’ve discussed above, the legacy, single-repository approach creates a number of challenges for your users and actually makes it harder, not easier to achieve the original vision for enterprise content management. It also creates a unique challenge for IT: how to deploy new content technologies in an organization that already has many.
With legacy ECM platforms, you once again don’t have many choices. You can either “lift and shift”, discarding your previous investments and replatforming all of your current content solutions. Or, you can choose to continue to live with the silos in your organization. However, with a modern ECM approach, you have much greater flexibility in adopting a next-generation platform. By offering connectivity as an option, existing investments can be left in place, without leaving the content behind. This gives you a choice about when, or even if, you want to decommission older ECM environments. And that’s really what it’s all about: having a choice.
Once again, my argument is simple. Legacy ECM suites are closed systems, architected around a common repository and a singular way of doing things. Next-generation ECM is all about choice, openness and connectivity. Your users should have a choice about what tools they use to perform work. You should have a choice about where content lives. And, you should have a choice about when or even whether you migrate existing content stores. Connectivity makes this all possible.
View original content here