(All views expressed in this blog are my personal views and do not belong to any group/organization I belong to. The blog may contain some data which may not have been verified by me. The blogger is not be held responsible for any misrepresentation of facts or data inconsistency)
As a senior consultant in product lifecycle management solutions, I engage customers operating in a variety of industries, with the goal to provide a solution that works in an environment where often other systems are:
The conversation invariably leads from an immediate, tactical need to a longer-term strategic vision, which frequently includes integration with these other systems that support the business operations at key moments of a product's life cycle and its production.
In that context, the incumbent application becomes the reference against the new solution. It is understandable, as the rationale is:
This yields a plan where the new application - which usually includes the state-of-the-art features of development trends - needs some form of integration with an earlier architecture. Eventually, we see the emergence of a "Franken-application" - part new, part recycled. The parallels between the "physical" world and computer software are somewhat relevant: how often have you had to forcibly upgrade an application that was, otherwise, working fine?
I recently came across an article that drew me to the Centre for Sustainable Design, which reported 10 take-aways about the Circular Economy (CE), where products' life spans are extended by a variety of means, in order to reduce the necessity of manufacturing yet another widget because the widget only "needs fixing" - it's not completely worthless. The CE is "focused on closing the material loop, the more productive and efficient use of materials and extending the life of products circulating in the economy (and therefore reducing embodied materials and energy over lifecycle(s))".
Keeping CE in mind, it makes it easier for proponents of a new (or improved) PLM application to defend their plan when integration is considered; otherwise, there is a lot of explaining to do, to send to the dustbin an application that has a lot of visibility within an enterprise. While tangible aspects such as hardware & software are easy to quantify and manage when considering an application deployment, we often underestimate intangible aspects such as perception and culture change that need to be addressed in the process.
Fortunately, when considering integration or support of existing applications, you are not alone. Let's look at major organizations that govern data exchange and integration processes:
1. Organization for Advancement of Structured Information Standard (OASIS):
This is the organization that sets up open standards for information interchange. ebXML, Odette and others are represented in one form or another under the aegis of this organization. Various business processes are documented, such as Business Document Exchange (BDXR), Cloud Application Management for Platforms (CAMP), Open Data Protocol (OData), etc.
2. Open Services for Lifecycle Collaboration (OSLC):
While OASIS provides the strategy, OSLC provides tactical support: it is where you'll find the tools and methods to integrate your incumbent lifecycle applications with your shiny new application. OSLC covers Change Management, ALM-PLM interoperability and Quality Management, among other lifecycle domains. Contents negotiation is done through a variety of format (Resource Description Framework (RDF), JSON, XML).
3. LOng-Term Archival and Retrieval (LOTAR):
LOTAR concentrates on ensuring the integrity of data that needs to be archived - and restored, when necessary - to support products and technical information. The aim is to provide for such information a process similar to what we have seen for image compression: enabling the reduction of the massive amount of information required to handle complex products, without data loss. As you'd expect, a project that reaches its end-of-life and is not referenced by any other active project becomes a candidate for archival, releasing valuable resources for other projects & activities.
In particular, OSLC uses Linked Data at the core of its operations. Why is this important? As products become more complex, and applications generate and consume ever more data, this leads to the need to manage larger data sets - with fewer resources. McKinsey, a consulting firm, reckons the US "alone faces a shortage of between 140000 and 190000 people with analytical expertise" (while there, look at the interactive infographic). What can be done to tackle this challenge? You could turn to Apache's Marmotta, designed from the onset as a Linked Data platform, and with other Apache projects such as Hadoop, Storm and Kafka, you can:
This allows you to bring under control your data, and allows you to extend the life your legacy applications while deploying the latest gadgets in your environment. Yes, you might still need a data scientist to help out on the details: enterprise systems are likely to have proprietary data formats that need interpretation and mapping toward your converged architecture. No, I do not imply this convergence is trivial; it seldom is.
I did not dive into specific topics in this article: I will explore more about OSLC and the Apache projects in the coming ones. Stay tuned!