Along with people, Enterprise Data Management involves processes and tools. The way these interact defines how your enterprise can respond to changes - be it introduction of new technologies, or reacting to supply chain interruption.
Pipes and plumbing
Business processes are often visualized as plumbing and its related hardware. The concept of pipes is a familiar analogy for processes that lead outputs into other processes, similar to a plumbing network cascading and distributing a flow in a building or across town. This model is appealing, since we can relate to individual processes that we see on a day-to-day basis. Even with many hoops and complications, there seems to be an aim to identify all pipes (which include processes and tools that perform business object transformations) and junctions (further processes and tools), where data is aggregated or distributed, in order to enforce a strict process.
Eventually, however, this often leads to a pipe dream. The problems can be multiple:
- Some business items can lead to process delays, cascading into a ram effect that destabilizes the flow unless buffers and dampers are implemented in the design;
- A rigid process flow limits the ability to quickly adapt to changes - leading to possibly missing significant opportunities;
- It often has been deployed using an earlier architecture model for which it becomes difficult to find resources for maintenance.
Hardwired processes without room for tuning can be summed up as follows:
Once you automate, you're incapable of further improvement.
(Sean McAlinden, chief economist at the Center for Automotive Research, paraphrasing the perspective of Honda, a car manufacturer, of using automation for hazardous tasks or less fit for humans than machines, as reported in this article.)
What if you were to build on existing tools and accommodate for varying process lengths? Furthermore, the tool itself could be interchanged at a moment's notice, without impacting significantly the upstream or downstream processes. This looks strangely enough like a train, with cars and couplings, where the cars are your processes and tools and the couplings are the interfaces between other processes and tools.
Trains: Cars and Couplings
A train representation involves cars and their couplings, which interconnect all processes with some form of interaction. When a train is started, we don't expect the last car to immediately move - there will be a delay due to the length of the overall assembly, and based off the rigidity of the couplings between each car.
This architecture has the added benefit that dampers can be implemented between each individual process, so unexpected events can be handled to avoid an interruption, to some extent. Realistically, extreme events can always cause interruptions; in some cases, it's just not worth the investment, and there are other emergencies we should concentrate on (such as: "Earth struck by asteroid").
This architecture opens the way to loosely coupled containers of processes, tools and people. By defining convenient interfaces that can be standardized and used between containers, you can future-proof your business processes. The goal of such interface is to reduce the amount of work necessary to implement changes, accelerating the deployment of innovative technologies and processes; entire containers can be replaced or even removed when they become obsolete, at a lesser cost than re-piping hard-wired processes. An additional benefit is it reduces the interruptions process stakeholders suffer, through downtime or training on new tools and processes. These concerns are expressed frequently by enterprises - and integrators & consulting firms are listening and are working on it, as this blog entry points out.
Connecting diverging architectures
In a perfect world, everyone talks the same level language and has the same perception of things. We know this is not the case in the real world - a flat in "British" English is an apartment, in "US" English. Similarly, you'd "aller magasiner" in the Canadian French-speaking province of Québec, while you'd "aller faire du shopping" in France, when doing a shopping outing.
To achieve the integration of your processes and tools to support a coherent business process across various architectures, you should consider an abstraction layer that complies to the semantic web model. By definition, it is
a way of linking data between systems or entities that allows for rich, self-describing interrelations of data available across the globe on the web.
The semantic web hinges on the Resource Description Framework (RDF) that links data (such as business objects, and commonly referenced as 'Linked Data' in the field). The execution of the previous statement involves:
- Linked Data - the actual business data set managed through the semantic web approach
- Articulated relationships and business objects (through RDF) - because the information in your business tools needs to be enabled for the semantic web
- Web-facing applications - you'll need a front-end to broadcast your legacy applications that are not web-friendly since the semantic web can not mine your data hosted on proprietary architecture; furthermore, you might have web-facing applications, but they will need to comply to an RDF format.
Airbus and IBM recently posted the results of one of their projects, supported by Open Services for Lifecycle Collaboration (OSLC, an organization that defines standards for Linked Data) practices that helped them resolve the complex engineering problem of icing on an airplane.
In a nutshell, the semantic web sets rules for representing information on the Web. It organizes data in an almost natural (semantic) fashion, that can be machine-readable. Consider the following statements that come up during any product review:
- Ann: "Pete, can you provide me the state of consumer product XYZ, that we targeted at $20 ?"
- Pete: "XYZ is made of 3 parts: Part1, Part2, Part3; it is currently 'Active'. "
- Mary: "Pete, Ann, the supplier of Part3 is having difficulties as one of their components come from an area that's been hit by a natural disaster. Once we have it, we'll be able to start the approval process."
At its most complex, this information is gathered from different data sources such as:
- a requirements management tool,
- a product portfolio application,
- an engineering CAD repository,
- a Bill-of-Material application,
- a change management application, and
- a process flow application.
This conversation can be articulated with the following Subject, Predicate and Object representation:
- ConsumerProduct ProductName XYZ
- XYZ TargetPriceIs 20
- XYZ MadeOf Part1,Part2,Part3
- XYZ CurrentStateIs ActiveState
- Part3 Issue PartDelayed
- ActiveState SignedBy None
There you have it: you navigated through your first RDF representation. We understand the above has been simplified for ease of reading; you can learn more here.
Why is it important?
- Because you have invested a lot on many applications to support your enterprise, and don't like to throw away applications that actually work
- Because training your resources is actually more expensive than you want to admit
- Because now that you have all this information managed in silos, how about extracting more value out of it by defining relationship rules that bridge these independent sources, to reduce the effort in getting the information and acting on it in a more timely fashion, through automation when it makes sense?
Stay tuned.
(All views expressed in this blog are my personal views and do not belong to any group/organization I belong to. The blog may contain some data which may not have been verified by me. The blogger is not to be held responsible for any misrepresentation of facts, or data inconsistency. Abbreviations are used to lighten the reading - I don't intend to invent TLAs. Finally, all trademarks and registered trademarks are the property of their respective owners.)