“Every single time you make a merger, somebody is losing his identity. And saying something different is just rubbish.”
Today’s blog is derived from a wonderful question from a subscriber to the PAOSS blog.
I would be interested to know more about Data integrity/Synchronization approach and challenges, especially when various mergers/acquisitions involved. Due to this reason service provider has various Inventory systems and none of them can behave as master system. Please provide some insight on this topic. I don’t have any specific such case. I want to know overall approach, challenges and solution.
This sure is a complex question to answer because a situation like this would normally require a highly customised, case-by-case approach rather than a generic solution to fit all situations. I even took the liberty of bouncing some ideas off a friend of mine and Data Magician, Doug Duke.
Whilst reading a book called “The Click Moment” recently, a phrase in it, “smallest executable steps” really struck a chord with me in relation to complex OSS. I think it fits this situation and I’m sure I’ll be using it in many future blogs.
A few thoughts:
- In a merger / acquisition (M&A) situation, you’d normally have to think that since each organisation’s systems were working in isolation before, they will be able to continue to work independently of each other. This buys you some time, which you’ll need.
- One of the objectives from a merger / acquisition is improved efficiencies to be gained from the consolidation of overheads. Reduction in the number of inventory systems is one example of this, reducing hardware, license fees and possibly head-count that currently operate them (or as Doug says, rather than reducing headcount it should be about, “freeing headcount from day-to-day management and operations to contribute more to development and expansion of the merged entity. Good people are still hard to find and retain and there is a big advantage in keeping and developing people who already know the organisation“). Let’s assume this is one of your prime objectives in this scenario
- As the master planner of this consolidation program, you’ll need that time mentioned in point 1 to get a deeper understanding of the merged set of systems. There are many ways to capture your information (samples shown here) and your approach may be different depending on the systems and completeness of documentation you have.
- In M&A situations, there is generally a dominant organisation, so they’ll often keep the systems and processes that they’re familiar with. However, in doing so they miss out on gold nuggets hidden in the other systems. I have a great story to share about this tomorrow
- If you’re an outside consultant, then you’ll have an even tougher task as you’ll have to get to know both sets of systems. Ouch!! Have fun with that!
- Now that you have your data dictionary, entity-relationship diagrams, etc, etc you’ll have a better handle on where each data object is sourced from and what it is used for
- You now have the base knowledge to build your consolidation strategy from.
- The next step is to define your objectives from the consolidation. Is it to remove duplication, federate, cull functionality, merge functionality, cull systems, cull head-count, refine processes, etc? In Doug’s words, “What is the desired outcome / purpose of the proposed federation of the Inventory solutions?” In other words, why can none of the systems be designated Master? If they are to remain separated, why can they not be synchronised only within their own domain?“
- Then there is the question of data politics. Doug’s view is that you “often run into issues of ownership and control at implementation” so system ownership may come into play within large organisations with multiple business units. Doug goes on to say, “Master data really needs to be just that – Master – with control over not only the data itself but also over its ownership, its structure, its use and its availability. My experience across telcos – not just OSS – is that operational islands (aka. silos) tend to operate unilaterally and in complete ignorance of their actions on other operational areas and the business overall. The result is that Master data solutions end up holding not just obsolete data but information that becomes completely irrelevant to the business.” So true!
- I’m all for ruthless simplification as a consolidation strategy. I’m sure you will be collecting and spending time / money on cultivating data that you have no tangible need for after consolidation. Do you know why every data object exists in your ecosystem?
- Once you’ve decided what data is necessary, it’s time to decide what data model supports it
- As Doug stated, “I’m usually very reluctant to promote this concept [a higher-order inventory database to federate and be master over existing databases] due to adding an additional layer of complexity, which pushes total system complexity up exponentially. However, it’s potentially manageable if roles, responsibilities and data flows are clearly defined and maintained.” I agree.
- Another alternative is to look at the toolsets and potentially map data sources over to one and decommission others. If you have two OSS that are capable of discovering inventory using SNMP interfaces, then you may be able to choose the one with functionality that best suits your organisation’s needs. For example, if OSS #1 maps NMS #1 and #2, whilst OSS #2 maps NMS #3, but OSS #2 has the better functionality match, then you may look to switch data feeds from NMS #1 and #2 over to OSS #2. OSS #1 then becomes redundant
- You won’t always have a single system with master control. In most situations, you will have the option of having multiple systems that are master of certain data but there shouldn’t be multiple masters of the same objects. You then rely on clever process design to bring it all together.
- An example is the case where the network itself (eg switches) must be the data master, but each network device tends to only have local inventory context and doesn’t have the end-to-end data set. That’s where a higher order system must be master of end-to-end or cross-domain data sets such as circuit or network designs. Your data model may need a hierarchy of data objects with different masters controlling different layers
- Then there are other concepts such as Linking keys and alternate fields to help synchronise data across databases as well as closed loop data integrity as we’ve discussed previously in “Synchronicity“
- The great thing about OSS data is that it is ideally suited to “small executable steps.” Lots of small steps (the devil is in the detail) are required to achieve a consolidation like this… although the big strategic positioning moves are usually required too.
- The challenge is finding the balance between minimising the complexity of change and taking the big, bold moves at the right times to strip away the unessential