Over the past year or two the term ‘catalog-driven’ has been getting a lot of buzz in the telecommunications industry ordering space. The concept is pretty simple – that relationships and dependencies established in a centralized product catalog should be used by the downstream systems to adapt and adjust behavior pertaining to fulfillment. The premise is that once achieved, a provider would see faster time-to-market and lower operational costs because fewer changes need to be made to the downstream systems when adding or modifying products and services.
The idea of a central catalog determining behavior in fulfillment systems is not a new one – in fact it’s been done for well over 10 years in some form or fashion by the large monolithic providers whose preference is to replace every single system within a service provider’s ordering stack. While there are numerous concerns and issues with that approach, the primary one is the transformation cost and lack of ability to leverage any legacy investments. What is a more recent trend though and more easily digested by most CSPs is the focus of having individual best of breed solutions from potentially different vendors also able to take advantage of this same concept of ‘catalog-driven’ order management through adoption of industry standards such as those proposed by the TMForum along with flexible and open architectures.
A successfully complete implementation of a catalog-driven solution should benefit the CSP in a few ways – operationally their costs should be less due to the smaller number of changes that are needed when introducing new products or services and second, time to market, a key performance indicator today should also be reduced for the same reasons. But in the end the capability of ‘catalog-driven’ order management simplifies the same linear process for change that’s been in place for 30+ years in telecommunications. It is still a static procedure made with numerous methodical steps and gates – it does not provide the ability to dynamically respond to real-time events affecting the ordering systems.
What if our ordering systems could react in real time to changing environments? What if they had the ability to change their behavior to adapt to positive or negative influences on the system in real time? With today’s advanced analytics capabilities that is exactly what is becoming possible. There are three key areas we look to when we consider how having an analytical data-mart built into an ordering platform benefits a provider: intelligence ‘on’ the process, intelligence ‘in’ the process and intelligence ‘driving’ the process.
The ability to provide this level of insight and adaptation to the ordering process means the use of an embedded data-mart within the ordering platform – which is not the same as a business intelligence reporting solution pulling from a snapshot of the active selling database – because that process introduces lags and delays in understand what’s really happening at that very moment. This analytical engine allows providers to become proactive instead of reactive.
Intelligence ‘on’ the process is an evolutionary step from the standard dashboard reporting from BI environments that moves it’s source data from delayed reference to something that already happened to real-time access to what is happening now. It enables users to respond to key trends by viewing dashboards and key performance indicators in real-time. This might provide visibility into views of sales by channel, margin, sales team, sales person, store, region, etc. or it might show backorders and CPE fulfillment issues. Real-time access to this information is critical when supporting large scale roll-outs of the latest super smartphone.