From Real-Time Insights To Actionable Decisions - The Road Entrigna Paves

What problem does Entrigna solve?

With tremendous increase in computing power and decrease in memory & storage costs, today’s businesses are weighed down with a deluge of mostly disconnected data, much of it highly relevant to effective decision-making; data associated with business applications, processes, transactions, operations, customers, customer insights, products, product insights, policies, systems, business-partnerships, competition etc. produces unmanageable amounts of information in many different formats. Such data is highly volatile and inherently contains time-sensitive business intelligence reflecting upon momentary business performance. If detected real-time, such ‘in-the-moment’ business intelligence can provide real-time insights that can be used to dynamically determine an optimal action to be taken in real time; Essentially such data would need to be ploughed through to discover valuable knowledge & instantaneous insights in a way to make actionable decisions and feed such decisions in real time back into business applications, processes and operations in a way to drive business value & profitability. A few examples of such insights & decisions are: Product recommendations based on customer actions & purchase behavior, Offer optimization based on customer insights, customer churn prediction, customer disservice detection and recommendation of recovery actions, dynamic pricing of products & pricing optimization based on 'in-the-moment' shopping-versus-buying patterns, real-time predictions of perishable inventory shrink, anomaly detection such as fraud and recommendation of course-correction actions.

To derive intelligence and insights in real-time from such volatile & varying data, there is a real need for technology with capabilities such as Machine Learning, Data Mining, Rules Processing, Complex Event Processing, Predictive Analytics, Operations Research type of Optimizations, Artificial Intelligence and other intelligence-generating Mathematical Algorithmic capabilities coupled with flexibility to mix & match such capabilities for more complex decisions orchestration. The breadth of decision frameworks is necessary because different business objectives require different analytical approaches. For example, a rules engine works great when recognizing a customer for a milestone. Likewise, event processing is well suited for identifying potential customer dis-service scenarios. Finally, optimization techniques work well when making decisions about which promotions to place in front of the customer.

Another challenge such technology would need to address is processing & correlating high-velocity data in live-streams coming from disparate data sources and in wide-variety of formats. As such technology should be highly scalable and fault-tolerant with extensive provisioning for large memory distribution & massive parallelization for executing complex CPU bound operations. From licensing & maintenance point of view, such technology should be cost-effective and economically viable. However, there is no single technology that is readily available offering all these capabilities out-of-the-box and with seamless implementation.

Why is this problem a big deal?

There are commercially available technologies that specialize individually in one required capability or the other. For example, there are sophisticated Business Rules Engines, technologies that excel in Complex Event Processing, those that excel in Data-Mining, in Operations Research, in traditional Business Intelligence etc. Each technology perhaps works well within its realm of decision-making capability. In almost all cases such specialized technologies come from different product vendors. There are a few vendors that offer some of these technologies as part of a suite, but as independent products within the suite. As such these technologies are not necessarily designed to speak to each other & inter-operate. However for real-life complex business decisions orchestration, such capabilities would need to be combined in a flexible and seamless fashion.

dw

Many businesses leverage traditional Business Intelligence technologies alongside Data-Warehouses & Data-Marts with sophisticated data mining capabilities. Such traditional approaches are either time-driven or request-driven. Operational & Transactional data is staged and analytically processed in batch mode. Data mining & model building are static mostly purposed to create reports & feed results to dashboards. Human reasoning is still needed to understand the business insights and trends so that any course-correction actions can be suggested/implemented for adapting business processes to new insights. This entire process of extracting business insights & trends spans from days to weeks and by the time such business insights are applied to business processes, the business environment might evolve further making the insights stale and potentially counter-productive.

Other businesses procure individually specialized technologies as mentioned before and make them inter-operate by developing custom middleware solutions. Even then deriving insights & actionable decisions is not comprehensive because required decision orchestration is still not seamless and not fully realized since it is like a split-brain across disparate technologies. This entire saga is prone to large capital investment spent in procuring disparate technologies and in developing custom middleware solutions to make such heterogeneous technologies inter-operate. As such many business initiatives with an urgent intent of exploiting maximum advantage from real-time insights are prone to long delays with long project timelines. Moreover, because decision orchestration cannot be fully realized, such business initiatives get implemented with limited-scope with many requirements de-scoped resulting in businesses losing original business value proposition, a loss typically measured in millions of dollars, besides losing competitive edge.

How does Entrigna solve this problem?

Entrigna developed a real-time decisions platform called RTES – Real Time Expert System. RTES enables real time decision capabilities by offering full range of decision frameworks packaged in one technology that work together seamlessly; Rules Engine, Complex Event Processing, Machine Learning, Artificial Intelligence, Optimization, Clustering/Classification. Essentially, RTES platform exposes such decision frameworks as built-in modularized services that can be combined and applied on an organization’s business data on a real-time basis to identify intelligent patterns that can lead to real time business insights. By packaging such decision capabilities in one technology, RTES enables seamless orchestration of higher-level decision services - meaning an hybrid decision service can be configured as network of individual decision services, as in electrical circuit, in-series and in-parallel. Individual decision services can be rules based, machine learning based, classification based, segmentation/clustering based, predictive or regressive, real-time optimization based.

Since RTES works on live-streams of data, it promotes event-driven data integration approaches. An event can be any number of things but is usually initiated by some action or activity; examples include, a customer shopping for tennis rackets online, sale of 1000 winter gear items in last 1 hour, a snowstorm being predicted for North-East, a flight getting rescheduled; RTES attempts to extract events from live-streams of data in order to initiate the real time decision process.

RTES processes live-data a.k.a events as they occur, combining the event with other valuable data or other events, gaining intelligence from the data and deciding on an action to be taken. Sometimes, knowledge of the event is sufficient information to derive an insight and take action.  More often, additional data must be leveraged to improve intelligence. For example: customer profile, transaction/sales history, channel interaction history, social activity history, external data like weather & traffic. RTES employs data architecture strategy commonly referred to as Data Virtualization that integrates disparate data sources in real time into a usable format while decoupling data processing aspect from intelligence derivation aspect.

To enable derivation of intelligence, RTES makes it easy to combine different decision frameworks. For example, to implement a specific offer optimization requirement, RTES enables use of decision trees to determine customer value score, clustering to segment customers based on customer attributes, neural networks to assess purchase trends by customer segment, optimization to rank most-value-generating products, additional rules to further personalize offers to a specific customer and CEP to augment offers based on external events such as weather & national events - all of these orchestrated seamlessly within one single technology .i.e. RTES.

ex

Once actionable decisions are determined, RTES enables such decisions to be integrated with business applications, processes and operations to enable action in real-time that impact business outcomes. For example, presenting optimized & personalized offers to a customer in order to help that customer buy his/her choices of products more easily such that business objective of increased product sales is met. RTES makes actionable decisions accessible by means of web services, messaging, direct web-sockets, database adapters and other custom adapters.

tr

RTES enabled machine-Learning and AI based predictive decision models can also learn and adapt based on feedback from actions taken. Online predictive models learn from action feedback in real-time while Offline predictive models learn in batch mode from action feedback that is stored first & consumed later. Typically such feedback is enabled for other business purposes such as Enterprise Data Management & Warehousing and RTES can tap into existing feedback channels, without having to necessarily devise new ways of consuming feedback.

How does Entrigna engage clients?

Below are high level steps that Entrigna would typically follow while initiating a client engagement.

Working collaboratively, Entrigna would engage with client by listening to client’s requirements for leveraging real-time intelligence paying close attention to business goals. This is a mandatory step more so because the concept of real-time intelligence is relatively new and as a product & services provider, it becomes critical for Entrigna to streamline client’s ideas, dotting the I’s and crossing the T’s and in the process steering the client realize much more business value off of real-time insights than initially anticipated. This includes capturing client's thoughts around what they presumed as possible solutions versus what they assumed as infeasible, impracticable, anecdotal or imaginative ones, something they thought not implementable at all because technologies that they are aware of, lacked required capability. Of course, this step is very much preliminary with the understanding that Entrigna would help realize more cases & opportunities for real-time intelligence during the course of actual ensuing implementation.

Once client’s initial requirements are discovered, next natural step is to thoroughly understand two important business aspects; 1] business processes, applications & operations where real-time insights-driven-actions would be implemented and 2] data, types of data, potential sources of existing data and sources of new data that would come into play for plowing through intelligence.

stps

Next step is a bit more involved step where hypotheses for different intelligence scenarios are actually given shape in the form of algorithmic models. Entrigna would employ elements of data-science applying them to client’s specific needs. This is very much a collaborative step wherein client’s subject matter experts would work hand-in-hand to vet the intelligence models. Entrigna would quickly prototype more than one model - trained, tuned & tested. Typically, the differences in the models are more in terms of mixing & matching of underlying decision frameworks. Entrigna would then share & review the model results to verify if what models predicted is close to or exceeding what client anticipated. In case there is a need to increase results accuracy, Entrigna would either further fine tune underlying decision algorithm or replace the existing one with a more advanced decision algorithm, ensuring applicability of such decision algorithms within mathematical feasibility boundaries set by data-science.

Finally, once the models are finalized, Entrigna would determine how decisions would get integrated into and consumed by downline applications, processes or operations; thereby Entrigna would expose models as intelligent decision services with appropriate service access mechanisms such as web services, messaging etc.

Tell us if this blog helped and please do share your comments!!

Leave a Reply

Your email address will not be published. Required fields are marked *