Evolution of data analytics

Analytics or Analytic studies have undergone a usual evolution that all such knowledge artefacts do
Business needed analysis, analysing the data that they acquire or capture in their operation, along their business growth or proess and at special moments in their business needed thorough examinations- these examinations when needed more detailed treatment pushed the onus onto quantification of values. Quantification gives rise to comparing value from one operation with similar operation by others or at different times or at different points of occurence. Thus business analysis became more of a data analysis where these data typically needed to be numbers, values, amounts that are finite but highly variable and in most cases not so discrete but continuous.

Data analytics thus ended up being hard core number crunching operation in the way of examining the aggregation, average operations, variation from some presecribed value and the clustering or sparsing of data. Mathematical and statistical operations had to be calculated. Putting this in the use of commercial and professional field operations starting from the basics to the advanced had to be ready at hand in any instant. With increase of complexities and increase of volume, type, dimensions and nature of data the complexity of these operations multiplied up in exponents.

Data analytics underwent changes in some aspects too. The process of data analysis, the quest of human inquisitieness with data and metrics did never had any kind of bounds and would never go down. The marginal demand for answers with respect to any amount of extra knowledge earned would again increase in exponents and that has actually occured. The instanteneity of the demand along with the burgeoning complexity and explosion of data captured and the ease of data capturing technology has made Data analytics impossible through manual operations.

Data analytics has been, is and will always be a mathematical and statistical operation projected on data elements- this we knew, and know. But data is time dependent and finite too! Every data element has an innert time function associated with it and can never be considered to be everlasting!This of course we mean when by data we mean the data value. Complexity arises when we start considering meta-data or data of data or the various levels of abstractions of data as data in their own capacities. Meta data might change over time- then there grows an exigency of creating the next level of metadata to describe them. Every level of meta-data is an abstraction of the next lower level of data that is considered to be the value to the previous level- therefore a typical pair is actually a pair and can be repeated with as many re-iterations across the abstraction level scale as one wishes. There should be some kind of a structure than can dynamically hold these different levels of combination. This is the challenge of dynamic data analytics.

Addition to these there is the complexity of data as a process in the historical and time scale. A data set of different event time scale reveals some kind of information – A data set of a particular time can be operated upon by the same metadata set and rules set at different points and can reveal meaning on a horizontal scale. Across different event scale a different set of metadata might be needed to reveal meaning – this is the vertical analysis scale, where on a vertical time scale sets of rolled up data can be brought to examination. This is done through aggregation, averaging and other mathematical and statistical techniques. The vertical study can be done in technology through a different kind of relational algebra called column based algebra in database nomenclature. Rolling up of data of a horizontal event plane and then comparing them with vertical time scale or rolling them down in the reverse track is the technique that data analytics tools have been doing now for a while. But the exigency drives us even further and deeper.

Previous data sets can be brought in to live and collated to have the running trend study. Trend-studies are not only time dependent scales, they can be done over many other dimensions. But previous data sets that have lost their currency would be useful to aid the current data in bringing out deviations and review and study of the variations. This is now known as descriptive data analytics- we have been using this extensively so long. The business analysis world is now a data analytics world and no serious business can think of running without this.

BEYOND:
The exigencies have always been felt before the popularization. Predictive analysis is the one that would use all kinds of statistical techniques to predict following some inert trend what could be the value at a future time or event or moment [ not necessarily on the time scale]. With this even some probabilistic and even stochastic operations would come up.

Still the value would be heavily biased on the previous value and the trend set by them. If a predictive analysis can be alluded to be the probabilistic or most likely study of the present or near future times then we have some more allusions to cater for.
Any probabilistic value is a speculation. Speculations can be more logical and rational and that means more in the line of the previous values. But every such speculation is never a certainty and every such uncertainty must and almost always are associated with multiple alternatives.

How does one vet or judge the alternatives? What is the best alternative with what “cost” associated? Which alternative although might be less than the best may give much better benefit in the future points? Which track of alternative path should a decision maker choose to reach the objective with least resource, least time and highest return? And there are plenty such questions that are sought in choosing among many alternative solution or the optimum solution from a solution set or optimum being delimited by different conditions and constraints- constraints that was there from the beginning, constraints that came along the way, constraints that undergo changes themselves. Thus the optimum solution set is dependent on various dimensions – this kind of complex multi-dimensional studies can be solved only by what now we call prescriptive aata analytics.

Presciptive data analytics is a hyper-cube operation on their own right!
One might be inclined to think that unless a full fledged ERP and Complete data warehouse is not finished one cannot embark on this journey. In reality ERP-EDW path has little congruence of little dependence on the Data-Analytics path. It can and should run parallely. As ERP-EDW path is evolutionary and can never be stopped so is the journey track of Data Analytics- it evolutes too! The evolution is more dependent on time, data volume, rule complexity, condition-dynamics and constraint dynamics that is the dynamics of the change of the objective function.

The question of elasticity comes to its significance here. The demand elasticity of meaning with respect to data explosion, with respect to rule-increment, with respect to change management, with respect to diversification and other various factors needs a self-service, resource elastic, easy handling, fast changing, and fast rendering of depictions – this is the need of the day brought in by the three main stages of DATA-ANALYTICS:- Descriptive, Predictive and Prescriptive!

No comments yet.

Leave a Reply