34145 Pacific Coast Hwy #335, Dana Point, CA 92629

Does data analytics matter?

An interview of Simon Gazikian

Vice President of Sierrabolics

Simon, what is your view on this subject ?

That depends on how you define “data analytics”. In these early days of data capture and analysis, “data analytics” can mean a huge number of things, ranging from simple presentation graphs to sophisticated prediction apparatus. From a practical sense, “data analytics” perhaps is best defined as “the ability to derive ROI from information”.

How would you quantify ROI from information?

The ROI obtained from information can be described in much the same way as ROI from any other investment of resource: in terms of cost and return. Costs can be measured in the traditional ways: expenditure of capital, investment of intellectual property, investment of time, investment of other finite resources. Ultimately, return can be measured in traditional ways as well: derived value from the final product. Value may vary from use case to use case. In many business use cases, value is incremental revenue produced as a result of the activity. In others it may be the attainment of an objective, such as solving a specific problem. In all cases, it can ultimately be reduced down to the traditional measure used to assess ROI in non- analytic activities.

So can you tell us what factors drive the magnitude of ROI from information?

That depends on how analysis of data is to be employed. Many data analysis efforts put a great amount of time and effort into the presentation and consideration of data and its attributes. When the use cases are considered in greater detail, invariably, the desired end product is to identify, apply or guide a course of action. With that in mind, if data analysis is looked at as cognitive system, it has an input of data and an objective; it has an output of a directive for action. ROI then becomes a function of what goes in that cognitive system in terms of time/cost/resources and the quality of its output. (“cognitive system” is used hypothetically and symbolically, for the purpose of this document “cognitive system” represents an abstract analysis tool, it is not meant to make a statement that a given tool functions as a cognitive system with no visibility into function and internals).

So what does the time/cost/resource needs of this “cognitive system” and the quality of its output mean in real practical terms?

The first order issue is the quality of the output. If the quality of the output doesn’t lead to high ROI actions, the rest of this cost equation doesn’t matter. The objective of data analysis cannot be achieved regardless of costs.

The second order issue is the time/cost/resource requirements of a given analysis “cognitive system”. These govern several things.

First are the costs, they directly have a huge impact on the effective ROI. If these costs are large, the quality of the output must be many multiples of those costs in order to have an acceptable minimum ROI in order to have value. This makes it much more challenging to have a successful data analysis experience. It also reduces the number of opportunities that can be exploited using data analysis by raising the bar for acceptable outputs and amplification of analysis costs.

Second are the resource requirements. If cognitive system has sophisticated requirements, that limits its application to the availability of those resources (e.g. computing hardware, domain expertise, analysis/statistics expertise, etc.). In these cases even if the costs are not an issue and the output is of sufficient quality, it may still not be possible to scale this process to apply to all potential opportunities.

The last is time. In many use cases, time is important. In business use cases the speed of reacting to changing dynamics can be the defining differentiator between competing entities. In some cases, the value of data diminishes quickly in time. If it is not possible to extract and exploit value before that expiry, a successful data analysis effort cannot succeed. Time requirements impose a limit on throughput. A process that takes less time yields benefits in the number of opportunities that can be explored and exploited, and the depth with which they can be explored.

So how do you assess a given tool with respect to these considerations?

There are many tools and options in existence for data analysis efforts. They provide different levels of features and capabilities. Some focus on numerics, some focus on visualizations. It is easy to become distracted by the number of features, tools and workflows. It is important to keep in mind, that for the cognitive system concept discussed, the perfect and ideal solution would be a cognitive system where data and an objective are supplied as input, and instantly, a high quality output is produced with no further intervention or expertise required by the operator. In fact ideally, such a solution could even be employed in a fully autonomous manner without the requirement of an operator.

This then makes it clear, assuming a tool generates a quality output, to maximize ROI, the analysis process itself is something that should be actively and aggressively minimized. Anything that doesn’t aggressively minimize time, costs, and resources is working in opposition to data analysis ROI.

How would you define a “quality output”?

A “quality output” must meet two important criteria.

The first is that it is actionable. The output should directly and immediately give you the information to make decisions about actions to take without any further consideration or analysis. For example if trying to make trading decisions in a financial market, the output should not be composed of graphs, summary statistics, price predictions, etc. It should unequivocally state a trading action such as buy or sell. If it does not, then it is not a complete solution. It leaves part of the process up to the user/operator to interpret the intermediate data and form their own conclusions and decide upon a course of action. In such cases, that is a failing of the analysis tool. It leaves part of the job (and arguably the most crucial part) to the user.

The second criterion it must meet is that it is sufficiently accurate to yield benefit beyond the status quo (we can define “status quo” as what would happen if there was no data analysis effort). For such an example, consider a quality control process where data analysis is intended to be used to increase production quality and manufacturing yields by reducing defective units. The status quo may be that 10 out of 100 units produced are defective. If the results of the data analysis efforts reduce this to 5 out of 100 units are defective, while not a perfect solution, it does reduce the defect rate by 50%. Such a result would meet the criteria that the result yields a benefit beyond the status quo.

Which kind of tools and technologies would you see that embody these essential traits?

The first consideration is whether a given toolset provides the end to end cognitive system description above: data+objective goes in -> a quality output comes out. As mentioned in response to the last question, if a tool does not provide that end to end function of data in/output out, then it is not a complete solution. At best it is sub- optimal.

So Simon, with all the above why does Databolics, your flagship product, matter?

Given the above considerations, as Databolics has a unique and exclusive analysis core engine, as it provides high quality outputs for a large set of use cases, and that it actively minimizes time, cost, resources in the data analysis “cognitive system” concept, it occupies a very unique and high value position. Despite the myriad of analytics solutions available today, few if any match Databolics for these traits essential for high ROI analytics. It is hard to come to a conclusion other than: “Yes. Databolics matters. Indeed, it matters a lot.”

May 28, 2017