The IT expenditures are forecasted to growth due to the business speculation behind the buzz word BigData. Therefore, in order to avoid biased insights, How to measure the quality of the information extracted? How is it possible to get more value from BigData and saving useless expenditures? Are the insights provided by business intelligence, analytics, and prediction tools really reliable?
A meaningful example about the speculation behind BigData, is the case of sentimenters that analyze tweets and posts: are they a BigData bubble? (see also My Issue with BigData Sentiment Bubble: Sorry, Which Is the Variance of the Noise?)
As a matter of fact, insights and predictions are final results of a transformation process (see figure) where the data in input is elaborated by an algorithm. A fantastic and exhaustive explanation about the process behind any business intelligence tool is the pyramid of data science where five stages are identified: 1) data gathering/selection, 2) data cleaning/integration/storage, 3) feature extraction, 4) knowledge extraction and 5) visualization (see The Pyramid of Data Science).
Anyhow, as for the production of products such as cars, clothes and a good meal in a restaurant, high quality results are ensured by the quality of the raw materials and the quality of the transformation process as well.
Thus, if the raw material for an analytic is the Data, how to assess the quality of the supply, the data? ACID compliance and pessimistic looking already define best practices in order to guarantee the quality of data in terms of data management and thus reduce maintenance cost and improve efficiency.
However, from a procurement point of view, how evaluate the quality of the supply within a procurement process for sourcing data? Similarly, how to evaluate the quality of the process that transform the supply (data) into insights and valuable information?
Like for the production of products a well-defined procurement process is ensured through well written specifications documents where requirements and key parameters/characteristics are clearly stated. Superfluous to say that a well-defined procurement process will ensure the quality of the supply, and the quality of the supply will ensure the quality of the final product/service. In this case, it will ensure the quality of the insights.
Undoubtedly, the huge amount of data available nowadays and new technologic improvements are generating more business opportunities than in the past both to improve process efficiency and to define new business models (see Firm Infrastructure Vs Catalyst to Change Business Models: the Double Side of IT (Change Management).
Thus, clearly define which KPIs to look for and negotiating the aspects that really matters will ensure best IT analytics services in terms of opportunities to exploit, thanks to the insights provided, as well as saving costs.
As an example, if are not yet available parameters for evaluating the quality of data and of the analytics as well, I would go to a restaurant where I know the quality of the suppliers instead of looking to reviews and advertisements based on questionable data suppliers: fake TripAdvisor’s users (see Tripadvisor: a case study to think why bigdata variety matters).
Beeing aware that ignoring bigdata opportunities means also ignoring a better restaurants, with delicious meal and low prices, Companies that define best a procurement process for data sourcing will enjoy the best meal in a bigdata restaurant.
Feelink – Feel and think approach for doing life!