Despite big data offering companies around the world the opportunity to gain invaluable insights into their industry and customers, many firms continue to struggle with the volume of information available. Various problems are experienced, from preventing bad data incidences, to ensuring that data flows continuously to aid operational efficiency. It means that there’s a lot of work still to be done if firms are to gain every scrap of value from the data collected.
In new research, StreamSets, the data performance management company, highlighted that almost 90 per cent of businesses asked reported that they still had bad data flowing into their stores. Meanwhile, only 12 per cent of those surveyed believe that their data flow performance management operates well. One of the most heavily cited issues for big data management is the quality of the information, with corrupted, misleading or inaccurate data causing problems. Even for those committed to cleansing data throughout its lifecycle, bad data storage remains. In addition, despite having the ability to detect diverging data values being highlighted as valuable by the majority of those asked, just 34 per cent of firms rated themselves capable of the task.
The findings also highlighted specific weak links in big data management. For example, performance degradation and increasing error rates were mentioned by 44 per cent of people. Overall, an obvious trend was noted in that there’s a notable gap between the capabilities that firms have and the reported value of each individual competency.
StreamSets Chief Executive Officer Girish Pancha said: “In today’s world of real-time analytics, data flows are the lifeblood of an enterprise. The industry has long been solely fixated on managing data at rest and this myopia creates a real risk for enterprises as they attempt to harness big and fast data. It is imperative that we shift our mindset towards building continuous data operations capabilities that are in tune with the time-sensitive, dynamic nature of today’s data.”
Finally, another issue highlighted in the report is the continuing problem of data drift. In fact, 85 per cent of survey respondents said that unexpected changes in data semantics and structure could have a substantial impact upon operations. Because of this, 53 per cent of firms have to make alterations to data flow pipelines many times each month, with 23 per cent making changes on a weekly basis, if not more. Hand coding remains a popular way of designing pipelines, with firms also using data integration solutions and legacy ETL (Extract, Transform and Load) tools.
Montash is a multi-award winning global technology recruitment business. Specialising in permanent and contract positions across mid-senior appointments across a wide range of industry sectors and IT functions, including:
ERP Recruitment, BI & Data Recruitment, Information Security Recruitment, Enterprise Architecture & Strategy Recruitment , Energy Technology Recruitment, Demand IT and Business Engagement Recruitment, Digital and E-commerce Recruitment, Leadership Talent, Infrastructure and Service Delivery Recruitment, Project and Programme Delivery Recruitment.
Montash is headquartered in Old Street, London, in the heart of the technology hub. Montash has completed assignments in over 30 countries and has appointed technical professionals from board level to senior and mid management in permanent and contract roles.