Big data is a term many of us have heard, but most of us don’t know what it refers to. According to Laurie Miles, head of analytics for big data specialists at SAS, the big data term is around us for many years, and we are trying to calculate it, it is not big but larger than that. The data explosion is a drastic increase in the quantity of any data and information which is published or made available to view.
The traditional data storage was organized in databases till the introduction of the internet which made it all digital. Everything which we do can be recorded, translated, transmitted and analyzed. The gadgets which perform all these tasks are a pocket size smart phone to super intelligent computers, smart home appliances to advanced CCTV cameras. The hugeness of the data is expected to lead towards data explosion using volume. To analyze the explosion and reasons for explosion, we must know what is the volume of this recorded data and why is it growing fast.
— Nornir-insight (@NornirInsight) January 4, 2017
The 90% data we have today is said to be created in past years. According to IMB research, 2.5 billion gigabytes (GB) data was generated daily in the year 2012, of which 75% is unconstructed data which is in the form of text messages, audio, and video messages. The popularity of communicative devices i.e. smart phones has been rapidly increased from 2013 to 2017 by approximately 70% and is still growing. This data is either stored online in clouds offered by many developers or open source platforms such as Hadoop, Cassandra, NoSQL, etc.
The information transferred through the internet had an annual record of 667 exabytes (EB) as estimated by Cisco for the year 2013. Initial computers followed a system of KB storage system, but the latest storage capacity is expected to reach one yottabyte (YB) which is so complicated to imagine. It is predicted that the data production would increase 44 times in 2020 as it was in 2009 and 1/3rd of all the data would be available live on the cloud. Google is the biggest contributor which processes 3.5 billion requests each day and stores 10 exabytes (EB) daily which makes up to 10 billion gigabytes (GB). Facebook uses 2.5 billion data, 2.7 billion likes and 300 million online photos which make more than 500 terabytes (TB) of data available online.
— AI (@DeepLearn007) January 3, 2017
Handling this much data is not easy, and many companies are working to make it secure. The big data market price is estimated as US $59 million worth which is expected to reach US $102 million by 2019 (IDC report). The growth of big data is arising many questions such as where would it end? Would it face an explosion? What is the limit of this data? Probably no one can answer that. Big data is clearly a hot topic and getting so much fame for justified reasons. All these reasons point to one thing which is management. People, organizations, setups, systems everything which uses raw data to produce any meaningful information should be administered.