Monday 19 January 2015

Mastering small data: How is big data?

There is need to adequately master “SMALL DATA” design, structures, standards and functions, if we want to effectively handle and deliver sustainably the challenges of Big Data. Currently, our national Database structures are perceived to be weak and venerable!

BroadbandThe hype in and around the concept and acclaimed deliverables of Big Data is so overwhelmingly tempting to the extent that it currently blinds developing countries and economics on how to match forward with their plans for engaging the 21st century Information Technology Ecosystem.

However, that does not in any way remove or diminished the fundamental need for and importance of appropriate and secured storage of Data at Local Data Centre. Thanks to Mainone that has professionally pointed the right way forward, by establishing a formidable and globally certified Tier 3 Data Centre in Nigeria.
 

Reliable Internet resources has revealed that the world’s technological per-capita capacity to store information roughly doubled every 40 months since the 1980s. As of 2012, every day 2.5 exabytes (2.5×1018) of data were created. As of 2014, every day 2.3 zettabytes (2.3×1021) of data were created by Super-power high-tech Corporation worldwide. As far as available date can lead us, no Nigerian corporate enterprise is currently in that Big data league!

The convergence in the global ICT Ecosystem has revealed the importance and strength of embedded systems. This phenomenal trend has led to what is now known as Softwareization. Due to this complex trend of “Softwareization of Things” (SoT), every nation is looking inwards for strategies to adequately address the challenges and delivers solutions required to respond to the current and emerging desires in ICT. Nigeria must rethink her IT strategy, organize her information technology ecosystem and master the design, processes, retrieval and storage of SMALL DATA as a reliable gateway and veritable tool and migration strategy towards Big Data.

Simply defined, Softwareization is the Internetization of the emergence of convergence of Information Technology, Telecommunications and Broadcasting. These converged technologies have led to the monumental inflation of the content on the Internet and compelled mankind to migrate from IPv4 to IPv6, giving birth to “Internet of Things” (IoT), where all connectable devises will acquire a recognized and secured IP Address.

The power and significance of Software as the blood that flows through the digital world becomes evident from day to day. Today it represents the blanket with which we rap the earthly cold of our planet – keeping it warm for sustainable development.

Big data is “massively parallel software running on tens, hundreds, or even thousands of servers”. It is an all-encompassing term for any collection of data sets. These data sets are so large and complex that it becomes difficult to process them using traditional data processing applications.
The challenges include analysis, capture, curation, search, sharing, storage, transfer, visualization and privacy violations.

The trend to larger data sets is due to the additional information derivable from analysis of a single large set of related data, as compared to separate smaller sets with the same total amount of data, allowing correlations to be found to spot business trends, prevent diseases, combat crime and so on. Global examples according to Internet resources includes but not limited to the following: Walmart handles more than 1 million customer transactions every hour, which are imported into databases estimated to contain more than 2.5 petabytes (2560 terabytes) of data  – the equivalent of 167 times the information contained in all the books in the US Library of Congress. Facebook handles 50 billion photos from its user base. FICO Falcon Credit Card Fraud Detection System protects 2.1 billion active accounts world-wide.

The volume of business data worldwide, across all companies, doubles every 1.2 years, according to estimates. Windermere Real Estate uses anonymous GPS signals from nearly 100 million drivers to help new home buyers determine their typical drive times to and from work throughout various times of the data. Scientists regularly encounter limitations due to large data sets in many areas, including meteorology, genomics, connectomics, complex physics simulations, biological and environmental research, and in e-Science in general.

Without any fear of contradiction, Big Data is critical and a very huge emerging market towering in thousands of dollars and desirous of being looked into my corporate giants in developing economies.
However, that adventure should not be plunged into – blindly. There is a critical need to look inwards and restructure our national Software architecture with formidable standards – ensuring that we build local capacities, capabilities and smart skills that can conquer and master small data and effectively domesticate Big Data for global competitiveness.

1 comment:

Related Posts Plugin for WordPress, Blogger...