What Characterizes Big Data?

Big Data is a major buzz word these days as there’s a lot of hype about it. With the Big Data wave taking over new data-driven paradigms, its transforming the way data is used to leverage insight at a granular level. Big Data analytics is the king when it comes to analyzing and making wisdom out of such minute data. A network of large and complex data sets are linked with Big Data that makes it difficult to manage with traditional IT systems. So what are the factors that characterize Big Data? 

Dr. Demirhan Yenigan, Big Data Expert and Professor of Analytics at GWU, opened up the window on Big Data and its characteristics. Big Data consists of an immense amount of electronic data generated from the internet and its sources including: clicks, search patterns, preferences, videos, and social media including Facebook, YouTube, Twitter, and more. Data generated from all these sources travel at unprecedented velocities, which needs to be stored, formatted, retrieved and analyzed to make wisdom out of it. Big Data is characterized by the following V’s.

  • Volume (Scale of data): When you talk about Big Data, the data is unlimited and goes far beyond tera or peta bytes. It is not possible to store and process this large volume of data using traditional data warehousing frameworks. However, new technologies such as Hadoop, Map Reduce, and Big Table have eased this burden as they possess the capability to store and process massive amounts of data efficiently and cost-effectively.
  • Velocity (Speed of data): The Velocity of data refers to the unprecedented speed at which data flows from various sources like machines, networks, business processes, social media, and more. This data may be massive and continuous, which can be processed using RFID tags, sensors and smart metering within near-real time. It helps the businesses and researchers to come up with strategic decisions for better ROI. Data velocity has three dimensions: 
    • The Speed of Data: Based on the speed of data, data can be of two types: data-in-motion and data-at-rest. The consistency and completeness of fast-moving data should be considered for capturing data at the same rate or with a lag time to frame analytic outputs.
    • The Lifespan of Data: Understanding how long the data is valuable is important and based on the validity data can be stored or discarded.
    • The Speed of Data Storage and Retrieval: The speed at which data is stored and retrieved is another dimension of data velocity that determines the pace of Big Data storage, retrieval and analysis.

The ability to capture, analyze and act on data in almost real time is where Real Time Big Data Analytics (RTBDA) is leveraged to provide a competitive edge.

  • Variety (Diversity of data): Variety implies the variety of structured and unstructured data formats from different sources. Data may come in different forms such as videos, audios, documents, emails, pdfs, social media text messages, graphs, data generated from sensors, machine logs, cell phone, GPS signals and many other sources. Big Data extracts data from these sources and completes missing pieces of information through data fusion. The primary functions are:
    • Storing and retrieving data at quick pace.
    • Processing, aligning and extracting varied data formats for combined analysis.

Formatting the unstructured data brings about a definite structure facilitating easy retrieval and further processing of data for in depth analyses. 

  • Veracity (Certainty of data): Veracity defines the accuracy and reliability of data from a source. The biases, noises and abnormality in data should be refined so that the data being mined and stored is meaningful.
  • Value (Value of data): Big Data adds incremental value to your organization by treating each customer individually for achieving the desired end goal. Big Data analytics helps in storing and analyzing even the minute of details and adds value to your company by turning customer centric data when it comes to business.
  • Variability (Consistency of data): The Variability factor of Big Data verifies the consistency of data in terms of availability and interval of reporting so that it actually portrays the reported event. When data comprise of peak values it should statistically determine whether the value is significant or noisy. 

Managing Big Data comes with a whole set of complexities, as data coming from multiple sources need to be cleansed, matched, linked and transformed to churn out valuable insight. By ensuring proper hierarchies, relationships and multiple data linkages, the massive flow of data can be controlled and processed efficiently. With emerging technologies you will no sooner be able to create, manipulate and manage big data. 

When it comes to identifying opportunities from Big Data – Volume, Velocity and Variety are the important characteristics to watch out for. New Big Data tools and techniques have revolutionized ways of handling and retrieving unstructured data. Big Data analytics opens up opportunities at high level, but utmost care should be taken while analyzing each opportunity so that it can add realistic business value to your organization.

For more information on Big Data Analytics and services contact us.


Share this:

By Demirhan Yenigan | November 1st, 2016 | Business Analytics & AI

About the Author

Demirhan Yenigun Managing Director, Analytic Services for Macrosoft

Demirhan Yenigan

Demirhan Yenigun has 30 years of leadership, strategic thinking and accomplishments in implementing analytical solutions for Fortune 500 companies helping them to enhance their marketing and sales efforts. Demirhan is an Industry Advisor and Adjunct Professor of Data Analytics and Data Mining at Decision Sciences Department of GWU Business School.Demirhan is currently the CEO of Metrica Group and DomainGo.

Recent Blogs

Speech Analytics 101: What is Speech Analytics?
Read Blog
LISTEN 2019: Turning Sound to Data
Read Blog
The Right-Person, with the Right-Skills, at the Right-Price
Read Blog
Web Development Services With These 3 Steps
Read Blog