The term Big Data refers to the tools, processes and procedures allowing an organization to create, manipulate, and manage very large data sets and storage facilities.
One current feature of big data is the difficulty working with it using relational databases and desktop statistics/visualization packages, requiring instead “massively parallel software running on tens, hundreds, or even thousands of servers. The size of Big data varies depending on the capabilities of the organization managing the set, ranging from a few dozen terabytes to many petabytes of data in a single data set.[advt]
Big Data spans three dimensions: Variety, Velocity and Volume.
- Variety – Big Data extends beyond structured data, including unstructured data of all varieties: text, audio, video, click streams, log files and more.
- Velocity – Often time-sensitive, Big Data must be used as it is streaming in to the enterprise in order to maximize its value to the business.
- Volume – Big Data comes in one size: large. Enterprises are awash with data, easily amassing terabytes and even petabytes of information.[source]