The Death of Business Intelligence

Big Data: The evolution

Posted in Business Intelligence by neilwilson1984 on January 10, 2014

Big Data is an ever-evolving and changing marketplace, but just how will it evolve in the future, and how will it meet the needs and demands of the ever growing volume of data that the market and its related software is asked to cope with?


With the fact that there is now more and more data created all the time, and the sheer volume rising at an exponential pace, there has been a need for it to be processed and analysed quicker and quicker.

According to one expert, massive amounts of data must be analysed on the spot. This has meant that companies across the globe have invested somewhere in the region of $14 million into the in-memory processing market.

This means that terabytes of data can be processed and analysed within just a few seconds. One company, according to a popular technology blog, recently realised the importance of speed when it comes to the analysing of data when its sensor data ballooned to a full 5 terabytes per day.

The expert added that it is now the case that almost every single Big Data vendor is having to create and release products that are specifically linked to making sure data is analysed as fast is as possible.

Data quality

Another factor that will become more and more important across the next year and beyond is the quality of the data that is being processed. Because of the sheer volume of what is being analysed and acted upon, it is vital that the data is of quality that means it will produce results and not just take up valuable time and space.

Because of the fact that data analysis is now so fast it is far outside the realm of the need for human minds, it is now the case that many decisions are made without a person being involved at all, with data cleansed, analysed and acted upon on. This has caused problems though, where a stream of bad quality data can lead to viruses, data loss, financial losses and even fines.

What some companies have started to do, and what will become more of a trend moving forward, is to deploy a system that combats poor data quality by making sure that any issues created are flagged up instantly rather than having to wait for problems to occur. It allows bad quality data to be wiped out very early in the process.

Specialist apps

In the age of everyone wanting to use Big Data, it has become more and more common for people not au fait with Big Data to want to be a part of the process. But this has led to issues with usability and compatibility.

It is a problem that will be combated moving forward, with apps specific to certain industries becoming ever-more prevalent to help companies harness Big Data even at the most basic levels.

Tagged with:

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: