NSA leaks forces defence sector to confront big data future
It is perhaps no revelation to say that, when compared to the private sector, governments have been largely slow to engage with the big data movement. There is of course an undeniable need for them to come down from the wings and play catch up, and while some public bodies have indeed moved forward with the times, others are still languishing on the sidelines.
Big data is concerned with the accumulation, tagging, storage and subsequent manipulation of large data sets. The precise definition varies, as what was considered a large dataset 5 years ago is today fairly standard. Consequently, large data can consist of anything from 10-100 Terabytes and up into the Petabytes. Naturally, governments have a wealth of current and historical data often in the upper reaches of that scale, much of which is in urgent need of being accessed and analysed. The first problem is that time is of the essence. The longer it takes to establish a system, a software, a methodology, or a mindset, the larger the gap grows between the demand for data storage and the ability organisations have to manage its increasing magnitude.
The private sector has of course been quick to engage with big data because of the monetary benefits on offer, but without such obvious similar incentives for the public sector, the gap is already sufficiently large to appear daunting. Fortunately, this time-sensitive issue has not fallen on deaf ears and today, most major governments are now acutely aware that they must adapt in line with the evolution of technology.
"They are coming around, and we see this at multiple levels," says Kim Andreasson, a Managing Director of DAKA advisory and specialist on public sector cyber threats. "Open government data initiatives that have been coming out in the last couple of years are…
Download the feature to read more.