A little security for big data
By George Chang May 20, 2013
- Performance is a key consideration when securing collected data and networks in a big data environment
- Policy creation and enforcement are more critical because of larger volumes of data and people who will require access to it
SECURING big data in an organisation requires smart policies enforcement, thorough analytics and high performance tools.
But what exactly is big data? Market research firm Gartner defines it as high-volume, high-velocity and/ or high-variety information assets that require new forms of processing to enable enhanced decision making, insight discovery and process optimisation.
Some common examples include web logs, RFID (radio frequency identification) data, sensor networks, social network data, Internet text and documents, Internet search indexing, call detail records, medical records and photography archives.
For most companies in Malaysia, big data increases routine security challenges in existing data networks. The sheer amount of data proportionately heightens the need to prevent data leakage.
Data loss prevention technologies should be employed to ensure that information is not being leaked to unauthorised parties. Internal intrusion detection and data integrity systems must be used to detect advanced targeted attacks that have bypassed traditional protection mechanisms, for example, anomaly detection in the collection and aggregation layers. The inspection of packet data, flow data, sessions and transactions should all be scrutinised.
Because big data involves information residing over a wide area from multiple sources, organisations also need to have the ability to protect data wherever it exists. In this regard, virtualised security appliances providing a complete range of security functionality must be positioned at key locations throughout the public, private and hybrid cloud architectures frequently found in big data environments.
Resources must be connected in a secure manner and data transported from the sources to the big data storage must also be secured.
Big data lifecycle
It is important to define specific security requirements around the big data lifecycle. Typically, this begins with securing the collection of data followed by securing access to the data.
Performance is a key consideration when securing the collected data and the networks. Firewalls and other network security devices, such as those for encryption, must be of sufficiently high performance so they can handle the increased throughput, connections and application traffic.
In a big data environment, policy creation and enforcement are more critical than usual because of the larger volumes of data and the number of people who will require access to it.
With the right tools, vast amounts of information can be analysed, and this allows an organisation to understand and benchmark normal activities. These security tools include dedicated logging, analysis and reporting appliances that can securely aggregate log data from security and other syslog-compatible devices.
Such appliances will also analyse, report and archive security events, network traffic, Web content, and messaging data. Policy compliance could then be measured and easily customised reports produced.
Small change, big impact
Unfortunately, the difficulty in capturing, managing and processing information quickly in big data environments will continue to make security an afterthought in many firms. As portable storage and bandwidth continue to grow, the mobility of these larger datasets will also increase, resulting in breaches and disclosure of sensitive datasets.
Threats will likely come from intruders manipulating the big data in such a way that business analytics and business intelligence tools could generate false results and lead to management decisions that could profit the intruders.
But remember, even small changes in big data can have a big impact on results! So, organisations must not ignore the need to secure big data assets – for security reasons, business intelligence or otherwise.
They must address big data issues in terms of authentication, authorisation, role-based access control, auditing, monitoring, and backup and recovery.
Going forward, big data analytics involving behavioural benchmarking and monitoring will also become ever more crucial in addressing next-generation information security challenges.
George Chang is Fortinet’s regional vice president for South-East Asia & Hong Kong. Fortinet provides network security appliances and unified threat management (UTM) solutions.
Related Stories:
Big data requires big decisions
As big data grows, so does the confusion it brings: Forrester
Security threats: What to expect in 2013
For more technology news and the latest updates, follow @dnewsasia on Twitter or Like us on Facebook.