FREMONT, CA -- (Marketwired) -- 04/02/13 -- Dataguise (http://www.dataguise.com), a leading innovator of data security intelligence and protection solutions, today announced DG for Hadoop v4.3. Now the first and only solution of its kind to provide both masking and selective encryption for sensitive data in major Hadoop distributions, DG for Hadoop allows organizations to determine the most appropriate remediation technique based on privacy requirements. The new version also delivers expanded capabilities, including contextual-based search to identify sensitive data in unstructured files, simplified management with automatic notifications and detailed audit reporting to demonstrate compliance.
Organizations globally are exploring the advantages of Hadoop and its ability to enable the analysis of data patterns previously inaccessible. However, compliance and security officers are mindful of the sensitive information located in these large data repositories and the lack of controls to prevent unauthorized access. Traditional approaches to securing Hadoop fail because they are too complex, expensive, and incapable of selectively protecting the data that matters in these large and diverse environments. DG for Hadoop provides an efficient, economical and effective method of determining where and how to secure sensitive data in Hadoop.
DG for Hadoop v4.3, part of the DgSecure suite of products, identifies the unique characteristics of Big Data, processing multiple terabytes of structured, unstructured and semi-structured data in only a few hours to protect sensitive data at the source, during ingestion and in the Hadoop Distributed File System (HDFS). Key features available in the latest generation software include:
According to Gartner, "Dataguise DG for Hadoop is a security offering of great value in an insecure platform, which Hadoop certainly is today."(1) DG for Hadoop is deployed across Fortune 200 institutions and built for the enterprise to evaluate exposure risks and enforce the most appropriate remediation to prevent unauthorized access, financial penalties and negative brand impact. The solution allows the user to define and detect the data in a Hadoop installation that is sensitive in nature (credit card numbers, social security numbers, account numbers, personally identifiable information, etc.), analyze the company's risk from the exposure of that data and protect the information with masking or encryption so the data can be used safely.
"The various distributions of Apache Hadoop provide a high performance platform for managing large volumes of data, helping organizations harness the potential of Big Data to make informed decisions," said Ashar Baig, Founder & Principal Analyst, Analyst Connection. "For security solutions to be effective in this environment requires both the ability to secure the information effectively and do so without significant impact to operational performance. DG for Hadoop is a sophisticated solution that addresses these areas to provide the assurance and confidence in dealing with sensitive data."
"With Hadoop deployments projected to grow in an upward direction for the foreseeable future, the threat to organizations that do not adopt a comprehensive approach to securing this data remains high," said Manmeet Singh, CEO, Dataguise. "DG for Hadoop provides a feature set unmatched by comparable alternatives, helping users benefit from the promise of Big Data without the potential risks."
Tweet this: @Dataguise Introduces Industry First in Big Data Compliance and Security with Latest Generation of DG for Hadoop - http://bit.ly/9nKnZX
Follow Dataguise on Twitter at: http://twitter.com/dataguise
Dataguise is the leading provider of data privacy protection and compliance intelligence for sensitive data assets stored in both Big Data and traditional repositories. Dataguise's comprehensive and centrally managed solutions allow companies to maintain a 360 degree view of their sensitive data, evaluate their compliance exposure risks, and enforce the most appropriate remediation policies, whether the data is stored on premises or in the cloud.
(1) Gartner, Blog: Hadoop 2013 - Part Two: Projects by Merv Adrian, Research Vice President, http://blogs.gartner.com/merv-adrian/