DLP, also known as data leakage prevention or data loss protection, describes the controls put in place by an organization to ensure that certain types of data (structured and unstructured) remain under organizational controls, in line with policies, standards, and procedures.
Controls to protect data form the foundation of organizational security and enable the organization to meet regulatory requirements and relevant legislation (that is, EU data protection directives,
Controls to protect data form the foundation of organizational security and enable the organization to meet regulatory requirements and relevant legislation (that is, EU data protection directives.
U.S. privacy act, Health Insurance Portability and Accountability Act [HIPAA], and Payment Card Industry Data Security Standard [PCI DSS]).
DLP technologies and processes play important roles when building those controls.
The appropriate implementation and use of DLP reduce both security and regulatory risks for the organization.
DLP strategy presents a wide and varied set of components and controls that need to be contextually applied by the organization, often requiring changes to the enterprise security architecture.
It is for this reason that many organizations do not adopt a full-blown DLP strategy across the enterprise.
For hybrid cloud users or those utilizing cloud-based services partially within their organizations, it is beneficial to ensure that DLP is understood and is appropriately structured across both cloud and non-cloud environments.
Failure to do so can result in segmented and non-standardized levels of security leading to increased risks.
Discovery and classification
This is the first stage of a DLP implementation and an ongoing and recurring process.
The majority of cloud-based DLP technologies are predominantly focused on this component.
The discovery process usually maps data in cloud storage services and databases and enables classification based on data categories (regulated data, credit card data, public data, and more).
Data usage monitoring for both ingress- and egress-based traffic flows forms the key function of DLP.
Effective DLP strategies monitor the usage of data across locations and platforms while enabling administrators to define one or more usage policies.
The ability to monitor data can be executed on gateways, servers, and storage as well as workstations and endpoint devices.
Recently, the adoption of external services to assist with DLP “as a service” has increased, along with many cloud-based DLP solutions.
The monitoring application should be able to cover most sharing options available for users (email applications, portable media, and Internet browsing) and alert them to policy violations.
Many DLP tools provide the capability to interrogate data and compare its location, use, or transmission destination against a set of policies to prevent data loss.
If a policy violation is detected, specified relevant enforcement actions can automatically be performed.
Enforcement options can include the ability to alert and log, block data transfers or reroute them for additional validation, or encrypt the data before leaving the organizational boundaries.
- The tool implementations typically conform to the following topologies:
Data in motion (DIM)
Sometimes referred to as network-based or gateway DLP.
In this topology, the monitoring engine is deployed near the organizational gateway to monitor outgoing protocols such as hypertext transfer protocol (HTTP), hypertext transfer protocol secure (HTTPS), simple mail transfer protocol (SMTP), and file transfer protocol (FTP).
The topology can be a mixture of the proxy-based, bridge, network tapping, or SMTP relays.
To scan encrypted HTTPS traffic, appropriate mechanisms to enable SSL interception and broker are required to be integrated into the system architecture.
Data at rest (DAR)
Sometimes referred to as storage-based data. In this topology, the DLP engine is installed where the data is at rest, usually one or more storage subsystems, as well as file and application servers.
This topology is effective for data discovery and for tracking usage but may require integration with the network or endpoint based DLP for policy enforcement.
Data in use (DIU)
Sometimes referred to as client or endpoint-based.
The DLP application is installed on a user’s workstations and endpoint devices.
This topology offers insights into how users use the data, with the ability to add protection that the network DLP may not be able to provide.
The challenge with a client-based DLP is the complexity, time, and resources to implement across all endpoint devices, often across multiple locations and significant numbers of users.
Cloud-Based DLP Considerations
Data in the cloud tends to move and replicate:
Whether it is between locations, data centers, backups, or back and forth into the organizations, the replication and movement can present a challenge to any implementation.
Administrative access for enterprise data in the cloud could be tricky:
Make sure you understand how to perform discovery and classification within cloud-based storage.
DLP technology can affect overall performance:
Network or gateway DLP, which scans all traffic for predefined content, might affect network performance. Client-based DLPs scan all workstation access to data, which can affect the workstation’s operation.
The overall impact must be considered during testing.
Start with the data discovery and classification process.
Those processes are more mature within the cloud deployments and present value to the data security process. Cloud DLP policy should address the following:
- What kind of data is permitted to be stored in the cloud?
- Where can the data be stored (in which jurisdictions)?
- How should it be stored (encryption and storage access consideration)?
- What kind of data access is permitted? Which devices and what networks? Which applications? Which tunnel?
- Under what conditions are data allowed to leave the cloud?
Encryption methods should be carefully examined based on the format of the data. Format-preserving encryption such as IRM is getting more popular in document storage applications; however, other data types may require vendor-agnostic solutions.
When implementing restrictions or controls to block or quarantine data items, it is essential to create procedures that prevent business process damage due to false-positive events or indeed hinder legitimate transactions or processes from being performed.
It can be an effective tool when planning or assessing a potential migration to cloud applications.
It discovery analyzes the data going to the cloud for content, and the DLP detection engine can discover policy violations during data migration.