Protegrity-specific term definitions

This describes the different terms used in the documentation with the definition for the terms.

Application Data Security

Application data security is the process of protecting sensitive data used within an application during processing, storage, or transmission. Security measures such as encryption, tokenization, or data masking are applied to ensure that sensitive information remains secure from unauthorized access or data breaches. Application data security is important for ensuring that personal or financial information is protected in enterprise applications.

Application Programming Interface Security

Application programming interface security refers to the practice of protecting an Application Programming Interface (API) from unauthorized access, data breaches, or misuse. This involves applying security measures such as authentication, encryption, and access control. These measures secure the interactions between systems that communicate through APIs. API security is essential in modern applications where APIs are used to integrate services across cloud environments.

Application Protector

Application Protector refers to security tools that protect sensitive data within an application by applying encryption, tokenization, or other data protection techniques during processing and storage. It ensures that sensitive data remains secure even while in use by the application, preventing unauthorized access to the data.

Application Protector Container

An Application Protector container is a secure environment in which sensitive data is processed and protected within an application. Containers isolate the processing environment for the application from other systems, ensuring that sensitive data is encrypted or tokenized during use. Application protector containers help businesses ensure that sensitive information is securely handled throughout the application lifecycle.

Attribute-Based Access Control

The attribute-based access control (ABAC) is an access control model. It grants or denies access to resources based on attributes of the user, the resource, the environment, and other contextual factors. Attributes can include the role of the user, department, time of access, location, and even the type of device being used. ABAC allows for dynamic and fine-grained access control, adapting to changing conditions and requirements in real time.

Audit Store

The Audit Store is a database that stores logs on the ESA. The log data shows the health of the system and helps during troubleshooting. The Audit Store is also referred to as Insight.

Audit Store Dashboards

The Audit Store Dashboards is a solution for viewing logs from the Audit Store. The logs entries can be queried and viewed here. The information can also be viewed using charts, graphs, and tables. The Audit Store Dashboard is also called Insight Dashboards.

Big Data Protector

A Big Data Protector secures large-scale data environments by applying encryption, tokenization, and other data protection techniques to sensitive information stored within big data platforms. It ensures that sensitive data within large datasets remains protected during analysis and storage.

Boost

A boost or weight, discussed in the context of a classifier, is the probability of a keyword getting qualified for the scan, as per the defined regular expression pattern.

Case-Sensitive Tokenization

Case-sensitive tokenization is a tokenization process that maintains the case, that is, uppercase or lowercase, of the original data during tokenization. This is important for data fields where the case of the characters matters, such as in passwords, usernames, or other case-sensitive data. By preserving the case of data, case-sensitive tokenization ensures that applications and systems requiring case-specific information can still function correctly.

Centralized Key Management

Centralized key management is a system that manages encryption keys in a single location. This ensures secure generation, storage, and distribution of keys across various environments. Centralized management simplifies the process of securing sensitive data by applying consistent key management policies across the organization.

Classification

Classification is the grouping of data that is done to represent a particular type of sensitive data after evaluation by the classifiers. For example, a collection of 16 digit values evaluated to be a potential credit card number is classified to be a Credit Card Number (CCN).

Classifier

The classifier defines the data identification rules to classify sensitive data from the sampled data.

Cloud Data Security

Cloud data security encompasses the strategies and technologies used to protect sensitive data stored in cloud environments. This involves using encryption, tokenization, access controls, and other security measures to ensure that data remains secure as it moves between cloud services or is stored in cloud databases.

Cloud-Native Security

Cloud-native security refers to security practices and tools specifically designed for protecting cloud-based environments. These tools provide data protection, identity management, and secure access to cloud services while ensuring compliance with security policies.

Confidence Score

The confidence score is the score that determines the confidence or severity level in the data classification findings.

Containerization

Containerization is a technology that packages an application and its dependencies into a standardized unit called a container. This container can run consistently across different computing environments. Containers isolate applications from the underlying system, providing security, portability, and consistency across environments. Containerization is widely used to enhance application development, deployment, and security, particularly in cloud-native environments.

Coordinate

A coordinate represents the location of sensitive data. It can be any system, database, schema, table, column, or file path.

Cross-Border Data Protection

Cross-border data protection refers to the strategies and measures used to ensure that sensitive data remains secure when transferred between different countries. This includes complying with various data protection laws and ensuring that the data is encrypted or tokenized to prevent unauthorized access during transmission.

Cryptographic Key Management

Cryptographic key management is the process of generating, distributing, storing, and managing encryption keys used to secure sensitive data. Effective key management ensures that only authorized users can access these keys, protecting encrypted data from unauthorized access. Proper key management is essential for maintaining the security of encrypted data throughout its lifecycle.

Cryptoperiods

The cryptoperiods is the time span during which a specific key is authorized for use. It is also the duration during which the keys for a given system or application may remain in effect.

Data Breach Prevention

Data breach prevention refers to the measures and technologies used to prevent unauthorized access to sensitive data. This involves applying encryption, access controls, and monitoring to detect and block potential security threats before a data breach takes place.

Data Discovery

Data discovery is the process of identifying and cataloging sensitive data across systems in an organization. This helps businesses understand where sensitive data resides and ensures that appropriate security measures are applied to protect it. Data discovery is critical for ensuring compliance with data protection regulations.

Data Element Keys

Data element keys are generated when a data element is created. This key protects the sensitive data.

Data Encryption Keys

Data encryption keys (DEK) are used to protect data. In the Protegrity Data Security Platform, the repository key, signing key, data store key, and data element keys are the DEKs.

Data Integrity Monitoring

Data integrity monitoring involves tracking and verifying the accuracy and consistency of data over its lifecycle. It ensures that data has not been altered in an unauthorized manner, protecting it from tampering, corruption, or breaches.

Data Loss Prevention

Data loss prevention (DLP) is a strategy and set of tools designed to prevent sensitive data from being lost, leaked, or accessed by unauthorized users. DLP technologies monitor and control the movement of data across networks, storage, and endpoints to ensure that confidential information is not accidentally or maliciously exposed.

Data Obfuscation

Data obfuscation is the process of intentionally making sensitive data difficult to understand or interpret by unauthorized users. This is done by applying techniques like encryption, tokenization, or masking. It helps protect the data from being exposed in non-production environments, such as testing or development.

Data Protection Lifecycle

The data protection lifecycle refers to the stages through which data passes, from creation and storage to usage, sharing, and deletion. Throughout its lifecycle, data must be protected from unauthorized access or breaches using access controls, encryption, and monitoring.

Data Security Gateway

A Data Security Gateway (DSG) enforces data security policies as sensitive information moves between different systems or environments. This gateway ensures that data is encrypted, tokenized, or masked before transmission, preventing unauthorized access during data migration or exchange.

Data Store Keys

The data store key (DSK) is generated in the configured key store when a data store is created. It is protected by the master key. It is only used to protect the staging that is located on the ESA.

Database Protector

A Database Protector secures sensitive data stored in databases by applying encryption, tokenization, and access controls to prevent unauthorized access. It ensures that sensitive information is protected from data breaches, insider threats, and other security risks.

Datastore

The datastore base or nodes define the different systems that are supported for a data discovery scan.

De-Identified Data

De-identified data refers to information with all personally identifiable information (PII) removed or obscured, making it impossible to link the data back to an individual. This process is commonly used in data privacy practices to protect personal privacy while still allowing the data to be used for research or analysis. De-identification is important for complying with data privacy regulations such as GDPR and HIPAA.

Dynamic Data Masking

Dynamic data masking is a security technique that hides sensitive data in real-time. It shows masked values to unauthorized users while allowing authorized users to see the original data. This protects sensitive information while maintaining its usability in business applications.

Encryption at Rest

Encryption at rest refers to the encryption of data stored on physical devices such as hard drives, databases, or cloud storage. This ensures that data remains protected from unauthorized access when it is not being actively used.

Encryption in Transit

Encryption in transit refers to the practice of securing data as it moves across networks or between systems. This type of encryption ensures that sensitive information remains protected during transmission, preventing it from being intercepted or accessed by unauthorized users.

Endpoint

The endpoint is the protection endpoint. In most cases, it is the protector.

Enterprise Data Warehouse Protector

An Enterprise Data Warehouse (EDW) Protector is a security solution that secures sensitive information within data warehouses by applying encryption or tokenization. These protectors ensure that data remains secure during both storage and processing, mitigating risks of unauthorized access. EDW Protectors are designed to handle the large datasets that are typically found in data warehouses while maintaining compliance with privacy regulations.

Enterprise Security Administrator

An Enterprise Security Administrator (ESA) is a centralized management tool that allows organizations to enforce data protection policies, including encryption and access controls, across their entire data infrastructure. It helps ensure compliance with security standards by managing data security policies at scale.

Federated Self-Service Data Security

Federated self-service data security refers to a security model where different teams within an organization manage the data protection measures of the team, while adhering to centralized security policies. This allows departments or business units to control their own data security needs while maintaining compliance with overall corporate security standards. Federated models are often used in large enterprises where different teams handle varying levels of sensitive data.

Field-Level Encryption

Field-level encryption is a method of encryption to protect sensitive information in individual data fields, such as credit card numbers or Social Security numbers. This approach allows businesses to encrypt specific pieces of data within a larger dataset without encrypting the entire dataset.

File Protector

A File Protector secures sensitive files by encrypting or applying access controls. This ensures that only authorized users can access or modify the data in sensitive files. It protects files stored in shared environments or transmitted between systems from unauthorized access or breaches.

Fine-Grained Data Protection

Fine-grained data protection involves applying security measures at the individual data field or record level, allowing precise control for users to access specific pieces of sensitive information. This ensures that only authorized personnel can access protected data without affecting the rest of the dataset.

FIPS 140-2

The Federal information process standard (FIPS) used to accredit cryptographic modules. FIPS 140-2 refers to a specific version of this standard.

Format-Preserving Encryption

Format-preserving encryption (FPE) is an encryption method that allows sensitive data to be encrypted while retaining its original format. This ensures that encrypted data can still be processed by systems that require a specific data format, such as credit card numbers or Social Security numbers. FPE is useful for integrating encryption without modifying existing systems that rely on a particular data structure.

Generalized Anonymization

Generalized anonymization refers to the process of replacing personally identifiable information (PII) with generalized or non-identifiable data, ensuring that individuals cannot be re-identified. This technique is used to protect privacy while allowing data to be analyzed or shared for research, reporting, or operational purposes. Anonymization is crucial for complying with data privacy regulations while still enabling data usage.

Hadoop Protector

A Hadoop Protector secures sensitive data stored in Hadoop big data environments by applying encryption, tokenization, and access controls. This ensures that sensitive information in large-scale data platforms remains protected during storage and processing. Hadoop Protectors allow organizations to comply with data privacy regulations while leveraging the analytical power of Hadoop.

Homomorphic Encryption

Homomorphic encryption is an encryption technique that allows computations to be performed on encrypted data without the need of decrypting the data. This means that data remains protected while being processed, preserving its confidentiality and integrity throughout the computation. Homomorphic encryption is especially useful for privacy-preserving analytics and cloud computing, where sensitive data needs to be processed securely.

Insight

Insight is the solution for transferring, collating, receiving, and storing logs. It consists of Analytics, Audit Store, Insight Dashboards or Audit Store Dashboards, and td-agent.

Insight Dashboards

Insight Dashboards is the solution for viewing logs from the Audit Store. The logs entries can be queried and viewed here. The information can also be viewed using charts, graphs, and tables. Insight Dashboards are also called Audit Store Dashboards.

Insight Discovery

Insight discovery refers to the process of extracting useful insights and patterns from data, particularly in large datasets. It involves analyzing data to identify trends, anomalies, or patterns that can inform decision-making. Insight discovery enables businesses to leverage data for strategic purposes without revealing sensitive information.

Job

A job is a task that you can create to scan a particular coordinate.

Key Encryption Keys

The key encryption keys (KEK) protects other keys. In the Protegrity Data Security Platform, the master key is the KEK.

Key Management Service

A key management service (KMS) is a service that securely generates, stores, and manages encryption keys used to protect sensitive data. KMS ensures that encryption keys are distributed and rotated securely, preventing unauthorized access to encrypted data. It simplifies the management of encryption keys across diverse environments, ensuring compliance with security policies.

Key Rotation and Aging

Key rotation and aging are security practices. Key rotation involves periodically updating encryption keys. Aging involves the process of securely managing the lifecycle of the rotated keys. These practices reduce the risk of compromise. Regular key rotation ensures that encryption keys do not remain in use for too long, reducing the chances of unauthorized access due to compromised or outdated keys. This practice helps maintain strong data security.

Key States

A key state is the state of a key during the lifecycle of the key.

Key Store

The key store can be a hardware security module (HSM), or other supported key management service (KMS) that can store keys and perform cryptographic operations.

Mainframe Protector

A Mainframe Protector secures sensitive data stored in legacy mainframe systems by applying encryption, tokenization, and access controls to prevent unauthorized access. This ensures that sensitive information on mainframes remains protected, even when these systems integrate with modern IT infrastructures. Mainframe Protectors help organizations secure legacy systems without disrupting operations.

Master Key

The master key (MK) is generated and stored in the key store when the key management is initialized, the key store is switched, or the active key is rotated. The MK protects all data encryption keys in the policy repository.

NIST 800-57

The National Institute of Standards and Technology (NIST) Special Publication 800-57 defines best practices and recommendations for key management.

Node

See Datastore.

Originator Usage Period

The originator usage period (OUP) is the period of time in the cryptoperiod of a symmetric key during which cryptographic protection may be applied to data

Persistent Data Encryption

Persistent data encryption refers to the continuous encryption of data throughout its lifecycle, from creation and storage to transmission and processing. This ensures that sensitive data is never exposed in plain text, always maintaining security. Persistent encryption helps protect data from unauthorized access or breaches, even when it moves between different systems or environments.

PKCS#11 Interface

The public-key cryptography standard is the standard API for key management.

Platform Logging Upgrade

The platform logging upgrade (PLUG) enhances the logging capabilities of a system. It provides more detailed and granular tracking of system activities, such as data access, encryption, and policy enforcement. Improved logging ensures that all critical operations are recorded for audit, monitoring, and compliance purposes. PLUG is particularly useful in environments where comprehensive logging is necessary for data security audits and regulatory reporting.

Policy Enforcement Point

A policy enforcement point (PEP) is a security component that ensures data protection policies, such as access controls and encryption, are applied when data is accessed or transmitted. It acts as a checkpoint, verifying that only authorized users can access sensitive information and that security policies are consistently enforced across systems.

Policy Management

Policy management refers to the process of creating, enforcing, and maintaining data protection policies across an organization’s data infrastructure. These policies dictate how sensitive data is handled, stored, and accessed. It ensures compliance with security standards and regulations. Policy management systems help automate and centralize the enforcement of security policies, simplifying compliance efforts.

Policy Repository

The policy repository is the internal storage in the ESA. It stores policy information including the master key properties and all data encryption keys properties.

Policy-Based Access Control

Policy-based access control (PBAC) is an access control model that uses predefined policies to determine who can access specific resources. These policies are typically defined by administrators and are based on organizational rules and regulations. Unlike attribute-based access control (ABAC), which relies on dynamic attributes, PBAC operates based on static policies that dictate access rules. PBAC provides a centralized way to enforce consistent access controls across an organization.

Privacy-Enhanced Analytics

Privacy-enhanced analytics refers to techniques that allow businesses to analyze sensitive data without compromising the privacy of individuals. This often involves anonymization, encryption, or tokenization of data. It enables insights to be drawn from data without revealing personally identifiable information (PII). These techniques help organizations maintain compliance with privacy regulations while still benefiting from data-driven insights.

Protegrity Centralized Logging

Protegrity Centralized Logging consolidates logs from various systems and environments into a single, unified platform. It provides businesses with a comprehensive view of the security operations. This enables better monitoring, auditing, and analysis of security events across the enterprise. Centralized logging is essential for detecting security incidents and ensuring that compliance with data protection policies is maintained.

Protegrity Cloud Security Gateway

Protegrity Cloud Security Gateway secures data as it moves between on-premise systems and cloud environments. It ensures that sensitive information is encrypted, tokenized, or otherwise protected during transmission. This technology allows businesses to adopt cloud services while maintaining control over their data security policies, preventing unauthorized access or data breaches during migration.

Protegrity Cloud Shield

Protegrity Cloud Shield is a security solution that protects sensitive data stored in cloud environments. It applies encryption, tokenization, and access controls to ensure that data remains secure while benefiting from the scalability and flexibility of the cloud. A Cloud Shield allows organizations to take full advantage of cloud services without compromising data security or compliance with privacy regulations.

Protegrity Compliance Monitor

Protegrity Compliance Monitor is a tool that helps organizations continuously track and report on compliance with data protection regulations. It provides real-time visibility into how data protection policies are being applied. This helps alert businesses to potential compliance risks before they result in breaches or regulatory violations. It enables organizations to stay ahead of compliance requirements and maintain robust data security.

Protegrity Data Security Platform

The Protegrity Data Security Platform is a comprehensive solution that provides encryption, tokenization, and data masking to protect sensitive information across an organization’s entire data infrastructure. It enables businesses to apply consistent data protection measures to structured and unstructured data, ensuring that sensitive information remains secure throughout its lifecycle.

Protegrity Gateway Technology

The Protegrity Gateway Technology ensures that sensitive data is protected as it moves between different systems, platforms, or environments. This is achieved by applying security measures like encryption or tokenization during transmission. This technology ensures that data remains secure across hybrid or multi-cloud environments, providing seamless data protection during migration and communication between systems.

Protegrity Integration Layer

The Protegrity Integration Layer allows businesses to integrate Protegrity’s data protection technologies into the existing IT environments. It enables seamless enforcement of encryption, tokenization, and data masking policies across different systems. This integration ensures that data protection measures are consistently applied, regardless of the underlying infrastructure.

Protegrity Masking Engine

The Protegrity Masking Engine dynamically masks sensitive data in real-time. This ensures that unauthorized users can only see masked values while authorized users can access the original data. It provides continuous data protection, allowing businesses to maintain security while ensuring usability for authorized personnel.

Protegrity Policy Engine

The Protegrity Policy Engine automates the enforcement of data protection policies. It allows businesses to define and apply consistent encryption, tokenization, and masking rules across data environments. This engine ensures that sensitive data is protected according to corporate security policies and regulatory requirements, reducing the risk of data breaches and non-compliance.

Protegrity Privacy Enhancer

The Protegrity Privacy Enhancer is a tool that anonymizes or tokenizes sensitive data. It ensures that personal information is protected and allows businesses to analyze or share data for operational purposes. This tool ensures compliance with privacy regulations, such as, GDPR, by protecting sensitive data without sacrificing its usability for analytics or reporting.

Protegrity Soft Hardware Security Module

The Protegrity Soft Hardware Security Module (HSM) is housed internally within the ESA. It is used to generate keys and stores the master key.

Real-Time Data Protection

Real-time data protection refers to the continuous protection of sensitive data as it is accessed, created, or modified. This ensures that data remains secure during all operations, such as processing, transmission, and storage, reducing the risk of data breaches. Real-time protection enables organizations to respond quickly to security threats, ensuring that sensitive data is always protected.

Recipient Usage Period

The recipient usage period (RUP) is the period of time during the cryptoperiod of a symmetric key during which the protected information is processed.

Reference

A reference is a pre-populated list of names and addresses containing the data dictionaries, such as, common names, city, postal codes, or states. By default, the list is pre-populated with Data Discovery to support commonly used data dictionaries.

Referential Data

See Reference.

Regulatory Compliance Reporting

Regulatory compliance reporting involves generating reports that demonstrate an organization’s adherence to data protection regulations such as GDPR, HIPAA, and PCI DSS. These reports track the application of data protection measures like encryption and tokenization. It helps businesses prove compliance during audits. Compliance reporting is essential for avoiding penalties and maintaining trust with customers and regulators.

Repository Key

The repository key (RK) is generated in the configured key store when the key management is initialized or the active key is rotated. It is protected by the master key. It protects the policy repository in ESA.

Role-Based Access Control

The role-based access control (RBAC) is a method of restricting access to sensitive data based on the role of the user within an organization. RBAC ensures that users are only granted access to the information necessary to perform their job duties. This minimizes the risk of unauthorized access or data breaches. It simplifies the management of user permissions by grouping access levels according to roles.

Search and Sort of Protected Data

The search and sort of protected data feature allows organizations to perform search and sort operations on encrypted or tokenized data without the need to decrypt it. This ensures that sensitive data remains secure during these operations, while still allowing businesses to access and organize data for operational or analytical purposes. This capability helps balance data security with usability in data management.

Secure Data Vaulting

Secure data vaulting is the practice of storing sensitive data in highly secure environments to protect it from unauthorized access. The secure environments are known as vaults. Data vaults are used to store encrypted or tokenized data, ensuring that even if the vault is compromised, the data remains inaccessible without the proper decryption keys. Vaulting is essential for the storage and protection of long-term sensitive data.

Secure File Transfer

A secure file transfer involves encrypting files during transmission. This ensures that sensitive information is protected from unauthorized access while being shared between systems or across networks. This process uses encryption and access controls to secure files, protecting them from interception or tampering during transfer.

Signing Key

A signing key is generated in the configured key store when the ESA is installed and key management is initialized. It is protected by the master key. This key is used to sign the audits generated by protectors. It is used by the protector to add a signature to the log records generated for each data protection operation. The record is signed and sent from the protector to the ESA. The signing key helps to identify that the log records have not been tampered with and are received from the required protection endpoint or protector.

Static Lookup Tokenization

Static lookup tokenization replaces sensitive data with tokens using a predefined lookup table. This ensures that the same data is always tokenized consistently. This method is useful for creating a stable and consistent tokenization process across multiple systems, particularly for audit and compliance purposes. It ensures that tokenized data can be cross-referenced across systems without exposing the original sensitive data.

Unicode Tokenization

Unicode tokenization is the process of replacing sensitive data with tokens while maintaining the integrity of multilingual or special character data. It typically uses the Unicode character set. This method ensures that sensitive data, such as international names or addresses, is protected while allowing applications that use different languages to function properly. Unicode tokenization supports the global use of tokenization in multilingual databases.

Vaultless Tokenization

Vaultless tokenization is a method of replacing sensitive data with tokens without the need for a centralized storage vault. The tokens are generated without the need for a lookup table. This enhances scalability and reduces the risk of a centralized point of failure. This method ensures that sensitive data remains protected while simplifying token management.

Weight

See Boost.

Zero Trust Architecture

Zero trust architecture is a security framework that assumes no user or system should be trusted by default, whether inside or outside the organization’s network. Instead, access to sensitive data and systems is continuously verified through strong authentication, encryption, and access control measures. This model reduces the risk of insider threats and external attacks by requiring strict verification for every access attempt.


Last modified : July 21, 2025