You are on page 1of 5

Manage data lifecycle

Objectives:

At the end of this episode, I will be able to:

Understand and apply the recommended guidance pertinent to how to manage the data
lifecycle through your daily practice as an information security professional.

External Resources:

Manage data lifecycle

1. Data roles (i.e., owners, controllers, custodians, processors, users/subjects)


2. Data collection
3. Data location
4. Data maintenance
5. Data retention
6. Data remanence
7. Data destruction

=====================

What is Data Lifecycle Management (DLM)? -

The process of managing information, following the life of data from the moment
it’s first created and stored, up to the time it gets archived or destroyed when
it stops being useful.

This ensures that data is managed in a consistent and responsible manner, no


matter its origin or purpose. All of this is achieved through automated policies
that are defined in advance, to minimize the potential for human error anywhere
in the process.

Data Lifecycle Management (DLM) vs Information Lifecycle Management (ILM) -

In a broad sense, data lifecycle management and information lifecycle


management (ILM) are the same thing. You cannot have an effective ILM strategy
without implementing a strong system for data management or DLM.

NOTE: Because the difference is just one of scope, any ILM system inherently
includes a data lifecycle management framework as its core.

Think of DLM as the set of governing principles that defines and automates the
stages of useful life, and determines prioritization. In more simple terms, data
lifecycle management is the catalyst that pushes data from one stage to the next,
from creation to deletion. It’s a system designed to answer the question:

"When should this information be deleted?"

Information lifecycle management is even more nuanced. It seeks to answer the


question:

"Is this information relevant and accurate?"

DLM deals with entire files of data, while ILM is concerned with what’s in the file.

ILM seeks to ensure every piece of datum included in a record is accurate and
up-to-date for the useful life of the record. In the context of ILM, even
metadata becomes particularly important.

DLM is not concerned with the individual pieces of data within a given record, just
with the record itself.
Under the principles of DLM, as a record passes through defined lifecycle stages, it
becomes more and more obsolete. As a result, speed and accessibility are no longer
prioritized for stale data.

NOTE: Both DLM and ILM should form critical aspects of an organization’s overall
data protection strategy.

The 5 Stages of Data Lifecycle Management + PLANNING -

Data Lifecycle Management is a process that helps organizations to manage the


flow of data throughout its lifecycle – from initial creation through to destruction.

PLAN - BEFORE ANYTHING HAPPENS....

Create a data management plan. Prior to starting the lifecycle, it is important


to plan how data will be managed throughout the lifecycle.

1. Data Creation - The first phase of the data lifecycle is the creation/capture
of data. Data is typically created by an organization in one of 3 ways:

Data Acquisition: acquiring already existing data which has been produced
outside the organization

Data Entry: manual entry of new data by personnel within the organization

Data Capture: capture of data generated by devices used in various processes


in the organization

2. Storage - Once data has been created within the organization, it needs to
be stored and protected, with the appropriate level of security applied, which
includes classifying it for easy retrieval and determining access controls. For
example, files can be marked as “internal” or “external,” which then decides
how tight of a lid it needs to be kept under.

A robust backup and recovery process should also be implemented to ensure


retention of data during the lifecycle.

3. Usage - Data is used to support activities in the organization. Data can be


viewed, processed, modified and saved. An audit trail should be maintained for
all critical data to ensure that all modifications to data are fully traceable.
Data may also be made available to share with others outside the organization.

Activities include employing sufficient protections such as strong encryption


and physical server security, as well as continually validating and auditing
files to ensure accurate data.

4. Archival - The copying of data to an environment where it is stored in case


it is needed again in an active production environment, and the removal of
this data from all active production environments.

NOTE: A data archive is simply a place where data is stored, but where no
maintenance or general usage occurs. If necessary, the data can be restored to
an environment where it can be used.

5. Destruction - Data destruction or purging is the removal of every copy of


a data item from an organization. It is typically done from an archive storage
location.
The challenge of this phase of the lifecycle is to ensure that the data has been
properly destroyed. It is important to ensure before destroying data that the
data items have exceeded their required regulatory retention period.

Having a clearly defined and documented data lifecycle management process is key
to ensuring Data Governance can be carried out effectively within your organization.

NOTE: A good DLM strategy offers redundancy that can ensure data stays safe in
the event of an emergency. It also helps to ensure that customer data is
safeguarded from being duplicated in different parts of a data infrastructure, where
security may be a concern.

Quality Control (QC) is an assessment of quality based on internal standards,


processes, and procedures established to control and monitor quality

Quality Assurance (QA) is an assessment of quality based on standards external


to the process and involves reviewing of the activities and quality control
processes to ensure final products meet predetermined standards of quality

Collection limitation - there should be limits to the collection of personal


data, and any such data should be obtained by lawful and fair means and, where
appropriate, with the knowledge or consent of the data subject

http://www.oecdprivacy.org/

The following are roles to know: (7)


1. Data Subject - an individual who is the subject of personal data

2. Data Owner - "Masters of all"; responsible for classification of data and


hold legal rights and complete control over the data they create; Determine data's
impact on the mission of the organization; Understand the replacement cost of
the information (if it can be replaced); Determine who has a need for the data
and under what circumstances the data should be released; Identify when data
is inaccurate or no longer needed and should be destroyed

3. Data Controller - Determines the purpose(s) for which and the manner in
which data is to be processed

4. Data Processor - "Managers of all" ; they process the data on behalf of the
Data Controller and the following SPECIFICALLY:

. Adherence to appropriate and relevant data policy and data ownership guidelines

. Ensuring accessibility to appropriate users, maintaining appropriate levels


of dataset security

. Dataset maintenance & documentation

. Assurance of quality and validation of any additions to a dataset, including


periodic audits to assure ongoing data integrity

5. Data Custodians - responsible for the safe custody, transport, storage of


the data & implementation of business rules. A data custodian ensures:

. Access to the data is authorized and controlled


. Data stewards are identified for each data set
. Technical processes sustain data integrity
. Processes exist for data quality issue resolution in partnership with Data
Stewards

6. Data Stewards - responsible for utilizing an organization's data governance


processes to ensure fitness of data elements - both the content and metadata.

Data stewards focus on processes, policies, guidelines & responsibilities for


administering organizations' entire data in compliance with policy and/or
regulatory obligations.

. Technical controls to safeguard data


. Data added to data sets are consistent with the common data model
. Versions of Master Data are maintained along with the history of changes
. Change management practices are applied in maintenance of the database/dataset
. Data content & changes can be audited

7. Administrator - grants permissions / access to data

Data remanence - the residual representation of digital data that remains even
after attempts have been made to remove or erase the data

Sanitization is a process to render access to target data on the media infeasible


for a given level of recovery effort.
Clear, Purge, & Destroy are actions that can be taken to sanitize media. The
categories of sanitization are defined as follows:

• Clear - applies logical techniques to sanitize data in all user-addressable


storage locations for protection against simple non-invasive data recovery
techniques; typically applied through the standard Read & Write commands to the
storage device, such as by rewriting with a new value or using a menu option
to reset the device to the factory state.

• Purge - applies physical or logical techniques that render Target Data


recovery infeasible using state of the art laboratory techniques.

• Destroy - renders Target Data recovery infeasible using state of the art
laboratory techniques & results in the subsequent inability to use the media
for storage of data.

The security goal of the overwriting (clearing) process is to replace target


data with non-sensitive data, which would be acceptable when you are asked to
overwrite media to allow for reuse in an environment operating at the same
security level.

Simply deleting a file would actually be Erasing.

Destruction - The media is made unusable for conventional equipment

Techniques:

Overwriting - one or more streams of 1's & 0's in a "random" pattern one or
more times

Degaussing - use of STRONG !!! magnetic field(s) to erase data

Encryption - securing in place via use of algorithms (stronger is better) AND


long long long time periods (workfactors)

Crypto-Shredding - #1 approach for cloud data security | encrypting your encryption

Physical Destruction - rendering unusable through a variety of

means (shredding, pulverizing, etc..)

Chemical Alteration - Plasma, Thermite, Acid etching, etc...

Phase Shift / transition (Curie Temp) -

SSD vs HDD

Cloud data - encrypt data while in storage and use ==> upon exit crypto-shred
remaining data

You might also like