Search for content, post, videos

Emerging Technologies and Cybersecurity: How It Can Secure Your Data

Organizations usually struggle to remember the difference between “information” and “data.” And mix this with the different levels of the attention span of the security teams in the ever-evolving threat world; we get a perfect storm brewing to take down even evolved security teams with defined standard-operating procedures! Jokes aside, the organizations thinking about the inside-outside, networks and micro segmentations, cloud and on premise, and other discussions tend to overlook that the whole organization is built around data and information, and their confidentiality, integrity, and availability is tantamount to their existence.

Organizations usually generate tons of data on day-to-day operations. To some, that data is the information to bring down the organization, gain insider knowledge, gain a competitive advantage, or the loophole they were waiting for a very long time to exact profit. Thus, it is equally essential for the organization to know about data and have policies to handle them. Also, give their prized possession the attention they deserve and warrant, rather than depending on other security measures and declaring securitythrough- obscurity.

The previous statement should haunt organizations who still believe they can hide behind the obscurity present in their environment. The question that they should find the answer to: are they being truthful to the situation? No, they are usually not, but then they continue with their procrastination on the matter, without as much as making an actual attempt to handle the real questions at hands. To elaborate, the issues that organizations should be more concerned about should be:

  1. Who can create data/information, and who owns it?
  2. When data/information is created, who can access and update it?
  3. How is it protected against unauthorized access or divulsion, at rest, in motion, or while processing?
  4. How is data/information protected from accidental alteration or deletion?
  5. How does the organization get rid of data/information that has reached its end-of-life and must be destroyed?

There are frameworks available to organizations that can help them derive their processes and operating procedures by carefully analyzing the organization’s goals, deliverables, data, suppliers, consumers, participants, tools, and techniques. Some of these frameworks have been around for over 20 years (Strategic Alignment Model and the Amsterdam Information Model). They are being adopted and evolved by DAMA to create the DAMA-DMBOK Framework, which can help organizations visualize the relationships between different activities they must undertake to take account of, effectively manage, secure, maintain, integrity and interoperability, storage and operations, etc.

But before the organization gets deep into these discussions, they should be able to define the importance of data and assign a proper classification to the same. Though I just mentioned one line about data classification, all the activities would be rendered fruitless if the organization did not put in time and effort to understand the types of data they are working on and generating. Teams must take time and define data classifications, i.e., public, confidential, sensitive, personal, etc., which usually requires more manual intervention and patience than the teams have the liberty of.

Issues with no proper data classifications get exasperated once the other factors play in, such as cost and attention required to secure non-critical data, security alert/incident fatigue, etc. There is always a balance that organizations must strike without burdening the users of data, as well as burning out the security teams and their budget. Even though the situation may seem bleak, the cybersecurity industry has many solutions that use Machine Learning, supervised and unsupervised, to help organizations sift through TBs of data, structured and unstructured, to understand its nature and help teams move on with the day-to-day activities. Such solutions often do “Discovery,” which allows the tools to access metadata, and sometimes data, to understand the type of data.

This information can then be used to classify data for owners or stewards and provide actionable intelligence. Thus, inturn, these solutions can help organizations drastically increase the speed of de-ployment of appropriate security measures based on the type and criticality of data.

Day-to-day activities of the cybersecurity teams for data protection teams can start from looking at data at rest and ensuring that it is secured from unauthorized accesses and that it is not copied or edited by anyone apart from the ones allowed in the first place. And even if access to data has been provided to people, it is reviewed periodically.

In industry, we call these solutions Data Access Governance solutions, which are usually combined with the Access Governance solutions to provide a comprehensive solution to the security teams, who can find patterns of specific applications and the related data that the member should have access to. Creating and using data-to-process and data-to-role relationship (CRUDE – Create, Read, Update, Delete, Execute) matrices also help organizations and security teams to map data access needs and guide the definition of data security role groups, parameters, and permissions.

But again, these activities depend on the metadata the teams/solution have at their disposal while working with data. The importance of these aspects must jump from the formal documentation to the actual implementation and usage of the tagging mechanisms. With ever-evolving solutions for data management, these activities can also be automated with the help of data profiling solutions and metadata repositories. Data profiling solutions can work on unstructured data, help explore data content, validate it against existing metadata, and identify data quality gaps/ deficiencies. Metadata repositories can store descriptive information about the data model, including data diagrams, definitions, lineage, and metadata imported from other tools and processes.

If this might seem too much to handle, cybersecurity teams have tools to get the job done rather than depend on documents to keep track of these complex interactions. These tools often help data architects to have a unified view of the interactions and integrations of data with the roles and system applications. They also provide security architects with much-needed clarity on the vulnerable surfaces, who can then work and device mechanisms that can be used to secure those surfaces. Securityrelated metadata becomes a strategic asset to the teams, increasing the quality of transactions, reporting, and business analysis while reducing the cost of protection and associated risks that lost or stolen information could cause the organization.

For the data at-rest or in-transit, data protection solutions have existed for a while now. It can range from encryption to offsite data stores, secure tunnels, and packet sniffing tools. Enterprises can very easily use the OS-based or native storage hardware-supported encryption of the drives on the endpoints. These mechanisms ensure that if the endpoints are lost or stolen, solutions will render data useless to unintended people trying to access those endpoints. This could be extended to the servers, application servers, databases, etc., to help organizations attain peace of mind if the physical infrastructure or the cloud instances were attacked. However, these mechanisms have CPU penalties or are costly because of the use of hardware-en-abled encryption modules. These trade-offs are essential for the organization to keep account of, as not all data are created equally, and the funds allocated for data protection must be spent judiciously!

A similar example goes for data in-transit. Secure tunnels (VPNs) have been for a long time and still hold their purpose in the “Work-from-home” policies that got all the limelight in recent years. However, the ZT Architectures, if implemented correctly, can help organizations move away from the need for such secure tunnels by using role-based and permission-based Data Access Management and Governance. These when combined with active fire-walls & packet-sniffing tools increasing the visibility of the security organizations and reduce surface area for the attackers. In recent years, with the mainstream usage of the Cloud, many organizations have the option to use access brokers that can help the security teams have better control over data and provide much-needed visibility with similar benefits across hybrid environments.

In order to handle sensitive employee and client data, cybersecurity teams can also use data obfuscation or masking solutions. These solutions shuffle or change the appearance of data without losing the meaning of data or the relationships data has to other data sets or systems, with the capability to reproduce the original data. These tools change the appearance of data to the end user without changing the underlying data, which means the solutions can mask the data in transit, at rest, and in use ( dynamically). One such use case for these tools can be to display the information to the members who might need permission to view the complete data in the first place but need to due to the nature of their job (helpdesk). Once again, just enough privilege for the users to do their jobs, but not enough to cause any damage to the organization.

At this stage, it is time that we must look at data at rest from a different perspective as well. For a long time, we have depended on encryption to take care of data at rest. However, our secret weapon has been weaponized by the attackers for a couple of years now in the name of “Ransomware”! It was a wake-up call for the security leaders who now must use solutions used by military organizations, like data diodes, one-way data networks, and off-site data stores for other enterprises.

These ideas are no longer part of Hollywood movies but have been made regulatory compliance for Banking and Financial Services, and should, sooner than later, be taken up by other organizations like healthcare, manufacturing, oil & gas, etc., to make disruption of their services by attackers a whole lot less life-threatening. The criticality of data and IT infrastructure in general for these domains deserves a whole lot of attention, and the leaders, both internal and external, should pave the way to incentivize the implementation of critical security standards and solutions to gain back control from the ransomware attackers, rather than depending on the legal teams and insurance providers. With all these fundamentals out of the way, now let us see how an organization can take care of their data and embed these practices in its corporate culture:

  1. Data Security Requirements: Organizations must define their business requirements clearly, allowing security teams to architect the environment efficiently . Also, organizations must be aware of the external regulatory restrictions that apply to them, analyze business rules and processes to identify security sur-faces and keep these documents (documents that track Data-to-process and Datato- role relationships) updated. Organizations must create a central invent-ory of all relevant data regulations and the data sub-ject area each regulation affects. Regulations, policies , required actions, and data will change over time; thus, teams should keep these documents as live and updated as possible.
  2. Data Security Policies and Standards: Organizations should create data security policies based on business and regulatory requirements. Security policies describe actions determined to be used by the security require-ments and help security teams determine the step-by-step procedures that govern the response & recovery procedures from an incident. These policies are vital pieces of the puzzle as they help determine the team’s behaviour and provide actionable objectives to the se-curity requirement documentation. Standards supple-ment policies and provide additional detail on how to meet the intention of the policies. In the case of data se-curity, they are often used to determine the following:
    • Data Confidentiality Levels
    • Data Regulatory Categories
    • Define Security Roles
    • Implementation of Controls and Procedures
  3. Security Awareness Standards: All the policies and standards can only be rendered helpful if members of the organization are aware of or follow the guidelines set by the security teams. Thus, it is of utmost importance that the policies are made aware to all the organization’s members. This can be implemented using appropriate training exercises that every member must undertake. Formal or anonymous feedback surveys and interviews can help identify the organization’s security awareness level. Also, it can help measure the number of members who have completed security awareness training within targeted role populations. Risk assessment findings can also enhance these reports, providing qualitative data that needs to be fed back to appropriate business units to make them more aware of their accountability for the data they manage.

And when the dust is just about to settle, teams must pull up their socks and work on the policies governing data’s retention & deletion policies which are also critical for data security. This can simply help the security team at bay from the burn-out of managing and handling data that has lived past its usefulness to the organization.

There is no “one size fits all” retention period for data that organizations can find standardization about; however, for some organizations, there are regulatory requirements mandating to keep information for a certain amount of time. In other instances, there may be no such requirements and the organization needs to determine the appropriate retention period, which can require case-bycase evaluations.

Once data has reached its retention period, it must be deleted securely. While the chosen dis-posal method depends significantly on the type of media used to store personal information, an organization must also consider the information’s sensitivity and context.

Re-lated to sensitivity, organizations must answer whether the storage media will remain within their control. A more robust disposal method should be used if the storage me-dia are leaving their control. In the current cloud infra-structure scenarios, teams must ensure the cryptographic destruction of data storage or the VM containing the data.

All the care and attention that data gets in the organization is related to the value that it adds to the organization. In the present world where Data is the new Oil, organizations must take due care to make sure that their competitive advantage does not become their competition’s prized possession, or worse, gets held by attackers for ransom, and all we can do is to make a bank run and still depend on the mercy of the attackers.

These scenarios have been happening at a rate that was not expected in the first place. But these incidents do allow us to, once again, highlight the importance of data security and make security practices part of organizations’ DNA. With this in mind, considering the constant evolution of technologies that has led to an inevitable evolution of cyber-attacks and malicious acts, organizations need to act promptly and responsibly to keep their, and their consumers, data safe.

Leave a Reply

Your email address will not be published. Required fields are marked *