Topics In Demand
Notification
New

No notification found.

Six best practices towards building a compliant and secure hybrid cloud infrastructure
Six best practices towards building a compliant and secure hybrid cloud infrastructure

August 26, 2022

190

0

Six best practices towards building a compliant and secure hybrid-cloud infrastructure in the Financial Services Industry

 

 

The financial services industry is a constantly evolving space that continually innovates to keep pace with changing customer demands, market volatilities, and technology disruptions— One such evolution is the adoption of cloud computing. The Financial Services Industry (FSI) has been on the cloud journey for 10+ years; however, the adoption soared during the pandemic and has been progressing at an incredible rate. 

A popular strategy has been deploying a hybrid cloud at the enterprise level given the financial services dependency on legacy systems, on-premises data centers, and concerns around compliance & security. 89% of banks reported that they are currently operating with or planning to operate with a hybrid cloud solution, according to the IDC’s 2020 CloudPath survey. 

At the onset, it seems that hybrid cloud is here to stay, with several leading finance and banking organizations in the fortune 100s increasingly adopting this strategy. To name a few, Bank of America launched a hybrid cloud with IBM, while Banco Santander partnered with Microsoft Azure for driving their hybrid cloud strategy, and many more.

 

Six best practices towards building a compliant and secure hybrid cloud infrastructure

Organizations are increasingly adopting the cloud, be it private, public, or hybrid. The first and foremost step in that journey is migrating on-premises data into the cloud. Due to uncontrolled data sprawl and the increase of unstructured data, the cloud migration journey for most organizations becomes complex and stressful. Data soon becomes a liability with security, risk, and compliance concerns. What’s the best way for organizations to migrate enterprise data to the cloud while reducing risks and costs in the face of so much data?

  1. Discover and Index: The first step is to discover and tag unstructured data containing sensitive or private information. Analyze the file content of your unstructured data for key personal identifiable information (PII), personal health information (PHI), or business-sensitive data. Then define risk profiles and file classification using intelligent tagging. By combining the identification of risk with data classification, organizations can understand the risk that exists and provide easy means of quantifying it. Artificial intelligence and machine learning technologies can be used to transition data from simply existing into aligned and refined data for risk reduction and business value.
  1. Ensure remediation of potentially sensitive data: Remediation is more than simply scanning and analyzing your environment for data that could potentially expose your employees and customers to risk. It allows you to resolve those issues. Identify the system for data that could expose the customers and the business to risk. Then ensure the use of the right remediation tools to mitigate the risk of sensitive data access and misuse to protect the business from any adverse effects. Organizations can mitigate the risk of sensitive data access and misuse through a robust, multi-approver remediation workflow that provides complete visibility of the remediation cycle’s request, approval, and execution phases.
  1. Quarantine the sensitive data: When files have personal or business-sensitive information and are accessible by a multitude of users, this creates exposure and exponentially increases the risk for rogue usage. Quarantine provides the ability to move files to a specified location and isolate them without anyone being allowed access. The air gap provided by Quarantine, with no means to access those files, helps you to prevent ransomware attacks on critical files while providing immediate protection. The key is to move sensitive data to a more secure location, such as sharing an object storage bucket file storage. A provision for moving the sensitive files from one file-share to another file-share and a file-share to an object-store location for easy accessibility can make the process smoother and more flexible. 
  1. Leverage intelligent re-permissioning: Usually, file permissions are assigned at the time of its creation, based on the location and the storage specifications. With passing years new people join the organization, and there is a risk of granting file access to the unauthorized person in the process. Using intelligent file re-permissioning, enterprises can provide access-based control that can ensure consistent means to manage file permission and can help with risk mitigation.
  1. Maintain immutable audit trail: Classify and track files to create an immutable audit report that can be utilized for regulatory and internal data governance. Blockchain technology can help! A combination of off-chain and Blockchain technologies can be used in a way that the references to any personally identifiable data may be erased when required. Whenever an audited file is modified, the change can be added to the blockchain, and stakeholders can then view a report of all audited changes for the dataset as a whole or for a specific file, as needed. This step can give visibility into any changes made to PII data present in the scanned files with the details of the users’ modifications, updates, and deletion. Immutable audit reporting is the foundation for enterprises to develop into secure file sharing across business units or even outside the enterprise. It empowers the enterprise to proactively mitigate risk, provide scalable security remediation, and generate immutable reports for validation.
  1. Intelligent identification and actionable reporting: This step includes identifying and classifying data sets containing files with content fields associated with regulatory requirements. Start with the premise of supporting the strictest definition of privacy and thus identify a data set that has the highest probability of adherence. Enterprises can utilize the ready-made templates or create their own compliance templates based on their specific requirements. This identification is the first critical step in the process to remediate and meet regulatory requirements. Another critical requirement in most regulatory requirements is to report a consumer’s data back to them, i.e., “tell me what data you have about me”. This step across the unstructured data sprawl in enterprise environments is not an easy task, and most enterprises utilize point solutions to scan the environment for this single use case and provide static reports without any actionability. This causes additional technology overhead and costs millions of dollars without fully meeting the requirements of the regulatory bodies. Actionable reporting is the key here. This allows data custodians to make decisions about what data to store, how to store it, and what level of access and use is appropriate with the individual’s consent. This actionable functionality helps to meet the challenge posed by massive volumes of personal digital footprints created as a result of the digital revolution and the Internet of Things.

 

 


That the contents of third-party articles/blogs published here on the website, and the interpretation of all information in the article/blogs such as data, maps, numbers, opinions etc. displayed in the article/blogs and views or the opinions expressed within the content are solely of the author's; and do not reflect the opinions and beliefs of NASSCOM or its affiliates in any manner. NASSCOM does not take any liability w.r.t. content in any manner and will not be liable in any manner whatsoever for any kind of liability arising out of any act, error or omission. The contents of third-party article/blogs published, are provided solely as convenience; and the presence of these articles/blogs should not, under any circumstances, be considered as an endorsement of the contents by NASSCOM in any manner; and if you chose to access these articles/blogs , you do so at your own risk.


© Copyright nasscom. All Rights Reserved.