To say 2020 was a strange year would be an understatement. Among the effects of the global pandemic were major changes in where and how people work and accelerated commitments to organizational digital transformation. Cyber attackers weren’t shy about taking advantage of the blind spots these shifts created. Fintech News reports that more than 80 percent of firms saw an increase in cyberattacks in 2020, while Arkose Labs found that cyber fraud jumped 20 percent, reaching 445 million attacks. Combined, it’s estimated that the average cost of a data breach reached $3.9 million in 2019, while the average time to identify a breach rose to seven months. All of this begs the question, have we become numb to data breaches as a whole? Are corporate boards now just pricing breaches into their risk models? And on the Street, are investors pricing breaches into their risk models or is there still room for data breaches to surprise the cybersecurity community?
Even in the best case, nobody knows whether their entire environment is secure all the time. They have to assume there is some degree of compromise – especially now, when so many organizations have implemented “Bring Your Own Device” policies, introduced multi-cloud and other new environments, and acquired environments that merge in. This puts the data estate we’re managing into a constant state of flux and uncertainty.
As we begin 2021, cybersecurity professionals need to recalibrate and reprioritize their organizations’ security requirements. Here are the imperatives based on current trends.
Shifting to a zero trust model
Ultimately, the industry is shifting toward a zero trust model around where the data resides and around the identity of users. In this model, the telemetry of tracing where users go, where they’re coming from, and how they’re interfacing and interacting with data becomes the new target for how to secure the assets in your environment. We can’t just think that “if we protect the edge, we’re going to be safe” because there is no edge anymore – the corporate network and perimeter tends to dissipate.
Get in front of security requirements for cloud-based assets
As organizations continue to migrate to and adopt newer cloud native applications and CSP-type environments, we can expect more breaches in 2021. Most organizations don’t have a long-standing, solid security posture to go with these environments, and the newness and lack of experience in them is going to create some exposure. From an operational perspective, organizations can circumvent the hassle of maintaining instances – EC2, databases or otherwise – when they break out parts of services and applications to be hosted in these environments as part of their infrastructure. The downside of the “[blank] as a service” approach is an introduction of risk as we saw with recent data breaches in supply chain and third party services. These breaches really illustrate the need for security people to help their business leaders understand the risks of these environments and take steps to help manage them.
Making security the responsibility of everybody in the organization can be parlayed into mitigating and stopping data breaches in the future. One way organizations are taking this on is through Cloud/API Centers of Excellence. These “think tanks” within an organization are a cross-functional group of people who develop patterns and trends for weaving security into the software development life cycle.
Create security for a perpetual “state of hybrid”
We’re seeing a perpetual state of hybrid that will continue in 2021, because there are some things that organizations cannot move out of the data center, such as mainframe. Organizations can keep critical apps in a private cloud, they can run other standard apps in a public cloud, and they can have an SD-WAN or some kind of private connection between them. As an industry, we’ve shifted to a new level of complexity with the way that application networks are designed and deployed. We’re going to see more re-architecture and replatforming of the applications themselves and a departure from native design patterns. Organizations will take greater advantage of cloud-native services. As opposed to doing a “lift and shift” of applications they don’t want to host on your data center, they’ll just put it somewhere in the same estate. The thinking here is “let’s redesign and get the benefits and optimization of whatever the cloud has to offer”, and leveraging cloud database services in the way they’re intended to benefit the organization.
One of the trends related to creating security in this state of hybrid is “collect everything” of which data lakes are a significant part. Organizations recognize a competitive advantage in being able to have a predictive analysis of user behavior on whatever is in the environment. Take a food delivery service as an example – using data, the service can predict that a food order will be delivered in x minutes and this intelligence is a competitive advantage over another service. The collection, storage, and analysis of all this activity data is happening across the board and is now becoming a standard or even a requirement to be competitive. This means there’s now much data that organizations need to secure – something for which they probably didn’t previously have a plan or strategy.
Securing and supporting data lakes in 2021
A traditional data warehouse environment is generally contained in one concrete server. Historically, it was an instance to which you could apply ACLs or on which you could run instructions to create features like access control. In a data lake it’s different. You’re connecting like you would with a data warehouse, but there are massively distributed sets of file stores across many servers covering multiple S3 buckets. There are many different query engines able to access that data. And there are many ways to inject, run and process that data and for developers and analysts to share information. Securing data lakes needs to be done in the context of this threat landscape.
Security teams need to work with business units to develop a design pattern that ensures data lakes are deployed in a manner that can be effectively secured. There will be a learning curve because most security teams are short-staffed and not up to speed on cloud technologies or on deploying them in a production capacity without introducing additional risk. They’re between a rock and a hard place – needing the benefits of the data lake but unable to support the technology at scale. There will continue to be a democratization of the data lake in 2021. If you look at why businesses are building data lakes in the first place, it’s to realize the promise of building out rich analytics and machine learning algorithms and make their own business data useful in delivering new products and services to the market. Think about an organization like Uber Eats as an example. Part of the democratization is to get these business benefits; organizations must grant access to this data to literally thousands of users. Since the data lake contains highly distributed data, the security challenge is in managing and controlling so many independent special services. More importantly, it’s about deciding which services they need to be concerned about and which they don’t. Security in the data lake is not just about user authentication and authorization, organizations need to know what users are doing with the data and why. Organizations will continue to look for solutions and best practices because auditors are asking these same questions, and we’ll likely see greater enforcement around these issues.
New strategies for overcoming cyber threat fatigue
Cyber threat fatigue will remain a factor in 2021, as organizations struggle to stay on top of things as the cyber threat landscape changes. The key to managing this is to first identify where the risk is. Filter out events with which there is no risk associated. Refine your approach using telemetry to determine which users are accessing what data – then create a refined set of events that can enable you to use insights and analytics platforms to produce a dashboard that tells you “this is an event that matters, here is all the other information you need that goes along with this event.”
The business risk approach to protecting networks and endpoints
What will organizations need to do to protect their networks and endpoints better in 2021? Start thinking from a business risk perspective. Who are the individuals in your organization that are apt to engage in the riskiest behavior? Assign different risk profiles for different people in the organization. For example, traveling salespeople are more likely to have access to customer information but less likely to have access to other types of PII. The highest risk people in the organization are your systems administrators, people with highly privileged access to information. You need to prioritize who needs to be monitored better, who we need to be most particular about, and who requires fewer restrictions or requirements around them. There is no definitive way to build that out, you have to know the internal workings of your organization to make these calls correctly. To improve the prioritization of assigned individual risk, security teams must partner with the business units to better understand the business purpose of what they’re actually doing.
Better security management during the rapid adoption of microservices
What are the threat models around microservices? What do organizations need to think about as they adopt some of these frameworks in the year to come? They need to consider more rapid release cycles and scaling just part of an application, to name a few. Another factor is that serverless technology for microservices can break down an application into small functions, and those functions can live anywhere. Even the largest organizations and slowest to adopt are moving toward newer cloud-native technologies to reap the optimization, cost reduction and benefits they offer. To secure these environments, you need to think of how microservices interact. Historically, security teams would put security at the ingress of a network. Today they need to think about inter-network communication between these different microservices. There’s no longer a single point of ingress to the network, and a common problem is controlling the interactions between users and networks. The sheer number of development accounts in most organizations creates a risk because they can all be used in an unsecured manner. Many born in the cloud companies are trying to address API security to manage the interaction between microservices but, to date, there really is no good standard for this across the board.
We’ve traded the relative simplicity of the monolithic data center for the ability to distribute data across a network. As we continue to shift to cloud and a microservices design model, we need to recognize more control points and more focal points and determine what security architecture needs to come first. This was a big challenge when there was just a data center, now we need to discover and understand cloud-native microservices environments and try to secure them before we deploy them and avoid having to add security last.
Insider threat analytics and threat detection algorithms
There have always been malicious users, insiders, and careless people that introduce risk into the environment. Understanding who those people are in your environment – particularly now with a dissolved perimeter, users connecting using VPN on their home network, and BYOD policies, etc. – will be very important in 2021. You really need to identify what those users are accessing and what they’re doing, and really understand what those access points look like so you can define what normal looks like and what’s anomalous. Rely on machine learning algorithms to help prioritize how you look at data threats and how you respond. The technology is available, executing needs to continue to get better.
Get more from SOAR: the changing role of SOAR in 2021
In the distributed service environment, service A talks to service B which talks to service C which talks to the database. To identify potential threats and take appropriate action, you need to understand who the users are and trace them through the entire environment. In 2021, the role of SOAR will be very dependent on how effective your organization’s insights platforms are – UEBA and other types of solutions that aggregate, parse, and present solidified or consolidated results that say “here are the two events of four million that require action”. The SOAR solution can then take those results with that context and take action upstream in any one of the tiers in the environment. From the audit perspective, being able to trace back and find something anomalous needs to be accompanied by the ability to look at all the data associated with those transactions. This will all hinge on how effective your analytics insights and platforms are. Having better insights, and better telemetry coming in will improve your ability to respond effectively to that data.
From a data security perspective, in a data lake – a collection of services and data from highly differentiated environments – SOAR is a critical component in being able to respond to threats. Because these systems are highly distributed, there’s not a single choke point to actually control ingress points like you can with a database – instead there are lots of different access points. SOAR will become more tightly integrated and leverage more cloud data functionality, so if you go look at ISM roles and permissions, we see SOAR platforms being used as a way to integrate more tightly with things, controlling them at the access layer, and dynamically provisioning things like a security group that can shut down various threats. Today, most organizations use SOAR at the endpoint and network level, but less at the data level – that will change some in 2021.
This topic and several other trends we anticipate impacting 2021 are discussed in our“Where Do We Go From Here? 2021 Security Predictions”. We invite you to listen to the fireside chat here.