Top Security Issues for the C-Suite: Q&A with BlueCat’s Director of Cybersecurity Solutions
Security continues to be top of mind for all organizations. Threats are more pervasive, sophisticated and appear as headlines in the news on a regular basis. According to Symatec’s Annual Internet Security Threat Report, 430 million new malware variants were discovered in 2015 alone[1].
Security continues to be top of mind for all organizations. Threats are more pervasive, sophisticated and appear as headlines in the news on a regular basis. According to Symatec’s Annual Internet Security Threat Report, 430 million new malware variants were discovered in 2015 alone[1].
As a result, BlueCat’s Director of Cybersecurity Solutions, Scott Penney, is in high demand at conferences, panels and forums[2] for his thoughts on security topics that are keeping the C-Suite awake at night.
Q: Should IT Security be recognized as a business enabler?
A: Yes, it certainly needs to be, but within reason. We all know that the IT Security team tends to be seen as the organization most likely to interfere with a good business idea, and we have to acknowledge that as part of our collective history. Moving forward, this is no longer acceptable. “Shadow IT” has proven that if IT can’t keep up with the business, people will find a way to get around IT and do what they need to do to be successful.
For IT Security, the way to move from disabler to enabler is to start leveraging foundational infrastructure technologies that adapt to your technology needs, rather than the other way around.
“For IT Security, the way to move from disabler to enabler is to start leveragingfoundational infrastructure technologies that adapt to your technology needs, rather than the other way around.”
This means that as you make rapid changes, you can’t have the baggage of re-architecting a big chunk of your infrastructure, or worse, buying a few new solutions to cover a new need – both of which can dramatically slow down your ability to stay relevant.
For example, technologies such as behavioral analytics engines can help in this area – the technology learns what’s normal over time and can alert when something suspicious occurs, regardless of the reason.
Q: Is identity and access management defining the new security perimeter?
A: In a lot of ways Identity and Access Management (IDAM) is becoming a foundational piece of security that enables us to build new and better security models for the future.
Being able to verify a user with confidence, and then link that user to an authorized device and a set of necessary applications and data can dramatically reduce the risk of a data-related incident.
However, IDAM is only as good as the underlying controls and processes it supports. Organizations still need to find ways to segment their users, applications and the data they require to assign access rights in a granular way. There are also critical processes around the ongoing management of access that need to be followed as people move around within companies and their security needs change.
Q: As data centers become more distributed and the traditional network perimeter dissolves, what can we use to protect enterprise data?
A: Visibility becomes hugely important for ensuring security and data protection as distributed IT continues to progress. As we lose direct control over all of the many “things” that access our data or infrastructure, we’re going to need to shift our focus heavily toward gaining insight into their actions and using that insight to assess the level of risk that those actions represent.
“To meet increasing visibility and detection challenges, organizations need to find ways to tap into existing information to understand normal operations, and then use that insight to identify whenbehavior starts to deviate.”
“Things” include internal users, cloud services, guest devices, BYOD, and the many items that fall under the category of “Internet of Things” (IoT). Each device behaves differently and interacts with data in its own way.
The challenge is these things participating in our infrastructures are getting out of control and it’s hard to monitor and control them.
To address this challenge, we need to focus on leveraging foundational infrastructure to monitor what each “thing” is doing and find ways to detect potential risks faster to prevent serious incidents before they have a chance to do damage. This may include leveraging DNS, DHCP, and other ubiquitous protocols as visibility points into what’s happening on the network in near real-time.
Q: With so many organizations struggling to manage identities within applications, what are some best practices to lower costs and risks?
A: It really comes down to tight integration between IT and the various lines of business (LOB) they support. Communication and information sharing are the cornerstones for success. While IT has to play traffic cop for access to critical systems, they can’t succeed unless they are informed about changes in business requirements – user role changes, data criticality evolution, and other factors.
One recommendation is if communications between lines of business don’t already exist, it’s time to start investing in them. Whether that’s by assigning LOB champions within IT or creating a formal communication process, the effort must be made.
Another recommendation (an unpopular, but essential one) is the need for a regular audit process around IDAM to catch the changes that fall through the cracks of everyday activity.
“IT needs to embrace self-service in a big way. Give the people in the business whoreally understand the requirements the ability to fulfill those requirements themselves.”
IT needs to embrace self-service in a big way. Give the people in the business who really understand the requirements the ability to fulfill those requirements themselves. This empowers them to do what they need to do in the right way, as opposed to finding ways to work around the system.
Finally, don’t get behind. With so much data out there (that changes so quickly), you’ll fall fatally behind if you take your eye off the ball. It becomes a nightmare to catch up and close any holes that may have appeared in the interim. Data classification must be a fundamental aspect of any service launch – it’s crucial to get the right information up-front on the lifecycle of that data and the protection requirements to support it. Cutting corners or making assumptions at the front-end makes it harder to retroactively fix it down the road once it gets caught by an audit or other control point.
Q: According to analysts, breached organizations average 250 days before they know they’ve been breached. How do you mitigate these risks?
A: Visibility is the key here. This wasn’t so hard when our entire critical infrastructure was protected within the four walls of a data center and we had informed staff who watched for strange behavior.
Today, we have distributed IT with the complexities of cloud, IoT, etc. Here’s just one example: a power company in the UK just rolled out 23 million smart meters to consumers, each of them with an IP address and a wireless connection. What visibility do they have into what each is doing?
To meet increasing visibility and detection challenges, organizations need to find ways to tap into existing information to understand normal operations, and then use that insight to identify when behavior starts to deviate. This is often the earliest indicator of compromise.
Here’s another example: if you have a Point of Sale machine that only talks to a dozen internal hosts for payment processing and inventory purposes, then suddenly it attempts to connect to a server in HR, that’s probably an issue that merits investigation. However, you wouldn’t know that attempt was even made unless you mine and analyze all the information around that system – firewall logs, DNS query logs, etc.
Contact BlueCat to discuss your enterprise security challenges. [1] Symantec Annual Internet Security Threat Report [2] http://www.argyleforum.com