This webinar, part of the GovForward Multi-cloud Webcast series, will help you learn about strategies for securely moving workloads to the cloud. We will also share how an identity-centric approach to Privileged Access Management (PAM), based on Zero Trust principles, helps strengthen security and compliance posture alike.
Please join this interactive session as our experts discuss:
- How perception differs from reality in government security practices
- What can inhibit cloud migration and how to overcome them
- Ways to improve cyber resilience for on-prem and cloud environments
- Why the path to Zero Trust starts with identity
James Hansen (00:09):
Welcome to the GovForward Multi-cloud Webinar Series brought to you by Carahsoft and Government Executive Media Group. I'm James Hansen, the vice president and publisher of NextGov, the federal technology news division of Government Executive Media Group. Through this series, we're featuring more than 30 technology experts covering virtually all facets of cloud security. These FedRamp marketplace and Multi-cloud technology thought leaders will highlight how their solutions enable agencies to address network security, data, and IT monitorization challenges. You can see the full schedule of programs along with hundreds of related resources at kerosoft.com/GovForward. And now I hope you enjoy today's webinar.
Torsten George (00:53):
Hello everybody. Thank you for joining us today. As we're embarking on discussing strategies around securely moving workloads to the cloud. Today's webcast as part of Carahsoft scuff forward multi-cloud series. My name is Torsten George, and I'm a cyber security evangelist at Centrify and I will be your moderator today. So let's set the stage for our discussion, a number of cybersecurity incidents and involving the government's aging system has grown by more than thousand percent, since 2006, leading many agencies to site information security as a major management challenge for the organization. Well, legislation encourages efforts to modernize IT and strengthened cyber security. Many government agencies are struggling with the increased complexity that emerging technologies. like the cloud. are creating. To explore strategies to securely moving workloads to the cloud and discuss how an Identity-Centric approach to security can help strengthen governments, agencies, security, and compliance posture.
Torsten George (01:58):
We have invited three amazing panelists to share their thoughts and experiences with us. Let me introduce you to Nicole Keaton Hart, whose an experienced influential leader and culture change agent in information technology and cybersecurity, driving security and digital transformation initiatives. With over 25 years of information technology, Nicole is a proven visionary and pragmatic leader. She is a cyber security strategy expert who is equally versed in enterprise risk management and enterprise IT governance. She has previously held key executive leadership positions as senior vice president SunTrust Banks, virtual CISO and cybersecurity strategist, as well as state and local governments CIO, keenly focused on business IT and cyber security synchronizations across major industries, such as financial services, consumer packaged goods, retail, oil and gas, as well as health IT. Today, Nicole serves as an executive in-residence for Georgia State University and as a distinguished board member and CIO of Fractional CxO advisors.
Torsten George (03:15):
In addition, we're joined by Dr. Ron Ross, who is a fellow at the national Institute of standards and technology. His focus areas include cyber security, systems security engineering, and risk management. Furthermore, Ron leads the Federal Information Security Modernization Act Implementation project, which includes the development of security standards and guidelines for the federal government , contractors and the United States critical infrastructure. He’s published a wide variety of standards and special publications covering privacy and security controls as well as risk management frameworks. As if it's not busy enough for him, Ron also leads the joint task force and inter-agency group that includes the Department of Defense, Office of the Director of National Intelligence, Use Intelligence Community and the Committee on National Security Systems with responsibility for developing a unified information security framework for the federal government and its contractors. And finally, I would like to welcome David McNeely, who is the chief strategy officer at Centrify, where he's focused on helping customers meet the evolving security needs of the modern enterprise while contributing to the strategic vision of the company's product portfolio.
Torsten George (04:33):
David has been with Centrify for over 16 years, contributing to the company's high growth via product innovation. Prior to joining Centrify, he served in a variety of product roles at AOL and Netscape communications, which was acquired by AOL. David is also a fellow at the Institute for Critical Infrastructure, which is a think tank that brings together a nationally-recognized cybersecurity and national security leaders. So welcome everybody. Let's get started with our expert round, and one of the first topics we want to talk about perception versus reality in government security practices. Obviously, the US government sector has some of the most prescriptive and effective regulatory mandates and security frameworks in place that you can find all the way around. And in fact, many other verticals and countries are borrowing security controls from this cyber security framework or the continuous diagnostics and mitigation program when assembling or revising their own regulations.
Torsten George (05:45):
Bearing this in mind, many observers are really struggling with a paradox that represents itself by the fact that year after year, we're hearing about the poor compliance and security posture of government agencies, and the U S. Just last year, the Senate Homeland Security and Governmental Affairs Subcommittee on investigations ducked through a decade of inspector general reports for eight federal agencies that rated lowest for compliance within the cyber security framework. The primary finding was an overall failure to keep pace with even basic federal cybersecurity standards. So let's ask our experts if they can shed some light on this paradox. I'm singling out Ron here because he's involved in writing these frameworks. So, what do you think that the US government has some of the most advanced cyber security regulation frameworks yet the agencies are struggling to implement them. Why do you think is that?
Ron Ross (06:47):
Oh, thanks. Thorsten it. So it's a great question. We get asked that question quite frequently. I think you have to answer that question in the context of what the federal government does. We have some of the most diverse missions and business operations, and one of the largest organizations in the world —number of people, number of systems. And so, the complexity of those information systems across industrial control systems, business systems, small devices — it's an incredibly complex infrastructure that the federal government runs. And it's also operating in a very hostile threat space, as everybody knows. So I think it's always going to be the case that the reports are going to have findings and, and discover vulnerabilities that haven't been attended to. But I look at all the progress the feds have made over the past 15 years, maybe 17 years since I've been leading the FISMA project, we started from ground zero.
Ron Ross (07:41):
We had no framework. We had very little controls. Privacy controls were nonexistent back in the early days. So we've come a long way to developing things like the risk management framework, cybersecurity framework, privacy framework. We last week just released the update to NIST 853, which is literally, I believe the best control catalog anywhere on the planet to integrate security and privacy controls into the same catalog. And the federal agencies have an enormous task. And I have to give a shout out to all my brothers and sisters out there on the front lines. They really have a very difficult and challenging job because they have an enormous attack surface to defend. Sometimes these systems. I know you're going to talk about this later, modernization, and they're dealing with some older systems which are more vulnerable, and maybe some, in some cases can't be fixed.
Ron Ross (08:35):
They're transitioning to the new technology to include cloud, but it takes time. And you're going to always have those vulnerabilities and those weaknesses and deficiencies pop up. And especially in an infrastructure as large as ours. But overall from what I've observed over the past decade or more, there's been enormous improvement in the federal information, security infrastructure. And I talk to agencies every day. They share a lot of their difficulties and challenges, but they're working 24/7. It's an enormous job. And it's like everything in security, or if there's a terrorist attack, what you don't see for every successful threat that actually exploits a vulnerability, it becomes headlines, but you don't see the hundreds and thousands and millions of cyber-attacks that are launched every day at our critical infrastructure. And they are pushed back. They're defended against very well.
Ron Ross (09:31):
So, there's always room for improvement, but I think overall, I would give the federal government high marks overall in where they started, where they are today and their plan for improving in the future.
I think very well said I would agree media always focus in on the negative examples and doesn't point though, at all the great work that has gone into that. Nicole, obviously you had exposure, you were CIO and state and local government. Maybe you can bring a little bit different perspective to the plate here because obviously state local might be challenged slightly differently than the federal government. So Nicole, any insights here, why we might see all these great regulations, but we're struggling to kind of apply them.
Nicole Keaton Hart (10:18):
Yeah, absolutely. So there's a couple of things. And so, you know, as I was listening to Ron speak, he packed a lot in there and at a state and local government, some of the differences are these NIST frameworks, for example? are they mandated versus voluntary for adoption? First and foremost at the onset is state and local government level. And frankly that just varies by municipality in terms of how they, how well they put forward efforts to adopt frameworks like NIST. But if I put that aside just for a second, some of the other challenges that I see in the state and local government arena, there's a few, and Ron mentioned this, the legacy technology and adding to that and compounding the challenges that state and local government face. Including inadequate supporting risk management infrastructures to identify and manage technology risk and more specifically, cyber risk. Second, state and local government entities have a wide range of functions, which are often led by both appointed and elected officials.
Nicole Keaton Hart (11:33):
And that really shapes the definition of a stakeholder differently than what we see in the private sector. In the private sector, when we're looking at making technology investments and mitigating cyber risks, you're not in a position where you're often evaluating taxpayer dollars and putting forward compelling programs to mitigating cyber risk at the same time. There's another part of that government agency that's putting forward an equally compelling story to combat homelessness or drug addiction and those sorts of things, or the cost of housing inmates that a county jail, for example. And so, the variety of requirements that they have to deal with are vast. And I think that's probably in parallel to what Ron mentioned in the federal sector. Last, I would say, historically, to accommodate the diversity of functions that existed at a state and local government level, many entities have relied on processes and technology and software solutions that were designed for government, which similar to the pace of government in general, just has not moved as rapidly as the private sector in the identification and management of cyber risk.
Torsten George (12:59):
Very good points. I like especially pointing out that the transition to a risk management based or approach. I always set a risk as securities new compliance because you had that old mentality of checkbox and making it to the audit. But I think NIST and other oversight agencies started propagating more risk-based approach. And it's still a cultural challenge to make that transition. So I think Nicole made a great point. Yeah. So David, just get your input. I mean, you, you have been in contact with a lot of agencies over the years and talk with them. What have you seen, as it relates to them struggling or trying to implement these best practices?
Well, certainly we've heard exact similar complaints or challenges with respect to how difficult it is to implement technologies into environments, especially given the diverse environment.
David McNeely (13:59):
I think I learned early on that some of the biggest challenges happen to be because some programs within some agencies are created for a particular project. And usually they're outsourced and so you get a set of computers and a set of systems that backup and support that particular application. And yet the next project that comes along is going to look very different because it was outsourced to different agency or different set of contractors that created the application. So, consistency across the environment has always been a challenge, I think at least within the agencies that I've spoken with. And that sort of leads to having a lot of the very different sets of computer systems in place and the applications, which makes the centralized centralization of security management just that much more difficult. That's actually one of the advantages of these agencies looking at migrating to cloud-based environments, because they do get a chance to revisit how an application was constructed and modernize it as they move it into a cloud-based infrastructure.
David McNeely (15:13):
But I think a lot of the challenges around security have been based on more traditional views of perimeter-based. And I think a lot of people are really waking up and looking a lot more seriously at zero trust architectures. NIST created another documentation that highlighted that methodology, which is really a mind shift change in how the secure environments based on assuming that the adversaries are already on the network. And with that assumption in mind, now how do you protect the assets while you have to enable the assets to defend themselves?
And we will talk about this in a couple of minutes. So, some observers blame the slow IT modernization for the shortcomings and the agencies security posture. And in fact, there was even a recent study by the government accountability office that found that the government's aging, IT infrastructure and systems are becoming increasingly obsolete, costly, and as a result, vulnerable. So, let's keep it at the higher level, not talking yet about the example of cloud, but IT modernization in general. Nicole, what do you think are the main inhibitors for agencies to really move ahead with implementing modern technology?
Nicole Keaton Hart (16:32):
If I look at that I'll, that I'll touch on, um, maybe four key common inhibitors. um, First, um, and again, my perspective is more so at that state and local government, um, level, but a practical means to identify, develop and acquire emerging skills, um, in key areas, big data, data science, AI, cybersecurity, so on and so forth. Second, the absence of business cases that expand relationships and access to the private sector as a means to drive services and economic growth. um, Third, um, legacy systems and the integration challenges thereof. we've talked a little bit, um, about that. I think all three of us mentioned, the legacy systems and David mentioned a point here that, um, was very prevalent in my experience with state and local government, um, you know, THE standalone single function systems, um, you know, and, you know, being able to, within that government setting, define a compelling need for system integration to address the diversity of functions without the presence of a data driven culture really makes the much more difficult.
Nicole Keaton Hart (17:51):
When you're looking at it purely from a perspective of what do I do with a legacy system and how do I keep that old gold framework or that old mindset, if you will, to just integrate legacy systems. And so, the absence of a data driven culture certainly introduces challenges there. And, when I talk about, a data driven culture, that's really just a culture that embraces the use of data in decision making. And really the ability to treat its data as a strategic asset of that entity. And then in many ways, making that data, widely available and accessible. Fourth, I had the ability to effectively portray that entity or that agency cybersecurity posture to the board or their governing body. When it comes to the board and the information the board receives, regarding cybersecurity often there is a voluminous amount of data, that that's not particularly useful because it doesn't convey the cybersecurity strategy. It doesn't convey key cyber risk. It doesn't convey the vendor ecosystem and key cyber relationships with third party entities. And moreover, it doesn't portray or paint the picture of the investment or the resource investment strategy required to achieve a defined level of success.
Torsten George 1 (19:28):
Really good points. I think, especially the talent gap that was one of your first points, is often challenging, correct? I mean, sometimes the salaries that are being paid by commercial sector compared with government alone has an impact here run or a David, any additional insights that you have here?
I don't think I can add much more to Nicole did a great job of describing the difficult challenges. I think they're largely the same for the federal government. One of the things that I've observed over many years is we've had this concept of enterprise architecture for a long, long time. And even with that concept, it's applied at the federal agency level, but one of the questions I've always wondered, wouldn't it be better to apply it if the federal government as large level? Because there's a lot of stovepiping in duplication of functions and services across the federal government.
Ron Ross (20:25):
Now they've tried to do a lot of shared services, but it seems to me that the shared service should be the first place you'd go. And you only develop your own individual systems where there's absolutely need to do so. And I think there's still a lot of not invented here. Everyone likes to think they're special and they have their own needs for these separate systems. But to me, this is directly related to cybersecurity because the larger you propagate this infrastructure, the complexity and you're duplicating functions and systems that grows the attack surface. And it also makes it incredibly expensive to have these duplicated functions across multiple agencies. This is not just the federal government problem, but we're one of the bigger entities. I mean, I'm sure this exists in the private sector as well, but that's a cultural shift. You have to be ability to change your mindset, to go to the shared service concept first, and then only develop specialized systems and product services when you need to have that specialization.
Ron Ross (21:23):
And I think that's the number one issue today is that the more of the old installed base that you continue to run, the longer this attack surface is going to have exposure. And that's where zero days really have a heyday because once the adversary discovers something about your system or network that you don't know, and who could know everything with the complexity we have today, you're fighting a losing battle. And so I know in the next segment, we're going to talk about movement to cloud, but I think the number one thing I think that it would help cybersecurity is a more rapid movement to modernization and reducing the attack surface by doing more shared services. And therefore you have less numbers of systems, fewer numbers of systems, I should say. And that by definition will translate into smaller attack surface, which you can better defend.
I think you hit the nail when it said, just while you were talking, came to my mind, like I was dealing with a major federal agencies that a DC location, they had an Austin location, and both locations run their own separate systems and run their own KPIs and monitoring reports.
Torsten George (22:37):
And it was simply not comparable. So you couldn't even compare one, one set of that agency against another one. And it makes it extremely difficult. David, any insights from, from your field exposure?
David McNeely (22:51):
Well, I think Ron and Nicola covered it pretty well. I do agree there there's a lot of challenges in the environments. It’s just different than every single agency that we go talk with, as far as what their challenges are.
Torsten George (23:10):
Yeah, I mean, when I looked out the window this morning here in California, it was cloudy. I always talk about cloudy with a chance of a data breach when I talk about moving to the cloud, but let's stick more into details as it relates to leveraging cloud and even multi-cloud environments and the government, according to the 2020 cloud threat report, 88% of organizations are currently using public cloud infrastructure in one way or another. Nonetheless, the numbers far lower when it comes to government agencies.
Torsten George (23:46):
So, in general is really primarily driven by agility, flexibility and cost savings, which should be also applicable to the government, but there still remain major concerns about cloud adoption that are deeply rooted in the security challenges and effect 92% of organizations admit that they face a cloud security readiness gap. David, I know you have recently been interviewed by Federal Computer Week to share your views on securing cloud resources. What do you think are the main inhibitors for a great adoption rate of public cloud deployments and the US government?
David McNeely (24:25):
Well, first of all, I think we, we probably should clarify what we mean by cloud. So there there's a lot of different uses of that word, right? Um, I, I find most people tend to, uh, look first at, uh, infrastructure as a service, just because you could imagine taking a computer, that's running an application, moving it to a Linux operating system, or maybe if it's running windows, it makes it very easy to go spin up a virtual machine on the cloud provider and just run the application in another data center. So it's almost, we call that lift and shift, getting things out of your existing data center, moving it to a cloud-based platform, um, with the goal of it, hopefully being cheaper than having a machine on, on premises, or at least you could restart it on, on net new machines so that you're always getting the latest image, the latest build.
David McNeely (25:18):
So that's the first thing that people tend to focus on. And so that tends to be the primary place where we spend a little bit more of our effort as opposed to the other two, which are more platform as a service or SaaS based application. We do build and deliver a SaaS-based application to make it easier for our customers to use the set of features of the application we make available. And that's probably where there could be a lot more use of common applications in this shared services type model Ron was talking about. You know, if we were to have agencies look at using SaaS based applications, I think a lot of the challenges that I see in going back to Infrastructure-as-a-Service, a lot of times people take their existing security practices, their existing security concepts, and just simply move them to cloud and reimplement the exact same things. So, they end up with the same concept of firewall around a set of computers and maybe virtual remote access into that with Jump Host and maybe a vault that keeps up with privileged accounts and things like that. It is difficult to set up a solution that can enforce higher levels of security. That's tied back into the infrastructure, but that certainly would be the goal is to get quite bit more granular
David McNeely (26:44):
And assume that you need to protect each and every asset each and every virtual machine each and every application independently, as opposed to taking a network style approach, just given the flexibility of these cloud providers.
I mean, we have also seen the opposite that people don't believe that they can apply the same policies, the same security practices in a cloud environment, and have to come up with something brand new there. So, Nicole, any views on your end about inhibitors that holds government agencies back and deploying cloud?
Nicole Keaton Hart (27:19):
One of the things, not a lot to add to what David said, but a couple of things. You know, one of the things that I've seen in state and local government is a fear, probably not the right word, but a level of, in trepidation with concerns to how secure really is the cloud. When you're looking at entities that are responsible for health data, HIPAA data, things of that nature. And in my experience, what I've found is through a series of assessments and engagements there that oftentimes we can devise clouds strategies, or cloud solutions that are just much more secure then that entity could ever afford to do on-prem. And that's a combination of technology and resources, and that's just the way it's been. And so, at some other agencies, I've also seen the opposite where they have this legacy technology and your vendor has stopped supporting it.
Nicole Keaton Hart (28:19):
So, the cost, what's the real cost of running that infrastructure, right? Put the risk management measures aside for a second, right? And your level of vulnerability and potential exposure there, but what's the real cost of running legacy infrastructure that hasn't been maintained and doesn't have appropriate maintenance contracts on it. And so, I've seen some of these entities struggle with how do I increase my budget to move to the cloud because it's a cost increase because I haven't maintained past environment. And so, the entities that have done it well culturally, they've been able to examine business problems through a different lens, and figure out how they can leverage lighter weight technology, such as cloud, storage, cloud computing as mechanisms to solve business problems. Last, the funding and budget cycle in government can be a limiting factor of technology innovation and adoption.
Nicole Keaton Hart (29:22):
Think about a cycle where in August or so of the current year, you begin planning what your budget's going to look like for next year, for one-year funding cycle. And you begin that journey in August and August through October that budget goes through reviews. And then, you know, in October through December, it goes through final reviews, cuts and revisions. And then at the beginning of January, a budget emerges that goes forward for approval and funding. And by the end of January, you now have your funded budget for the following fiscal one year. And you now begin looking at how do you move forward with implementation and carrying out your vision in February? Knowing that you're going to begin this cycle again in August, and certainly that one year funding cycle introduces some challenges in state and local government.
Torsten George (30:18):
So, Ron, what's your take on this?
Well, I could talk about this all day. I think this conversation actually goes back over 10 years. I remember sitting with the federal chief information officer back in 2009 on the first Cloud First initiative. And that's, that was kind of the birth of the FedRAMP program. And of course we were heavily involved in that program. And I think the FedRAMP program has really taken a lot of the risks were we tend to be we're risk averse in the federal government. We don't like taking a lot of risks and there's been this notion over the past decade that moving to the cloud is less secure than you could do it on prem. And I don't think there's any evidence to back that up, but the FedRAMP program has really gone through its fairly rigorous program that there, the number of controls that were looked at to put into the low impact, the moderate impact, and now the high impact systems and the assessment of those controls by independent third-party assessment organizations, coming in independently to the big cloud providers to make sure those controls are implemented correctly and operating as intended, that should give customers in the federal government, a high sense of assurance about those cloud providers.
Ron Ross (31:32):
Having said that, it's difficult to move. A lot of people think that just going to the cloud is getting the data out of my sight. But in reality, you have to take advantage of what cloud really provides is not just virtualization. It's redesigning your mission and your business models. It's in essence, taking that technology where you can provision your assets, whether it's Platform-as-a-Service, Infrastructure or Software-as-a-Service you can provision as needed. And at the same time, you can redesign your business functions to take advantage of that cloud technology. So it's not just moving this stuff it's, as David was talking about, you have to redesign some things, and that's why most cases, there's an initial increase in cost of going from what you currently have in legacy environments up to the cloud. But after that initial investment, those costs can drop off precipitously.
Ron Ross (32:25):
And this is tied back to what we talked about earlier. Instead of everybody operating their own system at great cost, you can reduce your cost dramatically by going to the cloud. And if you're really, really risk averse, we have our first federal standard called FIPS 199. That's the great triaged standard. You can divide all of your assets up into low impact, moderate impact and high impact assets. And if you're really worried, then start moving your low impact assets first to the cloud, then move up to the moderate. And then, if you really get into the cloud business, you can go up to the high impact. And we've got a lot of different types of cloud systems. Some people go to the public cloud and we have some of the best cloud providers in the world that we have in the public cloud space.
Ron Ross (33:11):
There's also the ability to do a private cloud, taking advantage of the same technology, but you're doing it in house. So to speak, the intelligence community does that with their intelligence community cloud. So, there's no reason not to take advantage of this great technology to reduce costs, increase your cybersecurity, and actually build better capability for your customers. And that's really what matters at the end of the day.
So besides leveraging FedRAMP authorized services or making a business case and leveraging the synergies of cloud, you can share any other kind of practical advice? Obviously, audiences listening in here, they might have some inhibitors that are slowing them down. What practical tips can you give them to go up to their superiors somehow shorten that budget cycle, Nicole, do you see any chance to, to kind of overcome some of these hurdles?
Nicole Keaton Hart (34:16):
Oh, absolutely. You know, so even if we look at the value or the opportunity that some of the NIST best practices brings forward or offers at a state and local government level. I'll share a share a story or an example. Oftentimes when I meet with a client or an entity, a simple exercise taking a piece of paper and folding it into thirds and asking the entity on that piece of paper, first, write down, what are your most critical or valuable assets to, where are they stored. And three, who has access to them most entities and most agencies can't do that. If I look at the NIST cloud computing standards and roadmap, one of the top, if not the first recommendations is for an agency to add their requirements. And that means in many cases to Ron's point taking the time to identify what your business processes are also addressing gaps such as an inventory asset, right? What does that really mean to have a solid fat foundational aspect in place inventory, asset hardware, application, and user identities, right? Understanding who has access to what throughout the enterprise, and really looking at redesigning those with a keen focus on how your business as an entity actually performs work.
Torsten George (35:40):
Hmm. Very good.
Torsten George (35:43):
Okay. So, Ron, you're deeply engaged with writing best practices. Are there any best practices that this has been established to really assist government agencies to help securely moving their workloads into the cloud? Are there any special publications that you would recommend to our audience?
Ron Ross (36:02):
I think to build the confidence up in a federal agency, you really have to be assured that the data that you're implement you use now, whether you're processing, storing, and transmitting, when it moves to the cloud environment, you have to have some level of confidence that that's going to be at least as well protected as when you are doing it on your side. And I think that goes back to the FedRAMP program offers an enormous opportunity there because they've done all the heavy lifting. They decided what security controls are going into those different impact levels that all the federal agencies were part of that process back in 2009 and 10, and those security controls are coming out of NIST 853. So, you have to be able to comply with the statutory requirements and all the OMB policies, but that's what the FedRAMP program does.
Ron Ross (36:51):
It's an extension. So you, as a customer on the federal side, you can be sure that if you go to one of those cloud providers, that's using Infrastructure Platform or Software-as-a-Service, and those have been through the provisional authorization process, you are still the ultimate arbitrator of the risk because risk management can't be outsourced. So you've got to take that provisional ATO (authorization to operate), take a look at that and look at all the results of everything that is associated with that particular offering, and then make your own final risk-based decision. But this has taken all the worry and the work out of that process. And the other thing I would say to kind of pile on what Nicole was talking about is, it's really important in addition to looking at your mission, your business processes, make sure you have a really thoughtful and careful look at the types of assets that you have.
Ron Ross (37:46):
You have to be able to understand what's important, and what's not that important. If everything is important, then really nothing is important. And that's a common thing we see every day in federal agencies. They want to protect everything to the highest level and you couldn't do that 20 years ago. You certainly can't do that today with trillions of lines of code, billions of devices, everything is connected in the 4G now going to the 5G network. So we have an enormous complexity problem. So just understanding your assets and that's why we have FIPS 199. We call them. As I mentioned earlier, the triage standard, every piece of data, every system is categorized either low impact, moderate, or high impact, and it's really impact to the federal mission or business operation. If that data is compromised, if that system goes down, once you've got that triaged, now you can make some better decisions.
Ron Ross (38:40):
You don't have to boil the whole ocean at the same time. You can start to do small successes, take a little bit of a one program, one system and thoughtfully move it to the cloud and then show the bosses that, Hey, we've done this. And it works well in this small case. And here's how much money I've saved you. Here's how much better our services are for our customers. And if it's not cybersecurity costs of the programs and value to the customer, those two propositions always get the attention of senior leaders and the federal government.
Very good points. Very good advice for your peers out there.David mentioned earlier kind of the term of zero trust and to improve cyber resilience for on-premises as well as cloud environments. The use of the zero trust model has really returned to the spotlight of more and more analysts firms and the media kind of gave it the STEM book approval..
Torsten George (39:33):
Zero trust is also quite popular model within the U S government with even the former federal CIO says that Ken calling it a critical framework for the US government to be able to have to protect data and operate in the environment. Zero trust model for those that might not be familiar with this was first introduced in 2010 by Forrester research in collaboration with the National Institute of Standards and Technology. And so, it's not a new concept, but instead of using the traditional approach of trust, but verify, the zero trust model are really implements and never trust, always verify as the guiding principles, considering that nowadays you can almost assume that the threat actor already exist in your network. David, because you brought that up earlier, what is your take on, how zero trust approach can approve of government agencies, cyber resilience, especially as it comes to, to moving things into the cloud?
David McNeely (40:41):
I think I'm expanding on what Nicole was saying earlier about tips and things that might help people move to cloud. There's a lot of additional capabilities that would be extremely difficult if not impossible to do on-premises, that you're just simply empowered to be able to do in the cloud-based environments. Simple things like automatically scaling up a website to support the amount of demand or traffic that comes in the front door. If you are a public-facing agency and you had a need for citizens, to come log into something, there's most likely going to be peaks in traffic, it's going to fluctuate. And this capability within the cloud-based environment just gives you more flexibility, more freedom to build out a that can respond to the requirements of the agency, much better.
David McNeely (41:37):
At the same time, there's a lot of capabilities and a lot of things we start looking at that drive security closer and closer and closer to the compute and also the data. So in order to accomplish that kind of, let's call it elastic application. There's much more communication that needs to happen between each of the nodes within that application. If, if we're breaking up the application into a little bitty chunks that that can communicate with each other so that we can enable them to scale out. So taking advantage of some of the cloud-based capabilities that gives you more flexibility in the delivery of an application or service back to the agency, this is service citizens that need to use it, at the same time gives you the chance to go back and take a second, look at security.
David McNeely (42:28):
And in what we're doing within cloud-based environments is providing much more granularity so that, each and every workload gets its own identity. We’re taking a closer look at authorization so that each and every compute entity in the environment has to authenticate strongly to each other. All of these are basic concepts of zero trust where each and every executing objects needs to be able to be trusted and leverage a centralized policy that would then determine, can you communicate? So it's not just about humans communicating to computers or logging in and doing activity. It shifts our focus a bit more to like, how did these computers talk to each other? If we look further out into the future, there's going to be a lot more computers talking to each other than our humans accessing those computers. And that will be the bigger challenge. So, we need to enable what I would call full mesh, where every single object needs to be able to potentially authenticate every other object in order to carry out its job. But we want to maintain control over that environment and do it with a zero trust style approach where everything gets a strong identity and we can apply strong authorization for that one executing function to be able to communicate to others or pick up datasets.
Torsten George (43:51):
Yeah, very, very good. So obviously, there are many starting points on the path to zero trust. And in fact, Forrester points out different pillars where you need to establish security controls ranging from workloads to networks. As you mentioned, either humans or non-human identities, devices, and ultimately data. However, one driving principle should be always considered is what are the hackers are doing? What are they really exploiting? Which according to post-mortem analysis seems to be identities because 80% of security breaches involve privileged credentials, according to Forrester research.
Torsten George (44:32):
And, I have many ways to start with zero trust and my organization. Nicole, would you agree that the past has zero trust should start with identity? Are there any other things that people should focus on?
Nicole Keaton Hart (44:50):
I do in part and the reason I say in part, let me put a little bit of context behind that. Identity management first and foremost is not well understood in many organizations, whether that's the public sector or private sector or perhaps it's only understood in part, but if you couple that with the digitalization of once paper-based processes or manual existing processes, the complexities and challenges become self-evident. And so, understanding how the business actually operates, what's the best opportunity to insert a digital component to drive process efficiency. Along with the identities in turn, right. And not just humans. Um, as David mentioned, you know, then in turn, I think you can, you can place focus and emphasis on what are the identities, human devices, et cetera, and the interactions that are required to facilitate business. So I didn't totally disagree with that statement. I expanded on it a little bit,
Torsten George (46:03):
Ron, very quick. What's your take, and then we have to kind of wrap up, but what do you think?
Well, I think zero trust concepts and architecture is the key to our future cybersecurity success. You know, we we've been fighting the cyber war with 20th century tools and techniques and approaches, and we've had a one dimensional strategy of penetration resistance, put up that firewall put up that boundary and hope they don't get in. Well, I think as you stated, Torsten, this is not always successful. We've got reams of empirical data now to show these cyber attacks are happening more frequently than we'd like. So I advocating a multi-dimensional strategy where you try to stop them at the front door, if you can, and then you try to limit the damage they can do once they're inside and moving into cyber resiliency. And if I have permission to use an analogy, I know you liked the analogy.
Ron Ross (46:59):
So I'm going to use one. It's almost like you have a great lock on your front door, on your house. Maybe you got triple locks on your front door, but if the bad guy comes through your front door, they can get into every room of your house and they can rip it apart. But what if they came in and every room in your house had a vault and that would stop them. So the second dimension of good cyber after the penetration resistance is damage limitation. And we can do that through two different approaches. You can one use massive virtualization, micro virtualization, micro segmentation, to split up those pieces. And the virtual machines and the VM technology will turn that system faster than they have time to exploit any part of that system. So you don't give them time on target. The second thing is about zero trust.
Ron Ross (47:47):
You make it very difficult to move laterally through that system. And identity is at the heart of that strong access control, a strong authorization authentication, but as David said, we're doing it at the much more granular level. That's the analogy of the house and each room in the house having a vault. Now that adversary has to get into every little nook and cranny again and again and again, their work factor goes up so high, they're going to go someplace else. So I'm a strong advocate, a zero trust. It's got a lot of different flavors, as you said, but that's the way to go in the future. And I think the sooner we get there, the less time we'll be agonizing over these continuing cyber attacks.
Very good. And I would like to think our industry experts for the time and insights to wrap up our discussion. I would with like to ask our experts to summarize in just a sentence or two, their recommendation for our audience today, on how to accelerate their IT modernization efforts while at the same time, improving cyber resilience for both on premises and cloud environments.
Torsten George (48:54):
Nicole, why don't you start?
Nicole Keaton Hart (48:56):
Sure. I'd say starting with an evaluation of the entity cyber security mindset and taking the necessary actions to raise the cyber IQ of that entity or that agency. And that means being able to embrace and evaluate frameworks such as zero trust on the front side of wide-scale technology investments. Second, take the steps to move toward a data driven culture. And that means embracing cloud and cloud computing, cloud platforms as a means to address some of the inhibitors that we discussed today and last, practical means for state and local government entities and perhaps at the federal level is the development and implementation of IT centers of excellence, that have the mission and the charge of really defining technology solutions that meets the needs of the date of that agency and its constituent.
Torsten George (50:00):
Very good. Please go next.
I would say go back to the fundamentals, the foundational building blocks, go back to enterprise architecture, look at your mission business processes, try to consolidate where you can and, and modernize where you can, uh, look at the different criticality of your assets. Try to move those assets to the cloud as quickly as possible, do small experiments, and then build on that success show value
Ron Ross (50:28):
and the increase in security and the increased value to your customers by relying on the enterprise architecture concept, which will reduce complexity. It'll reduce your cost by definition, consolidate, standardize, optimize. Those are the three hallmarks of enterprise architecture. And if we can do those things first, the cybersecurity efforts will be a whole lot easier and much more effective.
Good. David, please take it away.
I think Dr. Ron Ross summarized it perfectly. I think the one thing I'll add is pick a project, just find something that you think could use the most help to improve. There’s a lot of really cool capabilities within the cloud based environments that you can take advantage of. So, pick a project particular application, one that would benefit the most, start with that, get a strong success. And then from there you can build the rest of your plans to migrate other things.
That's a nice closing statement. Thank you for tuning in today. We hope today's discussion provided you with helpful information and really some food for thought and please stay safe and healthy during these unprecedented times. Thank you again.