Thought Leadership

Zero Trust Authentication within NIST 800-207 Framework

Written By
Published On
Dec 12, 2023

Husnain Bajwa, Vice President of Product Strategy at Beyond Identity, discusses Zero Trust Authentication, zero trust architecture, and how they fit together in the context of NIST 800-207.

Transcription

Hi, I'm Husnain Bajwa, people generally call me HB, and I do product strategy here at Beyond Identity. Today, I'm going to talk to you about Zero Trust Authentication, zero trust architecture, and how these things fit together in the context of NIST 800-207. 

So, when we talk about zero trust architecture, as the industry has evolved and we've come to a better understanding of what zero trust genuinely is, we now associate zero trust architecture with the identity-centric zero trust frameworks described by NIST in their 800-207 document. 

And associated to that, there's an enormous component that depends on Zero Trust Authentication. And today, we're going to show you how these pieces fit together. First, a little bit of history. Obviously, the NIST standard calls out a subject or asset in its description of an end user or device, and it calls out enterprise resources to describe applications. 

This basic design of having a subject asset and enterprise resource can trace back all the way to 1961, the creation of passwords with mainframes, as well as all of the generations of network computing that began in the early '80s and progressed through until single sign-on technologies came into favor in the late 2000s. 

Once these single sign-on technologies came into frame, generally describing them as a class of solution, the NIST architecture refers to them as policy enforcement points. Policy enforcement points as a descriptor allows an architecture and an environment to adopt policy and SSO frameworks, gateways, VPNs, zero trust network access architectures, and identity-aware gateways in either a distributed or centralized manner, and then orchestrated by a policy engine that can be centrally managed. 

Policy is a critical component of modern authentication and evolves enormously in the zero trust architecture described by NIST. So, the NIST view on functional components calls out data security, end-point security, IAM, and security analytics. 

If you look at this basic set of components, the functional components represent the tools that most enterprise organizations already have deployed. Data security typically refers to encryption in transit, at rest, and even in use now. 

End-point typically refers to end-point detection response platforms that have replaced previous versions of antivirus, although it can still include legacy antivirus programs. And end-points can also include MDMs, remote management systems, and other forms of end-point conformance tooling. 

IAM typically refers to the component of identity and device storage that's correlated to the authorization of various resources and users to resources. And this can be traced back even to network access control systems that first defined this type of an architecture for modern authentication. 

And then, of course, we see the emergence of security analytics, a fundamental enabler of continuous authentication and risk mitigation within a world of machine learning and larger and larger data. From our standpoint, we think this is an enormously critical portion of the solution to focus on. 

Policy engines have existed for a long time and they're only as good as the data that you put into them. Excessively relying on legacy signals and legacy data can result in a lot of stale decisions or information that doesn't represent one's perfect current understanding of a solution. 

Policy engines are also very important to the future of authentication as we see it. Continuous authentication is a major topic of work in the standards bodies. The Open ID connect groups are currently working on shared signals that relate to Cape and Risk two protocols that will allow policy engines to authorize users for long sessions, enabling an excellent user experience, but allowing the de-authorization of those long sessions when compliance to policy is changed or revoked. 

Within this entire structure, getting fresher data is where we think there's an enormous opportunity. A focus on just-in-time allows us to take modern supply chain techniques and reconsider all of the assumptions that we've had on where signals come from and how much we need to cash them, and how much we can allow a decision to be fresh and based on real-time or as near real-time data as possible. 

Going back to Jasson's talk earlier, from our standpoint, all of this begins with the public key. Using public key cryptography is the fundamental cornerstone of a phishing-resistant authentication architecture. We further harden that by sealing our keys, our private keys in a secure enclave, a specialized security processor that hermetically seals those keys and provides strong guarantees with tamper-proof hardware. 

Once we get the enclave and public key information, now we can essentially enrich it with device context. Device context encompasses most of what's found here and requires integrations, but some of it can be organically collected by end-point agents that are essentially residing on the device itself. 

This is what Jasson referred to as the PA, the Platform Authenticator. We can further enrich the platform authenticator's decisions with partner integrations. That exploit the modern zero trust aligned tools that we find organizations deploying more and more. 

Whether it be SIM systems, or EDR systems, or antivirus, modern RMM systems, we can take all of that information from the likes of Zscaler, Palo Alto, CrowdStrike, and incorporate that in both our decisions today as well as our mitigations for failure or change in authorization condition. 

So, where traditional technologies and all of the standards may not accommodate continuous authentication today, we're able to deliver a high level of fidelity to the principles of continuous authentication as part of an identity and access management system and this zero trust authentication framework. 

We align strongly with NIST 800-207 standards, and we try to take it beyond and looking at where the future is going for this. And we're super excited about the emerging protocols and all of the additional work that has to exist and get developed in the CA, continuous auth realm. 

So, again, taking policies, creating a principled location, having a scalable, fine-grained, unlimited capability representative of modern cloud architectures, and enriching it with real-time or as near real-time signals as possible is what we think zero trust authentication is all about. 

And we're super excited to show you more of what we have in store.

Get started with Device360 today
Weekly newsletter
No spam. Just the latest releases and tips, interesting articles, and exclusive interviews in your inbox every week.

Zero Trust Authentication within NIST 800-207 Framework

Download

Husnain Bajwa, Vice President of Product Strategy at Beyond Identity, discusses Zero Trust Authentication, zero trust architecture, and how they fit together in the context of NIST 800-207.

Transcription

Hi, I'm Husnain Bajwa, people generally call me HB, and I do product strategy here at Beyond Identity. Today, I'm going to talk to you about Zero Trust Authentication, zero trust architecture, and how these things fit together in the context of NIST 800-207. 

So, when we talk about zero trust architecture, as the industry has evolved and we've come to a better understanding of what zero trust genuinely is, we now associate zero trust architecture with the identity-centric zero trust frameworks described by NIST in their 800-207 document. 

And associated to that, there's an enormous component that depends on Zero Trust Authentication. And today, we're going to show you how these pieces fit together. First, a little bit of history. Obviously, the NIST standard calls out a subject or asset in its description of an end user or device, and it calls out enterprise resources to describe applications. 

This basic design of having a subject asset and enterprise resource can trace back all the way to 1961, the creation of passwords with mainframes, as well as all of the generations of network computing that began in the early '80s and progressed through until single sign-on technologies came into favor in the late 2000s. 

Once these single sign-on technologies came into frame, generally describing them as a class of solution, the NIST architecture refers to them as policy enforcement points. Policy enforcement points as a descriptor allows an architecture and an environment to adopt policy and SSO frameworks, gateways, VPNs, zero trust network access architectures, and identity-aware gateways in either a distributed or centralized manner, and then orchestrated by a policy engine that can be centrally managed. 

Policy is a critical component of modern authentication and evolves enormously in the zero trust architecture described by NIST. So, the NIST view on functional components calls out data security, end-point security, IAM, and security analytics. 

If you look at this basic set of components, the functional components represent the tools that most enterprise organizations already have deployed. Data security typically refers to encryption in transit, at rest, and even in use now. 

End-point typically refers to end-point detection response platforms that have replaced previous versions of antivirus, although it can still include legacy antivirus programs. And end-points can also include MDMs, remote management systems, and other forms of end-point conformance tooling. 

IAM typically refers to the component of identity and device storage that's correlated to the authorization of various resources and users to resources. And this can be traced back even to network access control systems that first defined this type of an architecture for modern authentication. 

And then, of course, we see the emergence of security analytics, a fundamental enabler of continuous authentication and risk mitigation within a world of machine learning and larger and larger data. From our standpoint, we think this is an enormously critical portion of the solution to focus on. 

Policy engines have existed for a long time and they're only as good as the data that you put into them. Excessively relying on legacy signals and legacy data can result in a lot of stale decisions or information that doesn't represent one's perfect current understanding of a solution. 

Policy engines are also very important to the future of authentication as we see it. Continuous authentication is a major topic of work in the standards bodies. The Open ID connect groups are currently working on shared signals that relate to Cape and Risk two protocols that will allow policy engines to authorize users for long sessions, enabling an excellent user experience, but allowing the de-authorization of those long sessions when compliance to policy is changed or revoked. 

Within this entire structure, getting fresher data is where we think there's an enormous opportunity. A focus on just-in-time allows us to take modern supply chain techniques and reconsider all of the assumptions that we've had on where signals come from and how much we need to cash them, and how much we can allow a decision to be fresh and based on real-time or as near real-time data as possible. 

Going back to Jasson's talk earlier, from our standpoint, all of this begins with the public key. Using public key cryptography is the fundamental cornerstone of a phishing-resistant authentication architecture. We further harden that by sealing our keys, our private keys in a secure enclave, a specialized security processor that hermetically seals those keys and provides strong guarantees with tamper-proof hardware. 

Once we get the enclave and public key information, now we can essentially enrich it with device context. Device context encompasses most of what's found here and requires integrations, but some of it can be organically collected by end-point agents that are essentially residing on the device itself. 

This is what Jasson referred to as the PA, the Platform Authenticator. We can further enrich the platform authenticator's decisions with partner integrations. That exploit the modern zero trust aligned tools that we find organizations deploying more and more. 

Whether it be SIM systems, or EDR systems, or antivirus, modern RMM systems, we can take all of that information from the likes of Zscaler, Palo Alto, CrowdStrike, and incorporate that in both our decisions today as well as our mitigations for failure or change in authorization condition. 

So, where traditional technologies and all of the standards may not accommodate continuous authentication today, we're able to deliver a high level of fidelity to the principles of continuous authentication as part of an identity and access management system and this zero trust authentication framework. 

We align strongly with NIST 800-207 standards, and we try to take it beyond and looking at where the future is going for this. And we're super excited about the emerging protocols and all of the additional work that has to exist and get developed in the CA, continuous auth realm. 

So, again, taking policies, creating a principled location, having a scalable, fine-grained, unlimited capability representative of modern cloud architectures, and enriching it with real-time or as near real-time signals as possible is what we think zero trust authentication is all about. 

And we're super excited to show you more of what we have in store.

Zero Trust Authentication within NIST 800-207 Framework

Phishing resistance in security solutions has become a necessity. Learn the differences between the solutions and what you need to be phishing resistant.

Husnain Bajwa, Vice President of Product Strategy at Beyond Identity, discusses Zero Trust Authentication, zero trust architecture, and how they fit together in the context of NIST 800-207.

Transcription

Hi, I'm Husnain Bajwa, people generally call me HB, and I do product strategy here at Beyond Identity. Today, I'm going to talk to you about Zero Trust Authentication, zero trust architecture, and how these things fit together in the context of NIST 800-207. 

So, when we talk about zero trust architecture, as the industry has evolved and we've come to a better understanding of what zero trust genuinely is, we now associate zero trust architecture with the identity-centric zero trust frameworks described by NIST in their 800-207 document. 

And associated to that, there's an enormous component that depends on Zero Trust Authentication. And today, we're going to show you how these pieces fit together. First, a little bit of history. Obviously, the NIST standard calls out a subject or asset in its description of an end user or device, and it calls out enterprise resources to describe applications. 

This basic design of having a subject asset and enterprise resource can trace back all the way to 1961, the creation of passwords with mainframes, as well as all of the generations of network computing that began in the early '80s and progressed through until single sign-on technologies came into favor in the late 2000s. 

Once these single sign-on technologies came into frame, generally describing them as a class of solution, the NIST architecture refers to them as policy enforcement points. Policy enforcement points as a descriptor allows an architecture and an environment to adopt policy and SSO frameworks, gateways, VPNs, zero trust network access architectures, and identity-aware gateways in either a distributed or centralized manner, and then orchestrated by a policy engine that can be centrally managed. 

Policy is a critical component of modern authentication and evolves enormously in the zero trust architecture described by NIST. So, the NIST view on functional components calls out data security, end-point security, IAM, and security analytics. 

If you look at this basic set of components, the functional components represent the tools that most enterprise organizations already have deployed. Data security typically refers to encryption in transit, at rest, and even in use now. 

End-point typically refers to end-point detection response platforms that have replaced previous versions of antivirus, although it can still include legacy antivirus programs. And end-points can also include MDMs, remote management systems, and other forms of end-point conformance tooling. 

IAM typically refers to the component of identity and device storage that's correlated to the authorization of various resources and users to resources. And this can be traced back even to network access control systems that first defined this type of an architecture for modern authentication. 

And then, of course, we see the emergence of security analytics, a fundamental enabler of continuous authentication and risk mitigation within a world of machine learning and larger and larger data. From our standpoint, we think this is an enormously critical portion of the solution to focus on. 

Policy engines have existed for a long time and they're only as good as the data that you put into them. Excessively relying on legacy signals and legacy data can result in a lot of stale decisions or information that doesn't represent one's perfect current understanding of a solution. 

Policy engines are also very important to the future of authentication as we see it. Continuous authentication is a major topic of work in the standards bodies. The Open ID connect groups are currently working on shared signals that relate to Cape and Risk two protocols that will allow policy engines to authorize users for long sessions, enabling an excellent user experience, but allowing the de-authorization of those long sessions when compliance to policy is changed or revoked. 

Within this entire structure, getting fresher data is where we think there's an enormous opportunity. A focus on just-in-time allows us to take modern supply chain techniques and reconsider all of the assumptions that we've had on where signals come from and how much we need to cash them, and how much we can allow a decision to be fresh and based on real-time or as near real-time data as possible. 

Going back to Jasson's talk earlier, from our standpoint, all of this begins with the public key. Using public key cryptography is the fundamental cornerstone of a phishing-resistant authentication architecture. We further harden that by sealing our keys, our private keys in a secure enclave, a specialized security processor that hermetically seals those keys and provides strong guarantees with tamper-proof hardware. 

Once we get the enclave and public key information, now we can essentially enrich it with device context. Device context encompasses most of what's found here and requires integrations, but some of it can be organically collected by end-point agents that are essentially residing on the device itself. 

This is what Jasson referred to as the PA, the Platform Authenticator. We can further enrich the platform authenticator's decisions with partner integrations. That exploit the modern zero trust aligned tools that we find organizations deploying more and more. 

Whether it be SIM systems, or EDR systems, or antivirus, modern RMM systems, we can take all of that information from the likes of Zscaler, Palo Alto, CrowdStrike, and incorporate that in both our decisions today as well as our mitigations for failure or change in authorization condition. 

So, where traditional technologies and all of the standards may not accommodate continuous authentication today, we're able to deliver a high level of fidelity to the principles of continuous authentication as part of an identity and access management system and this zero trust authentication framework. 

We align strongly with NIST 800-207 standards, and we try to take it beyond and looking at where the future is going for this. And we're super excited about the emerging protocols and all of the additional work that has to exist and get developed in the CA, continuous auth realm. 

So, again, taking policies, creating a principled location, having a scalable, fine-grained, unlimited capability representative of modern cloud architectures, and enriching it with real-time or as near real-time signals as possible is what we think zero trust authentication is all about. 

And we're super excited to show you more of what we have in store.

Zero Trust Authentication within NIST 800-207 Framework

Phishing resistance in security solutions has become a necessity. Learn the differences between the solutions and what you need to be phishing resistant.

Husnain Bajwa, Vice President of Product Strategy at Beyond Identity, discusses Zero Trust Authentication, zero trust architecture, and how they fit together in the context of NIST 800-207.

Transcription

Hi, I'm Husnain Bajwa, people generally call me HB, and I do product strategy here at Beyond Identity. Today, I'm going to talk to you about Zero Trust Authentication, zero trust architecture, and how these things fit together in the context of NIST 800-207. 

So, when we talk about zero trust architecture, as the industry has evolved and we've come to a better understanding of what zero trust genuinely is, we now associate zero trust architecture with the identity-centric zero trust frameworks described by NIST in their 800-207 document. 

And associated to that, there's an enormous component that depends on Zero Trust Authentication. And today, we're going to show you how these pieces fit together. First, a little bit of history. Obviously, the NIST standard calls out a subject or asset in its description of an end user or device, and it calls out enterprise resources to describe applications. 

This basic design of having a subject asset and enterprise resource can trace back all the way to 1961, the creation of passwords with mainframes, as well as all of the generations of network computing that began in the early '80s and progressed through until single sign-on technologies came into favor in the late 2000s. 

Once these single sign-on technologies came into frame, generally describing them as a class of solution, the NIST architecture refers to them as policy enforcement points. Policy enforcement points as a descriptor allows an architecture and an environment to adopt policy and SSO frameworks, gateways, VPNs, zero trust network access architectures, and identity-aware gateways in either a distributed or centralized manner, and then orchestrated by a policy engine that can be centrally managed. 

Policy is a critical component of modern authentication and evolves enormously in the zero trust architecture described by NIST. So, the NIST view on functional components calls out data security, end-point security, IAM, and security analytics. 

If you look at this basic set of components, the functional components represent the tools that most enterprise organizations already have deployed. Data security typically refers to encryption in transit, at rest, and even in use now. 

End-point typically refers to end-point detection response platforms that have replaced previous versions of antivirus, although it can still include legacy antivirus programs. And end-points can also include MDMs, remote management systems, and other forms of end-point conformance tooling. 

IAM typically refers to the component of identity and device storage that's correlated to the authorization of various resources and users to resources. And this can be traced back even to network access control systems that first defined this type of an architecture for modern authentication. 

And then, of course, we see the emergence of security analytics, a fundamental enabler of continuous authentication and risk mitigation within a world of machine learning and larger and larger data. From our standpoint, we think this is an enormously critical portion of the solution to focus on. 

Policy engines have existed for a long time and they're only as good as the data that you put into them. Excessively relying on legacy signals and legacy data can result in a lot of stale decisions or information that doesn't represent one's perfect current understanding of a solution. 

Policy engines are also very important to the future of authentication as we see it. Continuous authentication is a major topic of work in the standards bodies. The Open ID connect groups are currently working on shared signals that relate to Cape and Risk two protocols that will allow policy engines to authorize users for long sessions, enabling an excellent user experience, but allowing the de-authorization of those long sessions when compliance to policy is changed or revoked. 

Within this entire structure, getting fresher data is where we think there's an enormous opportunity. A focus on just-in-time allows us to take modern supply chain techniques and reconsider all of the assumptions that we've had on where signals come from and how much we need to cash them, and how much we can allow a decision to be fresh and based on real-time or as near real-time data as possible. 

Going back to Jasson's talk earlier, from our standpoint, all of this begins with the public key. Using public key cryptography is the fundamental cornerstone of a phishing-resistant authentication architecture. We further harden that by sealing our keys, our private keys in a secure enclave, a specialized security processor that hermetically seals those keys and provides strong guarantees with tamper-proof hardware. 

Once we get the enclave and public key information, now we can essentially enrich it with device context. Device context encompasses most of what's found here and requires integrations, but some of it can be organically collected by end-point agents that are essentially residing on the device itself. 

This is what Jasson referred to as the PA, the Platform Authenticator. We can further enrich the platform authenticator's decisions with partner integrations. That exploit the modern zero trust aligned tools that we find organizations deploying more and more. 

Whether it be SIM systems, or EDR systems, or antivirus, modern RMM systems, we can take all of that information from the likes of Zscaler, Palo Alto, CrowdStrike, and incorporate that in both our decisions today as well as our mitigations for failure or change in authorization condition. 

So, where traditional technologies and all of the standards may not accommodate continuous authentication today, we're able to deliver a high level of fidelity to the principles of continuous authentication as part of an identity and access management system and this zero trust authentication framework. 

We align strongly with NIST 800-207 standards, and we try to take it beyond and looking at where the future is going for this. And we're super excited about the emerging protocols and all of the additional work that has to exist and get developed in the CA, continuous auth realm. 

So, again, taking policies, creating a principled location, having a scalable, fine-grained, unlimited capability representative of modern cloud architectures, and enriching it with real-time or as near real-time signals as possible is what we think zero trust authentication is all about. 

And we're super excited to show you more of what we have in store.

Book

Zero Trust Authentication within NIST 800-207 Framework

Phishing resistance in security solutions has become a necessity. Learn the differences between the solutions and what you need to be phishing resistant.

Download the book

By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.