torsdag 19 december 2013

Why Role Based Provisioning is not RBAC

Talking with peers in the Identity space, it continues to strike me how two fundamentally different concepts remains a topic of confusion. I am specifically referring to Role based provisioning versus Role Based Access Control (RBAC).


RBAC is an approach to restrict system access to users with the appropriate authorization and enforced on the access control layer. RBAC is a non-discretionary access control mechanism which promotes the central administration of an organizational specific security policy.


This pattern of modelling access control has been around for a long time and with the raised compliance concerns of Segregation of Duties (SoD), RBAC have shown to be an effective model to enforce this within the enterprise - therefore also widely used.


Role based provisioning is the concept of clustering or grouping the way entitlements and attributes are being provisioned to target resources, for instance a particular role might imply a set of groups in the corporate Active Directory (AD) as well as access to a financial reporting system. The groups in AD would be entitlement carrying attributes, but the provisioning role wouldn’t be limited to these and could of course also define other attributes such as department and division.


Role based provisioning relies upon the target system to be the enforcer of the authorization layer. Of course the provisioning of entitlements can contain policies ensuring that SoDs are being considered and honored. Perhaps this is where the confusion occurs? Perhaps the terminology is being used too relaxed among identity vendors.


In the modern world of mobile users, connected wirelessly on portable devices such as tablets or smartphones creates a new set of challenges which quickly introduces the discussions around context aware access controls. Context of course being nothing more than a set of key value pair attributes defining where you are, what device you are one, time zone you are in or whatever it might be, but still simply attribute data derived from or provided by the user and associated device.


Analysts, such as Gartner, predict that by 2020, 70% of enterprises will use attribute-based access control (ABAC) as the dominant mechanism to protect critical assets, up from less than 5% today.


Fair enough, even though i believe that statement when i see it in reality - but lets not confuse this with role based provisioning, which serves a completely different purpose. Lets try to defuse the confusion by defining the two models.


  • A role-based access control (RBAC) model grants access to resources based on a user role, such as the user's job title or work responsibility. (Here we are talking authorization and its enforcement)


  • A role-based provisioning model, automates the access entitlement provisioning process for a specific managed resource, based on the roles to which the user belongs. (Here we are talking how to set attributes, entitlements carrying or regular attributes, not how to enforce them)


What’s funny about Gartner’s predictions, is the specific explanation on how “RBAC is one-dimensional because it cannot take context in the equation and will therefore fail to address challenges.”, the analyst seems to have missed the specific paragraphs in the NIST standard about static and dynamic constraints as well as temporal constraints, addressing the topic of contextual information and recent publications and research such as http://csrc.nist.gov/groups/SNS/rbac/documents/kuhn-coyne-weil-10.pdf. Basically allowing for attributes to dynamically impact the roles.


Having said the above, i do believe that context will become increasingly important but when jumping in to these discussions, please note the difference between these two concepts.

måndag 16 december 2013

Native REST in OpenDJ and the REST2LDAP gateway

Directory Services are an ideal way to structure and store, identity data reaching exceptional scale. The long serving protocol of choice is LDAP and of course integrating with or building solutions around directories often involved LDAP SDKs designed some 15 years ago with almost zero standards to build on.  For Java platform developers, however, the JNDI API emerged.

Though the intent was for JNDI to be that standard on the Java platform, it hasn’t evolved with the rest of the platform and lacks basic properties such as Generics and Concurrency support.
ForgeRock’s OpenDJ SDK and the likes are providing an answer to this issue but are still very LDAP oriented with the learning curve associated for LDAP operations and data model.

Development using JNDI is time consuming and far from intuitive for even seasoned engineers. Annoying problems such as the domain separator being a slash instead of a dot results in confusion and difficult debugging, especially as we are dealing with URLs.

The future of LDAP is often debated since its tricky and time consuming to use which results in higher development. Considering that LDAP is pretty much unavoidable in today’s enterprises, it’s surprising that fundamental LDAP training is not part of the required  curricula for software engineers.  At great cost, this important knowledge is ignored by most students out of university and is also often neglected by most startups building new and innovative solutions.

ForgeRock has spent a tremendous amount of effort to provide a RESTful interface around our directory OpenDJ, exposing all the power of the LDAP protocol and OpenDJ but with the simplicity of REST while at the same time maintaining the high performance and scalability.  Technically this means that OpenDJ exposes its directory data, such as users, organizations and groups over HTTP as JSON resources.

The business benefits from using the REST interface to OpenDJ because it  means that applications relying on directory services have a significantly shortened time to market and development time. The simplicity of REST also ensures a higher quality assurance and and more thorough testing.  As an example, in only a week’s time, one of our partners built a web application for a hospital that included different views for different personnel (physicians, nurses etc) without having to train its staff on LDAP and its best practices.  It is clear that using the REST API reduces development costs and accelerates time to revenue for new services and applications.






fredag 6 december 2013

Friday reflections on Software Releases

Imagine a car with state of the art navigation system, aerodynamic body and fantastic ceramic brakes, doing 0-60 mph in 4 seconds - but lacks the capability of putting the car into reverse. Now, imagine a smartphone that has fantastic new features, slick design and innovative human-computer interaction via touch screen - but lacks the capability to copy and paste text.

My guess is that you wouldn’t buy a car that you can’t reverse but clearly when apple launched their iPhone without copy and paste capability, people bought it. The Apple team made well thought thru decisions to launch a product before all the details were fully implemented and i can smell Agile development behind the scenes.

The goals of Agile development is efficiency and velocity. Allowing a product to fail quickly if that is its ultimate destiny or to adapt and include customer requirements, new or previously known but at the same time get features and functionality quickly in the hands of customers. Either to solve business problems they might have and/or get more feedback to improve the software on implemented features.

In my mind the “release early, release often” is critical to a young products success. A software development philosophy that was popularized by Eric S. Raymond in his
The Cathedral and the Bazaar, where Raymond stated "Release early. Release often. And listen to your customers”. This model is of course ideal for companies providing Open Source software such as ForgeRock, where i work.

The Agile Manifesto states four important things to improve software development,

  • Individuals and interactions over Processes and tools
  • Working software over Comprehensive documentation
  • Customer collaboration over Contract negotiation
  • Responding to change over Following a plan


Each individual sprint should provide viable features solving real problems and capturing the feedback from customers. Evolving the product is what makes the product ultimately successful, and of course that is done by interacting with the customers. Having said all of the above, i do believe its important not to neglect the details.

Agile development should never be an excuse for a lazy product manager (or owner if you wish) not scribbling down the details and explaining the requirements part of a user story. The balance is to understand as a Product Manager and dev team, that not all details are necessary to be implemented in order for the software to work and to stick true to the Release early and release often philosophy.

onsdag 27 november 2013

XACML: Dead or Alive


In order to bring clarity of whether XACML is dead, some analysts have claimed, or alive, we need to first understand what XACML is and where it fits in.


First of all XACML stands for eXtensible Access Control Markup Language and is the current de facto standard for a declarative language to define access control policies. Its fully implemented in XML. Its an Oasis standard and XACML 3.0 was released in January 2013, and of course the idea behind a standard is to promote a common terminology and ensure interoperability between access control implementations independently from whatever vendor is behind a particular technology or solution.


A closer look at XACML shows that its primary aimed at attribute based access control systems (ABAC), where the foundation for policy declarations is based on attributes and their values, however Role Based Access Control (RBAC) can also be used in XACML as a specialization of ABAC. Despite the name “eXtensible” does XACML enable externalization and encourages the separation of access decisions via Policy Enforcement Points (PEP).


The primary idea behind externalizing authorization and access policies from applications, is to centralize access policy governance but the benefits can be seen in three different areas according to leading vendor and provider of XACML based solutions, Axiomatics.


  • Software development: Policy Enforcement Points (PEP) are standardized components, intended to be re-used in software development and rather than implementing application specific logic in each application to determine what each user is allowed to do, PEPs make calls to a central Policy Decision Point (PDP). The end result of course is faster development and deployments.


  • In software life-cycle management: Fundamental change requests with regard to entitlements – for instance to meet regulatory compliance requirements – are managed with centralized policies. There is no need to change configurations or functionality in individual applications or services.


  • In operations: Entitlement-carrying attributes are widely managed in day to day line of business activities. Identity & Access Management solutions can to a large extent be embedded in existing business processes rather than demanding a separate administrative effort.



Now, having described the above, it is important to separate the “Externalized Authorization Management (EAM)” from the standard and language XACML.




EAM in the enterprise world is far from anything new and have been implemented in mainframes e.g. via RACF for decades. So just because enterprises are not leveraging XACML doesn’t mean they are not doing EAM.


Some of the arguments to why XACML has been declared dead, highlights poor adoption in a broad sense among large enterprises who written their authorization engines, that is not suitable for the cloud and distributed deployment, that refactoring and rebuilding existing in-house application is not an option.



As we all know, the current trend we are witnessing, is that APIs over the last few years are shifting more and more towards JavaScript Object Notation (JSON) in both responses and requests, which means that the more verbose XML notation is being dropped in favor of the less verbose JSON. Reason for that is of course the easy of use but also that all of the sudden we can reach better scalability when there is no need to parse XML. This would of course throw gasoline on the “XACML is dead” debate, but the community around XACML has been active working on JSON and RESTful bindings for XACML which will lead to increased XACML adoption - also lets not forget the OpenAZ project which moves forward in providing a vendor agnostic PEP API, open to anyone to use.


What about OPENAM and XACML?


  • Import/Export tool for XACML policies
    XACML can be a medium both to transfer and store access control policies. In OpenAM the policies are stored in a proprietary format and its not using XACML per se to store the policies. Therefor policies needs to be converted to a native format for storage. Part of OpenAM, ForgeRock provides an Import/Export tool for XACML policies. More about that can be found in the
    documentation for OpenAM.


  • Java SDK for XACML
OpenAM has an API available over Java to reach out to the XACML service however it currently only support XACML v2.


  • XACML2 SAML profile
OpenAM provides the XACMLv2 query capability in the Fedlets.


When it comes to the management of fine grained entitlements in the form of a GUI, this is something that currently is on the roadmap for OpenAM and i know product management is looking into providing the support for XACML v3. What is known is that there is a business interest implying that XACML is far from dead from big enterprises and governments point of view.


One of the potential issues with XACMLv3 is the amount of traffic generated when having millions of policies that needs to be evaluated. OpenAM is built with performance in mind and clearly there needs to be efforts into making XACML v3 performing to the modern web especially if it should find a good fit inside OpenAM.


To conclude this post, I want to quickly deviate a bit from XACML. Recently I spent a lot of time at a large prospect customer whose entire Identity Management system was built up around SPML and we all know how people have been claiming SPML is dead. A key requirement for this prospect was to serve the current deployment and deliver the provisioning capabilities via SPML but rip out and replace the backend. They had no intention on changing this interface! Clearly SPML is not dead, and neither is XACML and i doubt XML would disappear anytime soon. More likely is the turkey dead going in the oven. Happy Thanksgiving!

söndag 24 november 2013

Thoughts around Product Strategy

I have had reasons lately due to an upcoming workshop within my company, to give product strategy some thought. Product strategy is an important factor to the success or failure of a product or a suite of products and its important to stay on top of this, possibly fine tuning it to fit the needs of an identified customer base.

Setting a product strategy is a critical step when beginning the journey of a product or for that matter a set of products. The strategy defines and outlines the direction that needs to be taken but also the goal with a particular product.

Defining a product strategy is something that is part of the job as product manager and its necessary that this strategy answers some critical questions such as who will buy this product and why? What business problems and value does this product resolves for the customer buying it?

It is also however important to outline the minimal capabilities a product requires to fulfill customer needs, but also define the characteristics of why that product is better, or easier, or whatever unique selling point associated with the product might be.

The product strategy’s overall purpose is to help your company achieve corporate goals and therefore should be aligned to the corporate strategy. The corporate strategy should direct the goals for the product strategy. For maximum success, its extremely important that domain expertise is valued and is empowered by management within an organization developing a product. This is the job of the product manager. An unempowered product manager is a product manager that will struggle with his task. Ultimately the product will suffer and the team behind it will lose in morale. If the company strategy is weak in the sense of providing direction to product strategies, this may also impact the individual product strategies negatively.

Input should come from those able and capable of understanding the market segment. Getting good understanding of the competitive landscape is critical and its important to point out that this doesn’t not only applies to capabilities, features and ease of use etc but also to the pricing model as the basis for selling it.

Developing the product strategy, although owned by Product Management, involves multiple functions within the company and the close collaboration between these different functions. Often you will find conflicts between different functions which are necessary to be resolved or bridged. However, ownership should never be slipped into the hands of the others.

A product strategy should not be a 60 pages document outlining all details, but a message that answers the questions mentioned above in a concise and well articulated way and easy enough for everyone to understand to allow buy in. The roadmap and the execution or project plan is merely vehicles to reach the goal of the strategy, and if all goes well, the product provides a revenue stream for the company.

To conclude, that the most important factor around product strategy is to have empowered owners of the product strategy to allow for maximum buy-in from the teams developing, marketing, selling, deploying and supporting a product.

onsdag 20 november 2013

ForgeRock Common REST API overview


One of the unique features of the ForgeRock Open Identity Stack, is that all components of the stack share a single, easy to use RESTful web API. REST stands for Representational State Transfer and is a technique that relies on stateless, client-server and cacheable communication where the HTTP protocol is used. Not only is REST a more lightweight alternative to traditional SOAP based web services but given its HTTP nature, makes it easy to use in a firewall controlled environment.


The Open Identity Stack contains three different products, each with individual modules and capabilities; OpenAM, OpenDJ and OpenIDM. The Common REST or CREST can be used to access and leverage all the underlying modules and features with a set of easy to remember REST calls (CRUDPAQ).



Create
Add a resource that does not yet exist
Patch
Modify part of an existing resource
Read
Retrieve a single resource
Action
Perform a predefined action
Update
Replace an existing resource
Query
List a set of resources
Delete
Remove an existing resource



OpenAM
OpenAM offers a RESTful API for authentication, logout, cookie information, token attribute retrieval and token validation, authorization, OAuth 2.0 Authorization, OpenID Connecto 1.0, self-registration, password management, managing identities, managing realms and logging.



OpenDJ
The present implementation in OpenDJ maps JSON resources onto LDAP entries, meaning REST clients can in principle do just about anything an LDAP client can do with directory data.


OpenIDM
OpenIDM provides an implementation that allows you to manipulate managed objects as well as system objects.


Interacting with the RESTful API
There are a number of ways and programming languages which you can easily interact with the ForgeRock Common REST API. An easy way to invoke REST calls is to get hold of a REST client that allows you to easily provide the necessary details, save calls and tweak them as you play with the interface. One REST client that often is being referred to in our documentation is CURL (http://curl.haxx.se). CURL is a command line tool for submitting data with URL syntax and free to use.


Some samples using ForgeRock Common REST


Lets retrieve a user from OpenDJ, authenticated as Stevie with password Wonder.


$ curl
--request GET
--user stevie:wonder
{
 "_rev" : "000000005b337348",
 "schemas" : [ "urn:scim:schemas:core:1.0" ],
 "contactInformation" : {
   "telephoneNumber" : "+1 408 555 1212",
   "emailAddress" : "newuser@example.com"
 },
 "_id" : "newuser",
 "name" : {
   "familyName" : "New",
   "givenName" : "User"
 },
 "userName" : "newuser@example.com",
 "displayName" : "New User",
 "meta" : {
   "created" : "2013-04-11T09:58:27Z"
 },
 "manager" : [ {
   "_id" : "opope",
   "displayName" : "Olivia Pope”
 } ]
}


In OpenIDM we can simply create a new user using


$ curl --header "Content-Type: application/json"
--header "X-OpenIDM-Username: openidm-admin"
--header "X-OpenIDM-Password: openidm-admin"
--request PUT --data '{ "userName":"joe", "givenName":"joe", "familyName":"smith", "email":"joe@example.com", "phoneNumber":"555-123-1234", "password":"TestPassw0rd", "description":"My first user" }' http://localhost:8080/openidm/managed/user/joe


In OpenAM we can perform an authentication with the following call:


$ curl --request POST
--header "X-OpenAM-Username: demo" --header "X-OpenAM-Password: changeit"
--header "Content-Type: application/json" --data "{}"
{ "tokenId": "AQIC5w...NTcy*", "successUrl": "/openam/console" }


The above are just three simple calls to showcase the easy of use and flexibility of the Common REST API the Open Identity Stack offers. Check out the suggested reading links for more samples and information on how to leverage the capabilities exposed by the API.


Suggested reading


OpenAM:




OpenDJ:

The Whats, Whys, and Hows of XDR

Preventing security incidents is one of the primary goals of any security program. This should come as no surprise, and with today's eve...