fredag 25 juli 2014

In the light of Internet of Things and OpenIDM

I am a hug fan of hotels that meet my standard of living and during ForgeRock's recent IRM Summit in Phoenix, AZ i spent some time aligning ForgeRocks OpenIDM with a rather interesting use-case that could benefit a lot of hotels.

Imagine showing up late at the Waldorf=Astoria Biltmore Hotel straight from a 18 hour journey from across the pond, with no real ambition on queuing to get checked in and all sorted out, what would be more convenient than having your hotel room information texted to your cellphone the moment you land and your NFC enabled smartphone already set up to let you in to your room for a quick shower and instantanteous relaxation?

With OpenIDM, supporting this type of use-case is easy and was in fact demonstrated at the summit using a combination of a lock that opens and locks based on the credentials set up on your smartphone. Provisioning user account information along with devices and things is easy in the RESTful world that we are seeing happening infront of our eyes and Gemalto provides this type of cloud service.

Make sure to check out this video that demonstrates the use and explains the details. 

fredag 9 maj 2014

10 min demonstration of Roles in OpenIDM 3.0

In this 10 min video role based provisioning is demonstrated. We will see how to create a new role, assign that role to a user, remove the role from the user and delete the role.


tisdag 6 maj 2014

Installing and integrating OpenDJ and OpenIDM in 10 minutes

In this 10 min tutorial, you will learn how to install OpenDJ and OpenIDM, and integrate OpenIDM to reconcile with OpenDJ. 



fredag 28 mars 2014

Using OpenAM Realms for Large Scale Access Management

ForgeRock OpenAM provides a powerful Realms facility, enabling OpenAM to be used in internet scale deployment scenarios. Realms can be used to divide user populations and their associated configurations, with each realm treated as a discrete administrative unit.


For each Realm, an authentication process can be defined, stating the location of the authentication repository for users and groups, such as an Active Directory or LDAP directory, and the type of authentication required. Each Realm also has the ability to have policies defined to determine whether or not an authenticated user is allowed access to a resource protected by OpenAM. Realms allow for the separation of configuration data for the services used per realm as well as for federation purposes.


For administrative purposes, delegated administration enables user populations to be separated and sliced via realms, so that they can be managed in a distributed fashion. Delegated administration is a critical component in the deployment of large user populations, where central administration becomes unwieldy and too complex to be practical.


Service providers can use Realms to centralize authentication and authorization for multiple customers (often referred to as tenants) from different companies. Realms provide the boundaries, separating the tenants from each other and ensuring that they do not authenticate to each other or obtain authorization to access resources from each other. Delegated administration can therefore occur at the tenant level.


The ForgeRock IRM suite shares a common RESTful API, allowing Realms to be created, managed, and configured via REST and also through the CLI and the Administrative Console.


For example, you can create a REALM via REST simply by using an HTTP POST, as follows:


1
2
3
4
5
$ curl --request POST --header "iplanetDirectoryPro: AQIC5w...2NzEz*"
--header "Content-Type: application/json"
--data '{ "realm": "testRealm" }'
{"realmCreated":"/testRealm"}


The only required field is realm, however, the realm will not be active unless its status is set.


The caveat to the current Realms implementation in OpenAM, is that logs are not yet separated per Realm. This is something that is being addressed in future product enhancements, as ForgeRock moves towards a full blown multi-tenant architecture.


For the curious and the technically savvy, given the long history and maturity of OpenAM, it’s worth mentioning that Realms were previously called “organizations” (prior to OpenSSO). This is why the OpenAM SDK often refers to an “organization object”, rather than a “realm object”, for backward compatibility.


In summary, OpenAM Realms offer a convenient way to slice up a large scale user population into manageable chunks that can be separated from each other in how they are configured. Realms also enable delegated administration - critical in distributing the administration load and in addressing the challenges of consumer-facing identity and access management for the modern web.

torsdag 27 mars 2014

Adaptive Risk Authentication and the IRM Pillar Dynamic Intelligence

The requirements and demands on Identity and Access Management solutions have shifted. With a fresh pair of eyes, the industry calls this Identity Relationship Management (IRM). One of the pillars defining IRM is Dynamic Intelligence rather than Static Intelligence. Static Intelligence covers what we already know about a user, whereas dynamic intelligence includes things that might be spontaneous, such as geographical login location.


In the past, access patterns have been predictable. Employees have logged in from their usual desktop PCs, and have accessed the applications required to perform their job tasks. As the modern world has changed user behavior, predictability is no longer the case. Enterprise users access applications and services from a wide variety of locations and devices, including laptops, smartphones and their traditional desktop PCs. Looking beyond the enterprise at consumers, these users can leverage any device, from their home TV to their car.


A modern IRM solution must be able to cater for these dynamic access patterns and must understand the myriad of circumstances in which a user accesses a particular service, in order to grant access. Ultimately, authorization must also be dynamic and must provide the appropriate content or entitlements “on the fly”. A concrete example is that of a user trying to access services when logged in from a different country. An adaptable IRM system will adjust to the particular circumstances and will potentially ask for additional authentication beyond simple credentials.

This use-case is something that can be addressed with the latest release of
ForgeRock OpenAM (introduced in OpenAM 10), which provides an adaptive risk authentication module. Adaptive Risk Authentication is not an authentication mechanism but rather identifies potential risks (such as logging in from a different country in the sample above), assigns a risk score and then determines whether logging in should require additional means of authentication. The additional authentication requirement can be anything from requiring a one time password to using some type of hardware mechanism.


An authentication mechanism that adapts to specific circumstances is a perfect example of the IRM pillar of “Dynamic Intelligence” and adds additional assurance that a user really is who she claims to be.


Some of the risks that can be defined and dealt with dynamically include IP Ranges, IP History, known cookies (and values), time since last login, profile attributes and their values or geographical location.


Pioneering risk-based, adaptive authentication has always been the domain of the financial sector. However, as this technology has become mainstream, more and more organizations are deploying adaptive authentication to deal intelligently with the growing number of mobile users, to mitigate the risks of users who fraudulently claim to be someone else in order to potentially commit malicious actions.

torsdag 20 februari 2014

Two-factor authentication for mobile with OATH

Nearly all major vendors of smartphones and tablet platforms, such as Apple, Google, Microsoft and Blackberry, support software tokens to provide an inexpensive yet secure way to perform two-factor authentication. The majority of these vendors leverage OATH HOTP for this service, and the fact that ForgeRock OpenAM supports OATH HOTP, is a good reason to drill into this a little deeper.

Two-factor (or multi-factor authentication) is nothing new and is commonly found in various applications where the basic authentication process of providing just a password, a pin code, or the swipe of a card isn’t enough. The purpose of multi-factor authentication is to lower the probability of the user trying to authenticate to provide false credentials as evidence of his identity. The rationale goes, the more factors, the higher the probability that the user is who he or she claims to be.

In the case of two-factor authentication, the two factors are typically something the user knows (such as the classic password, and some private identity details such as mother’s maiden name or first pet’s name) and something the user has (such as an access card or a cellular phone). A third factor might be something the user is, such as biometric data of the face or fingerprint, or voice characteristics.

Now, OATH (short for Open Authentication) is a collaborative effort of various members of the IT industry to provide a reference architecture for universal strong authentication across all users and devices over all networks. OATH is open and royalty free, for anyone to implement and use. HOTP is short for HMAC based One Time Password. The idea behind OATH is to reduce complexity and lower the cost of ownership by allowing customers to replace proprietary security systems, which often are complex and expensive to maintain.

ForgeRock OpenAM can be configured to support OATH-based HOTP for two-factor authentication. For the technically savvy, who like to try things out, there is an easy to follow guide to configure OpenAM in this way. On a side note, OpenAM can also support TOTP or Time based one time passwords. Both TOTP and HOTP are standards described in RFC 4226 and RFC 6238.

There are a number of use-cases where this added security comes in handy.

  • Self-service password resets
One of the most common problems users encounter is when they forget their passwords and are unable to login to work or to purchase goods and services. The traditional approach to address these issues is to leverage what is called challenge/response questions - a set of predefined or user-defined questions with answers to reset the password. In the light of what people tend to share on social media sites these types of questions typically introduce a weakness in the security perimeter. In fact, the U.S. Federal Financial Institutions Examination Council (FFIEC) and the U.S. Federal Deposit Insurance Corporation (FDIC) strongly cautions financial organizations against adopting authentication methods that use personal information for authentication purposes. With OATH tokens, an easy second factor can be introduced, thereby mitigating some of the associated risks.

  • Accessing cloud applications via single sign-on
    In the new and modern world without boundaries, moving freely between on-premise applications and off-premise cloud applications might cause your company security experts sleepless nights. Introducing a stepped up authentication with an OATH HOTP when accessing cloud apps could be one precaution to make the security experts sleep better.

  • Step up or risk-based authentication
Assume that, from within an application, you are accessing sensitive information or you are logging in using an unknown pattern (for example, from a different network or a new device). Providing a second authentication factor in this case again mitigates some of the risks and OATH is a cost-effective solution to this problem.



To conclude, we have discussed what OATH is, and some of its typical use-cases. OpenAM provides the necessary capabilities to implement a cost effective two-factor (or multi-factor) authentication process that increases user or consumer acceptance for stronger authentication. OATH allows your organization to be compliant and to follow the guidelines set out by FFIEC, FDIC and others, while reducing the risk and implications of identity theft. It does so by offering multiple factors of authentication, allowing users and consumers to more accurately prove that they are who they claim to be.





tisdag 18 februari 2014

OAuth 2.0 Explained and Whats the Business Value?


With the latest release of ForgeRock OpenAM in November 2013, there is a lot of talk about OAuth 2.0 and how OpenAM can act both as an OAuth 2.0 authorization server and as an OAuth 2.0 client for installations where the web resources are protected by OpenAM. As for this article, we’ll focus on OpenAM as an OAuth 2.0 authorization server, and the business value of this implementation.


Assume you rock up at the fancy restaurant with your $200k sports car and hand over the car key to the 18 year old valet tasked to drive it off to the parking lot a block away. Somehow you wish you could restrict the performance of your car, to drive below a certain speed, not allow access to the trunk and glove compartment, not to use the built in communication package to make outgoing calls, and only to travel a certain distance.


The solution to this is of course a valet key, designed specifically for this type of situation. Now, the same situation exists for the modern web. As a user, you might want to share certain resources and capabilities from one service with another - but not the full shebang, nor do you want to provide the credentials for your social media sites to third parties.

This is where OAuth 2.0 comes in. OAuth 2.0 is an open standard for access delegation and authorization without the need to explicitly share password information. OAuth 2.0 allows providers of services to share a “valet key” to a third party, that allows limited access to services, possibly with an attached time constraint. In other words, OAuth 2.0 provides the mechanism for clients to access resources on behalf of the user or owner of those resources. The end user can select or authorize third-party access to resources, without ever sharing any secret credentials such as the username and password.


In simpler terms, OAuth 2.0 empowers people to perform delegated authorization on their own.


The above concept is, of course, nothing new and is seeing more and more application in the modern web, where businesses wish to expose their APIs and share information with third party applications in a secure manner.


In the world of OAuth 2.0 there are four roles that are defined in the standard:


  • Resource Owner
    An entity capable of granting access to a protected resource. When the resource owner is a person, he is referred to as an end user.
  • Resource Server
    The server that hosts the protected resource, capable of accepting and responding to protected resource requests using access tokens. The resource can be anything, for example, photos, in the case of flickr.com.
  • Client
    An application that makes protected resource requests on behalf of the resource owner and with his authorization. The term "client" does not imply any particular implementation characteristics (for example, whether the application executes on a server, a desktop, or other device).
  • Authorization Server
The server that issues access tokens to the client, after successfully authenticating the resource owner and obtaining authorization.


In addition to these four roles, two different types of tokens are defined by the standard:


Access Token : Access tokens are credentials provided by the client to access protected resources.  An access token is a string that represents an authorization issued to the client. Tokens represent specific scopes and durations of access, granted by the resource owner, and enforced by the resource server and authorization server. The access token provides an abstraction layer, replacing different authorization constructs such as traditional credentials (username/password) with a single token that is understood by the resource server.


Refresh Token : Although not mandated by the specification, access tokens ideally have an expiration time that can last anywhere from a few minutes to several hours. Refresh tokens are credentials that are used to obtain access tokens. Refresh tokens are issued to the client by the authorization server and are used to obtain a new access token when the current access token becomes invalid or expires


OpenAM exposes a RESTful API that allows administrators to read, list, and delete OAuth 2.0 tokens. OAuth 2.0 clients can also manage their own tokens.


The Business Value of OAuth 2.0



Now that we have a rough idea of the purpose of OAuth 2.0, let’s look at how it fits the business puzzle and why it can be important to you and your organization.


More and more frequently, organizations are exposing their own APIs to attract third party developers who can take advantage of the organization’s services to build apps and widgets, with the ultimate motive of driving top line revenue. Companies like Amazon, Netflix, and even Sears do this to attract intermediaries who can connect buyers with themselves as sellers, and create convenient ways to transact.  


For this strategy to work, however, a certain level of trust must be established between the parties involved. Identity is an important piece of this puzzle and it is imperative that organizations secure the exposure of services via APIs. OAuth 2.0 is a component of any Identity Relationship Management (IRM) platform, and serves to address these security issues and provide a convenient way to deal with authorization.


From a developer perspective (described by Ryan Boyd at Google), before the dawn of OAuth 2.0 inside Google, developers spent 80% of their time dealing with authorization in new applications. The ability to deal with authorization easier and in a standardized way, cuts costs, saves money and brings applications to market quicker.
 


Further Reading

onsdag 12 februari 2014

Bridging Enterprise and Cloud in the shadow of Mr. Snowden

As the Software Industry and the way we procure enterprise software shifts from on-premise with proprietary licensing to off-premise with a subscription, there is an emerging need to control this hybrid environment being created. For small startups with less than a hundred people on the payroll, its natural to leverage cloud-services for file storage, customer relationship management, mail and calendar etc, but if you look at older and/or bigger companies, often there is a mix of on-premise and off-premise deployments. Despite what the industry hype says, not everybody is in The Cloud, but many are taking trembling steps to explore this model.


Now, the problem that emerges from a hybrid environment is of course the issue of compliance and the regulatory concerns associated with shuffling identity data to SaaS providers, as well as the actual “shuffling” itself. It needs to be secure, in sync with any potential authoritative sources, and the appropriate logs need to be kept.
Whenever an enterprise outsources sensitive business information, such as personal information about employees, contractors and partners to a SaaS vendor in the Cloud, there is a risk of data privacy issues and access control complexities. Not to mention the risk of foreign governments accessing the information for reasons other than national security. Especially in light of the recent interview with Mr. Snowden, the NSA whistleblower, where he states that such information would be collected in the name of national interest, regarding the topic of whether or not the NSA is spying on Siemens, Mercedes or other successful German companies.




For companies and organizations embracing Cloud Services, it is critical that they not only enforce and monitor controls, but also make sure that users are who they claim to be and ensure that the appropriate entitlements within the SaaS application are managed. Looking at this from the other side of the fence, the SaaS vendors want customers to quickly and securely be on-boarded and provide some of the capabilities the above concern raises. This is where the need for Identity Bridges arise or IDM systems that can manage both traditional enterprise applications and new SaaS applications.


Often in the hybrid environment, a local, on-premise authoritative source for employee data is present and managed by e.g. the Human Resources department. Alternatively, there might be a directory of some sort such as an LDAP-directory or Active Directory from Microsoft. In these cases its convenient if Cloud-Services are just seen as additions to their current infrastructure and do not require additional management, such as juggling with exported CSV files from an HR system that needs to be uploaded or worse, have to be manually onboarded/off-boarded to a Cloud Service.


However, Jonathan Lehr predicts that Identity and Entitlements will move beyond the active directory paradigm as we know it today. Especially with tools such as Workday already storing employee data in the Cloud.


ForgeRock OpenIDM provides an extensive integration layer that spans across both on-premise enterprise applications and systems for cloud based SaaS solutions, such as Salesforce and Google Apps. This allows an enterprise to control the provisioning flow and ensure that entitlements are set according to applicable policies. Identities can be kept in sync using OpenIDM’s discovery engine, reconciling and synchronizing down to the attribute level, yet at the same time record the provisioning activities for monitoring, regulatory and compliance reasons.


Now of course managing the boundaryless identities is one thing, but still there are fears, especially among Europeans, that data being stored in the Cloud could be vulnerable to foreign surveillance. Sensitive data flows quickly between servers, possibly in multiple countries, making it very complex to regulate and protect the data - but at least with the appropriate tools proper Identity Management can be done.

onsdag 29 januari 2014

Tackling some traditional IdM use-cases with OpenIDM

Although OpenIDM is built for consumer facing identity management as part of the ForgeRock Identity Relationship Management Stack, it provides a number of typical capabilities that traditional enterprises can use to tackle some of their problems. Since many of our customers are investigating options to their Sun Identity Manager environment, I thought it would be helpful to describe the use-cases with this in mind.


Lets look at four typical use-cases, a Sun Identity Manager customer might have deployed and discuss how OpenIDM matches up.


1.) Orphan account detection
Sun IdM provides a reconciliation engine allowing customers with XPRESS rules to define correlations between target resource accounts and the virtual identity in Sun IdM. The reconciliations runs per resource, compares and produces situations on whether accounts are matched, unmatched, not known etc.


OpenIDM offers a similar reconciliation engine allowing these correlation rules to be migrated from XPRESS to JavaScripts. The reconciliation results are similar to what Sun IdM offers and also exposes the capability of invoking custom reactions to a discovered situation such as running a script or invoking a BPMN 2.0 workflow. The reconciliation similar to Sun IdM also provides the necessary information needed to produce reports such as orphan accounts reports.


A key differentiator from traditional IdM vendors, is that OpenIDM is made for the consumer facing world where scale and performance is critical.


2.) Authoritative Source driven provisioning
Sun IdM provides the mechanism of ActiveSync, where certain connectors or resource adapters are extended with the capability of reacting to near real-time (via scheduled polling).


The ActiveSync process then discovers CREATE, UPDATE or DELETE situations on resource accounts and three different workflows parses a set of forms (typically referred to as ActiveSync forms) to manage the attribute transformations and identity data flow.


OpenIDM offers a similar capability and also leverages the same set of connectors as Sun IdM. In the world of OpenIDM this capability is referred to as LiveSync. The LiveSync process is typically a scheduled process running as a background process and instead of UserForms and XPRESS to define the transformations, these are specified in mappings describing the flow from one system to another. The LiveSync life-cycle offers a number of hooks that allows you to specify actions such as running custom scripts or invoking workflow offering the same flexibility and capabilities as Sun IdM.


3.) Password Management
A typical quick-win and low hanging fruit with Sun IdM was that once resource adapters or connectors were configured, the password management aspect came with the setup. Sun IdM allows you to specify governing password policy according to company requirements and enforce them during password resets. Sun IdM also allowed to intercept passwords on Active Directory by deploying a special plugin on the AD domain controllers. Self Service capabilities to reset passwords was by default managed using challenge/response questions that could either be specified by administrator or self-defined, or a combination of the two.

OpenIDM provides equal functionality to manage passwords, specify policies using flexible regular expressions in JavaScript rules, to reset and change passwords accordingly and to leverage challenge questions to do self-service resets. OpenIDM also provides a plugin for AD to intercept passwords and allow them to be synchronized as well as a plugin for OpenDJ to expose the same capability there.


4.) Self Service requests
Sun IdM allows you to quickly and easily expose custom workflows that can interact with the virtual identity and the underlying integrated resources to do attribute updates or to provision new accounts etc. OpenIDM exposes the same capability but instead of using a proprietary workflow definition language, leverage the industry standard BPMN 2.0 to specify workflows.



So despite OpenIDM really is targeting a different market segment with its consumer facing approach which includes focus on scalability, some of the typical and traditional use-cases often found within Enterprises can be addressed. OpenIDM also provides the ideal platform to extend your Enterprise to the Cloud, where user provisioning and administration with ease can bridge that gap. Further more does OpenIDM also give you the opportunity to expose Identity Management services via the common RESTful API that exposes all capabilities in the product.


fredag 24 januari 2014

How Open Source Software can impact your Business

The holidays are over and, no matter what Santa brought us for Christmas, it’s time again to shift focus, back to the world of Digital Identity. ForgeRock is a unique player in the Identity and Access Management space, given our Open Source nature and our ability to deliver a comprehensive software stack to solve IAM related business problems.


As we all know by now, open source software has a number of great advantages over proprietary software and I thought I would revisit some of these in this post.


Security
While no software can claim perfection, many recent studies provide a clear indication that if the source code is open for more people to inspect, vulnerabilities and bugs are more likely to be discovered and fixed.


Proprietary software vendors force their customers to accept whatever security their software has, and the pace at which patches and updates are released. In an open source software model, customers have the option of fixing problems themselves or narrowing down the problem and raising the issue to the community for a fix. With closed source software, as a customer, you simply have no idea what surprises the code might have for you.



Customizability
While working in the field, deploying Identity Management solutions at customers, I often cursed  the fact that I never had access to the source code - so I could never make minor tweaks, such as adding or altering the behavior of an integration to a target resource.


One of the true advantages of open source software is that business users can pick up any piece of software, modify it to fit their needs and be done with it. Doing that with proprietary software is infinitely more difficult. Often, tricks such as decompiling with JAD must be used, which might be a violation of the license agreement but are sometime necessary just to get the job done.  



Quality
Despite the saying “Too many cooks spoil the broth”, there is research indicating that open source software (up to 1,000,000 lines of code) has a higher level of quality, largely due to the transparency and openness of the source code. More qualified developers can scrutinize the code and bug fixes are addressed quicker in a distributed collaboration. In this context, I can mention that ForgeRock OpenIDM has 247,163 lines of code, as of this writing.



Freedom
Selecting open source software is often a conscious decision for a business to liberate itself from the effects of a traditional proprietary vendor’s “lock-in” strategy. Open source software provides its users greater control, better interoperability and access to a, hopefully, thriving community of skilled developers who are well versed in the solution’s source code.


Another important aspect is the ability to take a project forward independently. Consider what happened when Sun Microsystems was acquired by Oracle, who already had an extensive Identity and Access Management stack with significant investments. Oracle made the decision to render many of the open source projects “non-strategic” going forward, essentially allowing the projects to die, but providing others with the freedom to pick these projects up and continue. In this way, open source provides some kind of insurance regarding the longevity of a project.



Flexibility and Interoperability
The ability to make changes to the source code, and the fact that many open source projects are less resource-intensive and do not follow traditional proprietary vendor upgrade schemes, allows you to be more flexible and agile. Many open source projects also take great pride in following standards, which enables greater interoperability with other components. In the time of cloud computing, interoperability has become a critical must have.



It is easy to advocate the benefits of open source software but to all good things there is a flip side and some principal risks. Open source software is often easy to adopt, with a “try before buy” philosophy. This practice can lead to unmanaged software assets, which can introduce technical and potentially legal challenges (such as intellectual property management, audit compliance and security). The community is critical to realising the benefits of open source software. Other questions you need to ask are pertain to the type of open source software license that is used? Is it a viral GPL or a more business-friendly CDDL?


The barrier to entry to any open source software is low and it is important to recognize that this low barrier, combined with the challenges outlined above, can result in high risk at a high cost. The way to mitigate this risk is, of course, to ensure that there is proper insurance in the form of a vendor backing the software. If you decide, for example, to “build it yourself”, assign a set of engineers and maintain the software yourself, be aware that the cost of maintenance increases over time. Even if the initial entry cost is low in terms of staffing, this cost will increase and there is always the risk of competent skills fleeing the company.


The potential risks aside, when open source software is managed properly, the results are cost optimization, flexibility and innovation, which should be on the mind of all CIOs.

The Whats, Whys, and Hows of XDR

Preventing security incidents is one of the primary goals of any security program. This should come as no surprise, and with today's eve...