Archive for the ‘Identity’ Category

Troubleshooting the D1IM LDAP Authentication Module

October 10, 2015 Leave a comment

During a recent Go-Live with Dell One Identity Manager 6.1.2 we ran into some issues establishing the IT-Shop authentication leveraging the LDAP Authentication Module. I’d like to use this blog post to share all the know how we were able to gather and participate within the finding.

So here’s the setup we were using:
We had a custom D1IM 6.1.2 IT-Shop application being published to a set of Windows 2008 R2 based IIS web servers. The authentication had to established against Novell eDirectory 8.8 SP8 using LDAPS via Port 636.

This is the way it was intended to work out:
The user lands on the IT-Shop login page being asked to enter the username and password. D1IM looks up the LDAPAccount table for user, having the given user name. When found, D1IM grabs the Distinguished Name of the identified LDAPAccount object and uses this one in combination with the given password to do a login using the LDAPADSI bind functionality against the configured LDAP directory to authenticate the user. If the authentication was successful, the user will be logged into the IT-Shop using the Person object being connected to the LDAPAccount object.

Here’s how we approached the activation of the LDAP Authentication Module:
We went into Designer and activated the Dynamic LDAP Authentication Module and fed it with the configuration data, specifically the LDAP RootDN, the FQDN of the LDAP server (which initially was a F5-FQDN load balancing against all the LDAP servers), port number and the authentication method. Having this in place, we went into web portal configuration tool to activate the LDAP authentication module.

Then we were pretty much done… At least, we thought we were. Testing the LDAP based authentication ran into the first issue: in the IT-Shop log we saw the following error message:

Value RootDN is required for authentication

 As we had the RootDN being configured into the module configuration we did cross check the current setting with the real RootDN of the LDAP directory and the domain object being established in D1IM finding no failure at all. So we opened up a service request with Dell Software Support.

Finding #1 was, that the IT-Shop is not able to leverage the configuration of the LDAP authentication module. Therefore, we needed a manual entry in the web.config as the configuration is not able to define the necessary key. Dell provided this information in a Kb article pretty quick. The KB can be found here:

 So we put our LDAP configuration into the web.config and did a retest which ran into the next issue: looking up the IT-Shop log we now did find the next error message:

The server is not operational

So we did use the VINSProviderTest.exe to test login into the LDAP directory from one of our web servers. Using the LDAPNovell Provider, the login was all fine, using the LDAPADSI provider which the LDAP authentication modules is leveraging failed. So we had the intention that the LDAP authentication module is not compatible to work with a Novell eDirectory for LDAP authentication. So while mailing back and forth with Dell Support we did came to the conclusion that we would need a custom authentication module talking LDAPNovell for our customer, which would have been a massive challenge due to the limited till Go-Live.

But here’s the good news: it does work with Novell eDirectory for LDAP authentication if all the environment configuration is right.

As i mentioned, we did configure the F5-FQDN which is load balancing all the LDAP requests onto the LDAP servers in the infrastructure. Being asked by Dell Support, we were looking up the SSL certificate of the LDAP servers finding the issue, that the web servers did not have the Root-CA certificate being installed to trust the certificate at all. Installing the root certificate removed the warnings from the server certificates. But it did not help us authenticate. So we did check the certificate path of the server certificates next and recognized, that those were server specific but there was no certification path for the F5 loadbalancer. So in next step, we talked to the LDAP team if there was any chance to add alternate subject names to their certificates carrying the F5-FQDN to be leveraged by the LDAPADSI provider to authenticate successfully. But this approach was unsuccessful due to timing issues as well with regards to our Go-Live.

Dell did bring online a KB describing this issue as well which can be found here:

So we ran into the next workaround: connecting to one of the LDAP server directly bypassing the F5 Loadbalancer. But even this was unsuccessful. We were near to the goal but we still ran into the issue that the authentication was failing with the error message „The server is not operational“. What came now has nothing to with D1IM at all but we were not aware of what was coming next….

Being totally desperate, we got hand onto one the best Novell eDirectory engineers of the customers organization to do a LDAP trace while trying to login. What we saw was pretty upsetting:

SSLv3 handshake failed: no matching ciphers found

So we started looking into the security configuration. Based upon the security policy of the customer organization, the System Integrators did some hardening work on the IIS servers being used for the D1IM IT-Shop, disabling everything except TLSv1.2 and disabling weak encryption ciphers to comply with security policy. Next step in our troubleshooting procedure was an approach of one of our System Integrators, using the OpenSSL tool suite to test-drive the available SSL / encryption features being supported by the current instance of the Novell eDirectory. The result was pretty disgusting: SSLv3 and TLS1.0 but nothing more secure. So the LDAP engineers did open a service request with Novell, which ended up in being told that Novell eDirectory does not support TLSv1.1 and TLSv1.2. Such support will be available with Novell eDirectory 9.0 somewhat in 2016.

After getting an exceptional approval by the customers IT security office, we did lower the SSL / encryption level of our D1IM web servers. Since the we’re successfully authenticating against the customers Novell eDirectory.

So if you ever want to use the D1IM LDAP authentication module in the IT-Shop portal, there shouldn’t be any more you can run into regarding potential issues.

I also wanted to use this blog post to say „many thanks“ to Rene O. from Dell Support and Ralph G. from the Dell Development Department in Dresden who were not loosing the faith into D1IM and the authentication module till the end. I think we all learned a lot during the time we were dealing with this special service request.

Categories: D1IM, IAM, Identity, IDM, Programming, Tools

First look onto EmpowerID

August 13, 2015 Leave a comment

I’ve had the chance to get a demo on one of the .Net-based IAM tools earlier this week. I had the pleasure to get some insight on the EmpowerID product platform. As i do come from a legacy-Voelcker background, being shifted through the aquisition through Quest Software in 2010 and joining my current (vendor neutral – although we do have unique competencies for a couple of IDM / IAM / IAG tools) employer, i’m still working mainly with what is now Dell One Identity Manager, which completely based on the Microsoft .Net stack, as the EmpowerID suite is. There are a couple of things in common: both tools come with their very own database based meta-directory (EmpowerID support Microsoft SQL Server, Dell supports Microsoft SQL Server as well as an Oracle Database Server), both tools use the IIS server for their web applications, both tools do ship with a graphical workflow designer, both tools ship with a bundle of native connectors but are able to be extended using their API capabilities to program custom connectors against their extensible meta-directory. There are a couple of mor things in common betwenn D1IM and the EmpowerID suite, but there are also a whole bunch of differences between the two solutions approaching the same problem. The main issue at least for the european market for the EmpowerID suite is the missing SAP connector to provision into the SAP security stack. They do have a connector to provision Identities from or to SAP HCM natively, which Dell is currently missing as a native connector.

From what i’ve seen during that 90 minute demo, i’d like to get a demo installation of the EmpowerID solution suite to do some hands-on experiments discovering the tool. It looked pretty nice, pretty quick and pretty responsive although the majority of the configuration and administration is done through web interfaces.

A must read by Ian Glazer (@iglazer)

August 1, 2015 Leave a comment

One of the must-reads during my family vacation was the speech Ian Glazer gave at CIS 2015, titled „Identity is having its TCP/IP moment“. He’s talking about using standards based IAM. His conclusion (which i totally agree onto): not using standards is the wrong way. Ok, he’s expressing it with the phrase „the Banyan Vines of identity“.

He gave this as a speech without any slide deck. The speech can be read on his private blog:

Ian also embedded a video recording of his speech for all of us, who are to impatient to read the whole text. But i do have to recommend to read the text at all, as it’s even more impressive than „just the recording“. Thanks to Ian for such a great speech.

Categories: Identity, IDM

Biometry is broken

January 28, 2015 Leave a comment

From what we’ve seen during 31C3, biometry is broken. It’s not just broken if we do try to get the fingerprints of a potential victim by extracting them from whatever our victim might have touched before using forensic methods. With what we’ve seen during the talk given by starbug ( – attention: the talk was given in german), the physical barrier is broken. There is no more need to save a glass that was touched by our victim. The fingerprint can be restored using a photograph of the fingertips of our victim that has a certain quality.

During his talk, starbug already gave some insight on what might be next: 4K video. I’m curious about what he might come up for the next congress. Maybe he’s already working on extracting fingerprints from 4K videos.

So you might wanna say fingerprints are broken, what about other biometric factors. Let’s try to run them through:

  • fingerprints – broken
  • retina scans – broken (at least if the quality of the picture is good enough)
  • face scans – broken (as shown in the video)
  • heart beat – not broken yet
    So let’s keep up with what’s left on the list: heart beat. There was a startup showing up with the idea of a wristband using your unique heart beat signature as a identification token. Sounds pretty cool so far. But here it comes: I’ve been talking to different people about two different approaches that might break this as well.
    The first approach (although is much more theoretical and does have a moral and ethical impact) I’ve been discussing with a doctor. In the end she told me, that it would be possible to use a pacemaker to re-program a individuals heart beat. It has not been done before, but it’s possible.

The second approach I was talking to a guy working in device security for quite a while. From his expertise, it shouldn’t be the biggest deal to set up the specific electric signal that will look like a valid heart beat to the device.

So from where we are right now, there are only two conclusions:

  1. Don’t trust in biometrics as a single source of identification. They might be used in combination with other forms of authentication, but never ever alone.
  2. Biometric devices need to get better. The need to be able to determine if they are scanning a print version of the fingerprint, face or retina or if the are scanning a real human being. This will raise prices for devices.
Categories: IAM, Identity, Privacy, Security

30C3 talk on Identity Ecosystems

January 2, 2014 Leave a comment

During the 30th Chaos Communication Congress hosted by the Chaos Computer Club in Hamburg, Germany, which took place between 12/27th to 12/30th 2013, Christoph Engemann gave a talk on NSTIC and COM 238, which are the two identity policy proposals of NIST (USA) and the european commission, highlighting similarities, differences and potential conflicts.

A complete recording of the talk can be found here: CCC-TV – Europe, the USA and Identity Ecosystems


Categories: IAM, Identity, IDM

Passwords must die – we’re on the way

Since several months and identity related conferences there is one hot topic still ongoing and represented as an popular hashtag within the IAM crowd (some call them / us identirati): #PasswordsMustDie

I already did spent some time in march blogging some lines on #PasswordsMustDie in the article “Passwords must die – but how”. And over the past weeks i was spending some time to look around on various plates to see, how it’s going on the way to kill passwords. There’s a bunch of news in that space that i’d like to wrap up pretty quick.

The FIDO alliance

The FIDO alliance (FIDO stands for Fast IDentity Online) was formed as an non-profit organization in summer 2012 to change the nature of user authentication. Some very known names being members in the FIDO alliance are:

  • Google
  • Lenovo
  • PayPal
  • PingIdentity
The alliance is still growing, making it’s way to bring a FIDO plugin supporting various FIDO authenticators, such as hardware based tokens, finger prints, voice identification as well as combinations of those differentiating those into two kinds of  tokens:
  1. Identification tokens as unique identifiers being associated with an online identity
  2. Authentication tokens for identity proofing

Mozilla Persona

In April 2013, the Mozilla Identity team announced the second beta of Persona as an simple way to login to various services and web sites using any modern internet browser. Their simple goal: Eliminate passwords on the web. Although the base of services and web sites is still small, i do expect them to grow their services base over the months.


Both, the FIDO alliance as well as Mozilla Persona do show that there is something going on to kill passwords. These initiatives will see a major boost in usage as soon as some bigger services start supporting their technology and approach. As long as services like Twitter and LinkedIn just enable their users to use two factor authentication as as result due to various security incidents, there is still some password usage although it’s just a single part of authentication. Let’s see what’s the first popular service starting to use such technologies as offered by FIDO or Mozilla, we might see some real security improvements.

Categories: Cloud, IAG, IAM, Identity, Security, Strategy

Continuity vs. Reinvention

In an lead architect role i’m currently involved into a project to design the migration of a very mature IAM implementation towards a newer release of the same IAM solution suite within an complex infrastructure. I don’t want to shed light on the technical details here, but i would like to discuss the impact on the end user in such an migration.

The customer is using the web based end user portal heavily with an user base beyond 100.000 end user all over the world. Due to the lack of oob-features in the currently used release the customer spent a lot of time and money in extensively customizing the web portal. In fact, there are just a NO standard modules used anymore, the whole web portal was reimplemented by the customer and their current service provider to offer the best breed value for their end users.

Facing the challenge of migrating the existing solution to an newer release of the same IAM suite, were facing some issues here:

  • due to the fact that we plan to make a jump over 2 major releases, lot’s of the backend technology, database structure as well as the web portal designer engine and controls have changed, were replaced or disestablished
  • to enhance the end user experience, the customers service provider did implement a bunch of custom controls into the web portal using the existing web portal designer engine which are sometimes based on oob-controls
  • the web portal implementation is doing some tasks in an very special way, so to say not all of the tasks are initiated through the web portal are being handled and executed by the IAM solutions backend engine (as it would be the usual way to deal with) but they are being driven directly out of the web portal itself without ever interacting with the backend engine (e.g. several web service calls)

All of these issue and their consequences have to taken into consideration when planning the migration as an strategy as well as the technical implementation itself. When not taking a look on effort or cost, we’re coming down to the following decision:

Rebuild the web portal as it exists today OR reinvent the web portal using the newer technology and oob controls?

Or just rephrasing the question: Continuity or Reinvention? Let’s discuss both of them a bit first before coming to my final conclusion.


Keeping an eye on the huge end user base of more than 100.000 users it might be worth spending the effort in rebuilding the same heavily customized web portal than it is existing today. This will minimize the impact on the end user in the final phase of the solution migration and might keep the help desk and the IAM team in an comfortable situation. But the price that has to be paid is high: all controls that can’t be migrated automatically using the IAM suites migration features have to be rebuild within the new platform. So there is effort to be spent which has already been spent and paid by the customer. Keeping a high level of customization does increase the complexity of the final solution implementation as well as the potential maintenance effort but also the risk when migrating towards newer versions of the IAM suite in some years. It does also make the customer dependent from the implementing service provider as the implementation know how will be on the side of the implementing service provider. The solution vendor might not be able to support all of the implementation as it might be to heavily customized, which also does not bring any value to the customer.


Reinvention does mean to migrate the complete web portal towards the standard solution features by limiting the customization to the lowest level possible (which in the best case would be customers CI). The nearer the to the delivery standard the solutions implementation is, the better is the supportability by the solution vendor instead of the implementing service provider which brings the customer into a much more comfortable situation for future business with the vendor as well as with the service provider landscape. On the other hand there is impact on the huge end user base as they will have to relearn and / or adopt the newer solutions web portal and it’s functionality as it will have another look and behavior. This will also have an impact on the customers help desk and the IAM team as they will be the point of contact for all the end users that do have trouble with adjusting themselves to the new solution and the new way of handling things. This can be mitigated by providing training material, web casts and regular updates during the implementation process to make key users and power users of the upcoming changes. Utilizing the group of key users and power users does streamline the process of sharing knowledge and information from the help desk and the IAM team through the key users and power users to the regular end user.


From an architects view, my conclusion is pretty clear: i’m recommending the reinvention of the end user web portal although there will be an impact on the huge end user base. Why do i want to do that?

  1. Bringing the solution back towards oob standard as far as possible does make the solution less complex and enhances the maintenance situation for the customer itself as well as for the service provider (which might not necessarily be the implementing service provider)
  2. The implementation effort is not that high than keeping the web portal as it is by spending a lot of time and material in keeping continuity by reimplementing anything that has been implemented so far
  3. The end user impact will have a peak at the beginning of the solutions roll out but will decrease quickly

From an realists view knowing the customer for a while i’m pretty sure we will end up with kind of an mixture between continuity and reinvention. But the strategy i’d like to propose to my customer is clear: decrease customization over time. Maybe it’s worth spending the money in reimplementing the existing solution in the newer release and then starting an process of moving feature by feature back to standard.

Categories: IAG, IAM, Identity, IDM, Migration, Strategy