Release the light

I just re-read a previous blog I'd written on organizational diagnosis (July 2010) It was right for now as I'm currently in the middle of three different organizational assessments. This is what some people call the 'discovery' phase when I'm finding out what I think I need to know about the organization in order to make some judgment on what course(s) of action to suggest or recommend.

As I said in last week's blog I am working with three clients in this assessment phase. I always enjoy this part of the design activity because I get to meet all kinds of people each with their own perspective on the same situation. Each of the three cases has taken a somewhat different approach in this phase – so as I always say on my organization design training courses 'there is no right way'.

The interesting thing about the assessment phases is making sense of the data. What do you do with a set of interview notes, or a mix of interview notes and survey data, or a mix of 1:1 notes and group discussion notes? How do you find the themes and patterns that make for a good diagnosis and a useful set of recommendations?

As I was buying one of those light reflecting crystals to give to my mother yesterday I thought that organizations are a bit like the crystal. In an ideal situation each facet is a part of the whole but each is catching and shedding light from its own perspective. The different views and information I am collecting in the discovery phase suggest the facets of the crystal. Collectively the facets make up the crystal ball that catches the light and throws rainbows on the wall. I know this may be a bit far-fetched but it did start me wondering if rather than looking to cure a problem we could be looking to release the light in our assessments.

I say this because one of the clients has a long history and they are rightly keen that a redesign does not, in their words, 'throw the baby out with the bathwater', that it reflects the long history and all the good things about it and reshapes them for the future. I guess much as one can recut the facets on a crystal ball.

Side bar The crystal ball metaphor also suggests looking into the future, and perhaps designing for it – though, in my sardonic moments I am reminded of Scott Adams's (of Dilbert fame) view that, "There are many methods for predicting the future. For example, you can read horoscopes, tea leaves, tarot cards, or crystal balls. Collectively, these methods are known as "nutty methods." Or you can put well-researched facts into sophisticated computer models, more commonly referred to as "a complete waste of time."

Going back to my re-read, in that earlier blog I said that the NTL Handbook of Organizational Development and Change describes diagnosis as 'a collaborative process between organization members and the OD practitioner to collect relevant information, organize it, and feed the data back to the client system in such a way as to build commitment, energy and direction for action planning'

This description outlines a three step process a) collect the data b) organize it and c) feed the data back into the client system.

It's interesting to note the use of the word 'collaborative' because each of the steps involves a different form of collaboration.

Step 1 – is agreeing the data collection method. In most cases clients have a wealth of organizational data both quantitative and qualitative. They can pull out documents like strategic plans, business cases, customer and employee satisfaction information and so on that they feel is relevant to the task in hand but are not readily accessible or in public documents. It is important, but not enough, to do desk research of various types on this type of internal and external (Annual reports, white papers, analysts commentaries, etc) data.

Equally the consultant can suggest methods of approach: interviews, town halls, surveys, focus groups and so on. Deciding the right mix can well be a good first collaboration in the design process. If the choice is to conduct interviews and focus groups i.e. interaction with stakeholders then the process of facilitating the conversations involves collaboration between the consultants and the organizational members, or others, they are talking with. One has to assume a certain level of willing co-operation of the interviewees/participants, and that they are genuine in telling the story as they see it.

Step 2 – is the organizing of the data – I think this is less of a collaborative process between client and consultant and more of a consultant(s) activity. In two of the three projects I am working on I have colleagues working with me. In these instances the consulting team is collaborating on the organization of the data.

Incidentally if you are interested in methods of assessing qualitative data e.g. from interviews there's a useful book InterViews: Learning the Craft of Qualitative Research Interviewing . For small projects I generally organize unstructured data like interview notes 'by hand', but there are good software packages available that academic researchers use AtlasTi is one and Nvivo is another.

In this organizing of the data how does the consultant know how to sift out from the data what is important and what is 'noise'? I think this is where more client/consultant collaboration comes in. In a client I was working with a couple of weeks ago I began by sifting the information gathered from face to face interviews into cells on a Word table. I constructed a table with five headings: Main element, sub-elements, staff quotes, suggestions, work-stream. I was using an adaptation of Nadler and Tushman's congruence model so there were four main elements: Work, People, Informal Organization and Formal Organization. For this particular client, because so many people had talked about the client base I added a fifth element 'Client'.

Within each element I then had sub-elements drawn from the interview data. So, for example, under Informal Organization I had 'Trust', 'Inheritance from the past', and 'Response to change'. Against each of these sub elements were direct (unattributed) but in some cases slightly adapted quotes from staff – so the originator could not be identified. The fourth column held our – consultant – suggestions on what actions could be taken and the fifth our suggestions on emerging project work-streams or existing functional areas/projects that the actions could be assigned to.

Step 3 – is feeding the data back to the client system. In the case outlined above, we presented the client's leadership team with a PowerPoint overview of findings and then dived into a discussion on the detailed table, which is where the collaboration was really evident. The leadership team was able to take the various observations and see from a collective view whether they reflected a common reality or were an outlier, how important each was, how relevant and appropriate the suggestions were, and where to assign agreed actions (either functionally or to a work stream). We came out of the meeting with a whole lot of changes to our original thinking but ones which were much more appropriate to the clients' experience and intentions.

This type of co-creation is a very powerful way to achieve common ground and as much committed support as you can hope for. Also it removes the notion of 'expert' from the consultant, and puts it in the hands of the client group. Taking the medical analogy I wrote about in the July 2010 blog this approach is a form of second opinion on an initial diagnosis. But one where the second opinion comes from the client as expert. Facilitated carefully the discussion can also focus on methods of 'releasing the light' rather than looking to cure some perceived ills.

Your ideas on how to capture, organize and present assessment data to release the light would be great to get.