The #EODF18 Conference – Tasting Menu or Mezze?

Strangers to the world of organisation design, might have thought that The European Organisation Design Forum Conference, in Budapest, last weekend (19 and 20 October) offered an eclectic tasting menu – ‘a set menu of smaller plates [that] requires your chefs to flex their creative muscles to develop a multi-course culinary experience.’

Like a tasting menu, the 2-day set programme consisted of sample portions of many different organisation design concepts, approaches, tools and ideas, served in several courses [forums] for a fixed price [conference fee] under the banner ‘Designing for Business Ecosystems

There are different views on whether tasting menus are a good idea. On the one hand they enable chefs ‘to dazzle their guests with their skills, ideas, inspirations (and egos) … each bite-size course showcasing a skill, an innovation or an idea’.   And on the other hand, they can be ‘an off-putting “straitjacket” for guests’.

Comments from people I talked with at the conference reflected both views and yet, regardless of view, also reflected an energy about the ‘menu’.   Partly, this was because there were two tasting menus options offered simultaneously – much as an omnivore and a vegetarian can enjoy a tasting menu together – in our case a tech one and a human one.

  • On the tech menu we experienced, among other tastes, sli.do, quantified, networkapp, Colony – ‘a platform for open organisations built on the Ethereum Blockchain’, Connectivity – on how you can map connectivity.
  • On the human menu one we were offered ‘who’s in the room and what connects us?’ a deep democracy activity, open space sessions, Nora Bateson on ecosystems, playback theatre, and random coffee pairings.

At points the tech and human menus intersected – Karel Foltyn, Amazon’s Head of HR, MEU, giving a keynote over Skype, described some of his world, telling us how digitalization and global connectivity supports diverse teams of humans and robots enabling them to work together.

This presentation was, in spite of lots of testing beforehand, a little clunky in its tech delivery, which since all of us had experienced similar, we were good natured about.  I laughed when, the following day, someone sent me Tom Fishburne’s wonderful cartoon on digital transformation which neatly illustrated that none of us can yet rely on tech to delivery what we hope it will.  I can’t remember if it was in this session or another one that all the lights failed and we were treated to an instant pop-up array of light points from people’s smartphone torches.

People had very different reactions to the use of sli.do and quantified to ask questions, put points to the keynote speakers, and with the networkapp, find their way round the conference programme and notices.  As with a tasting menu – some felt the ‘strait jacket’ of tech, instead wanting the personal connection with people rather than a tech mediated connection.  Others felt these apps showcased ideas that we could experiment with and learn from.

The human sessions similarly evoked mixed reactions.  For some, the deep democracy action-session proved a bridge too far.  (Deep democracy is ‘the principle behind a community building process that hears all voices and roles, including our collective experiences of altered states, and subtle feelings and tendencies. It is a principle that makes space for the separable, the barely speakable and the unspeakable’).  For others it was a powerful experience that provoked insights and awareness.

Thinking again on the conference and how participants interacted with its diverse elements I am now wondering whether the conference ‘chefs’ (organisers) had artfully designed less of a prescriptive tasting menu and more of a mezze – ‘a collection of small platters where a guest can edit their experience to a certain extent – swiping hummus with a piece of kobez bread from here, sampling a crisp falafel from there’.

I recall the ‘Law of Two Feet’ was up on the main wall “If, during the course of the gathering, any person finds themselves in a situation where they are neither learning nor contributing, they can go to some more productive place.” People who didn’t want to stay in one of the scheduled sessions wandered out to the meeting areas to chat, work, drink coffee, eat something from the splendid variety of delicious food offered throughout the two days.

Four of the scheduled sessions were run as 8 parallel 45-minute open spaces each with a   conference participant hosting a topic they wanted to talk about and others showing up to discuss it with them.  The range of topics was wide – probably over 30 offered – with participants wandering between them and/or sticking with the ones that caught their interest.

And then there were awards – not the Great British Bake Off, but similar idea – for the best organisation design article – over 1000 votes polled from EODF members on this.  Three articles were shortlisted, from a longlist: Boosting Performance Through Organization Design, How to Create an Agile Organisation, and 10 components that successfully abolished hierarchy.  Here, you can see the 3 article authors giving us tasters of their articles, and find out who the winner is.

I had my own course on the menu in being awarded the Paul Tolchinsky Award – a true honour and a privilege, and had a lot of fun telling my story and watching it being immediately recreated and given artistic shape and meaning by the Playback Theatre Performers.  (If you want to know more on Playback Theatre techniques look here, but briefly it’s ‘an interactive form of improvisational theatre in which audience members tell stories from their lives and watch them enacted on the spot’).  I also enjoyed the graphic artist’s, Szilard Strenner from Grafacity,  ability to convert this, and each other session, into an amazing visual.

Nora Bateson’s session felt like a tasting menu in its own right as she introduced us to mesmerising concepts and ideas, enticed us to imbibe new words – symmathesy,   transcontextual (the inter-relationships that integrate complex systems), warm data – and urged us not to ‘commit the violence of reductionism’, a point that led me to the thought that we were missing conference debate on thornier issues.  These were hinted at in some of the 1 to 1 conversation but not in any groups, at least not in any of those that I participated in.

Maybe next year we will make space for ‘the barely speakable’, for example, how are we designing in situations of resource scarcity, of job precarity, of surveillance, of political repression and examine what responsibility we should take to tackle these types of organisational/societal challenges?

Putting that thought aside for the moment, all in all, I now see the conference as a delightful mezze/taster menu fusion.  It worked beautifully at many levels (real chefs take note).  People I spoke with left feeling well-satisfied but not overfull.  Thank you organising team – you cooked up great.

Readers: What makes a well-designed conference for you?  Let me know.

Image: Lima Floral & The Palomar: One-off tasting menu experience

Futures and horizons

Do you believe that you are now owning the last car you will buy?  Maybe that sounds improbable, but a piece on the UK’s BBC website argues the case quite well, saying ‘The central idea is pretty simple: Self-driving electric vehicles organised into an Uber-style network will be able to offer such cheap transport that you’ll very quickly – we’re talking perhaps a decade – decide you don’t need a car anymore.’

Someone sent me the article knowing that I was facilitating a workshop at the CIPD conference in November on OD&D and Change Skills to Drive New Business Models, and integral to the workshop is the idea that to help our organisations adapt into the future, we, OD & D practitioners, must keep future focused and develop horizon scanning skills.

Although there’s some overlap and debate between the terms futures, foresight and horizon scanning within the research/academic literature, in general, discussions of futures and foresight provide ‘a conceptual framework for a number of forward-looking approaches to informed decision making that includes long term considerations.’ (FAO 2013).

Keeping future focused is tricky but Philip Tetlock, in his book Superforecasting, reassuringly tells us:

  • The future can indeed be foreseen, at least in the near term. An analogy is weather forecasting – you may feel confident that you’ll need an umbrella this Thursday, but not that you will or won’t need one at a point in 2023.
  • Some people are much better at it than others.  The ‘foxes’ who ‘know many small things (tricks of their trade), are skeptical of grand schemes, see explanation and prediction not as deductive exercises but rather as exercises in flexible “ad hocery”, are much better at forecasting than the ‘hedgehogs’ those who know one big thing, and ‘aggressively extend the explanatory reach of that one big thing into new domains, and display bristly impatience with those who do not get it.
  • Forecasting is not a divine gift, but a skill that can be practised and improved. Tetlock offers an, online course,  Superforecasting Fundamentals that I’m tempted by but will stop myself registering (as I’m practicing paring down the amount of stuff I do).

An alternate to the course is the excellent resource The Futures Toolkit, ‘designed primarily as a resource for those who are new to futures thinking but should also prove useful to more experienced practitioners’.  It introduces futures thinking and offers a toolset ‘for gathering intelligence about the future, exploring the dynamics of change, describing what the future might be like’…

One of the tools in the kit is horizon scanning an approach used to identify early warnings of  potential threats, risks, emerging issues and opportunities, from the immediate up to 12 months away to explore how these trends and developments may combine and play out and what organisational impact they may have ‘allowing for better preparedness and the incorporation of mitigation and exploitation into … decision making processes.’  (What is Horizon Scanning, 2016).

Because horizon scanning is more about early warnings i.e. short term forecasting it can be used to help answer a question Tetlock asks:  ‘Is it a worse error to fail to try to predict the potentially predictable or to waste our time trying to predict the unpredictable?’ Consistent and continuous horizon scanning can stop the error of failing to predict the predictable, and what you do predict is likely to be more accurate if you are a fox thinker than a hedgehog thinker.

But we can’t stop at only looking at early warnings.  What about longer-term forecasting?  An extension of the single horizon scanning model is the Three Horizons model described as ‘a simple, intuitive way to encourage a conversation about the challenges in the present, our aspirations for the future and the kinds of innovation we might need in order to address both at the same time’.

  • The first horizon ‘describes the current way of doing things, and the way we can expect it to change if we all keep behaving in the ways we are used to.’
  • The third horizon is the future [long way out] system.  It is those new ways of living and working that will bring new patterns in existence.  It is transformative.
  • The second horizon is the transition and transformation zone of emerging innovations that are responding to the shortcomings of the first horizon and anticipating the possibilities of the third horizon’.

Knowing what horizon scanning is, and how it links to futures/forecasting is one thing – knowing what horizons to scan and how to scan them is another.   How do you actually horizon scan?   I do it via five main paths:

  • Scanning a range of magazines, journals, and newspapers for general coverage. The ones I skim read have changed over the years, but I find I’m still consistently and thoroughly reading The Economist, New Scientist, and The Atlantic.
  • Subscribing to daily/weekly email newsletters for focused coverage again I get several – some examples:
    • Tech: MIT Tech Review, Geekwire,  Information Week in Review
    • Science:  Science Daily
    • Business: Strategy+ Business, Fast Company, Harvard Business Review
    • Innovation:  Stanford Social Innovation Review, Open Data Institute
  • Looking at blogs and other info on specialised websites.  I have a lot bookmarked (I must pare them down) but I find I look most at the long now which has thought provoking blogs, as does the Practical Ethics blog – the latest is ‘should vegans avoid almonds and avocados?’  Workplace insight has info on workplace trends, and shaping tomorrow which is an ‘AI-driven, systems thinking model that delivers strategic foresight and anticipatory thinking in real-time’.  The big consultancies track ‘megatrends’ in various ways, See PWC’s example here.
  • Reading reports and surveys from various sources – think tanks, professional bodies, government and public sector organisations.
  • Talking to people and going to events – if you’re in the UK the Royal Society of Arts has excellent events – which are often livestreamed or available as web and pod casts if you can’t get to the event itself.

What happens with this scanning activity?  The idea is to apply it into your organisation design thinking.  For example, suppose we won’t need to own a car anymore in ten years (and the early warnings are pretty evident on this)?  What impact will that have on insurance providers?  Car manufacturers? Car maintenance outlets?  Car retailers?  Organisational benefits – if you offer a car as one or offer mileage payments?  Commutes to work?  People who drive cars for a living?  Software developers?  It’s likely that the reach of self-driving cars will extend to all organisations.  How could/should you start factoring it into your design work now?

My view is that horizon scanning is an essential skill for organisation designers.  Do you agree, if so how do you do it? Let me know.

Image: Economist, Unclouded vision

Worrying about employee engagement

There are things about employee engagement that worry academics and researchers.  They are, on the whole, sceptical of the concept and urge caution in taking on board what management consultants and others offer as engagement advice, tools and promises.

For example, McLeod and Clarke in their 2011 Report to Government –  Engaging for success: enhancing performance through employee engagement, say that ‘Early on in the review, when we spoke to David Guest, Professor of Organisational Psychology and Human Resource Management at Kings College London, he pointed out that much of the discussion of engagement tends to get muddled as to whether it is an attitude, a behaviour or an outcome or, indeed, all three. He went on to suggest that “… the concept of employee engagement needs to be more clearly defined […] or it needs to be abandoned ”.

I’ve been trawling research articles to get more of a handle on what it is that worries the academics, given that, as McLeod and Clarke say, ‘there is too much momentum and indeed excellent work being done under the banner of employee engagement to abandon the term’, (and the accompanying activity).

John Purcell in a ‘Provocation Paper’ Disengaging from Engagement, puts it succinctly, ‘The problem is not just one of defining what engagement is but the way it is being used, with implications for the study and practice of employment relations and HRM.  He makes the point that ‘Boiling engagement measures down to one score is particularly worrying.’ (And explains why in the paper).   Purcell discusses two types of employee engagement:

  1.  ‘Work engagement relates to an individual’s psychological state of mind while at work. Work engagement is seen as a ‘positive, fulfilling work-related state of mind that is characterised by vigour, dedication and absorption’.  Purcell makes the point that ‘What emerges [from a description of an engaged employee] is a profile of a person so engrossed in work that it can only ever apply to a minority of employees’.   He discusses several concerns with work engagement, including unconvincing attempts to link work engagement to organisational outcomes, such as labour turnover and performance, the way workers who are not fully engaged are described in negative terms, the dangerous reduction of work relations to individual attributes and failings, and the influence of positive psychology on the types of measures used to test engagement and the way the surveys are often designed.
  2. Employee or behavioural engagement that ‘relates to the managerial practices that appear to be linked to employees becoming engaged. There is usually explicit reference made to social exchange theory and reciprocity, especially to ‘perceived organisational support… The difference between behavioural (or employee) and work engagement is nicely put by Truss (2014): employee engagement ‘is an approach taken by organisations to manage their workforce, rather than a psychological state experienced by employees in the performance of their work; is more relevant to HRM and employment relations; “doing” engagement, rather than being engaged’.  However, employee/behavioural engagement also raises concerns for Purcell.  He is concerned with difficulties in showing conclusive and causal evidence between engagement and performance, lack of an agreed definition of employee engagement, use of a composite score that ranks organisations, and within them, down to the level of the individual line manager.

Other researchers raise similar warnings about engagement (without differentiating in quite the same way,  between work and employee engagement).  In their paper The Meaning, Antecedents  and Outcomes of Employee Engagement: A Narrative Synthesisthe authors report that ‘Out of 5771 items identified in our search, only 172 empirical studies met the quality threshold, suggesting that a great deal of what has been written about engagement could be described as incomplete or under-theorized, leaving considerable scope for further development of the field.’

Confirming this, research presented in  The Meaning and Measurement of Employee Engagement: A Review of the Literature tells us that ‘Employee engagement is an emerging topic that has gained considerable attention from human resources professionals and researchers who posit engagement as a key driver of organizational success. Nevertheless, there exist mixed definitions of the construct and ambiguities in its theoretical underpinnings. This confusion in turn presents problems for both the measurement of the construct and its use when implementing and evaluating strategies aimed at building employee engagement. Such disagreements also raise questions about the reliability and validity of extant measures of engagement, and hence their value to both academics and practitioners.

A further paper Exploring Different Operationalizations of Employee Engagement and Their Relationships With Workplace Stress and Burnout makes the point that ‘Many empirical studies of employee engagement show positive relationships with desirable work-related outcomes, yet a consistent understanding of the construct remains elusive (Saks & Gruman, 2014 ). We propose that this lack of clarity is leading to an increased risk that employee engagement is becoming overly generalized and that, as a consequence, its utility in both theory and practice is compromised.’

Summarising the research quoted above, along with other articles I scanned, there is a common view that there’s confusion and lack of clarity around engagement.  This results in academic researchers’ worrying about:

  • How we understand concepts of engagement.  It is evident that different researchers conceptualise engagement differently – Purcell, for example talks about work and employee engagement, others blur these boundaries.  The differences in conceptualizing engagement give rise to different definitions of the term – McLeod and Clarke, quoted at the start, found 50 ‘engagement’ definitions.
  • The lack of context in which we examine engagement.   Several researches commented on how little contextualisation there is in engagement discussions.  They pointed out that different contexts will reflect ‘engagement’ differently.  For example, I did not come across research that looked at the national/cultural differences related to either work or employee engagement (although it looks as if the UK organisation Engage for Success is interested in looking at this).  It is very likely that different national cultures have different norms, assumptions and values around engagement.
  • The normative values and assumptions we ascribe in asking questions on engagement. In the quest for employee engagement, and the amount of news coverage and opprobrium heaped on supposedly disengaged workers we appear to be making an assumption that engagement is good, and non-engagement or disengagement is bad. Take this example, from an article in Forbes: ‘These “actively disengaged” employees wander the halls like ravenous zombies, eager to spread their contagion throughout the organization. No matter how idyllic your workplace culture is, these workers will always pose an imminent threat. That makes identifying and removing them a matter of workplace productivity life and death’.

As Purcell points out, ‘It is disingenuous to portray work in the positive glow of engagement without recognising the very different experience of many who fail to be [psychologically] engaged often for very good reasons. Problems of job insecurity, zero hours contracts and real pay reductions for many do not get recognition’.

  • The question of who or what has power and agency in relation to engagement. If we think that engagement is ‘associated with a sustainable workload, feelings of choice and control, appropriate recognition and reward, a supportive work community, fairness and justice, and meaningful and valued work’ then where does the power and agency lie in ‘doing’ and getting engagement and being engaged?  There are multiple players and factors in the power/agency mix – players include managers, leaders who want to achieve a certain ‘score’, job designers, and employees.  Factors include performance and reward systems, disciplinary and grievance procedures, levels of autonomy and decision making accorded to people and physical work environments.  All of these have a part to play in both psychological and behavioural engagement.

Having looked at some of the research and considered what worries academics and researchers, I’m now wondering if in the day to day search for employee engagement we’re stuck in unexamined clichés and stereotypes of what engagement is and why we are interested in it.   How many organisations spend member time conceptualizing, defining, and contextualizing engagement for their specific organisation, I ask myself?

Perhaps we should be using the research to ask different questions about engagement to arrive at different perspectives on it, in ways that address the academics’ worries, improve practitioner understanding in ways of ‘doing’ engagement and deliver better outcomes as a result of both.  What’s your view?  Let me know.

Designing resilient teams

I’ve been working on a series of 4 x 20-minute on-line, in-house, masterclasses (see my related blog) on team resilience.

Their focus is on the design aspects of team resilience not the development aspects.  The design aspects are the ‘hard’ or formal organisational elements that are easy to describe and capture in words.  They are the structures, systems and business processes that help deliver the organisation’s products/services.  ‘Hard’ elements include policies, structures, decision and authority rights, governance.  This handout, Four elements of Nadler and Tushman model.HO1 helps explain.

These masterclasses are not about the development aspects of team resilience.  The development aspects are the ‘soft’ or informal organisational elements that are difficult to describe and capture in words.  They include behaviours, interpersonal relationships, culture, and lived values.  (I maintain that design and development are very distinct and different disciplines although they inter-relate at points).  See Exhibit 1 in this article that illustrates.

Here’s a summary of each of the four masterclasses. They each follow the same 3-part format:  What’s the idea?  How does it work?  Try it out.

1              Designing resilient teams

What’s the idea? The idea is that teams can weather disturbances (positive and negative) like team members moving out or joining, sudden unexpected deadlines, technology glitches, and so on, if they are designed to do so.  The things that make for resilience can be seen in ecological systems.   Steven Forth explains this in his piece What Makes for a Resilient Team. The C S Holling article he mentions Resilience and Stability in Ecological Systems gives a much more detailed  (23 dense pages) explanation.  But summarises saying:

‘A management approach based on resilience … would emphasize the need to keep options open, the need to view events in a regional rather than a local context, and the need to emphasize heterogeneity.  Flowing from this would be not the presumption of sufficient knowledge, but the recognition of our ignorance; not the assumption that future events are expected, but that they will be unexpected. The resilience framework can accommodate this shift of perspective, for it does not require a precise capacity to predict the future, but only a qualitative capacity to devise systems that can absorb and accommodate future events in whatever unexpected form they may take’.

How does it work? Six principles for team design come from this perspective:

  • Have overlapping skill sets on the team
  • Have lots of overlapping connections
  • Be open to new people and ideas – (discuss the design aspects of this)
  • Design good connections outside the team (both inside the organization and outside)
  • Have access to a ‘bench’ that can step in and refresh the team’s skills
  • Develop team autonomy through decision, delegation and authority rights

Try it out Assess your own team’s composition against the six principles.  Agree how you can develop team resilience using the six principles.  (Note Steven Forth has a useful graphic to illustrate).

2              Restructuring teams (changing the organisation chart)

What’s the idea? The idea is that managers trying to solve a problem tend, first,  to look at an organisation chart and shift people around it, rather than looking at the work, the skills needed to do the work, and finally the people who could do the work.  A Q5 Partner 3-minute video thinkpiece explains clearly why a chart-based approach to solving a problem is not going to work.  An organisation chart only tells us certain things.  It does not tell us, for example, how the work flows, where the handover points are, or what the linkages are between teams as the work flows – all part of designing effectively. See the handout What does an organisation chart tell us and not tell us

How does it work?  Restructuring teams begins with thinking about what it is you are trying to solve and deciding whether or not it is a design issue or something else.  Once you think it is a design issue the next step is agreeing the work your team is there to do (the purpose), developing some design criteria, mapping the activities that comprise the work of your team as it flows through the team, clustering the activities in a way that meets your design, criteria, developing design options, determining the linkages within your team and across to other teams, testing your options before planning to implement the new design.

Try it out In a team meeting watch the video Got a wicked problem? First tell me how to make toast (9 minutes).  Discuss how the ideas in this are applicable to the issue you are trying to address.  Assuming it is a design issue use your insights to follow the steps in the para above.

3              Designing collaborative teams

What’s the idea?  The idea is that collaboration does not happen just by chance.  Collaboration structures can be designed by using models of collaboration combined with the different stages of design methodology.

How does it work?  Collaboration can be expressed in four modes (Note collaborative projects often use more than one of the modes during one project in order to achieve the desired goal).

Open & Hierarchical Anyone can contribute but the person, company or organisation in charge of the project decides which ideas or solutions to develop.

Open & Flat There is not an authority who decides which innovations will be taken further because anyone can contribute in the process and use delivered results.

Closed & Hierarchical The participants have been chosen by the authority who also decides which ideas will be chosen and developed.

Closed & Flat The group of participants chosen by an authority share ideas and make the decisions and contributions together.

Using a four-phase design methodology – discover the problem, define the problem and solution ideas, develop and refine the solution ideas, deliver the solutions – ask questions by phase.  For example, in the discover phase ask: what is the challenge?  Who are the stakeholders? Do you know who you want to collaborate with (closed) or do you need to allow anyone to contribute (open).  Can only selected people join the decision making in this phase (hierarchical) or can anyone participate in making decisions (flat)?

Try it out: Identify a problem or opportunity your team is working one. Use the methods outlined to arrive at a collaborative structure that you can test.

4              Designing for remote team members

What’s the idea? If you are in a ‘team’ then there is an underlying assumption that some or all of the work you do requires working with others in your team to produce something – a paper, a policy, a design, a customer outcome, or similar.  People who work in teams that comprise some face to face members and some remote members, or teams that comprise only geographically dispersed members often have difficulty in forming a cohesive, high performing team that effectively delivers the desired outcome.  Paying attention to designing a supportive context for remote team members helps address issues of isolation, lack of community, and heading off-track.  Implementing design solutions for working with remote coworkers results in better work and healthier communication with everyone.

How does it work?  Designing a good working environment for remote team members involves:

  1. Agreeing how to capture and store information that everyone needs access to.
  2. Adjusting face to face methods and techniques for team community building to make them appropriate for remote workers.  Moodthy Al-Ghorairi suggests ‘hold a Slack channel meeting where all key players in a project get to speak directly to each other. For multilingual teams, gmail and Workplace may be better options because of the auto-translate feature. Use systems that make it easy for team members to communicate frequently with each other to avoid misunderstandings and missed cues.’
  3. Expecting and planning for asynchronous discussions and having the tools for doing this effectively (this includes teaching people to use the tools, expecting them to use them, and developing on-going learning tips for using them to full advantage)
  4. Keeping team goals and tasks firmly in everyone’s view – some teams find Trello effective for this but there are multiple other options.

Try it out:  Identify what Moodthy Al-Ghorairi, calls ‘a single source of truth for documenting procedures, workflows, how-to’s and on-boarding. Use a group password manager to manage logins. Set up a Slack bot for repeatable questions (status reports, who’s up for pizza on Friday) or turning conversations into shared knowledge easily.’ Agree how you will use it to develop team performance.

What design masterclasses would you offer in a series on team resilience?  Let me know.

Image: Resilience, Lily Gordon

Could we use a Design Authority?

There’s been a lot of talk recently about ‘organisational alignment’.  For example, Jonathan Trevor and Barry Varcoe in an HBR article tell us that ‘Most executives today know their enterprises should be aligned. They know their strategies, organizational capabilities, resources, and management systems should all be arranged to support the enterprise’s purpose. The challenge is that executives tend to focus on one of these areas to the exclusion of the others, but what really matters for performance is how they all fit together.’

If reading the article prompts you to want a course on organisational alignment, MIT offers Building Game-Changing Organizations: Aligning Purpose, Performance, and People, or if a video is more your taste watch Stanford University’s How to Align Your Organization to Execute Strategy

For those of us who enjoy the academic research and theoretical underpinnings of a buzz-phrase like ‘organisational alignment’ take a look atOrganizational alignment A model to explain the relationships between organizational relevant variables.’ International Journal of Organizational Analysis. It’s an excellent article.

Assuming the need to align organisational elements in order to successfully deliver the strategy.   (I’ve already assumed there is a strategy), and assuming that organisational alignment falls within the remit of organisation design activity,  then how do designers  know that aligning  is actually happening?  Is there a way of tracking progress from unaligned to aligned and then  re-aligned –  qualitatively and quantitatively –  in order to satisfy people who need to see an ROI on what they might think is nugatory alignment activity and in order to satisfy organisation designers that their work is helping the organisation to align?

Unfortunately, info on tracking alignment progress is where the popular articles fall down and the research points to ‘more research needed’.

None of the work on organisational alignment I’ve come across talks about governing and tracking progress of organisational alignment through a Design Authority.  The concept and use of a Design Authority is common in the software world and well known to Enterprise Architects  and, I think, could be applied to organisational alignment.

If you are not familiar with the term, in the Enterprise Architecture world a design authority is a role or body that ‘provides assurance that solution designs are fit for purpose, working to ensure that each component meets requirements and integrates and works within the complex enterprise architecture. This requires development and imposition of architecture and design controls; defining and enforcing architecture standards, methodologies, processes, tools and frame works against which services and projects can deliver.’

This type of Design Authority ensures alignment of specific enterprise elements.  On-line you can find many Terms of Reference for these technical Design Authorities.  The UK’s University of Reading one is typical, describing the role and function of their Authority as follows:

‘The Design Authority Group

  • Reviews requirements to make sure they are clear and have the appropriate level of detail and clarity.
  • Ensures that the requirements of the solution are being met by the proposed solution design.
  • Engages with projects, programmes and workstreams to build and maintain the design pipeline.
  • Engages with projects, programmes and workstreams to ensure that the correct design governance is being applied (e.g. suitable authorship of design and expert input).
  • Reviews the technical input and subject matter expertise input into the proposed solution design, covering areas such as the definition of requirements, legal compliance, security considerations, functional fit, technological capability, cost, support modelling (such as skill and resource requirements) and delivery capability.
  • Assesses the feasibility of the proposed solution, specifically the functional capability and the organisational fit.
  • Provides a broader review and assessment of solution and delivery interdependencies and integration / interface requirement.
  • Ensures technical risk is being managed.
  • Escalates / reports design submissions and approvals to the ISMG.
  • Escalates challenges to architecture principles and design guidelines to the Enterprise Architecture Advisory Board (EAAB).
  • Reviews and approves design change’

Looking at this, it provides a useful model that could be used for tracking and monitoring business organisational alignment/design.

I can hear a number of objections being raised to the idea of governing and tracking organisation alignment and design work through a Design Authority these include:  it’ll become a meaningless, bureaucratic process.  It’s too time consuming.  Designs ‘emerge’ they can’t be ‘governed’.  We have enough governance in our organisation – there’s no place to introduce more.   It’s too difficult to track organisation alignment progress.  You can’t measure alignment because there’s no trackable cause/effect, and so on.

I can hear far fewer voices saying – ‘This is a great idea.  Let’s explore it’, but I think it’s worth a go at overcoming negativity bias on this topic.   Let’s consider what benefits a Design Authority would bring to organisational alignment work.  Although this depends on the organisation and its degree of maturity, in general it would bring

  • A degree of discipline to a design process – if you work in an organisation that employs external consultants to help with design work you’ll find they take different approaches, use different languages, and tend to focus on the bit they are being paid to look at and not the interdependencies with other parts of the organisation and the knock-on effects. A Design Authority could look across the design work and make sensible join-up suggestions and/or recommend a consistent design process/taxonomy that headed an organisation towards alignment.
  • A forum for overseeing whole system alignment and appropriate integration. Too often leaders are focused on day-to-day delivery at the expense of more strategic and longer-term considerations. A Design Authority could take a more systemic view asking questions like: Are we connecting and collaborating in an efficient way?  Is there stuff we need to abandon?  (On this, read about Peter Drucker’s planned abandonment exercise – one I use a lot in my work).  Are there gaps in delivery? Are customers experiencing a ‘seamless service’?  Are we managing our interdependencies and interfaces effectively?
  • A monitoring and tracking function for specific organisation design work. In many cases design work stalls, or stops at the end of the design phase without effectively transitioning, or fails to respond to context changes quickly enough.  An effective Design Authority could keep tabs on progress and intervene as needed – perhaps to help maintain momentum, perhaps to recommend calling it a day and closing the work down, perhaps to show where specific designs were not meeting overall alignment criteria.
  • A learning capability to organisational alignment and design thinking. A Design Authority that requested/reviewed learning from the design process, learning from other pieces of work that had gone down similar routes, and helping develop the skills of line managers in organisation design and alignment work would be an organisational additionality.

To get these kind of benefits means acknowledging the concerns of the naysayers, and developing a Design Authority which does not prove their point and which is effective and value-add.  Brian MacDonald wrote a blog Making the Design Authority More Effective that gives five pointers.  He says

  • ‘Start with an executive sponsor or “champion” Newly formed Design Authorities always need a senior executive sponsor or champion who can coach them on how to be effective within the organisation and help them put their recommendations into practice.
  • Define “effective” The stakeholders in will have widely divergent perspectives on what makes a Design Authority effective.
  • Establish process early and document decisions The process followed by a DA should have a degree of formalism to it such that people new to the DA can understand how it functions and how they can successfully engage with it.
  • Evolve the DA as it gains credibility and influence Successful DAs … tend to be those that have started small and have then evolved as they have been given greater scope and responsibility.
  • Restrict membership One of the big challenges with any DA is keeping the number of participants to a workable number while still providing required coverage for complex topics and multiple stakeholder organisations. (A technique that can help with this challenge is to establish different categories of participation).’

Do you think a well-designed Design Authority would add value in shaping organisational alignment?  Let me know.

Image: Organisational alignment is the glue

Agreeing what organisation design means

Last week I asked if being a Certified Organisation Design Professional is an individual or organisational value-add which provoked a bit of discussion with one person tweeting ‘not before we agree on what organisation design means … ‘

That gave me pause for thought.  What does organisation design mean?  The original tweet statement is ambiguous and open to interpretation.  Is it about the philosophy of organisation design – what it means in the great scheme of things, rather like the question ‘what does life mean?’

Or is it a practical question that is more about scope – what would be in a ‘package’ of organisation design if we were buying it.  Which, of course, is what people are doing when they buy organisation design consultants’ time and expertise.

Or is it about the words themselves – a definition for the phrase ‘organisation design’.  Or is it something else that relates organisation design to meaning?

I couldn’t just drop the question because during the week it came up in various meetings I was in.  Not as the blunt question – what does organisation design mean? – but more as a subtle probing and poking, first as I ran a pilot of a one-day organisation design course for line managers, second as I worked with a colleague to develop the outlines for 5 x 30 minute tip-sessions for general awareness about organisation design, third in a discussion with the Organisation Design Forum Board who are working on their annual strategic plan and fourth as I ran a 1 hour intro to organisation structure charts – what they do and don’t tell us.

Focusing on the conversation with the ODF Board, this went in the more philosophical direction.  Organisation design means designing good work in a context that retains the human spirit where efficiency and effectiveness metrics are balanced with positive social contribution.  Socio-tech thinking came up as an something to explore further in this part of the discussion. Developing this idea that organisation design means fulfilling a business (say efficiency) imperative with a social obligation to people led the discussion towards ethics and another question.  Does organisation design mean adherence to a code of ethics?

Out of curiosity – according to an article in the current issue of HBR a great attribute to have –they say we should cultivate it – I wondered how an actual philosopher would tackle the question ‘What does organisation design mean?’  This took me down several interesting routes – I learned about analytic and synthetic methods, about instrumental and intrinsic value and got lost in the maze of possibilities.  Sharply pulling myself out of wandering I thought we could hand the question to a research student interested in philosophy and organisation design.

If the question is more about the scope of organisation design – what is in the ‘package’ of it that gets sold by consultants to clients, there are plenty of examples to share on what people put in their packages.  Take a look at Change Works Designed Organization (TM) 7 step approach, for example or Kates Kesler’s Five Milestone Design Process or Axelrod Group’s Conference Model® for organisation design.   (Note the principals in these organisations are members of the ODF Advisory Group).

Each of these consultant’s packages of organisation design – what it means to them differ.  I wonder if we have enough conversations with clients on the question ‘what does organisation design mean?’ in respect to the various ‘packages’ on offer in order to find one that is a good match.  Whether we could agree on the ‘right’ package is a moot point – the philosophy investigation led me to thinking that what organisation design means is subjective rather than objective.

If the question is more about the definition of organisation design – again there are lots to choose from: Take Nicolay Worren’s blog ‘What is organisation design?’ From this a reader learns that OD means more than ‘boxology’, involving ‘the creation of roles, processes and structures to ensure that the organization’s goals can be realized’.  The Center for Organizational Design says, ‘Organizational design is a step-by-step methodology which identifies dysfunctional aspects of work flow, procedures, structures and systems’.   McKinsey describes organisation design as ‘going beyond lines and boxes to define decision rights, accountabilities, internal governance, and linkages’.

It’s striking that what these definitions have in common is they are about the ‘hard’ aspects of the organisation – coming from the roots of systems theory.  They are not about the ‘soft’ aspects that come from the roots of social and behavioural science and form the basis of organisation development.   The two fields are distinctly different.  I saw these distinctions played out in another experience I had last week.

I spent Wednesday in the day-surgery ward of a hospital. Not me having surgery but someone I was accompanying.  From my companion status I was able to observe how the organisation design – systems, processes, decisions made, technologies, hierarchies of staff, protocols followed, floor layouts, and so on played out in the course of the day.

But I also observed that we (patient plus companion) felt safe in the process, cared for and treated with kindness and dignity – the interactions of the staff between themselves and with us spoke of development activity that complemented the design activity.

Synthesising these various lines of enquiry leads me to suggest that to answer the question ‘What does organisation design mean’ we have to look from at least the three perspectives I’ve discussed:

  1. What does organisation design work mean in a more philosophical sense for organisational stakeholders and how can our work have a positive outcome and meaning for them?
  2. What do we mean by the process of doing organisation design – what’s in the ‘package’ of it and what is the methodology we use?
  3. What do we understand by the words ‘organisation design’ in order to arrive at a (systems) definition of it that does not blur the design with development?

Whether we can agree any of these three, I don’t know.  Also, I’m not sure what the value would be in agreeing in order to underpin an Organisation Design Certification.  If the process of certifying is rigorous (see my questions on the Certification) and focuses on the design rather than the development aspects of organisation this may be more valuable to organisations and practitioners than getting to any objective agreement on what organisation design means.  (Though perhaps, I’m wrong in assuming that agreement implies a objective interpretation and application.)

The forthcoming Organisation Design Forum Conference (October 19 -20) offers a forum for discussing the question.  Maybe I’ll give it a go.

What do you think organisation design means? Should we agree on it?  Let me know.

Image:  Do we all agree?

And now I’m Certified

‘Thank you for your CODP application and supporting materials. The review committee has evaluated the application and documentation combined, and found that you have provided all necessary means, as well as met all requirements for certification. Therefore, I am happy to inform you that the committee has granted you the certification – you can now refer to yourself as a Certified Organization Design Professional.’

That’s the email I got a few days ago.  It came as the result of evidencing that I met the criteria for certification and sending in the application payment.  (The 2018 payment is $150:00 but it is going up in 2019.  So, if you are interested in applying – go for it now to get the current rate).

The criteria info states:

‘As an organization design professional, you can become certified if you meet a set of criteria divided into education and practice – both criteria are estimated based on your achievements through the past two years.

You might be asking why I decided to apply for Certification?  I asked myself the same question – after all it’s a time and money commitment that is currently (as it’s a new certification) of uncertain value.  And, in applying I’d be making myself vulnerable to peer review.

The rational part of me kept telling myself I’m already over-committed to stuff and l need to practice saying ‘no’ to taking on anything that will take time – I didn’t need the added pressure of applying for Certification.  But as Mark Rowlands says, in his wonderful book, Running with the Pack ‘there was a small, sneaky, irrational part of me that always knew I was going to be standing at the starting line of this race’.  Though in my case it wasn’t in the starting line of a race, but the starting section of the application form.  (Helpfully this is ‘Your full name’ so I felt confident on that question).

The small, sneaky part of me that over-ruled the rational part of me did it by presenting four reasons why applying would be ‘a good thing’ to do. I’d be:

  1. Participating in a new venture that which is worth supporting
  2. Contributing to an effort to professionalize organisation design
  3. Reflecting on what I have learned and developed in the past two years
  4. Testing and learning from the application process and criteria for myself

I’ll discuss each of these in turn.

Participating in a new venture that is worth supporting.  Organisation design is what academics call a fragmented field that (adapting from a paper on knowledge management) ‘lacks a common conceptual core; it is cross‐disciplinary, it addresses a wide variety of organisational phenomena, and it has difficulty distinguishing itself from many related areas of organisational/consulting practice’.   In my view, any effort that in the words of the Organisation Design Community (ODC) helps ‘research, practice, and learning intersect to produce valuable design knowledge and applications’ is worth supporting.  The Certification is a new part of the several activities orchestrated, individually and collectively, by the ODC, the Organization Design Forum, and the European Organization Design Forum designed to do that.

Contributing to an effort to professionalize organisation design.   A definition of ‘a profession’ that I agree with says: A Profession is a disciplined group of individuals who adhere to ethical standards and who hold themselves out as, and are accepted by the public as possessing special knowledge and skills in a widely recognised body of learning derived from research, education and training at a high level, and who are prepared to apply this knowledge and exercise these skills in the interest of others.’  Over the years I’ve been involved in the field there has been no recognised endorsement of professionalism although there are numerous short and long programmes that teach organisation design (see my blog on this).  However, now organisation design is the ‘hot topic’ it’s time that it became a recognised ‘profession’ with a code of ethics and a process for quality assuring the practitioners in a way that gives confidence to buyers of organisation design work.  Note that this is early days on the ‘professionalizing’ road and the handbook of certification explicitly states ‘Certification does not warrant or guarantee the individual’s expertise in the field of organization design, nor does it signify that the individual is equipped to manage a given project within the field.’

Reflecting on what I have learned and developed in the past two years.   I’ve been in the Organization Design field for over 20 years and I think I have expertise in it.  With this, I’m conscious that, in the words of researcher Elizabeth Jones, ‘Expert professionals act at a level of automaticity with knowledge that enables efficient, effective and unselfconscious practice. They must also extend the theoretical and research knowledge that informs their practice and engage in critical enquiry into their own practice. Through these processes, professionals acquire new knowledge and skills as they develop a well-elaborated and improving theory of practice.’

My rational self pointed out that completing an application form hardly constitutes reflective practice.  (For more on that read the classic, and excellent, Donald Schon book,  The Reflective Practitioner: How Professionals Think In Action).  However, it did require me to look back over two years, see what I’d been writing about, working on and learning.  In that look-back process I did get some insights on how my practice has changed.

Testing the application process for myself.  Having got myself across the start line of form completion and completed the name, education, etc sections I tackled the 4 questions that form the meat of the process:

  • Describe your general experience with organizational design
  • Describe below how you meet the education requirement by providing information in the table with educational activity you have participated in
  • Using the table below, please describe in detail how you have achieved 1040 hours practical organizational design experience within the past two years
  • Describe how your background has supported your work as an organization design professional

This turned out to take quite a bit of time and effort (as my rational self had predicted).   Sifting through files, memories, and documents for the right combination of practical, theoretical, educational and developmental information resulted in a couple of trash-bags of documents I don’t know why I kept so long, a re-ordering of my on-line files to make info retrieval easier, and finally some paragraphs I felt happy enough with to submit.

In getting to this point I also ended up with ten questions about the Certification:

  1. Do trainers facilitating the accredited courses need be certified practitioners? (I think not).
  2. Who supervises the reviewer panel?
  3. The requirements for course accreditation are very detailed.  Are the reviewers looking for this type of information in the individual practitioner certification?
  4. How much value does individual certification process and the course accreditation process add to organisations wanting organisation design skills?
  5. Are there bursaries for people/organisations who can’t afford the certification/accreditation fees?
  6. Is three years too long before individual re-certification given the current pace of organisational change?
  7. Should the assessment process be more rigorous – for example the requirement to submit a portfolio of evidence?
  8. Should there be more emphasis placed on the ‘reflective practitioner’ in the assessment process?
  9. Should people being certified also agree to conform to a code of ethics? (Note: the ODC members agree to adhere to the Academy of Management Code of Ethics.  The ODF and EODF do not have a response when the term ‘ethics’ is entered into their search box).
  10. Is there a plan to start quality assuring the practitioners?

As I’m an Advisory Board Member of the EODF and ODC I can pick these up with the ‘relevant authorities’.

Meanwhile I can happily report that the certification process focused my mind on what matters in my OD work, encouraged me to reflect on my OD practice, and provided a lovely review point of the highs and lows in my recent OD career.

Do you think a Professional Certification in Organisation Design Practice is an individual or organisational value-add?  Let me know.