A new chapter begins

Employee experience research — Why organisations move from projects to programmes 

Shift from fragmented feedback projects to cohesive, data-driven EX programmes for insights and actions.

A decade ago, in a class called “Innovation Management,” our professor told us that innovation comes in shocks, not in a steady upward path. The world of employee experience research is currently in the midst of such a shock. So, what’s driving it?

Over the past five years, new capabilities from technology providers such as Qualtrics, Glint and Remesh have helped us understand the employee experience (EX). We’ve seen how these technologies have enabled a different way of gathering feedback for organisations and the advisory firms that support them. Similarly, human resource information systems have grown much richer. And they’re easier to connect to EX research software.

Many organisations have made considerable investments in these new technologies. But most only use these capabilities ad hoc — stacking new initiatives on top of old ones and creating an ever-more-complicated web of separate employee feedback projects. 

At best, these projects don’t interfere with each other. At worst, they create irritation for employees, confusion among those tasked to act on results and disappointment among senior leaders. After all, leadership expected great synergies and extra value from the expensive new EX research tool. 

To generate a better return on these investments, we must take lessons from the past five years and apply new approaches. That means no longer stacking independent projects but instead making a fundamental change and working more strategically. 

That step toward a more strategic approach is the innovation shock we’re presently experiencing as we move from EX projects to EX programmes. 

The employee experience baseline and its challenge

Just about any organisation with more than 500 employees will gather employee experience feedback through an annual survey. Maybe they run the survey every two years instead of annually, but the principle is the same: asking numerous questions, processing and sharing data via dashboards and PowerPoint reports at all levels of the organisation, and providing a more in-depth analysis for senior leadership.

These companywide surveys won’t disappear any time soon. They provide a great understanding of what’s going well in the organisation, what could go better and whether any hot spots warrant more support. But they also leave two big gaps: frequency and comprehensiveness.

Filling the gaps

Tackling the low frequency of insights has been the first step for many, with the tool of choice being the pulse survey, which generally comes in two flavours. 

First, short, frequent pulse surveys help counter the low frequency of insights from annual surveys. And second, more elaborate pulse surveys gather deeper insights into a specific topic (such as mental well-being) or within a targeted subgroup of employees (for example, everyone at a given location).

Another way to gather more frequent feedback is to apply lifecycle surveys, automatically triggered when a certain condition is met, such as “with us for one week” or “started new role.” Insights from these surveys are often highly actionable and combine well with the more strategic insights from annual and pulse surveys. Many of our clients have begun implementing some form of these.

Digging new holes

Pulse and lifecycle surveys can provide tremendous value by filling the gaps left by an annual survey. However, this is where EX professionals often begin to realise they’ve dug themselves a new hole: They haven’t built the infrastructure to ensure all these initiatives work well together. Let’s investigate that a bit more.
  1. Surveys are done as projects led by a topic specialist.

    For example, the rewards team focuses on understanding people preferences in conjunction with the common engagement survey. Or the health, safety and environment (HSE) department is interested in well-being and burnout prevention and commissions well-being surveys within the organisation. At the same time, the onboarding team is conducting lifecycle surveys for feedback at weeks 1, 4 and 16 after joining. 

    None of these departments coordinate tools or methodologies with the others, nor do they align on survey dates and audience. Each group reinvents the wheel multiple times as projects progress, missing the opportunity for various synergies. 

  2. Every project searches for its own tool. 
    That means data reside in different tools, based on varying specifications. As a company gathers more EX feedback data, expectations rise for insights from combined data sources. How does the onboarding experience drive engagement? And how does that translate into ramp-up time and/or retention? Handling these types of analyses is challenging enough. Having to connect data from multiple sources and with varying specifications makes the task that much harder. And it leaves stakeholders disappointed with the lack of insights that result. 
  3. Methodologies don’t automatically make sense together.

    Even if the data are joined, the combination may not add much value. Each project was designed in a vacuum, and insufficient thought was put into how they could create valuable insights together. 

    For example, a common question about “intent to stay” is included, but the response scales don’t match, making the data incomparable. Or surveys pose similar questions that are worded slightly differently. Another common challenge is that a few questions would have obvious value in a combined database but little value within the context of a particular project. So, they don’t get asked. 

  4. Each project carries its own branding and competes for attention and participation. 
    Employees don’t understand why they’re asked the same question repeatedly. Because of the different brands and communications, they start to tune out those surveys that aren’t pushed hard by their leaders. 
  5. Survey moments and audiences aren’t coordinated.
    This results in some employees responding to multiple surveys, while other departments gather zero feedback, leaving a hole in the analysis. Aside from an unbalanced data set, this will result in unnecessary frustration and irritation among employees and therefore lower response rates, poorer data, and fewer insights. 
  6. Functions act on insights in silos,
    without clarity on who’s responsible for improvements, which leads to less-than-satisfactory action plans. The lack of coordination between projects can spread to change management. This leaves stakeholders frustrated as they now have insights on what they need to improve to help meet their business challenges but not how to accomplish these improvements.
So, while filling the gaps that we EX professionals have identified from the data flow provided by the annual survey, we’ve dug ourselves a new hole: a lack of connectivity. Every project is still designed and executed as if in a vacuum — as if there are no other initiatives, surveys, focus groups, interviews or other research taking place. This disconnect prevents organisations from generating the maximum value from their efforts and tools. 

The answer: Build a programme from a solid blueprint

Avoiding the trap of unconnected projects requires a shift in philosophy. Organisations should start planning for a programme, not separate projects.

What does such a plan look like? The answer depends on many factors. But organisations should consider the following core principles:

Whatever action you take on EX should start with driving business success. This is where stakeholders need to be identified, and their various needs understood. 

As an example, the board will want the ability to monitor how their strategic plans are unfolding. But team leaders need to understand EX at a much more tactical level: What’s stopping their teams from fully thriving, and how can those hurdles be removed? Another example is HR, which needs to ensure it delivers a compelling employee value proposition that attracts the right people and retains key employees. There’s not a single person in leadership who can’t benefit from EX insights. But no one is helped by generic, high-level insights. 

Aside from leadership, many others will need EX insights. Think about IT — is the tech provided helping employees? HSE — is the work environment supporting safety protocols? Or learning and development — are you offering the right enablement programmes?

Mapping out what the business needs and how EX research will drive the specific key performance indicators attached to those needs is where any good plan starts. The plan can be small or wide ranging, as long as it’s clear. 

Creating an understanding of the purpose and goals the research serves also helps manage expectations and make success visible.  

After the relevant business needs are mapped out, the next step is to understand what data are needed to provide the required insights. EX data should be considered but also operational data already available in the organisation. 

What are the gaps in the current data sets? This analysis will lead to the right type of activity to bridge any gaps identified; for example, using onboarding or exit surveys to understand attrition. Or combining engagement survey data with recorded safety incidents to understand what drives people to ignore protocols in certain situations. 

How an organisation expands on an existing EX programme is crucial. Organisations should build on current EX feedback programmes and add any activity required to help the business. New activities should be well connected to existing ones, especially where necessary to support an identified business need. 

Finally, organisations must pay close attention to who they ask to provide EX feedback. A good set of governance rules will go a long way toward ensuring that no one is bombarded with feedback requests and that surveys are gathering the right information from the right audience. 

This is the million-dollar question: How will the organisation use the insights to respond to the business needs identified at the beginning? A successful impact requires setting the right priorities and identifying the right solutions. The former will follow from the data analysis. For the latter, leaders should make use of the available tools and in-house knowledge to identify the best solutions. 

Ultimately, the organisation will need to drive that action, but the programme goal is to make this as easy as possible. The right impact will outweigh the costs to ensure return on investment (ROI). 

Although change can only be made after the analysis, the correct infrastructure for quick and effective action can be established early on. Because business needs were also identified early on, a business case for any proposed change can be drafted quickly with a clear ROI calculation. 

One example is an investigation into attrition drivers, which is often challenging for organisations. The combination of experience and attrition data will show what drives attrition within the organisation and for various personas. This makes it possible to set a target for improving the most critical driver, provide a cost calculation and estimate the expected impact on attrition. Combining those insights with a projected reduction in attrition costs makes presenting a compelling business case easier. 

Setting aside budgets beforehand will help speed up the time required to act on the business case. Establishing responsibilities early as part of the plan ensures a continuous drive, including monitoring change implementation effectiveness. 

Data, needs and action

To summarise, an effective EX research programme thrives when it’s:

  • Rooted in business needs
  • Supported by data that provides direction and credibility
  • Designed to drive action systematically

If you have questions on how you can transition to coordinated EX programme over unconnected individual projects, please contact us.

About the author(s)
Related products for purchase
Related solutions
Related insights