Chief People Officer’s quick guide to generative artificial intelligence 

13 April 2023

In just a matter of weeks, an obscure tool called ChatGPT ripped a hole in time sending us into the future. We went from “GPT what?” to “What can I do with ChatGPT?” in less than a month.

Now we are asking, in mainstream conversations, questions like: How can we use generative AI tools and applications at work? How should our kids be using them for homework? What jobs will be disrupted? With more to come, generative AI is making a swift and dramatic impact on how we work, learn and create. And, with every moment spent researching these tools and discussing their application, the full force of their impact is being realized. Exciting times, indeed! 

No doubt generative AI will be reshaping work, and it’s clear that we are just getting started. Not without limitations and risks, as highlighted in an open letter signed by Elon Musk and AI experts warning of an “out-of-control” AI race with potential risks to society and humanity. Certainly there is much to consider, with the implications to people at the top of experts’ minds.

So, what does this mean for HR, and what do Chief People Officers (CPOs) need to know to join this revolution? 

Brief note on generative AI, for those just tuning in

Based on user prompts, generative AI systems like ChatGPT use algorithms to recognize underlying patterns from input parameters and existing material in order to generate new content, including audio, images, text and videos.

Why is it getting so much hype? As Microsoft Copilot (a new integrated AI personal productivity tool) notes, “it can turn your words into the most powerful productivity tool on the planet.” Indeed, it can help improve efficiency by accelerating manual and repetitive tasks such as writing emails or summarizing large documents, it can personalize experiences by creating content tailored to a specific audience, but it can also mimic writing styles or enable the creation of whole new material through an infinite loop of prompts. Much like a personal assistant scouring the internet to a random question for an entire day, but in a matter of seconds. What a time to be alive!

So why are people calling for caution? In its attempt to meet users’ requests for an answer, it can generate false or biased information while still being believable — without regard for whether the generated information has real-world implications.

Companies now must work out how to use generative AI and other tools in ways that maximize business impact, without succumbing to potential pitfalls. This is delivering a new role that HR needs to address today.

What is first up for HR to consider?

HR leaders have a clear role in the deployment of AI and AI tools in the workforce, given their ability to augment human working and potential impact on the jobs of the future.  Enterprise technology and data leaders typically run the selection and tech/data governance around these tools, so HR will need to collaborate quickly to secure an early voice in both the deployment of these tools and questions around access and usage. To maintain people-centric work designs and human-based organizations, it is critical that senior leaders across the organization are aligned about how people and AI will work together and the potential impact of this world of new tech on strategic workforce plans. Below are some short- and mid-term considerations for CPOs and their teams based on our experience.

Short-term considerations on the implications of AI and AI tools:

  •  Engage employees in redesigning work
    AI is changing how work is performed, substituting some tasks and augmenting others. Ironically, while AI technology can make certain tasks easier, it also introduces complex implications for organizations. Consider the need for HR to ensure employees remain engaged and motivated, especially in jobs for which generative AI will heavily impact how work gets done. Ensure employees have voice into how AI is introduced into their work so its impact can be optimized while providing upskilling and reskilling so they can engage in new productive work. Understand the advantages that AI brings to working, and focus on building skills, even credentialing, in areas that complement the use of AI, as well as the tech itself.
  • Consider how to mitigate risks stemming from the use of proprietary data
    Employees who experiment with generative AI may inadvertently use it to process proprietary data, thus training external models of generative AI which some tools would then reference when creating content for external users. Work with your Risk and Compliance team to evaluate and avoid such risks. Communicate and train leaders and employees on what they can do with AI, as well as on the risks that arise from the mishandling and sharing of sensitive information with generative AI tools, thus building a strong cyber resilience culture
  • Develop a plan to address emerging skills shortages
    New skill sets in data management, governance and ethics are rising up the priority list. Define which are most critical for your organization, where you need this talent made available, and how you can develop employees with these skills. Remember, in this emerging space, there might not be enough expert talent in the market for all interested organizations. Think broadly about how to build, borrow and buy these skills to ensure your company is staying ahead.
  • Hone skills that allow for the effective use of AI
    Effectively overnight, the ability to sense- and fact-check generative AI output has become a crucial skill set. Data literacy and vigilance, proper source evaluation, validation and attribution will all become even more critical. These are not new skills, but the demand has dramatically increased in scope and significance. Employees will need to hone both their curiosity and critical thinking to shape and contextualize the content that is being generated by AI. Be sure to communicate these skills and demonstrate their application.
  • Redefine roles in light of this growing field

    Entire careers and job roles are evolving as we speak. Many of these jobs are in short supply today and will need to be developed in-house, potentially in partnership with outside experts. As organizations explore generative AI use cases, some roles that are rising in prominence include:

    • AI Utilization Director to regulate how tools are used and improve programmatic accuracy and relevance
    • AI Implementation Specialist to integrate AI technology into operations. This requires both technical expertise and business knowledge to bridge the gap
    • AI Product & Adoption Manager to support internal customer needs, and to ensure effective deployment and adoption
  • Ensure your ethical AI guidelines have been updated
    The ethical considerations around emerging AI needs are many, varied and growing. Recently, generative AI’s inaccurate responses and missteps in public demos have drawn much attention; but deeper bias in generated content may soon prove more pernicious. Because these models “learn” through training on existing data, they risk perpetuating undesirable historical biases — much like humans do. For example, an AI-created job description for a senior position might default to male pronouns, or an AI-generated photo filter could lighten a subject’s skin color. Similar to tackling unconscious bias among their human workforce, HR can be proactive in addressing bias from AI tools moving forward. Ensure your company has and adheres to a policy on ethical AI and that it has been updated to include generative AI applications. As a general rule, put humans in front with the tech behind (that is, have humans make the last check, and don’t leave it up to the tech to make final decisions). Ensure there are checklists of things to follow and monitor for those building AI applications. Running regular ‘adverse impact’ assessments will help to ensure that talent decisions are driven by humans, not AI. 
  • Pilot the use of AI and AI tools within HR
    Put someone in your own organization in charge of leading the adoption of AI and AI tools within HR. Compensation teams can use AI to support tasks like writing job descriptions and also tailoring compensation. Talent teams can generate new competencies and competency models. Learning teams can begin recommending more customized curriculum. Employee communications can be drafted and refined more quickly. Start to implement and adopt so you can learn within your own function’s business what the real-world implications are for your company — and so the HR team isn’t left in the dust.

Mid-term considerations:

  • Optimize the combination of talent and technology
    In Reinventing Jobs: A 4-step Approach for Applying Automation to Work (HBR Press, 2018), Jesuthasan and Boudreau demonstrate through several dozen case studies and examples that companies that lead with the work instead of the technology are better equipped to ensure the optimal combinations of humans and automation. Those companies see where automation can best substitute highly repetitive, rules-based work; where it can augment human creativity, critical thinking and empathy; and where it can create new human work. As AI starts to permeate the different bodies of work in your organization, ensure your leaders have the tools and capabilities to deconstruct jobs into tasks, redeploy them to the best option and reconstruct new, more impactful jobs. It will be essential that you develop leaders who can approach work automation with a humanistic perspective versus a technical one.  
  • Ensure the impact on organizational culture is understood
    AI technology will ultimately transform how organizations work. The opportunity for CPOs is to further infuse the organizational culture with human-centric values like innovation, curiosity and discovery. Watch out for the emergence of subcultures among those who use generative AI, and those who don’t. There might also be concerns of inequity surrounding this new technology, fueling anxiety over the impact of AI on people’s jobs and career progression. Support business leaders in describing how AI should integrate into the work culture. Take steps to ensure that its use and performance align with the organization's values and goals.
  • Be ahead of the curve in carving out future roles

    Remember that every algorithm has a human “parent.” Even as AI takes over some tasks, you’ll need humans to train and run those systems. Organizations may soon find themselves needing to create entirely new roles to manage AI-driven tools. Consider how best to build these skills in-house and ensure your organization is attractive to digital talent who might be interested in roles, such as:

    • Prompt Engineer to apply data science expertise in developing curated, relevant content
    • Output Auditor to minimize substantive errors and bias, as well as improve accuracy and relevance
    • Data Management Specialist to manage, process and curate large volumes of data to ensure its quality and utilization in training AI models (particularly for models that are key to future business strategies)
  • Cultivate a mindset of perpetual reinvention
    Traditional jobs have already changed, and reinvention must become a new muscle that organizations learn to flex forevermore.  In the simplest terms, encourage workers to look at how their jobs are done at other firms, and then bring back ideas. Create forums for idea-sharing and challenging the status quo, harnessing collective creativity. Set aside budget for “test-and-learn” experiments and create capacity in the system for people to “play.” Engage your talent at the front lines in mapping out how AI can best be deployed. This is a critical opportunity to close the gap between executives and workers on how AI is disrupting the organization’s work. 
  • Stay ahead of legal and regulatory compliance
    The use of generative AI poses several legal and regulatory pitfalls. The data concerns mentioned above become worse when providing prompts or publishing content that violates copyright or privacy laws. At a minimum, provide basic training so employees know not to enter private, sensitive or confidential information. Better yet, put rigorous guardrails around the use of these AI-based tools. Certify your employees (via pass/fail course) before allowing them to use the tools; set up alerts or disclaimers for internal content downloads; and, of course, ensure that GDPR and similar laws are being upheld.

This opportunity is too great to miss

To be clear, the unfolding story of generative AI is one of significant risk and reward. It makes chatbots, internet search results, and business platforms more engaging and helpful. Tedious time-sinks like data entry now happen in minutes, not hours. There’s even an app that translates visual data into audio descriptions, making the world and the office more accessible. Digital personal assistant, anyone?

Organizations could build their own in-house generative AI tools and train them on their proprietary data for any number of business applications — and our collective ideas will only swell in the coming months. Entire industries will emerge, evolve and shrink starting with this moment. 

With continued advancements, companies will continue to find ways to leverage generative AI in ways that add value and minimize risk. For, while there is business risk, the opportunity to drive up productivity by utilizing the same workforce (augmented by AI) in new ways and the chance to make jobs more attractive is just too big to overlook. It’s best to get ahead.

The one truth to hold in our minds is that humans plus AI deliver the real advantage — and we must actively manage this. Therefore, the role of CPOs in shaping how we grasp and prepare for the impacts on our businesses, our leaders, our workforce and society has never been more acute. The good news is that AI has the potential to help level and even expand the playing field as we all learn together.

Mercer is a Marsh McLennan company. We help companies with people, risk and strategy. 

About the author(s)
Jesse Bramall

Mercer Principal and US Skills Advisory Solutions Lead

William Self

Mercer Partner and Workforce Strategy & Analytics Leader

Ben Hoster

Marsh McLennan Advantage, Transformative Technologies Director

Chris Lomas

Digital Products Director, Oliver Wyman

Related solutions
Related insights