Opinion

AI’s Downside: Mediocrity Is the New Standard

The term workslop captures one of the most significant risks of AI with real precision. Here are three recommendations for how leaders can set a clear direction.

Af Thomas Hanssen, CEO i CfL. Debatindlægget har været bragt i Finans 20. februar 2026 

 

We speak enthusiastically about AI as a driver of productivity, innovation, and growth. But in many places right now, I’m seeing the opposite: more noise and less judgement. At best, mediocrity is becoming the new standard. At worst, it leads to mindless work.

So far, we’ve only seen the beginning of the effects of generative AI. The most skilled are already reaping significant efficiency gains, while truly transformative business models have yet to materialise.

In many organisations, the result is what’s now being referred to as “workslop”—a term introduced by Harvard Business Review (HBR) in the autumn of 2025. And it’s spot on: workslop is AI-generated content that looks polished but lacks substance and does nothing to move the work forward. The result? We lose momentum because we have to start over—and if we push ahead anyway, quality erodes.

In a recent HBR article from January 2026 (“Why People Create AI ‘Workslop’—and How to Stop It”), researchers report that around 40% of employees have received workslop in the past month—and that the cost in large organisations runs into the millions annually.

"When ‘Good Enough’ Becomes a Business Risk

The most pressing risk is mediocrity at scale. When AI is used uncritically to produce content such as texts, proposals, offers, training materials, and presentations, quality standards begin to erode. That poses a real risk to professional standards, business outcomes, and organisational culture.

CfL collaborates with several researchers, including SDU professor Alf Rehn, who has described the quality of students’ work as follows:

  • Garbage is gone: Everyone uses ChatGPT and is therefore lifted to a mid-level. But even within all the “garbage,” there were occasionally independent, brilliant ideas. Those are now disappearing.
  • Replicate the mediocre: Average is still average. Students are satisfied with “good enough”—they’ve just become better at spelling.
  • Excellence is gone: Independent thinking and execution have dropped to a mid-level, because no one dares to take a chance without ChatGPT.

This example illustrates how AI can make us complacent. That’s why it’s crucial to remain critical of both the input we provide and the output we receive—especially when the work is shared with others.


The text continues below the box.

 In many organisations, the result is what’s now referred to as “workslop”—a term introduced by Harvard Business Review in the autumn of 2025. And it’s spot on: workslop is AI-generated content that looks professional but lacks substance and does nothing to move the work forward.
Thomas Hanssen, CEO at CfL.

ChatGPT and other LLMs (Large Language Models) are extremely good at producing long, polished text. But if the input isn’t well thought through, the output will reflect that. It may look impressive, but in many cases it’s just words without substance—something you would never share if you had written it yourself.

LLMs are not better than us—they are simply faster. They are useful for research, inspiration, and structuring ideas, but they do not understand our reality. And if the sender isn’t willing to invest their own thinking into the input and output, why should the recipient engage with it?

It’s an easy parallel to draw to business: without excellence, we don’t stand out. “Good enough” is simply not good enough.

3 Recommendations to Avoid Workslop

This is where leadership comes into play. We need to take a much more deliberate stance on how technology is used—and what we are willing to stand behind. My recommendations for avoiding workslop are:

  1. Prioritise quality over quantity. Establish clear editorial standards for AI use: What counts as finished work? Which sources, checkpoints, and domain-specific criteria must be met? Workslop emerges when AI output is delivered unfiltered, leaving colleagues to clean up afterwards. Make the sender—not the model—accountable for quality.

  2. Train judgement—not just prompting. Mediocrity is countered through metacognition: the ability to plan, evaluate, and refine one’s own thinking while using AI. This is the capability that ultimately determines whether AI elevates or erodes quality.
  3. Define AI quality for your core tasks. Develop clear standards and approval processes for outputs such as analyses, decision briefs, and customer-facing deliverables regardless of whether the draft is AI-generated or not. This is how you avoid the “good enough” mindset.

     


    Ultimately, this is about trust and professional pride. We lose both if we accept that AI’s role is simply to fill pages, slides, and inboxes. Our task is not to accelerate at any cost, but to stay on course: from noise to substance, from pilot to practice, from average to excellence. Leadership means having the courage to set standards including for how we use technology.

    Det handler i sidste ende om tillid og faglig stolthed. Vi mister begge dele, hvis vi accepterer, at AI blot skal fylde sider, slides og indbakker. Vores opgave er ikke at skrue tempoet op for enhver pris, men at holde retningen: Fra støj til substans, fra pilot til drift, fra gennemsnit til excellence. Ledelse er at turde stille krav – også til hvordan vi bruger teknologien.

Also read: Opinion: Has remote work caused more stir than strength?

You might also like

Opinion: Stop Talking About Generational Gaps

Today, four generations share the workplace—soon to be five—and we need all of them. It’s tempting to elevate generational leadership to a distinct management discipline. The problem is that it’s far, far too simplistic.

Opinion: Have You Hugged Your Job Today?

Job Hugging is about staying in your job — not out of desire, but out of fear of losing it. This raises three key questions for us as leaders, writes CfL’s CEO, Thomas Hanssen.