LLM output is optimized to sound plausible, which makes it especially subject to the Dunning-Kruger effect.
When a middle manager prompts an LLM for something an employee produces, a manager who can't do that job can't find anything wrong with its output. But when they ask about their own job, they can see the output is nonsense riddled with errors.
So obviously the manager's job is safe, but the LLM can replace everyone that reports to that manager.