National Quitter’s Day: Why January Sees A Surge Of Resignations
Dubbed National Quitter’s Day, the final day of January is statistically the most common day for UK employees to hand in their notice.
Read MoreWe have reached a bit of a crossroads in recruitment.
On one hand, 82% of us in the HR world are leaning on AI to help find and hire new people.
It makes sense because the sheer volume of applications is staggering.
But on the other hand, 2025 showed us some pretty uncomfortable truths about what happens when we let machines take over entirely.
As we head into 2026, the conversation is shifting.
It is no longer about whether we should use AI, but how we use it responsibly without losing the “human” in human resources.
Most large companies now use Applicant Tracking Systems (ATS) to manage candidate applications.
While this streamlines the process, it means that roughly 75% of CVs are never actually seen by a human eye.
The danger here is automated bias.
A recent study from the University of Washington found that AI screening tools favoured white-associated names 85% of the time.
Even more shocking? Female-associated names were preferred only 11% of the time.
It is tempting to think of AI as a neutral judge but in reality, it is more like a mirror. It doesn’t have its own moral compass, it simply looks at the data we give it and tries to find a pattern.
This is where the problem starts.
If an AI tool is trained on twenty years of resumes from a company that has historically hired a specific demographic, the algorithm learns that those traits are the ‘blueprint’ for success. It begins to view anything outside that narrow window as a risk.
For HR teams in 2026, this creates a massive barrier to diversity, equity and inclusion efforts.
If the software is quietly filtering out certain candidates before the human even sees them, your diversity strategy is broken before it has even begun.
When an algorithm is trained on historical data, it often just ends up repeating the mistakes of the past.
Data finds, if your AI rejects a brilliant candidate who then goes to a competitor, it could potentially cost your business an average of £2.4 million in the long-term.
That is an expensive mistake to make for the sake of “efficiency”.
It’s not just about who we are rejecting – it’s who is getting through.
We are seeing a massive spike in AI-generated CVs and even deepfake candidates.
We have moved far beyond candidates simply using ChatGPT to tidy up their bullet points – we’re now seeing the rise of the fully synthetic candidate.
With 67% of large companies reporting a spike in application fraud, the tactics have become incredibly advanced:
Candidates are using generative tools to create perfectly forged certificates, portfolios and even references that look indistinguishable from the real thing.
In remote-first environments, some applicants are using real-time video and audio filters to mask their identity or even have a more experienced person take the interview behind a digital mask.
It is no coincidence that the tech industry is the top target, with 65% of fraud cases. The high salaries and remote nature of the work make it a goldmine for those using AI to bypass traditional vetting.
People are using AI to fabricate entire careers or qualifications. It is happening most in tech, marketing and finance, yet nearly half of HR professionals haven’t had any training on how to spot these fakes. In 2026, verification is going to be just as important as the interview itself.
Keeping AI as a support tool, rather than relying on it to automate everything, is key to maintaining a human-first approach to recruitment and building real connections.
It is not all doom and gloom though as AI is genuinely brilliant at increasing admin efficiency.
Using it for admin can cut HR labour time by about 35%, which is a huge win for many teams.
Last year, some teams managed to cut their time-to-hire by 50%.
In a market where 250 people might apply for a single role, that speed is vital.
But the goal for 2026 should be to use that saved time to actually talk to people.
93% of hiring managers still say human involvement is the most essential part of the process.
Our COO, Lucy Harvey, puts it perfectly:
“Keeping AI as a support tool, rather than relying on it to automate everything, is key to maintaining a human-first approach to recruitment and building real connections.”
The key is balance.
Use AI to handle the scheduling and the paperwork, but keep a human in the loop for the actual decisions.
We need to hire based on individual skillsets and team fit, not just whoever used the right keywords to trick a machine.
[2] https://gohire.io/blog/recruiters-guide-to-applicant-tracking-systems
[3] https://www.cnbc.com/2019/02/28/resume-how-yours-can-beat-the-applicant-tracking-system.html
[6] https://softwarefinder.com/resources/hiring-managers-vs-ai
[7] https://verityai.co/blog/ai-recruitment-rejects-qualified-candidates
[8] https://www.theaccessgroup.com/en-gb/blog/hcm-generative-ai-in-hr-large-businesses/
[13] https://employernews.co.uk/hr-news/81-of-u-k-hr-professionals-are-open-to-using-ai/
Dubbed National Quitter’s Day, the final day of January is statistically the most common day for UK employees to hand in their notice.
Read MoreNearly 44% of UK employees plan to take annual leave in December. 27th December is cited as the single most requested day off.
Read MoreTwo in five workers suffer a minor accident at work celebrations. Most common office party accidents and how to avoid them.
Read More