Australia is short 370,000 digitally skilled workers. Within that, we're missing roughly 130,000 "digital experts" in fields like data science, cybersecurity, cloud engineering, and AI. Those numbers get quoted in every government report, every industry panel, every LinkedIn post from a tech CEO trying to justify a visa sponsorship.

But here's what nobody wants to talk about. A meaningful chunk of that gap is self-inflicted. We're not just failing to find talent. We're actively filtering it out.

The Machine That Says No

Most large Australian employers now use some form of AI-powered screening in their applicant tracking systems. These tools parse resumes, match keywords, score candidates, and rank them before a human ever sees them. In theory, this saves time. In practice, it's creating a blind spot the size of a small country.

The problem is simple. These systems are trained on patterns from existing successful hires. They look for specific job titles, specific certifications, specific career trajectories. If you've spent ten years as a "Data Analyst" at a bank and you apply for a "Data Scientist" role at a SaaS company, the algorithm sees a mismatch. Even if your actual skillset is 90% of what the role requires.

Career switchers get hammered. A journalist who taught themselves Python and built data pipelines for a newsroom? Rejected. A teacher who completed a data science bootcamp and spent 18 months freelancing on analytics projects? Rejected. A marketing manager who's been running attribution models and A/B tests for five years? Doesn't have "analyst" in their title, so rejected.

Titles Are Not Skills

This is the core issue. We've built screening systems that confuse titles with capabilities. A "Digital Marketing Manager" at a 15-person startup might be doing the work of a data analyst, a campaign strategist, a CRM administrator, and a content producer. But the ATS sees one title and maps it to one box.

Meanwhile, a "Digital Marketing Manager" at a 5,000-person enterprise might be managing one channel with a team of six specialists doing the actual work. Same title. Completely different skillset. The algorithm can't tell the difference, and most hiring managers don't look deep enough to figure it out.

The career switcher penalty

Australia's tech sector has been loudly advocating for more diverse pathways into digital careers. Government-funded bootcamps. University micro-credentials. Industry reskilling programs. We're spending hundreds of millions encouraging people to transition into tech.

Then those people apply for jobs and get auto-rejected because their resume doesn't pattern-match to someone who's been in tech since graduation. The system is funding the supply and blocking the demand simultaneously. It's genuinely absurd.

What Good Screening Actually Looks Like

1. Screen for demonstrated capability, not career history

Instead of filtering on job titles and years of experience, look for evidence of the actual skills you need. GitHub contributions. Portfolio projects. Certifications with practical assessments. Even a well-written cover letter that demonstrates domain understanding tells you more than a title match.

2. Build your shortlist from skills taxonomies, not keyword lists

The difference matters. A keyword search for "Tableau" misses the candidate who's expert in Power BI and Looker and could learn Tableau in a week. A skills taxonomy approach says "data visualisation proficiency" and catches all three. Most modern ATS platforms support this. Almost nobody configures them properly.

3. Create explicit career-switcher pathways

Some of the best hires I've placed in the last two years have been career changers. A former management consultant who moved into product management. An ex-accountant who became a business intelligence analyst. A hospitality manager who transitioned into customer success at a SaaS company. Every one of them outperformed candidates with "perfect" backgrounds because they brought cross-domain thinking that pure-play candidates lacked.

But these people would have been auto-rejected by most ATS configurations. They got through because a human recruiter read their application and recognised the transferable skills.

4. Audit your rejection data

When was the last time you looked at who your ATS rejected? Most companies never do. They see the shortlist, interview from it, hire from it, and assume the system worked. But the candidates who got filtered out are invisible. You don't know what you lost.

Run a quarterly audit. Pull a random sample of 50 rejected applications. Have a human review them. If even 10% should have made it to interview stage, your screening process has a serious leak.

The Uncomfortable Implication

If we fixed our screening processes, we wouldn't eliminate the skills gap. There are genuine shortages in deep technical disciplines that take years to develop. You can't bootcamp your way to being a senior machine learning engineer.

But we'd shrink the gap significantly. Tens of thousands of capable people are being turned away by systems that can't see past a job title. Every one of them represents a hire that didn't happen, a team that stayed understaffed, a project that got delayed.

We don't just have a skills gap. We have a recognition gap. And we built the tools that created it.

The next time someone tells you they can't find talent, ask them how many applications their ATS rejected last month. Ask them what criteria it used. Ask them if a human ever reviewed the rejects.

The answer to Australia's tech talent shortage isn't just training more people. It's getting dramatically better at recognising the people we already have.