Here's something no job posting ever tells you: your resume probably never reaches a human. Not because you're underqualified. Not because the role filled internally. Because a piece of software looked at your file, didn't like what it found, and discarded it before any recruiter opened their laptop.
Applicant tracking systems — ATS — handle initial screening at roughly 98% of Fortune 500 companies and most mid-sized firms. They don't read your resume the way a person does. They parse it. They extract data points — name, contact info, work history, skills, education — and score it against the job requirements. If your score is below a threshold, you're out. No human involved.
The parsing problem
Here's where people get surprised: ATS software isn't reading your beautifully formatted resume. It's extracting raw text from it. And if your formatting is doing anything unusual — a two-column layout, tables, text in headers or footers, graphics, or icons — the parser often mangles it or skips it entirely.
That modern two-column template you downloaded? The one with your skills listed as percentage bars on the left side? An ATS might read that as: "Marketing Manager Project Management 85%". That's not a skill. That's garbled text. It gets ignored.
- Multi-column layouts: the parser reads across rows, not down columns — jumbling everything
- Text inside tables: many ATS versions skip table content entirely
- Headers and footers: contact info placed in document headers is often invisible to parsers
- Images and icons: cannot be extracted as text, period
- Decorative or non-standard fonts: some systems fail on unusual character encodings
The keyword problem
Even if your formatting is clean and the parser extracts everything correctly, you're still not done. ATS scores your resume against the job description using keyword matching. And here's the thing: it's mostly exact matching, not semantic matching.
If the job description says "project management" and you wrote "managing projects" — a human would consider those identical. Many ATS systems don't. If the job wants "SQL" and you wrote "relational database management", you may not get credit even though you're describing the same skill.
The system isn't evaluating whether you can do the job. It's checking whether your text contains specific strings of characters. It's closer to a keyword search than an intelligence test.
File format matters more than you think
PDF vs DOCX is genuinely contested advice. The honest answer: it depends on the ATS. Older systems handle DOCX more reliably. Modern systems are mostly fine with PDF. When you have no idea which ATS a company uses, a cleanly-exported PDF is usually the safer bet — but only if the PDF has searchable text, not a scanned image.
If you exported your resume from Canva, Google Slides, or anything that treats the document as a visual canvas, there's a real chance your resume text is invisible to ATS. It's a picture of a resume. Not a resume.
What you can actually do
The fix is less exciting than the problem. Single-column layout. Standard section headings ("Work Experience", not "Where I've Been"). Contact info in the document body, not the header. Keywords pulled from the actual job description. No tables, no text boxes, no graphics.
It won't win any design awards. It will get through the system.
One more thing: different companies use different ATS systems, and those systems have different parsing behaviors. A resume that sails through Greenhouse might stumble on Taleo. The best you can do is optimize for the most common failure points — which the above covers — and tailor your keywords for each application.
The frustrating reality is that ATS screening was designed to handle volume, not find the best candidate. Most recruiters know this. They hate it too. But until that changes, the people who understand how the system works will keep having an edge over the people who don't.