I wasn't supposed to be doing a security audit today.
The ask was simpler: scope two sync bugs in a client's e-commerce platform, give an estimate. Standard stuff. But you can't look at a codebase without looking at a codebase. And what I found in the first 60 minutes changed the conversation entirely.
Here's what was sitting there, in a live production system, actively serving customers:
A config file with every credential in the business. Shopify API keys. Google service account. AWS credentials. CRM tokens. All in one file. Publicly accessible over HTTP. No auth, no protection, no .htaccess rule blocking it. Anyone who knew to look could pull it. We confirmed they've had a breach before. This is probably why.
SQL injection. Direct string concatenation feeding user input into database queries. The kind of thing that gets covered in the first week of any web security course. Still there, in production, in 2026.
Open admin panels. No IP restriction. No additional auth layer. Just a URL and you're in.
Google Tag Manager installed three times. Different container IDs. At least two of them duplicates or ghosts from old campaigns. Nobody cleaned up after the last agency.
robots.txt blocking all crawlers. Disallow: / - the site was telling Google not to index it. Had probably been there for months. SEO dead on arrival, silently.
No deployment pipeline. Changes go straight to prod over SFTP. No staging, no review, no rollback. Every deploy is a prayer.
The two sync bugs I came in to scope are still there. We'll get to them.
But before any feature work touches this codebase, the foundation has to be fixed. Exposed credentials mean the blast radius of any breach is total - not just the app, but every service it connects to. You can't build on that.
The order of operations matters. Security holes first. Deployment pipeline second. Then sync bugs. Then anything else.
There's a version of this where I scope the two bugs, send the estimate, and move on. The client gets what they asked for. The config file stays publicly accessible for another six months until something bad happens.
That's not how I want to work.
The most valuable thing you can do when you open a codebase sometimes isn't the thing you were asked to do. It's noticing what's on fire and saying so.
TL;DR
- A 60-minute audit of a live PHP e-commerce platform found: publicly exposed credentials, SQL injection, open admin panels, triple-installed GTM, robots.txt blocking all crawlers, and no deployment pipeline
- Feature work can wait. Security and deployment hygiene come first.
- The most valuable thing you can do in a codebase audit is sometimes not the thing you were asked to do