← Back to Blog

The Same Developer's Greatest Hits

A developer left the company and we inherited their projects. Every codebase was a new adventure. SQL injection in the login page was just the warm-up act.

By John Croucher
War StoriesSecurityEarly CareerWeb Development

Every developer leaves a fingerprint in their code. The way they name variables, how they structure things, the patterns they fall back on. Usually it’s harmless. Sometimes you can even learn something from it.

And then there was this one developer.

We worked at the same company. They’d been there a while before I joined, and they’d built a lot of the client projects that were in active support. When they eventually left, we inherited everything they’d been working on.

That’s when the discoveries started.

You’d open a project and know it was theirs within minutes. The same patterns, the same shortcuts, the same complete disregard for anything resembling security. It was like recognising someone’s handwriting, except instead of neat cursive it was written in crayon on a wall.

The Password That Unlocks Everything

One of the first projects we dug into after they left was a web application with a standard login form. Username, password, submit button. Nothing unusual on the surface.

I was poking around the authentication code and found the SQL query that handled login. It was raw SQL, no parameterised queries, no ORM, nothing sanitised. Just a string concatenated straight into the query.

That alone was bad enough. But then I looked more closely at how the password check actually worked.

They’d used LIKE instead of =.

For anyone who doesn’t work with databases, LIKE is meant for pattern matching. You use it when you want to search for things that partially match, like finding all customers whose name starts with “John”. The % character is a wildcard that matches anything.

So when you use LIKE to check a password, and someone types % as their password, it matches everything. Every password in the database.

I tested it. Typed in a valid username, put % as the password, hit login.

I was in.

I tried another username. %. In again.

Every single account on the system was accessible with a one-character password. The admin account, user accounts, all of them. Just %.

It was the kind of bug where you stare at the screen for a moment, close your laptop, and go make a cup of tea before deciding how to explain this to the client.

The Credit Card Scenic Route

A different project for a different client, but the moment I opened the code I recognised the style immediately. Same developer.

This one was an e-commerce site with a multi-step checkout. You know the kind, enter your details, choose shipping, enter payment, confirm and submit. Pretty standard stuff.

Except for how they handled the credit card details.

On the payment step, the user enters their card number, expiry date, and CVV into a form. Normal so far. The form submits to the server. Still normal.

But then, instead of storing those details securely on the server side while the user completes the remaining steps, the server sent them straight back to the browser. In hidden form fields. The full card number, the expiry, the CVV, all of it, sitting right there in the HTML.

When the user moved to the next step, all that card data got submitted back to the server again. And then sent back to the browser again in more hidden fields. Back and forth, step after step, until the final confirmation page where it was all submitted one last time to actually process the payment.

The credit card details were doing laps. Server to browser, browser to server, server to browser again. Every step of the checkout was a round trip with the full card number sitting in the page source for anyone to see.

Right-click, View Source, and there it all was.

It’s the kind of approach where you can almost see the thought process. “I need the card details on the last step, so I’ll just carry them through the form.” Technically it achieved the goal. The payment went through. They probably tested it, saw it worked, and moved on to the next project before anyone thought to look at how it actually worked under the hood.

The fact that it was broadcasting payment card data in plain HTML across multiple page loads, that part apparently didn’t come up.

The Pattern

What got me about this developer wasn’t any single bug. Everyone writes bad code sometimes, especially early in their career. I certainly did.

It was the consistency. Every project had the same fingerprints. No input sanitisation anywhere. Security treated as someone else’s problem. Solutions that technically worked but were held together with the coding equivalent of duct tape and optimism.

And because they’d been the only one working on most of these projects, nobody had ever reviewed any of it. The code went straight from their machine to production. We only found out how bad things were because they left and we had to pick up the pieces.

The Uncomfortable Truth

The thing is, from the client’s perspective, everything worked. Users could log in. Payments went through. The sites did what they were supposed to do.

That’s the uncomfortable reality of security vulnerabilities. They’re invisible when everything is going well. The login page worked perfectly, it just also worked perfectly for anyone who knew to type %. The checkout processed payments successfully, it just also showed your card number to anyone who knew how to view page source.

The bugs that scare me most aren’t the ones that break things. They’re the ones that work perfectly while being completely, silently wrong.

What I Took Away

These projects taught me to never assume that working code is correct code. A feature that passes every test can still be fundamentally broken in ways that don’t show up until someone looks at it with the wrong intentions.

They also taught me the value of code review. Not the “looks fine, approved” kind, but the kind where someone actually reads the code and thinks about what it does. Both of these issues would have been caught instantly by anyone with basic security knowledge. A second pair of eyes was all it needed. If we’d had proper reviews in place while they were still at the company, none of this would have made it to production.

Every time we opened another one of their projects, there was a moment of genuine dread. Not “what’s messy in here” dread, but “what’s been running in production for three years that nobody’s ever looked at” dread.

We fixed what we found, and then went through every other project they’d built. It was a long few weeks. But I still think about how long those projects were out there, working perfectly, wide open, and nobody knew.