Just recently researchers published a study showing that over 29 million American health records were compromised between 2010 and 2013. Many of those were stolen, but many of those were also caused by hacks and vulnerabilities in weak systems. This isn't anything new. We hear about big companies loosing big data almost daily now, and we never hear about smaller companies losing any, but they do, it's just not big news. 29 million records however, is not something that should be brushed off lightly. Back in the days when client data was kept in manila folders in some cabinet somewhere, data breaches were rare. You would notice someone walking out of a building with 29 million folders. You just would. Every year however we digitize more and more content. We've gotten good at that. Digitizing. But maybe we got too good at it too early. Maybe we didn't realize that having 29 million records fit in a thumbsize drive could be a problem one day.
This isn't an article about healthcare data however. It's about data security in general. CitiBank was hacked in 2011 with a vulnerability that would make any programmer cringe. 200,000 customers had their information compromised because someone couldn't check if an account ID belonged to the currently logged in user. This was 100% programmer error and something that could have been rectified in 10 minutes, but that went unnoticed for who knows how long. Even better it should have been something that should have been caught way before it was made public facing.
So why does it keep happening? I ask myself as I read those stories. The answer really is rather simple. It's actually a combination of many smaller answers. These aren't Matrix like hackers that break into the head of hospitals computer at night and ask them to meet them under freeway overpasses. The truth is it's relatively hard to "break" into a system without the proper credentials and without knowing a fair amount about how the system works. A proper system that is. One that was built with security in mind from the beginning. That, however, seems harder and harder to find nowadays.
Because Security Isn't Pretty
Security isn't flashy, and it's expensive in both time and resources. That doesn't boast well in that monthly meeting where you have to break down what you worked on in the past 7 days. You could spend a week combing through every single process in your website and find one single dangerous vulnerability that you patched up and be a hero..in your mind, only to be reprimanded for taking so long in doing so. I've seen it happen and it saddens this programmers heart a bit. I take my job seriously, and I try to make the best product that I can, so when I see someone disrespect it I take it personal.
I once worked in a company where a particular website kept getting hacked repeatedly. No one knew why, but like on a vicious cycle, it would end up getting flagged by anti-virus software and thus losing the company traffic and money. The solution? To delete whatever file was causing the issue and move on with life. Then a month later it would happen again. And the solution was again to spend 10 minutes and move on. Eventually, a developer took it upon himself to figure out why this was happening, in his off time. Because as developers, we do take responsibility for the work that we put out there. After a couple of days, he found the cause of the problem. And I mean the real cause. The one that would solve the problem once and for all. And he fixed it and it was never discussed again. That's usually how these things go. Unless you can present it in PowerPoint to your managers managers, then it's pretty much not an important issue.
And this is what I mean by it not being pretty. It's not something that you see and can show your managers and put on a graph. Or maybe it is.
That'll do just nicely I think.
Lowest Bidder Anyone?
Many times, particularly for governments sites, the contracts go to the cheapest bidders. This isn't any secret. It's visible to anyone that has ever visited the DMV website and wondered why every single link pretty much just asks you to call a 1800 number. Even more relevant was the more recent healthcare.gov fiasco, which was pretty much a zero day disaster as nobody could sign up or pretty much do anything with the website. This doesn't just apply to government sites however. Many organizations live by this business model. They need something done, they outsource it to someone else, sometimes in another country but almost always at a low cost. Why pay 100k for a custom CMS when an up and coming company can do it for 15k? Because you get what you pay for, that's why.
I've worked on code that was brought in from another company, and almost quit on the spot as I realized that I would be the one maintaining it and working on it. The name of the game is saving money, and all sides in the game play those cards. If it means hiring a startup company, comprised of fresh college grads to save a few thousand, then that's what will be done.
Because Companies Can't Seem To Keep Good Talent
This is a problem that I've seen up close at previous jobs when the company is "restructuring" and replacing good developers with 2 interns each. The truth is that college doesn't prepare you for a job. I've worked with interns fresh out of college that had no idea what HTML was and who had never heard of a server. I've also worked with fantastic developers that weren't respected in their positions and so they moved on. It's a real shame to see it happen, because they take with them knowledge that can't be reproduced by reading a few emails.
You want to have the best working on your product, not the cheapest. That's why big companies like Google and Microsoft do what they do. Because they don't mind spending money on talent and they also don't mind giving fresh young minds a chance at the same, and when was the last time that Google lost 29 million records?
Because Programming Is Hard When You Do It Right
Because programming isn't just a job that you go to from 9 to 5, stay under the radar, get a paycheck and then move on. At least it shouldn't be that. But for many, unfortunately it is. For me, it's owning a project and making sure it's functioning to the best of its ability. It's asking questions and taking notes and getting things figured out and working. When you don't do that, is when vulnerabilities are born. Sometimes you have to learn new technologies to keep up and sometimes you have to work those extra hours to make sure it works. But that's how you do it right.
Here's a story from a few decades ago, but one that is still relevant. It involves a radiation therapy machine called the Therac-25. It was a fine beast it was, except for the fact that between 1985 and 1987 it was involved in at least 6 accidents were patients were given thousands of times more radiation than normal. It was chocked up to bad software design and bad development practices. The machine had no safety locks in place that would prevent such things from happening. The Therac-25 case is taught in college computer science courses so that future software developers know that there are some ethical standards that they should be held to.
Because People Let It Happen
Websites get hacked, because the people running them allow them to get hacked. Because a former employer never bothers to change the password to their super important back end that controls all of the data, and that dozens of ex employers have in their hands. Take the CitiBank hack for example. A proper QA team would have caught that bug way before that page would have hit the internet. Websites get hacked because they're made by regular people who are under stress due to made up deadlines by managers who want to look good, and who may or may not have the appropriate skills to properly secure it. That is the sad reality and unfortunately will continue to happen until a company decides to do things the right way.
Walter Guevara is a software engineer, startup founder and currently teaches programming for a coding bootcamp. He is currently building things that don't yet exist.