The definition of Computer Scientist has changed many times during the past century. Even more so now with the popularization and novelty of it with various movements springing up to "teach code" and "anyone can be a programmer!" as their slogan. And we're essentially narrowing the definition down year by year as to what it means to be a scientist of technology. It's starting to mean, to code, to program, to drag and drop buttons on a screen, or "canvas", as it were. It's starting to mean, to store data in a database, and to configure a connection string. And more importantly, it means to delve into it strictly in a work environment, and taking it out of our real world.
And that's a relatively new thing. It's a sign of the times really. Anything that's normally popularized will eventually cease to be what it once was. It will be morphed and manipulated and pushed and pulled until it resembles something that is quantifiable. Something we can buy an online course for for 39.99. We see it all the time in many different other aspects of life. Religion for example, once a personal and life defining concept, can now be tuned in to channel 'x' and can be heightened by amount 'y'. Health can be delivered to your door nowadays for a monthly fee as well. And for the better part of a decade, I've followed that same model with computer science. I didn't question it I just fell into it. Maybe as a child I had different intentions for my adult self. But eventually, we all give in. We want to fit in and to be normal. And so I too called myself a computer scientist for some time.
'science' - the intellectual and practical activity encompassing the systematic study of the structure and behavior of the physical and natural world through observation and experiment.
We call it science now however. We refer to ourselves as scientist, envisioning lab coats and expensive laptops, heightening our sense of self worth. But we're not. We don't study any world but the one confined to an office building and a 4 x 4 cubicle, and our observations resemble more visiting a website and copy and pasting things that we do not ourselves understand. Our experimentation revolves around figuring out why someone else's script doesn't work in our environment, and then feeling successful once it does. But we still call it science.
That's not how it once was however. For one, we didn't give it a name. We didn't romanticize it. And you couldn't just "study" it, as it didn't exist. You needed to understand the most basic of principles before you even attempted to begin to understand it. And in many cases, you had to create it. And that was true of all science. Many disciplines still follow those rules, those guidelines by their nature. Biologists still need to hit boot to the dirt and collect samples and analyze and learn and listen. And not long ago, computer scientists had to dig up spare vacuum parts in order to make their creations come to life.
The scientists of today think deeply instead of clearly. One must be sane to think clearly, but one can think deeply and be quite insane. - Nikola Tesla
Society is different now though. We learn a trade or memorize one I should say, in order to complete arbitrary tasks for financial gain. Others, financial gain usually. We learn to hate this science and to ignore it when the work day is past. And eventually, we forget about the science altogether. Only the utilitarian task remains. The same 'foreach(var i = 0...'. The repeated if (textbox.text != ''...) is all that remains. But it doesn't have to. You don't need a degree in mathematics, to study mathematics. You don't need a Ph.D. in Biology to look for patterns in cell replication and to use your computer science skills to bring it to life.
So let's take a minute to look beyond the "computer". Beyond the for loop and beyond the if else to something less practical and something more important. Alan Turing is one of the more memorable names in the world of computer science. For many many reasons, such as his work in cryptography and his Turing Machine, which can be considered as the first functional model of what we now call a computer. We learn of him in college, we make movies about his life, and again, we romanticize the idea of technology. But I think that muddies the water a bit, as his skill isn't really something that can be pinpointed and given an identifier.
But Alan Turing wasn't just a computer scientist. He didn't sit in a cubicle writing decrypt functions for 8 hours a day. He studied the nature, and the physical, of this thing we call technology. The science, if you will. We know him for his work in computers, but he was also a mathematician, a logician and a mathematical biologist. And that's what we lost along the way. We left behind bits and pieces of science and understanding, and kept the shell around it, which is empty and void of knowledge. Ask any programmer right now how a computer works, and he will tell you just that and nothing else about the nature. It works, because it needs to work. Because it starts with A, and because it ends in B.
Ada Lovelace, or the Countess of Lovelace to us mere humans is regarded as the first computer programmer due to her work with the Analytical Engine. She was brought up a mathematician and logician, to steer her away from her father's, Lord Byron, footsteps in poetry. But she donned many hats. In the end, she was a writer, a mathematician, a logician, and a computer scientist, though she preferred to refer to herself as a Analyst (& Metaphysician) and worked in the realm of "poetical science".
I've read the following, and thoroughly enjoyed it and recommend it to anyone who's interested in a time in technology that's getting harder and harder to find nowadays.
And that's my takeaway point. Computer Science is many things to many people. It's a word. It has as little meaning as foreach. Before we begin to call ourselves scientists, we should stop for a minute and try to understand what that actually means.
Found this useful?