Learn Programming, Not Programming Languages
At a recent meetup for our local freeCodeCamp group, I heard a question asked that I’ve often seen before. It usually goes something like this:
“Should I learn a language that will keep me relevant in the next 5-10 years or should I learn one that will help me get a job now?”
Industry standards don't change often, so what works now likely will still work in 5-10 years time. But this question masks a much simpler question that many beginners tend to ask. “What language should I learn?”
So many choices
College freshmen enrolled in Computer Science usually don’t need to worry since they don’t get to choose. They use whatever the professor decides. But for self-taught developers, it can be easy to get lost. There seems to be so many languages and plenty of conflicting opinions all over the internet. There are quite a few languages that are appropriate as a starter language. Some have low initial learning curves such as Python and Ruby. Others are widely used in the industry, such as Java and C#. Many learners get caught up in deciding what language to learn that they lose sight of their purpose.
What should I choose?
A real life reminder
The question at the meetup inspired me to write this post, but I also got a reminder of this at the start of this semester. I will be taking the Introduction to Operating Systems course and C is the natural choice for this class. While I had used C++ before in my data structures course, I was not a CS major at the time and so I didn’t give much effort into learning the language (nor the material). I did pass but did fairly poorly on the implementation projects.