I've been thinking

On-demand learning is ruining the tech industry.

Everyone has heard about the so-called “types of learning styles” auditory, visual, kinesthetic, and so on. And almost everything has been said about it. But what if there is a different categorization of learning? Maybe one that gives more inside? Well, I think there is. This time we split the act of learning just into two categories. The first one, I’ll call it "preliminary learning", this is the classic learning that we do in school; we acquire knowledge that might be useful in the future—without being 100% confident that it will in fact be useful. The second category is "on-demand learning", this is the act of acquiring knowledge when the situation demands it, not sooner nor later. With these two learning techniques in mind. What can we say about the benefits and drawbacks of each? And how our modern environment changes how we learn?

If you ask a great programmer how to become good at coding. The answer would most likely be something around the lines of “just start coding—pick a project that you are passionate about—and start coding, failing, and learning from your mistakes”. Good advice. That’s how many of the best programmers were created. But, if you think about it, whenever you get stuck while coding, there is always Stack Overflow, Documentation, YouTube, and hundreds of online resources to help you. You are never learning alone. That’s how, little by little, bug by bug, tutorial by tutorial, one becomes a great programmer. This is a perfect example of on-demand learning going well. Maybe even the best example, because this scale of self learning is a relatively new strategy that the internet made possible. The immediate access to information and low failure cost in software makes on-demand learning feasible, in fact, not only feasible but highly successful.

There are multiple reasons why on-demand learning is in rapidly uprising as the default learning strategy in tech. Almost every day, there are new technologies to learn, new frameworks, new programming languages, new requirements — even establish technologies are in constant metamorphosis. Indeed, on-demand learning makes a lot of sense. I’m not going to spend 20 hours learning the latest JavaScript framework if I’m not 100% sure I’m going to use it in the future. Just look at the emergence of coding boot camps and online courses with a title similar to “get a job in tech in one month”. Even some prominent universities have been gradually taking a more hands-on approach to learning.

The problem starts when the undeniable success of on-demand learning in the web development world spreads out uncontrollably to other areas. People start to perceive this technique as the optimal learning strategy, regardless of the situation. In a blink of an eye, people, companies, and institutions appear to forget the many downsides of it, and the tremendous power of the “traditional” preliminary learning.

Needless to say, both techniques are useful in different ways. But if you are confronted with a new piece of knowledge. Do you know which strategy to use? How to use it?. Those aren’t trivial questions. Sure, you might have a general idea or even a preference. But there are fundamental limitations to each approach. So, to gain more insight, let’s make another categorization, not of learning styles, but of things to learn. Let’s put everything one can learn into a generality spectrum. At one end of the spectrum, we have all the specific pieces of knowledge around a niche subject — knowledge that doesn’t translate to other areas. On the opposite side, we have knowledge or skills that are useful in multiple situations. As an example, things like logic and math are in the general side; but things like programming languages or a particular IDE are in the specific side of the spectrum.

line1.PNG

No matter how you conceptualize this distinction — specific vs. general, human knowledge vs. nature knowledge, or concepts vs. tools — most people are aware of it. But being aware is not enough. A good mental framework not only helps in understanding, but also shows deeper connections. Therefore, I will stick to the more pragmatic categorization of "concepts vs. tools" and see where it leads us.

Tools are things to get work done, or at least to facilitate getting work done. Without tools, you can’t do anything productive. Concepts, on the other hand, are much more abstract. They mostly aren’t necessary for any specific task but are helpful in many different ones. Let’s give an example. Imagine a software engineer that just got his codding Bootcamp title. He has no other codding experience or formal education. But if this hypothetical developer is determined enough, he probably could make a complex and fully functional web application in a few weeks. He doesn’t need to know any computer science theory, just the tools and frameworks. But, of course, the website probably would be faster and more reliable if he knew about concepts like algorithm complexity and data structures, instead of only tools like React and Node.

This much, I assume, won’t be that controversial. But, what are the implications? First, if concepts aren’t necessary to perform any task, you most likely will only learn tools while completing such task, never concepts. As a result, you can spend all your career learning just the minimum necessary to complete your work — primarily tools. Sure, if you take this attitude, you probably will be fine, but you will never know how much better your work could be if you have taken the time to learn concepts.

Putting it bluntly, on-demand learning is excellent for learning tools, but it sucks at learning concepts. The more general is a piece of knowledge the least likely is for you to learn it with an on-demand learning approach . On the other hand, preliminary learning is quite bad for learning tools but is the only way of learning concepts. What blows my mind is that for most of us preliminary learning stops after college. That’s a big deal! Of course, I don’t blame the individuals; in the modern tech industry is increasingly difficult to find time to learn concepts. It is hard to justify spending time on things that maybe, just maybe, would be useful in the future.

The worst part about it, is that the industry seems to be moving even more in the direction of on-demand learning. And don’t get me wrong, we need people that are just proficient and productive with tools. For some jobs, tools are enough. But what happens when companies and institutions value productivity and tool knowledge more than conceptual understanding? Well, I believe a lot of the problems in modern commercial software can be attributed to that exact problem. Just imagine how many bugs and security concerns can be attributed to a lack of simple conceptual understanding. Most of the time, a tool can be used much more efficiently if you understand the concept behind it.

Fundamentally, this emphasis on short term productivity is a problem rooted deep in the modern tech culture. And maybe we can survive with this obsession in the short term. But sooner or later, it will catch up. In fact, we already can see some of the effects of it. Just look at all the security vulnerabilities being discovered every day. Without mentioning that every piece of software seems to get more and more bloated by the day. And maybe this approach has relatively harmless effects on web development, but what about more essential areas? Would you trust someone to take an on-demand learning approach to develop healthcare software? What about infrastructure? Probably you would not; you should expect some deep conceptual understanding of someone doing critical work. But where do we draw the line? What is safe enough to allow on-demand learning? There is no straightforward answer, but there are some cultural changes that might steer the ship in the right way.

If something is worth taking away from this article, it is that taking the time to do preliminary learning really pays off in the long run. Maybe in the moment, it seems like waste of time. But trust me, that’s the productivity trap. When companies and individuals acknowledge the importance of conceptual understanding, many of the problems mentioned earlier will be solved. I predict that eventually we will see that companies that value conceptual understanding will get a considerable advantage. In fact I firmly believe that preliminary learning should be part of the paid workload. But independently of that, college should definitively not be the end of your concept learning but just a step in a lifelong journey.

"Living is worthwhile if one can contribute
in some small way to this endless chain of progress."
- Paul Dirac