Taking computer science courses are already paying off in my career. Nothing too significant (yet) but I am witnessing small wins.
For example, this past summer I suffered through HPCA (high performance computing architecture), a course historically only offered in the longer semesters (e.g. fall, spring). In the course, I learned a lot of theory: barrier synchronization, processing pipelines, instruction scheduling, multi level caches, branch predicators, cache coherence protocols, and much more (many of which I have already purged from my memory). However, since most of that knowledge was primarily theory, very little of it I’ve been able to directly apply to my day to day job.
Until a couple days ago.
While at work, my co-worker had pointed out that the unlikely C function calls were sprinkled throughout our code base, used for branch prediction. And thanks to theory, I was able to describe to another co worker how the compiler (with its understanding of the underlying CPU architecture) will rearrange the assembly (or machine code) in order to avoid a branch prediction miss: if the wrong instruction is fetched, the processing pipeline must flush all the previous instructions queued in the pipeline, temporarily halting the pipeline, which ultimately reduces the throughput.
And this is just one example of how taking my masters had paid off. There have been a number of other situations (like understanding shared memory and interprocess communication) that I’ve been able to apply theory from academia to practice at work.
Thanks to the program, I’m feeling much more competent and confident working as a software developer at Amazon Web Services.