Topics commonly missed by non-CS students

My programming education has been almost exclusively self-directed so far (books, the internet, discussions with a couple of friends). I've completed some small personal projects and spent some time experimenting with various languages and concepts. I feel like I have a reasonable grasp on the basics that I've learnt, but I'm aware there are probably gaping holes in my knowledge.

I was wondering if anyone who took a computer science course (or anyone else) could share any insight into the kinds of useful theoretical (or more practical) topics that are taught there that autodidacts commonly miss.

I would assume concepts/conceptual areas are missed for a few reasons:
  • People simply haven't thought about learning it
  • The ideas are difficult to get your head around without some in-depth explanation
  • Despite being useful, the topic is (initially) uninteresting, and require some more motivation to get started in learning

To give an example, my expectation was that many self-taught people would lack a solid understanding of algorithm design, and separately an understanding of what the computer is actually doing at a low level. To rectify these in a partially combined fashion, I have been working my way through the Kleinberg and Tardos 'Algorithm Design' book and learning the basics of x86 assembly language using 'Programming from the Ground Up' by Jonathan Bartlett (freely distributed at http://download-mirror.savannah.g...grammingGroundUp-1-0-booksize.pdf). The plan is that once I have a sufficient grasp of assembly, I'll try to implement the algorithms using it.

I'm very interested to hear what thoughts you all have on this.

Andrew

Edited by Andrew Reece on
Well, I don't know the answer to your question (it's a hard one).
But, if I were to recommend one book for self-study it would be:

http://www.amazon.com/Computer-Sy...Perspective-Edition/dp/013409266X

because it is goes through the whole computer architecture and programming stuff in some details. It is expensive, though.

Edited by Piotr Madalinski on
Data structures and algorithms are the obvious things, but also usually the easiest to fix -- big-O is not that hard to understand, and Wikipedia will give you an overview of pretty much all the major algorithms (sorting, searching, etc.)

Networking and databases are more difficult -- they're both complex and difficult to really get started on. Not impossible: sqlite is simple enough, and there are plenty of micro-servers to go pick apart on GitHub.

To be honest, though: you're not missing out on much by not taking a CS degree. Independent effort counts for much more, even if you *are* in school -- the coursework alone will leave you hilariously unprepared for actually doing anything useful. For example, we used assembly in my systems class, but we used an extremely stripped-down assembler in a VM. (I spent more time trying to write a fast division function (PEP8 lacks mul/div instructions) than listening to lecture in that class...) I learned far more about assembly by playing around with compiler output than anything else.
btaylor2401
Data structures and algorithms are the obvious things, but also usually the easiest to fix -- big-O is not that hard to understand, and Wikipedia will give you an overview of pretty much all the major algorithms (sorting, searching, etc.)

Really? Is the whole topic of data structures and algorithms left out of modern CS courses? If so, what in the name of Your Favourite Deity do they teach?!

Back in the early-to-mid-90s, a typical CS course would include subjects on what I think of as "the basics", including:

  • Data structures and algorithms
  • Digital circuits (combinatorial and sequential)
  • Programming languages and compilers
  • Databases
  • Operating systems
  • Theory of computation (e.g. languages and automata, computability)

Plus a bunch of area-specific subjects like:

  • Numeric analysis
  • AI
  • Logic (e.g. non-classical logics)
  • Graphics

And that's on top of the basic science subjects (e.g. calculus, statistics, discrete mathematics).

Things will have changed because different stuff is important (e.g. networking/web stuff is arguably more important than numeric analysis for most careers, plus new fields like cloud computing and bioinformatics), but I would hope that the basics are still the basics.

amzr
To give an example, my expectation was that many self-taught people [...]

My experience is that the one drawback of being entirely self-taught is that you're not forced to study things which are important but whose relevance isn't obvious at this very moment. One obvious example is archaic data structures which aren't useful in the modern era, but teach something important about the way that new data structures are designed or analysed.
Pseudonym73
btaylor2401
Data structures and algorithms are the obvious things, but also usually the easiest to fix -- big-O is not that hard to understand, and Wikipedia will give you an overview of pretty much all the major algorithms (sorting, searching, etc.)

Really? Is the whole topic of data structures and algorithms left out of modern CS courses? If so, what in the name of Your Favourite Deity do they teach?!

No? But azmr was asking what *non*-CS students would miss, and I've yet to see Data Structures and Algorithms offered as a core course. (Though, part of the discrete-math-lite course we call "Liberal Arts Math" (yes, seriously) comes close.)

Good points on compilers and theory of computation, I'd forgotten about that. I had a really good book for theory that walked you through lambda calculus and automata and proving the halting problem by having you implement them, need to dig that up. (It was in Ruby, iirc.)

EDIT: Also, if you're not doing the college thing *at all* (or at least not in a STEM field), study statistics. It's nowhere near as intuitive as it looks, almost everyone gets it wrong, and it is *incredibly* powerful once you get it. (Systems design in games is almost entirely statistics -- probabilities and distributions. If you want to know what your profiling results *actually mean*, you need to understand testing, etc.)

Edited by Bryan Taylor on
Formal Education is always a bit different. Back in the time, before VideoStreaming/youtube etc. some CS didactics in Germany were ridiculous. Well you got the paper, that gives you a job/status a.s.o.

Now 50% of the children in europe going to univercity, many are getting 2 or more master degrees, but I don't think that kids are more clever than in the past. I speak a lot with people about their formal education and many higher qualified people admited that their formal education was a waste from 70 to 90% in terms of time/learning efficience. Contrariwise many students think that they get the best available education in the world, whereever they study. Might be self protection.
But at least in germany, that piece of paper is needed. Not many companies trust self educated persons.

I know a case of a german person, which I can proof, he studied CS in the UK, got a master degree and applied for a job at a german/bavarian univercity. They offered him a salary of like 30k Euros p.a. with contract of one year. Think he now teaches at Oxford in a life time job for at least the 3 times amount of money. Poor Germany.

But back to the topic, I think 2 Weeks on this can close gaps:
https://www.quora.com/What-are-th...beginner-to-advance-in-algorithms

Edited by Carsten Holtkamp on Reason: typo
This recomendation maybe is not on topic. I'm no computer science guy so I don't know what I don't know. But I'm learning with each video so I would recomend Computerphile (and the brother channel Numberphile)
btaylor2401
No? But azmr was asking what *non*-CS students would miss, and I've yet to see Data Structures and Algorithms offered as a core course.

Ah, right, got it. I pass computer science and fail reading comprehension.
my cs course(at the top engineering school in sweden) had a pretty good first year curriculum and some decent courses in the masters program. Good as in that it exposed me to a lot of stuff I wouldn't have pursued on my own. We did procedural and functional programming(pascal + scheme) with a focus on data structures and algorithms the first semester, then followed up with oop though the main concern was still data structures. Had we continued with low level programming/computer architecture and numerical analysis + some kind of personal project I would have been pretty happy.

Instead the second year we had more stuff on OOP and then some kind of team based project course where we had to choose from a pile of projects and then work at a company or institution for a whole semester doing that stuff. Which in theory is sounds like a valuable thing until you look at the project folder and realize that every single project was some kind of mobile app, web-thingy or "big-data" project. Almost every project description was throwing around buzzwords like scrum/agile, UML, oop and so on. If at least 4-5 of these 30 project had been 'old-school' programming I wouldn't complain.
Computer architecture wasn't introduced until the fourth year. At that point I had already taught myself that stuff and ended up dropping out from sheer frustration with the curriculum. We did have a course in graphics where we wrote a raytracer and software renderer from scratch in c which was probably the best course.

If I could do it all over again I would major in math + physics and do most of the cs stuff on my own.

Edited by Gus Waldo on