Everybody here is right on the money. You go to college or university for the paper, but you teach yourself. I learned fucking nothing at college. In 3 years, I had:
* A bare-bones algorithms course that could have been covered in a week
* An intro C++ course that covered nothing beyond functions, basic datatypes, and basic IO (didn't even cover classes, multi-file programs, compiler shit, anything)
* A followup C++ course that went as far as building your own linked list, but taught nothing about memory management.
* A shitty C# course (which was supposed to be the more advanced level of language-oriented classes) that just focused on building UIs the whole semester.
* A shitty game development elective that went through FPS Maker, RPG Maker, and Game Maker. (Though I actually learned something in the latter two, because I was the "scripting guy" for RPG maker and did a bunch of Ruby, and the professor let me make a game in C++ instead of using GameMaker)
* An assembly language course using a stack-based virtual computer. This was actually a good one, and taught me a lot about reasoning around a program stack and working with stack frames. The professor sucked, though, and gave me 50% on the final, because it specified that we were to write a real assembly program for any OS and assembler, but he didn't tell anybody that he would simply fail anything that wasn't written for his specific Windows assembler.
As a professional programmer, I'd estimate that roughly 1% of what I do day-to-day was something I actually learned in college. Everything else was self-taught. You're going to have to work your ass off just doing it. Reading is good, but you need to actually program.
The good news is that anything you need to learn, you can teach yourself. Your degree will get your foot in the door, so as long as your actual skill can carry you the rest of the way, you're golden.