First, the fundamentals. If you've got the right mindset, and you know what you want the machine to do, you can probably write something to make it do it in some half-hearted way. The trick is to make it do it fast (algorithmic complexity) and well (formalise the problem, consider corner cases, implement to spec). And that is algorithms, the core of the subject.
Most developers will, admittedly, spend most of their time gluing together other people's libraries. But abstractions leak and you'll need to look inside the black box. You may well need to write your own libraries. And remember that nothing a computer does is magic, so you should be able to get up to your elbows in gubbins on any part. So, even if your day to day job isn't coding clever algorithms, it should be your bread and butter. Put another way, do you want to be a mechanic, or an engineer?
So, algorithms, algorithms, algorithms. That includes data structures and complexity, and all that. I give two thumbs up to Cormen et al's Introduction to Algorithms. It covers the basics and does some very interesting advanced topics, and has lots of references to papers for further stuff.
As it provides a rich toolkit of ideas, plus useful vocabulary, add basic computer science theory. Computation theory with regular languages and context-sensitive grammars, lambda calculus, that kind of thing. First order predicate calculus. Probably learn ML, Haskell and/or Scheme.
Eventually, it all hits hardware, so learn computer architecture. Hennessy and Paterson will do the trick. Great book. Readings in Computer Architecture was also pretty awesome. Digital logic and maybe a little VLSI help round it out. Abstractions leak, so it's still useful if you never pick up a soldering iron.
Then there are the other staples of a traditional computer science undergrad degree: Graphics, security and cryptography, networking, compilers, databases, etc. They pretty much all have interesting ideas associated with them. Learn a bit of all of them, and dig deeper on the ones that pick your fancy.
You've probably noticed that I'm being heavy on the theory, and very light on the specific technologies. That's because it really is all about the ideas. The technologies keep changing, and while ideas change too, it's at a much slower rate, and it allows you to see those things that are invariant.
Learning specific language is necessary, of course. They're fundamental tools, and if you can use a language efficiently, things are pretty bad. Moreover, some languages are really effective ways of getting across concepts. On the other hand, some languages are sufficiently baroque that people become experts in the language and its tricks, not programming. *cough* C++ *cough* People can end up thinking they're designing in the abstract, when in fact they're designing in a particular language. Obviously, what you design must be implementable in the language used, and sometimes it makes sense to play to the language's features and idioms, but fundamentally, you should be designing something that architecturally makes sense even to someone who doesn't know the language. Again, as an example, some people have their concept of what object orientation should be broken by C++.
So, learn languages, but I'd recommend wide rather than deep. Both is best.
Then there are specialised algorithmic areas that have some really interesting stuff, but don't show up on the wider radar much. Good psuedorandom number generators and hashes. Compression and error detecting/correcting codes. That kind of stuff. Between my undergrad and now, machine learning has become a real subject with lots of interesting things to learn. Outside but near to computer science, there's plenty of discrete maths. Play with stuff like game theory, perhaps?
A lot of this is about getting the best possible toolkit to think with. So, practice is useful. There are lots of puzzly things that both give you practice, and may throw you new stuff too. Project Euler is fun, as are competitions like Google's Code Jam, and there is an apparently never-ending set of interview puzzles, all of which make good exercises. I'm sure you can find more.
This has been fairly algorithmically-focused. There's a lot of process and design stuff that is incredibly important, but I'm not sure it's something that's easy to pick up from books. There are books like Code Complete which I guess are worth reading, but I think this is particular area where experience seems to do its thing, or at least it's not well-enough served by books.
On the other hand, I'm actually quite positive about books on management and leadership. Too many people treat managing teams or leading projects as this incidental thing they fall into. Whether or not you like it or are interested in it, if you're going to do it, do it right. As managers like to write documents, there are plenty of decent business books in this area, and it's worth reading a few. Even if you're not managing, you can understand what your manager should be doing, how to make their life easier, and use them to make your life easier. Books can be useful because they allow you, even in a possibly dysfunctional management environment, to understand how things can and should work.
I guess the idea here is to be an idea infovore. Try to keep learning new things, but particularly favour those things that give you new ways of looking at the world, new tricks, and ways to get to the nub of it all.