I'm programming a synthesizer right now, and I make music as a hobby, so I want a dedicated machine for audio processing that can handle all of my music stuff. I'm totally willing to program my own DAW from scratch, because I only need very simple tools. Everything I've ever used is either bloated, buggy, or not my style of workflow. All I need is a tracker, sampler and synthesizer with some effects.
The thing about audio programming is that it's a very simple concept, yet most systems today make it incredibly frustrating for a beginner to get into. You shouldn't need to dig through trenches full of boilerplate and bloat in order to write to an audio buffer. SDL is the best thing I've found so far, but it's far from perfect. And now that I have some working audio code, I can port it to any system or library. It's literally a matter of making a wrapper that can fetch that buffer and write to it.
It may not be technically "correct," or efficient, but I've extended my SDL wrapper so that I can output buffers of pixels to the screen as well. While it may be a pain in the ass to write your own line drawing algorithm, it sure teaches you a lot when you implement everything from scratch. And it's perfectly fast enough to make simple 2D games with too. I've been working on a sort of cellular automata sandbox game, if you ever played powder game dan-ball.jp, it's kind of like that.
I've actually become very fond of this way of programming, where everything is just output to a buffer. For someone who does programming for multimedia, it makes everything very unified and streamlined. It's a shame that, with the advent of GPU's and modern languages, this sort of thing is going by the wayside. I think it's deprived us of an environment for experimenting and learning to program in an interactive and low-level way, where you really have to think about how everything works. If there's one way I could recommend to improve your programming skills, it would be this.
Working on this level also shows how much we still need to improve on the dev tech side of things. Modern languages have definitely gone way off course in terms of making it easier to solve problems. Take java or python, or other "paradigm" based monstrosities, riddled with superfluous keywords and built-in functionality. C might be ugly, and have lots of platform specific or undefined behavior, but at least I can understand why those design decisions might have been made. I cannot, however, understand how one could decide to design a syntax where:
>[0]*4 allocate a list of four zeros,
>[[0]]*4 allocates a list of four lists containing zeros
>but [[[0]]]*4 allocates a list of four REFERENCES to ONE LIST containing zero.
I used to admire python for its terseness as a scripting language, but after using it for some assignments in school it seems completely insane to me. I actually like C a lot for the fact that it's not trying to be anything that it's not, or sticking to some dogma. At the end of the day, their is a very minimal set of syntax that you absolutely need to know, and everything else you learn pretty easily if you need to.
The actual problems I see are usually very domain specific. As an example, for my synth I've taken to using X-macro lists in order to fill out a bunch of switch statements, declarations, gather up parameters and submodule names into a contiguous array, and so on. My sandbox game also has automata rules defined using a combination of macros, initializer lists and bit-fields. Although it works, the rules of the pre-processor are quite inflexible and messy to use. I'd really like to see arbitrary compile-time code execution and meta-programming in C. It would probably solve a lot of problems and extend the usefulness of the language by a wide margin. The stuff that Jonathan Blow is doing with Jai's meta-programming is very interesting to me, though I'm not sure about the rest of the language. It seems like as people become more disillusioned with the shiny new language/library/paradigm meme, we will gradually begin to improve our existing tools, and I think that will have an impact on how we look at hardware and systems programming too. In the future, I dream that I'll basically have a bunch of specialized machines all working together with their own architectures and idiosyncrasies. A separate device for audio, synthesis, graphics, programming, etc, all reporting back to a central interface. A bit of a pipe dream, but I think it's definitely possible, and something that I'm aiming for.