Hacker News new | past | comments | ask | show | jobs | submit
I mean yeah, I agree, but is it that hard to keep relevant technology in the mix? I'm not saying everything has to be cutting edge!
Many professors view teaching as a secondary obligation. Even if they don't it takes more time to learn to teach something than just to learn it. Our field is moving so fast that outside of the major innovations, it would be quite difficult to keep up being a good teacher on everything, while also doing research, and doing the actual teaching. In addition, most new tech isn't very interesting, or useful. Like every couple of months I'm getting another peak at SOTA Python or JS and the "innovation" is just another layer of duct tape that doesn't really improve much.

Cool tech usually also sees faster adoption in academia. Rust courses where offered at the uni I went to back in 2017 for example. According to my friends still involved with uni, there was also a strong shift towards more data science/engineering and HCD since then, both fields that saw major practical improvements.

Sure, but are C++ or Java really that outdated. AFAIK that’s what most schools teach. Maybe with some JavaScript as well. It’s not lime they’re teaching Fortran or COBOL.

And with the advent of AI coding, I’d hope they can spend more time on system design, as that’s where I’ve found new grads are generally lacking.

loading story #47402710
loading story #47403563