You can still continue to master actual software engineering while others spend their time turning their minds into a palimpsest of tricks and lessons of how to convince one model after another after another after another into giving reasonable output. That you'd still have to vet yourself anyway.
While I think a lot of the AI hype is just hype - everyone saying most of these things have _hitherto untold riches_ levels of financial incentives to say them - I think it's also undeniable that LLMs speed up many aspects of coding.
I also think that AI might be the beginning of the end of copyright. While before, everyone with money clearly had tremendous incentive to keep copyright strong, now all of a sudden trillions of dollars are basically predicated on the idea that LLMs aren't violating copyright. Copyleft has been a major tool in the FOSS toolbox. If that's weakening, I don't ALSO want free software to be locked out of agentic programming too.
It's the corrupting nature of capitalism really laid bare. A net loss for so many of their constituents that politicians all over the world are falling over themselves to pave the way for foreign companies to exploit their constituents IP.
A true tragedy of the commons unfolding before us.
I get why, and I get why it's the only realistic choice, but it really is showing the weaknesses of modern politics.
I love AI because I love building things and it lets me build more things I like faster.
If anything it's anti-capitalist: For example I built a software bluetooth proxy for Docker that let me use the underlaying BT device for Home Assistant even though the HA docs said I'd have to buy a new device. There is no way I'd do that without AI.
And I've built many many random project that I'd never have thought about doing without AI.