It's not about "middle management" - lots of professional or technical fields operate under a de facto apprenticeship system with long term on the job training to ramp up to the level of familiarity and expertise needed to do QA to recognize and correct common errors made by novices.
And working with AI coding or legal assistants feels remarkably like the same experience of experienced supervisors doing quality check on a junior employee and maybe introducing subtle revisions to the work product.
Unless college could somehow become an experience that makes new junior coders equivalent to senior supervisors QA'ing AI work product, there is really nothing it can do. And that would be my today's standards, by 2030 when today's high school seniors graduate from college, the AIs will probably be way more capable than even that.
I suspect we're going to see this a lot starting now, a whole generation of new laptop class college grads are going to just get shut out of their expected careers by having "new grad cheap AI substitutes" available. It's probably easiest to get a lump of older workers to retire, maybe early, and the middle ages are anybody's guess, but I suspect strong culling for anyone who isn't an AI-tool-wielding pro.
But a lot of unemployable yet talented young people is going to be very disruptive one way or another.
I don’t know. That should be true of 1-2 years of experience, also. I just think when everything contracts, there is a flight to “known,” and if you’re still in school, you’re unknown to the workforce. The solve, I think, is a “just do things” and/or apprenticeship model…if you’re technical, or technically inclined, just write code. Even bad code. But again, I think the on-ramp is having someone vouch for you as a known worker, and that creates a chicken-egg problem.
https://x.com/airkatakana/status/1926179334005338621
It's not about "middle management" - lots of professional or technical fields operate under a de facto apprenticeship system with long term on the job training to ramp up to the level of familiarity and expertise needed to do QA to recognize and correct common errors made by novices.
And working with AI coding or legal assistants feels remarkably like the same experience of experienced supervisors doing quality check on a junior employee and maybe introducing subtle revisions to the work product.
Unless college could somehow become an experience that makes new junior coders equivalent to senior supervisors QA'ing AI work product, there is really nothing it can do. And that would be my today's standards, by 2030 when today's high school seniors graduate from college, the AIs will probably be way more capable than even that.
I suspect we're going to see this a lot starting now, a whole generation of new laptop class college grads are going to just get shut out of their expected careers by having "new grad cheap AI substitutes" available. It's probably easiest to get a lump of older workers to retire, maybe early, and the middle ages are anybody's guess, but I suspect strong culling for anyone who isn't an AI-tool-wielding pro.
But a lot of unemployable yet talented young people is going to be very disruptive one way or another.
I don’t know. That should be true of 1-2 years of experience, also. I just think when everything contracts, there is a flight to “known,” and if you’re still in school, you’re unknown to the workforce. The solve, I think, is a “just do things” and/or apprenticeship model…if you’re technical, or technically inclined, just write code. Even bad code. But again, I think the on-ramp is having someone vouch for you as a known worker, and that creates a chicken-egg problem.