Ask HN: AI has changed my job for the worse
In the span of a few months, my job has completely changed. Most of the code in my team is now written by agents. And most of the focus of my team is to integrate agents in our products.
I'm not interested in the products we're supposed to build, and I don't like the way we're building them. Code quality has suddenly became irrelevant, and you have to keep up with everybody who ship twice as much code as before.
At the same time, there's more pressure on SWEs to deliver as layoffs are looming. I think leadership really believes they'll be able to save a lot by ultimately getting rid of all of us.
I'm not sure what to do at that stage but I'm pretty miserable. It's crazy that this occurred so fast.
I read what you say, but I can hardly imagine how people can be twice as productive. I use Claude daily for writing various scripts. It produces a lot of code, so yes, if I measured it by generated code, then it's productive. However, it works best for short, clearly defined tasks/scripts. If you want anything longer, you have to watch it like a hawk so it doesn't go off track. And I'm not sure if I would give this code to someone else to use without a thorough review. I would say it may be even less productive (I think there was a paper on that).
I also use LLMs for writing; it's good, but again, you need to carefully read everything and rewrite passages that are completely made up. So I'm not really sure how this can replace people, to the point that Amazon is firing 30,000 people. I have a hard time squaring it that it's because of AI.
You’re not alone—many teams pivoted fast to agent-written code, and it can feel like craftsmanship no longer matters. A few concrete moves:
- Have a candid 1:1: say you’re misaligned with the process, but propose owning guardrails leadership cares about—reliability, security, test coverage, CI policies, prompt/eval hygiene. Suggest measuring outcomes beyond velocity: defect escape rate, change failure rate, MTTR, SLOs, incident cost. - Differentiate where agents are weak: ambiguous requirements, system design, debugging gnarly prod issues, performance tuning, threat modeling, compliance. Volunteer for those areas. - Use AI defensively: generate tests, fuzzers, benchmarks, docs, migrations; prune agent output; write prompts/evals to reduce rework and incidents. - Protect yourself: keep a brag doc with quantified impact, network for internal transfers, and quietly explore roles that still value rigor (fintech, healthcare, infra, aerospace, devtools). - Set a 60–90 day window. If nothing changes, execute an exit plan rather than burn out.
It’s okay to be disillusioned. Your edge now is owning quality, risk, and outcomes—things the org can’t ignore even when throughput is cheap.
I feel the same way. Im on the job market (though still employed) and i can tell you my core skills have degraded since using LLMs last year or so where technical (non leetcode!) interviews are now more challenging to me since i forgot how to make all these small decisions (eg should this be a private class property or public).
I decided to just disable codepilot and keep my skills sharp i know we will be called back to clean up the mess. Reminiscent of offshoring in the 2000s
This seems to be par for the field rn. I would say learn the tools for now, do your best to ship code you like, release as many f*cks as you can about what you're building - especially if the product and majority of products belong to someone else, and start putting feelers out for something better to hopefully come along.
It sucks SO much rn, but it seems the majority option is to grin and bear it for the time being and pray to whatever gods you believe in that we get back to something sane sooner than later
I think LLMs can generate tons of production quality code if you put in the time. I do it every day. The output and productivity is amazing.
I'm also really bored and hate that my job is writing specs and stupid prompts.
Why aren’t we replacing the executives who make the poor business decisions which lead to layoffs with AI agents?
They will. This optimization will bring a future made of one-man companies who will keep busy vibe coding and asking business decisions to AI agents. The answers will be poor as ever, so on that side nothing will change, I guess.
They won't get rid of all, but many of you will be let go for sure.
Absolutely. Keep putting shit into production and watch the company sink.
It’s understandable to feel this way — the shift happened incredibly fast, and a lot of teams weren’t given time to adapt. Maybe the real challenge is figuring out how to redefine what meaningful work looks like in this new environment.
You put a manual bookkeeper out of a job. What comes around goes around?
Manual bookkeepers were put out of a job in the 1980s with VisiCalc
How. How is a software engineer building saas applications in 2025 replacing a manual bookkeeper?
manual bookkeeper? a quickbooks user? or pen and paper? either way, it's not like any of you are manually entering debits and credits regularly into a journal like they have done forever before computers, databases, and software. we're not losing jobs because of ai but because we're in the middle of switching gas tanks. just a low energy production is causing everyone to panic and blame it on ai. i have so much work i wish i had the money to hire people. they're all just waiting for things to get cheaper. it's not like they don't have enough money to survive indefinitely, if they can make 90mil in two years vs 10mil now, the math is simple, just wait.
It's just the typical midwit argument that one kind of automation is the same as any other. Best to ignore it.
be the best at using llm and do talks how to use them get noticed and stay hired.
This is only if people believe you
I feel your worries, but sooner or later sde roles will adjust to the new requirements and tools. If a companies business model is at risk by an evolving technology or innovation, it wasn't very good after all. Nonetheless, the skills are still essential to good products imho
Not to be flippant, but - be glad you’re not a writer, because you might not have a job at all.
There are more writing jobs. Aka prompt engineers.
No, not really. There is no such position, writing is just more becoming a part of a marketer’s job.
what is the average marketer going to do that AI can't?
the LLM will parse tweets and emails and whatever else and can write it in a perfect, localized vernacular for whichever demographic you want to sell to.
and it can do it in real-time, 24/7
at this point you're mostly a bot-herder or social-media-tool admin; the tools are doing the thinking.
name and shame, so that I can get off of their products ASAP.
vibe code is going to crash those apps and I want to be away from them when it happens.
your job security is probably at risk already, so sharpen up that resume killer -- the market is rough right now.
Oh that's a good idea. Put OP's severance check at risk by being fired for cause.
Everyone in the field seems to be in deep FOMO driven by the other guys also being in a state of FOMO. This creates chaos, delusion and a stressful environment where things are irrational.
I understand your thoughts, we have to keep pushing through this and saner heads will prevail.
find a project on github and contribute it!
This too shall pass.
There’s tons of jobs in the world. Go get a different one if you’re that miserable?
You’ll probably find something to hate about that new job, too.
[dead]