It’s hard for people who haven’t experienced the loss of experts to understand. Not a programmer but I worked in aerospace engineering for 35 years. The drive to transfer value to execs and other stakeholders by reducing the cost of those who literally make that value always ends costing more.
those executives act like parasites. They bring no value and just leech the life from the companies.
executives act like parasites
WE MAED TEH PROFITZ!!!1!!1
which is ironical since without them the profits would likely soar. Doing bad shit 101 is to pin the consequences of your actions on others and falsely claim any benefits others have managed to do as your own achievements.
IMO without execs, employees would get paid for a greater percentage of their labor and profits would go down.
what profits? the ones that end up in the pockets of the executives?
Yes
Executives think they are the most important part of the company. They are high level managers, that is all.
I’d argue the CEO is the most important person, usually. We see dipshits like Musk and turn around and bag on all of them.
Think of a business, doesn’t matter if it’s local or national. How do the employees act? Are they happy and seem to be doing useful work? Are they downcast and depressed looking?
Sometimes it’s the local manager staving off corporate bullshit, but company culture mostly rolls down from the CEO. They saying, “Shit rolls downhill.”, works both ways.
The CEO and C Suite are the least important people in a company. They can be changed out with relatively little interruption and it takes a lot longer to see the effects. However, you have an on the ground workforce stop producing and the effects are immediate and long lasting.
Well, yeah, but those costs are for tomorrow’s executive to figure out, we need those profits NOW
It’s utterly bizarre. The customers lose out by receiving an inferior product at the same cost. The workers lose out by having their employment terminated. And even the company loses out by having its reputation squandered. The only people who gain are the executives and the ownership.
This is absolutely by design. The corporate raider playbook is well-read. See: Sears, Fluke, DeWalt, Boeing, HP, Intel, Anker, any company purchased by Vista (RIP Smartsheet, we barely knew ye), and so on. Find a brand with an excellent reputation, gut it, strip mine that goodwill, abandon the husk on a golden parachute, and make sure to not be the one holding the bag.
What happened to ankor?
They were acquired by Opta Group in 2023. Since then, the quality has declined while prices increased. And around the time of their acquisition, they started doing some shady stuff when claiming USB-IF compliance. The cables were blatantly not USB-IF compliant.
Another example: I personally love my Anker GaN Prime power bricks and 737. Unfortunately, among my friends and peers, I am the exception. The Prime chargers are known for incorrectly reading cable eMarkers and then failing to deliver the correct power. This has so far been an issue for me twice, but was able to be worked around.
On a more generic scale (whatever that means), we went from coding serious stuff in Ada with contracts and designs and architectures, to throwing everything in the trash while forgetting any kind of pride and responsibility in less than 50 years. AI is the next step in that global engineering enshittification (I hate that word but it’s appropriate).
Whether AI has a future or not, no one can deny that SWE is an absolute mess of shitty practices. If AI stays as it is, we’re going down with it.
<cough>Boeing<cough>
Everyone. But Boeing did a pretty fucked up job of it.
Imagine a company that fires its software engineers, replaces them with AI-generated code, and then sits back, expecting everything to just work. This is like firing your entire fire department because you installed more smoke detectors. It’s fine until the first real fire happens.
Sure but they’re not going to fire all of them. They’re going to fire 90% then make 10% put out the fires and patch the leaks while working twice as many hours for less pay.
The company will gradually get worse and worse until bankrupt or sold and the c-suite bails with their golden parachutes.
This is a bad analogy.
It would be more akin to firing your fire departments, because you installed automatic hoses in front of everyone’s homes. When a fire starts, the hoses will squirt water towards the fire, but sometimes it’ll miss, sometimes it’ll squirt backwards, sometimes it’ll squirt the neighbour’s house, and sometimes it’ll squirt the fire.
I don’t know. I look at it like firing all your construction contractors after built out all your stores in a city. You might need some construction trades to maintain your stores and your might need to relocate a store every once in a while, but you don’t need the same construction staff on had as you did with the initial build out.
In my experience, you actually need more people to maintain and extend existing software compared to the initial build out.
Usually because of scalability concerns, increasing complexity of the system and technical debt coming due.
Most extension today is enshitification. We’ve also seen major platforms scale to the size of Earth.
If you’re only going to maintain and don’t have a plan on adding features outside of duct taping AI to the software, what use is it maintaining a dev team at the size you needed it to be when creating new code?
While true, that is a weak analogy. Software rots and needs constant attention of competent people or shit stacks.
I’m not saying you can fire everyone, but the maintenance team doesn’t need to be the size of the development team if the goal is to only maintain features.
It works for a while. Keep a few seniors and everything will be fine. Then you want new features and that’s when shit hits the fan. Want me to add a few buttons? 1 month because I have to study all the random shit that was generated last week.
Twitter and Tumblr are operating on skeleton crews but are able to make changes.
Craigslist is still around even though it hasn’t changed much since the '90’s.
There is an entire industry of companies that buy old MMO’S and maintain them at a low cost for a few remaining players.
Southwest Airlines still runs ticketing on a Windows 95 server.
I think you’ll see more companies accept managed decline as a business strategy.
It’s funny you use southwest as an example in this. I flew with them for the first time this year and it was easily the worst technical experience from an IT perspective that I have ever had. Sure I got from point A to point B, but everything involved with buying the ticket, getting through security, tracking my flight, boarding time, etc was worse than every other flight I’ve been on. The app was awful and basic features like delay notifications or pulling up the digital ticket made an already expensive as hell experience way more stressful. Windows 95 isn’t keeping up
But no one is flying Southwest for a best in class experience. It doesn’t have to be a great system to use, just a system that does the bare minimum.
Twitter, Tumblr, Craigslist: those web sites are feature complete and require low maintenance.
Southwest Airlines: good for them, but if the servers have issues, they will lose billions while trying frantically to find the retired guy who maintained that monster.
Software engineer here. You’re completely wrong. The amount of work it takes to maintain and extend functionality to existing software is even bigger than the original cost of building it.
Get some time understanding how software teams work and you’ll understand. There’s a reason C Suites are hoping AI generated code can replace developers. They can’t hire enough of them.
Is there really a need to extend functionality like there was 10 years ago?
Yes. That’s at least half of the work I do on a daily basis. How else do companies in the same market compete with each other if they cannot add on to functionality and remain static? That’s a quick way to lose market share to your competition.
We’re at a point of effective monopoly and vastly increased costs of creating competition.
The spigot of free money has been turned off, so most projects today need to have a planned out ROI, which is why enshitification has become such a big thing recently. Improvement for competition sake is out the door unless the incumbent is weak or a jump is needed as the existing revenue stream is collapsing.
What are you going on about?
I don’t work in a space with a monopoly.
My employer doesn’t have free money. They compete in a huge market and earn money while doing so.
Not every company has the business model you described. The world would not run if that was the case.
Yeah, but a lot of the discussion has been about those companies given how well they pay and how dominant they are in the industry.
As a software engineer, I’m perfectly happy waiting around until they have to re-hire all of us at consulting rates because their tech stacks are falling the fuck apart <3
The jabronigrammers before me seem to have made a fine mess without the aid of an AI tool as it is…
deleted by creator
There’s also the tribal knowledge of people who’ve worked somewhere for a few years. There’s always a few people who just know where or how a particular thing works and why it works that way. AI simply cannot replace that.
Institutional knowledge takes years to replace.
deleted by creator
there’s so much negligent work here I swear they did it on purpose.
Depending on the place, it’s the “work insurance” - companies would usually think twice before firing the only person who can understand the spaghetti. Now they won’t need said person to generate “working” code
That’s what I expect if I’m fired and rehired: at least +25% on my salary.
We hired a junior at work from a prestigious university. He uses ChatGPT all the time but denies it. I know that because all his comments in the code are written like some new Tolkien book. Last time I checked his code, I told him it had something like 20 bugs and told him how to fix that because I’m not a bad guy. The next day, he came back with a program that was very very different. Not knowing how to apply my fixes, he used another prompt and the whole thing was different with new bugs. I told my boss I was not wasting time on that shit again.
Well, also if the guy was just dumping AI generated code arbitrarily into your product, that pretty significantly risks the copyright over the entire product into which the generated stuff was integrated (meaning, anyone can do whatever the fuck they want with it).
deleted by creator
I’m an IP attorney whose been pretty specialized in ML-enabled technologies for a decade now, and have worked in-house for fortune 500 companies so I’m pretty familiar with how these queries are often handled, especially at multinats. There honestly probably isn’t someone in your legal with all three of seniority, understanding and keeping up with the legal nuances, and understanding of the underlying technologies. The overlap in my experience is few and far between.
Literally anybody who thought about the idea for more than ten seconds already realized this a long time ago; apparently this blog post needed to be written for the people who didn’t do even that…
You underestimate the dumbassery of Pencil-Pushers in tech companies (& also how genuinely sub-human they can be)
MBAs are like surgeons; their every solution is to cut.
Stop giving me bad ideas lol
e: I suck at markdown
To your point at my last company party i got drunk and kept complimenting people by calling them human.
Although I agree, I think AI code generation is the follow up mistake. The original mistake was to offshore coding to fire qualified engineers.
Not all of offshore is terrible, that’d be a dumb generalization, but there are some terrible ones out there. A few of our clients that opted to offshore are being drowned is absolute trash code. Given that we always have to clean it up anyway, I can see the use-case for AI instead of that shop.
I think the core takeaway is your shouldn’t outsource core capabilities. If the code is that critical to your bottomline, pay for quality (which usually means no contractors - local or not).
If you outsource to other developers or AI it means most likely they will care less and/or someone else can just as easily come along and do it too.
…shouldn’t outsource core capabilities.
This right here.
The core takeaway is that except for a few instances the executives still don’t understand jack shit and when a smooth talking huckster dazzles them with ridiculous magic to make them super rich they all follow them to the poke.
Judges and Executives understand nothing about computers in 2025. that’s the fucked up part. AI is just how we’re doing it this time.
Companies that are incompetently led will fail and companies that integrate new AI tools in a productive and useful manner will succeed.
Worrying about AI replacing coders is pointless. Anyone who writes code for a living understands the limitations that these models have. It isn’t going to replace humans for quite a long time.
Language models are hitting some hard limitations and were unlikely to see improvements continue at the same pace.
Transformers, Mixture of Experts and some training efficiency breakthroughs all happened around the same time which gave the impression of an AI explosion but the current models are essentially taking advantage of everything and we’re seeing pretty strong diminishing returns on larger training sets.
So language models, absent a new revolutionary breakthrough, are largely as good as they’re going to get for the foreseeable future.
They’re not replacing software engineers, at best they’re slightly more advanced syntax checkers/LSPs. They may help with junior developer level tasks like refactoring or debugging… but they’re not designing applications.
Companies that are incompetently led will fail and companies that integrate new AI tools in a productive and useful manner will succeed.
Using AI to lobby for bailouts? Very clever!
What most people forget is that as a programmer/designer/etc, your job is to take what your client/customer tells you they want, listen to them, then try to give them what they ACTUALLY NEED, which is something that I think needs to be highlighted. Most people making requests to programmers, don’t really even know what they want, or why they want it. They had some meeting and people decided that, ‘Yes we need the program to do X!’ without realizing that what they are asking for won’t actually get them the result they want.
AI will be great at giving people exactly what they ask for…but that doesn’t mean its what they actually needed…
Great points. Also:
… AI will be great at giving people exactly what they ask for …
Honestly, I’m not even sure about this. With hallucinations and increasingly complex prompts that it fails to handle, it’s just as likely to regurgitate crap. I don’t even know if AI will get to a better state before all of this dev-firing starts to backfire and sour most company’s want to even touch AI for most development.
Humans talk with humans and do their best to come up with solutions. AI takes prompts and looks at historical human datasets to try and determine what a human would do. It’s bound to run into something novel eventually, especially if there aren’t more datasets to pull in because human-generated development solutions become scarce.
AI will never not-require a human to hand hold it. Because AI can never know what’s true.
Because it doesn’t “know” anything. It only has ratios of usage maps between connected entities we call “words”.
Sure, you can run it and hope for the best. But that will fail sooner or later.
AI will. LLM won’t.
Also, LLM doesn’t usually have memory or experience. It’s the first page of Google search every time you put in your tokens. A forever trainee that would never leave that stage in their career.
Human’s abilities like pattern recognition, intuition, acummulation of proven knowledge in combination makes us become more and more effective at finding the right solution to anything.
The LLM bubble can’t replace it and also actively hurts it as people get distanced from actual knowledge by the code door of LLM. They learn how to formulate their requests instead of learning how to do stuff they actually need. This outsourcing makes sense when you need a cookie recipe once a year, it doesn’t when you work in a bakery. What makes the doug behave each way? You don’t need to ask so you wouldn’t know.
And the difference between asking like Lemmy and asking a chatbot is the ultimative convincing manner in which it tells you things, while forums, Q&A boards, blogs handled by people usually have some of these humane qualities behind replies and also an option for someone else to throw a bag of dicks at the suggestion of formating your system partition or turning stuff off and on.
that stuff should really get worked out in the agile process as the customer reacts to each phase of the project.
Getting the real requirements nailed down from the start is critical, not just doing the work the customer asked for. Otherwise, you get 6 months into a project and realize you must scrap 90% of the completed work; the requirements from the get-go were bad. The customer never fundamentally understood the problem and you never bothered to ask. Everyone is mad and you lost a repeat customer.
yeah but with agile they should be checking the product out when its a barely working poc to determine if the basic idea is what they expect and as it advances they should be seeing each stage. Youll never get the proper requirements by second guessing what they say.
Yesterday the test team asked me for 3 new features to help them. I thought about it for a few minutes and understood that these features are all incompatible. You can get one and only one. Good luck finding an AI that understands this.
Exactly. And if AI somehow finds a way to do it, end users will find even more ways to do it wrong
I’m just a dabbler at coding and even i can see getting rid of programmers and relying to ai for it will lead to disaster. Ai is useful, but only for smallest scraps of code because anything bigger will get too muddled. For me, it liked to come up with its own stupid ideas and then insist on getting stuck on those so i had to constantly reset the conversation. But i managed to have it make useful little function that i couldnt have thought up myself as it used some complex mathematical things.
Also relying on it is quick way to kind of get things done but without understanding at all how things work. Eventually this will lead to such horrible and unsecure code that no one can fix or maintain. Though maybe its good thing eventually since it will bring those shitty companies to ruin. any leadership in those companies should be noted down now though, so they cant pretend later to not have had anything to do with it.
Even if I ask AI for how to do a process it will frequently respond with answers for the wrong version, even though I gave the version, parameters that don’t work, hand waving answers that are useless, etc.
I find it’s the most useful when asking it for small snippets of code or dealing with boilerplate stuff. Anything more complicated usually results in something broken.
give it a few more years
Yeah, like bitcoins, NFTs, and Tesla driving themselves.
Bitcoins are there, aren’t they?
Yes, still a scam and still useless for most people. We’ve been waiting forever, maybe like AI.
Given that even stack overflow is being mostly answered by AI, don’t expect that to actually get better, unless you’re counting on sensitive coding data being “legally” siphoned from AI users
yeah, there are many things its easier to just give up having the ai do it. even if you somehow succeed it will likely be such mess it gives you its not worth it
AI mostly seems useful when you don’t know a specific concept and just need the base ideas. That said, given it’s often confidently wrong and doesn’t involve humans actively steering you toward better ideas, I still find Stack Overflow more helpful. Sometimes the answer to your problem is to stop doing what you are trying to do and attack the problem from a different angle.
yeah, though for questions people on stackoverflow would consider stupid or beneath them, ai was superior as it never gets angry with you. How i see the ai is kind of like being able to ask the knowledge it consists of questions. But since it has also been fed with garbage and its all there like ingredients of a soup, who knows what affects what so you really dont want to trust it too much.
I also find it is best when I’m very specific and give as many identifiers as possible. App name, version, OS, quoted error code, etc.
Like relying on automated systems for aircraft so much. You get things like planes going into landing mode because they think they are close to the runway.
A reason I didn’t see listed: they are just asking for competition. Yes by all means get rid of your most talented people who know how your business is run.
And can reproduce the whole business in a weekend with the help of AI. There are no moats anymore.
I wonder if there will eventually be a real Butlerian Jihad
Maybe after Herbert’s idiot son dies and someone else gets the rights
His books are so shit and totally miss the point of Frank Herbert’s books.
The jihad starts with the tech bros’ butlers, that’d be very poetic
I’m sorry, I mostly agree with the sentiment of the article in a feel-good kind of way, but it’s really written like how people claim bullies will get their comeuppance later in life, but then you actually look them up later and they have high paying jobs and wonderful families. There’s no substance here, just a rant.
The author hints at analogous cases in the past of companies firing all of their engineers and then having to scramble to hire them back, but doesn’t actually get into any specifics. Be specific! Talk through those details. Prove to me the historical cases are sufficiently similar to what we’re starting to see now that justifies the claims of the rest of the article.
Come back in 3 years and the “historical cases” will have appeared.
https://old.reddit.com/r/jobs/comments/1inh2hl/meta_just_laid_off_3600_peopleheres_why_this/
or just wait 14 hours
I disagree. For example:
Now, six months later, you realize that your AI-generated software is riddled with security holes. Whoops! Your database is leaking private financial data like a sieve
We have seen this a thousand times before there was an AI. AI is like a cheap contractor out of school and companies who use it extensively will get the same results. It’s a pragmatic thing, not some phantasm about bullies. I have told so many times “I told you so” to previous managers that I trust it will happen again and again.
I’m fine with this. Let it all break, we’ve earned it.
I haven’t seen anybody point this out yet. The owners of tech were never in it for the “tech”. It’s just a tool for them to wiggle their way up to the top. Trying to hit the jackpot so that they can wrest control of society from the current “old rich”.
I just hope people won’t go back to these abusive jobs. The oligarchy that runs the US has shown it is more than happy to lay people off to cool wages and the Fed is more than happy to blame workers getting paid a reasonable amount as the cause of inflation.
Article about bad AI decisions
Thumbnail is AI
Lmao