Rendered at 09:43:32 GMT+0000 (Coordinated Universal Time) with Cloudflare Workers.
sureglymop 23 hours ago [-]
At the end of last year I started to feel a little burned out and seemingly had lost my curiosity, motivation and passion for computers.
I started again with something I don't do at my day job, assembly and low level programming. It's been a blessing to learn and realize that things can actually be much simpler at that level. I'm especially interested in anything that actually needs one to drop down to assembly level and can't be done in some low level languages. E.g. implementing coroutines, jit compilation, self modifying binaries etc. I don't think I'll ever be able to use this knowledge professionally but more importantly, it's fun! Any more ideas for fun stuff are welcome. :)
Slow_Hand 1 days ago [-]
For the first time this year (and nearing 40) I'm teaching myself web development and learning to code - and I love it. I like the problem solving, and the creativity, and having the facility to build things for myself.
The threat of LLMs undermining my opportunities in the work pool are never far from my mind as I move towards a seemingly burning building, but you know what? I will do it regardless, because even though my prospects for employment may be diminished I'm enjoying the craft, and I like being creative with it.
I intend to embrace LLMs as an augmentation of my will once I get a good grasp on how to code proficiently. Maybe I'm too late for the heyday of coding by hand, but if these tools allow me more power to solve the ACTUAL problem, then that's alright. The point is to solve problems, right? Not to write code.
bluefirebrand 1 days ago [-]
> The point is to solve problems, right?
I want to solve problems correctly and in a high quality manner.
LLMs do not enable me to do this any better in my opinion. They enable me to do it faster* but worse
* I'm not entirely sold on it being actually faster either
jordanbeiber 1 days ago [-]
There’s a place for everything.
Most coding tasks take place outside of pure tech companies, if I’d venture a guess.
And let’s be honest, enterprises in general do not value that quality - and they face very little in terms of technical challenges that can’t be solved by code on stack-overflow or github.
What most enterprises lack is knowledge about themselves though - this is more a business problem than a technical one however.
Slow_Hand 1 days ago [-]
Yeah. It remains an open question if they'll be a net positive in the end. If they end up being helpful, good. And if they end up being mostly a waste of our time, then we'll go back to where we were. More or less.
It seems like there's still some juice to squeeze from this technology, though. So my money is on a net positive. For now.
Even if they end up being bad practice for production code, we can probably agree that they're decent at mock-ups, experimentation, and quick proof of concepts. That at least has some value.
I see a lot of people with zero coding/engineering experience trying to make their own products, and very few of those products with long term staying power. We've got a long way to mature with how we use this tech. This moment feels like the introduction of the home microwave: A lot of terrible, terrible meals were cooked while people briefly forsook their stove to use the miraculous microwave for everything. Eventually people figured out what tasks the microwave was suitable for and then went back to the oven for all but simple re-heating.
bluefirebrand 1 days ago [-]
I've been feeling this a lot
I don't know what I'm going to do next but I have very little interest in or respect for LLM assisted coding, so I think the industry is likely not a good place for me anymore
If I could retire I would but I'm not quite 40. I don't have the savings to stop working now. So I'll figure out something new I guess
What a bummer. I loved my career to this point and I'm very sad that LLMs have ruined it
I guess I'm glad I'm not alone in this
hyperhello 1 days ago [-]
What have they ruined, exactly? You can do all the same things you used to, can’t you?
chickenimprint 1 days ago [-]
You can't get paid doing them.
If coding goes away, decades of experience become worthless instantly. Not all of it, but the vast majority, enough to justify starting over in another career.
In that world, it will have become more cost-effective for most companies to spend most of their budget on inference vendors and employ a few low-paid LLM wranglers, even if the final output is of terrible quality. No point in competing for that kind of employment experience with that kind of pay.
dasil003 1 days ago [-]
I really don't get this point of view at all. I acknowledge that two yours into my quarter century of experience, most of what I knew was easily replaceable by the AI of today. After two decades of experience however, syntax and specific algorithm and language knowledge was perhaps 10% of my value, nowhere near the vast majority.
The idea that low-paid LLM wranglers are going to push out the experienced engineers just doesn't wash. What I think is much more likely to happen is the number of software engineers greatly reduces, but the remaining ones actually get paid more, because writing code is no longer the long pole, and having fewer minds designing the system at a high level will allow for more cohesive higher-level design, and less focus on local artesenal code quality.
To be honest AI is just the catalyst and excuse for overhiring that happened due to the gold rush over the last 20 years related to the internet and smart phone revolutions, zero-interest rate, and pandemic effect.
tetris11 23 hours ago [-]
> language knowledge was perhaps 10% of my value, nowhere near the vast majority.
Do you not see LLM's catching up with your experience fast?
You might not lose your job, but you'll definitely have to take a pay cut
chickenimprint 21 hours ago [-]
> What I think is much more likely to happen is the number of software engineers greatly reduces, but the remaining ones actually get paid more.
You realize that this is contradictory, right? If the number of competitors remains the same, yet there are far fewer jobs, it's a buyer's market: companies have to offer very little to find someone desperate enough.
> It will allow for more cohesive higher-level design, and less focus on local artesenal code quality.
I don't buy this, LLM code is extremely bloated. It never reuses abstractions or comes up with novel designs to simplify systems. It can't say no, it just keeps bolting on code. In a very very abstract sense you might be right, but that's outside the realm of engineering, that's product design.
dasil003 17 hours ago [-]
You raise some good points about the economics, that's where I feel the least confident, but let me explain my reasoning.
Software has eaten the world, and thus the value of maintaining software has never been hire. Engineers are the people who understand how software works. Therefore unless we move away from software, the value of software engineering remains high.
AI does not reduce software, it increases the amount of software, makes messier software and generally increases the surface area of what needs to be maintained. I could be wrong, but as impressive as LLM's language and code processing capabilities are, I believe there is a huge chasm that will likely never be crossed between the human intent of systems and their implementation that only human engineers can actually bridge. And even if I'm wrong, there's another headwind which is that, as Simon Willison has point out, you can't hold an LLM accountable, and therefore corporate leaders are very unlikely to put AI in any position of power, because all the experience and levers they have for control are based on millenia of evolution and a shared understanding of human experience; in short they want a throat to choke.
The other factor is that while AI can clearly replace rote coding today, I think the demos oversell the utility of that software. Sure it's fine to get started, but you quickly paint yourself in a corner if you attempt to run a business on that code overtime where UX cohesion, operational stability and data integrity over time are paramount and not something that can be solved for without a lot of knowledge and guardrails.
So net of all this, where I think we land is a lot of jobs that are based purely on knowledge of one slow-changing system and specific code syntax will go away, but there will be engineers who maintain all the same code, they'll just cover more scope with LLM assisted tools. You put your finger on something, that I do believe this moves engineering closer to product design, but I still think there's a huge amount on the engineering side that LLMs won't be able to do any time soon (both for technical and the social reasons stated above), and ultimately I don't see the boundary the same way you do, as software engineers we have always had to justify our systems by their real world interaction.
chickenimprint 15 hours ago [-]
> Software is everywhere and thus the value of maintaining software and the value of software engineering remains high.
This is an unfinished argument. What if we get coding agents to maintain software? What if frequent rewriting becomes cheap enough? Something that's a tenth or one hundredth of your salary doesn't have to be good to make for a good business decision. Why do you think every native application has been replaced by slop made up of 10 layers of JS frameworks on top of electron? Nothing matters as long as the product is cheap and fast to pump out, barely works on modern hardware, and makes dough.
> AI does not reduce software, it increases the amount of software.
There's not infinite demand for software. If AI inference costs take 50% of the prior payroll expenses, while making a company twice as efficient, that means we need 4 times as much demand in software engineering at the same salary for everyone to keep their job. What new or improved subscription, app, website, device, or other software product does the world need right now? 99.9% of people use the same 5 apps. Most of their free time, attention, and disposable income has already been captured by trash that is unbeatable due to network effects. Are we all going to sell shitty LLM frontends to businesses until they notice they could have done the same thing themselves? There might be an explosion in new software, but no one there to care about using it.
> I believe there is a huge chasm that will likely never be crossed between the human intent of systems and their implementation that only human engineers can actually bridge.
Maybe, or the AI might just be missing context. Think of all the unwritten culture, practices, and conversations the LLM hasn't been made aware of.
> In short they want a throat to choke.
You're responsible for those under you anyway, this doesn't help. Banking on those in charge being irrational forever in a way that is bad for business, and without ever noticing, is a bad gamble.
> The other factor is that while AI can clearly replace rote coding today [...], X is not something that can be solved for without a lot of knowledge and guardrails.
I'm talking about the world the AI-maximalists predict is rapidly approaching, not where we are today. None of that knowledge and none of those guardrails are hard to grasp intellectually, compared to advanced mathematics for example. Put your institutional knowledge in a .md file and add another agent that enforces guardrails in a loop. The only way out I see is a situation where there are complex patterns that we intuitively grasp, but can't articulate. Patterns that somehow span too much data or don't have enough examples for LLMs to pick up on.
> There will be engineers who maintain all the same code, they'll just cover more scope with LLM assisted tools.
So fewer jobs with lesser qualifications?
> Ultimately I don't see the boundary the same way you do, as software engineers we have always had to justify our systems by their real world interaction.
I've seen the way engineers design products, and I like products designed by engineers, but no layperson does. Laypeople don't want power, privacy, or agency. They care about how things work, and they lie to themselves and others about what they really want. They don't want a native desktop app that streams high-quality audio from a self-hosted collection, they want a subscription that autoplays algorithmic slop through a react native app on their iPhone. Do you really think you're better at appealing to/fleecing customers than people with actual UX, marketing, and behavioral psychology experience? This example only applies to mass-market software, but I'm sure it's not much different in other fields. Engineers keep thinking they could everyone else's job, but they don't do so well in practice.
dasil003 5 hours ago [-]
I'm sort of shocked at how little of my argument seemed to land with you in any way. I'm wondering how many cycles of software hype have you been through? Were you here for the PC revolution, the .com era, smartphone mass adoption?
There's a lot of what-ifs, and worst case scenarios in your reply that I simply don't find likely. I am not drinking the koolaid from the AI maximalists or the doomers. I could be wrong of course, no one can predict the future, but to me the very real, novel and broad utility of LLMs that we are just learning to harness combined with the investment outlook are leading to a mania that has people overestimating where things will land when the dust settles. If I'm wrong then I guess I'll join the disenfranchised masses picking up pitchforks, but I'm not going to waste time worrying about that until I see more evidence that it's actually going that badly.
So far what I see is that software engineers are the ones getting the most actual utility of AI tooling. The reason is that it still requires a precision of thought and specificity to get anything sustainable out AI coding tools. Note this doesn't mean that engineers can design better apps than proper designers, rather my point is that designers and other disciplines still can not go much further than prototypes, they still need engineers to write the prompts, test the output, maintain the system, and debug things when they go wrong. I have worked long enough with large cross-functional teams to know that the vast majority of folks in non-engineering functions simply can not get enough specificity and clarity in their requests to allow an LLM to turn it into a working system that will work over time. The will hit a wall very quickly where new features add bugs faster than they improve things, and the whole thing collapses under its own weight like a mansion of popsicle sticks. And by the way, I don't consider AI-assisted coding to require less qualification than regular coding. Sure you don't need to know as much syntax or algorithms, but you absolutely need to know data modeling, performance, reliability, debugging, consistency, and migration knowledge in order to use AI to contribute to any software that powers a real business, and yeah you might need to develop your product and business sensibilities, but to me that's what been happening throughout the history of computing. Wiring up ENIAC, certainly required qualifications that were not needed for assembly programing, which in turn required certain things that C programmers did not need and so forth, but harnessing the increasing compute power and complexity required new qualifications. I don't think AI will ultimately be that different, it will change the way we work, it doesn't replace what senior engineers do.
bluefirebrand 1 days ago [-]
> What I think is much more likely to happen is the number of software engineers greatly reduces
So you just believe you'll be one of the ones left behind?
Best of luck to you
bluefirebrand 1 days ago [-]
I'm not spending my precious time on this earth reviewing code from my coworkers that they couldn't be bothered to write without using LLMs
And that's really just the tip of the iceberg. LLM usage metrics being introduced by management to ensure the licenses they pay for are being used. New productivity metrics that require LLMs and low standards to reach, and that's before we even get into my ethical problems with the technology
So, yes. Their existence is ruining my love for technology
hellojesus 1 days ago [-]
My coworkers have started writing code with copilot. It's kind of okay but also not really.
I've been enjoying teaching them how the things they're producing with llms work, because they have no idea and constantly break their builds because of it. And at the same time it helps me improve my craft because I get to refine the bits of which I don't have full understanding as well as see some implementations I wouldn't have voluntarily chosen previously, which allows me to explore their benefits and limitations. Llms actually make this process slightly less painful because at least now when I send them away to work for the day they have something to review when we next meet vs. pre-llm days when they would basically have wrote nothing because they were stuck.
I still don't use llms to code beyond whatever search providers autoprovide when I'm looking up documentation. I don't think I'm good enough to use them. Maybe one day. But for now I don't have to because I'm not facing the breadlines for writing things myself.
hyperhello 1 days ago [-]
I mean, if it’s that bad they’ll be released from service at your company, and you’ll be recognized as superior. This only helps you.
bluefirebrand 1 days ago [-]
In my experience management only cares about velocity, not quality. I believe this is pretty universal across the industry
When hand writing code we could strike a tolerable balance between quality and velocity. With LLM coding we cannot. Velocity is high, quality is low. I don't believe there is any fixing that despite what the many LLM coding shills on this website would have you believe
ares623 14 hours ago [-]
I've been feeling it too.
Up until a few weeks ago, I've been able to successfully avoid using AI at work. But then mandates happened and now I'm being forced to use them. Absolutely no guidance from leadership though. "Just figure it out amongst yourselves". Other folks in the company have similar reservations but I feel I'm the only one who has very strong feelings about it in the "morality" and ethical sense. I just can't ignore what the tech is built upon and is doing to other people. All so people like us can open 20 PRs in a day. PRs that don't even get merged because no one can keep up with reviews. For tickets that before would've been labeled as "not worth it". For a job that wasn't even that hard to begin with.
Funny thing is that 1-2 years ago when it was all still new I was more open minded. It was a shiny new tool and naturally I would like to try it out. I was one of the first finding potential use-cases for the team. But the more I looked and learned about it the more I hated it.
And I am not a Luddite. Before all of this, I would personally spend my _free_ time using and reading about random and often obscure tools and languages like Lisp, Clojure, Slackware. I'd spend hours curating my Emacs config. I was learning "k8s the hard way" back when it was the hottest tech thing. Does that sound like a Luddite?
I don't have the privilege to just pivot to another career so I have no choice but to stick it out. My only consolation is that when my kids are working age in a couple of years, when they ask me what the fuck happened, I can look them in the eye and honestly say that I did what I can and I did not cheer it on.
adampunk 1 days ago [-]
Are you replying to the right post? This post is about someone excited to learn again.
Straight up, why aren’t you excited to learn?
bluefirebrand 1 days ago [-]
I view using LLMs as the absolute opposite of learn which is why I'm turned off of them
I think I misread this post, though. I initially read it as someone who was excited to leave software and learn something new.
Your post made me re-read it, now I'm not sure. Maybe the author is excited to learn a new LLM based workflow. If so, you're right that I have nothing in common with how they feel.
I started again with something I don't do at my day job, assembly and low level programming. It's been a blessing to learn and realize that things can actually be much simpler at that level. I'm especially interested in anything that actually needs one to drop down to assembly level and can't be done in some low level languages. E.g. implementing coroutines, jit compilation, self modifying binaries etc. I don't think I'll ever be able to use this knowledge professionally but more importantly, it's fun! Any more ideas for fun stuff are welcome. :)
The threat of LLMs undermining my opportunities in the work pool are never far from my mind as I move towards a seemingly burning building, but you know what? I will do it regardless, because even though my prospects for employment may be diminished I'm enjoying the craft, and I like being creative with it.
I intend to embrace LLMs as an augmentation of my will once I get a good grasp on how to code proficiently. Maybe I'm too late for the heyday of coding by hand, but if these tools allow me more power to solve the ACTUAL problem, then that's alright. The point is to solve problems, right? Not to write code.
I want to solve problems correctly and in a high quality manner.
LLMs do not enable me to do this any better in my opinion. They enable me to do it faster* but worse
* I'm not entirely sold on it being actually faster either
Most coding tasks take place outside of pure tech companies, if I’d venture a guess.
And let’s be honest, enterprises in general do not value that quality - and they face very little in terms of technical challenges that can’t be solved by code on stack-overflow or github.
What most enterprises lack is knowledge about themselves though - this is more a business problem than a technical one however.
It seems like there's still some juice to squeeze from this technology, though. So my money is on a net positive. For now.
Even if they end up being bad practice for production code, we can probably agree that they're decent at mock-ups, experimentation, and quick proof of concepts. That at least has some value.
I see a lot of people with zero coding/engineering experience trying to make their own products, and very few of those products with long term staying power. We've got a long way to mature with how we use this tech. This moment feels like the introduction of the home microwave: A lot of terrible, terrible meals were cooked while people briefly forsook their stove to use the miraculous microwave for everything. Eventually people figured out what tasks the microwave was suitable for and then went back to the oven for all but simple re-heating.
I don't know what I'm going to do next but I have very little interest in or respect for LLM assisted coding, so I think the industry is likely not a good place for me anymore
If I could retire I would but I'm not quite 40. I don't have the savings to stop working now. So I'll figure out something new I guess
What a bummer. I loved my career to this point and I'm very sad that LLMs have ruined it
I guess I'm glad I'm not alone in this
If coding goes away, decades of experience become worthless instantly. Not all of it, but the vast majority, enough to justify starting over in another career.
In that world, it will have become more cost-effective for most companies to spend most of their budget on inference vendors and employ a few low-paid LLM wranglers, even if the final output is of terrible quality. No point in competing for that kind of employment experience with that kind of pay.
The idea that low-paid LLM wranglers are going to push out the experienced engineers just doesn't wash. What I think is much more likely to happen is the number of software engineers greatly reduces, but the remaining ones actually get paid more, because writing code is no longer the long pole, and having fewer minds designing the system at a high level will allow for more cohesive higher-level design, and less focus on local artesenal code quality.
To be honest AI is just the catalyst and excuse for overhiring that happened due to the gold rush over the last 20 years related to the internet and smart phone revolutions, zero-interest rate, and pandemic effect.
Do you not see LLM's catching up with your experience fast?
You might not lose your job, but you'll definitely have to take a pay cut
You realize that this is contradictory, right? If the number of competitors remains the same, yet there are far fewer jobs, it's a buyer's market: companies have to offer very little to find someone desperate enough.
> It will allow for more cohesive higher-level design, and less focus on local artesenal code quality.
I don't buy this, LLM code is extremely bloated. It never reuses abstractions or comes up with novel designs to simplify systems. It can't say no, it just keeps bolting on code. In a very very abstract sense you might be right, but that's outside the realm of engineering, that's product design.
Software has eaten the world, and thus the value of maintaining software has never been hire. Engineers are the people who understand how software works. Therefore unless we move away from software, the value of software engineering remains high.
AI does not reduce software, it increases the amount of software, makes messier software and generally increases the surface area of what needs to be maintained. I could be wrong, but as impressive as LLM's language and code processing capabilities are, I believe there is a huge chasm that will likely never be crossed between the human intent of systems and their implementation that only human engineers can actually bridge. And even if I'm wrong, there's another headwind which is that, as Simon Willison has point out, you can't hold an LLM accountable, and therefore corporate leaders are very unlikely to put AI in any position of power, because all the experience and levers they have for control are based on millenia of evolution and a shared understanding of human experience; in short they want a throat to choke.
The other factor is that while AI can clearly replace rote coding today, I think the demos oversell the utility of that software. Sure it's fine to get started, but you quickly paint yourself in a corner if you attempt to run a business on that code overtime where UX cohesion, operational stability and data integrity over time are paramount and not something that can be solved for without a lot of knowledge and guardrails.
So net of all this, where I think we land is a lot of jobs that are based purely on knowledge of one slow-changing system and specific code syntax will go away, but there will be engineers who maintain all the same code, they'll just cover more scope with LLM assisted tools. You put your finger on something, that I do believe this moves engineering closer to product design, but I still think there's a huge amount on the engineering side that LLMs won't be able to do any time soon (both for technical and the social reasons stated above), and ultimately I don't see the boundary the same way you do, as software engineers we have always had to justify our systems by their real world interaction.
This is an unfinished argument. What if we get coding agents to maintain software? What if frequent rewriting becomes cheap enough? Something that's a tenth or one hundredth of your salary doesn't have to be good to make for a good business decision. Why do you think every native application has been replaced by slop made up of 10 layers of JS frameworks on top of electron? Nothing matters as long as the product is cheap and fast to pump out, barely works on modern hardware, and makes dough.
> AI does not reduce software, it increases the amount of software.
There's not infinite demand for software. If AI inference costs take 50% of the prior payroll expenses, while making a company twice as efficient, that means we need 4 times as much demand in software engineering at the same salary for everyone to keep their job. What new or improved subscription, app, website, device, or other software product does the world need right now? 99.9% of people use the same 5 apps. Most of their free time, attention, and disposable income has already been captured by trash that is unbeatable due to network effects. Are we all going to sell shitty LLM frontends to businesses until they notice they could have done the same thing themselves? There might be an explosion in new software, but no one there to care about using it.
> I believe there is a huge chasm that will likely never be crossed between the human intent of systems and their implementation that only human engineers can actually bridge.
Maybe, or the AI might just be missing context. Think of all the unwritten culture, practices, and conversations the LLM hasn't been made aware of.
> In short they want a throat to choke.
You're responsible for those under you anyway, this doesn't help. Banking on those in charge being irrational forever in a way that is bad for business, and without ever noticing, is a bad gamble.
> The other factor is that while AI can clearly replace rote coding today [...], X is not something that can be solved for without a lot of knowledge and guardrails.
I'm talking about the world the AI-maximalists predict is rapidly approaching, not where we are today. None of that knowledge and none of those guardrails are hard to grasp intellectually, compared to advanced mathematics for example. Put your institutional knowledge in a .md file and add another agent that enforces guardrails in a loop. The only way out I see is a situation where there are complex patterns that we intuitively grasp, but can't articulate. Patterns that somehow span too much data or don't have enough examples for LLMs to pick up on.
> There will be engineers who maintain all the same code, they'll just cover more scope with LLM assisted tools.
So fewer jobs with lesser qualifications?
> Ultimately I don't see the boundary the same way you do, as software engineers we have always had to justify our systems by their real world interaction.
I've seen the way engineers design products, and I like products designed by engineers, but no layperson does. Laypeople don't want power, privacy, or agency. They care about how things work, and they lie to themselves and others about what they really want. They don't want a native desktop app that streams high-quality audio from a self-hosted collection, they want a subscription that autoplays algorithmic slop through a react native app on their iPhone. Do you really think you're better at appealing to/fleecing customers than people with actual UX, marketing, and behavioral psychology experience? This example only applies to mass-market software, but I'm sure it's not much different in other fields. Engineers keep thinking they could everyone else's job, but they don't do so well in practice.
There's a lot of what-ifs, and worst case scenarios in your reply that I simply don't find likely. I am not drinking the koolaid from the AI maximalists or the doomers. I could be wrong of course, no one can predict the future, but to me the very real, novel and broad utility of LLMs that we are just learning to harness combined with the investment outlook are leading to a mania that has people overestimating where things will land when the dust settles. If I'm wrong then I guess I'll join the disenfranchised masses picking up pitchforks, but I'm not going to waste time worrying about that until I see more evidence that it's actually going that badly.
So far what I see is that software engineers are the ones getting the most actual utility of AI tooling. The reason is that it still requires a precision of thought and specificity to get anything sustainable out AI coding tools. Note this doesn't mean that engineers can design better apps than proper designers, rather my point is that designers and other disciplines still can not go much further than prototypes, they still need engineers to write the prompts, test the output, maintain the system, and debug things when they go wrong. I have worked long enough with large cross-functional teams to know that the vast majority of folks in non-engineering functions simply can not get enough specificity and clarity in their requests to allow an LLM to turn it into a working system that will work over time. The will hit a wall very quickly where new features add bugs faster than they improve things, and the whole thing collapses under its own weight like a mansion of popsicle sticks. And by the way, I don't consider AI-assisted coding to require less qualification than regular coding. Sure you don't need to know as much syntax or algorithms, but you absolutely need to know data modeling, performance, reliability, debugging, consistency, and migration knowledge in order to use AI to contribute to any software that powers a real business, and yeah you might need to develop your product and business sensibilities, but to me that's what been happening throughout the history of computing. Wiring up ENIAC, certainly required qualifications that were not needed for assembly programing, which in turn required certain things that C programmers did not need and so forth, but harnessing the increasing compute power and complexity required new qualifications. I don't think AI will ultimately be that different, it will change the way we work, it doesn't replace what senior engineers do.
So you just believe you'll be one of the ones left behind?
Best of luck to you
And that's really just the tip of the iceberg. LLM usage metrics being introduced by management to ensure the licenses they pay for are being used. New productivity metrics that require LLMs and low standards to reach, and that's before we even get into my ethical problems with the technology
So, yes. Their existence is ruining my love for technology
I've been enjoying teaching them how the things they're producing with llms work, because they have no idea and constantly break their builds because of it. And at the same time it helps me improve my craft because I get to refine the bits of which I don't have full understanding as well as see some implementations I wouldn't have voluntarily chosen previously, which allows me to explore their benefits and limitations. Llms actually make this process slightly less painful because at least now when I send them away to work for the day they have something to review when we next meet vs. pre-llm days when they would basically have wrote nothing because they were stuck.
I still don't use llms to code beyond whatever search providers autoprovide when I'm looking up documentation. I don't think I'm good enough to use them. Maybe one day. But for now I don't have to because I'm not facing the breadlines for writing things myself.
When hand writing code we could strike a tolerable balance between quality and velocity. With LLM coding we cannot. Velocity is high, quality is low. I don't believe there is any fixing that despite what the many LLM coding shills on this website would have you believe
Up until a few weeks ago, I've been able to successfully avoid using AI at work. But then mandates happened and now I'm being forced to use them. Absolutely no guidance from leadership though. "Just figure it out amongst yourselves". Other folks in the company have similar reservations but I feel I'm the only one who has very strong feelings about it in the "morality" and ethical sense. I just can't ignore what the tech is built upon and is doing to other people. All so people like us can open 20 PRs in a day. PRs that don't even get merged because no one can keep up with reviews. For tickets that before would've been labeled as "not worth it". For a job that wasn't even that hard to begin with.
Funny thing is that 1-2 years ago when it was all still new I was more open minded. It was a shiny new tool and naturally I would like to try it out. I was one of the first finding potential use-cases for the team. But the more I looked and learned about it the more I hated it.
And I am not a Luddite. Before all of this, I would personally spend my _free_ time using and reading about random and often obscure tools and languages like Lisp, Clojure, Slackware. I'd spend hours curating my Emacs config. I was learning "k8s the hard way" back when it was the hottest tech thing. Does that sound like a Luddite?
I don't have the privilege to just pivot to another career so I have no choice but to stick it out. My only consolation is that when my kids are working age in a couple of years, when they ask me what the fuck happened, I can look them in the eye and honestly say that I did what I can and I did not cheer it on.
Straight up, why aren’t you excited to learn?
I think I misread this post, though. I initially read it as someone who was excited to leave software and learn something new.
Your post made me re-read it, now I'm not sure. Maybe the author is excited to learn a new LLM based workflow. If so, you're right that I have nothing in common with how they feel.