This analogy has always been bad any time someone has used it. Compilers directly transform via known algorithms.
Vibecoding is literally just random probabilistic mapping between unknown inputs and outputs on an unknown domain.
Feels like saying because I don't know how my engine works that my car could've just been vibe-engineered. People have put 1000s of hours into making certain tools work up to a give standard and spec reviewed by many many people.
"I don't know how something works" != "This wasn't thoughtfully designed"
To be fair there seems to be a weird dissonance between the marketing (fire your workers because AI can do everything now) and the reality (actually you need to spend time and effort and expertise to setup a good environment for AI tools and monitor them).
So when people just Yolo the ladder they don't get the results they expect.
I'm personally in the middle, chat interface + scripts seems to be the best for my productivity. Agentic stuff feels like a rabbit hole to me.
If agentic coding worked as well as people claimed on large codebases I would be seeing a massive shift at my Job... Im really not seeing it.
We have access to pretty much all the latest and greatest internally at no cost and it still seems the majority of code is still written and reviewed by people.
AI assisted coding has been a huge help to everyone but straight up agentic coding seems like it does not scale to these very large codebases. You need to keep it on the rails ALL THE TIME.
Yup, same experience here at a much smaller company. Despite management pushing AI coding really hard for at least 6 months and having unlimited access to every popular model and tool, most code still seems to be produced and reviewed by humans.
I still mostly write my own code and I’ve seen our claude code usage and me just asking it questions and generating occasional boilerplate and one-off scripts puts me in the top quartile of users. There are some people who are all in and have it write everything for them but it doesn’t seem like there’s any evidence they’re more productive.
as a second annectdote, at amazon last summer things swapped from nobody using llms to almost everyone using them in ~2months after a fantastic tech talk and a bunch of agent scripts being put together
I would think this is reasonable. My general understanding at Amazon is that things are expected to work via API boundaries (not quite the case at Google).
Just wanted to say I've felt very similarly recently. Honestly feels like we need a place to continue to discuss post-tech career paths for mid-career engineers.
I've been considering becoming an electrician but it is also quite a career shift.
Restarting a career seems so hard. Looking at engineering programs and having to spend thousands just doesn’t sit well. I suspect many of us will just keep doing “software” until they won’t pay us anymore.
That argument seems completely nullified by the fact that the president unilaterally changes his mind on tariffs every other day and setting up a manufacturing basis can take 10+ years.
Seems safer for many business to just continue to operate outside of the US and get a more consistent business relationship with every other country in the world.
I hate to say it but I completely agree, don't want to be a downer I'm sure OPs project is cool, but I legitimately could not understand what it was based on this landing page. No videos, demos or details.
This is a really great idea going to try this out. I similarly just cannot mentally stand reviewing vibe coding PRs all day, but this sounds genuinely useful.
This idea sounds somewhat flawed to me based on the large amount of evidence that LLMs need huge amounts of data to properly converge during their training.
There is just not enough available material from previous decades to trust that the LLM will learn to relatively the same degree.
Think about it this way, a human in the early 1900s and today are pretty much the same but just in different environments with different information.
An LLM trained on 1/1000 the amount of data is just at a fundamentally different stage of convergence.
Feels like this combination (usually) creates a race to the bottom instead of expansion of new ideas.
LLMs kind of feel somewhere in the middle
reply