Conversation

Jarkko Sakkinen

There will be soon need for a book "LLMless cookbook" for kids so that they don't fuck their careers with misguided beliefs and fantasies.

One recipe could be "build your own stash with local repositories". It's not really about any "secret sauce" but more like less friction to do random experiments from which to learn from. It's losless compression of *your own memory* whereas neural network is lossful compression of LLM model + generated additional interference based on material fed (fuck that was complicated to phrase lol, sick layered)

For a person like me, LLM generated implementation is +1 variation of thing that I could have, and literally always also have written manually. This is for expanding search space of alternative ways to branch code for instance. We're looking for quality over quantity after all, aren't we?

There is one recent example where I did use an LLM agents successfully and in very planned, instructed and sandboxed manner. I did a huge file recovery operation that reconstructed a couple of lost Git repostories from ddrescue dumps of three different computers.

Third example is actually a bit obscure but for rootns patch set that I'm working on I use a bug I've found in claude code to make the agent go insane by removing cwd beneath it while being overstressed with "impossible" workload.

What these examples show is that since I have the craft under control, even the use of agents is very "tool-oriented". The non-tool oriented use would be OFC compensating my own shortcomings.

AI companies are throwing a generation of potentially great engineers under the bus. That's sad to watch happening.
0
0
2