Hacker Newsnew | past | comments | ask | show | jobs | submit | TheFlyingFish's commentslogin

Tangent: Much like PHP, "modern" CF isn't actually that bad to work with these days. In particular the superset-of-html syntax has been superseded for pure logic by "CFScript" which is just an ECMAScript dialect.

There's even a package manager, test harness, etc. And of course it's JVM hosted so it's fairly easy to use Java stuff (stdlib of otherwise) if what you need doesn't exist in CF.


The linked article isn't describing a form of input sanitization, it's a complete separation between trusted and untrusted contexts. The trusted model has no access to untrusted input, and the untrusted model has no access to tools.

Simon Willison has a good explainer on CaMeL: https://simonwillison.net/2025/Apr/11/camel/


That’s still only as good as the ability of the trusted model to delineate instructions from data. The untrusted model will inevitably be compromised so as to pass bad data to the trusted model.

I have significant doubt that a P-LLM (as in the camel paper) operating a programming-language-like instruction set with “really good checks” is sufficient to avoid this issue. If it were, the P-LLM could be replaced with a deterministic tool call.


He's living the hacker dream. Made a billion bucks, then went right back to writing code. People upvote because they wish they were him.


The HN zeitgeist has something of a love/hate relationship with the web, I've noticed. HN in general seems to skew a little older than a lot of online communities, so a lot of HN users were adults back in the early days of the web/Usenet/etc. There's a tendency to view those days with nostalgia, leading a lot of people to feel like the "good old days" of the web were "ruined" by the modern shift into more interactivity, fancier/prettier design, etc. And "web developers" are the ones proximately responsible for the shift, so they get the hate too.

I laugh every time I see someone on HN asserting that the web "shouldn't" be used for anything beyond "documents and lightly interactive content", which is not uncomment. There's some real old-man-yelling-at-clouds energy there.


It basically boils down to: (a) 90s web developers tended not to have computer science backgrounds and weren't aware of fundamentals -> (b) when js frameworks exploded in popularity and diversity in the 00s, there was much wheel reinventing, because those developers (and to a lesser degree framework inventors) were often ignorant of wheels -> (c) there are persistent, fundamental mistakes* in the web ecosystem that could have been fixed at the start if anyone with experience had been asked.

All of those people are now the vibe coders of the 20s, and it's going to end up in the same dumpster fire of 'Who knew it might be a good idea to cryptographically sign and control library packages in a public repository?'

* Note: I'm distinguishing things going sideways despite best intentions and careful planning from YOLO + 'Oops, how could that possibly have happened?' shit


If you're crazy then I am too. 50% odds it was written by a human, 50% bot.


I imagine offloading a lot of the heavy lifting to Vite helps cut down on the code size.


This. Whole thing struck me as basically an advertisement for Vite. 99% of the base functionality is probably already there, written by humans.

"Use our proprietary SaaS and you too can approximate Next.js in 1/100 as much code using a bit of chicken wire and an LLM".

Whole thing sounded too good to be true, and it was.


I once managed to trigger what I think was a race condition in a microwave's beep routine. It was one of the type that does a single long beep rather than individual beeps, and like most it would cut the beep short when you opened the door. But one time, one single time, I managed to open the door PRECISELY as the timer finished, and the beep just didn't stop. I finally closed and opened the door after maybe 30 seconds, and that stopped it.

I was never able to trigger it again, so I have no idea whether it was a race condition or some other random one-in-a-million happenstance, but it makes a fun theory at least.


I've never used a Lisp either, but I get the impression that "forcing you to write the AST" is sort of the secret sauce. That is, if your source code is basically an AST to begin with, then transforming that AST programmatically (i.e. macros) is much more ergonomic. So you do, which means that Lisp ends up operating at a higher level of abstraction than most languages because you can basically create DSL on the fly for whatever you're doing.

That's my impression, at least. Like I said, I've never actually used a Lisp. Maybe I'm put off by the smug superiority of so many Lisp people who presume that using Lisp makes them better at programming, smarter, and probably morally superior to me.


Technically native selects do have a very rudimentary form of filtering: start typing text with the select focused and it will auto-select the first matching option.

E.g. if the select is a list of US states, type "N" and it will jump to Nebraska. Continue into "New" and you'll get New Hampshire, etc.

This is better than nothing (and I personally use it all the time) but not a patch on an actual proper select-with-filtering which, yes, you still need JS to implement properly.


That works if you're dealing with a known set of keys (i.e. what most statically-typed languages would call a struct). It falls down if you need something where the keys are unknowable until runtime, like a lookup table.

I do like dataclasses, though. I find them sneaking into my code more and more as time goes on. Having a declared set of properties is really useful, and it doesn't hurt either that they're syntactically nicer to use.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: