Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The dollar sign is charming.

There's a few points here:

1) Not all data is text. In fact, very little of the data people see/work with day-to-day is raw text. It's silly to transform a PNG image into text to be able to pipe it around. (Or to pipe around its filename instead and have a dozen tools all having to open and re-parse it each time.)

2) There's nothing on PowerShell preventing you from serializing a piece of data to text if you want to. The key is: you don't have to.

3) Systems that depend on 50,000 CLI tools all having their own ad-hoc text parsers are cemented, mummified, cannot change. You can't change the output format of ps (to use an example in this thread) without breaking an unknown number of CLI tools. Even if you come up with a great way to improve the output, doesn't matter, you've still broken everything. This is less (but not none!) of an issue with PowerShell. I like computers to evolve to become better over time, and text-based CLIs are a huge anchor preventing that.



Unfortunately PowerShell relies on everything running on .NET (well, I think COM works, too); the idea of a shell that can expose live objects is useful, but PowerShell's platform limitations in reality doing that make it a far from ideal implementation of that concept. Something built on a protocol that is platform agnostic would be better.


Live objects don’t usually expose any protocols at all. They only expose an ABI. Do you know any platform-agnostic OO ABI, besides what’s in .NET?

If you’ll wrap your objects into some platform-agnostic protocol like JSON, you gonna waste enormous amount of CPU time parsing/formatting those streams at the object’s boundaries.


You can run streams of many millions of JSON objects pretty much as fast as the IO can feed it... most of the time, in situations like this, you're constrained by IO speed, not CPU... assuming you are working with a stream that has flow control.

I tend to dump out data structures to line terminated JSON, and it works out really well for streams, can even gz almost transparently. Parse/stringify has never been the bottleneck... it's usually memory (to hold all the objects being processed, unless you block/pushback on the stream), or IO (the feed/source of said stream can't keep up).


Even if printing and parsing is computationally cheap, memory allocation is less so.

If you expose JSON, each serialize/deserialize will produce another instances of objects, with the same data.

The architecture of PowerShell implies commands in the pipeline can process the same instances, without duplicating them.

Another good thing about passing raw objects instead of JSON — live objects can contain stuff expensive or impossible to serialize. Like an OS handle to an open file. Sure, with JSON you can pass file names instead, but this means commands in your pipeline need to open/close those files. Not only this is slower (opening a file requires kernel call, which in turn does various security checks for user’s group membership and file system’s inherited permissions), but can even cause sharing violations errors when two commands in the pipeline try accessing the same file.


And just think how much faster it would be if there were no serialization and I/O involved at all...


And my method can work over distributed systems with network streams... There are advantages to streams of text.


What advantages?

PowerShell works over networks just fine, because standardized ISO/IEC 17963:2013 protocol a.k.a. WS-Management.


> Even if you come up with a great way to improve the output, doesn't matter, you've still broken everything.

You phrase this as if changing things for the sake of changing was a good thing. It is not.

Well, perhaps it is good for the software vendor, but from the customer's point of view, having to re-learn how to do the same stuff over and over every other year is a PITA.


This is why I have trouble getting along with Linux users.

Without change, there's no improvement.


It's often the customers who are complaining that your current output is not suitable.


First off, you can pipe binary data around. Most tools just expect text.

Secondly, if you used DSV to parse PS, like you should, adding a new column to the end won't break anything. A fancier parser won't even break if you add to the middle, but that's usually not worth the effort to write.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: