Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

He already did: https://en.wikipedia.org/wiki/J_(programming_language)

The J programming language, developed in the early 1990s by Kenneth E. Iverson and Roger Hui,[4][5] is a synthesis of APL (also by Iverson) and the FP and FL function-level languages created by John Backus.[6]

To avoid repeating the APL special-character problem, J uses only the basic ASCII character set, resorting to the use of the dot and colon as inflections[7] to form short words similar to digraphs.



Thanks. However, J was conceived in 1980-s, uses ASCII for ease of communications, not as the best set of symbols. In this sense APL was somewhat less constrained - symbols were new, even though some of them were typed on teletypes with some tricks.

Looks like in case of J a lot of consideration was given to what's relatively easy to do in terms of chosen set of symbols. The question is, if ASCII wouldn't be a requirement, what would be a modern set of primitives, efficient as the tool for thought. APL at the time was trying to extend - and improve - mathematical notation, no less...


> The question is, if ASCII wouldn't be a requirement, what would be a modern set of primitives, efficient as the tool for thought.

I love this question.

Moving from APL to J is really a step backward in terms of notation. With unicode standard flourishing, we might finally see something that can improve upon the mathematical notation we use and make instructing computers easier. I don't think anyone would be able to answer though… Perhaps one could extend APL symbols to cover all of the J primitives?


I'm trying to find out - is APL set of symbols looks justified today, even with Unicode capabilities?

If yes then Iverson created really well thought out set which stood the test of time and we can hope will continue to efficiently serve as a good notation - and might be may be extended. If not - then we have the question of what would be a good notation.

References to math notation, which developed over decades and centuries, are good - but sometimes look like not perfect. Iverson was trying to make it better - for example, put the notation in one line, get rid of operation priorities. Is that a success from today's point of view?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: