xelxebar a day ago

Man, I feel like APL has unlocked some latent part of my brain.

I'm a few years into seriously using APL and now work in it professionally doing greenfield development work.

Starting out, solving puzzles and stuff was fun, but trying to write real programs, I hit a huge wall. It took concerted effort, but learning to think with data-first design patterns and laser focusing on human needs broke through that barrier for me.

Writing APL that feels good and is maintainable ends up violating all kinds of cached wisdom amongst developers, so it's really hard to communicate just how brutally simple things can be and how freeing that is.

  • gtani 10 hours ago

    Interesting, how did you choose APL?

    i worked in APL2 fulltime years ago, big asset backed bond models, big as in some of the largest workspaces the IBM support people had ever seen. Never occurred to me to pick it up again, but i have been looking for the Polivka/Pakin book i learned out of (the edition prior to their APL2 edition).

    • xelxebar 8 hours ago

      I came to APL slowly, originally motivated by some combination of fascination with the syntax and desire to break into the financial sector.

      However, what got me to invest in earnest study was hitting today beginner's wall and realizing that I had no idea what Iverson was on about with his design principles.

      APL is really different these days, as far as I hear. Dyalog APL is the only vendor actively working on the language these days, and the old hats tell me that things like dfns, trains, and various operators make modern APL quite different from APL even just 15 years ago.

  • ralegh a day ago

    Could you give some examples of where you're using it?

    • xelxebar 8 hours ago

      My YAML loader[0] is where I first broke through the wall. It's still languishing in a relatively proof-of-concept state but does exhibit the basic design principles.

      There's also a Metamath verifier that does parallel proof verification on the GPU. It's unpublished right now because the whole thing is just a handful of handwritten code in my notebook at the moment. Hoping to get this out this month, actually.

      A DOOM port is bouncing around in my notes as well as a way to explore asynchronous APL.

      I'm also helping Aaron Hsu in his APL compiler[1] for stuff adjacent to my professional work, which I can't comment on much, unfortunately.

      Et hoc genus omne

      [0]:https://github.com/xelxebar/dayaml

      [1]:https://github.com/Co-dfns/Co-dfns

  • ogogmad 11 hours ago

    I'm thinking I'd like to learn array languages (APL, J) and maybe use them professionally. Maybe their time has come.

noosphr a day ago

Missing the tag (1970), and the paper text.

3836293648 21 hours ago

It's one of those broken sites where you can't even access the text. And I am signed in, it just doesn't load the pdf.

boznz a day ago

Cant access the text but "sounds" very advanced for 1970. Gemini 2.5 did not give me anything much about it so a little perplexed about its relevance.

  • polytely 21 hours ago

    you can't imagine something being relevant because the AI doesn't know about it? Seems like more a fault of the AI if you ask me. There is a huge amount of information that hasn't been—or cannot—be captured in the data LLMs are trained on.

ogogmad 12 hours ago

How does this compare to a modern GPU?

  • bear8642 10 hours ago

    Reading the abstract, it seems like a precursor of somekind