I finally realized why I prefer tacit programming: I'm lazy and impatient.
Sure, tacit code has all sorts of nice properties (in my case, the most salient being that it's easily generated and manipulated by other code), but I'm a tacit programmer because that's how I learned J.
The key insight is the J session. I guesstimate that one person in a million has less patience than I do. I love the immediacy of interactive programming. I require the feedback cycle: Try something, fail, make a small change, try again, fail, and so on.
But the speed of the feedback cycle comes at the cost of its upper bound: one line. I write tacitly because 90% of the lines of J I've written were typed into the J session first. Tacit definitions fit onto one line. Explicit definitions do not (generally). The fact that they're broken up onto several lines is often lauded as a key advantage over tacit definitions.
Of course, you could emulate explicit definitions in the session, via assignments. But assigning nouns with the intent of converting the resulting code to a script is a painful and error prone process. Locals shadowing globals, locale mischief, forgetting to persist the code that assigned a particular noun (easy to overlook because the name already exists in this J instance), picking the wrong line to persist because the J session is not a log, etc. I prefer my code context-free.
So session-programming (tacit programming) complements my style and personality. It has made me significantly more productive. I solve "hard" problems in hours ("hard" is defined by my colleagues). Often, I don't even have to think in advance of what the right thing to do is; I try a bunch of things and let the interpreter tell me which one is correct 1].
I've been known to take that approach to the next level by auto-generating thousands of functions which I expect to solve the problem, generating a characteristic dataset, selecting the fastest correct program, and copying it into my script, without ever analyzing this now-canonical solution. Tacit code and the mindset it fosters makes that easy, too.
Of course, this immediacy has is drawbacks, too. For one thing, it's addictive. J is now, more-or-less, the only programming language I use. I used to be fluent in, literally, a dozen. I was hyper-proficient in Perl. But no other language I already know has an interactive shell. Nor are they terse enough to make one useful if they did.
So even when I know a solution would be easier and faster in Perl (a lot of what I do is string processing), I still barrel my way through it in J, because without a shell I feel like I'm groping in the dark. J is now my golden cage. And the problem only gets worse with time, as I get better at J while my other fluencies wane.
I also recognize that explicit definitions are easier to read and maintain. Even for me. I am reminded of this every time I try to break into its components a 1000-character tacit definition I constructed iteratively in the session 2] . Explicit definitions can also be faster, in particular when large intermediate nouns need to be used more than once to create the final result. Or when one can take advantage of the special code for assignments. (But certainly not when you've got 3 : 0"1 or some such.)
I see it as a sign of maturity in myself that I've started writing more explicit code. But only because I'm confident I've learned the lessons tacit programming has to teach me. It provides a different kind of insight. And any code I write in the future, be it tacit, explicit, or in Perl, will be tinted by that.
I could never have hacked it in the days where you had to flowchart a business process, type up your program on a typewriter, proofread it (REALLY carefully) hand it to some operator to convert to punch cards, schedule time on a compiler, wait for your appointed hour, compile your program, scheduled time on your target computer, wait for the appointed time, execute and debug your program. Wash, rinse, repeat. No way in hell.
A professor reported to me he once lost several days' time to an omitted semicolon. My mother says she once broke down and cried at work because she fixed a bug, re-ran the program -- and saw the same erroneous results. She'd forgotten to recompile. The next window for a re-compile/run was past her deadlines. Then there are the fun "I dropped the card drawer" stories. Maybe that's where the term "program crash" comes from.
Sing along with me: CTRL+Up, CTRL+Left, ')', home, '@:(', home, ...
Originally posted in the "Function programming" thread on the Forums.