Location:: Bayesian Enhanced Strategic Trading (BEST) in Hoboken, NJ
We looked at some examples of coding in J7 - both in the HTTP server and the GTK version. We then looked at some of the difficulties converting image-handling code from J6 to J7 and an example of an elegant technique for addressing a common, practical problem. Next, we examined some of interesting and unexpected ways J handles derivatives and integrals. We digressed from these specifics to much more general considerations of musings about what characterizes robust versus brittle systems and how debugging is still crucially reliant on the insight of developer who understands the code, especially in light of complications introduced by parallelism. Finally, we considered a pair of sophisticated algorithms, one with a J implementation which looks as though it could be better adapted to the language and another not yet implemented but laid out for consideration of what might be good to have available.
Meeting Agenda for NYCJUG 20110208 ---------------------------------- 1. Beginner's regatta: simple J HTTP Server example: see "HelloWorldInJHS.pdf". Example GTK app: see "MinesweeperConfigAndCode.pdf" and "Other GTK Minesweeper Examples.pdf". 2. Show-and-tell: converting from J6 to J7 for image-handling: see "Handling Images in J7.pdf". Solving a common, practical problem in an unintuitive (?) way; see "Veeder Counter.pdf". 3. Advanced topics: see "Derivatives and Integrals in J.pdf". Robust technology and debugging parallel applications: see "Robust Versus Brittle Technology.pdf" and "PhotoshopDevelopersInterviewInCACM.pdf". 4. Learning, teaching and promoting J, et al.: some progress in clarifying, elucidating and simplifying an implementation of the Levenberg-Marquardt Algorithm: see "Levenberg-MarquardtAlgoUpdateOnWiki.pdf". Another candidate: see "Crank-Nicolson Method for solving heat equation.pdf".
This example of a minesweeper game provides some good examples of interactive graphics using GTK in J.
We talked about how good Cliff Reiter's book (Fractals, Visualization, and J) is and what it needs to be updated so that the examples work in the current version of J across all platforms. This e-mail exchange brought up some ideas.
We also discussed the techniques here for renaming files with embedded numbers so that they alphabetize in numeric order.
(I just discovered this introduction to calculus and J (pdf) by Ken Iverson.)
Brushing up on some calculus recently, I wondered how I could use J to figure out integrals symbolically. Then it occurred to me that if d. 1 gives me the first derivative and d. 2 gives me the second derivative, then, logically, d. _1 should give me the negative one derivative which should be the integral. I tried this and it worked! I was pleasantly surprised, though it may actually be documented.
I looked up some basic derivatives and integrals, as seen here, from this set of calculus notes by Paul Dawkins:
Playing around with these a little showed me this In J:
^ d. 1 NB. First deriv of ^ is itself ^ ^. d. 1 NB. First deriv of ln is reciprocal % ^ d. _1 NB. First integral of ^ is itself ^ ^. d. _1 NB. First integral of ln(X) is (X * ln(X))-X (] * ^.) - ] (^. d. _1) d. 1 NB. Deriv of above should be reducible... (^. + ] * %) - 1"0
It looks like this last one, the derivative of the integral of ^., could be symbolically reduced further, though I may be missing a subtlety. The rightmost fork inside the parentheses - (] * %) - roughly, "something times its reciprocal", should reduce to "1". However, this fork does allow for the possibility of dyadic use, in which case the "reciprocal" becomes "divides". I'm not sure if this is a relevant possibility as conventional mathematics focuses on monadic functions for the general case of "f(x)".
However, if my reasoning is correct, then it looks like (^. + ] * %) would simplify to (^. + 1:). Further simplification of (^. + 1:) - 1"0 looks like it should reduce to plain ^. which is what we would expect for the derivative of the integral of this verb. However, my calculus is rusty and I may be overlooking something that belies this simplification.
Learning, Teaching and Promoting J
We talked a little bit about what makes systems robust versus brittle after taking a look at the contribution to "Risks in Computing" by Paul Robinson - see the attachment "Robust Versus Brittle Technology".
Brain as Most Important Debugger
This interview with the developers of Photoshop in CACM has a number of interesting points, one of the most relevant to our recent discussions being the jump in complexity that accompanies parallelism.
The following quote from the interview illustrates both the immensity of this difficulty and how the solutions still require basic brain-power and individual insight. The individuals speaking here are Clem Cole ("COLE") and Russell Williams ("WILLIAMS").
A Long-Running Problem with Parallelism
While Photoshop comes with a unique set of problems and constraints, many of the engineering challenges it presents will undoubtedly seem familiar to any software engineer who has ever attempted to achieve parallelism in an application. Still, to get a handle on the issues Photoshop's engineers are facing today, we must first consider the application's history with parallelism over the past 15 years.
COLE: You've been writing software for a long time, and for the past 11 years you've been working with Photoshop and have become increasingly engaged with its parallel aspects. Which parts of that have proved to be easy and what has turned out to be surprisingly hard?
WILLIAMS: The easy part is that Photoshop has actually had quite a bit of parallelism for a long time. At a very simplistic level, it had some worker threads to handle stuff like asynchronous cursor tracking while also managing asynchronous I/O on another thread. Making that sort of thing work has been pretty straightforward because the problem is so simple. There's little data shared across that boundary, and the goal is not to get compute scaling; it's just to get an asynchronous task going.
I should note, however, that even with that incredibly simple task of queuing disk I/O requests so they could be handled asynchronously by another thread, the single longest-lived bug I know of in Photoshop ended up being nestled in that code. It hid out in there for about 10 years. We would turn on the asynchronous I/O and end up hitting that bug. We would search for it for weeks, but then just have to give up and ship the app without the asynchronous I/O being turned on. Every couple of versions we would turn it back on so we could set off looking for the bug again.
COLE: I think it was Butler Lampson who said the wonderful thing about serial machines is you can halt them and look at everything. When we're working in parallel, there's always something else going on, so the concept of stopping everything to examine it is really hard. I'm actually not shocked your bug was able to hide in the I/O system for that long.
WILLIAMS: It turned out to be a very simple problem. Like so many other aspects of Photoshop, it had to do with the fact that the app was written first for the Macintosh and then moved over to Windows. On the Macintosh, the set file position call is atomic-a single call-whereas on Windows, it's a pair of calls. The person who put that in there didn't think about the fact that the pair of calls has to be made atomic whenever you're sharing the file position across threads.
COLE: Now, of course, you can look back and say that's obvious.
WILLIAMS: In fact, the person who originally put that bug in the code was walking down the hallway one of the many times we set off looking for that thing, smacked his forehead, and realized what the problem was - 10 years after the fact.
Algorithms to Implement: Solvers
We talked about how important it is to have a "deep bench" of algorithms implemented in J. It's most satisfying to have a problem and be able to solve it by stealing code. Also, when people are searching for instances of algorithms, a J solution can be intriguing to someone who has not previously encountered the language. Additionally, a J solution could be a source of insight if it's sufficiently succinct to allow us to hold the whole thing in our head at one time. However, even if not initially comprehensible, the interactivity of a J solution provides a powerful tool for further understanding.
An existing algorithm with a non-J-like solution in J is the Levenberg-Marquardt Algorithm. This is on the wiki as a work-in-progress to polish it down to a J "Jem" from the raw ore of FORTRAN-like code which is the current implementation. The important progress so far consists simply in getting the example to work properly - an important step. However, even this apparently correctly working example gives results which differ slightly from those proposed as the correct solutions, so it's still a bit frustrating to work with this unwieldy code. Having a surer set of test cases would be a big step forward.
Given the difficulty of refining non-J-like code, another approach is to implement an algorithm from a clear statement of it. That's the idea behind presenting the Crank-Nicolson method for solving a heat equation. Heat equations have found use beyond their original domains in Physics, most notably in Finance, so this could have far-reaching applicability.