Saturday, March 31, 2007

A jumble of final Justice League comments

I finished off the last DVD of Justice League (Unlimited) recently, so I thought it'd be fitting to publish my final round of comments in one broad sweep. As usual, I can't accurately entitle a potpourri of miscellaneous observations a "review", but it's as close as I'll get to one. The 'Net isn't exactly suffering from a scarcity of media reviews. (I try to aim for unique, original content whenever possible - otherwise, it just feels pointless to me to add more noise to signal.) Also, I freely admit to not being well-acquainted with comic books whatsoever.
  • I almost can't express in mere words the amusement I've been getting from having Powers Booth both on Justice League as the voice of the impressive Gorilla Grodd and on 24 as an aggressive VP/acting-president. When I shut my eyes while watching 24, it's like Gorilla Grodd is threatening to bomb countries in retaliation for terrorism or snapping at subordinates. Come to think of it, Grodd acts more like a visionary leader than the 24 VP all around (except for a plot to turn all humans into huge, hyper-intelligent apes like him, *groan*).
  • Am I the only one who sometimes sees the fights on Justice League and wonders why a superhero doesn't just use power _____ to get something done, or how the holy hand grenade a villain was able to defeat a superhero so easily? And am I the only one who has noticed that "Martian Manhunter" should pretty much be able to throweth the smackdown on anyone? Aside from being a remarkably gifted telepath, he has superhuman strength and durability, flexible shapeshifting, the capability to shift the solidity of his form, flight. In fact, he has a wealth of options available to him in any situation. I'm a little annoyed that such a character, the very definition of powerhouse, seemed to fill a supporting role so often.
  • OK, I suppose it's silly to continue to harp on the unbelievability of a superhero cartoon, even one that's so careful to keep itself somewhat grounded, but I just need to say it. The ultimate superpower is incredible luck. This is more important for the heroes without superhuman powers. No matter what trap a hero is in, some avenue of escape happens to be there, or some convenient environmental feature can be turned against the antagonist. Not to mention a hero is always prepared - not for situations that didn't transpire, but for the situations that did. Batman is always right. "Wait a moment while I grab the antidote for this exact poison out of one of my belt compartments..."
  • Was AMAZO in the final episode? That would have come in handy, wouldn't it?
  • Lex Luthor, no more than a human, rose to the top as the prime villain as Justice League continued. I like that. There's something poetic about an arch-nemesis having essential flaws. I once wrote a story whose main villain is a peasant sorcerer - someone for whom magic was a means to class warfare. Voldemort has a non-wizard parent. Of course, the problem is that having Lex lord it over a room-full of superpowered goons, no matter how omniscient he is, stretches my suspension of disbelief to a degree Elongated Man can't hope to match. (Work, imagination, work!)
  • One quality of Justice League that bears repeating is its humor. The show kept itself from slipping into camp, which is an everpresent danger when mixing dashes of humor into incredible premises, but nevertheless allowed its characters to be themselves, which includes expressing their senses of humor, dry or flashy. Listening to someone quietly sing a pop song to himself while throwing a hefty object through a glass door is one of those little comedy bits whose effect is simultaneously brilliant and hard to explain.
  • Episodes about reincarnated lovers, specifically lovers in Egypt, don't work for me after sitting through a viewing of Mummy 2.
  • To display my personal bias again, I found it irritating how Vigilante and Shining Knight steadily became more prominent. Their voices are great (Nathan Fillion's delivery is so much fun), but to me they're stupidly unoriginal concepts for heroes.
  • The theme for Unlimited makes me cringe. I like some uses of electric guitar, more or less depending on my mood, but my common practice was to mute the player when it was on the DVD menus. The title sequence and end credits I either muted or skipped.
  • Partly because I watched the series on DVD, I didn't mind continuing storylines at all; I like a show with stories too large to fit into one or two episodes. Characters and their relationships should change over time. I have more respect for heroes and villains that undergo serious damage, physically and emotionally, and then cope. Something minor like Hawkgirl not wearing her alien garb anymore has deeper meaning.
  • My overall opinion is quite positive. Like Batman: The Animated Series, not every episode is great, but the ones that are stick with you and demand repeated viewings. If only to justify the cost of purchase...

Friday, March 30, 2007

IRC channel as mutual admiration society

This entry by "somian" from the use.perl.org feed is a fascinating read. It analyses his experiences on #perl on Freenode. I've never been there, although I did pop in on #gentoo, when I had a vexing question for which RTFM wasn't possible because I didn't know which FM T R. Anyway, somian's thoughts apply to many other online chats and forums, and I'm not familiar with his specific situation.

On one hand, I see what he means. I've felt similar disappointment with the "groupthink" I've found - especially annoyance at the assumption that my nerd mojo somehow implies that I hold a whole set of opinions or behave a particular way.

But the tendency of people in a group to mercilessly attack outsiders is a deep, human-naturey thing (see my post on mockery). Part of the reason people form into groups at all is out of a common thread between them. Frankly, they didn't form the group to talk (in a non-dismissive way) about people who are different than them - they formed the group to talk about their interests.

Practical upshot: nerds aren't immune from mob mentality, regardless of whether they happen to be "right" about something or not.

Saturday, March 24, 2007

Schematron

Schematron is one of those bits of technology, like the indoor grill, that people may not realize they needed. Also like the indoor grill, there may have been times they recognized the deficiency of the alternatives and wished for some unspecified better way. Schematron is an XML schema language whose basic concept is a set of XPath-expressed rules or patterns for validating a document. Each rule matches some element, then tests a set of related assertions in the context of that element. It appears to be less of a full replacement for other schema languages than a complement to them, because existing schema languages essentially say "accept documents that look like this" whereas in Schematron each individual fact about valid document structure is a separate part. If anyone needed motivation to better learn XPath (which I like to think of as "advanced regular expressions for XML"), Schematron is it.

Schematron has gone through standardization, for those concerned with interoperability. Note that ISO Schematron is separate from the other, pre-standard versions out there. There is a beta release of ISO Schematron that is a reference-implementation-but-not-officially. This specific implementation takes the remarkable approach of transforming a Schematron document through XSLT into an output XSLT document. Then the output XSLT performs Schematron validation by transforming any XML document into validation output. XSLT produces XSLT. Another intriguing part of the implementation is that it is a "skeleton" with templates that are meant to be overwritten for producing output. So the same skeleton XSLT has been included in an XSLT that outputs plain text, an XSLT that stops on the first valiation error, and an XSLT that outputs validation results in XML (naturally), the SVRL language. I should mention there are other, platform-of-choice, implementations of Schematron, too.

It goes without saying, but Schematron may also be the best name for XML technology, ever.

Unrelated Babble: I'm sure others have said this, but have you noticed how Lost and Heroes have a yin-yang thing going on? Heroes is about incredible people in ordinary surroundings. Lost is about (to some degree) ordinary people in an incredible situation/setting. For a while, Heroes had the comic-booky downfall of feeling plot-driven and hokey. For a while (pre-hiatus), Lost had the melodramatic downfall of running-in-place while characters develop their relationships but effectively not much else. But quite recently, Heroes has probed more into its characters' motivations (did you see the episode with the story of Claire's adoption?) and Lost has had its characters accomplishing more. To do well, these shows need balance up (out?) the yin-yang.

Sunday, March 18, 2007

sneakernet in miniature

I was starting to transfer a largish media file from my primary computer to my MythTV computer over my home network (which consists of one cheap hub and some cables), when I realized it might be faster to copy the file to one of my USB thumb drives and carry the drive over to the MythTV computer.

I started to transfer the file onto the thumb drive, when I realized that it was going faster, but not really fast. In addition, I would need to transfer the file off the thumb drive and onto one of the internal drives in the MythTV computer, to ensure speedy access for viewing.

Finally, I decided to switch to burning the file to a CD (the file wasn't that large). The burner's so quick that it took no more than a few minutes. Having the file on disc meant that I didn't even need the copy on my primary computer, so I removed the file after burning. The MythTV computer could play the video straight from the disc, which meant it didn't take up any space on the MythTV computer either. Transferring by CD was a solution that was fast (only the time to burn the file to CD), cheap (I bought my current stack of blank discs so long ago I actually don't remember what the cost was), convenient (this tech is well-established and well-supported all around), and storage-efficient (as long as CD-reading is snappy enough, the data only needs to take up space on one medium - given that the data's not vital in any sense). What's AppleTV?

A still better solution may have been to stream the file from one computer to the other over the network, but the amount of time/effort required to set it up would be tedious. Or is it? Speaking out of my laziness, I simply don't know, although I have heard good things about VLC.

Tuesday, March 13, 2007

conservation of information

I've been working on a project to rewrite an existing piece of custom software for a different platform, as part of a general move from one technology stack to another. (I've written some of my impressions about learning ASP.Net - as well as playing around with F#.) I'm living the truth that whenever a total rewrite happens in software development, information is (should be) conserved - it just changes shape. XML information can become JSON information, and this transformation can even be performed, with some caveats, via an XSLT file; look at the bottom links on http://json.org . In the original, widget X may convey some particular scrap of data, while in the rewrite widgets Y and Z together convey the same scrap. Code that's quite long in one version may become just a few smaller, more effective lines in another version, but the lines express the same algorithm. Decentralized but closely-related code in the first will hopefully be gathered together in the second, yet once again the end effect should be identical. I feel a little like a conspiracy theorist mumbling "See? See? Underneath it all, everything's connected! You're distracted by appearances, but the underlying motivations are what matter! Follow the data! Follow the data!"

Conservation of (program) information is a widely-applicable principle. I knew I'd read something similar, but in that case it was called conservation of complexity. Kurt Cagle wrote it a while ago, and Rick Jelliffe, quoting him from a mailing list, gave the idea a name: Cagle's Law of Constant Complexity, which just says that complexity can be moved around, into libraries or standards/specs for example, but not eliminated.

The task of moving complexity around reminded me of a Zed Shaw blog rant about indirection vs. abstraction that was responded to on "discipline and punish". According to the rant, indirection is for achieving flexibility and extensibility and replaceability in code. Abstraction is for achieving simplicity in the code's interface. Layers of indirection can lead to complex code, so indirection is not abstraction. According to the response, the distinction between indirection and abstraction "is a bit shaky" because indirection can also be for hiding messy implementation details and abstraction is for concretely constraining the problem domain rather than creating simplistic (leaky) generalizations. That is, an abstraction is for defining what can be done with any file-like entity, like open()-ing and read()-ing, as opposed to generalizing a file to a directory entry, which is less helpful because a chdir() works on some directory entries that are really sub-directories and fails on directory entries that are really files.

Conservation of information means indirection and abstraction can't eliminate the information implicit in code, only move it out of the programmer's face. To take a well-known example (since Java is the lingua franca), someone performing calendar work may fiddle with a GregorianCalendar even though he or she may have initially thought a Calendar would have done the trick. Hypothetically, if this separation did not exist but the API could accomplish the same tasks, and a GregorianCalendar was just called a Calendar, the information for Gregorian calendar processing would still be there; it would just be "embedded" in Calendar. Code the programmer doesn't need to write because he or she can utilize a library doesn't mean the code isn't there, just that it's hidden. By the way, there's a JSR to put a Joda-Time-inspired API in plain Java.

For actual information conservation, I advise you to keep it away from black holes. Depending on who you talk to, it may be lost forever if it gets too close.

Thursday, March 08, 2007

lightweight bout

Hey, remember when Spring was lauded as being the light/agile/simple/pick-your-buzzword alternative to vanilla J2EE/JEE technology? Oh, right, that was just yesterday. My RSS blog feeds did a lot of pointing at Guice, which apparently (I haven't used Guice) does simple Spring tasks better than Spring, although the linked comparison says the two are interoperable. So you can possess your frosted baked good and consume it concurrently.

Bonus link: I can't resist sharing this, because no one should miss out on learning the SOA facts.

Sunday, March 04, 2007

Dark City addendum

One observation I forgot to include in the Dark City post was that I finally understand what people were referencing when they made little hand waves and said "sleeeep" as a joke. That gesture-command happens a lot in this movie, several times in one scene! As special powers go, it's not too shabby, either.

It kinds of sucks that I've caught on after the fad passed. Ever seen Colin do that to Ryan when Ryan was floundering on episodes of Whose Line Is It Anyway (US)?

Dark City

I have a new entry in the "movies I saw for the first time on cable" category, alongside Fargo: Dark City. By writing anything about it, I'm probably expected to mention the Matrix, but I won't and you can't make me! More generally, Dark City's similarities to other movies I like reduces it in my eyes because those other movies were more enjoyable. Minority Report has dreamlike precog visions in place of Dark City's dreamlike memory flashes, not to mention a major character being afflicted by doubts about his innocence. A darkly-styled city appears in many other movies (is Sin City disqualified because the entire movie is incredibly black anyway?). Seeing people hover and emit mental blasts in this movie isn't terribly exciting if the viewer (me) has seen X-Men. The bar for people merely acting loopy was set pretty high by Twelve Monkeys. Ignoring the M movie, trapping people in constructed realities was handled with style by the "Perchance to Dream" episode of the Batman animated series, and "Legends" of the Justice League series. Someone having memory trouble is pretty much the thrust of Memento. Freaky guys, even kids, with ghostly skin? Check the horror aisle.

Saying that a movie is in good company doesn't necessarily indicate that it's entertaining or not, just that it's overshadowed. Some of the ingredients of this movie predisposed me to want to like it. Kiefer Sutherland is so ingrained in my mind as Jack Bauer that I get a strange pleasure out of seeing him in anything else because it doesn't feel right (the Sentinel doesn't count). In Dark City he's a batty scientist with a halting speech pattern. He bugged me at first, but then I started to like him, maybe because he's off-screen for a few stretches of time, and I felt the absence - his importance to the movie is that he seemed to be the most interesting character for me. The others lacked personality, or if they had personalities I failed to be intrigued. For the Strangers, this is to be expected. For Jennifer Connelly, not so much. Then again, for Connelly it may be expected. I can't say for sure, because I'm a poor judge of acting. In her case, I'm hopelessly biased, too, on account of the (mostly harmless! really!) fascination I have for watching her, a fascination that's similar to the Eliza Dushku fascination I mentioned elsewhere. It doesn't help that she's in three off-kilter and critically-despised movies that I inexplicably like: Hulk, Labyrinth, Rocketeer. Her presence hardly lifted this movie, but she did fine with what she was given (I guess).

Indeed, Dark City felt weighed-down and dreary to me, which may have been its intention. Blame the great disparity between the abilities of the Strangers and the humans, blame the humorlessness, blame the well-justified despair a dark city has, blame the nasty-looking knives the Strangers pull out at the drop of a hat. On the other hand, it is a fine-looking movie. Fine lighting, fine compositions of the shots, pretty good effects. Perhaps I'm too happy to like this movie. Eh, it was worth the time to see it on cable.

Saturday, March 03, 2007

comparative roundup of FP paradigm aspects

Functional programming is a broad topic. When I've read about it, the context has often been either its applications in a specific language, longish screeds about abstract concepts (Church numerals?), or teachings about other subjects entirely (Scheme, I'm looking at you!). But the important bits of FP, the parts that people are probably referring to when they advocate "learning about FP just to expand mental horizons", are independent of that stuff. As I understand it, FP is a way of structuring a program; it's a paradigm. Both to organize my own thoughts and help dissolve FP's perceived mystery, here is a roundup of FP's answers to fundamental programming concerns, with comparisons to others' answers to the same concerns. Nothing new or detailed here (go elsewhere for that), but I hope the conciseness and comparisons could: 1) aid in quicker understanding of what makes FP different, 2) serve as a very-high-level introduction, 3) answer the honest question "how is FP applicable to real programming problems again?". I would have appreciated something like this when I first starting reading up on it.
  • Programming metaphor. FP's metaphor seems to be math. People in FP talk about "provable" programs. Functions in FP are more like mathematical functions that always map the same arguments to the same result, rather than procedures that happen to return a value. In contrast, OOP's metaphor is objects passing messages, and (high-level) imperative programming's metaphor is...well, a simplified computer architecture, in which memory locations (variables) have names and operators can be used to represent the end goal of several machine instructions. Declarative's metaphor is a document of assertions written in terms of the problem domain.
  • Code modularization. Code is easier to handle when the humble programmer can divide a program into parts. Of course, FP uses functions for this, but functions have values and types similar to any other program entity ("first-class"). OOP uses objects. Imperative uses functions/procedures that group a series of statements, but the functions have no presence in the code aside from program flow control. Declarative might use an 'include' to designate one document as a part of another document. In practice, no matter the paradigm, there's a higher level, the module or package level, for collecting several units of modularized code into a single reusable chunk or library.
  • Localized code effects. Closely related to code modularization. Code should not have much of an "action-at-a-distance" quality. Code units must be connected in some way, but this connection should be as minimal and well-defined as possible to make unit changes less hazardous and the units' execution more easily understood. In particular, program state over there should not affect program state here. In FP (only the paradigm in the "pure" abstract is under discussion here) there is no modifiable program state, because that might cause functions to not always map the same arguments to the same value. In effect, an FP program is a series of independent, generalized data transformations. FP accomplishes iteration by recursion, but a function that ends just by returning the result of another function, known as a "tail call", does allow the compiler/interpreter to save space simply by tossing out the intermediate function call (cut out the useless middle man whose only remaining purpose is forwarding the tail-call result) and returning the result of the tail call to the function that called the intermediate function in the first place. In OOP, code's effect is localized by encapsulating or hiding data inside objects' private scopes. Imperative has local function variables.
  • Deferred code execution. Often, code needs to be manipulated but nevertheless deferred in execution, such as handler code that must be connected up to an event-processing framework (like a GUI toolkit). In FP, functions are values, so functions can be passed as arguments to other, "higher-order" functions. An object in OOP would send a message to another object to identify itself as a Listener or Observer of that object's events. Or, alternatively, an object might always delegate or forward certain messages to other designated objects. In a more general context in FP, if one wishes to delay the execution of a particular operation at runtime, one way is to construct a function with no arguments, known as a "thunk", around the operation, and then pass the thunk around until the operation is really needed, at which point the code just runs the thunk and does the operation.
  • Reconfiguring code. To avoid marginal effort, code should be reusable. In some happy cases, the code can be reused as is because the requirement is the same (actually, the problems themselves can sometimes be mapped onto each other to accomplish this - one could perform a sort on a set of expensive-to-copy data by creating a parallel array of references to the data and then sorting the references array using any code that can sort arrays by dereferencing the values). But many times a requirement will be similar but distinct, and the code must be reconfigured to be reused. The easiest solution is to add a parameter to the code. For FP-style functions, that have values and types of their own, this can take the form of currying: evaluating a function with an incomplete number of parameters returns not the function's result but a new function that takes the remaining parameters. Currying add(a,b) by evaluating add(3) would return a new function that could be named add3 - a function that returns the result of adding 3 to its single parameter. Moreover, the fact that functions can be parameters to functions means that FP makes it easy to create "light-weight frameworks": the parts of a generalized function or "framework" that vary can be smaller, distinct functions that are passed to it like any other parameters, and then are called within its body. In OOP, a slew of patterns might apply to reconfiguring code to a new situation: Adapter, Mediator, Decorator, Strategy, among others. Declarative might have a technique known as an "overlay" for merging a new document fragment with another named fragment.
  • Metaprogramming. One of the great epiphanies of programming that set it apart from other disciplines is metaprogramming - programs manipulating programs. Here, comparing programming to writing would imply that a book that could write books, and comparing programming to construction would imply that a house could build a house (although I suppose one could say metaprogramming is like building a robot that can build robots - er, Skynet?). FP's enhanced notion of functions greatly pays off in this context, because functions can now create other functions. "Lambda" is the traditional name for the magic function, keyword, or operator that creates a new function (value) from a code block and parameter list. Any outside variables referenced by the new function in lexical scope are captured at creation time and "ride along" with the function - this is a closure because it "closes over" the scope. Since closures are related to variables, closures pertain more to "impure" FP than FP per se. In OOP, metaprogramming support refers to constructing new original objects (as opposed to instantiating an existing class), especially when the OOP is prototype-based rather than class-based - class-based OOP can still do it if the language supports metaclasses. Metaprogramming can also be done with macros/templates within any paradigm, but unless this facility is part of the language (like the Lisp-y ones), the macros may be part of a messy pre-processing step that is notoriously tricky.
Yow. I was hoping this would turn out to be smaller as I was writing it. I haven't even covered any real FP techniques or patterns; I've only talked in generalities about the tools, and not examples of how to effectively use them! There's no mention of concepts related to/enabled by FP, like lazy evaluation or pattern matching! The breadth and longevity of the topic have foiled me again!

Note that a paradigm is mostly independent from a type system, and also the question of compilation vs. interpretation vs. JIT from bytecode. Comparing paradigms is about how each one enables differences in coding styles, not how the paradigms and code happen to be implemented. Comparing languages and language implementations, well, that's a different question.

I should also point out that FP, or any other paradigm, blends with other paradigms in the wild. A common tactic, that has the virtue of being more familiar to the OOP-trained, is to make an FP-style function an object that happens to be callable with arguments. A different tactic, that may have advantages in underlying flexibility for esoteric tricks, is to use a closure like an object, where the variables in the closure's scope are like an object scope. Yet another tactic is to rely on conventions and practices and libraries to use one paradigm like another, like C code that uses structs and functions for object-oriented "lite" - in an early stage C++ worked by translating C++ code to backend C code. The quite permeable boundaries between paradigms is why knowledge of FP can come in handy in contexts outside languages like Haskell.