Thursday, December 20, 2007

peeve no. 254 is "argument from common sense"

"Common sense is the collection of prejudices acquired by age eighteen." -Albert Einstein (link)

In the midst of an argument, the most satisfying retort is exposing the opponent's reasoning as a fallacy, such as one of these. But the most frustrating reasoning to deflate is reasoning that's not even wrong: its truth value is meaningless, mu, zero (and not the way C conditionals judge zero). The "argument from common sense" is in this category of fallacy. When someone advances an idea on the basis of, or defended by, what he/she calls common sense, no recourse is available. Possibly one could gain some satisfaction by responding with "I find your abundance of faith in unexamined assumptions disturbing", but that still wouldn't be a real counterargument.

Real counterarguments to the argument from common sense would be meaningless, because the argument from common sense is meaningless, because common sense is meaningless in an objective debate. I don't mean that the concept of "common sense" is meaningless, but that its very usefulness is its function as a catchall term for implicit knowledge. It doesn't have a fixed definition. Without sharp definitions, debates can endlessly spin in confusion purely from misunderstanding of what each participant is communicating. (Not incidentally, this is why the legal documents of a society under the rule of law are careful to define all important nouns. If you use the GPL, you really should know exactly what a "derivative work" is.)

Moreover, a bare argument from common sense exudes sheer ignorance of observer bias and disregard for rigorously evaluating truth via proof--how much of common sense is the result of investigation, as opposed to culture and mindless conformity? As I've heard some people say, with tongue-in-cheek, "common sense isn't common". A subset of common sense knowledge for one person, or even an entire group, might not be a subset of common sense knowledge for another person, or even another entire group. Is it common sense to use global variables sparingly, if at all? Is it common sense to indent code in a consistent manner? Is it common sense to change the default/root password? Is it common sense to finally cease using <font> tags, for Bob's sake? Maybe for some people, those procedures are so blindingly obvious that using the description "common sense" is pretty much an insult to the words. For other people, perhaps not...

The argument from common sense is not an argument. If trying to justify a statement by referring to common sense, find some other way. Look closer at what "common sense" really means in that specific case. At least make your assumptions explicit, so everyone knows you aren't using common sense as a logical hand-wave (no need to look here, it's common sense!). Common sense has done a lot for you. Don't push it into being something it isn't.

Tuesday, December 18, 2007

seriously, two movies based on the Hobbit?

According to the official Hobbit movie blog, two movies will be shot simultaneously, with Peter Jackson as one of the executive producers. Also, speculation is rather pointless right now, considering the hoped release dates are in 2010-2011.

Overall, this seems like good news to me. A good adaptation of Hobbit has been a long time coming. But I never thought two movies would be necessary. LOTR is usually sold as three books, and each book is dense. I've always considered Hobbit to be the "gateway drug" for LOTR: less intimidating, generally not as potent/complicated (look, fewer subplots!), and hints of what the heavier stuff is like. Looking back at how the adaptation worked for LOTR, massively cutting here and adding there until we had three long movies (longer if you opt for the extended DVDs), I'm worried what will happen this time around, when the problem is reversed: not enough story for two movies. Will it turn out like King Kong?

Yet another problem will repeat itself for the screenwriters: the choice of which scenes belong in which movie. In retrospect, Two Towers may have been diminished because of the parts that shifted to Return of the King. I'm curious where the dividing line will fall for two Hobbit movies. The book's sense of scale shifts toward the end; onward from when Smaug appears, the book is suddenly no longer about just Bilbo and dwarves, but disagreements between peoples culminating in a Battle of Five Armies. Cut the story off too soon, and the first movie won't have much of a climax to speak of. Cut the story off too late, and the second movie won't have much plot left to cover at all. This should be interesting to observe.

Monday, December 17, 2007

why "Art Vandalay" writes Rippling Brainwaves

I was reading On The Meaning of "Coding Horror" (short version: the blogger's a coding horror because all professional coders are--and the best coping tactic is to be humble and teachable), when I followed a link to a past entry, Thirteen Blog Clichés. I agree with most of the list; fortunately, Rippling Brainwaves hasn't had many violations, although that's partly because its basis is unmodified Blogger (items 1, 4, 6, 13). Avoidance of randomly-chosen inline images (2), personal life (8), the topic of blogging (10), mindless link propagation (11), top-n lists (12) happened naturally because I try to publish using two fundamental guidelines: 1) only include content I would honestly find interesting if it was on someone else's blog (content Golden Rule), 2) only link elsewhere if I supplement that link's content with substantial and original commentary. The first exception to guideline two is when I judge that the link must be shared, but I don't have anything to add because the link is just that good or important.

However, the third item in the list, "No Information on the Author", needs further mention. The item's word choices are emphatic, so I'll quote in entirety.

When I find well-written articles on blogs that I want to cite, I take great pains to get the author's name right in my citation. If you've written something worth reading on the internet, you've joined a rare club indeed, and you deserve proper attribution. It's the least I can do.

That's assuming I can find your name.

To be fair, this doesn't happen often. But it shouldn't ever happen. The lack of an "About Me" page-- or a simple name to attach to the author's writing -- is unforgivable. But it's still a problem today. Every time a reader encounters a blog with no name in the byline, no background on the author, and no simple way to click through to find out anything about the author, it strains credulity to the breaking point. It devalues not only the author's writing, but the credibility of blogging in general.

Maintaining a blog of any kind takes quite a bit of effort. It's irrational to expend that kind of effort without putting your name on it so you can benefit from it. And so we can too. It's a win-win scenario for you, Mr. Anonymous.


These are excellent points excellently expressed. I hope the rebuttals are worthy:
  • Like any blogger, I'm always glad to discover that my precious words are connecting to someone rather than going softly into the Void. However, I think it's up to me to decide what I "deserve". Attribution to my proper name isn't one of my goals, because becoming a celebrity is not one of my goals. Crediting the blog and linking back is thanks enough, because that action has the potential to prevent future precious words from going softly into the Void. And by all means, be ethical enough to not plagiarize.
  • This blog is of my ideas, the titular Rippling Brainwaves. It isn't about (original) news. It isn't about (original) research or data, unless you count the anecdotal set of personal experiences I use for reference. The information in this blog, which is mostly opinions anyway, can and should be evaluated on its own merits. In this context lack of a byline should have no bearing on the credulity of the author, the writing itself, or "blogging in general". Here, the ideas reign or fall apart from the author. Think of it like a Turing Test: if a computer was writing this blog, or a roomful of monkeys at typewriters, etc., would that make any difference, in this context? Part of what makes the Web special, in my opinion, is the possibility that here, as in the World of Warcraft, everyone can interact apart from the prejudices and spacetime limitations implicit in face-to-face/"first life". Objectivity++.
  • Given that credibility is not in question, and I function as the writer of this blog, not the topic (item 8, "This Ain't Your Diary"), I'm not convinced that my identity is of interest. I don't mind stating that I am not a famous individual, I do not work at an innovative software start-up or even at one of the huge software corporations, I am not a consistent contributor to any FLOSS projects, and in almost all respects my often-uneventful but quietly-satisfying life is not intriguing. More to the point, I don't want anyone who reads to "color" my words with knowledge of the writer: don't you dare belittle my ideas because of who I am.
  • Probably the most childish or indefensible reason I have for being a "Mr. Anonymous" is anxiety about real-life repercussions from what I write, to both career and personal relationships. I also appreciate the freedom of expression which comes from disregarding such repercussions, and I believe uninhibitedness benefits the readers, too. Who wouldn't like the chance to say whatever one wants? I'm nicer in person, yet I'm guessing that's the typical case. It's rare to hear someone comment, "I don't have road rage. I only have hallway rage."
  • Lastly, I won't dispute the assertion that "Maintaining a blog of any kind takes quite a bit of effort". Perhaps I should be putting in more, at least in order to achieve more frequent updates. Currently, and for the foreseeable future, I will post as I wish, and not until I judge the post is worthy and ready. I started blogging for the same reasons as many others: 1) I enjoy doing it, 2) I feel that I (at times) have valuable thoughts to share, 3) I hunger for greater significance, 4) I wish to give back to the Web, 5) I like communicating with people who have common interests, 6) I relish correcting and counterbalancing what others write. Those reasons are how I "benefit" from the blog. Are those reasons "irrational"? Eh, if you say so. Either way, I can't stop/help myself. As I keep repeating, this blog isn't about me, my career, or my finances. Ad income wouldn't earn much anyhow, and even it did, I wouldn't want to look like someone who's trying to drive Web traffic for profit. The blog is about what I'm compelled to publish. (And inserting ads would morally obligate me to use my real name. Hey, if you're planning to potentially profit off people, you should at least have the decency to tell them where the money's going.)
I'll end by saying that, notwithstanding the above, I'm certainly willing to consider revealing more about my identity. But what information, why, and how is it worth forgoing the benefits of anonymity?

Friday, December 14, 2007

peeve no. 253 is overuse of the word 'satire'

Here's a quotable line: "When someone uses pointed humor to discredit my opinions, it's vicious meanness. When someone uses pointed humor to discredit your opinions, it's satire." (I was inspired by Mel Brooks' "Tragedy is when I cut my finger. Comedy is when you walk into an open sewer and die."--link)

Regardless of the above, satire does have an objective definition, and that definition doesn't match up with some of the commoner usages of "satire" I've been reading. I bet some of you same offenders have been misusing "irony" and "decimated". As I understand it (he yelled from his padded cell), satire is the timeless technique of applying humor of exaggeration in public discourse to starkly emphasize the ridiculousness of an opposing viewpoint. Satire is a part of language; it predates telecommunication, mass communication, photography, and practical electricity. Effective satire, especially satire which stands a minute chance of influencing others, requires a corresponding degree of intelligence and wit. A Modest Proposal is an excellent example. In politics, which is a subject I attempt to studiously avoid in this blog even in the face of an imminent presidential election year, satire is a welcome diversion from the typical "shout louder and more often" tactic. I respect satire.

Having described satire, it should be clearer what satire is not. Admittedly, my idealization of satire may not perfectly fit the official/academic category of satire.
  • Satire is not synonymous with political humor, because not all political humor is satire. A joke which has a political subject might not be a satirical joke.
  • Satire is not synonymous with sarcasm, although both include the idea of statements whose intended purpose is to express the opposite of its meaning. Both include the idea of statements that attack something. (Side comment: some comments I've heard called "sarcastic" should be called "facetious".) However, sarcasm is the more general of the two. Satire may be an extended application of sarcasm to a particular viewpoint.
  • Satire is not an attack on a politician's personal appearance, personality, mannerisms, etc. It's an attack on his or her beliefs or actions.
  • Satire is more than a hodgepodge of one or two line quips on political issues. Satire employs greater depth in narrower focus, as it attempts to illustrate that a viewpoint is not merely funny, but ludicrous.
  • Satire is more than creating unflattering fictional caricatures of an opponent or viewpoint--perhaps that technique is satire, but no more than the lowest form of it. Satire's goal is to make a point. Having the fictional caricature say something like "rustle me up some ding-dongs" may be worth a chuckle (but hardly more), yet it neither discredits follies nor imparts wisdom.
I enjoy humor as much as anyone else. But "satire" is too valuable both as a word and as an art-form to be diluted by pretenders and anyone those pretenders satisfy.

Tuesday, December 04, 2007

peeve no. 252 is artless programming

I intended to ramble on for a bit about how readability/ease of understanding is not at all guaranteed by the mere use of a runtime-typed language, but then I recognized that the point I planned to make was in a sense the opposite of a previous peeve, no. "251": equating programming and art. The main thrust of 251 is that programmers don't need to babble on about pretty code and being artists when each programming task (ideally) has an objective purpose and objective obstacles known as bugs. Regardless of the opinions of a few folks, code is likely to be far more appreciated for fewer bugs than for beauty.

However, I can't deny the existence of programming style, nor can I ignore its manifestation or absence in any particular piece of code. The level of style exhibited by code has been analyzed and quantified in a few ways: lists of code "smells", program-structure calculations such as cyclomatic complexity, lint-like tools for flagging potential problems (tried JSLint, by the way?), and, of course, automated style-enforcement programs. The sense of style I am describing includes and goes beyond all those.

It also goes beyond bug reduction, the province of 251. Code listings A and B can be equally bug-free, but the level of style in A can still be unmistakably superior to B. Style is part of what Code Complete tries to communicate. Duplicated code has less style than modular code. Cryptic variable names have less style than informative variable names; so do verbose variable names. Objects and functions that overlap and do too much have less style than objects and functions with clear-cut responsibilities. On the other hand, style indicators are more what you'd call guidelines than actual rules, because style is an elusive concept which escapes hard definition.

One example of the difference between code with style and code without style has been burned into my memory. In one of my first few undergraduate classes, the professor divided the students into three groups to write a pseudo-code algorithm for translating a day of the month into a day of the week, but only for one specified month. Thanks to my influence (I was the first one to suggest it, anyway), the group I was in used the modulus operator (%) as part of the algorithm. One or both of the other groups made an algorithm without the modulus operator, a patchwork algorithm having several simple "if" statements my group's algorithm did not. All the algorithms solved the (trivial) problem of producing the correct output for each input; the benefit of several people to mentally trace/check the code was probably why the students broke into groups in the first place. Need I state which of the algorithms had the most style, regardless of my all-too-obvious bias?

Here's another quick example, from a few days ago. Out of curiosity, I was looking over some checked-in code. A utility class contained a handful of methods. One of those methods had many complicated steps. In defense of the code's style level, the method's steps were sensibly grouped into a series of "helper" methods, called in succession. The gaffe was the visibility of those helper methods: public. Is it likely that other code would try to call any of those helper methods individually? Is it even likely that other code would use that narrowly-applicable utility class? No on both counts. The public visibility of the helper methods causes no bugs, and realistically speaking will never result in bugs. Nevertheless, objects that "overshare" a contract/interface to other code represent poor style.

The programmer who checked in this code is not dumb, yet he had a style "blind spot". Style is subjective, but it can be learned. Constructively-minded code reviews should put an emphasis on improving the style of everyone involved.

If you've read Zen and the Art of Motorcycle Maintenance, you may also have noticed that its conclusions about language rhetoric resemble the above conclusions about programming style. Quality craftsmanship can arise from the mental melding of craftsman and material. It's neither objective nor subjective, neither art nor science. A common expression of programmers is a language or framework "fitting my brain". Artless programming proceeds without the attempt to achieve this deep awareness.

Wednesday, November 28, 2007

drop the superiority complex

I know, easier said than done, and I fall into this more than I should...

Neither career nor talents grant anyone the authority to degrade everyone else. On a website aimed toward computer technology professionals but on a non-tech topic, I just read an ugly comment stating (in different words) the unlikelihood of a previous commenter having anything valuable to say to "people who think for a living". I'm not surprised or bothered by attacks on basic credibility; disagreement without acrimony would be the real anomaly. What makes me cringe are the haughty assumptions behind the statement:
  1. People who work on computer technology think for a living.
  2. Others, like the statement's victim, don't think for a living.
  3. "Thinking for a living" is a qualification for having worthy opinions to express.
I won't deny that strong aptitude for logical/analytical thought, as used by people to successfully work with computer technology, is rather atypical. I won't deny that deep knowledge in a particular subject domain such as computer technology allows that person to not be as easily confused by casual problems in that domain. I even won't deny that the requisite intellectual capability should be an important factor in decisions for career and education.

However, people who don't fall into those categories can still make coherent, valid points. They may be smart, clever, and accomplished. Their problem-solving skills and ingenuity may be impressive. They may have a wealth of skills for human interaction (which would be foolish not to learn from). The striking creativity they may exhibit is partly why they want PCs at all. "Thinking for a living", as opposed to dumb repetitive labor like ditch-digging, encompasses many options beside working with computer technology.

I know people who either can't do my chosen career, or don't want to. That doesn't make them inferior. After all, the fact that I may not be able to succeed in their careers, or want to, doesn't make me inferior to them...

Friday, November 23, 2007

remembering Beast Wars

By "Beast Wars" I mean the 1996-1999 CGI Transformers series, in which sentient robots transformed into animal forms (as the series continued, it incorporated various vehicle-inspired forms too). I know Beast Wars much better than the original cartoon series, which the summer Transformers movie was based on. After seeing that movie, I was compelled to peruse my Beast Wars DVDs and view some selected episodes. Here's why Beast Wars is still so good:
  • Memorable characters. I'm inclined to think this is common to all Transformers series. Admittedly, those looking for subtlety and balanced characterization should perhaps keep looking. Those who want to be entertained by a rich variety of strongly-defined personalities (so strongly-defined that it borders on overexaggeration) should be satisfied. Of course, defining characters, especially the minor ones, in broad, easily-grasped strokes is the norm for TV shows. Beast Wars had some great ones, and it seems unfair to only list seven: Optimus Primal the just leader, Megatron (different robot, same name) the megalomaniacal manipulator, Tarantalus the insane genius, Black Arachnia the slippery schemer, Rattrap the wisecracking infiltrator, Waspinator the bullied underling, Dinobot the conscientious warrior. The cast of the show changed incredibly often, yet each addition and subtraction made an impact, particularly in the second season.
  • Malevolent aliens. This element of the show is not as corny as it sounds; it acts more as a bubbling undercurrent of anxiety or a convenient plot driver/game-changer than a central focus. An advanced race of alien beings has an intense preexisting interest in the planet of the series' setting, and to them the crash-landed embattled transformers are an intolerable disruption. Yet the sophisticated alien race always acts indirectly through the use of proxy devices of incredible power (only at the very end do they send a single representative). The episodes that actually do revolve around the aliens' actions all have two-word titles starting with the letters "O" and "V". These episodes make up no more than a handful, but include the two episodes which make up the exhilarating cliffhanger finale of the first season.
  • Advanced but limited robots. Some fictional stories strive to be plausible by trying to match, or at least not massacre, the unwritten rules of reality: excessive coincidences can't happen, people can't act without motivation, human heroes and villains can't be entirely good or entirely bad. Other fictional stories are set in a fictional reality with a fictional set of rules, but the story must still be "plausible" to those rules. The Beast Wars "reality" has its own set. Although a transformer can be endlessly repaired from devastating injuries, its true mystical core, known as a spark, must not be lost, else it ceases to be the same animate object. Too much exposure to too much "energon" in its surroundings will cause it to overload and malfunction, but transforming to a beast form nullifies the effect. A transformer in suspended animation as a "protoform" is vulnerable to reprogramming even to the extent of changing allegiances. Transwarp technology operates through space and time. However, the attention to internal consistency is tempered by...
  • A sense of fun. Beast Wars was filled with humor. Robots that can take a beating are ideal for slapstick gags (arguably, the show takes this freedom too far on a few occasions). A few of the characters have bizarre and unique eccentricities, such as speech patterns or personality quirks, which are mined for amusement. For instance, one transformer has been damaged into confusing his consciousness with his ant form, so one of his titles for Megatron is "queen". The transformers' dialog is peppered with high- and lowbrow jokes, mostly at each others' expense. Between Rattrap, Dinobot, Megatron, and Black Arachnia, sarcasm isn't in short supply. When one transformer claims time spent in a secretive relationship with an enemy to be "scout patrol", Rattrap comments, "Find any new positions?"
  • Battles galore. As befits a show whose title contains "wars", almost every episode contains one or more conflicts. The most common type is shootouts, naturally, but the wily transformers on either side employ diverse strategies and tactics to try to gain a decisive advantage: hand-to-hand combat, communication jamming, viruses/"venom", ambushes, stalking, feints, double-crosses, gadgets made by Tarantalus, etc. Surprisingly, Megatron stages a fake "defeat" of his forces in one episode (so a spaceworthy craft will be forged by combining everyone's equipment), and calls a truce that lasts for a few episodes (so he can refocus his attention to the imminent threat from the aliens). It's probably unnecessary to note that these battles are bloodless, even in the minority of battles that happen while in beast form, however beheading and dismembering (and denting) are common results of warfare. In fact, if Waspinator wasn't so often assigned to collect up the pieces for later repair, he would have very few functioning comrades.
  • Connections to the original series. Foremost, that the transformers of this series are descendants of the original transformers, whose battles in the original series are collectively called the Great War. Autobots preceded the basically-decent maximals. Decepticons preceded the basically-aggressive predacons. Intriguingly, during the Beast Wars the maximals and predacons are officially at peace, although the maximals apparently exert greater control than the predacons. Megatron is a rogue who was openly pursuing resources for war, before the maximal research vessel was diverted from its mission to chase Megatron to the planet, ultimately causing each spacecraft to crash. To state more of the connections between Beast Wars and the original series would spoil too much, because over time the writers increasingly intertwined the two series.
  • Vintage CGI. You may think that a CGI TV series that started in 1996 wouldn't be as visually impressive now. You're right. Fortunately, the people who worked on the show were all-too-familiar with the limitations, which means they generally did what they could do well. They avoided (or perhaps cut in editing?) what they couldn't. Apart from a few often-used locations, the "sets" are minimal. The planet has life-forms that aren't usually seen, sometimes giving the impression that the transformers are the only objects in the world that move around. On the other hand, the "shots" are creative and effective in composition, angle, zoom, etc., conveying mood and expressing information such that viewers are neither confused nor bored. The transformers, thanks partly to astute body movements and voice acting, are more emotive than one might guess (really, their faces shouldn't be able to move as much as they do--this is an easy little detail to forget about). Each season's CGI noticeably improved on the previous'. Just as Toy Story (1995) benefited by rendering toys instead of people, Beast Wars rendered robots. Nevertheless, the unconvincing fur and hair textures on the beast forms are best ignored.
  • Not Beast Machines. This is an excellent reason to cover after CGI quality. Beast Machines was the series that followed Beast Wars. It was pretty, angst-y, stark, and altogether dazzling in design. It had a built-in audience of Beast Wars fans...who stopped watching. Beast Machines was just a drag. The comparison to Beast Wars was night and day, even in the literal sense of being much blacker. Beast Machines had the unintentional side effect of illustrating how special Beast Wars was.

Friday, November 16, 2007

has MVC in webapps been over-emphasized?

Most of the prior entries' titles don't contain a question mark. This one does, because I've become torn on this topic. As I understand it, MVC (Model-View-Controller) refers to the interface design principle of preserving code/implementation separation between the data (Model), the appearance or rendering (View), and the processing of user interactions (Controller). MVC in webapps typically means...
  • The Model is the data structures and data processing. Ultimately speaking, the Model consists of the information manipulation which is the prime purpose of the webapp, even if it merely accomplishes database lookup.
  • The View is the components, templates, and formatting code that somehow transforms the Model into Web-standard output to be sent to the client.
  • The Controller processes HTTP requests by running the relevant 'action' code, which interacts with the Model and the View. Part of the Controller is a pre-built framework, since the Controller's tasks, like parsing the URL, are so common from webapp to webapp.
Post-Struts, MVC may seem somewhat obvious. However, speaking as someone who has had the joy of rewriting an old webapp, no more than a jumble of .jsp and some simplistic "beans", to run on a properly-organized MVC framework, the benefit of MVC is palpable. I'm sure the difference shows itself just as strongly in, say, the comparison between Perl CGI scripts written before the Internet bubble and recent Perl modules that use Catalyst. (As an aside, I can't stop myself from remarking that well-written and quite-maintainable Perl does exist; I've seen it.)

Nevertheless, I wonder if the zeal for MVC has misled some people (me included) into the fool's bargain of trading a minor uptick in readability/propriety for a substantial drag in productivity. For instance, consider the many template solutions for implementing the View in JEE. Crossing borders: Web development strategies in dynamically typed languages serves as a minimal overview of some alternative ways. I firmly believe that embedding display-oriented, output-generation code into a template is better than trying to avoid it at all costs through "code-like" tags; markup is a dreadful programming language. Any CFML backers in the audience? I don't mind excellent component tag libraries that save time, but using a "foreach" tag in preference to a foreach statement just seems wonky.

Indeed, it is true that including code inside a template may make it harder to distinguish static from dynamic content, and the burden of switching between mental parsing of HTML and programming language X falls upon the reader. Developers won't mind much, but what about the guy with the Queer Eye for Visual Design, whose job is to make the pages look phat? In my experience, the workflow between graphic artist and web developer is never as easy as everyone wishes. Page templates don't round-trip well, no matter how the template was originally composed in the chosen View technology. A similar phenomenon arises for new output formats for the same data. When one wishes to produce a syndication feed in addition to a display page, the amount of intermingled code will seem like a minor concern relative to the scale of the entire template conversion.

Consider another example of possibly-overdone MVC practices, this time for the Model rather than the View. One Model approach that has grown in popularity is ActiveRecord. Recently my dzone.com feed showed a kerfuffle over the question of whether or not ActiveRecord smells like OO spirit. The resulting rumble of the echo chamber stemmed more from confusion over definitions than irreconcilable differences, I think; in any case, I hope I grasp the main thrust of the original blog entry, despite the unfortunate choice of using data structures as a metaphor for ActiveRecord objects. Quick summary, with a preemptive apology if I mess this up:
  • In imperative-style programming, a program has a set of data structures of arbitrary complexity and a set of subroutines. The subroutines act on the data structures to get the work done.
  • In object-oriented-style programming, a program has a set of (reusable) objects. Objects are units having data structures and subroutines. As the program runs, objects send messages to get the work done. Objects don't interact or share data except by the messages. The advantage is that now a particular object's implementation can be anything, such as an instance of a descendant class, as long as the messages are still handled.
  • Since ActiveRecord objects represent a database record by freely exposing data, other code in the program can treat the ActiveRecord like an imperative data structure. To quote the blog: "Almost all ActiveRecord derivatives export the database columns through accessors and mutators. Indeed, the Active Record is meant to be used like a data structure." You know how people sometimes complain about objects having public "getters" and "setters" even when the getters/setters aren't strictly necessary for the object to fulfill its responsibility? Roughly the same complaint here.
  • The blog entry ends by reiterating that ActiveRecord is still a handy technique, as long as its messy lack of information hiding (leaky encapsulation) is isolated from other code to 1) avoid the creation of brittle dependencies on the associated table's schema, and 2) avoid the database design dictating the application design.
In my opinion, the real debate here is not if ActiveRecord objects break encapsulation and thereby aren't shining examples of OO (they do, so they aren't), but the degree to which anyone should care in any particular case. Rigidly-followed MVC, which aims to divide Model from View, would tend to encourage interfaces that prevent the Model data source, i.e. a database or remote service or file, from affecting the View. The result is layers of abstraction.

In contrast, I'm steadily less convinced of the need for doing that. The number of times I've 1) completely swapped out the Model data source, or 2) significantly tweaked the Model without also somehow exposing that tweak in the View, is minuscule. Similarly, the majority of webapps I work on are almost all View, anyway. Therefore, as long as the View is separate from the Model, meaning no program entity executing both Model and View responsibilities, inserting an abstraction or indirection layer between the two probably is unnecessary extra work and complexity. When the coupling chiefly consists of feeding a dynamic data structure, like a list of hashes of Objects, into the View, the weakness of the contract implies that the Model could change in numerous ways without breaking the call--the parameter is its own abstraction layer. Of course, that flexibility comes at the cost of the View needing to "know" more about its argument to successfully use it. That information has in effect shifted from the call to the body of the call.

Lastly, after retooling in ASP.Net, the importance of a separate Controller has also decreased for me. I can appreciate the necessity of subtler URL-to-code mapping, and the capability to configure it on the application level in addition to the web server level. Yet the concept of each dynamic server page consisting of a page template and an event-driven code-behind class is breathtakingly simple and potent as a unifying principle. Moreover, the very loose MVC enforcement in ASP.Net (although explicit MVC is coming) is counterbalanced by the strong focus on components. To be reusable across Views, components must be not tied too closely to a specific Model or Controller. The Controller code for handling a post-back or other events is assigned to the relevant individual components as delegates. The Model...er, the Model may be the weak point, until the .Net Entity Framework comes around. As with any other webapp framework, developers should have the discipline to separate out the bulk of the nitty-gritty information processing from the page when it makes sense--fortunately, if that code is to be reused in multiple pages, the awkwardness acts as a nudge.

The practical upshot is to use MVC to separate the concerns of the webapp, while ensuring that separation is proportionate to the gain.

Wednesday, November 14, 2007

another case of unintentionally funny

I've stumbled on another unintentional funny from somewhere on the Web. This is a concluding analysis after converting a function to be truly tail-recursive:
Comparing those two the first one increases the stack to a point until it starts to pop the stack. The tail recursive version does not use the stack to keep a “memory” what it has still to do, but calculates the result imediatly. Comparing the source though, in my view the tail recursive version takes all beaty out of the recursive implementation. It nearly looks iterative.

Hmm...tail recursion making a recursive function "nearly look" iterative. Go figure.

(What makes this funny, of course, is the fact that in situations without mutable variables, tail-recursive functions are the way to achieve iteration. And converting recursion into iteration through an explicit stack is another well-known technique...)

Tuesday, November 06, 2007

DIY expletives

I was about to apologize for writing something self-indulgent, but then I remembered this is a blog. Moving along...

I've fallen into the habit of saying DIY (do-it-yourself) expletives. Making up expletives isn't difficult. With practice, driven by the frustrations of everyday life, the original sound combinations just flow off the lips like, uh, spittle. Observe these helpful tips, which are more guidelines than rules:
  • Keep each individual "word" short. Expletives beyond two syllables can be unwieldy and unsatisfying. The goal is acting as an embodiment of anger and rage--emotions of violent aggression.
  • The limitation on the length and complexity of each word doesn't apply to the "sentence" as a whole. Speaking multiple words in rapid succession is great for overall effect, even more so in situations when loudness is prohibited.
  • To start a "word", I like to use the consonants k, t, d, m, s. To a lesser degree I also sprinkle in usage of p, h, b, n, g (hard or soft pronunciation, but more often hard), and f. More complicated consonantal sounds can be handy for adding flavor, such as "st","ch", "sp". By all means include gutturals, if you can without hurting yourself or working too hard.
  • I like to use the vowel sounds "ah", "oo", "oh". Other vowel sounds can help to break up monotony, perhaps in the middle or end of a statement, but in any case shouldn't draw attention too far away from the consonants. Following up a vowel sound with another vowel sound ("oo-ee" for example) is probably not conducive to maintaining peppy rhythm and tempo.
  • "Words" with multiple syllables (not more than two or three per word, recall!) are simply constructed by combining monosyllabic words.
  • Emphasize whichever syllable you like. Variety is what I prefer.
  • The last tip is counterintuitive: don't let the preceding tips slow you down or stop you from creating a word that feels right. Practice and developing the habit are more important.
Some quick examples:
  • "Mah-TOO-choh!"
  • "Fah soo poh NOOST!"
  • "KOH-dahb hoom BAYSH-tah!"
I'm not absolutely sure what led me to start muttering violent gibberish, but I suspect my Star Wars and Firefly DVDs deserve some of the blame.