By "human brain as VM" I don't mean running bytecode on wetware. (That would give a sinister twist to the phrase "write once, run anywhere", eh?.) I mean the mental process of understanding the operation of lines of source code, sometimes called tracing within and without debuggers. I have assumed that other developers use the same strategy I do when confronted with unfamiliar code: mentally simulating what and how the computer will execute. That is, building up a mental model of a language and the various scopes of a particular program, then applying that mental model to the actual instructions to produce a projected result.
But just a few days ago I attended a training presentation in which the source code on the slide had no chance at all of accomplishing what the presenter said it did. I'm pretty sure the presenter wasn't stupid or purposefully careless. My conclusion was that he had composed the code by copying it incompletely from elsewhere or incorrectly translating an abstract algorithm into its concrete counterpart. Either way, he had most certainly not parsed and traced the program like a "computer" (in quotes because every medium or high level language technically is an abstraction from the actual computer's execution of its instruction set and memory).
How common is it for developers to read, write, and otherwise think about code without putting it through the mental lexer, the mental parser, or even the mental Turing machine as a last resort? I ask because I'm genuinely curious. I don't feel that I understand code unless I can visualize how the individual character elements fit together (syntax) and how the data flows through the program as it runs (semantics); is this not the case for others? Is this another instance of my rapport with languages and grammar and logic? Am I wrong to read through not only the prose of a training book, but also the code examples, and then not continue until I can mentally process that code in my mental VM without tossing compile-time syntax exceptions?
I know some (ahem, unnamed) people whose primary mode of development is tracking down examples, copying them, and modifying them as little as possible to get them to work. Example code is an excellent aid to understanding as well as an excellent shortcut in writing one's first program; I'm not disputing that. Patterns and techniques and algorithms should be publicized and implemented as needed; I'm not disputing that either. And I understand that the scary corners and edges of a programming language (static initializers in Java?) are probably better left alone until really needed, to avoid excessive hassles and complications.
However, if someone wants to save development time and effort, especially over the long run (small programs tend to morph into HUGE programs), isn't it better to take the time to learn and understand a language or API and instead reuse code through a code library mechanism, rather than copying code like a blob of text and forfeiting one's responsibility to, well, know what's going on? Don't forget that by their very nature copy-and-paste code examples fall out of date obscenely fast. Often, it seems like people write code one way because they soaked it up one day and now they cling to it like a life preserver. Multiline string literals created by concatenating individual lines and "\n", in one of the languages with features for multiline string literals, never ceases to amaze me. As a last plea, keep in mind that as good APIs evolve, common use cases may become "embedded" into the API. Congratulations, your obsolete copy-and-paste code example written against version OldAndBusted is both inferior and harder to maintain than the one or two line API call in version NewHotness. Try to keep up.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.