...is the chance to wait day after day for the surround sound option to return to all the TV series that had it before! Even better, total silence about it in Netflix's established avenues for public communication! (No, extremely vague and evasive answers from Customer Support don't count.) Sign up, and for a low monthly price you too can give your patience a thorough endurance session.
Yeah, I know, #firstworldproblems. But at least I'm not airing a "one-percenter" problem ("it's so hard to find the right personal chef!").
Showing posts with label Rants. Show all posts
Showing posts with label Rants. Show all posts
Tuesday, January 24, 2012
Sunday, September 11, 2011
peeve no. 265 is users blaming the computer
No, user of the line-of-business program, the computer isn't the trouble-maker. It could be from time to time, if its parts are old or poorly-treated, but problems at that level tend to be much more noticeable than what you're describing. Generally, computers don't make occasional mistakes at random times. Despite what you may think, computers are dogged rather than smart. Computers do as instructed, and by "instructed" I mean nothing more than configuring the electricity to move through integrated circuits in a particular way. Computers can't reject or misunderstand instructions. No "inner presence" exists that could possibly do so.
I understand that placing blame on "the computer" can be a useful metaphor for our communication. But the distinction I'm drawing this time is substantive. To identify the precise cause of the issue that you've reported, a more complete picture is necessary. Your stated complaints about the computer's misdeeds really are complaints about something else. The reason to assign blame properly isn't to offer apologies or excuses. Figuring out the blame is the first step in correcting the issue and also in preventing similar issues.
I understand that placing blame on "the computer" can be a useful metaphor for our communication. But the distinction I'm drawing this time is substantive. To identify the precise cause of the issue that you've reported, a more complete picture is necessary. Your stated complaints about the computer's misdeeds really are complaints about something else. The reason to assign blame properly isn't to offer apologies or excuses. Figuring out the blame is the first step in correcting the issue and also in preventing similar issues.
- Possibility one is a faulty discussion of the needed behavior for the program, way back before any computer played a role. Maybe the right set of people weren't consulted. Maybe the right people were involved, but they forgot to mention many important details. Maybe the analyst missed asking the relevant questions. Now, since the program was built with this blind spot, the issue that you reported is the eventual result.
- Possibility two is a faulty translation of the needed behavior into ideas for the program. Maybe the analyst assumed too much instead of asking enough questions. Maybe the analyst underestimated the wide scope of one or more factors. Maybe the analyst was too reluctant to abandon an initial idea and overextended it. Maybe the analyst neglected to consider rare events that are not so rare.
- Possibility three is faulty writing of the program itself. Maybe the coders overestimated their understanding of their tools and their work. Maybe the coders had comprehensive knowledgeable and didn't correctly or fully express what they intended. Maybe a fix had unfortunate side effects. Maybe the tests weren't adequate.
- Possibility four is faulty data. Like blaming the computer, blaming the data is a symptom. Maybe something automated quit abruptly. Maybe manual entry was sloppy. Maybe the data is accurate and nevertheless unexpected. Maybe someone tried to force shortcuts. Maybe management is neither training nor enforcing quality control.
- Possibility five is faulty usability, which faulty data might accompany. "Usable" programs ease information processing from the standpoint of the user. Maybe the program isn't clear about what the user can do next. Maybe unknown terminology is everywhere. Maybe needless repetition encourages boredom and mistakes. Maybe, in the worst cases, staff decide to replace or supplement the program with pen marks on papers or fragile spreadsheets containing baroque formulae. Downfalls in usability may disconnect excellent users from excellent programs.
- Possibility six is the dreaded faulty organization, in which various units disagree or the decision-makers are ignorant. Maybe definitions are interpreted differently. Maybe the "innovators" are trying to push changes informally. Maybe the realm of each unit's authority are murky and negotiable at best. Maybe units are intentionally pulling in opposite directions. Regardless, the program probably will fail to reconcile the inherent contradictions across the organization.
Friday, July 29, 2011
peeve no. 264 is omniduction
Two thinking processes generally receive attention: deduction and induction. Logical consequences and generalizations. Debaters construct their arguments from these basic blocks. Participants who rely on other tactics are subject to being systematically dismantled by opponents. (There's also abduction, but I'm ignoring it here.)
Au contraire! This is an incomplete account of actual reasoning in human society. Besides deduction and induction, I propose a third process: "omniduction". Omniduction is the creation of many small details from out of unassailable preconceptions. Its rallying cry is, "I don't need the data! I know what the data is already because I know that ___(preconception)___ could never ever be false!" The miracle of omniduction is akin to spinning gold from straw. Reality can be so simple just by producing facts through grandiose assumptions, not vice-versa.
Unfortunately, omniduction tends to be problematic when practitioners try to converse. Unless everyone happens to be applying omniduction identically, the produced universes could be in conflict. When facts follow directly from overall assumptions, discussion of evidence is futile.
Au contraire! This is an incomplete account of actual reasoning in human society. Besides deduction and induction, I propose a third process: "omniduction". Omniduction is the creation of many small details from out of unassailable preconceptions. Its rallying cry is, "I don't need the data! I know what the data is already because I know that ___(preconception)___ could never ever be false!" The miracle of omniduction is akin to spinning gold from straw. Reality can be so simple just by producing facts through grandiose assumptions, not vice-versa.
Unfortunately, omniduction tends to be problematic when practitioners try to converse. Unless everyone happens to be applying omniduction identically, the produced universes could be in conflict. When facts follow directly from overall assumptions, discussion of evidence is futile.
- "My theory is correct. Your information must be wrong."
- "The complex specifics of this situation must be irrelevant. My flawless system of beliefs doesn't require such minutiae to render a verdict."
- "Truths cannot be complicated. If you'd only consider the issue from my perspective, you'd realize that a small set of self-contained opinions can explain anything."
Friday, May 27, 2011
peeve no. 263 is offhand Java bashing
It's a sign of immature writing to insist on inserting opinions and extra comments. Within opinionated persuasive essays (or blog rants), this is an expected practice because it's the whole point. Within a tutorial, it's an irritating distraction. If you're writing disinterested and objective prose about a technological topic, then it's...not...about...you!
When someone's stated goal is to educate, as opposed to spreading a point of view, side-swipes at programming languages should be left out. There's certainly a time and place to "fight the good fight" for your faction, but not by dumping stray sentences in the middle of a how-to on neutral ground. I understand, Advanced Technical Writer, that you're eager to give off the impression that you're an independent thinker of exquisite taste. Why not fully express your sophisticated perspective into a separate long-form document, rather than confining it to a cryptic handful of sentences or footnotes in everything you write? Tossing off some quick one-sided insults doesn't do justice to yourself or the reader. Advocate in your advocacy articles. Propagandize in your propaganda.
When someone's stated goal is to educate, as opposed to spreading a point of view, side-swipes at programming languages should be left out. There's certainly a time and place to "fight the good fight" for your faction, but not by dumping stray sentences in the middle of a how-to on neutral ground. I understand, Advanced Technical Writer, that you're eager to give off the impression that you're an independent thinker of exquisite taste. Why not fully express your sophisticated perspective into a separate long-form document, rather than confining it to a cryptic handful of sentences or footnotes in everything you write? Tossing off some quick one-sided insults doesn't do justice to yourself or the reader. Advocate in your advocacy articles. Propagandize in your propaganda.
I would've written this example shorter, but Java is awful, am-I-right? Now watch me apply the words "noise" and "clutter" to perfectly understandable and correct syntax. Isn't a shame that I used an anonymous class here, when other languages (wink-wink, nudge-nudge) offer different possibilities? I apologize for writing loops and not overriding operators; my example language ties my hands so tight that it burns. Sorry about these checked exceptions ruining my 'sunny-scenario-only-dear-bob-don't-copy-it-as-is' code sample. Please don't laugh at the parentheses in my 'internal DSL', I did the best I could with the terrible tools available. Oh, how I long to have avoided that well-known design pattern through the deep meta-programming magicks that you naive mortals cannot comprehend. And the explicit data types torture me so...
Friday, April 15, 2011
peeve no. 262 is verification by bluster
In some ways, human nature is flawed. Complaining about it won't help. But here I go anyway...
Many arguments in debates are intellectually wrong or flimsy at best. This is well known and oft mentioned by pedants. In my opinion, these transparently fallacious arguments are less insidious than their emotional counterpart, "verification by bluster". With a sufficient combination of affirmation, bravado, machismo, etc., humans appear to forget about the concept of proof.
Many arguments in debates are intellectually wrong or flimsy at best. This is well known and oft mentioned by pedants. In my opinion, these transparently fallacious arguments are less insidious than their emotional counterpart, "verification by bluster". With a sufficient combination of affirmation, bravado, machismo, etc., humans appear to forget about the concept of proof.
- Confidence levels may be out of proportion to the supporting data. In a card game this would be called a bluff. If anyone asks "Are you sure?", then the answer is always "100%", in a firm voice and accompanied by a steady gaze. Do it right and humans just won't bother with whether the steely assurance is grounded in reality. This is particularly important in sales or management; complete detailed honesty about actual evidence would merely demonstrate "lack of belief".
- Naive arguers sometimes act as if winning means discrediting the other side's facts. Not necessarily true. It might be enough to invent a sufficiently nasty name for a part of the opposing argument, then repeat it endlessly.
- It's similarly naive to act as if an individual data point "speaks for itself". It's a comparatively simple matter of weighting. Take the contradictory data point, juxtapose it against a supportive data point, state assertively that you're providing "full context". Humans are unlikely to request entire data sets in any case, because analysis might involve, y'know, math.
- In general, complexity is a bad tack. Messy corner cases and ambiguous conclusions aren't gratifying. Exceptions to beloved rules are downright unsettling to some personality types. Hence the more profitable route of argumentation is to accept simplistic thinking, not challenge it. It won't work to spout admonitions to replace laughably course-grained prejudices with complicated observations. Confronted with the world as it is, right now, humans instead prefer a personalized/sanitized perspective in which solutions are easy, relationships are causal, and heroes and villains are well-defined.
Friday, October 29, 2010
peeve no. 261 is Obi and programming language mentality
The Force can have a strong influence on the weak-minded. (--Obi-Wan)
Similarly, a programming language can have a strong influence on weak-minded programmers. If your programming language "changed your life", "made you a better coder", "expanded your mind", etc., then I suggest that you may be going about this programming thing the wrong way. If a lack of language support for OO results in you producing a tangled web of dependencies among procedural functions, or easy access to global/file scope results in you spraying program state everywhere, then the programming language is your crutch. It's the guide-rail for your brain.
I'm glad that you found a programming language that you enjoy, but consider me unimpressed by rather impractical claims about the ways that it alters your consciousness.
Similarly, a programming language can have a strong influence on weak-minded programmers. If your programming language "changed your life", "made you a better coder", "expanded your mind", etc., then I suggest that you may be going about this programming thing the wrong way. If a lack of language support for OO results in you producing a tangled web of dependencies among procedural functions, or easy access to global/file scope results in you spraying program state everywhere, then the programming language is your crutch. It's the guide-rail for your brain.
I'm glad that you found a programming language that you enjoy, but consider me unimpressed by rather impractical claims about the ways that it alters your consciousness.
Wednesday, June 30, 2010
peeve no. 260 is factional prejudice in IT
Faction: a group or clique within a larger group, party, government, organization, or the likeI don't know what it is about IT that results in professionals forming factions. Use text editor X! Align your punctuation thusly! Avoid all software from company Q! Programming language R is your cool friend that all the popular kids like! Every problem should be flattened via solution U! Your VCS is DOA!
I understand that people develop attachments to their favorite tools and passion for their work; that's fine. What drives me nuts is the all-too-common over-exaggeration of these feelings. Acting as a cheerleader for your chosen IT factions can be fun, but if you ground your personal identity in it then you've officially gone mental. An IT professional should make technological decisions based on actual requirements and well-understood trade-offs, not based on which option fits your personal "style". Of course, usability and even nebulous subjective appeal are nevertheless legitimate factors in the decision simply because a difficult and hated option is detrimental to productivity and morale. The difference is a decision-maker who critically weighs all applicable criteria instead of just always "voting with the faction". Emotional investment is present without overpowering every other fact. Someone doesn't say "I'm a BLUB programmer". Rather, "I know the most pertinent portions of the syntax and semantics of BLUB, and I like BLUB because of BLURB".(I've complained before about grouping programmers by language.)
Although I'm ranting against factional prejudice, I caution readers to not interpret my opinion as a case of "binary" thinking (i.e. thinking characterized by a simplistic division into only two categories). Some factional prejudices are quite beneficial and practical, when kept within reasonable and specific bounds. For example, past behaviors of a company or open source project are good reasons to be wary of current behavior. A platform's reputation for lackluster efficiency is a worthwhile justification to avoid it unless the platform demonstrates improvement (or the efficiency is sufficient for the task). Evaluate factional prejudice according to its motivation and relevance and realism.
For businesses at least, technology isn't an accessory or an avenue of personal expression. It serves a purpose beyond the satisfaction of one's ego. The One True Way might not be what's right for the situation and the client.
Tuesday, July 29, 2008
peeve no. 259 is calling supernatural stories "sci-fi"
Of all the mislabeling of entertainment that goes on, none sets me on edge quite like "sci-fi" referring to stories that contain the supernatural. I appreciate that authors don't feel obligated to confine their ideas into neat little categories (nor should they!) and that consequently none is adequate. I recognize that almost from the very start sci-fi, science fiction, bifurcated into stories based on the one hand on the "hard" science of experimentally-verified theories, and on the other hand on "soft" scientific or technological speculation. Moreover, a work's scientific "hardness" lies on a fuzzy spectrum because some of its ideas could be hard and others soft. If the work is episodic, then each episode could be different on the scale. I enjoy both hard and soft sci-fi, although I prefer that a story strives to stick to its chosen hardness rather than leaping sloppingly over the line and back.
My limit is really pretty simple: once a story is so soft on science that it includes the supernatural, its sci-fi credentials should be revoked at least temporarily. I'm okay with exposition like "the 'telepathic' race communicates with one another through radio waves". I'm not at all okay with "the being made of energy can create and transform matter at will". Or rather, I'm not okay with that character's story to be known as "sci-fi". Science's relevance to that story is slim.
The dividing line shouldn't be that hard to distinguish. If a story entails a concept that science contradicts, casts doubt upon, or dismisses from consideration, then the story shall not be deemed science fiction. If a story's suspension of disbelief involves jettisoning the conventional scientific perspective, then the story is outside science--beyond it, if you prefer. It's a supernatural story. Such stories might have value for any number of reasons, but worthiness of the "sci-fi" description isn't one.
My limit is really pretty simple: once a story is so soft on science that it includes the supernatural, its sci-fi credentials should be revoked at least temporarily. I'm okay with exposition like "the 'telepathic' race communicates with one another through radio waves". I'm not at all okay with "the being made of energy can create and transform matter at will". Or rather, I'm not okay with that character's story to be known as "sci-fi". Science's relevance to that story is slim.
The dividing line shouldn't be that hard to distinguish. If a story entails a concept that science contradicts, casts doubt upon, or dismisses from consideration, then the story shall not be deemed science fiction. If a story's suspension of disbelief involves jettisoning the conventional scientific perspective, then the story is outside science--beyond it, if you prefer. It's a supernatural story. Such stories might have value for any number of reasons, but worthiness of the "sci-fi" description isn't one.
Tuesday, July 15, 2008
peeve no. 258 is dynlang users who disrespect Perl
Before I get my rant on, I'll do the lawyerly thing and attempt to clarify my terms to avert misunderstanding.
So I know that Perl has its problems. In fact, my encounters with Perl nowadays stem from two sources: 1) ad-hoc administrative tasks and/or data processing (but if I need to interact with one or more of standard formats, databases, JVM APIs then I switch to groovy), 2) upkeep of legacy Perl that acts either as "glue" between systems or as rough internal CGI interfaces for simple yet vital business tasks (I'd also note that the "legacy" Perl has no firm date for replacement). I'm not bothered when language cheerleaders call attention to Perl's weaknesses; this practice is hardly a new phenomenon, and when the latest critic repeats the same years-old tired refrain I can barely manage to react at all. No, what gets me worked up is when someone is dismissive, contemptuous, or mocking toward Perl, which has on numerous occasions, despite its acknowledged icky portions, served my purposes, enabled me to achieve unconventional solutions, and taught me intermediate-to-advanced programming concepts. Without Perl, I might have had to hack together a comparatively ugly combination of grep, find, sed, awk, etc., and shell script. (At the time I'm describing, pretty much the only other programming language available on the system was C or C++, and later Java.)
I'm not too irked by cheerleaders for static languages who throw rocks at Perl for fun. Their derisiveness is a subset of their general antipathy for dynamic languages. If Perl was the epitome of language design perfection then they would still pronounce it junk because of characteristics that by definition apply to any dynamic language: performance overhead, lack of mandatory type enforcement, multiple inheritance, chaotic data structures that are modifiable at will, and so on.
As the title says, the true irritants are dynamic language users who explicitly or implicitly assert that Perl is horrible, terrible. These users know the advantages (and disadvantages) of a dynamic language. They know Perl is a dynamic language. They know that they would probably rather use Perl rather than a static language to solve the same problems (okay, maybe some of them would instead use ML-family, Haskell, or statically-compiled Lisp-family--just maybe). Moreover, the reasons they give for why Perl is abominable sometimes strongly resemble the reasons static language cheerleaders give for avoiding dynamic languages altogether: too many operators and punctuation in general, syntax features that can interact in complicated ways, too few built-in data types (no "official" numbers or strings, just scalars!), the choice to use or ignore OO, special variables that affect execution, accumulated design cruft that makes some tasks too awkward or tricky. The stranger cases are dynamic language users who formerly used Perl primarily, but after changing their habits suddenly stop recognizing anything valuable about Perl at all. Perhaps they used Perl for the general benefits of a dynamic language, all the while having strong distaste for the actual "Perl-ness", and as soon as they could obtain a dynamic language without that flavor they were glad to kick it to the curb. Or (shudder) perhaps some current dynamic language users spent a long time solely in the static language camp, where they grew accustomed to decrying Perl as a mysterious, confounding mess, but after using and liking a dynamic language they continued to denounce Perl more out of habit than out of scorn for dynamism anymore.
Perl is what it is (except when it purposely breaks compatibility in some ways for Perl 6), though it does keep changing and code written using current recommended practices is superior to what you might remember. All I wish is for dynamic language users to give Perl the credit it's earned, is still earning, will earn. Stop mercilessly bashing it as if it ate your parents. But know that I join you in disliking sigils, list/scalar context distinctions, and autovivification.
- "dynlang" is 9 characters shorter than "dynamic language", the name of a fuzzy category of programming languages that allow, support, and exploit the capability to make run-time changes to a program's very structure--its data types, classes/data structures, method/function dispatch. Although program interpretation is quite better suited than compilation to implement such a capability, it's too simplistic to assume that all dynamic languages are necessarily interpreted and all static languages are necessarily compiled, and for that matter too simplistic to assume that a particular language implementation never does both. (Groovy cheerleaders are fond of mentioning that their dynamic language is compiled, while compiled Objective-C code sends object messages at run-time.)
- "Disrespect" refers to an attitude, not to an engineering-like objective evaluation of trade-offs. An example of the latter is "Java's static typing nature can result in code that is difficult to adapt to unanticipated purposes." An example of disrespect is "Java sucks. And one of the reasons it sucks, and the people who choose to use it also suck, is that the stupid types strewn throughout a class do nothing but get in the way when I just want to pass in an instance of MyTempObject, and get the idiotic compiler to stop tying my hands on my behalf". (By the way, I apologize for the inaccuracy. This example of Java disrespect is a bit too lucid and courteous to match the corresponding real statements elsewhere on the Web.)
So I know that Perl has its problems. In fact, my encounters with Perl nowadays stem from two sources: 1) ad-hoc administrative tasks and/or data processing (but if I need to interact with one or more of standard formats, databases, JVM APIs then I switch to groovy), 2) upkeep of legacy Perl that acts either as "glue" between systems or as rough internal CGI interfaces for simple yet vital business tasks (I'd also note that the "legacy" Perl has no firm date for replacement). I'm not bothered when language cheerleaders call attention to Perl's weaknesses; this practice is hardly a new phenomenon, and when the latest critic repeats the same years-old tired refrain I can barely manage to react at all. No, what gets me worked up is when someone is dismissive, contemptuous, or mocking toward Perl, which has on numerous occasions, despite its acknowledged icky portions, served my purposes, enabled me to achieve unconventional solutions, and taught me intermediate-to-advanced programming concepts. Without Perl, I might have had to hack together a comparatively ugly combination of grep, find, sed, awk, etc., and shell script. (At the time I'm describing, pretty much the only other programming language available on the system was C or C++, and later Java.)
I'm not too irked by cheerleaders for static languages who throw rocks at Perl for fun. Their derisiveness is a subset of their general antipathy for dynamic languages. If Perl was the epitome of language design perfection then they would still pronounce it junk because of characteristics that by definition apply to any dynamic language: performance overhead, lack of mandatory type enforcement, multiple inheritance, chaotic data structures that are modifiable at will, and so on.
As the title says, the true irritants are dynamic language users who explicitly or implicitly assert that Perl is horrible, terrible. These users know the advantages (and disadvantages) of a dynamic language. They know Perl is a dynamic language. They know that they would probably rather use Perl rather than a static language to solve the same problems (okay, maybe some of them would instead use ML-family, Haskell, or statically-compiled Lisp-family--just maybe). Moreover, the reasons they give for why Perl is abominable sometimes strongly resemble the reasons static language cheerleaders give for avoiding dynamic languages altogether: too many operators and punctuation in general, syntax features that can interact in complicated ways, too few built-in data types (no "official" numbers or strings, just scalars!), the choice to use or ignore OO, special variables that affect execution, accumulated design cruft that makes some tasks too awkward or tricky. The stranger cases are dynamic language users who formerly used Perl primarily, but after changing their habits suddenly stop recognizing anything valuable about Perl at all. Perhaps they used Perl for the general benefits of a dynamic language, all the while having strong distaste for the actual "Perl-ness", and as soon as they could obtain a dynamic language without that flavor they were glad to kick it to the curb. Or (shudder) perhaps some current dynamic language users spent a long time solely in the static language camp, where they grew accustomed to decrying Perl as a mysterious, confounding mess, but after using and liking a dynamic language they continued to denounce Perl more out of habit than out of scorn for dynamism anymore.
Perl is what it is (except when it purposely breaks compatibility in some ways for Perl 6), though it does keep changing and code written using current recommended practices is superior to what you might remember. All I wish is for dynamic language users to give Perl the credit it's earned, is still earning, will earn. Stop mercilessly bashing it as if it ate your parents. But know that I join you in disliking sigils, list/scalar context distinctions, and autovivification.
Tuesday, March 04, 2008
peeve no. 257 is the idea of Internet "governance"
I'm not sure if this opinion places me in the category of realist or idealist, but the whole idea of Internet "governance" irks me. The Internet is a "network of networks", and a network is a connection between devices. It's a simple idea, while at the same time being fiendishly complicated to put into practice. The Internet is "devices talking to one another".
Nobody should find it necessary to comment that the Internet is decentralized, or to comment that the value of the Internet is at the "edges", i.e. the devices actually sending and receiving data. In simple terms, when ma and pa click on an icon for a Web browser to "go to the Internet", the closer metaphor isn't switching TV channels or turning pages in a massive encyclopedia (Information Superhighway, remember that?). They're telling the computer to "make a phone call" to other computers. This point was easier to emphasize back when people plugged a regular phone line right in to the computer's internal or external modem. IPTV and VOIP probably interfere further with the casual user's attempt at a metaphorical understanding...
If the fundamental Internet is just computers talking, then discussing its fundamental "governance" is nutty. Who "governs" TCP, UDP, and so on? Who "governs" HTML, CSS? Who "governs" SMTP, POP, IMAP? Who "governs" IP, IPv6? The Internet is communication. What matters is that the communication works right here, right now. Standards and protocols enable valid communication. The "penalty" is miscommunication, e.g., a malformed data packet or a web page that's "broken" in an idiosyncratic renderer.
No entity--company, government, standards body--"governs" the Internet. The Internet isn't owned by them either. I want to believe this is true. The irresistible tendency to see the Internet as a marketplace, public diary, fan convention, software development technique, leisure entertainment, newsroom, information repository, and so forth masks the reality. There is no such thing as cyberspace and no such thing as blogosphere. The Internet is device A exchanging data with remote device B, via a path of complex technologies and intermediaries. At the point at which almost all important information interaction happens through this method, and yet strict governance extends to the same level of micromanagement of communication, previous lacks of privacy will seem tame.
Nobody should find it necessary to comment that the Internet is decentralized, or to comment that the value of the Internet is at the "edges", i.e. the devices actually sending and receiving data. In simple terms, when ma and pa click on an icon for a Web browser to "go to the Internet", the closer metaphor isn't switching TV channels or turning pages in a massive encyclopedia (Information Superhighway, remember that?). They're telling the computer to "make a phone call" to other computers. This point was easier to emphasize back when people plugged a regular phone line right in to the computer's internal or external modem. IPTV and VOIP probably interfere further with the casual user's attempt at a metaphorical understanding...
If the fundamental Internet is just computers talking, then discussing its fundamental "governance" is nutty. Who "governs" TCP, UDP, and so on? Who "governs" HTML, CSS? Who "governs" SMTP, POP, IMAP? Who "governs" IP, IPv6? The Internet is communication. What matters is that the communication works right here, right now. Standards and protocols enable valid communication. The "penalty" is miscommunication, e.g., a malformed data packet or a web page that's "broken" in an idiosyncratic renderer.
No entity--company, government, standards body--"governs" the Internet. The Internet isn't owned by them either. I want to believe this is true. The irresistible tendency to see the Internet as a marketplace, public diary, fan convention, software development technique, leisure entertainment, newsroom, information repository, and so forth masks the reality. There is no such thing as cyberspace and no such thing as blogosphere. The Internet is device A exchanging data with remote device B, via a path of complex technologies and intermediaries. At the point at which almost all important information interaction happens through this method, and yet strict governance extends to the same level of micromanagement of communication, previous lacks of privacy will seem tame.
Thursday, February 07, 2008
peeve no. 256 is overuse of the word 'powerful'
Like the other word overuse peeves ("random" and "satire"), I have no issue with the word "powerful" when it's used as I think it should. For instance, when something is full of power, "powerful" is a fine description. In other cases, when the definition and/or quantification of the relevant "power" is debatable, a better word choice would be more interesting and more accurate.
- An experience provoked strong emotions, so therefore it was powerful, Mr. Critic? Fascinating. How many watts was it?...Oh, the experience had emotional or "psychic" power. How much emotional work did this power achieve? Enough to form one semi-vivid, long-term memory?...Would it do the same to any person? If not, does that imply each person has a different degree of emotional inertia? Hmm? Stop walking away from me...
- This software is powerful, Mr. Marketer? Fascinating. Where is the power lever? If I turn up the juice, will it use up more electricity? Does it have a "turbo boost" button I can press to make it run faster?...Yes, I know about nice and renice, but wouldn't those affect powerless software the same?...Oh, I see, the number of features is what makes it powerful. If I don't plan to ever use those features, will the software still be powerful for me? Eh?...Of course, how silly of me, what makes the software powerful is its set of sophisticated commands. When those coarse-grained commands make too many assumptions about what I expect, are those commands still powerful? Don't get angry, I just want to know what you mean...
- This programming language is powerful, Mr. Advocate? Fascinating. Does that mean it can do more than existing general-purpose languages?...No? Does that mean it has all the same features as language X, and more?...What? Cramming every feature and possibility into a language would be awful? Well then, does the powerfulness of the programming language mean that it runs fast?...I suppose you're right to say that's an implementation detail, and in any case the algorithm design can be far more important than the language for execution performance. What about the design?...You're saying succinctness is power. I recall reading about that once...
- Politician X is powerful, Mr. Pundit? Fascinating. He must be a fairly good athlete, then?... Ah, I see, you mean he's well-connected. So it isn't him that's powerful, but a network of people doing favors for each other?...Right, in a democracy his power would have come from the citizens who elected him. He's powerful because at election time he was popular enough?...Huh, he's powerful now, because he's good at persuasion. His powerfulness is really power redirected from everyone around him?...Naturally, he would also have official government powers according to his post. His powers allow him to command government employees. But if they consent because of his position in government, isn't the position the part that is powerful? Stop interrupting my questions...
Thursday, January 31, 2008
peeve no. 255 is an API call changing global state
Ooooo, there's nothing quite like spending most of the workday trying to track down the root cause of a single problem.
Except for when the root cause turns out to be an API call changing global state.
It's like the spammy counterpart to the inversion of control technique. "You know how convenient it can be to define an application's callbacks/delegates/listeners, and just let library code handle the Big Picture as much as it can? What if the library code also manipulated the application code's variables and resources on its behalf, without even being asked!" It's an idea so innovative it'll prompt you to smack your forehead in response. Or smack your forehead against a desk. Repeatedly.
It's like the code equivalent of Yzma's frustration with Kronk. "...why did I think you could do this? This one simple thing. It's like I'm talking to a monkey..." I ask my API for help with one task, the task the API call is clearly named for. I provide the API call with what it needs to carry out the task. And at execution time, the API call then proceeds to surgically cut a piece of data, which sends the entire program crashing down like a chandelier.
Testing it is like repeatedly stepping on rakes lying on the ground. Each time the code runs (stepping on the rake), failure seemingly comes out of nowhere (the handle) to bash you. So you try changing some of the application code, merely to see if the change might make a difference. Thwack. Grab your stress-relief squeeze ball ("that little sucker just saved your life, monitor!") for a moment, adjust a different section of the application code, try again. Thwack.
The Wheel of Morality for today contains the following tips, any of which the spin could land on: 1) minimize global state by whatever options are available, 2) minimize the number of incidental side-effects per procedure and document the side-effects that remain, 3) minimize the general "action-at-a-distance" quality of all code, 4) minimize the assumptions a reusable library makes about the application, 5) minimize surprise without introducing inconvenience by somehow providing multiple library code paths--for instance, by including an optional DoMyWorkForMe parameter, a separate super-deluxe FunctionWithStateMunging, or an OptimizedForYourConvenience façade object.
As for my situation, in which the offending API is written and maintained by a vendor and distributed only in bytecode form (say hello to my little friend, the .Net Reflector), 6) frakking minimize the frelling respect and tolerance and money for that kriffing vendor.
Except for when the root cause turns out to be an API call changing global state.
It's like the spammy counterpart to the inversion of control technique. "You know how convenient it can be to define an application's callbacks/delegates/listeners, and just let library code handle the Big Picture as much as it can? What if the library code also manipulated the application code's variables and resources on its behalf, without even being asked!" It's an idea so innovative it'll prompt you to smack your forehead in response. Or smack your forehead against a desk. Repeatedly.
It's like the code equivalent of Yzma's frustration with Kronk. "...why did I think you could do this? This one simple thing. It's like I'm talking to a monkey..." I ask my API for help with one task, the task the API call is clearly named for. I provide the API call with what it needs to carry out the task. And at execution time, the API call then proceeds to surgically cut a piece of data, which sends the entire program crashing down like a chandelier.
Testing it is like repeatedly stepping on rakes lying on the ground. Each time the code runs (stepping on the rake), failure seemingly comes out of nowhere (the handle) to bash you. So you try changing some of the application code, merely to see if the change might make a difference. Thwack. Grab your stress-relief squeeze ball ("that little sucker just saved your life, monitor!") for a moment, adjust a different section of the application code, try again. Thwack.
The Wheel of Morality for today contains the following tips, any of which the spin could land on: 1) minimize global state by whatever options are available, 2) minimize the number of incidental side-effects per procedure and document the side-effects that remain, 3) minimize the general "action-at-a-distance" quality of all code, 4) minimize the assumptions a reusable library makes about the application, 5) minimize surprise without introducing inconvenience by somehow providing multiple library code paths--for instance, by including an optional DoMyWorkForMe parameter, a separate super-deluxe FunctionWithStateMunging, or an OptimizedForYourConvenience façade object.
As for my situation, in which the offending API is written and maintained by a vendor and distributed only in bytecode form (say hello to my little friend, the .Net Reflector), 6) frakking minimize the frelling respect and tolerance and money for that kriffing vendor.
Thursday, December 20, 2007
peeve no. 254 is "argument from common sense"
"Common sense is the collection of prejudices acquired by age eighteen." -Albert Einstein (link)
In the midst of an argument, the most satisfying retort is exposing the opponent's reasoning as a fallacy, such as one of these. But the most frustrating reasoning to deflate is reasoning that's not even wrong: its truth value is meaningless, mu, zero (and not the way C conditionals judge zero). The "argument from common sense" is in this category of fallacy. When someone advances an idea on the basis of, or defended by, what he/she calls common sense, no recourse is available. Possibly one could gain some satisfaction by responding with "I find your abundance of faith in unexamined assumptions disturbing", but that still wouldn't be a real counterargument.
Real counterarguments to the argument from common sense would be meaningless, because the argument from common sense is meaningless, because common sense is meaningless in an objective debate. I don't mean that the concept of "common sense" is meaningless, but that its very usefulness is its function as a catchall term for implicit knowledge. It doesn't have a fixed definition. Without sharp definitions, debates can endlessly spin in confusion purely from misunderstanding of what each participant is communicating. (Not incidentally, this is why the legal documents of a society under the rule of law are careful to define all important nouns. If you use the GPL, you really should know exactly what a "derivative work" is.)
Moreover, a bare argument from common sense exudes sheer ignorance of observer bias and disregard for rigorously evaluating truth via proof--how much of common sense is the result of investigation, as opposed to culture and mindless conformity? As I've heard some people say, with tongue-in-cheek, "common sense isn't common". A subset of common sense knowledge for one person, or even an entire group, might not be a subset of common sense knowledge for another person, or even another entire group. Is it common sense to use global variables sparingly, if at all? Is it common sense to indent code in a consistent manner? Is it common sense to change the default/root password? Is it common sense to finally cease using <font> tags, for Bob's sake? Maybe for some people, those procedures are so blindingly obvious that using the description "common sense" is pretty much an insult to the words. For other people, perhaps not...
The argument from common sense is not an argument. If trying to justify a statement by referring to common sense, find some other way. Look closer at what "common sense" really means in that specific case. At least make your assumptions explicit, so everyone knows you aren't using common sense as a logical hand-wave (no need to look here, it's common sense!). Common sense has done a lot for you. Don't push it into being something it isn't.
In the midst of an argument, the most satisfying retort is exposing the opponent's reasoning as a fallacy, such as one of these. But the most frustrating reasoning to deflate is reasoning that's not even wrong: its truth value is meaningless, mu, zero (and not the way C conditionals judge zero). The "argument from common sense" is in this category of fallacy. When someone advances an idea on the basis of, or defended by, what he/she calls common sense, no recourse is available. Possibly one could gain some satisfaction by responding with "I find your abundance of faith in unexamined assumptions disturbing", but that still wouldn't be a real counterargument.
Real counterarguments to the argument from common sense would be meaningless, because the argument from common sense is meaningless, because common sense is meaningless in an objective debate. I don't mean that the concept of "common sense" is meaningless, but that its very usefulness is its function as a catchall term for implicit knowledge. It doesn't have a fixed definition. Without sharp definitions, debates can endlessly spin in confusion purely from misunderstanding of what each participant is communicating. (Not incidentally, this is why the legal documents of a society under the rule of law are careful to define all important nouns. If you use the GPL, you really should know exactly what a "derivative work" is.)
Moreover, a bare argument from common sense exudes sheer ignorance of observer bias and disregard for rigorously evaluating truth via proof--how much of common sense is the result of investigation, as opposed to culture and mindless conformity? As I've heard some people say, with tongue-in-cheek, "common sense isn't common". A subset of common sense knowledge for one person, or even an entire group, might not be a subset of common sense knowledge for another person, or even another entire group. Is it common sense to use global variables sparingly, if at all? Is it common sense to indent code in a consistent manner? Is it common sense to change the default/root password? Is it common sense to finally cease using <font> tags, for Bob's sake? Maybe for some people, those procedures are so blindingly obvious that using the description "common sense" is pretty much an insult to the words. For other people, perhaps not...
The argument from common sense is not an argument. If trying to justify a statement by referring to common sense, find some other way. Look closer at what "common sense" really means in that specific case. At least make your assumptions explicit, so everyone knows you aren't using common sense as a logical hand-wave (no need to look here, it's common sense!). Common sense has done a lot for you. Don't push it into being something it isn't.
Friday, December 14, 2007
peeve no. 253 is overuse of the word 'satire'
Here's a quotable line: "When someone uses pointed humor to discredit my opinions, it's vicious meanness. When someone uses pointed humor to discredit your opinions, it's satire." (I was inspired by Mel Brooks' "Tragedy is when I cut my finger. Comedy is when you walk into an open sewer and die."--link)
Regardless of the above, satire does have an objective definition, and that definition doesn't match up with some of the commoner usages of "satire" I've been reading. I bet some of you same offenders have been misusing "irony" and "decimated". As I understand it (he yelled from his padded cell), satire is the timeless technique of applying humor of exaggeration in public discourse to starkly emphasize the ridiculousness of an opposing viewpoint. Satire is a part of language; it predates telecommunication, mass communication, photography, and practical electricity. Effective satire, especially satire which stands a minute chance of influencing others, requires a corresponding degree of intelligence and wit. A Modest Proposal is an excellent example. In politics, which is a subject I attempt to studiously avoid in this blog even in the face of an imminent presidential election year, satire is a welcome diversion from the typical "shout louder and more often" tactic. I respect satire.
Having described satire, it should be clearer what satire is not. Admittedly, my idealization of satire may not perfectly fit the official/academic category of satire.
Regardless of the above, satire does have an objective definition, and that definition doesn't match up with some of the commoner usages of "satire" I've been reading. I bet some of you same offenders have been misusing "irony" and "decimated". As I understand it (he yelled from his padded cell), satire is the timeless technique of applying humor of exaggeration in public discourse to starkly emphasize the ridiculousness of an opposing viewpoint. Satire is a part of language; it predates telecommunication, mass communication, photography, and practical electricity. Effective satire, especially satire which stands a minute chance of influencing others, requires a corresponding degree of intelligence and wit. A Modest Proposal is an excellent example. In politics, which is a subject I attempt to studiously avoid in this blog even in the face of an imminent presidential election year, satire is a welcome diversion from the typical "shout louder and more often" tactic. I respect satire.
Having described satire, it should be clearer what satire is not. Admittedly, my idealization of satire may not perfectly fit the official/academic category of satire.
- Satire is not synonymous with political humor, because not all political humor is satire. A joke which has a political subject might not be a satirical joke.
- Satire is not synonymous with sarcasm, although both include the idea of statements whose intended purpose is to express the opposite of its meaning. Both include the idea of statements that attack something. (Side comment: some comments I've heard called "sarcastic" should be called "facetious".) However, sarcasm is the more general of the two. Satire may be an extended application of sarcasm to a particular viewpoint.
- Satire is not an attack on a politician's personal appearance, personality, mannerisms, etc. It's an attack on his or her beliefs or actions.
- Satire is more than a hodgepodge of one or two line quips on political issues. Satire employs greater depth in narrower focus, as it attempts to illustrate that a viewpoint is not merely funny, but ludicrous.
- Satire is more than creating unflattering fictional caricatures of an opponent or viewpoint--perhaps that technique is satire, but no more than the lowest form of it. Satire's goal is to make a point. Having the fictional caricature say something like "rustle me up some ding-dongs" may be worth a chuckle (but hardly more), yet it neither discredits follies nor imparts wisdom.
Tuesday, December 04, 2007
peeve no. 252 is artless programming
I intended to ramble on for a bit about how readability/ease of understanding is not at all guaranteed by the mere use of a runtime-typed language, but then I recognized that the point I planned to make was in a sense the opposite of a previous peeve, no. "251": equating programming and art. The main thrust of 251 is that programmers don't need to babble on about pretty code and being artists when each programming task (ideally) has an objective purpose and objective obstacles known as bugs. Regardless of the opinions of a few folks, code is likely to be far more appreciated for fewer bugs than for beauty.
However, I can't deny the existence of programming style, nor can I ignore its manifestation or absence in any particular piece of code. The level of style exhibited by code has been analyzed and quantified in a few ways: lists of code "smells", program-structure calculations such as cyclomatic complexity, lint-like tools for flagging potential problems (tried JSLint, by the way?), and, of course, automated style-enforcement programs. The sense of style I am describing includes and goes beyond all those.
It also goes beyond bug reduction, the province of 251. Code listings A and B can be equally bug-free, but the level of style in A can still be unmistakably superior to B. Style is part of what Code Complete tries to communicate. Duplicated code has less style than modular code. Cryptic variable names have less style than informative variable names; so do verbose variable names. Objects and functions that overlap and do too much have less style than objects and functions with clear-cut responsibilities. On the other hand, style indicators are more what you'd call guidelines than actual rules, because style is an elusive concept which escapes hard definition.
One example of the difference between code with style and code without style has been burned into my memory. In one of my first few undergraduate classes, the professor divided the students into three groups to write a pseudo-code algorithm for translating a day of the month into a day of the week, but only for one specified month. Thanks to my influence (I was the first one to suggest it, anyway), the group I was in used the modulus operator (%) as part of the algorithm. One or both of the other groups made an algorithm without the modulus operator, a patchwork algorithm having several simple "if" statements my group's algorithm did not. All the algorithms solved the (trivial) problem of producing the correct output for each input; the benefit of several people to mentally trace/check the code was probably why the students broke into groups in the first place. Need I state which of the algorithms had the most style, regardless of my all-too-obvious bias?
Here's another quick example, from a few days ago. Out of curiosity, I was looking over some checked-in code. A utility class contained a handful of methods. One of those methods had many complicated steps. In defense of the code's style level, the method's steps were sensibly grouped into a series of "helper" methods, called in succession. The gaffe was the visibility of those helper methods: public. Is it likely that other code would try to call any of those helper methods individually? Is it even likely that other code would use that narrowly-applicable utility class? No on both counts. The public visibility of the helper methods causes no bugs, and realistically speaking will never result in bugs. Nevertheless, objects that "overshare" a contract/interface to other code represent poor style.
The programmer who checked in this code is not dumb, yet he had a style "blind spot". Style is subjective, but it can be learned. Constructively-minded code reviews should put an emphasis on improving the style of everyone involved.
If you've read Zen and the Art of Motorcycle Maintenance, you may also have noticed that its conclusions about language rhetoric resemble the above conclusions about programming style. Quality craftsmanship can arise from the mental melding of craftsman and material. It's neither objective nor subjective, neither art nor science. A common expression of programmers is a language or framework "fitting my brain". Artless programming proceeds without the attempt to achieve this deep awareness.
However, I can't deny the existence of programming style, nor can I ignore its manifestation or absence in any particular piece of code. The level of style exhibited by code has been analyzed and quantified in a few ways: lists of code "smells", program-structure calculations such as cyclomatic complexity, lint-like tools for flagging potential problems (tried JSLint, by the way?), and, of course, automated style-enforcement programs. The sense of style I am describing includes and goes beyond all those.
It also goes beyond bug reduction, the province of 251. Code listings A and B can be equally bug-free, but the level of style in A can still be unmistakably superior to B. Style is part of what Code Complete tries to communicate. Duplicated code has less style than modular code. Cryptic variable names have less style than informative variable names; so do verbose variable names. Objects and functions that overlap and do too much have less style than objects and functions with clear-cut responsibilities. On the other hand, style indicators are more what you'd call guidelines than actual rules, because style is an elusive concept which escapes hard definition.
One example of the difference between code with style and code without style has been burned into my memory. In one of my first few undergraduate classes, the professor divided the students into three groups to write a pseudo-code algorithm for translating a day of the month into a day of the week, but only for one specified month. Thanks to my influence (I was the first one to suggest it, anyway), the group I was in used the modulus operator (%) as part of the algorithm. One or both of the other groups made an algorithm without the modulus operator, a patchwork algorithm having several simple "if" statements my group's algorithm did not. All the algorithms solved the (trivial) problem of producing the correct output for each input; the benefit of several people to mentally trace/check the code was probably why the students broke into groups in the first place. Need I state which of the algorithms had the most style, regardless of my all-too-obvious bias?
Here's another quick example, from a few days ago. Out of curiosity, I was looking over some checked-in code. A utility class contained a handful of methods. One of those methods had many complicated steps. In defense of the code's style level, the method's steps were sensibly grouped into a series of "helper" methods, called in succession. The gaffe was the visibility of those helper methods: public. Is it likely that other code would try to call any of those helper methods individually? Is it even likely that other code would use that narrowly-applicable utility class? No on both counts. The public visibility of the helper methods causes no bugs, and realistically speaking will never result in bugs. Nevertheless, objects that "overshare" a contract/interface to other code represent poor style.
The programmer who checked in this code is not dumb, yet he had a style "blind spot". Style is subjective, but it can be learned. Constructively-minded code reviews should put an emphasis on improving the style of everyone involved.
If you've read Zen and the Art of Motorcycle Maintenance, you may also have noticed that its conclusions about language rhetoric resemble the above conclusions about programming style. Quality craftsmanship can arise from the mental melding of craftsman and material. It's neither objective nor subjective, neither art nor science. A common expression of programmers is a language or framework "fitting my brain". Artless programming proceeds without the attempt to achieve this deep awareness.
Monday, October 22, 2007
peeve no. 251 is equating programming and art
I think I understand. Someone can appreciate art by contemplating a work's brilliant juxtaposition of shapes or colors, or by admiring the high level of craftsmanship, or by projecting his or her own reactions onto it (hence, art appreciation has a subjective aspect). Similarly, someone can appreciate code by contemplating its achievement of minimal complexity through a brilliant combination of simpler elements, or by admiring the high level of knowledge and skill applied to it, or by reacting to the symmetric use of whitespace.
Oh, right. The visual distribution of whitespace is entirely beside the point in code appreciation. Know why?
Code has a specific reason to exist. It has a purpose. (When the sentient programs in Matrix Reloaded, whether agents or exiles or Smith, talk about purpose, they're showing a laudable degree of self-awareness.) It does something. It operates. It executes. Data isn't merely one of the concerns addressed by a particular piece of code; data is in a fundamental sense the only real concern. Every cruddy program in existence performs one or more of the universal tasks of creating, reading, updating, and deleting data. True, some programs may do no more than yank pieces of data from here and there, transform or summarize the pieces, and dump out a transitory result. However, what came out (a display screen, a printout) still differs significantly from the data that came in, so data creation has happened. Even games process data: the data is a singularly trivial but highly-lucrative stream resulting from the iteration of a loop of the abstract form "Game State += UI input".
The definite purpose of every snippet of programming code, data processing, is what separates it from art. There is no inherently common purpose of art. Well, besides being the expression of the artist's vision, but a work's success in fulfilling this purpose is necessarily a subjective judgment by the artist, and once the work is finished that purpose is as completed as it will ever be. (Remaking and revising a work to better express the artist's original vision is the creation of a new work. Hi, George Lucas!) This difference in essence between code and art underpins several important distinctions. Although comparing programming and art may be instructional, these are reasons why equating programming and art is not quite right.
The practical conclusion is that the degree of beauty exhibited by code, as appreciated by discriminating programmers, is beside the point if it doesn't imply fewer short-term and long-term bugs. Evaluating and debating code prettiness is not something programmers need to do in order to have substantive discussions about software development. Why is using a popular, well-supported framework likely better than making your own? Yours probably has bugs you don't know about yet. Why is code modularization good? It cuts down duplication, enabling easier and more effective bug fixes. Why is uncluttered syntax good? It affords the programmer more time and attention for the subtler bugs of analysis and design. Why is increasing development agility (whether or not ye gods have deigned you Agile-Approved) good? It minimizes the possibilities for bugs to creep in. Why is understandable code good? It makes the tasks of finding and fixing bugs more productive later.
Don't show and tell me how shiny my code will be, thanks to your silver bullet. Show and tell me how bug-free, how effortlessly correct my code will be, thanks to your silver bullet. I am not an artist.
Oh, right. The visual distribution of whitespace is entirely beside the point in code appreciation. Know why?
Code has a specific reason to exist. It has a purpose. (When the sentient programs in Matrix Reloaded, whether agents or exiles or Smith, talk about purpose, they're showing a laudable degree of self-awareness.) It does something. It operates. It executes. Data isn't merely one of the concerns addressed by a particular piece of code; data is in a fundamental sense the only real concern. Every cruddy program in existence performs one or more of the universal tasks of creating, reading, updating, and deleting data. True, some programs may do no more than yank pieces of data from here and there, transform or summarize the pieces, and dump out a transitory result. However, what came out (a display screen, a printout) still differs significantly from the data that came in, so data creation has happened. Even games process data: the data is a singularly trivial but highly-lucrative stream resulting from the iteration of a loop of the abstract form "Game State += UI input".
The definite purpose of every snippet of programming code, data processing, is what separates it from art. There is no inherently common purpose of art. Well, besides being the expression of the artist's vision, but a work's success in fulfilling this purpose is necessarily a subjective judgment by the artist, and once the work is finished that purpose is as completed as it will ever be. (Remaking and revising a work to better express the artist's original vision is the creation of a new work. Hi, George Lucas!) This difference in essence between code and art underpins several important distinctions. Although comparing programming and art may be instructional, these are reasons why equating programming and art is not quite right.
- Primarily, programming is objective in a way that art isn't. Code has a specific purpose, which is processing data in a specific way. If a machine executes the code, but the data processing has not occurred correctly, the cause may be complicated and subtle (or so simple as to be overlooked), but something has undeniably failed. The purpose(s) of art are not so explicit and obvious, which means the failure(s) of an artwork are not so explicit and obvious. Horrible art may be "ugly", but horrible programming is borked.
- For computers and code to feasibly/acceptably automate data processing, the objectively-evaluated accuracy just described must be consistently reliable. Therefore, computers and code must be deterministic. Parts of the data processing may be "indeterminate", like a string of pseudo-random numbers, or parts of the execution order itself may be indeterminate, like a set of parallel threads, but taken as a whole, the guaranteeing of accurate results implies determinism. Determinism, in turn, implies that inaccuracies do not "happen spontaneously", but have a root cause. The root cause in any particular situation may in fact be an erratic one, due to flaky memory for instance, but nevertheless the inaccuracy has a determinate cause. More to the point, if the programming is at fault, then this cause is known as a bug. Since all parts of the code are intertwined, and code, while it can be fault-tolerant, certainly isn't self-conscious and self-correcting (that's what programmers are for!), it follows that, speaking in general of code from the metaphor of a "black box", an inaccuracy could potentially be due to any part of the code. When the code as a whole can objectively fail, then every part of the code can objectively fail, too. An artwork is much more forgiving of its parts. Critics of any art form will comment that a work had some "regrettable qualities", but "by and large, succeeded". Programming has a much more menacing antagonist, the bug, and no matter how artistically or aesthetically correct the code is, one lil' bug means the code, at least under some conditions, is fatally flawed.
- Continuing the bug discussion, the dirty little non-secret of programming is that, with few exceptions, no sufficiently complex code is entered without mistakes the first time, or the second time, or the third time. The wizards and geniuses of programming confess to regularly forgetting about some minor detail (is it array.length or array.length()?), and mistyping identifiers (what was the name of the variable for the number of combinations?). Speaking out of my ignorance and lack of artistic talent, the mental exertion/concentration required to perfectly perform the concrete actions of programming--obeying syntax, tracking semantics, remembering the general flow of the design/algorithm, error-free typing--outweighs the similar demand on the artist. In response, helpful programming-oriented editors arose, built for relieving some of the strain with auto-completion and other code assists.
- Moreover, bugs in the manual transcription of a design/algorithm are some of the easier ones to catch. Programming is an excellent illustration of how easy it is to gingerly proceed through a canned series of steps, appearing to conscientiously complete each one, and end up creating complete horsepucky (or, if you prefer, bulldooky). As people swiftly discover after exiting school, the unwritten assumptions and the errors in reasoning about the data domain can yield code which processes the wrong data, processes data irrelevant to the user's needs, processes the data insufficiently for easy use. Consequently, transactions with programming clients differ from transactions with art patrons. The art patron (editor, supervisor, commissioning official) has a set of loosely-defined outlines/parameters for the artist to follow, and the art should be acceptable if it is within those boundaries. The programming client quite likely also has a set of loosely-defined requirements, but, because code is not art, code is strictly defined: this input transformed to that output. Loose definitions of the purpose are insufficient for code because loose results from code are insufficient. Take a stupid example, a crayon illustration. Is the purpose of "calculating sales tax" met by code which assumes a tax rate for California? Yes--but just if the input and output is Californian! A programming client who loosely states he or she wants code that "calculates sales tax", receives that code, and runs it in Wyoming, is not happy.
- A greater amount of precision distinguishes programming requirements from artistic. Time does, as well. Artworks can be unchanging, and remain sublime expressions of the human spirit--this "requirement" of the artwork doesn't shift. In contrast, as the turn of the century reminded everyone, a shift in requirements around pristine code can make it useless. "Functioning as designed" turns to "bug". It's a cascading effect: the world changes, the data that represents the world changes, the processing of the data changes, the code that implements the data processing changes. No doubt, certain codebases are long-lived, but only by embodying the most general of concepts and nurturing areas of change. emacs continues both on account of meeting the continuing need to edit text and supporting arbitrary extensions.
The practical conclusion is that the degree of beauty exhibited by code, as appreciated by discriminating programmers, is beside the point if it doesn't imply fewer short-term and long-term bugs. Evaluating and debating code prettiness is not something programmers need to do in order to have substantive discussions about software development. Why is using a popular, well-supported framework likely better than making your own? Yours probably has bugs you don't know about yet. Why is code modularization good? It cuts down duplication, enabling easier and more effective bug fixes. Why is uncluttered syntax good? It affords the programmer more time and attention for the subtler bugs of analysis and design. Why is increasing development agility (whether or not ye gods have deigned you Agile-Approved) good? It minimizes the possibilities for bugs to creep in. Why is understandable code good? It makes the tasks of finding and fixing bugs more productive later.
Don't show and tell me how shiny my code will be, thanks to your silver bullet. Show and tell me how bug-free, how effortlessly correct my code will be, thanks to your silver bullet. I am not an artist.
Friday, September 07, 2007
peeve no. 250 is unnecessary surrogate keys
The Wikipedia definition of surrogate key is here. In my database experience, it takes the form of an "autonumber" column, also referred to as "serial" or "identity". Whatever the database product or nomenclature, each new row has a guaranteed-unique, generated value in that column. The commonness of surrogate key columns verges on mundane.
As with the other peeves, my irritation has subjective and objective aspects. Objectively, a surrogate key has the downside of being fairly useless for queries for multiple rows--which also makes its index a fairly useless optimization. This is tempered by the syntactical ease of retrieving one known row, but that case is really more of a "lookup" than a "query", don't you think? (I would contrast the selection and projection of tuples, but why bother...)
Another objective danger with unnecessary surrogate keys, perhaps the worst, is the duplication of data which shouldn't be duplicated. By definition the surrogate key has no relationship with the rest of the columns in the row, so it isn't a proper primary key for the purpose of database normalization. If the cable company's database identifies me by an auto-generated number, and I sign up again as a new customer paying promotional rates after I move, the database won't complain at all. In practice, of course, the entry program should aid the user in flagging possible duplicates, but clearly that lies outside the database and the data model. Admittedly, judging whether "Joe Caruthers in Apt B3" is a duplicate of "Joseph Caruthers in Building B Apartment #3" is a decision best left up to an ape descendant rather than the database.
The objective tradeoffs of a surrogate key are worthy of consideration, but my subjective disdain for unnecessary surrogate keys goes beyond that. I feel violated on behalf on the data model. Its relational purity has been sullied. Lumping in an unrelated number (or value) to a row of real data columns feels out of place, like adding a complex "correction factor" to a theoretically-derived equation so it fits reality. But the unnecessary surrogate key has the opposite effect: it causes the table to have less resemblance to what it models. An unnecessary surrogate key leads someone to wonder if the table was the victim of a careless and/or thoughtless individual who went around haphazardly numbering entities to shut the database up about primary keys. Normalization, what's that?
I've seen enough to realize the convenience of a surrogate key, especially for semi-automated software like ORM mappers. It's also clear that sometimes a surrogate key is the only feasible way of uniquely identifying a set of entities, particularly when those entities are the "central hubs" of the data model and therefore the corresponding primary keys must act as the foreign keys for a large quantity of other tables. The technical downsides of a surrogate key can be mitigated by also including a real primary key for the table.
If the overall design doesn't make obvious the choice of each table's primary key, that's a clue the design should be reevaluated. (You did design the tables before creating them, right?) Don't forget that sometimes the primary key consists of every column. The table that serves as the "connecting" or "mediating" table in a many-to-many join is a good example, because it's essentially a table of valid combinations.
As with the other peeves, my irritation has subjective and objective aspects. Objectively, a surrogate key has the downside of being fairly useless for queries for multiple rows--which also makes its index a fairly useless optimization. This is tempered by the syntactical ease of retrieving one known row, but that case is really more of a "lookup" than a "query", don't you think? (I would contrast the selection and projection of tuples, but why bother...)
Another objective danger with unnecessary surrogate keys, perhaps the worst, is the duplication of data which shouldn't be duplicated. By definition the surrogate key has no relationship with the rest of the columns in the row, so it isn't a proper primary key for the purpose of database normalization. If the cable company's database identifies me by an auto-generated number, and I sign up again as a new customer paying promotional rates after I move, the database won't complain at all. In practice, of course, the entry program should aid the user in flagging possible duplicates, but clearly that lies outside the database and the data model. Admittedly, judging whether "Joe Caruthers in Apt B3" is a duplicate of "Joseph Caruthers in Building B Apartment #3" is a decision best left up to an ape descendant rather than the database.
The objective tradeoffs of a surrogate key are worthy of consideration, but my subjective disdain for unnecessary surrogate keys goes beyond that. I feel violated on behalf on the data model. Its relational purity has been sullied. Lumping in an unrelated number (or value) to a row of real data columns feels out of place, like adding a complex "correction factor" to a theoretically-derived equation so it fits reality. But the unnecessary surrogate key has the opposite effect: it causes the table to have less resemblance to what it models. An unnecessary surrogate key leads someone to wonder if the table was the victim of a careless and/or thoughtless individual who went around haphazardly numbering entities to shut the database up about primary keys. Normalization, what's that?
I've seen enough to realize the convenience of a surrogate key, especially for semi-automated software like ORM mappers. It's also clear that sometimes a surrogate key is the only feasible way of uniquely identifying a set of entities, particularly when those entities are the "central hubs" of the data model and therefore the corresponding primary keys must act as the foreign keys for a large quantity of other tables. The technical downsides of a surrogate key can be mitigated by also including a real primary key for the table.
If the overall design doesn't make obvious the choice of each table's primary key, that's a clue the design should be reevaluated. (You did design the tables before creating them, right?) Don't forget that sometimes the primary key consists of every column. The table that serves as the "connecting" or "mediating" table in a many-to-many join is a good example, because it's essentially a table of valid combinations.
Thursday, June 28, 2007
peeve no. 249 is the casual use of 'random'
As rants go, the capriciousness of this one is up near level 11. I hardly expect anyone else to share my irritation. Nevertheless, I must tell the World.
In my opinion, the everyday use of the word "random" is nearly always inaccurate. If someone you were talking with abruptly switched the conversation's subject, neither that person nor the shift was random. It may have seemed random, but only because you weren't insightful or knowledgeable enough to fathom the connection.
To be fair, the mental leaps people are capable of making can be incredibly subtle and creative, even to the point at which the likeliness of anyone else duplicating the leap is nil, whether with or without the exact same set of information. Don't call it random. Mental leaps are essential to progress.
If someone shifts mood suddenly, the chances are good that it isn't random, although someone's true motivations may be cloudy at best. The cases that come closest to randomness are those in which the emotional imbalance is clearly linked to a chemical imbalance - and in those cases the underlying cause is definite!
Humor that consists of events happening outside of rhyme and reason is not random, either. It's a writer frantically hurling nonsensical ideas when he or she doesn't care enough to write organized material. Unfortunately, such "humor" is its own undoing, because sooner or later the audience will notice the absence of any point, then naturally wonder why they're settling for such. One should also note that even "random" humor can't, in fact, be truly random. True randomness alienates an audience. Seriously. Use some dice or something to pick a set of elements, then combine the elements. The result will not cause laughter so much as perplexity.
Humor isn't random; it must be constructed so that it works.
Sometimes the actions of large organizations, or the managers of those organizations, may be labeled as "random". If thinking that makes one feel better, fine. Specific decisions perhaps actually have random elements (if two products are so similar in price and features as to be practically indistinguishable, who cares which one will be bought?). The reasoning behind other decisions may seem shaky or hasty, at best. But you better believe that when manager 3 suddenly starts advocating wonder technology 6, the idea did not enter his head by accident.
Tragedies, or the victims of tragedies, feel "random". My genuine (discomforting) advice is to abandon the illusion that the Totality Of Reality operates according to well-defined rules at any time, tragedy or not. I have barely any control over what happens to me. Neither do you. Those in supreme positions of power don't either; they could be eliminated by microscopic organisms, an assassination, or a coup. If "random" is a synonym for "out-of-control", then "random" is just the way things are.
Ugh, now I'm depressed. Stupid Total Perspective Vortex.
In my opinion, the everyday use of the word "random" is nearly always inaccurate. If someone you were talking with abruptly switched the conversation's subject, neither that person nor the shift was random. It may have seemed random, but only because you weren't insightful or knowledgeable enough to fathom the connection.
To be fair, the mental leaps people are capable of making can be incredibly subtle and creative, even to the point at which the likeliness of anyone else duplicating the leap is nil, whether with or without the exact same set of information. Don't call it random. Mental leaps are essential to progress.
If someone shifts mood suddenly, the chances are good that it isn't random, although someone's true motivations may be cloudy at best. The cases that come closest to randomness are those in which the emotional imbalance is clearly linked to a chemical imbalance - and in those cases the underlying cause is definite!
Humor that consists of events happening outside of rhyme and reason is not random, either. It's a writer frantically hurling nonsensical ideas when he or she doesn't care enough to write organized material. Unfortunately, such "humor" is its own undoing, because sooner or later the audience will notice the absence of any point, then naturally wonder why they're settling for such. One should also note that even "random" humor can't, in fact, be truly random. True randomness alienates an audience. Seriously. Use some dice or something to pick a set of elements, then combine the elements. The result will not cause laughter so much as perplexity.
Humor isn't random; it must be constructed so that it works.
Sometimes the actions of large organizations, or the managers of those organizations, may be labeled as "random". If thinking that makes one feel better, fine. Specific decisions perhaps actually have random elements (if two products are so similar in price and features as to be practically indistinguishable, who cares which one will be bought?). The reasoning behind other decisions may seem shaky or hasty, at best. But you better believe that when manager 3 suddenly starts advocating wonder technology 6, the idea did not enter his head by accident.
Tragedies, or the victims of tragedies, feel "random". My genuine (discomforting) advice is to abandon the illusion that the Totality Of Reality operates according to well-defined rules at any time, tragedy or not. I have barely any control over what happens to me. Neither do you. Those in supreme positions of power don't either; they could be eliminated by microscopic organisms, an assassination, or a coup. If "random" is a synonym for "out-of-control", then "random" is just the way things are.
Ugh, now I'm depressed. Stupid Total Perspective Vortex.
Friday, April 13, 2007
peeve no. 248 is politics on nonpolitical sites
I just seem to be racking up these peeves lately, haven't I? Well, I've been busier, and unlike other posts these peeves almost write themselves, are short, and require no inspiration beyond trivial dissatisfaction...
Seeing purely political diatribes, stories, links, ads, etc. on a web site that isn't geared toward politics unavoidably puts me in a bad mood (sequestering politics in a distinct "Politics" section is, of course, fine, as is politics with a connection to the site's topic matter). If I agree with the political point, I become upset that the point had to be made at all, because it's so obviously right! If I disagree with the political point, I become upset that there are nerfherding idiots polluting the 'Net all the time! The politics must surely be annoying for readers in countries outside the US, too...
If I want to read politics, there's no shortage of politically-geared sites I could go to. Drop the off-subject politicking everywhere else, please!
Seeing purely political diatribes, stories, links, ads, etc. on a web site that isn't geared toward politics unavoidably puts me in a bad mood (sequestering politics in a distinct "Politics" section is, of course, fine, as is politics with a connection to the site's topic matter). If I agree with the political point, I become upset that the point had to be made at all, because it's so obviously right! If I disagree with the political point, I become upset that there are nerfherding idiots polluting the 'Net all the time! The politics must surely be annoying for readers in countries outside the US, too...
If I want to read politics, there's no shortage of politically-geared sites I could go to. Drop the off-subject politicking everywhere else, please!
Friday, April 06, 2007
peeve no. 247 is equating FLOSS to communism
As with some of my other rants, my problem is not so much the existence of something, but how it is used - its connotations and motivations. Merely comparing FLOSS (Free/Libre/Open-Source Software) to communism per se is not what bothers me. Moreover, to some degree the comparison makes sense. FLOSS code wouldn't be FLOSS if it wasn't possible for the masses to use it, share it, study it, refine it, etc. The code is effectively in a massive "communal pool" (laying aside complications of legal incompatibilties between differently-licensed code...). Developers produce according to their abilities, and users consume according to their needs. If only communism worked this well in practice.
On the other hand, an important distinction divides FLOSS from communism: code can be duplicated at little cost, which makes economic analogies less apt because the "supply side" is virtually limitless. This is the premise of shareware; customers can handle the "manufacturing" and packaging and distribution of the product on their own, reducing everyone's cost/risk. Another distinction is the control of the "means of production". These days, compilers and interpreters and entry-level SDK often cost nothing except the time and bandwidth to download them. Depending on the target platform, no special equipment beyond a common PC may be required to produce code, either. Of course, there's still the cost of labor, but until computers comprehend human language, brainstorm brilliant ideas, design attractive images, and so forth, developers will continue to be required. The essential point is that the topics communism concerns itself with are not directly applicable to software methodology, so communism answers questions software methodologydoesn't ask and ignores other questions software methodology does ask.
Unfortunately, generally speaking, the people who compare FLOSS to communism aren't doing it to start up an interesting discussion. No, they do it to end the discussion with a strategy not far removed from name-calling. To turn people away from a vehicle, say "deathtrap". To turn people away from a philosophy (where I live, anyway), say "communism". And this is what irks me, not that the connection may be somewhat approximate, but that anyone thinks it is a convincing argument against FLOSS. "He has pointy ears, like one of those evil leprechauns, so he must be up to no good!"
What's so scary about sharing? What's so problematic about large numbers of people working together, benefitting everyone at once? Developers can still be paid to make the code (some projects have feature "bounties", and one of the dreams of any FLOSS project founder is to be hired by a company to work on that project full-time). The code can still be sold, though the right of anyone to copy the code does mean the most competitive price will be close to if not exactly zero. Speaking more fundamentally, not all code has to be FLOSS. Or someone could compromise. Conceivably, code could be proprietary at first, until the owner decides he or she has milked it enough, then released as FLOSS. Dual-licensing is another avenue, although many developers will be understandably reluctant to contribute to dual-licensed software. And let's be frank. Some software simply won't be written, or written well, unless pay is involved. I've worked on stuff that was so dull, it felt like work.
Charges of communism are pointless. Embrace the commune. Be not afraid. Think of it as a kibbutz, if that helps.
On the other hand, an important distinction divides FLOSS from communism: code can be duplicated at little cost, which makes economic analogies less apt because the "supply side" is virtually limitless. This is the premise of shareware; customers can handle the "manufacturing" and packaging and distribution of the product on their own, reducing everyone's cost/risk. Another distinction is the control of the "means of production". These days, compilers and interpreters and entry-level SDK often cost nothing except the time and bandwidth to download them. Depending on the target platform, no special equipment beyond a common PC may be required to produce code, either. Of course, there's still the cost of labor, but until computers comprehend human language, brainstorm brilliant ideas, design attractive images, and so forth, developers will continue to be required. The essential point is that the topics communism concerns itself with are not directly applicable to software methodology, so communism answers questions software methodologydoesn't ask and ignores other questions software methodology does ask.
Unfortunately, generally speaking, the people who compare FLOSS to communism aren't doing it to start up an interesting discussion. No, they do it to end the discussion with a strategy not far removed from name-calling. To turn people away from a vehicle, say "deathtrap". To turn people away from a philosophy (where I live, anyway), say "communism". And this is what irks me, not that the connection may be somewhat approximate, but that anyone thinks it is a convincing argument against FLOSS. "He has pointy ears, like one of those evil leprechauns, so he must be up to no good!"
What's so scary about sharing? What's so problematic about large numbers of people working together, benefitting everyone at once? Developers can still be paid to make the code (some projects have feature "bounties", and one of the dreams of any FLOSS project founder is to be hired by a company to work on that project full-time). The code can still be sold, though the right of anyone to copy the code does mean the most competitive price will be close to if not exactly zero. Speaking more fundamentally, not all code has to be FLOSS. Or someone could compromise. Conceivably, code could be proprietary at first, until the owner decides he or she has milked it enough, then released as FLOSS. Dual-licensing is another avenue, although many developers will be understandably reluctant to contribute to dual-licensed software. And let's be frank. Some software simply won't be written, or written well, unless pay is involved. I've worked on stuff that was so dull, it felt like work.
Charges of communism are pointless. Embrace the commune. Be not afraid. Think of it as a kibbutz, if that helps.
Subscribe to:
Posts (Atom)