Friday, June 12, 2009

I'm GLAD I'm not programming in natural human language

At least one of the hard and undeniable facts of software development doesn't really swipe a developer across the face until he or she starts work: the difficulty of extracting good information from people. Someone may say "the bill is a set rate per unit" but a mere five minutes later, after further interrogation, is finally coaxed into saying "customers pay a flat rate for items in categories three and four". Similarly, the reliability of the words "never" or "always" should be closely scrutinized when spoken by people who aren't carefully literal. (On the other hand, xkcd has illustrated that literal responses are inappropriate in contexts like small talk...)

I'm convinced that this difficulty is partially caused by natural human language. It's too expressive. It isn't logically rigorous. By convention it supports multiple interpretations. While these attributes enable it to metaphorically branch out into new domains and handle ambiguous or incompletely-understood "analog" situations, the same attributes imply that it's too imprecise for ordering around a computing machine. Just as "I want a house with four bedrooms and two bathrooms" isn't a sufficient set of plans to build a house, "I want to track my inventory" isn't a sufficient set of software instructions to build a program (or even a basis on which to select one to buy).

Every time I perform analytical/requirements-gathering work, I'm reminded of why I doubt that natural human language will ever be practical for programming, and why I doubt that my job will become obsolete any time soon. I can envision what the programming would be like. In my head, the computer sounds like Majel Barrett.

Me: "Computer, I want to periodically download a file and compare the data it contains over time using a graph."
Computer: "Acknowledged. Download from where?"
Me: "I'll point at it with my mouse. There."
Computer: "Acknowledged. Define periodically."
Me: "Weekly."
Computer: "Acknowledged. What day of the week and what time of the day?"
Me: "Monday, 2 AM."
Computer: "Acknowledged. What if this time is missed?"
Me: "Download at the next available opportunity."
Computer: "Acknowledged."
Me: "No, wait, only download at the next available opportunity when the system load is below ___ ."
Computer: "Acknowledged. What if there is a failure to connect?"
Me: "Retry."
Computer: "Acknowledged. Retry until the connection succeeds?"
Me (getting huffy): "No! No more than three tries within an eight-hour interval."
Computer: "Acknowledged. Where is the file stored?"
Me: "Storage location ______ ."
Computer: "Acknowledged. What if the space is insufficient?"
Me: "Remove the least recent file."
Computer: "Acknowledged. What data is in the file?"
Me (now getting tired): "Here's a sample."
Computer: "Acknowledged. What is the time interval for the graph?"
Me: "The last thirty data points."
Computer: "Acknowledged. What is the color of the points? Does the graph contain gridlines? What are the graph dimensions? How will the graph be viewed?"
Me: "Oh, if only I had an analyst!"

6 comments:

  1. Anonymous12:33 AM

    That (declaratively) is exactly how we should program! When it gets tedious, you create a ... software library.

    "Download from here with retry strategy 1-3-delta and storage strategy 4-6-sigma."

    ReplyDelete
  2. But mayby when we have this language the computer also realy understands the humanlanugate. You example would then be:
    Me: Computer, I would like to download economic data from bbc-dot-coms homepage; valueestimates of google (GOOG) and let me se a graph over time, will you?
    Computer: Okay.........here you go!

    ReplyDelete
  3. Anonymous4:44 AM

    Two words: sensible defaults.

    ReplyDelete
  4. Anonymous5:48 AM

    I wonder what programming in Lojban would be like. It's syntactically unambiguous and based on logic, so there wouldn't be nearly as much problem.

    ReplyDelete
  5. OK, I should have used a more complicated example, but I got bored with it pretty quickly, even as is. (The "huffy" and "tired" descriptions are rank exaggerations, of course. I really wouldn't be that temperamental.)

    @Anonymous #1: I'm not claiming that the declarative paradigm isn't useful. My point is more that the more expressive the declarative language must become, the less it's of any benefit over general-purpose programming languages - particularly the good ones.

    @Johan: I actually thought of this later. Some of the "mashup editor" demos (Pipes? Popfly?) I've seen might be able to handle the example with a minimal number of clicks in the interface and no actual code at all. It would just be a sequence of actions of dragging an "URL source", applying some "filters" to extract the data, and dragging a "graph" output. Again, the example wasn't great at making the point.

    @pozorvlak: As I was writing, one of the first examples I thought of was video encoding/transcoding, where there's a ton of knobs and switches (Xvid) to specify. But I realized that, almost every time I've done it, I haven't known the trade-offs of fiddling the settings, and the programs I like are the ones that have (drum-roll) sensible defaults. Those are important in any GUI or API.

    @Anonymous #2: To the extent that I see natural human language working as a programming language, I assume it would need to somehow be halfway between the two: unambiguous but without devolving into descriptions of algorithms and data structures (those would still have to be written by whoever implemented the in-between language). Reminds me a bit of "fluent interfaces". Based on my experience, I'm still highly skeptical that I would enjoy using a "programming language for everybody". Isn't that the motivation for COBOL?

    ReplyDelete
  6. The last example can be solved easily by defining default options i.e. in a file. Imagine the computer would be a real AI where you "tweak" its behaviour by setting all options. (At best of course, this could be done interactively with the computer asking questions, and storing this result.)

    But we dont even have voie-based input everywhere yet, so these things.... visions rarely enter reality.

    PS: And humans are a lot more intelligent than computers at UNDERSTANDING something. Yes, that even includes "dumb" people. Computers are stupid, the AI is a joke in 98% of the situations...

    ReplyDelete

Note: Only a member of this blog may post a comment.