Sunday, November 26, 2006

peeve no. 244 is breaking upgrades

Preface: I'm a little hesitant of embarking on this rant, because I'm fully aware that I may come out looking like an imbecile, n00b, or [insert preferred term of mockery here]. But then I remember that a written rant is the thinking's man version of a gorilla tantrum, so any ranting doesn't flatter me anyway.

Until a few days ago, I was running Debian testing ("etch"--kudos on trying to get the next release out sooner). Well...more or less. My installation started out quite a while ago as Mepis, but over time I replaced almost everything with plain Debian. I used some packages from Debian unstable (Sid) here and there. Then I read that many users run the unstable distribution, it doesn't break as much as you might think, it always has the latest and greatest, it's not quite as frozen as testing close to a I pointed my apt to pure unstable (can you guess where this is going?).

I upgraded a few of the packages I care about most and enjoyed the new features. Then I saw an upgrade that required of those "program A depends on framework B depends on infrastructure C" cases that make me glad for automatic dependency handling. I was running hotplug, because that was part of the initial install (the kernel was a distro-patched version 2.6.12, I believe). I waited until I had a lengthy, contiguous block of time, and let the upgrade happen. Oooops. The udev package made it very clear that it wanted to run on a later kernel. This might faze some folks, but kernel shlepping is not new to me considering I ran Gentoo back before it offered any precompiled kernel at all, so I installed the Debian meta-package that depends on the latest kernel, made sure the necessary bits were in my grub config, and rebooted.

This wouldn't be a rant if it had worked. The kernel ran through the boot process, appeared to load the modules it should, and started executing items in the initrd. End result: something similar to "can't mount /dev/hdb3 on /". The filesystem on that partition is ext3! I started to feel the primal rage that comes on whenever, from my perspective, software refuses to do as it's told. I rebooted back into the other kernel. I searched the Web for similar troubles. I'm inclined to blame udev for not doing its job, or maybe the initrd creation tool. I fell into a cycle of fiddling with the packager or rerunning a tool, rebooting, calling out dirty names, and repeating the process. After I had enough, I decided to admit I had no clue what the cause was. Some unknown magic smoke on my system wasn't magically smoking like it should. I was desperate. My new kernel couldn't even get me to a real shell. I humbled myself before apt, and upgraded the world.

Afterwards, the new kernel still wouldn't boot, the old kernel booted and initialized fine, and pretty much everything in userland was down for the count. No X, no network a broken guy with a broken system, I copied the important stuff on my / partition to my home partition, tossed in one of my LiveCDs long enough to download and burn an iso, and then installed Ubuntu. I have vowed to confine my repository list to the basics, because breaking updates suck huge donkey balls. Apologies for any unpleasant mental images.

Bonus rants: It's been an interesting change going from Konquerer to Nautilus (is it still called Nautilus?), amaroK to rhythmbox, kwm to Metacity. Here are my complaints:
  • The volume/mixer applet for the KDE panel showed levels for several outputs, and I could adjust one of the levels just by clicking or dragging the level bar in the panel. The Gnome applet appears to require clicking on the icon, waiting until a slider pops up, then moving it. And if you want to change something other than Master, double-click it to get a complete set of sliders.
  • amaroK's panel icon dynamically changed as the current track progressed. So a quick glance at the icon was sufficient to tell how far in the track was. You don't miss this feature until you've had it and then been without it. It would also be nice if rhythmbox had the "stop after playing the current track" command that amaroK has.
  • In KDE, I could assign a permanent keyboard shortcut to a window. For instance, I could assign a shortcut to my browser window on desktop three, a different shortcut to amaroK on desktop one, and switch from one to the other with a single multi-key combination. (I like to use the "Win" keys for this purpose). Gnome has a window-selector panel applet that pops up a list of all open windows (Alt-Tab merely cycles between windows on the current desktop) on click, but that requires mousing over and clicking on the applet, scanning the window list, and clicking on the desired window. When I was going through my minimalist phase (on Gentoo), I could use and assign keyboard shortcuts for what seemed like everything in IceWM (don't mention Fluxbox, I'll slug you). Surely Metacity can do the same?

Tuesday, November 21, 2006

subjective impressions of Objective-C

After I read a comment on some blog expressing high praise for the features of Objective-C, I've been going through some introductory materials found by Google. I must say that the experience has been eerie. It's almost as if someone was storing C and Smalltalk on the same volume, there was some corruption, and the recovered file(s) ended up mashed together. I confess to knowing about as much about Smalltalk as Objective-C (that is, close to nil - HA, I made a funny!), but the resemblance is obvious. I think I have a clearer understanding of how people can assert() that Java is C++ going back to its Smalltalk roots, but not quite getting there.

First, the parts I like. The Categories capability is intriguing. It reminds me of the roles that are part of Perl 6. I also like the choices the language offers: either use objects with a type of "id" and send messages to whatever object you like, or use static object types and Protocols when that level of dynamic behavior is undesireable or unnecessary. Even better, Objective-C inherits the high-performing "close-to-the-metal" compilation of C, with an extra library that handles the fancy object tricks at runtime. I kept wondering why I hadn't heard more about this language, and why it didn't seem to be in more widespread use (by the way, I own nothing made by Apple).

Then my reading uncovered several justifiable reasons why Objective-C didn't hit the big time, at least on the scale of Java or C++. It doesn't have a true standard, which means that the chance of multiple entities implementing it is correspondingly lower. On the other hand, one open implementation can act as a de facto standard, so this criticism may not apply to GNUstep. Another problem for me is the syntax. Punctuation ([ : - +) is used in ways that I've never seen before. Perl is worse in this way, and I suppose that programmers who seriously use Objective-C become accustomed. Something else that bugs me is Objective-C's strong association with specific framework(s). A standard library is fine, of course, in order for a language to be useful, but I expect there to be competing libraries or toolkits for anything more complicated. I also wish that Objective-C was implemented on a common platform (Parrot, CLR, JVM), which would get rid of this issue. But then Objective-C would be competing with languages that have dynamically-typed OO as well as convenient syntax niceties that elevate the programmer above C's level. Frankly, although Objective-C is fascinating to study, I don't think it fits any important niches anymore, except possibly the niche currently occupied by C++. If you need Smalltalk-like abilities, use Smalltalk. If you just want to crunch numbers or strings at a high level of abstraction, use OCaml or Haskell. If you deeply need code that performs well, use C with a good compiler or even assembly. If you just need to solve a common problem, use a scripting language.

Here are some of the links I found:

Saturday, November 18, 2006

the existence of mathematical entities

I've started reading The Road to Reality by Roger Penrose. I'm not far into the book at all, which stands to reason since I'm reading it slowly and in small doses. I was surprised to read that mathematical entities exist as Platonic ideals, which in effect seems to mean that their existence is neither physical nor subjective. (It reminded me of the three Worlds described by Karl Popper). In contrast, I believe that mathematics is a human creation having no independent being. That is, mathematics exists in the minds of people. It has the same claim to reality as Arda.

But you may argue, "Numbers are self-evidently real, at least because we use them to count objects and then accurately operate on those quantities". Say that I accept your argument for the time being and grant you all the whole numbers. I can do this because the whole numbers and simple 'rithmetic make up only a small sliver of modern math. Consider complex or "imaginary" numbers, which involve the square root of -1 (i), or even just negative numbers or 0. Historically speaking, these concepts did not come easily to humanity. Are these more abstract math concepts still applicable and useful, in spite of perhaps not having direct analogues to normal experience? Sure, but that's not the point. The point is to realize that from the logical standpoint of the entire math system, all numbers, whether whole or zero or negative or complex, as well as all equations and functions, have the same degree of "existence"; all belong to a system of axioms, definitions, postulates, theorems, etc. And this system was not discovered by people. It was invented, bit by bit, by starting with elementary ideas and then extending those ideas. 1 did not exist until a person wanted to represent how many noses he had.

One of the important ways in which the human creation of mathematics differs from the human creation of Rumpelstiltskin is that math proceeds logically (whereas Rumpelstiltskin isn't that logical at all). This means that mathematicians assume the truth of as few statements as they can get away with, and then show that putting those statements together leads to everything else they need. Hence the obsession of mathematicians for proofs--proofs mean that someone can rely on the truthfulness of conclusion Z13(w) provided that same someone relies on the truthfulness of some definitions and axioms. Proofs make the implicit truth explicit. The discoveries of mathematics, because the discoveries come about logically, consist of finding out what was implicitly accepted as true all along. So pi, or e, or i, are not ideal entities that eternally exist until someone finds them. That is, not like the speed of light, c. Mathematical entities, even constants, are just consequences of certain definitions. In effect, no humans created or fully comprehended the infinitely deep Mandelbrot set; they simply defined a set by defining a function that operates on complex numbers as previously defined. No human has counted every number even in the set of integers, either, but the set is infinitely large because it is defined that way.

Math does not originate or exist in the (completely objective) physical world, although the physical world displays enough order that mathematical models can correspond to it with varying degrees of accuracy. Math also does not originate or exist in a (completely objective) hypothetical world of Platonic ideals. Math is all the logical consequences of specific definitions, and the definitions in turn are/were developed by humans who used their powers of abstraction to create logical ideas out of physical reality. The same thought processes that enable generalized language (the word and representational concept of "chair" as opposed to the specific instance I'm sitting on at the moment) also enable the creation of mathematical entities. And logic, the process of combining true (abstract) statements into new true (abstract) statements, enables anyone to discover what they believed all along.

blog updates

I switched this blog over to the new "beta", and then I applied some labels (or tags?) to the posts and tried out the blog customization. Not too shabby. Now the WordPress blogs don't make me feel quite as inferior.

Here's some further explanation about the subject each label represents, with the caveat that this list will soon be out of date:
  • .Net/F#. (ASP).Net or F#
  • Blog Responses. A response to a post on another blog
  • Metaposts. The blog
  • Philosophical Observations. Sincere but amateurish remarks about (what passes here as) deep philosophical topics
  • Rants. Rants that serve no constructive purpose
  • Reviews. Reviews of movies, books, TV
  • Software Development. Techniques, debates, and general observations of software development and programming languages
  • Software Explorations. My investigations into existing software, or any original (to me) ideas for/about software
  • Trivial Observations. Miscellaneous topics and comments

Thursday, November 16, 2006

the power of the sun in the palm of my hand

I use my mobile phone pretty much just for phone calls. Color me odd. But when I saw that a mobile SimCity was available, I couldn't resist giving it a whirl, if nothing else for the sake of nostalgia.

I bought SimCity Classic off the store shelf a while ago, copyright 1989, 1991. The box I still have is the same one pictured in the wikipedia article. In the lower left corner it lists the system requirements: "IBM PC/XT/AT/PS2, Compatibles", "Supports EGA, CGA, Hercules mono and Tandy Graphics", "Requires 512K (EGA 640K)", "Mouse and Printer Optional". I ran it on an XT-compatible with CGA graphics and 512K, and no mouse. It was DOS, of course, so the game had its own windowing system, along with a menu option to only animate the top window to allow better game performance. Here's a line from the manual: "Simulator reaction time is also greatly affected by your computer's clock speed and type of microprocessor. If you have an XT-compatible running at 4.77 MHz, life in SimCity will be much slower than on a 386 running at 33 MHz." Its copy protection was a red card with cities and population data. On startup, the game would ask for a statistic off the card. Any product I purchased from Maxis seemed to be reasonable on the issue of backup copies and copy protection. Considering the number of times I experienced a floppy disk failure, it was only right. Later versions of SimCity lost my interest, since the family's computer never kept current enough to play them.

Enough reminiscing. The mobile version I played had similarities to what I remembered, but actually was more complex in a few ways (to start with the obvious, a much wider range of colors). And the graphics weren't top down, but perspective-drawn, as if looking down at the city from a high oblique angle - the kind of view from a helicopter. The game was just as responsive running on my cell phone as it had on my XT-compatible, maybe more. My point is that technological advances happen so gradually that it can sometimes take a common point of reference--in this case, the game Simcity--to illustrate the great difference. My puny phone can run a game that is very similar to a game that ran on a desktop, and the phone version is even enhanced. It feels like I have the power of the sun in the palm of my hand! Almost makes you wonder if all that speculation about the Singularity isn't a huge load.

Along the same line, I ran Fractint when my family finally had a 386 (no coprocessor, which sometimes was a pain when running this program). It was an outstanding program created through the collaboration of many coders working over the Internet, or maybe it started in Compuserve for all I know; either way, Fractinct must have been the first time I'd heard of people doing that. By the time I came around to it, Fractint's feature list was incredible. In addition to many fractals and even fractal subtypes, the display parameters could be tweaked to do a slew of wild effects. It also offered a long list of video modes, any combination of resolution and colors one might want. Rather than sharing the generated images with others, the user could simply use a menu option to save the Fractint parameters that resulted in a given image and send the parameter file to another Fractint user, who could then use the corresponding menu option to load the parameters into Fractint and regenerate the image. Many of the fractals have informative documentation screens that explain the origin of the fractal and its calculation method. I could just go on and on.

As you may guess, I've kept my DOS Fractint files around just like the way I kept my SimCity materials. Any sentimental coot knows that Linux has a project, DOSEMU, that can run (Free)DOS excellently. DOSBox may be better for games, but I have had no trouble running Fractint with DOSEMU. After adding the line "$_hogthreshold = (0)" to my /etc/dosemu/dosemu.conf, which seems to be the equivalent of saying "sure, you can have my CPU, I don't need it for anything!", DOSEMU generates even the worst, most-complicated fractals of Fractint so quickly it's not even funny, in spite of the layer of indirection. Having to wait to see your fractal was such a large part of the experience...if all programs were written to perform as well as Fractint, latency in software would be a laughable concern. Here in my room is a computer that has so much more processing power than that 386, and it mostly sits idle, or whips through a web page render, or decodes a media file. It's a criminal case of waste.

Tuesday, November 14, 2006

First rule of proselytization is to respect the infidel

The idea behind the clash of civilizations post was that the Java (and C#, etc.) folks have different values than the dynamic languages camp, so arguments that are completely convincing to one side are considered irrelevant by the other side. They talk over each other's heads.

However, if someone wants to try to talk across that gap, and possibly help someone cross, I think that insults are not a good start. In defense of J2EE's complexity addresses the layered, many-pieced architecture of J2EE through a historical point of view. As a general rule, technology (especially software) is invented, popularized, and evolved to serve a specific need, even if that need may be quite broad. J2EE wasn't an accident! The problem with this blog entry is the argument at the end: that software engineering should be complex enough to keep out BOZOs. chromatic called him on it over on his O'Reilly blog, because, obviously, there are many reasons for a non- or anti-BOZO to use a simpler technology.

An example of a good way to communicate with your intellectual opponents is this post by Sam Griffith. He's the same writer who previously expounded on a Java macro language, which I linked to from my MOP post a while ago. He gives Java its credit, but at the same time explains how Java could learn some things from other languages/platforms that had similar goals. Another good example is the Crossing Borders series over at IBM developerWorks (even if it exhibits the annoying trait of non-Java content in a Java-centered context). Each page in the series demonstrates an alternative approach to the common Java way of accomplishing a task, and then it compares and contrasts. Some of the articles, like one on Haskell, honestly don't seem to offer much for the Java developer, and one on web development seems to basically suggest that we should go back to embedding straight Java code in our page templates. But the one on the "secret" sauce of RoR is enlightening.

Personally, I often read these articles with a half-smirk, because the pendulum swing here is clear: we started out writing web stuff in C, then switched to Perl, then Java, and now back to Python/Ruby/PHP or what have you. The other reason for my smirk is because I'm now forced to work in ASP.Net anyway. But if Microsoft can do anything, it's copy and clone, so there's work afoot to use IronPython for more rapid web work.

Saturday, November 11, 2006

the SEP field of social interaction

Disclaimer: I am not a serious student of sociology, although I have had a spiffy liberal arts education. Moving along...

A post about mockery and now this. When I started this blog, I assumed it would mostly revolve around my chosen specialty/vocation. What can I say? I haven't had much to write about lately, tech-wise. I do have the following tidbit of commentary. The agreement between Microsoft and Novel about SUSE has inspired me to rewrite the well-known Gandhi quote "First they ignore you, then they laugh at you, then they fight you, then you win" to "First Microsoft ignores you, then Microsoft spreads FUD about you, then Microsoft pretends to peacefully coexist with you, then Microsoft or you crushes the other". Not as poetic, but accurate.

Enough preamble ramble. My experiences in society have reminded me of the HitchHiker's series once more. This time, it's the Somebody Else's Problem field. The SEP field is a field that accomplishes the same effect as a typical invisibility field, but by masking everything in the field as "somebody else's problem". It exploits the natural human (and all aliens?) impulse to not take responsibility for something unless necessary.

An SEP field-like effect can occur in social interactions, too. An SEP field social interaction consists of someone not acknowledging his counterpart's very existence. As I once heard someone joke, "His existence is not pertinent to my reality". The SEP field effect is not the same as ignoring someone else, because that would imply that the other person was noticed and then disregarded. It is also not the same as indifference, because that also implies an uncaring awareness of the other person. Within the SEP field, people may as well not exist at all, because in either case the impact is precisely null. In fact, this could be a useful detection method for social SEP fields. If someone's absence is not noted in any way, then clearly an SEP field may be masking that individual when he/she is present.

I should further note that the SEP field of social interaction is not a reliable indicator of cruelty or inhumanity. Simply put, someone would have to acknowledge someone else exists in order for an act of cruelty or inhumanity to occur. Those within the SEP field are not necessarily hated, ignored, low-class, etc. They're just nonexistent to others. I don't know if the SEP field has been explored in sociology, but I'm reminded of the categories of Gemeinschaft and Gesellschaft, where I think of Gemeinschaft as "society in which people interact based on relationships" and Gesellschaft as "society in which people interact based on purpose". I'm more inclined to put the SEP field of social interaction in Gesellschaft, though I suppose it really fits in neither, given that SEP fields negate normal society.

How can one defeat an SEP field? Hell if I know. Oh, wait: Hell is other people. Aw, now I'm just confused.

Monday, November 06, 2006

how I met your Whedon

So, I've been hooked on "How I Met Your Mother" since the first episode. It's just a good show. And it happens to star Alyson Hannigan, who played Willow from Buffy. Then Alexis Denisof, who played Wesley on Buffy and then on Angel, guest-starred in a few episodes. Fun times, but relatively unsurprising considering Alyson and Alexis are married.

The trend continued when Amy Acker, who played "Fred" on Angel, managed to slip into the very last episode of season one. That makes three alumni from Whedon shows. Until tonight, when Morena Baccarin, who played Inara on Firefly, was on as, go figure, a gorgeous rival to Alyson's character. Even the most skeptical viewers would have to admit that these casting choices have gone beyond coincidental. The ultimate topper is that a miniscule role in tonight's episode, an unnamed "Guy", was played by none other than Tom Lenk, who played Andrew on Buffy! It's a conspiracy, I tell you, a conspiracy! Is anyone else seeing this? I feel like I'm taking crazy pills!

On Wednesday Nathan Fillion, who played some sort of superdemon thing on Buffy and then as Capt. Mal in Firefly, will be on Lost this week. It will also be the last episode before Lost breaks for a while so its production schedule can get caught up again. I'm thinking it will be a good one. I resisted the urge to comment on the last episode. I read online that what transpired was planned from the start, I've learned to trust the writers, and I appreciated the grand significance of it all.

For those who are keeping track, Neil Patrick Harris is gay. Not that there's--oh, I can't finish the line, it's so old and moldy.

Sunday, November 05, 2006

you are who you mock

Disclaimer: I am not a serious student of pop culture studies, media or film studies, sociology, anthropology, etc., etc. Moving along...

Recently I've been paying special attention to mockery in media. Anyone with a passing acquaintance with human nature would guess correctly that the trigger was when I experienced one of the groups or subcultures I identify with being mocked. (The applicable truism is that people find it hard to even care about something unless it affects them personally).

It was on a TV show some weeks ago. I won't delve into that with any greater detail, since this is one of those cases in which specific examples might distract from considering mockery as a general concept. I will add that the remark was well-aimed and well-executed, and clearly hilarious to anyone not taking it seriously. I'm not considering here whether mockery is ethical, justifiable, effective, enjoyable, or necessary (IMNSHO: the context and degree and intention are all pivotal factors). I'm also not considering self-mockery or lighthearted mockery between friends.

For me, the interesting insight I've reached after observing the mockery of one group by another is how it defines the group doing the mocking. Even casual or off-the-cuff mockery broadcasts a group's values, mores, and norms on a huge megaphone, in fact, and in more than one sense. First, a sincere mocking reinforces the identity of the group by indicating who the group is not. We are not Them. There is no intersection between our groups, because if that were true then we would be talking trash about ourselves! Second, mockery can indicate what the mocking group fears, hates, or doesn't understand. One of the primary uses for humor is coping. Before mocking another group, I may know full well that they are not all idiots or degenerates; I may possibly be on good terms with someone in that group. But if something about that group disturbs me, I can defang it by laughing at it as if it were a characteristic of an idiot or degenerate. I simply declare it unworthy of uncomfortable consideration. Third, mockery may illustrate how a group measures the status of its members, because mockery consists of bestowing an uncomplimentary status on another group. If my group equates status with intelligence, then you could reasonably expect me to mock the intelligence of other groups. A group that values fashion may mock the clothing of others. And so on and so forth. You don't mock what you don't measure, because the funny part is how low he/she/they measure up. Fourth, the method and manner of mockery can betray a group's concepts of humor or disgust. If a group mocks another group by calling them a bunch of grues, then clearly the group believes that grues are either funny or distasteful. If I mock you by calling you a jackass or an even more colorful metaphor, then I must not think much of jackasses.

You are who you mock. More practically, even mockery can be a possible vector of understanding between groups.