Showing posts with label Computer Industry Observations. Show all posts
Showing posts with label Computer Industry Observations. Show all posts

Friday, November 30, 2012

Twitter is the de facto OpenID

OpenID was intended as a way for website visitors to log in to one website and then reuse that logged-in "identity" on other websites as well. The wonderful benefit of OpenID is avoiding the hassle of setting up separate identities for separate websites. And if someone is already logged in to the "providing" website, then they can step the login step altogether on the "relying" websites, which more or less only need to ask the provider about who's currently logged in. (The typical security precautions apply: the easiest first steps to prevent someone else from reusing information is to log out from all websites and quit the web browser.)

If the preceding paragraph led to the reaction, "Huh? OpenID sounds good but I've never heard of it," then it's clear why some commentators declare OpenID a failure. It's currently in use and it will most likely survive for a long time to come, but it never achieved widespread popularity. In my opinion, a surprising competitor has surpassed it to become the top identity source: Twitter. Here's why.
  • Publicity. By any measure, Twitter is well-known and constantly visited. This is vital for a successful identity source. If an identity source is relatively unknown or dormant, websites won't have a strong reason to accommodate it. And the fewer websites that accommodate it, the less appealing it is as an identity source, which then causes fewer websites to accommodate it, which then causes it to be less appealing as an identity source... The upshot is that an identity source excels when its main attraction is something famous other than providing identities to other websites.
  • Upkeep. Twitter's reputation for availability has fluctuated. Nevertheless, as a major website (see #1), it has an obvious interest in ensuring that visitors, apps, and other websites can log in quickly without problems. OpenID was by necessity a secondary option for logging in, so neither the providing nor the relying websites were especially careful at ensuring OpenID functioned properly—OpenID visitors were less valuable anyway, precisely because the website didn't force them through the information-gathering portion of login setup. All too often, either the providing or the relying websites rejected the other due to miscellaneous errors. Since OpenID was a low priority, it wasn't as well-tested or maintained while the website's normal login code evolved. Like RSS feeds, redesigned websites accidentally broke OpenID access and then never repaired it. OpenID's independent and noncommercial existence protected it from the potential extinction of any single company, but it didn't have a specific team of paid support staff responsible for its bugs, like Twitter's login.
  • Informality. In comparison to other websites, Twitter's login setup is quite rapid, undemanding, and painless. Twitter needs very little information for the activities it has. Thus, whether on purpose or not, it's an effective strategy for the incidental task of just creating uncomplicated login information. Given that the motivation behind the search for an identity source is dodging irksome/repetitive login setup, this characteristic of Twitter is ideal.
For anyone who wants to scatter their comments throughout the Web, Twitter has greater practicality than OpenID. Who knows, someday I may stoop to use it to micro-blog...

Tuesday, November 29, 2011

this is an undefined entry

Standardization is indispensable. Through independent standards, various products can work well, regardless of market competition. On the other hand, successful standards have appropriate limits. Too much constraint impairs vital product flexibility1. To balance these concerns while remaining unambiguous, a standard might literally delineate topics which are called "undefined". It identifies and then ignores. Hence, a well-known technological gaffe is to assume too much about what's undefined by the standard.

I appreciate such indistinct confessions. In fact, one of my cynical unfulfilled wishes is to exercise that option in other contexts.
  • I'll finish the project by the agreed deadline. On the condition that everyone else finishes the preceding tasks promptly. Else the finish date is undefined.
  • I'll contribute a proportional share to the joint cost of supper. On the condition that I stopped by an ATM recently. Else I should warn you that the amount that I will find in my wallet is undefined.
  • I'll advocate the replacement of the software package. On the condition that the result is an unqualified improvement. Else my original level of support for the replacement is undefined.
  • I'll follow the specifications. On the condition that the specifications are exact matches for what I was already planning. Else my fidelity to the specifications is undefined.
  • I'll write a program that handles the input correctly. On the condition that all the input is correctly entered. Else the program's actions are undefined.
  • I'll stay alert at all times in long meetings. On the condition that I slept soundly the previous night. Else my attentiveness is undefined.
  • I'll continue adding more examples to the list. On the condition that I don't start to feel bored. Else the number of more examples is undefined.


1The product can exploit the flexibility in myriad ways. It could do nothing or fail. Or it could provide additional benefit. It could violate unstated expectations. Or it could guarantee a result which the standard doesn't. It could optimize for greater efficiency by performing fewer checks. Or it could optimize for greater infallibility by performing innumerable checks.

Wednesday, October 26, 2011

shared links are different than RSS

The point of RSS is automation. RSS is an established standard1 so that no human needs to keep revisiting a long list of vetted topical sources that regularly publish worthwhile content. With RSS, a computer can rapidly and flawlessly determine whether another computer offers new content. If available, the content itself can be automatically retrieved, too.

Human link sharing is different. When humans share links, those links are likely intermixed with irrelevant items like unsolicited opinions. The links may be about a wide range of uninteresting topics. Although link sharing distributes the work of revisiting valuable websites and flagging fresh information, the flaws of a manual procedure continue to apply. How reliable is the crowd at performing this task? As a unit, they're more reliable than one person, but I daresay that a program is better still.

RSS is for mechanizing the syndication of "publishing sources". I'd say that humans who share links are more accurately classified as additional publishing sources than as "John Henry competitors" to efficient RSS software. I'll concede that link sharing can be a more convenient answer than RSS to the question, "How do I obtain a simple list of interesting links to visit?" But RSS is superior if the question is, "How do I, a unique individual, follow the ongoing activity of every source in a list that I control?"

Shared links won't replace RSS for my needs. To suggest otherwise is to misunderstand the purpose of RSS. When a website shuts down its feeds due to "lack of interest", I'll gladly turn my attention elsewhere.

1Ha! RSS is a standard in the sense that you pick whichever one you like.

Wednesday, June 30, 2010

peeve no. 260 is factional prejudice in IT

Faction: a group or clique within a larger group, party, government, organization, or the like
I don't know what it is about IT that results in professionals forming factions. Use text editor X! Align your punctuation thusly! Avoid all software from company Q! Programming language R is your cool friend that all the popular kids like! Every problem should be flattened via solution U! Your VCS is DOA!

I understand that people develop attachments to their favorite tools and passion for their work; that's fine. What drives me nuts is the all-too-common over-exaggeration of these feelings. Acting as a cheerleader for your chosen IT factions can be fun, but if you ground your personal identity in it then you've officially gone mental. An IT professional should make technological decisions based on actual requirements and well-understood trade-offs, not based on which option fits your personal "style". Of course, usability and even nebulous subjective appeal are nevertheless legitimate factors in the decision simply because a difficult and hated option is detrimental to productivity and morale. The difference is a decision-maker who critically weighs all applicable criteria instead of just always "voting with the faction". Emotional investment is present without overpowering every other fact. Someone doesn't say "I'm a BLUB programmer". Rather, "I know the most pertinent portions of the syntax and semantics of BLUB, and I like BLUB because of BLURB".(I've complained before about grouping programmers by language.) 

Although I'm ranting against factional prejudice, I caution readers to not interpret my opinion as a case of "binary" thinking (i.e. thinking characterized by a simplistic division into only two categories). Some factional prejudices are quite beneficial and practical, when kept within reasonable and specific bounds. For example, past behaviors of a company or open source project are good reasons to be wary of current behavior. A platform's reputation for lackluster efficiency is a worthwhile justification to avoid it unless the platform demonstrates improvement (or the efficiency is sufficient for the task). Evaluate factional prejudice according to its motivation and relevance and realism. 

For businesses at least, technology isn't an accessory or an avenue of personal expression. It serves a purpose beyond the satisfaction of one's ego. The One True Way might not be what's right for the situation and the client.

Monday, May 26, 2008

the earliest ever "second system"?

From chapter 2 of The Code Book by Simon Singh, section "Mr. Babbage Versus the Vigenère Cipher" (page 64 in my edition):
These mathematical tables were calculated by hand, and the mistakes were simply the result of human error. This caused Babbage to exclaim, "I wish to God these calculations had been executed by steam!" This marked the beginning of an extraordinary endeavor to build a machine capable of faultlessly calculating the tables to a high degree of accuracy. In 1823 Babbage designed "Difference Engine No. 1," a magnificent calculator consisting of 25,000 precision parts, to be built with government funding. Although Babbage was a brilliant innovator, he was not a great implementer. After ten years of toil, he abandoned "Difference Engine No. 1," cooked up an entirely new design, and set to work building "Difference Engine No. 2."

Monday, April 28, 2008

why computer tech IS distinguishable from magic

Arthur C. Clarke's third law is "any sufficiently advanced technology is indistinguishable from magic" (this law has been handy for sci-fi writers who write Halloween episodes, for example). Would modern computers be indistinguishable from magic by someone from a past era?

Naw.

Three hobgoblins, if not more, scuttle the illusion that computers are magic: entropy wrecks the hardware, information's continually changing shape makes software obsolete, and engineers (hardware and software) are fallible. In the typical portrayal of magic, it doesn't have these problems. Imagine if a magus had to contend with them:
  • Entropy. Eventually, a wand would stop sending out sparks like it used to. Worse, it might start short-circuiting. Quests for hidden enchanted objects would be pointless, since those objects would long since have lost all effectiveness. Ancient books would fall apart, of course. Potions would go bad (or go good or go neutral, whatever). Flying brooms would start to give out, hopefully not during a high flight, and develop the same troubles as other wooden things, like splinters (ouch!). The power factor would demand special attention since magical actions expend so much energy, particularly in transformations of matter. No matter what the source of magical power is, that source wouldn't be inexhaustible (even if it's human thought/concentration/imagination/emotion that would still imply that each application leaves the person's consciousness reduced somehow). Anything that can operate while disconnected from a magical power source would need periodic recharging, but a "standby" or "hibernate" mode would help. Related to the power factor is the dissipation factor. The efficiency of magicked widgets wouldn't be absolute, resulting in a certain amount of magic leakage. Presumably, the leakage wouldn't be desirable, and overuse might produce an (apocalyptic?) overheated state. On the other hand, purposely approaching this limit for the sake of performance might be reasonable if done conscientiously. One would hope that the manufacturer would be overly conservative in estimating the threshold.
  • Information malleability. First of all, spells would be much more specific than commonly assumed. The desired effects represent a finely-detailed confluence of information; else the vivified brooms will never stop carrying water in buckets from here to there, which would be an exceedingly serious bug, eh? The spell must precisely communicate its intent because the information must come from somewhere, and the information to produce a specified effect is always conserved. This means that the spell to transform someone into a newt is different than the spell to transform someone into a toad. And what if needs change in the future? For instance, now the toad must be able to retain the power of speech? Well, with any luck, the original spell-writer will have accommodated that by making the preserved human attributes configurable through additional optional spell clauses--if the spell-writer wasn't enough of a soothsayer to predict that need, then a new, complex spell is required. As for the toad, if the human's vocabulary wasn't extracted and stored just in case during the process of the first transformation, then the sorcerer has no other option than starting again with a different subject. Then there's the situation of a kitchen spell that automatically washes pans with steel wool. That's great until buying a Teflon skillet. A golem guard under instructions to pulverize uninvited visitors, like pushy salespeople, may seem great until it mashes your next-door neighbor, who came by to drop off the newspapers that were delivered during your vacation. A spell to create one hundred units of currency out of air is fairly convenient in the U.S. (your 15% interest rate credit card doesn't count), but chanting the same spell in Japan is not.
  • Human fallibility. Oh, sure, tales of less-than-stellar mages aren't hard to find, and neither are tales of inadvisable attempts at something poorly understood. However, it's relatively unusual to find accounts of a normal practitioner who has a level 5 certification and occasionally fails to perfectly complete a level 3 incantation. Oops, the caster slurred a few words. When he or she was visualizing what should happen, concentration dropped for a moment or the envisioned vehicle's interior wasn't considered. Too many drops of the sixth ingredient went in, and now the concoction is worthless. The crystal ball is a tad oblong, causing everything to look horizontally stretched. A Trinket of Power has some flaws that necessitate tapping it several times whenever it intermittently stops functioning. When the spellbook was copied, several errata crept in. Some unnoticed iocaine powder was on the witch's fingers when she was stirring the cauldron, and consequently the antidote mysteriously acted like a poison. Instructions for the bewitching of the shoes didn't explicitly state that the shoes had to be made by leprechauns. Wearing an Invisibility Anklet or Invincibility Brooch is fine, but wearing both at once causes problems during the night of the 1st quarter of the moon. And the Bow of Elvish Accuracy only is compatible with FairyGenuine arrows, although any Crossbow of Standard Aim should at least wing the target with arrows that claim to match the Warlock Council's Archery Dictates.
Computer technology is indistinguishable from magic. Heh. I wish.

Tuesday, March 04, 2008

peeve no. 257 is the idea of Internet "governance"

I'm not sure if this opinion places me in the category of realist or idealist, but the whole idea of Internet "governance" irks me. The Internet is a "network of networks", and a network is a connection between devices. It's a simple idea, while at the same time being fiendishly complicated to put into practice. The Internet is "devices talking to one another".

Nobody should find it necessary to comment that the Internet is decentralized, or to comment that the value of the Internet is at the "edges", i.e. the devices actually sending and receiving data. In simple terms, when ma and pa click on an icon for a Web browser to "go to the Internet", the closer metaphor isn't switching TV channels or turning pages in a massive encyclopedia (Information Superhighway, remember that?). They're telling the computer to "make a phone call" to other computers. This point was easier to emphasize back when people plugged a regular phone line right in to the computer's internal or external modem. IPTV and VOIP probably interfere further with the casual user's attempt at a metaphorical understanding...

If the fundamental Internet is just computers talking, then discussing its fundamental "governance" is nutty. Who "governs" TCP, UDP, and so on? Who "governs" HTML, CSS? Who "governs" SMTP, POP, IMAP? Who "governs" IP, IPv6? The Internet is communication. What matters is that the communication works right here, right now. Standards and protocols enable valid communication. The "penalty" is miscommunication, e.g., a malformed data packet or a web page that's "broken" in an idiosyncratic renderer.

No entity--company, government, standards body--"governs" the Internet. The Internet isn't owned by them either. I want to believe this is true. The irresistible tendency to see the Internet as a marketplace, public diary, fan convention, software development technique, leisure entertainment, newsroom, information repository, and so forth masks the reality. There is no such thing as cyberspace and no such thing as blogosphere. The Internet is device A exchanging data with remote device B, via a path of complex technologies and intermediaries. At the point at which almost all important information interaction happens through this method, and yet strict governance extends to the same level of micromanagement of communication, previous lacks of privacy will seem tame.

Saturday, January 19, 2008

the pigs are walking?

If the title doesn't make sense, I highly recommend reading Animal Farm. When the farm animals overthrow the farmers, the pigs are the leaders. By the end of the book, the pigs have started walking; a grand revolution of government has ultimately produced leaders just like the old. In the Lost episode Exposé, Dr. Arzt states that the pigs are walking when he's questioning the (de-facto) leaders.

In this case, I just read a commentary piece which at one point gives several examples of huge, monolithic, feature-bloated, overly-complex programs. One of them is Firefox.

But as people may or may not remember, Firefox (er, Phoenix) first became popular as a light, focused alternative to Mozilla (now SeaMonkey). (Actually, at that point in time I was happily using Galleon--never used Epiphany either.)

This isn't really a sighting of "pigs walking" because the commentary piece is referring to the perspective of developers of POSIX-ish systems, not the perspective of a user. And speaking as a user, if Firefox adds features to make my browsing better without hindering me from my most common tasks, I won't complain.

Friday, January 18, 2008

public bulletin: open source software is heterogenous

I was just reading some commentary (not linked because there's no shortage of commentary available) about Sun's purchase of MySQL. I feel that I should do the responsible thing and get the word out.

Open source software is heterogenous.

Let me explain before this statement leads to any humorous misunderstandings. FLOSS is heterogenous because it's a conceptual clumping of any software which is available under the right license(s). Similarly, "LAMP", Linux/Apache/MySQL/(Perl|PHP|Python|PRuby), isn't a product. It's a collection of software. And as more and more people know, what's commonly called the "Linux platform" is also a collection of software, around the Linux OS/kernel. One of the real-but-under-publicized advantages to running FLOSS is how easy and convenient it can be, nowadays, to mix and match the software.

All this is enabled by the common denominator between FLOSS: the licenses. MySQL's license (er, one of them) ensures it will continue to remain available for the software combinations it's currently in. Speculation about how Sun's purchase of a company will affect "open source" is incredibly vague, because FLOSS isn't a company nor an industry nor an organization. "Movement" is closer, but even that erroneously suggests cohesive goals crossing across all the participants. Since it's a matter of licensing and collaboration, perhaps it should be a verb, not a noun. FLOSS is an activity some software-developing entities do. Even the company often filling the boogyman role, Microsoft, has been doing FLOSS more.

But by all means, Web commentators, continue to muse on the Sun-MySQL deal as if it will drastically upset those using FLOSS. (I admit that when companies pay developers full-time to work on the software, that makes a difference.)

Saturday, July 28, 2007

stating the obvious about Microsoft and Open Source

Microsoft has a web site that serves as a "gateway" to its ventures related to Open Source. Also, Microsoft will submit its "shared source" licenses to OSI. I've read some...questionable analysis about these overtures, so here's a helpful list of obvious points. I may be a little sloppy in confusing "open source" with "free software" (Stallman essay on the topic). I think it's a given that Microsoft is more open to open source concepts than free software concepts.
  • I don't have the numbers (nor do I care), but one obvious truth is that Microsoft earns obscene amounts of revenue from selling software, no matter what discounts it may offer in certain situations. It also benefits from customer inertia, because once a customer has started using particular software the cost of migrating off it can be steep. Naturally, many other software companies do the same. Now, the licensing of open source software makes this typical strategy much less viable. (Open source software can be redistributed by users, which drives down the selling-price, and open source software can be modified by users, which means they don't need to depend on the company for future releases or bug fixes). Hypothetically, if Microsoft moved to pure open-source licensing, it would lose truckloads of moola. Not a likely move. On the other hand, switching to an open source license for software which is free to acquire, like Java, makes more sense to software companies.
  • Microsoft is mind-blowingly massive. Again, an obvious point, but there it is. Microsoft has its hands in OS, office suites, databases, development tools, Web technology, media, video games (PC and console). It even makes mice, you know. Microsoft's massiveness means several things. First, that categorical statements about the quality of Microsoft's offerings are almost always over-generalizations. Saying that one will refuse to consider the (possible) technical merits of open sourced Microsoft code solely because it's from Microsoft is foolish and uninformed. Microsoft has some incredibly smart people whom I can only assume are worth more to Microsoft as real coders than figureheads. Refusing to consider the code based on ethical concerns or fear of patent litigation, well, that's different. Second, Microsoft's massiveness means that Microsoft can pick pieces to open up to varying extents while leaving the rest closed. Microsoft gains a few favorable public-relations nods, but keeps its identity. I imagine it will primarily open software on its periphery and in contexts that have a greater chance of being appreciated (open-sourcing a standard-document converter, for instance).
  • Executives at Microsoft have repeatedly denounced free software, on principle. Historically, Microsoft hasn't done much to grant its users additional privileges beyond, well, using. If it's taking steps in that general direction, expect those steps to be slow, rather forced, and accompanied by restatements of its stances.
  • Microsoft grows by expanding into apparently-profitable new markets. From this point of view, the open source software community represents yet another opportunity. Unfortunately, as a vocal and dominant player in proprietary software, Microsoft also happens to be a defining symbol in that community for what open source stands against. Personally, I believe that defining a viewpoint in terms of its opposite ("I'm not them, I'm something else!") is lame, and there's no shortage of other targets that fit the criterion "aggressively makes closed software". And at least one knows that Microsoft isn't illegally exploiting open source software like some others, considering mere interoperability is poorly supported.
  • Microsoft has a slew of competitors, some worrisome, some laughable. Those competitors have been starting to leverage open source, though perhaps more for public-relations' sake than in a hope to create better software. As product comparison checkboxes go, "connected to open source" has risen in importance. It helps to put customers a little more at ease and enhances the trustworthiness quotient of the company. The more other companies can check that box, the less desirable Microsoft seems. I think it's safe to assume Microsoft would rather compromise a tad on open source instead of losing market share.
  • When Microsoft dabbles in open source, it doesn't use the same open source licenses as everyone else. To be fair, it's not alone; as is their prerogative after some legal consulting, companies will create or modify an open source license when none of the licenses is exactly right. Nevertheless, why doesn't Microsoft just use the (myriad) OSI-approved licenses, which is an easier track to "official Open Source"? To me, this is the most significant indication of an underlying lack of sincerity. In using existing open source licenses, companies gain the stamp "Open Source". My impression of companies that craft new licenses for approval is that they are trying to reshape the stamp to fit them, not acknowledging the stamp for what it is.

Sunday, March 18, 2007

sneakernet in miniature

I was starting to transfer a largish media file from my primary computer to my MythTV computer over my home network (which consists of one cheap hub and some cables), when I realized it might be faster to copy the file to one of my USB thumb drives and carry the drive over to the MythTV computer.

I started to transfer the file onto the thumb drive, when I realized that it was going faster, but not really fast. In addition, I would need to transfer the file off the thumb drive and onto one of the internal drives in the MythTV computer, to ensure speedy access for viewing.

Finally, I decided to switch to burning the file to a CD (the file wasn't that large). The burner's so quick that it took no more than a few minutes. Having the file on disc meant that I didn't even need the copy on my primary computer, so I removed the file after burning. The MythTV computer could play the video straight from the disc, which meant it didn't take up any space on the MythTV computer either. Transferring by CD was a solution that was fast (only the time to burn the file to CD), cheap (I bought my current stack of blank discs so long ago I actually don't remember what the cost was), convenient (this tech is well-established and well-supported all around), and storage-efficient (as long as CD-reading is snappy enough, the data only needs to take up space on one medium - given that the data's not vital in any sense). What's AppleTV?

A still better solution may have been to stream the file from one computer to the other over the network, but the amount of time/effort required to set it up would be tedious. Or is it? Speaking out of my laziness, I simply don't know, although I have heard good things about VLC.

Thursday, November 16, 2006

the power of the sun in the palm of my hand

I use my mobile phone pretty much just for phone calls. Color me odd. But when I saw that a mobile SimCity was available, I couldn't resist giving it a whirl, if nothing else for the sake of nostalgia.

I bought SimCity Classic off the store shelf a while ago, copyright 1989, 1991. The box I still have is the same one pictured in the wikipedia article. In the lower left corner it lists the system requirements: "IBM PC/XT/AT/PS2, Compatibles", "Supports EGA, CGA, Hercules mono and Tandy Graphics", "Requires 512K (EGA 640K)", "Mouse and Printer Optional". I ran it on an XT-compatible with CGA graphics and 512K, and no mouse. It was DOS, of course, so the game had its own windowing system, along with a menu option to only animate the top window to allow better game performance. Here's a line from the manual: "Simulator reaction time is also greatly affected by your computer's clock speed and type of microprocessor. If you have an XT-compatible running at 4.77 MHz, life in SimCity will be much slower than on a 386 running at 33 MHz." Its copy protection was a red card with cities and population data. On startup, the game would ask for a statistic off the card. Any product I purchased from Maxis seemed to be reasonable on the issue of backup copies and copy protection. Considering the number of times I experienced a floppy disk failure, it was only right. Later versions of SimCity lost my interest, since the family's computer never kept current enough to play them.

Enough reminiscing. The mobile version I played had similarities to what I remembered, but actually was more complex in a few ways (to start with the obvious, a much wider range of colors). And the graphics weren't top down, but perspective-drawn, as if looking down at the city from a high oblique angle - the kind of view from a helicopter. The game was just as responsive running on my cell phone as it had on my XT-compatible, maybe more. My point is that technological advances happen so gradually that it can sometimes take a common point of reference--in this case, the game Simcity--to illustrate the great difference. My puny phone can run a game that is very similar to a game that ran on a desktop, and the phone version is even enhanced. It feels like I have the power of the sun in the palm of my hand! Almost makes you wonder if all that speculation about the Singularity isn't a huge load.

Along the same line, I ran Fractint when my family finally had a 386 (no coprocessor, which sometimes was a pain when running this program). It was an outstanding program created through the collaboration of many coders working over the Internet, or maybe it started in Compuserve for all I know; either way, Fractinct must have been the first time I'd heard of people doing that. By the time I came around to it, Fractint's feature list was incredible. In addition to many fractals and even fractal subtypes, the display parameters could be tweaked to do a slew of wild effects. It also offered a long list of video modes, any combination of resolution and colors one might want. Rather than sharing the generated images with others, the user could simply use a menu option to save the Fractint parameters that resulted in a given image and send the parameter file to another Fractint user, who could then use the corresponding menu option to load the parameters into Fractint and regenerate the image. Many of the fractals have informative documentation screens that explain the origin of the fractal and its calculation method. I could just go on and on.

As you may guess, I've kept my DOS Fractint files around just like the way I kept my SimCity materials. Any sentimental coot knows that Linux has a project, DOSEMU, that can run (Free)DOS excellently. DOSBox may be better for games, but I have had no trouble running Fractint with DOSEMU. After adding the line "$_hogthreshold = (0)" to my /etc/dosemu/dosemu.conf, which seems to be the equivalent of saying "sure, you can have my CPU, I don't need it for anything!", DOSEMU generates even the worst, most-complicated fractals of Fractint so quickly it's not even funny, in spite of the layer of indirection. Having to wait to see your fractal was such a large part of the experience...if all programs were written to perform as well as Fractint, latency in software would be a laughable concern. Here in my room is a computer that has so much more processing power than that 386, and it mostly sits idle, or whips through a web page render, or decodes a media file. It's a criminal case of waste.