Recently in Bits and Bytes Category

Software in the Age of Sampling

|

I've just returned from Øredev in Malmö, Sweden where I delivered Software in the Age of Sampling twice. The first time to a healthy turnout, the second to a nearly empty theater, to get a take without demo snafus. The video below is Take Two.

Abstract

Over the last generation or so, software development has changed profoundly, but some of these changes occurred so slowly they have barely been recognized.

Software was once built by skilled but peculiar artisans, who meticulously crafted their original, green-fields commissions from first principles. Today, their frontiers are gone, and the very few of us build anything new from the ground up. Instead existing resources are rehashed, recombined, and remixed to produce "new" mash-ups based up the work of others. It's collaborative, this 21st century scavenging, and it has become a state-of-the-art approach to software development.

These changes in the way we produce software have much in common with the changes that have transformed the music industry over the last thirty years. The garage band era of original composition has given way to the direct borrowing of scraps of other people's pieces and a cavalier disregard for traditional originality.

This session will explore how software developers in the age of sampling have as much in common with contemporary high-tech music "producers" as they do with traditional engineers.

The Cobbler's Children

|

The cobbler's children are the last to be shod.

So it is too for software developers these days. I'm reminded of this hoary trope when I think about the state of twenty-first century programming languages and technology. Because its is still very much mired / anchored in the 95-character ASCII/EBCDIC punchcard-era practices of the fifties and sixties.

Punchcard

This reverie was triggered by a chance Twitter sighting of a 'blog post by Canadian Computer Scientist Greg Wilson on why Donald Knuth's vision of Literate Programming had failed.

Greg's piece resonated with me because it addresses two themes I've trotted out myself on a couple of occasions. Such is the nature of this sort of triggering.

The first is the enduring irony that, despite having delivered all manner of post-ASCII visual GUI bliss and razzle-dazzle to countless users of all descriptions, programmers themselves must still trade in primitive text-based program representations that would be familiar to time-traveling hackers from the Nixon Administration. With a handful of exceptions, today's production code would be quite at home on a keypunch (sans case). These textual accounts of programmer intent remain the canonical, authoritative representations of code, from which any supporting executable binary accounts must be derived. For most, the vast majority, in fact, programs are ASCII characters that live in ASCII files, in directories, which these days, might be under some sort of source control, but still...

Now, one of Greg's points in his "ASCII" passage, was to remind us that Knuth's vision of integrating complex typeset annotations like equations and tables with code, and binding them to the codebase, is even now only clumsily possible using embedded HTML in "doc" comments.

And, it's true, we don't do this well. In general, tying diagrams, equations, and commentary together with code in a way that presents it conveniently along with the code so as to make it even remotely possible to maintain it and keep it consistent with changes to the code, is just not something one sees well supported. And given the polyglot nature of so many applications these days, this ASCII common denominator too often places important cross-language artifacts beyond the reach of refactoring tools that might, given a richer representation, have had a better shot at keeping such support material in sync. Think XML configuration, definition, and data files, even schema.

Cobbler

The cobbler's children lament, then, is borne of the envy that one feels over the fact that so many our customers all get to enjoy much richer representations of the important objects in their domains than we programmers do.

This theme has been showing up of late when I talk about the ubiquitous Ball of Mud phenomenon. These talks have come to have several "DVD" (select your own) endings, one of which explores the idea that one source of our perception of muddiness is the primitive tools and representations we currently employ to navigate and visualize our codebases.

This in turn, is a partial result of working with refactoring tools developers at UIUC a few years back, where I came to believe that program representation is one of the most interesting and important unsolved problems in program language research. By this, I meant getting beyond 95-character ASCII into a richer, round-trip AST-level representation amenable to analysis, refactoring, and annotation. Such representations would need to be cast somewhere beneath text, but above the level of say, Java .class files.

This is neither a new nor an original idea, but it remains, alas, unfinished business. Ever wondered why refactoring tools for C and C++ are so slow in coming? This is part of the problem.

The second reverie that Greg triggered, which was partially reinforced by another Twitter citation, this Redis Road Map from Paul Smith, was this one:

One of the underlying premises of the now discredited waterfall division of labor between skilled architect/designers and rote programmer/coders was the implicit, presumably self-evident assumption that the code was unreadable, and that design must therefore be conveyed by "higher-level" descriptions like documents and cartoons.

Nowadays, a fundamental article of faith underlying the craftmanship, clean code, and iterative, emergent design communities is that the code itself can be wrought to carry the dual burden of conveying to the compiler a comprehensive set execution instructions, while at the same time communicated to at a suitable level of literacy and abstraction the details and intent of the artifacts design. A lot of stock has been placed in this conceit, so one can only hope it holds up.

It is clear that code can do so far better than the woefully minimal degree of adequacy that many shops tolerate. But the question of whether, even at our best, meticulously tended source code will be enough remains an open one.

"Real" Software Engineering

|

This video of a presentation by Glenn Vanderburg entitled Real Software Engineering came up last week during one of those periodic flurries of contrary opinion on Twitter regarding whether or not software development is, or is not engineering. Glenn's 51 minute talk explains why, after after having made a painstaking, convincing case that what we do do is utterly unlike what any other known engineering discipline does, he nonetheless aligns himself with the "pro" engineering perspective.

Real Software Engineering - Glenn Vanderburg from Engine Yard on Vimeo.

It's a well-prepared and delivered piece, and well worth your time. He opens by acknowledging something that anyone who has been in this field for long already knows: that the kind of Software Engineering that has been taught in Computer Science programs for the last forty years has been hopeless, comically out of touch with day-to-day software development reality.

His opening examination of where "Software Engineering" went astray is particularly compelling; he does so by going back and examining some primary sources. For instance, the legendary NATO the 1968 meeting that established the field had some moments that seemingly foreshadowed today's Agile orthodoxy, before heading off into into the weeds for a generation the next year. Winston Royce has evidently been saying, for 42 years, that his original waterfall paper has been tragically misunderstood. Glenn makes a good case that this is so. You may, of course, actually read the paper and decide for yourself. Parnas's A Rational Design Process: How and Why to Fake it is here too. Glenn has some fresh background on Parnas's use of the term "rational".

Galloping Gertie

I thought I caught a welcome, albeit uncredited whiff of Petroski in the second part of the talk, where he describes how science, and art, mathematics, craft, tradition, and empiricism guide what real engineers really do. And no talk on the limits of engineering would be complete without an appearance from Galloping Gertie

I particularly enjoyed Glenn's treatment of of the perennial and enduring mis-perception of the roles of engineers and coders that the industry inherited from its lengthy flirtation with the waterfall model. This conceit went something like this: The "real" engineering effort involved in engineering software is in the design, not the implementation. Hence, design must be something distinct, something more demanding, than mere coding. The software engineers job then, was produce, from a given set of requirements, some artifact that could be "thrown over the wall" to the coders for "construction".

Of course, this analogy is off. The design of the program itself is the part of this process that is akin to automotive or aircraft design. Construction, building, or fabrication is the process of reproducing the shrink-wrapped media, or invoking and executing the application over the web. For aeronautical engineers, fabricating each individual remains aircraft is quite expensive. Though software engineering began during an era of pocket-protectors and mechanical pencils where CPU were still scarce, fabrication for us now is essentially free. Given this perspective, Glenn continues, the folks dismissed as blue-collar coders in the waterfall pecking order are the real engineers. Because engineering is about making stuff that works. And it is with this contention than Vanderburg rests his case.

Which is fine, as far as it goes. I guess, after all that, I feel less obligated to align myself with the engineering fraternity than does Glenn, given how different making software turned out to be from the other disciplines he showcased, but that's probably a matter of taste. I'm just not sure I have a dog in it. There are lots of disciplines that deliver stuff that works besides engineering: cinema, pharmaceuticals, agriculture, bakeries, blacksmiths, composers, ... I could go on. What might we yet learn by analogy from disciplines outside our mathematics and engineerings roots?

Of course, the other great unspoken truth about the software engineering tradition in Computer Science has been that software engineering has always really focused on the admittedly considerable challenges associated with managing, organizing, and yes, even leading a large, infantry scale industrial organization whose objective it is to produce software, often to the detriment of the more technical issues of interest to those in the trenches.

Ironically, one of the successes of the Agile movement has been encourage the emergence of more antonymous "commando" scale units within the kinds of World War II Era waterfall shops that had dominated the industry.

Indeed, these hierarchical corporate traditions, the clashes of these kinds of cultures with the agile mindset, and the daunting demands of scale are all issues that might have merited additional examination, and that continue to the contribute to the perception that software engineering is out-of-touch.

Refactoring's Original Sin: Part II

|
Makeover

Part I of this piece made the case that Refactoring's Original Sin was having allowed Refactoring to come to be perceived as completely cosmetic elective surgery. Worse yet, the resulting improvements benefited only the innards of your application, not the parts the users see.

Even as early as the early nineties, there was a recognition that allowing refactoring to be become pigeonholed as an unproductive polishing pass could do injustice to the cause of the kind of iterative, incremental design.process that we'd become accustomed to working in Smalltalk.

We did make several attempts to characterize this approach as a integrated iterative, incremental process in which design pervaded the object lifecyle. A few years earlier, Lehman and Belady had observed that while design was an entropy decreasing process, maintenance was an entropy increasing process, that inevitable eroded structure.

Yet, somehow, the Smalltak image had avoided this fate, and the development style we'd vicariously absorbed working with Smalltalk was seemingly allowing our code to do so as well. The way this worked was that, as you added new code, you'd keep an eye peeled for opportunities to better integrate it with the code around you. And then ... take them. This casual, continual shift in focus from function to form and back again helped to forestall the kind of death-by-duct tape that seemed so unavoidable when working on classical code. And I think it was fair to say that it was by cultivating these form-focused improvements to the structure, clarity and presentation of the code by hand that we learned what refactorings were.

For instance, we trotted out an extended abstract A Fractal Model of the Lifecycles of Reusable Objects at a few workshops during the early nineties. Conceived as a wry variation on Boehm's Spiral Model, it sketched a three phase scheme whereby an Initial Prototype formed the nucleus of a system that was organically grown through a succession of fine-grained, opportunistic Expansion and Consolidation phases. Expansion had an exploratory focus, and explicitly tolerated a certain degree of structural erosion in the cause of mastering functionality.

Consolidation on the other hand, was a form focused phase in which opportunities to harvest the benefits of hindsight were exploited to repair any damage done during prior expansions and ensure the application was poised to address the challenges to come. Anywhere in the universe, counteracting entropy requires an investment of countervailing energy, and it was here that that entropy reduction was done.

By PLoP '94, Bill Opdyke and I had put together a somewhat more detailed treatment of this material that tried to cast Refactoring in the context of this this kind of process: Lifecycle and Refactoring Patterns that Support Evolution and Reuse. One goal of this work was to once again underscore the idea that refactoring should normally be an integral part of incremental development, not a coarse-grained recovery project. Which is not to say that such an effort might not be called for if a codebase had gone badly to seed, but that with conscientious cultivation the need for major reclamation efforts could be prevented.

Software Tectonics

Our PLoP '95 Software Tectonics idea can be seen in this light: entropic stress accumulating in a system risks release with increasingly severe power-law consequences.

By then Refactoring was poised to become a household word, among hackers, anyway. John Brant and Don Roberts were already working on the first practical automated refactoring tool for Smalltalk, Martin Fowler was soon to embark on a landmark cataloging effort that would, in turn, give Erich Gamma a roadmap that would lead to the refactoring facilities of the Eclipse JDT. And Kent Beck and his cohorts were soon to deem Refactoring as one of the pillars of eXtreme Programming.

So, one of the reasons that Martin's TradableQualityHypothesis piece resonated with me so deeply is that this is precisely the kind of perception battle I still find myself in when discussing Refactoring to this day. Our inability to have framed it as an integral, fundamental, indispensable element of a continuous, organic design and development process has left code quality and the kind of poised responsive, adaptive posture in the face of change that than makes possible, vulnerable to the remorseless whims of the budget axe.

There is still work to be done.

Brownfield Software

|

I spent an informative 51 minutes yesterday watching this InfoQ video presentation on Brownfield Software: Industrial Waste or Business Fertilizer by Josh Graham of Thoughtworks.

Josh describes an intensive, heroic effort to bring a large, festering, vintage 1997 legacy Java system under control.

Among the highlights: his admonition that, no matter how great the temptation may be, avoid the urge to rip out those homebrew frameworks and replace them with more modern, standard ones.

The testimonials from the theretofore downtrodden hackers who'd been pinned down in the trenches on this project were touching as well.

Oh yeah, and they used AOP too. For logging, naturally....

They also spoke of the indispensability of engineers with superior textual skills in an environment like this, which, I gather, means the kinds of folks who for whatever reason are able to make some sense out of code most of us would find hopelessly inscrutable.

Kakonomics

|

Came across this blog entry by Gloria Origgi, courtesy of a tweet from @ronjeffries on Kakonomics: or The Strange Preference for Low-Quality Outcomes.

The piece opines that in many systems, a state of collusive mediocrity can emerge and persist between agents, whereby an exchange of low quality outcomes and minimal rewards is mutually tolerated as part an Evolutionarily Stable Strategy.

It's easy to imagine this sort of low-expectations relationship emerging between an IT organization and mud maintainers, and how it might account for the kind of neglect some codebases must seemingly endure.

The relationship between coder and code itself might lend themselves to this sort of analysis along the lines of Joshua Kerievsky's notions of Sufficient Design.

One wishes the author had opted for a different name. I can almost hear every preschooler (and preschooler-at-heart) in the room giggle each time I read it.

Refactoring's Original Sin: Part I

|

Before there was Refactoring there was refactoring. There were people who refactored just fine before Refactoring was invented.

I remember right well.

I remember back during the late 1980s, the OOPSLA-era Object boom was in full bloom. I had joined a new research group led by Ralph Johnson at the University of Illinois at Urbana-Champaign. And some of us were availing ourselves of the privilege of getting the best apprenticeship in object-oriented programming a body could ever get: spelunking around the PARC Smalltalk-80 codebase.

Gauguin and Tahiti

Here we found code that had clearly been crafted by insanely bright, committed pioneers who had stuck with it, lived amongst it, inhabited it, as they acknowledged themselves, polished it and cultivated it. And incrementally improved it. It was clear to anyone who visited the place, and it felt like a place, that the kind of code you were browsing was the fruit of a patient, sustained, collaborative process of cultivation. All the code was there. You were free to roam, and explore as you pleased.

It was a place that enticed you to join it. Smalltalk development enticed one to "go native", like Gauguin's Tahiti. And you grew and cultivated your code next to theirs, and when you saw an opportunity to improve it, you did. And your code grew better too.

Of course, even then, during the dusk of the Cold War, not every working programmer left every first draft untouched. If you were overconscientious, you might routinely do this as part-and parcel of bringing an implementation into being. Sometimes these ideas would come into focus once the code was actually on-the-air. In those days, when you wanted to improve the design of a chunk of existing code, you said you were going to clean it up, or take a final polish pass over it, or maybe tidy it up a bit. An enlightened boss might tolerate such an indulgence at the end of a successful push. But even then, the suspicion that you might be engaging in self-indulgent lily-gilding was never far behind.

Often, your focus would be strictly on improving the code's clarity or structure, or aesthetics without changing the way it worked at all. By analogy with number theory, one could casual describe a reworked or improved version as being better factored than its predecessor. It became natural to describe the newer take as a refactored version of the older one, and more broadly, to describe this kind of focused improvement activity as refactoring. Still, this most often was less a distinct phase that an acute, opportunistic shift of focus. It was typically part of preparing the groundwork for some subsequent functional improvement.

And so it came to pass that in the early days of the reign of George Bush the Elder, Bill Opdyke and Ralph Johnson set about the task of cataloging some of these Things You Could Do to Programs that Didn't Change The Way They Worked But Made them Better, with an eye towards seeming studying them more formally, and seeing whether they could be automated. And refactoring became Refactoring, with a Capital-R.

A Refactoring became a much more narrowly cast behavior-preserving program transformation, which served their research agenda fine, but cast the broader, more casual usage to the side. Refactoring, the activity, had henceforth to be one in which only such sorts of transformation were to applied. A distinct activity.

Subsequent authors, such as Martin Fowler (Refactoring) and Joshua Kerievsky (Refactoring to Patterns), have probably inadvertently reinforced this impression with their detailed focus on the surgical tactics and skills required to accomplish particular Refactorings, at times seemingly at the expense of broader, more holistic therapeutic approaches to day-to-day codebase health maintenance.

Adam and Eve So, what was Refactoring's Original Sin? Letting the idea be framed as a distinct phase, a polishing pass, an optional brush up, where working code was turned into "higher quality" working code that worked the same way as the original working code. Framed this way, this could sound like an utterly optional spasm of indulgent, flamboyant, narcissistic navel gazing by self-indulgent prima-dona programmers, rather than a inherent, indispensable, vital element of a process that cultivated the growth and sustained health of both the codebase and the team.

The problem with "refactoring" is that refactoring is not a bad name for Refactoring proper, but that the idea has always cast a wider shadow, in part because of its gestation in precisely the kinds of iterative incremental incubators we nostalgically recalled above.

As a result of having framed in this fashion, the perception persist to this day that refactoring is a distinct, coarse-grained, and dispensable luxury, rather than a natural, inherent, integral part of a healthy, sustainable development process.

The problem is that the refactoring process in which Refactoring plays this indispensable role, and the value of the resulting code, have never been really properly framed.

And a consequence of this is that refactoring itself becomes a casualty in precisely the kinds of TradableQualityHypothesis triage games that the Martin Fowler piece that triggered this one so eloquently showcased.

Thomas Jay Peckish II on 21st Century Software Engineering

|

Open-Closed Principle? OCP? Make that private and Young Hakaz gonna copy-paste that pretty little banzai meditation garden of yours and do what they want to it. Encapsulate this yo.
--Thomas Jay Peckish II

Yo Hakaz! I chased it / and I faced it / and I cut it / and I pasted / I could taste it / 'til I wasted it.
Peace.

--Thomas Jay Peckish II

Thomas Jay Peckish II on Life and Death

|

Skills are important. A surgeon screws up, a person dies. A programmer screws up, a process dies. The stakes are identical. Unless you think a process is worth less than a human life...
--Thomas Jay Peckish II

Thomas Jay Peckish II on Programmers

|

All too many programs are made in the images of the people who create them: complex, technically sound, but ineffective, often woefully, at communcating with human beings. Why are we surprised?

--Thomas Jay Peckish II

The tweet looked like:

Too many programs are made in the images of those who create them: complex, technically sound, but poor at communcating with human beings.

which may actually be a tad better...

 

The continuing controversy over Scrum Certification evidently inspired the following "press release", which was circulating at the Agile 2007 Conference this year. While its authenticity, or, to be more precise, its lack thereof, is no longer in dispute, you may nonetheless find it amusing.

REVOLUTIONARY NEW PROCEDURE DECREASES SCRUM MASTERBATION

Press Release, 15 August 2007

*REDACTED*, Inc., is pleased to announce that we will be able to offer safe, somewhat affordable Scrum Circumcision™, for the first time, to our preferred clients, beginning in September of 2007.

Scrum Circumcision™ is perfect for the software professional who is looking for a way to distinguish himself from, shall we say, the unwashed masses.

This revolutionary procedure entails the surgical removal of the patient's foresight. The effects are similar to a traditional XP lobotomy, though Scrum Circumcision™ is typically considerably less invasive.

Research has demonstrated that excising one's foresight can reduce premature generalization, improve code hygiene, and increase pairing satisfaction.

Customers usually report that they feel their XP-ness is dramatically enhanced after undergoing this procedure, and often experience stronger, more robust iterations.

Warning: stubbornly retaining your foresight may cause you to be subjected to ridicule and derision by your peers. Trim that foresight now, and transform yourself into a peak performer.

Act today. Our certified Mohelim are equiped with the lastest, most modern tools available, and are are waiting for your call. Remember, this procedure becomes more painful the longer you wait.

Foresight You're Not Gonna Need It™.

*REDACTED* president *REDACTED* denounced the faux press release as a "crass, puerile fabrication". Said *REDACTED*:

While we have mixed feelings about the necessity for formal certification procedures for any agile methodology, we see no reason to question or mock the motives of anyone advocating such credentialization. Frankly, I'm disgusted.

Regrettably, it appears this fabrication may have been the work of one of our own employees. Should this turn out to be the case, we assure you that the individual responsible will be sternly reprimanded.

Thomas Jay Peckish II on Metaphor

|

When it comes to artifacts fashioned from bits, metaphor is destiny...
--Thomas Jay Peckish II

Thomas Jay Peckish II on Finding the Last Bug

|

Bugs are like prime numbers. You can never really find the last one, but after the first billion or so, they start to thin out a little bit...
--Thomas Jay Peckish II

Thomas Jay Peckish II on Refactoring

|

We'll know that refactoring has really arrived when Microsoft starts marketing Word as a refactoring tool for your prose...
--Thomas Jay Peckish II


You know, over the last couple of days since I've posted this hoary quip, it's occurred to me that there actually may be something to the idea of bringing refactoring tool ideas into the realm of prose cultivation. I wonder if there are tools like this out there already?


Indeed, witness this observation on this topic...

The Mind of the Theoretician

|

I have struggled for, let's say, way longer than is good for a person, to comprehend the mind of the theoretical computer scientist. I've met with mixed success at best over the years, but this post offers more insight than most into the monastic deliberations of this secular priesthood...

A postscript: I've been causally browsing "Ernie" occasionally for the last couple of three or four months, but I'd never hunted around enough to discover that Ernie's Gepetto is UIUC DCS's own Jeff Erickson. I'm reminded of one of those horror stories that ends with "run for your life, that phone call was made from inside the building!". I might add that I completely concur with Steve Freeman's observation that Erickson's work in this area is likely applicable, with appropriate revision, to conference committees in general.

Freedom from Choice

|

In ancient Rome
There was a Poem
About a Dog
Who found two Bones
He picked at One
He licked the Other
He went in Circles
He dropped Dead
--Devo, Freedom of Choice

I stumbled across an interesting discussion in from Ruby-land through a link on Martin Fowler's website. At least I thought I did, I can't find the reference now. No matter I suppose.

It’s an intriguing, but somewhat iconoclastic idea from David Heinemeier Hansson, by way of meme consolidator Tim Case, to the effect that the notion that a framework be an ultra-flexible jack-of-all-trades has run its course. The claim is that frameworks that employ fixed conventions in lieu of elaborate configuration mechanisms are easier to learn and use, at least in the most typical and common cases. A framework should be really really good at one thing. It should do one thing, and do it well.

At first blush, this would seem at odds with conventional / customary / traditional framework development lore. Nonetheless, I find the idea quite compelling, given the right circumstances…

My first thought is that it is a sign of a healthy, mature ecosystem that the carrying capacity for frameworks that do a single thing well exists. Code that does one thing right can be easier to read, and easier to extend (using code), that code that straddles a range of requirements. There is an element of neoteny, of reverting to a less mature form in this. This is what a green fields framework often looks like too, at first.

Normally, over time, as new requirements emerge, the framework grows, to accommodate a range of requirements. The code to accommodate this variety, this diversity, is refactored as the framework grows, the commonalities rise, as if in some code chromatographic centrifuge, to the upper levels of the class hierarchy, and out into distinct components, and yes, out into the data, and into databases and configuration files.

The prize: flexibility, market share, and libraries of redeployable abstractions. The price: inevitable complexity, and a steeper learning curve.

Reversion to convention marks a break with such evolutionary trends, with midlife growth. It is tempting to speculate that such breaks are inevitability generational. That drawing-back-to-leap entails a fresh draft, a new generation, a new codebase. Hence, I suspect, the phenomenon discussed here is not warmed over YAGNI (You are not going to need it); not a response to premature generalization, but a late, mature reaction to a rising tide, and a mature understanding of where flexibility is needed in a given domain, and where it is superfluous.

Neoteny is a "back to the future" reaction, in some ways, but not a "worse is better" reflex.

An analogy drawn from the realm of stadia in the United States came (once again) to mind. During the sixties, multipurpose facilities that could accommodate the needs of several sports were constructed. They met those of all of them adequately, but met none of them well. Is a Swiss Army Knife ever really better that the right tool for the right job? Or is it just easier to afford and to carry?

During the nineties (and beyond) these cookie-cutter concrete white elephants were razed, and replaced with single-purpose facilities better suited to just baseball, or just football. Even adjusted for inflation, these replacements were considerably more expensive than the stadia they replaced. But a bigger, more affluent population, craving the thrill of vicarious victory enough to support skyboxes and three-figure seat prices, was able to support this extravagance.

Could it be, that in a red-shift, expanding universe, the growth of the internet itself, and of web application demand, is such that we now have niches for single-purpose frameworks that might only have been filled by less satisfactory general purpose code even five years ago? There is a slightly decadent, postmodern redolence about all this.

Is there any way that conventional frameworks might exhibit this same level of convenience? Perhaps, if their elements are engineered to WORK OUT OF THE BOX. By exhibiting, generating, and maintaining reasonable defaults, programmers might provide the same kind of 80% / 20% Pareto's Law hands-free convenience to the bulk of their clients, while still being able to cater to more exotic tastes as well.

Now, to be fair, one of the factors at play here is that (post-)modern scripting languages like Ruby are easier to change than relatively hidebound type-pecked tongues such as C++ and (alas, now) Java. I'm recalling Kent Beck's observation that were coding to become infinitely fast, we wouldn't need to worry very much about reuse. The fast-on-their-feet scripting cultures of Amazon and Google lend some credence to this world view, it would seem…

Freedom of Choice
is What you Got
Freedom from Choice
is What you Want
--Devo, Freedom of Choice

The Least Understood Thing in Computing

|

Journalist Mariann Unterluggauer has latched onto this challenge question from OOPSLA 2005. Here's the question, with my reply:

Q: What is the least understood thing in computing that if only it were understood, everything would change?
A: That's easy: Reflection...

Microsoft Seeks to Acquire Weapons of Mass Computation

|

In a defiant keynote address delivered to a stunned audience at the Supercomputing 2005 Conference in Seattle, Microsoft Chairman William "Bill" Gates vowed today to acquire the technology to produce weapons of mass computation as quickly as possible. Microsoft’s program posses a provocative challenge to existing supercomputing powers such as IBM, Sun, and Cray.

This technology has heretofore been available only to an exclusive club comprising only the most technically advanced powers. That Microsoft, considered by some in the supercomputing elite as a rogue player, now has made the development of this technology a priority is raising considerable alarm.

Chairman Gates

Researchers at the University of Illinois at Urbana-Champaign’s Software Architecture group expressed skepticism that Microsoft could develop the powerful refactoring technology required to transform it’s raw business software into the highly enriched form necessary for high-performance computation. "It takes years to cultivate the powerful high-performance "codes" necessary to pose a serious threat. Without this [refactoring] technology, they are at least seven years away", said Thomas Jay Peckish II, of UIUC. "We’re they able to somehow acquire this technology from an outside source it, they could develop a weapon’s grade potential in as little as two years."

A key element of Microsoft’s alleged program is a black project know only as XBox, which threatens to put weapon’s grade supercomputing technology in the hands of any rogue adolescent anywhere on Earth. Microsoft claims this suitcase-sized device has no dual-use potential, and is intended only to pacify children.

Some scoff at this potential threat. This is a Business Intelligence failure of monumental proportions, suggested one analyst.

Gates did acknowledge that Microsoft is working on advanced cluster technology, the same technology that has been denounced by some human rights watchdogs as dangerous in the hands anyone but the most highly trained software technology experts. Gates disclosed that Microsoft has been developing these weapons under its Windows Cluster 2003 Program. He defended Microsoft’s work in this area, pointing out that competitors have been developing and deploying such technology for years.

Microsoft has launched a charm offensive to counteract the negative publicity it has been receiving as a result of these developments. They’ve booked the Experience Music Project Facility for a concert for an exclusive group of supercomputing dignitaries this evening. The mysterious EMP facility, it might be pointed out, is rumored to have been surreptitiously funded by reclusive Microsoft billionaire Paul Allen.

The official libretto for Gates's address can be found here. Their official strategy can be found here.


Photo ©2005 by Kathleen Ricker

Photran in the News

|

Click here to see the full article.

Oh, and any similarities between the piece above and this earlier Thomas Jay Peckish II interview are almost completely coincidental.

Then there is this Photran sighting in Ward Cunningham and Bjorn Freeman-Benson's 'blog about efforts to broaden the reach of the Eclipse CDT. Our collective efforts drew this plaudit from Bjorn: "I dream of an idyllic future where all Eclipse projects are as open and transparent and cooperative as CDT/Photran/PTP in our quest to create great frameworks." Aw shucks...


See Photran in action at the Supercomputing 2005 Conference, at both the Fortran Company and Los Alamos National Labs booths.

I Have Nothing to Declare But My Genius

|

I Have Nothing to Declare but My Genius

With mainstream language design mired in ennui and retreating into formalism, the field has been effectively ceded to a ragtag, de-facto coalition of old-school dynamic stalwarts, scripting language designers, and ad-hoc domain specific API architects. A generation of research in this area can be distilled down into three overarching ideas, the rest is filigree.

This talk will explore these ideas, examine how and why these currents are converging, and show why the large scale, dispersed, heterogeneous, polyglot world of 21st century computing demands nothing less than this degree of commitment to dynamism.


Brian Foote has been programming professionally since the dawn of the Carter Administration, mostly in the service of academic researchers of various stripes. His association with the academy has enabled him to dabble extensively in research. His interests include objects, programming, programming language design, reflection, metalevel architecture, patterns, and software devolution. His exposure to Smalltalk during his impressionable formative years indelibly shaped his attitudes towards software architecture and design. Brian is one of five people to have attended every OOPSLA conference to-date. He is the author of a forthcoming self-help volume entitled “Late Binding as a Philosophy of Life”.

I get the same sense of security knowing that my program is statically type-safe as I do knowing that my seat cushion can be used as a flotation device.
--Donald Bradley Roberts, author of the award winning autobiography Outsourced to Indiana, during Q&A...

Devolution in the Air

|

Over the years, one of of the my favorite OOPSLA pastimes has been attempting to discern the Zeitgeist. What memes are moving, as the marketing 'droids might say. This time around, several distinctive themes emerged. I'll discuss the most vivid of them here. I'll talk about the others, Web Services, the Reflection and Metalevel Architecture Revival, and Refactoring Coming of Age, under separate cover.

The first, and most prominent / evident was Software Devolution.

By this I mean an overdue discussion of the death of quality, the embrace of failure, the acceptance of uncertainty, our exhausted capitulation in the face of complexity, the realization that the programme of the close-world formalist is mocked by reality.

Acknowledging devolution means acknowledging that not only are things worse than we've dared to admit, they are getting worse, and will continue to do so, at an ever accelerating pace. We are entering a worse-is-better world of sweatshops and generative robo-code, where the craftsmen must inevitably retreat to their redoubts in boutiques and curio shops.

Acknowledging devolution means abandoning our fetish for correctness, and learning what to do when (not if) something goes wrong.

Devolution was a theme that impinged from so many directions this year that it was impossible to ignore it. Software Devolution is something of an umbrella under which I am lumping a number of disparate trends. Perhaps the notion will seem more clear once I've cited some of the developments that made this theme seem palpable.

The most glaring evidence of this theme could be seen in Martin Rinard's Breakthrough Ideas presentation. Rather than cling to the notion that every error can even be caught, let alone avoided, Rinard deals statistically with consequences and ensuing carnage entailed in just "eating" them. The approach would seem to have more than a little in common with what the genetic programming people are doing. I need to look at his work more closely to see if this is so.

Embracing the reality of programmer fallibility and failure was a theme of Mary Beth Rosson's The End of Users keynote. She observed that something like ninety percent of all spreadsheets have errors, and that we are in something of a state of denial of the potential consequences of this reality.

Agile methodologists have been promoting the primacy of bottom-up, feedback intensive code and test based design as an alternative to the top-down foresight fetish crowd's traditional ideas. Martin Fowler's Finding Good Design keynote went beyond even this, and offered more than a nod in the direction of whether high-minded design ideals were honored primarily in the breach in practice, and questioned whether the very idea of design would retain its relevance.

For a moment, we all dared to ponder the question of whether craft in design is anything more than gilded filigree, soon to be as anachronistic as a hand-crafted circuit board.

My own Breakthrough Ideas triptych (previewed exclusively here a few weeks back) further reinforced the notion that the gap between what we practice and what we preach is large, and is growing larger. In Praise of Cut and Paste sought to "out" an ubiquitous practice among framework designers and, well, everyone else, that has been regarded as something of a dirty little secret by practitioners, while being scorned by high road methodologists and academics alike.

Big Bucket of Glue suggested, among other things, that integration isn't pretty, that it is often quick and dirty, that essential complexity is here to stay, and that some part of the system must bear the structural stigmata associated with wrangling the rest of the system into a coherent whole. It can be seen as a potential integration layer sequel to our hoary spasm of pomposity Big Ball of Mud.

The End-to-End Principle in Programming Language Design suggested that mainstream languages such as Java focus too closely on minutiae like protection and type checking at the micro level while neglecting any support for the overarching end to end of real, honest to God runtime facilities such as validators, to, well, who knows? Features designed badly at the language level must often be reinvented a level up in a dynamic or adaptive object model layer in order to properly accommodate the needs of any end to end domain specific languages that might emerge. One example of where this phenomenon can be seen the architectural eyesore that the services layer is becoming. It is bad, and getting worse by the minute.

Devolution can be seen in the enthusiastically received Scrapheap Challenge workshop, where participants dumpster dive scraps from the web as fast as they can to solve programming problems.

There was something decidedly devolutionary in Grady Booch’s overdue exhortation to cast models aside, and in his candid admission that no one reads code anyway.

The winners of our Sudoko exercise in the Extravagaria III: Hunting Creativity workshop used twenty-first century scrapheap techniques (they Googled the problem), mocking the attempts of the organizers to demonstrate genetic crossover using the other two groups who'd treated the task as relatively solitary chores.

The notion of "Software Devolution" is drawn from the colorful hatchet job James Noble did on my Biography for the Sashimi vs. the Purple Robots panel. If the shoe fits. It is a notion that fits wonderfully under the aegis of the Post-Modern School of Programming founded by Noble and fellow traveler Robert Biddle at the inaugural Onward! a few years back. We are All Devo indeed.

Among the mantras of this movement are paeans to the centrality of The Program as the essential focus of computer science. Code is seen not only as the source of choice for interesting problems to study, but as the remedy of choice for addressing them. We should address problems with code with code, like real computer scientists, and not with closed form analysis, like mere mathematicians.

Implicit in this is the hope that our discipline is finally becoming mature enough to confidently address indigenous subject matter with indigenous tools and ideas, rather than seeing ourselves as the outcast spawn of some tawdry, long forgotten one-night stand involving the mathematics and physics departments.

The primacy of code, and of runtime mechanisms as a condign recognition of the intricacy and fallibility of code was a central theme of my own closing address to the Dynamic Language Symposium. More of that elsewhere, soon, I hope.

Devolution can be seen in the decline in the impulse towards Linguistic Imperialism, the cooling of the language wars, the increasing irrelevance of Utopian Linguistic Monocultures, and the grudging acceptance that ours is a heterogeneous polyglot world, where applications are increasingly expected to be able to cooperate.

A devolutionary cast can be seen as well in the impulse to use generated code in place of code crafted by human beings. Is generation a symptom, a code smell, signaling the inadequacy of our languages and tools, or is it a harbinger of things to come, the wave of the future, a sign that the John Henrys with their TTL drivin’ keyboards will ultimately be vanquished by these generative steam drills.

Indeed, Richard Gabriel is suggesting that the ultrascale volumes of code that our futures will demand can only be supplied by an automated process that relies on increasing greater degrees of abstraction. This prophet of a grim Worse is Better world may find himself once again the reluctant head of yet another vanguard.

Devolution has two somewhat conflicting connotations, both of which, in the right context, in the right light, fit this discussion to a tee. Devolution suggests de-evolution, revanchism, neoteny, and regression, a descent towards something worse than what we have now. Not a good thing. Devolution also connotes decentralization, a surrender to local control, a process of increasing federation, a spinning-off of constituent parts, as seen in the establishment of a Scottish Parliament, or the refactoring of the Soviet Union into Russia and friends. Think of a process of ball-of-mud componentization, or an annealing process that lets local clusters emerge. Not necessarily retrograde, not necessarily a bad thing at all.

Devolution could be seen as well in the conference organization itself, where satellite symposia offered an alternative to the "take my ball and leave" secession of the Aspects and Agile movements. The adjacent RubyConf, and embedded Wiki and Dynamic Languages events brought fresh plankton to what could have otherwise been a stagnating meme pool, and may point the way to a more federated future for long-in-the-tooth conferences like OOPSLA.

Is devolution a natural consequence of growth? Of maturity? Of riding an exponential? Of Variety and Diversity? Fratricde ebbs because there is more room, and a more diverse herd, self-sustaining communities calve from the central glacier, and set out on their own…

Bastards! Bastards! You're All Bastards!

|

My final official duty of this year’s OOPSLA was to serve on Onward!’s Yoshimi vs. the Pink Robots panel. I was asked by panel moderator James Noble to assume the guise of a grizzled, curmudgeonly veteran hacker. Something of a stretch, to be sure :-), but I figured I’d be up to it.

The panel takes its name from a song by The Flaming Lips. James opened the panel by showing the video for this number. It’s actually pretty catchy. You can find it by clicking the album cover to the right. It was likely the intention of the panel’s framers that the beleaguered Yoshimi was to be cast a surrogate for the users, and that the Pink Robots were to be associated with us inscrutable hackers. I’d rather hoped that this question be left open, and that the audience be allowed to see things the other way. In hindsight, it’s not clear it was even raised, let along settled.

I’d found as many soda pop cans as I could to decorate my spot at the dais. I donned my trusty old engineer’s cap, primarily to get the bright lights out of my eyes, though the thought that it fit the character didn’t hurt.

I don’t remember much about my opening position statement, except what’s below. What I do know is that I’d been told that the ACM was going to finally start recording talk and panel videos this year with an eye towards putting them into the Digital Library. What, I thought, about those occasions where one has made a complete jackass of one’s self on a panel, and had wanted to crawl off into a corner to die? What then?

Then I thought about the performance OOPSLA 2004 Conference Chair John Vlissides gave as the (literally) propeller-headed “Jimmy the Freshman” in OOPSLA 2004’s sporadically amusing but occasionally execrable “Dating Design Patterns” skit.

John had asked me to help him cast this skit a few weeks before. I’d cast it without reading it, out of fear for the worst. I’d read it shortly before rehearsals began, and my fears were not fully allayed. This skit, I had warned him, had the potential to substantially undermine the very dignity not merely of everyone involved, but indeed, of the entire conference. John replied: “What dignity?”

It was in this spirit that I resolved to put such concerns as I might have had on my part to rest once and for all / for good.

In the spirit of OOPSLA 2005’s focus on creativity and the humanities, as established by the stirring keynote address of former poet-laureate Robert Haas earlier in the week, I elected to recite an original piece that Don Roberts and I had put together, based on the work of one of our country's most celebrated contemporary lyricists.

The original idea had come to me, as so many ideas do, earlier that morning, in the shower. Cleanliness is next to Godliness, and Hygiene has served me over the years as generous, albeit fickle, muse. This, however, was one of the most horrific shower visions I'd ever had the displeasure of contemplating. An acid hallucination gone terribly awry. Like something out of Hitchcock, it was a flashback of James’s over the top introduction of Mary Beth Rosson’s talk, only hellishly different. Indeed, it was the end of a fateful downward spiral that had begun, as Eugene Wallingford has noted, the day before.

Panel moderator James Noble's outsized neo-pompous HMS Pinafore histrionics and desiccated antipodean wit are utterly inimitable. But I took a stab at channeling 'im anyway. My re-enactment of my surreal low-culture vision from hell went like this:

I've been around forever
And I wrote the very first code
I put behavior and data together
I am hacker (sic)
And I write the code

I write the code that makes the shareholders smile
I write the code that the users defile
I write the code that makes the testers cry
I write the code
I write the code

The Fountainhead is on my short list of all-time favorite movies, although, I must confess, part of the pleasure I take in it is a partially (but only partially) ironic appreciation of its unrelentingly self-absorbed dialog. It’s great. Objectivism indeed. That said, Rand’s characterization of Roark’s total passion for his work, and his utter disregard for what society thinks of it, one way or the other, fits many in our vanishing indigenous programming culture to an uncanny degree.

I don't build in order to have clients.
I have clients in order to build.
--Architect Howard Roark, in The Fountainhead, by Ayn Rand

An enduring source of frustration among programmers is the degree to which users have no conception of the the glory, the majesty, the grandeur, of living in the code. Of living among, and crafting these magnificent artifacts. The medieval artisans who crafted the great cathedrals of Europe, is has been said, would often take care to craft even that which could never bee seen from the ground with the same care as that which could, because they knew God could see it. This alas, is often inevitably the lot of much of the craftsmanship that goes into the code. But we put it there anyway. And the users have no idea.

This being OOPSLA XX, I was in something of a reflective frame of mind. I started my professional programming career on 20 January 1977, the Day Jimmy Carter was inaugurated, or James VI by my American Regents reckoning. I worked in fairly large research group (about thirty folks in all), with a dozen person technical staff. We worked for a guy who was the spitting image of the pointy haired boss in Dilbert, only with a 170 IQ.

I have a confession to make. In those days, when programmers got together and spoke among themselves, which was fairly frequently, because no one else would talk to us, when we’d talk about users at all, it would be to talk about what #$%^&%$ morons users were. What’s more, we were not alone. Furthermore, I have it on some good authority, that in some circles, this perception continues to exist, among some programmers, even to this day.

Users, we were sure, didn’t know what they wanted. Who did? Why we did!

Who were these users desecrating our cathedrals? Who were these money changers? Chase them from our temples!


The Agile insurgency is increasing viewed as an orthodox movement. A lot of the process guys have traded their keyboards for cufflinks. Waterfall has been effectively vanquished, though it lingers like a winter cough. If I had a beef, it would be with anyone, it would be with the managerial caste, not my users.

At one point, I shamelessly recycled one of my vintage tale about raw materials for software, caffeine and sugar, being cheap and abundant, whereas labor was anything but. I then observed that were the sugar and coffee exporting countries ever to get into cahoots, they could do to the American software industry what OPEC did to the overall economy during the seventies, which is to say, cripple it with an embargo on its vital raw materials.


People come to panels for red meat, for conflict, for entertainment. The scholarly may opt for high-fiber technical material tracks with names like "Type Theory IX", or spicy, exotic mind candy alternatives as are otherwise found in Onward!, but by and large, people come to panels hoping to see the upper-middle brow equivalent of a Claymation celebrity death match.

And so, I fear in hindsight that many found our panel to be a tepid, desultory love-fest. And people just don’t seem to get too excited watching a love-fest, at least so long as the participants remain fully clothed. Who, outside of a few members of the diplomatic corps perhaps, enjoys watching people agree?

For me, the point where I broke character was when legendary waterfall apologist Larry Constantine asked us to be serious for a moment, and I fell for it. Rather than maintaining my shop-steward for the hackers, Commando vs. Infantry persona, I tried instead to be reasonable. Yawn.

Mr. Inimitable ended the panel with a (fortunately nearly unintelligible) video of a salty exchange from someone else’s nightmare, fellow Kiwi Peter Jackson, if I am not mistaken. James then closed this one with one his trademark tumescent tweedy tirades, a heartfelt tribute to someone special.

And it will be coming soon to a Digital Library near you. May God have mercy on our souls.

John Vlissides Photo ©2004 by Munawar Hafiz

Martin!

|

I (almost) had the opportunity to introduce Martin Fowler's (superb) talk on Finding Good Design at OOPSLA 2005 this week. Fortunately, an alert functionary in the ACM's Office of Protocol, correctly having observed that my name was nowhere to be found in the Conference Peerage, was able to heroically and single-handedly avert what could have been an appalling affront to the conference's prestige.

Eugene Wallingford has done his usual masterful job providing the play-by-play for the talk itself. I might inject some color commentary once my own OOPSLA brain-dump has progressed a little further, but in the mean time, a few folks asked for the libretto for what became the scripted part of Ralph Johnson's intro. Here it is. Ralph did a terrific job paring it down. I'm continually amazed by what he can do with only arrow and delete keys to improve a piece.


It is fitting that we've convened in a place called Fashion Valley to talk about design. For in the realm of High Technology, just as in the realm of High Fashion, Design is a central concern.

And, In the glamorous and exotic world of object-oriented world of object-oriented software style, of code couture, if you will, the very best designers, the doyens of design, are known, as in the fashion world, by a single name: Kent, Ward, Ralph, Rebecca, and ... Martin.

Martin's is a life is filled with glamorous models; his sketches of smashingly attired anorexic stick figures are transformed overnight, in factories in the Far East, into the season's hottest products.

Whether you commission designer code or buy it off the rack... ...chances are you've had the opportunity to admire Martin's work.

This season, when some trade rag paparazzi breathless ask: Who did your code? The answer the A-List Coder longs to give is ... Martin.


Martin is, as well, the master of the Total Code Makeover. He is the Father of Modern Code Cosmetology.

Martin is a gifted distiller, a distiller of memes. He scours the hinterlands, patiently hand selecting only the finest raw materials. Martin lets them age, as if in an oak barrel. He takes ideas rough, fresh, impetuous and raw and ages them until they are mellifluous, smooth, soothing. and finally, palatable.


And he is, as I'm fond of saying, a pretty good writer for an Englishman...

He is the lead author of UML Dispelled, er, Distilled, still the volume of choice for those looking for a relatively painless introduction to this cartoon cult. Not least among this volume's virtues, it has often been said, is its brevity.

His seminal work on Analysis Patterns sets him apart from the mere scavengers that stalk our field.

His landmark volume on Refactoring took an obscure technology cultivated in the cornfields of rural Illinois, and turned it into a household word.

He's got a new book out as well, Enterprise Application Patterns. I think it is about Star Trek, but I'm not sure.

Martin is the man who brought back basic black with his Martin Fowler Signature Line for Addison-Wesley.

Technical fads rise and fall like hem lines. But At the edge of the runway on the banks of the Refactoring Rubicon one designer stands alone. So, here to preview his fall collection is the fabulous Martin! Martin Fowler:

The Aristocrats of Brigadoon

|

It was 1986, the space shuttle was grounded, a lame duck second-term Tory was watching his poll numbers erode, and I found myself heading off to OOPSLA.

Some things never change.

Nineteen years later, we find ourselves convening for the twentieth time. This is OOPSLA XX. (Next year will mark the twentieth anniversary of OOPSLA, back where it all began, in Portland, but that event will, nonetheless, be OOPSLA XXI. We number a lot of thing starting with zero in our line of work, after all.

However you reckon it, the task of commemorating this auspicious occasion had fallen to the five stoic, steadfast, reliable souls who’d managed to attend every OOPSLA to-date. A distinguished lot, present company excluded, of course. We, along with a handful of indispensable fellow travelers (Richard Gabriel, Ken Bauer, Linda Rising, Laura Weiner, and Jeff McKenna), constituted the unofficial OOPSLA 2005 Backward! committee.

We'd wrestled sporadically over the format for this event over the last year. I'd originally conceived of it the morning after an evening of moderate carousing in Vancouver last year as an Irish Wake. The Idea would have been that it were being held at sometime in the future. Anyone with a story could come up to tell it. Libation would, of course, be abundant. It would be the kind of event that would finally end in the bar at two o’clock in the morning.

Shopworn stereotypes aside, the Irish Wake is one of the most dignified and effective retrospective rituals mankind has ever devised. I know. As per her wishes, I’d helped my family to conduct one for my own mother a few years back.

However an Alert FunctionarySM in the ACM's Office of Protocol deemed such an enterprise as being in potentially questionable taste, so off we went trolling for other formats: a nursing home in the years 2025, a twelve-step meeting for object addicts, the list went on.

By the afternoon of day before, we’d still not settled on final format for our plenary event the following evening. We convened an emergency lunch meeting of the three non-professors on the perfect attendance list, and professional retrospective organizer extraordinaire (and Backward! committee member) Linda Rising. We decided to run our retrospective as … a retrospective.

We nodded our heads in satisfaction with our plan, as perfect attendee Rebecca Wirfs-Brock produced a six dollar bill from her wallet bearing an ad for art house film with which she was quite smitten named The Aristocrats. The film is built around different renditions of what is billed as the world’s funniest, or at least filthiest, joke. Rebecca told me the joke on the way out. I laughed on-and-off for over twenty minutes. Though I said nothing at the time, I knew then and there that our intrepid little band of OOPSLA perfect attendees finally had a name.


Our retrospective itself was preceded by a sumptuous, outrageously decadent dessert spread sponsored by Sun Microsystems, to commemorate (only) the tenth anniversary of the Java programming language. It was like something out of Charlie and the Chocolate Factory. A persistent rumor had it that Sun had shelled out $20,000 for six hundred helpings. Suffice to say there was enough molten chocolate on hand to have dipped every attendee there up to his neck in the stuff.

The bridge from the reception to the retrospective was to have been provided by designated Java Oompa Loompa (note the OOs) James Gosling, but he proved unable to attend. We were, however, as a dividend of the Backward! committee’s video production efforts, to display a clip of Gosling’s OOPSLA 1996 talk, The Feel of Java that Don Roberts had carefully selected for this very occasion.

With Gosling sidelined, the duty of holding up the Java Standard fell (figuratively, this time) to his designated stunt double: Steele. Guy Steele. It is customary that a stunt double be less handsome, less erudite, and more athletic that his corresponding leading man. I regret that I must inform you that Steele (who is depicted here vividly indicating Java’s age using the fingers on both hands) fell short on all three counts.

I’d noted as well that Steele had delivered what was unquestionably the cleverest talk I’ve ever heard in my life, Growing a Language at OOPSLA ’98 in Vancouver. As part of our Backward! commemoration, we’d embarked on the process of transferring some of the OOPSLA video archives to DVD late this summer, with the hope of setting up a Wayback/Time Machine space in San Diego. We used the video of Steele’s talk to test the equipment, and it attracted quite a crowd. Though we had about fifty other vintage events to show, Steele’s talk was the only one we exhibited. Other ‘bloggers have cited it as among the most impressive things they saw this year at OOPSLA. I could have sold a hundred of ‘em.

The knowledge that one is about to speak induces a kind of mental tunnel vision, or information triage, that makes concentrating on the speaker before you more difficult. I don’t recall that Steele said anything he hasn’t said before about Java, other than there were unspecified things they’d do differently now.

We opened the retrospective proper with our five perfect attendees, The Aristocrats (left to right), Ralph Johnson, Ed Gerringer, Allen Wirfs-Brock, Rebecca Wirfs-Brock, and your humble correspondent, fielding a few softballs. I’d noted that not only had Allen and Rebecca attended every OOPSLA, but that they had been married for nearly thirty years. Any bets as to whether OOPSLA, let alone Java, will make it that far?

I’d introduced the proceedings brandishing a bottle of Clos du Bois (get it?) Merlot, and vowed that once we five were down to two, we’d crack open that bottle and drink it. I wish I could remember where that cliché came from. I think it was a bottle of Bordeaux, and a fighter squadron, in the original tale.

Once the retrospective began, it fell to Linda Rising, Mary Lynn Manns, and fellow lifer Allen Wirfs-Brock to save the day. Allen came armed with a veritable litany of who was here when questions that managed to get some of the older folks in the crowd a pretty good workout, what with repetitive standing and sitting. Allen and Linda subsequently widened the panel to anyone who’d been at OOPSLA ’86 in Portland. That produced a satisfactory quorum of the village elders.


Q: What kept me coming back?

A: The free booze.

A: OOPSLA, said I, was an intellectual Brigadoon, a place you could return to every year, and pick up conversations in mid-sentence. A place like a cocktail party on the Bizzaro World, where you could glibly discuss topics like multimethod dispatch, and be surrounded by the other thousand people on Earth who thought that that was fascinating.

There was a steady stream of questions from the floor. One somehow involved the idea of removing the OOs from OOPSLA, and just re-branding the conference as Programming Languages and Systems. I quipped that were we to banish the OOs from OOPSLA, Rik SmOOdy and I would have to move to Serbia. I’m not sure whether anybody got it.

There was a query as to our most embarrassing OOPSLA moment. I recalled with horror the reception that Al Gore’s science advisor was given at the after dinner speaker at the last OOPSLA banquet in Washington, DC in 1993. Rebecca grabbed the baton and recalled how noisemakers distributed during one of the panels, in combination with wine distributed during the meal, lead to this genuinely appalling breach of etiquette.

I fondly recalled, at one point, Alan Kay’s banquet presentation on the Vivarium project at the very first OOPSLA. I repined as to how he’d managed to gore every oxen in the house, and being struck that one could actually do that at a research conference. For good or ill, my life was changed.

I recalled as well, as I often do, how he’d observed, as he so often did, that Good ideas don’t always scale, which prompted Henry Lieberman, the next day, to inquire So what do we do, just scale the bad ones?

Someone lobbed a slow ball down the middle for us at one point, asking us what things would be like down the road. Having been spending my time in the High Performance Computing world of late, I’m convinced that things seven or so years from now will be quite a bit more different from now than now is from seven years ago. With Moore’s Law exhausted, and a multicore breakout in the works, and PS3s and their ilk poised to bring supercomputer power to every teenager on the planet, we’ll be faced with the challenge of coming up with a programming model. MPI meets Actors meets Darwin meets Worse-is-Better. How will we squander this bounty?


After a while, I recalled the mix of deference and fascination, mingled with a dash of pity and boredom with which I’d have greeted such a gathering of elders nineteen years ago, and wondered what the folks in our audience, few of whom had bailed out in the midst of this, were thinking. Regardless, I was really enjoying myself, at least, as can be seen in the picture below depicting fellow OOPSLA gadfly Dave Ungar wistfully waxing eloquent over I can’t remember what as I gaze on in rapt attention.

In an elegiac moment, Ungar also recalled the year that we discovered that a Sarcasm Birds-of-a-Feather session was inherently impossible to organize if you tried to get the right people to come to it.

There were moments where what we were doing felt a tad indulgent. I recalled my youthful garage band sessions, which were way more fun to perform than they were to listen to, and thought, well, if that’s what we’re doing, then what the hell, we deserve it. As the retrospective wound down, I felt oddly, uncharacteristically sanguine about it all. It’s hard to deny that we have made a difference over the last nineteen years.

What with the dessert spread and all, this affair seemed more akin to a Jewish Wedding than an Irish Wake. We got insulin shock, alright, but nothing to wash it down. But this event, in the end, was what it needed to be, and about as much fun as it could have been, at least without the liquor…

Photos ©2005 by Munawar Hafiz

Inside the Hacker's Studio

|

A number of us at the CS Department at UIUC were able to end our afternoon today, at 1700 CDT, by watching a live (condign / flawless / transparent) webcast of the 2004 A. M. Turing Award Lecture from SIGCOMM in Philadelphia. The page for the webcast stated that "Ironically, this will be the first ACM Turing Lecture that is webcast live." I'm at something of a loss to explain why this is ironic. For those of us who were at OOPSLA 2004, it would be our second consecutive realtime Turing presentation.

The webcast was lead in with an ACM Logo accompanied by organ music that began to remind me of a hockey arena. I was waiting for the Canadian national anthem. Then it gradually dawned on me that the piece being played was a decidedly languid rendition of “Thus Spake Zarathustra” did little to diminish this effect. It turns out that the University of Pennsylvania's Irvine Auditorium, in which the lecture was to be delivered, is home to a 1.1e04 pipe pipe organ (see picture).

The prize was awarded in June to Vint Cerf and Bob Kahn for their pioneering work on Internet protocols. They elected to deliver their Turing "lecture" at the ACM venue of their choice: SIGCOMM.

The lecture was entitled: Assessing the Internet: Lessons Learned, Strategies for Evolution, and Future Possibilities.

I use the term "lecture" advisedly, because this dynamic duo of the datagram chose not to attempt a tag team lecture at all, but resorted instead to that quotidian staple of twenty-first century life, the talk show format.

The effect, given the stage set, and the gray-bearded, distinguished but phlegmatic, slightly diffident demeanor of moderator Lyman Chapin was nothing if not a probably unintentional echo of James Lipton’s “Inside the Actor’s Studio”.

The effect was in stark contrast to Alan Kay's California Casual high-tech presentation of a year ago, (as well as with the "IP on Anything" tee shirt Cerf claimed to have sported during the eighties). Kahn and Cerf dressed like pall bearers, decidely old-school (perhaps literally), and the faux-faculty club set showcased the shines on their shoes. The pair had evidently invested at least a little of their $50,000 shares of the prize money in wardrobe enhancement. Chapin began by patiently unzipping a leather folder and removing his paper notes. Cerf and Kahn sat with cheat sheets on their laps. For the second year in a row, no Microsoft prosthetics were in view. Indeed, no visuals of any kind were in use here.

I’ve seen this set-piece mock interview format attempted before, with mixed results. At worse, it can seem stiff and more than a little contrived. Fortunately, once they settled into things, this format seemed less and less obtrusive in this case.

Indeed, after a few minutes, given the big screen, the lecture room full of computer types, and the heady, but not excessively clinical subject matter, I started to get a bona fide conference buzz, and even began to take a few notes. Hence, this post. It’s a tad disorganized, which I ascribe in somewhat equal parts to my selective culling, and the desultory conversational format itself. The webcast will be out there on the web, of course (and it is altogether fitting and proper that it should be so, as they say in Pennsylvania), so if you’re curious as to what was actually said, by all means chase it down yourself.

Kahn opened the metaphorical anteing up with the observation that the service economy being build atop his brainchild was a like a “cafeteria in cyberspace”. Very biological, very Base-4 (Bio=Base 4, CS=Base 2) ATCG. Timely stuff…

I reached for my notepad in earnest when they started to talk about layering, and how layering was, it seemed, honored as often in the breach as by the Hoyle. I’d been looking for ammunition for a Layering as Golden Calf screed, and what better place to find it than a Turing rap. Kahn talked of layering as an implementation strategy, and one of ‘em observed that for efficiency’s sake, a more intimate design can be accommodated as well.

Then, they drifted into end-to-end allusions that are evidently orthodox shorthand before a SIGCOMM crowd. I've blathered on about the sheer brilliance of the end-to-end argument as recently as last week, and hence won't dwell on it it here. The End-to-End notion casts the internet's transport facilities themselves as a substrate, a medium, that does not pre-judge, confine, or constraint what applications can be at the Edge of the Internet.

One of ‘em (story of their lives, I’ll bet) observed that we are in a race to get those who know why some of this stuff is the way it is to say so and record this information before they are gone and no one knows at all. This is the same actuarial time bomb we are seeing with our Fortran refactoring work. Actually it is the first of several parallels I noticed.

The second was one of the evening’s recurring themes: the difficulty of even incremental evolutionary change in the face of the ossifying architectural fait accompli that the Internet has become. “Creeping incrementalism” was the best we can do, if that, it was lamented. Darwin would be utterly unsurprised to hear this… Duct tape is forever, it was more or less observed, with resignation…

Everyone committed to the internet protocols, it was said at one point, but didn’t actually use them.

Cerf (I think) extolled the virtues of a logical architecture that allows federation. (Hmmm, what year did “federation” become a buzzword?)

Cerf likened the basic internet protocols to a linear algebraic set of basis vectors, from which arbitrary vectors could be composed. I found myself thinking that as with the Von Neumann architecture, a general, blank-check mechanism for bit transport, as with bit transformation, had proven those who say general solutions never work wrong for at least this second time.

At one point, it was asked, that, given that the laureates couldn’t recognize that they were changing the world, how would we know which, among the ideas drifting through the memepool now, would have the same kind of impact. Now, I’d think, the answer is as it always has been. We’ll deploy an army of the unwitting to have them all, and let the devil, (or Darwin), take the hindmost, as always.

Cerf observed that if one wanted to design a voice network, building IP first, and then running voice over it would be the last thing one would do. But the bottom feeders are doing it. That’s what biology does, it works with what it has. Herb Simon satisfices, Gabriel and Worse-is-Better

There was less brainstorming as to the future than the billing would have had you believe. The most interesting gambit here was into the same turf Rick Rashid tried to mine at OOPSLA last fall regarding keeping “it all” around. To hear Kahn tell it, information will come into being like churn in the rabid quantum foam, most annihilating itself before ever amounting to much, some being judged worthy of enduring long enough to be subject to at least a modest amount of additional consideration…

Yet another recurring theme was networking's return to its wireless roots. The notion of an ethernet began with wireless in Hawaii. A world without wires now promises to reduce the number of remotes for which we spend half our time hunting, promised the laureates.

In contrast with 2003 Turing Laureate Alan Kay, who, at times, seems to have the uncanny ability to channel Don Rickles, and at times appears to have as his goal to leave no ox in the room ungored, attempts at humor here were few and far between. Their parting sound bite was something of an exception to this: “if it is really important, probably no one is in charge”…

What were the lessons learned? What are the strategies for evolution? Let's see, layers are good as a logical ideal, but are often ignored where the bytes hit the information highway. They could easily have riffed on patterns vs. implementation, but this memeplex seems to have spared this crowd. They seemed decidedly pessimistic about the prospects for anything but incremental evolution in infrastructural areas where the annealing process has cooled for the perhaps the last time, and tertiary constraints hamstring further evolution. They are more optimistic that there is always room at "the edge" for innovation, that the end-to-end frontier remains as open as ever.

To summarize:

  • Evolution: Tough problem
  • Layering: Honored in the breach
  • Future: Who knows?

Well, okay, after having literally slept on it for a few hours, there were a couple of somewhat forward looking notions that have come back to mind. One was Cerf's discussions of the challenges JPL encountered extending the internet to Mars. At the point where the planets are the furthest apart, this entails a forty minute round trip through the luminiferous ether. What's more, orbiting platforms and rotating planets conspire to obscure the targets for hours at a time. Such highly dispersed, and sporadically available nodes required considerable modifications to the stock design, it was said.

Another thing I didn't know, being from outside the COMM world and all, was that Kahn has had a longtime interest in object mobility. Not having pursued this, I don't know if he means the same thing by this that people in the object-oriented works used to, but if he did, this is one of those object-have-failed tales we could have milked more a couple of years ago. We never got "objects floating in a sea of computation", or the "animal kingdom" model. We got computation rooted to platforms, with data plying the ether, which is more of a "vegetable kingdom" model instead. For all sorts of reaons, security not being the least among them, this may have been, in hindsight, a more prudent (though less exciting) way to have gone...

A recurring theme was security. Cerf observed, to my surprise, that even in 1975, security work was going on in parallel with that on the public internet, but the security work was (and evidently remains) classified. The authors acknowledged that they'd never pondered subjects like the socio-economics of spam when they were building this thing. They recall the outrage engendered by the first email advertisement. I recalled what an obvious, unthinkable abuse of this honor-system infrastructure such things seemed like then, as they, in an elegiac moment, recalled the same.


Cerfing USA: Another memorable issue raised was America's inexplicable desire to unilaterally surrender her basic research capacity in Computer Science on the altar of budget cuts, particularly at DARPA, which had patiently stocked the internet punch bowl for a generation, a fact often conveniently forgotten by the free marketeers who hawk the internet as a triumph of the private sector. Without DARPA's high-RISC agenda / imperative, Dave Patterson (the ACM president) observed, neither ACM nor Stanford would have a president right now.

It's interesting how different communties have different cultures and characters, and how these characters were affected by their genesis. Objects were Scandiavian Design meets PARC / California. The Supercomputing Community are Oppenheimer's grandchildren, seeded by DOD and DOE. The Internet pioneers lived in a house that DARPA built.

The Turing Award has been called the Nobel Prize in Computer Science, in the same sense, one must presume, that a Daytime Emmy Award could be thought of as a Nobel Prize in Soap Opera.
--Thomas Jay Peckish II, on an ususual strain of Stockholm Syndrome...

Crimson Crux offers more of a straight play-by-play that complements my color commentary. They evidently thought it was funnier than I did. I guess Kay is a tough act to follow.
While the post at motz compliments my post...

Thomas Jay Peckish II on Coaching

|

Beware of any enterprise that employs a coach.
--Thomas Jay Peckish II

In Praise of Cut 'n' Paste

|

"Cut 'n' Paste", the practice of creating a new software component or application by making a source copy of an existing body of code and "beating on it" mercilessly until it meets some new set of requirements, is often treated with disdain and contempt by the traditional software engineering community and (post-)modern object-oriented methodologists alike. Despite this, the ubiquity and enduring popularity of this approach begs the question: What are these people doing right?

Evidence suggests that cut 'n' paste is a nearly essential, and nearly universal, practice when new framework applications are constructed, be they for GUI code, Eclipse plug-ins, or what have you. Adaptations to complex applications such as compilers often demand an incremental, test-as-you-go approach as well.

The advantages to this approach, naturally, are the complements of those associated with factoring out duplication. A copy provides a safe sandbox off to the side of a vital development stream. Changes made to copies cannot disrupt production code, and like duplicate genes, copies can evolve new functionality without jeopardizing existing functionality in the original. Mere copies can avoid, for a time, the intricacies associated with engineering away duplicate functionality.

Indeed, expedient code copying has long been observed to be characteristic of an early, exploratory phase in the evolution of object-oriented components and systems, but its virtual indispensability to framework developers has been less widely acknowledged.

An OOPSLA 2005 Onward! Breakthrough Ideas Micro-Screed

Big Bucket of Glue

|
Fishbowl Room

Back in 1994, a small group of undergraduates toiling in a round, windowless room at the National Center of Computing Applications at the University of Illinois at Urbana-Champaign produced what was certainly the decade's, and perhaps the century's most revolutionary application: Mosaic. It's praises have by-and-large been duly sung, but one of its lesser known, and most remarkable attributes was its architecture: it is was what I call a "Big Bucket of Glue". The Mosaic application actually contained very little new code. It was, instead, constructed of a thin veneer of C adhesive code that bound together a rag-tag collection of existing applications. It was a harbinger of things to come.

With the increasing popularity of scripting languages with names beginning with letters like "P", this style / school of design has become one of our current era's most frequently deployed systems level architectures. Drawing on an increasingly broad, if not rich legacy of existing applications and components of all sizes, shapes and varieties, architects increasing slather on the glue, and ship these "Big Buckets of Glue" while their competitors are still inking high-road cartoons on their whiteboards.

While the Big Ball of Mud has remained the design of choice for application developers, the "Big Bucket of Glue" is increasingly the weapon of choice for system's level integrators. Indeed, the wildly popular Eclipse platform can be seen as a collection of black-box Java components bound together with reflective XML glue.

An OOPSLA 2005 Onward! Breakthrough Ideas Micro-Screed

The End-to-End Principle and Programming Language Design

|

Earlier this year, I finally came across one of the greatest "classic" papers I'd never read: "End-to-End Arguments in System Design" by Saltzer, Reed, and Clark. The end-to-end argument was developed in the realm of internet protocol design. Succinctly put (and this format allows for no alternative) it argues that facilities, such as error checking, which one might be tempted to place at the lower levels of a system or network, might actually be redundant, and therefore useless at the lower levels, when they must be included in the upper levels of the system as well.

We argue that this argument applies as well to programming language design as it does to networking. For instance, facilities that some languages provide for security, protection, synchronization, namespace management, or type checking must often be re-implemented in applications programs in a variety of different ways, many at odds with the manner in which these facilities were provided in these underlying languages. Such duplication raises the question of why these now redundant facilities need be present in these programming languages at all.

This argument suggests a simpler, more dynamic approach to programming language design, based on an end-to-end analysis of the kinds of facilities users really use, may be better suited to the demands of our clustered, networked, threaded multi-core 21st century world.

An OOPSLA 2005 Onward! Breakthrough Ideas Micro-Screed

Thomas Jay Peckish II on Photran

|

NCSA: So, tell us about Photran…

TJP2: My pleasure. Photran is a plug-in for the Eclipse IDE that provides support for Fortran development.

NCSA: …and Eclipse is?

TJP2: Yes. Eclipse is an open-source platform for building IDEs that was originally developed at IBM. It’s a descendent of their VisualAge for Java product. That product was originally written in Smalltalk. As the Java juggernaut picked up steam, IBM felt compelled to re-write it in Java. They subsequently renamed this product Eclipse.

NCSA: And made it all open source?

TJP2: Yes, they “open-sourced” the entire codebase. This struck some as a bold gambit at the time, but Eclipse has turned into the finest IDE on “the market” for Java development. It’s pretty much brought about a mass extinction among competing IDEs. How do you compete with free, after all?

NCSA: So Eclipse is a free IBM product?

TJP2: Not anymore. It’s still free, of course, but IBM spun the Eclipse effort off to an autonomous non-profit entity, the Eclipse Foundation, so as to not have it be seen as in competition with other vendors. It’s ironic though, IBM has been doing a better job of support Java than Sun itself in some respects. I wish they’d co-operate more. I wouldn’t mind seeing IBM acquire Sun, truth be told :-) At this point, they seem like a good match. But I’m a programmer, not an investment banker, so take this with a grain of salt.

NCSA: Agreed. Um, …a “plug-in”? What’s that mean? It sounds so convenient.

TJP2: It is. Actually, all Eclipse is is a collection of plug-ins. Plug-ins are medium grained, relatively autonomous Java components. They are woven together using a layer of XML glue, using a relatively modest collection of high-level integration conventions. This style of embedding more coarsely grained black-box components using mortar like XML (or Python, or Perl) is decidedly post-modern – Actually, Mosaic was one of the first high-profile exemplars of this style, if you ask me, though their “glue layer” was a thin coat of sloppy C code that bound together a hodge-podge of existing utilities. The rest is history, as you know.

NCSA: …Ahem. plug-ins?

TJP2: So anyway, Eclipse is nothing more than a constellation of hundreds and hundreds of plug-ins, flying in close formation (to mix metaphors). Indeed, IBM’s popular WebSphere product is nothing more than a large collection of plug-ins that run on top of Eclipse. IBM has made quite a lucrative business out of selling services for WebSphere. It’s as if GM were giving away cars, and making its money by selling gasoline and providing mechanics. But I digress.

In any case, this architecture means that anyone who wants to write their own plug-ins will see them integrated into Eclipse with all the fit, finish and flair of the existing plug-ins. After all, Eclipse itself is nothing but plug-ins, and reflective glue. And since it is open-source, you have lots of worked examples in front of you, and can use the source for existing plug-ins as the basis for writing your own.

NCSA: …and that’s how you built your Photran plug-in?

TJP2: Yes. Let me get back to your original question. Photran is a Fortran plug-in (or, to be precise, collection of plug-ins) for Eclipse. Like the rest of Eclipse, it is itself written in Java. We built it by taking another set of plug-ins, the CDT (C Development Toolkit), and then hacked away at it. Our ultimate goal is to provide facilities for Fortran programmers that meet the same standards as those that Eclipse is providing for Java programmers. Right now, we provide CVS repository management, Fortran code editing with syntax highlighting, powerful search-and-replace facilities, and code outlining. But what we really want to do is be able to refactor Fortran as easily, or more easily, as you can now refactor Java using Eclipse.

NCSA: “Refactor”? What on Earth is refactoring? I’ve never heard of it.

TJP2: Our party line is that a refactoring is a “behavior preserving program transformation”. That sounds pretty clinical, doesn’t it? By “behavior preserving” we mean that it is a change that doesn’t change the way the program works.

NCSA: So what's the use in make a change to a program if it doesn’t change the way it works? That sounds like a waste of time.

TJP2: I demur. For example, you might discover that it would be easier to discern the intent of a program were somehow, as if by magic, all the occurrences of a variable, say “K”, were replaced with the name “SUM”.

NCSA: Couldn’t you do that with any editor?

TJP2: You could try, but depending upon how your program was written, you’d have to tread carefully, because any number of variables might be named “K”, and you’d only really want to change one specific incarnation of “K”.

NCSA: So a refactoring tool has to be smarter than an editor?

TJP2: Exactly. A refactoring tool knows about the structure, syntax, and semantics of its target language, and can guarantee that only occurrence of the specific copy of “K” you’d selected is refactored. This is the kind of thing that distinguishes simple code editing from refactoring: our substitution of “K” with “SUM” can be made as a first-class program transformation.

NCSA: So you can guarantee that you won’t screw your code up?

TJP2: Yup. That’s the beauty of refactoring. That’s where the “program transformation” part of “behavior preserving program transformation” comes in. We can bring some fancy computer science programming language theory to bear on the problem and show that our transformations do what they are supposed to do, only what they are supposed to do, and don’t break anything else in the process. This takes a tremendous burden off the programmer. It’s an amazing amenity.

NCSA: Well, renaming variables is nice, but…

TJP2: Yeah, another example, one of my favorites, is what the Java folks call the “Extract Method” refactoring. We’ll call it something like “Extract Procedure” in Fortran. This refactoring allows you to select a code passage in your source program, and in one fell swoop, convert that code into a free-standing method or subroutine, replacing it and any other occurrence of that code fragment with properly constructed invocations of the new routine. Your jaw drops the first time you see it in action. So, back to your earlier question: neither of these manipulations changes the way the code works one whit. But they make it easier to read, and thereby potentially easier to change. The computer itself doesn’t really care about these changes, but programmers who have to work with the code certainly do.

NCSA: So, are refactorings mainly cosmetic manipulations?

TJP2: You might think so in the case of giving variables new names. It must sound like high-tech copy editing to you. In the case of extracting routines, though, the changes are more than skin deep. You are improving the code’s structure. Moving duplicated, difficult to read code into one place at once makes the code easier to maintain, easier to understand, and easier to reuse. The structure of the entire application becomes more modular, as redundancy subsides, and autonomous modules, routines and data structures emerge. It feels a bit like rehabilitating a run-down house, or gentrifying a downtrodden neighborhood.

NCSA: All the software buzz-junkies are high on the likes of Python, Java, and even XML these days. Why Fortran? It sounds so retro; so old-school…

TJP2: We became interested in Fortran when it became to impinge upon our consciousness again from two directions at once. On one hand, we were given the opportunity by NASA to try to take what we’d learned over the years about object-oriented software engineering, frameworks, and architecture to a large high-performance Fortran framework for simulating Gamma Ray Bursts in Type 1a Supernovae. This framework, IBEAM, was written primarily in Fortran. Actually, one of its principal authors is at NCSA now: Prof. Paul Ricker. Maybe you know him.

NCSA: Um, I might. And on this other hand?

TJP2: Yes, at around the same time, DARPA and IBM became interested in ways of improving the productivity of programmers in high-performance computing by an order-of-magnitude, under the aegis of their PERCS program. And, as your readers are surely aware, Fortran still maintains quite the presence in the high-performance arena.

NCSA: And you think refactoring can really improve the performance of Fortran programmers this dramatically?

TJP2: Of all the technologies in the program, I believe refactoring has then potential to produce a larger immediate benefit than any of the others. Indeed, I think full-featured refactoring tools can have a greater impact on Fortran programs than they have for Java or Smalltalk for a number of reasons. For one thing, Fortran is a more complex language than either. More bookkeeping needs to be done to keep declarations consistent than in some of the newer languages. And, Fortran has gone through three of four major transformations over the years. There is a lot of Fortran code still mired in the Fortran 66 “GOTOs considered necessary” era. Refactoring tools are ideal for modernizing this kind of code. Fortran 77 code uses arrays for nearly every kind of abstraction, modern data structures weren’t introduced until Fortran 90. Finally, Fortran 2003 will be adding support for objects. Object-oriented transformations play a central role in Java refactoring tools. We’re looking forward to refactoring Fortran to objects one day not to far in the distant future.

We recently did a presentation where we took a small piece of vintage Fortran code, a “dowdy dusty deck”, and performed what we called a “Total Code Makeover” on it. Incrementally, one step at a time, we brought this program into the seventies, then the nineties, and then in to the realm of tomorrow’s object-oriented Fortran code.

NCSA: So you had to learn Fortran?

TJP2: Not at all. I’m an old Fortran hand. I wrote my first Fortran program on punch cards back during the Nixon administration, right across the street from here at DCL. That was before they added the second addition, the one that looks like a minimum security cellblock. I spent a number of years doing scientific programming in Fortran for a group that was trying to read minds with brainwaves over at the Psychology Dept. They never got that far, but they were well-funded. Once of my greatest frustrations there was that none of the purportedly more modern programming languages that became available at the time were as well-suited to scientific programming as the Fortran 66 descendents we were using. Still, I envied those who were able to program in fancier, frillier languages like Pascal, C, and then C++. Nonetheless, no other language dealt with arrays as well, or as efficiently as Fortran. That’s still pretty much the case. And, in the mean time, Fortran has evolved to match the feature sets of many of its successors.

NCSA: Well, there are some who’d pay a small fortune for a tool like the one your building, but yours is free, right?

TJP2: That’s right. Lock-stock-and-barrel / Hook-line-and-sinker. You can find our tool at http://www.photran.org. We’d be delighted to hear from any of your readers who’d be interesting in testing and using our tool. And they’re welcome to contribute too.

NCSA: Well, this sounds like fascinating work. Thanks you for your time.

TJP2: And thank you. That URL again is: http://www.photran.org...

Backward! Ho

|

I am, God help me, one of five souls who will, should we all arrive intact, have attended all twenty OOPSLA Conferences. The others are Ralph Johnson, Rebecca Wirfs-Brock, Allen Wirfs-Brock, and Ed Gehringer. We are putting together an observance of this auspicious occassion, called "Backward!" that will be held on Tue. 18 October 2005. It is expected that another slightly less auspicous, but nevertheless noteworthy occassion, the Tenth Anniversary of the introduction of the Java programming language, will be observed at this very same event...

Refactoring and the Closing of the Frontier

|

On the 20th of September, 2004, Fortran, the first, and thereby the oldest, high-level programming turned fifty. Most people regard computer programming as a relatively new avocation, but the reality is that this industry can now boast of a heritage of over half a century of code.

I use the term "heritage" advisedly. The more customary term is "legacy". That term, is of course laden with negative connotations. And indeed, in the case of most of this code, most see little worth boasting about. The sheer burden of this accumulating legacy, however, is finally, belatedly, beginning to alter the way we think about software development, both in the academy, and in industry.

Like some of you, I can of age in what might be called the “green fields” era of software development. Systems were built from the ground up. It wasn’t unusual for a single individual to write ever line of code that ran an application, even a system more or less from first principles.

Large systems too were built in relative isolation from first principles by teams both large and small that wielded what might be thought of as homebrew materials and handmade tools.

Green Fields Today, the green fields a gone; the frontier is closed. In Illinois, only a few thousand acres of virgin prairie remain. Today’s developers are confronted by construction sites than have seen extensive prior development. Instead of green fields, they must master broken field running, avoiding, or otherwise coming to terms with existing obstacles. Some sites are so devastated as to be eligible for Superfund status...

Jellyfish in Space

|

Jellyfish in Space Does software have a shape? Cope asked this years ago…

Software and Speciation; Allopatry…

A jelly fish in space: If only software could be written on the space shuttle, in the weightless void.

Most programs have primitive shapes, they are infophages, a tube around a gut, a python… A stream…

We’re running a chop shop for code, a van conversion outfit, a limo conversion shop.

Where are Ferrari, Porsche, Earle? We are shade tree mechanics, not designers. They reused ideas. Lots of ‘em…

Construction Sites are Always Muddy

| | Comments (1)

Construction sites are always muddy. Kitchens are always messy, at least while meals are being prepared.

People want assembly to be antiseptic, and free. It ain't. It takes time, and skill, to fit materials to the site, and the task, and craft them into an integrated whole. People don't want to live or work in prefab trailer park modules. They want real homes, real stores, real offices.

I've been re-reading some of Christopher Alexander's work, and remain skeptical that either good buildings or good code can be seamlessly, effortlessly conjoured up out of Lego blocks.

In the Shadow of the Moon

|

No OOPSLA XIX Zeitgeist compendium can ignore the momentium that the Eclipse community continues to exhibit this year. A packed reception on Monday night put an impressive array of contributions on display, notable both for their breadth and depth. Eclipse is moving into what can only be called a Nation Building phase...

It's a simply gorgeous day in Vancouver, and their is some hope that a total eclipse of the moon will be visible later this evening. I knew IBM had a lot of clout...

What a Bunch of Clowns

|

A nod of some sort to the "two cultures" of art and technology has become obligatory at OOPSLA over the last few years. An OOPSLA 2004 Panel chaired by Jim Haungs on whether software design is math or science or arts or crafts revisited the enduring theme of similes for software development (among other things). The four panelists shown to the left, from left to right, are Richard P. Gabriel, Martin Fowler, Steve McConnell, and Ralph Johnson. Shown to the right is late addition Robert Biddle. Photo of Robert Biddle

Two newbies I know later dubbed this panel as the single most impressive thing they'd seen at OOPSLA. I found it pleasant, and at times thought provoking (and I am not one who is easily provoked to thought), though I found myself at the disadvantage of being already familiar with the likely positions of most of the panelists.

Ralph took the eminently sensible position, as is his custom, that software development is, above all, engineering, and gave an engaging presentation on the travails of nineteenth and twentieth century bridge engineering based on the work of Henry Petroski. In the early days of railroading, railroad bridges were as prone to failure as software is now, frequently with more lethal results.

Biddle mined the same fertile post-modern vein that has served him and his antipodean sidekick James Noble so well on the Onward! circuit over the last three years. My recollections of his presentation have melded into a conflated kaleidoscope of notions and reconstructed memories like "the death of orthodoxy" and "making do in a post-precision heterogeneous polyglot playground of a world". I'm pretty sure Robert didn't quite say any of these things, but they are in the spirit of what he said, which was in the spirit of the Zeitgeist he was trying to summon out of the humanities. In this respect, he may been the panelist most true to the panel's theme. I recall some mention of nihilism and certain French academicians. All and all, the sensibility was a generation beyond the outmoded technotopianism that our own villiage elders still hew to, and was in considerably better touch with life in the trenches.

McConnell, predictably, hawked his more troublesome notion of software construction. My main beef with the construction metaphor is that it perpetuates an enduring confusion between software design and software product fabrication. Software design has much more in common with the process of deciding on the layout of a house than it does with the process of merely driving the nails at the site. We are more like the people who design cars than the people who assemble them. The analog to auto assembly for us is disk replication. Despite this fact, the industry has exhibited a persistent desire to treat software development as an essentially blue collar activity. Metaphors like McConnell's do little to dispel this harmful perception.

It's customary to conclude such discussions with the observation that no metaphor is perfect, and that a programmer is, after all, just a programmer, as the agile folk are fond if saying. This discussion was no different. "Programming is like nothing else" opined programmer cum poet Richard P. Gabriel.


I had been content, as of late, to leave this issue there, but the panelists had managed to re-convince me instead that metaphor was more important that ever. Metaphor matters. As I pondered their positions, I was struck by a metaphor drawn from the arts: We're all a bunch of clowns.

So, "What a bunch of clowns!", I exclaimed. Not just the panelists, all of us. Programming is like ... comedy.

Another software simile was born.

For instance, APL programmers and comedians share an affinity for pithy one liners. Take this one for instance: What poetry has in common with code is that no one ever reads it.

Doing either well demands creativity, and a mix of improvisation and scripting skill. Comedians test, iterate and polish their material until it executes perfectly.

Fred Brooks, after all, is said to have found that there is a factor of twenty difference between the best and the worst comedians...


I'd missed most of Dick's position, alas, and hard a hard time recalling what Martin had said, at first. Then I remembered that he'd offered yet another software simile: programmers are like lawyers. I'm pretty sure I shopped this around at a panel a couple of years ago. I liked Martin's treatment, though he could have played up the domain expertise angle more. I've posted my 2002 notes here.

This reference also had the effect of reminding me of one of Martin's previous forays into the realm of object-oriented jurisprudence at the Show Trial of the Gang of Four back in Denver during 1999. Martin can be scene in the solicitor's wig, third from the left. He'd painstakingly supervised the trimmng of the solicitor's wigs beforehand, so as to ensure they were shorter than the presiding judge's, and thereby maintain the dignity of the court.


Gabriel et al. Photograph (C) 2004 by Munawar Hafiz

Some subsequent comments on POMO and Biddle from Ian Clysdale...

My Dog Ate My Keynote

| | Comments (1)

A couple of people have asked me whether I was planning to post anything substantial about OOPSLA 2004. To them I say patience, patience, red meat and fiber take longer to digest than do frilly pastries. For openers, here are a few thoughts on the OOPSLA 2004 Keynote Address.

I apologize in advance for the verbosity of what follows. This is a raw core dump of sorts. Making it taut will take more thought. I’m afraid I’m prone to occasional bouts of logorrhea after traveling abroad.


Microsoft has a checkered past with respect to participation in the broader research community. For too many years, Microsoft was conspicuous only by its absence at OOPSLA. This low profile once prompted one wag to quip that "Listening to Microsoft talk about innovation is like listening to a scavenger talk about hunting."

Hence, its been satisfying over the last several years, dating from around the turn of the century or so, to see the strides Microsoft has made towards the goal of turning this around. They are cranking out more books and papers than ever these days, and their people are getting out more. The acquisition of high-profile free-agents like Ward Cunningham has added more muscle to a lineup that had traditionally been able to boast of a strong bench, but of few long ball hitters.

It was for these reasons, and others, that I was excited at the prospect of hearing from Microsoft's homegrown Research Czar, Rick Rashid.

At this point, I must turn to my notes. Ahem, here they are:


Dr. Richard F. Rashid; Founder and Head of Microsoft Research; a CMU alumnus, who'd worked on Mach; a guy who'd gotten a few bits under his fingernails in his day. Indeed, he observed that current users of Apple’s OS X may still be packing a few lines of his original kernel code in their rigs. I knew nothing of Rick’s background until this morning, and his stock was rising fast (in my book, at least).

The talk was seemingly cobbled together from two strikingly distinct sources. The first part of the talk opened with amusing eyewash, recycled from the likes of COMDEX Conferences gone by.

The future, Rashid predicted, would be about the data. Hey, not a bad hook. His images reinforced an emerging sense that that long delayed digital convergence that the digerati have been braying about for the last few years may finally be upon us at last.

Now, what they’ve been predicting is a cornucopia, an age of resource abundance, a horn of plenty. Petaflops galore. Scores of cores on every desktop 3D horsepower that rivals Pixars. Bandwidth too cheap to meter. The difference these days is that we can finally see it coming. Yes, he’s there too, the rise of the GPU, multicore processors, the whole shebang.

Now, Rashid is talking about heady stuff: Black box flight recorders for humans. The lawyers will love this. We yanks will need to set to work to fix the fourth amendment for anything like this to work.

Hmmm. I guess Planetary Scale is a buzz phrase these days. Will we be able to consult realtime satellite photos to see if there are fresh parking tickets on the windshields of our cars?

Terra Scale / Galactic Scale. Hmmm, 1999 Turing Laureate Jim Gray is talking about Galactic Scale computing. Impressive. Has any mere marketing department ever exhibited megalomania of this magnitude?

http://skyserver.stss.org || http://GPGPU.com || http://www.iova.net || http://skyquery.net

I recalled the climax of Arthur Clarke’s 2010: Something Wonderful is About to Happen. It feels as if Moore’s Law has been treading water of late, expending calories on a phase transition, as if melting ice, rather than generating predictable increases in raw heat. It feels as if, after a period of relative stagnation, things are about to change. It feels exciting. 2010 culminated with a new sun in the sky alongside the old. Jupiter transformed through fusion. The sky was yellow and the sun was blue. I wonder what color our new sun will really turn out to be. But the timeframe is starting to sound about right.

Next, Rashid dug his steam shovel into the ubiquitous computing vein. Smart Personal Object Technology, the Ministry of Software calls it. They have a tiny CLR for everyday gadgets. Neat.

We’ve been digging into sensor nets, control systems, and MPI-worthy clusters at home, and his tales reinforced our hope that this ore dug from this vein will yield more than base metal one of these days.

Yup, he had me up until right about here.


It was at this point that Rashid’s talk took its first ominous turn toward the worse. For one thing, I began to dawn on me that there hadn’t been anything very meaty, very original, in his presentation up until this point. I felt as if in a Mexican Restaurant, filling up on chips, and wondering when the main course would arrive.

The larger, more serious problem was that his focus had changed abruptly from fuzzy lens ink blot fantasies to what Microsoft calls “Software Factories”.

More copies of “Software Factories” left OOPSLA 2004 in the hands of attendees than any other book in OOPSLA history. Given the healthy attendance, the numbers for SF must have easily eclipsed those posted by the Beck/Gamma Eclipse book IBM bought us all last year. This ploy, in turn, obliterated Gamma’s own record, set in 1994, by the GoF book release’s “reading frenzy” in Portland. Of course, the 1994 books were actually sold, whereas these were comps.

Not that I’m complaining or anything. What Irishman minds hearing the words “On the House”? Why look a gift-horse in the mouth, even if he isn’t the fastest pony on the track? But I digress…

Now, as luck would have it, our Software Architecture Group at the University of Illinois at Urbana-Champaign has conducted a series of writers workshops a few months ago that examined a fair bit of this material. As I recall, we had some substantial reservations about where this effort was heading.

For one thing, the notion of “Software Factories” is way off the mark. Something less grandiose; more intimate; might have worked better. I can’t recall the alternatives we came up with (borrowing “weaver” from the AOP crowd might have been one suggestion). I’ll poll our contingent later in the week.

Our more serious reservation is that we think this “programme” is far too ambitious, far too grandiose, to bear substantial fruit over the short haul outside of a handful of set-piece, niche domains.

Proponents would likely counter with “you’ve got to start somewhere.” I suppose … but …

Nooooo, is this Model Driven Development again? For the love of God, noooooo. A proper rant on the excesses of MDA/MDD advocates is beyond the scope of this screed. The upshot: this stuff is being hyped way way to prematurely.


The presentation turned next to Visual Studio 2005. VS05 will evidently include tools for developing domain-specific languages. This sounds like a worthwhile direction, but wasn’t all that easy to tell how this will work.

VS05 will support building programs using diagrams as well as code, and, mirable dictu, refactoring too.

It began to dawn, somewhere in the sluggish recesses of my shopworn mind, that this all was beginning to sound like a response to Eclipse and Websphere.. Let’s see, we’ll have domain-specific plugins, multiple language support, tools for turning code into pretty pictures, and vice versa, refactoring support, an utterly shameless Java nock-off. There is even something named “Corona” in there somewhere. Get it?

Like U.S. Grant, Microsoft is fighting its customary unrelenting war of attrition. Siege tactics. Lastest with the Mostest, as Nathan Bedford Forest might have said (instead).

And, that’s all fine as far as it goes. I use Visual Studio from time to time, and I like it well enough. A commercial alternative to open-source tools like Eclipse is a healthy thing, in my estimation.

Indeed, I heard a lot of people complain that the VS2005 roll-out announcement reeked of crass commercialism. I simply can’t see the problem. I mean, God help us should someone show up at OOPSLA with software for sale. Doing a roll-out at OOPSLA strikes me more as a gesture of respect than of disdain. It’s no wonder exhibitors are so scarce these days.

No, the problem I had with this part of the talk, other than what I’ve prattled on about already, is that he delegated about half of his keynote to an underling. Sure, the Year 2525 stuff and the roll-out were fine, and a fellow like Rashid surely leads a busy life.

Nonetheless, I felt like we were served left-overs, carrion, and I craved something more meaty, and more fresh. It seemed as if Rick had left his A-Game at home in Seattle. I wanted to run out to try to find Rick Rashid’s dog, to see if I could cajole him into coughing up the talk I wanted to hear instead...

--BF, who reserves the right to revise and extend his remarks.
--BF, who has leaned on alimentary metaphors perhaps more than a person should over the course of a blog post…


If you don’t have the courage to say what you think, there isn’t much use in thinking it, is there?
--Thomas Jay Peckish II


Hey, I just noticed the Visual Studio 2005 Beta DVD in my OOPSLA totebag. Thanks guys, I'll give it a look..,

Is "Croquet" Coyly Eponymous?

| | Comments (4)

Alan Kay has had what, by anybody's standards, would have to be called a good year. He recently bagged the Draper and Kyoto Prizes, and is poised to deliver his Turing Award lecture tomorrow night. His praises have been duly and extensively sung elsewhere; let it suffice to say that he has more than ample laurels upon which to rest, should he have so desired.

At the frightful risk of appearing to indulge in "lèse-majesté", I must sheepishly concede that I have, on a few occasions, described Kay as the Orson Welles of Object-Oriented Programming. I'm not completely sure why this comparision came to mind, but genius, pure preternatural genius, was surely a factor. And, like Welles, Kay's best-known, most monumental achievements came at a remarkably young age. Then too, one can, if one squints just the right way, discern a certain pyhsical resemblance between the two.

Unlike Welles, Kay has not taken the luxury of resting his laurels until both had gone to seed. Instead, there's a good chance that he'll be demonstrating Croquet, a remarkably collaborative enviroment that foreshadows what can be done with the coming cornicopia of cycles, bandwidth, and graphical power.

It usually takes a few days to discern OOPSLA's Zeitgeist, if any, but this time its evident already that the buzz junkies will have more than the usual methodone to amuse them...

--BF, who valiantly resisted the tempation to entitle this posting "Citizen Kay"...

The Hajj

|

Given the vagaries a the lunar calendar, this must be at least the second time that OOPSLA has occured during Ramadan. (Yes, it's a Google question of some sort, I know, I know.) In was in this spirit that we set out on our nineteen annual pilgrimage to OOPSLA XIX, in Vancouver, BC.

This is OOPSLA's third time in Vancouver, and our fourth or fifth time in Canada, depending on how one scores Minneapolis. Our air/land journey from Urbana consumed 16:14, besting James Noble's travel time from New Zealand by at least two hours.

OOPSLA 2004 Wiki page for Thomas Jay Peckish II.

Brain Surgery

|

Our web server for the NASA IBEAM Project, http://brain.cs.uiuc.edu, underwent surgery yesterday for an acute respiratory problem, the complications of which included high fever and loss of consciousness. The procedure went smoothly, though more loss of blood than expected was encountered (alas, on the part of the surgeon) than expected. Brian and Brain are resting comforatble, and a complete recovery is expected for both.

The procedure, I'm pleased to report, was neither "brain surgery" nor "rocket science".

Anyway, our server was crashing due to a faulty cooling fan. I rather suspected something of the sort when it would lock up and refuse to do anything, only to return to its old self after being powered down for a few minutes. Still, there was no obvious fan problem to be seen once the covers came off.

Howerver, the BIOS Hardware status page told the take quite vividly. I could see the fan speed for the CPU0 drop from 5400 RPM down to 500, and then to zero, while the CPU temperature rose to nearly 212 F. A smoking gun; a nearly toasted Athlon.


I mention this only because it reinforced something I've been thinking while building test cases of late: why can't more software components work this way? I want code that does more of this kind runtime validation and status reporting in more of the objects I build and use.

Now, this is surely not a new idea; just one I'm personally becoming quite sold on / smitten with of late. Why can't more of these kinds of monitoring and instrumentation facitlites be built into more of our components? As with motherboards, is this a by-product of technical maturity?

Thomas Jay Peckish II on Code Clutter

|

That production code is littered / cluttered with "error handling" or "exception" code speaks volumes about the state of contemporary programming language design...
--Thomas Jay Peckish II

O'Reiley on Scripting

|

Scripting Languages seem to be on the vanguard of programming language design and/or evolution these days. leaving more traditional, more academic languages, (and more traditional academics) in the dust. Much more should be said about this. For the moment, I just want to remember where I found this: O'Reilly Network: Why Scripting Languages Matter [May. 14, 2003] and this: O'Reilly Network: The Crafty Turk [Jul. 21, 2003]

--BF, wishing all my bottom-feeding friends a Happy 2004...

PS, I also bookmarked this nugget 'o wisdom from Timmy Boy too, thought I don't recall why: O'Reilly Network: Refolding the Instructions [Apr. 09, 2003]. Probably because it seemed consonant with Christopher Alexander's thinly-veiled Zen allusions...

Johnson Braves the Mire

|

I had the distinct priviledge this afternoon of hearing Ralph Johnson (yes that Ralph Johnson) present our hoary chestnut, Big Ball of Mud to his Software Engineering seminar. The mere prospect of this caused me to start reflecting on it...

spaghetti-medium.jpg I still find myself haunted by Kent Beck's critique that this masterpiece of equivocation ought instead to have had the "courage of its convictions".

I've become increasingly receptive to this perspective over the years. Is there something about the metaphors we use to descibe software that blinds us to its fundamental nature? Is that untidy tangle we dismiss as anomalous really part and parcel of building this stuff?

The set of systems that can feasibly be built at all is strickly larger than those that can be built elegantly. For many tasks, and for many teams, this architectural style may be the only possible match.

We made a point of saying, and so it is oft said, that this paper should not be seen as an endorsement of Big Balls of Mud. Indeed, we made it clear that we were, by no means, recommending these kinds of designs. At the same time, we resisted considerable pressure to utterly repudiate this architecture, for a variety reasons.

For one thing, tangled legacy systems leave programmers no choice but to cope as well as they can. Infantry-style teams and processes insure system made in the team's image. Some problem domains may pose requirements so inherently muddled that must inevitably be mirrored in any possible solution.

While I still can't say I recommend this approach, I'm convinced that it was high time that someone try to describe and explain it. This architecture may as well be placed honestly and openly on-the-table, since its spectre looms large over its more respectable alternatives.

Noble et al. have nominated this work as perhaps one of Computer Science's first post-modern works. I'm quite sanguine about this characterization. I realized during Ralph's lecture that a legitimate descontruction of our argument, that is to say, a characterization of our unstated posture, might be that small teams of skilled chraftsmen can beat an underskilled human wave approach every time. Were we really advocating a world made in our (presumed) image?

And, this Slashdot nugget on role fragmentation suggests yet another possible perspective on this multi-faceted issue. (And, yes, it feels rather pointless to echo a Slashdot post in one's own weblog, but what the hell...)

Ralph observed at one point that Big Balls of Mud are what you get when you throw an army of Visual Basic programmers at a problem, and further, that that's just what you ought to expect to get. It's yet another corrolary to Conway's Law. Given that you've elected to employ a large number of modestly talented "infantry level" coders, a haphazzard hodge-podge is what you should properly expect.

Now, back to Beck's challenge. Kent made this remark, I've recently realized, back when his ideas about Extreme Programming were taking shape. XP would ultimately relegate concerns about design aesthetics to a secondary, or even tertiary position in its pantheon. You Are Not Going to Need It demanded that design flourishes be eschewed in lieu of immediate, established requirements. It is a defiantly utilitarian process that ultimately came to scoff at the petty egotistical inclinations of designers towards generality, extensibility, and elegance. Indeed, it gently mocks these inclinations as soft of wasteful hubris. Looking back, it seems that Kent had begun to see the Sirens of Elegance, of High Road Architecture, as of the major obstacles on the road to a more dependable software development process.

Outlook Mystery Hex

|

Have you ever been mystified, err, well, let's be candid, annoyed by Outlook's hexadecimal failure codes. This site has the low-down on 'em.

Now as to the enduring mystery of why I'm still using Outlook...

Software that Doesn't Stink

|

I came across some fresh speculation about the enduring mystery of why so much of the software we build is so bad. This discourse at Better Living Through Software entitled Software that Doesn't Stink links the discussions instigators before adding a good size chunk of supporting commentary.

The notion that programmers who been around for a while develop a tolerace for a certain degree of "code clutter" rang true for me. What's more, I don't see this as only as a possible abdication of concern for the system's design integrity, but possibly also as a necessary means of holding conflicting, centrifugal forces at bay. Big Ball Of Mud speculates that different programmers differ in their capacity and willingness to cope with such clutter. The Selfish Class raised the possibility that inscrutable code may, ironically, resist change better than more readable code.

The observation that there is an appaling amount of waste in many organizations rings true, up to a point. There are a lot of forces at play here, and no single factor will be able to account for it all.

It seems like many organizations assume they need much larger teams, with their attendant support, than they really do. They employ an "infantry model" to staff their projects, when a "commando team" model might be sufficient, as well as more effective...

Joshua speculates that the software industry attracts a certain sort of idealist; persons enamored with the seemingly unliited maleability of this stuff we call software, and captivated by the sheer power that command of such stuff brings with it. One of this most exhilarating thing about programming is that, with enough effort, one can master the intricacies of what is "under the hood" all the way down to the silicon.

Or, one could. During the the '70s and, again, with the advent of the microprocessor, a single individual could master vast swaths of all the code that ran on their machines. These John Henry programmers could single-handledly design, code, and test quite substantial applications. Making all their design decisions inside their own heads kept communication overhead to an absolute minimum.

Are the most accomplished soloists less adept at ensemble work? Are commando coders hamstrung when co-ordinated co-operation is called far? Did those scurillous calumnies about the purportedly poor social skills of the cream of our indigenous hacker culture have any truth to them what-so-ever?

In any case, how surprising might it be were people who were trainted to be Crackerjack solo coders are surprised that collaborative work moves, by comparison, at a stately, even glacial pace?

Joshua observes that coders seem to do what they are good at doing; what they know how to do; regardless, it would seem, of whether doting on these dimensions for improvement is really a good use of their time. You get hand-coded inner loops that exhibit near theoretical peak performance, along with business logic that, at times, seems oblivious to task at hand...

He offer two additional pathologies, competition among programmers to excel at say, optimization, at the expense of master the intricacies of the domain. This sounds plausible, craftsmen of every stripe engage in this sort of one-upsmanship. I'm reminded of a story about the plasterers who stuccoed my house (which I will try to find).

The other is designing for extensibility. This too has, at times, had a certain cache among hackers. Therein lies a tale...

Learning to Compromise

|

On the way to OOPSLA 2003, I found myself reflecting on how the design of any system beyond a certain size is inherently about making compromises. Perfection in any design effort involving trade-offs along a multitude of design dimensions, many of which are impossible to quantify, and some of which are intrinsically subjective, becomes not only impossible to achieve, but nearly meaningless, or at least profoundly impractical, to even contemplate. Any good design is an orgy of compromises.

Much of my thinking along these lines was shaped by reading Henry Petroski's work. There are other tentacles that reach all over the place. I'll try to find some.

My thought, though was this: If compromise is going to be an essential part of any practicing programmer's work, why are we so fixated on inculcating a passion for perfection in students, and so bad at teaching them to be good compromisers?

To be sure, there are realms where quality is measurable, and perfection can be pursued. Algorithm optimization and code succinctness come to mind. These skills can be taught, and better still, evaluated. This may be one reason we teach them first. Are we good at tempering the pursuit of these qualities with the judgement to effectively trade them off? Or can only experience teach this?

Programmers are Like Attorneys

|

Programmers are like attorneys. They know a lot about the law, but don’t know much about their client’s domain. They learn new domains every time they take a new case. Over time, they may cultivate specialties, as they develop more expertise in a particular area.

Still, their attitude is “Have Law, Will Travel” (talk about dating yourself).

People think that a good lawyer can litigate anything, or that a good programmer can hack anything. Both statements could not be further from the truth. You'd not want to hire a probate lawyer to defend against a drug charge, nor a union pension fund disbursement programmer to write a realtime data acquisition system.

It takes years to master the law, but mere mastery of the law is often not enough. One must also know as much about the domain being litigated as a denizen of that domain.

Like lawyers, we spend years acquiring arcane expertise in realms like programming languages, and optimization techniques. Once we hang out our shingles and begin to practice our trade, business rules, medical details, or even rocket science must become our stocks in trade as well.

Like lawyers, what programmers do is often misunderstood, and both professions have found themselves the targets of derision, even contempt. Garbage Collection guru Henry Baker once likened the role lawyers play in modern society to that of the samurai in ancient Japan. No one really liked having either around, he said, but if the other guys had theirs, you had to have yours, and whoever had the most, or the best, would probably prevail.

Another attribute that programmers once shared with the legal profession was decent compensation. Sigh. And a modicum of prestige. Sigh. Some of the last of the secular priesthoods.

Both engage in a level of face-to-face interaction with their customers that resist attempts at outsourcing. The work products of both exhibit a custom, one-of-a-kind character.

Both adhere to rigid dress codes, and work irregular hours.

A Programming Language Parable

|

A programming languages parable from Shelly Powers, by way of Mariann Unterluggauer...

Thomas Jay Peckish II on Programming Language Design

|

No programming language construct held out so much promise, yet failed more spectacularly, than the Comment Statement...
--Thomas Jay Peckish II

Thomas Jay Peckish II on the Cartoon Cult

|

A flaccid cabal of cartoon worshipers, drooling over UML centerfolds...
--Thomas Jay Peckish II, at OOPSLA 2000

UML: A form of pornography favored by project managers... Porn for Project Managers...
--Thomas Jay Peckish II

Live James Burke Scraps from OOPSLA '99

|

77 Tutorials
25 Workshops

30 of 152 papers
2136 or so people...



James Burke from Oxford, alas...

Excruciatingly boring... He opens with a joke about a millipede...

Knowledge manufacturing, a.k.a. innovation...

Oh no, not The Box...

Carburetor: perfume spray, gasoline... A Frenchman...

A paean to serendipity and bisociation...

Descarte and reductionism... A world of noodlers. Specialization vs. holistic hippies.

Planning the Hunt: delegation of labor, hierarchy, standard mammoth catching practice.

Flint and language to convey skill. A precise linear sequence, he claims...

More than one meaning in more than one context... Break the reductionist/specialized rules (yawn).

Why does he call reductionists noodlers?

Looms and putting people out of work...

Riots and change in the apparel chain...

Linen rags are free. You can make paper...

Since 1991, the Americans have been able to turn their first stringers from building bombs to building toys. I has turned out to matter, it seems. No one expected this. Not even us. Cold war demobilization has given us a windfall technology dividend...

Graduate vocabulary 12,000 418,424 words in English

He praises reductionism as he belittles it. Cheap talk about creativity is really a fig leaf for mobilizing the masses of dullards to step into line with enthusiasm. Mr.Smith goes to Oxford. His is a soothingly egalitarian message, so much so that I don't trust it.

Standards and information scarcity: he is going to claim that intelligence is not scarce. He claims there were only a few niches for clever folk to act like wizards. He is going to claim this is changing.

Novel juxtaposition: it can be more plentiful...

Even Burke is hanging out his red light. A Knowledge Web. Form being more important than content...

He beats on one and one is there...

Jim Burke's Everybody Gets a Trophy Day

Foreseeing patterns of innovation... Weave a few patterns...

He's an optimist... We went to the moon... Several times...

Thomas Jay Peckish II on Top-Down Design

|

Top-down, Waterfall design is the technical equivalent of Scientific Creationism.
--Thomas Jay Peckish II

Thomas Jay Peckish II on Relational Database Rows

|

As far as I am concerned, RDBMS rows are just objects that don't know how to behave...
--Thomas Jay Peckish II

A Trivial, Nearly Clerical Exercise

|

Those of you who are old enough to remember the era of 16-bit to 32-bit code conversion, especially in the Microsoft world, will recall that on the surface, it seemed like a trivial, nearly clerical exercise. In its own way, it seemed every bit as ridiculous as the Y2K problem. Yet, for many of the same reasons, it was an expensive and tedious chore…

I Have Nothing to Declare But My Genius

|

Oscar Wilde on Strong Type Checking

Wilde, when, visiting America for the first time, was asked upon arriving at the New York Customs House if he had anything to declare, and is said to have replied: I have nothing to declare except my genius .
-–Oscar Fingal O’Flahertie Wills Wilde (1854-1900)

Wilde, is, of course, one Ireland's most gifted minters of colorful epigrams...

Thomas Jay Peckish II on Open Source

|

Give away a thousand lines of source code to your competitor, and you may ruin your busininess. Give away a million lines, and you may ruin his...
--Thomas Jay Peckish II

Controlling the Emergent Entity

|

At OOPSLA '97, Gregor Kiczales et al. discussed, among other things, an idea he referred to as Aspect-Oriented Programming, or AOP. They employed a faux conversational format, which at times seemed forced and a little contrived. The work struck me as potentially interesting, but premature. At least the level of fanfare exhibited for something as preliminary as this seemed unwarranted. He spoke of the difficulty of remodularizing to control the emergent entity. This in, turn, prompted the following quip:

If we can't remodularize to control the emergent entity, then what can we do, repolarize the main deflector dish to emit a tachyon pulse?
--Thomas Jay Peckish II

Thomas Jay Peckish II on Java

|

These guys are making all the right compromises...
--Thomas Jay Peckish II, upon the occasion of Java's incorporation of inner classes, a flawed, but probably adequate substitute / excuse for closures...

The Fractal Model

|

The Fractal Model: What points do we have to make?

Reusable objects are the result of an evolutionary process that is quite distinct from that that produces traditional software.

Reusable objects result from a three-phase iterative process that unfolds throughout the system, at all levels, throughout the lifecycle, within and beyond individual applications. These phases are: prototyping, exploration, and consolidation.

This perspective is builds on earlier lifecycle perspectives, particularly Boehm's Spiral Model, but is unique because it recognizes that the lifecycles of individual classes and frameworks are distinct from the applications that spawn them.

This process has characteristics in common with natural evolutionary processes.

Contemporary OOA&D bears a much closer resemblance to traditional approaches than it does to this process.

An organization that wishes to produces reusable objects must be willing to deploy technical and managerial talent in a quite unconventional way. Front-loaded, waterfall-based approaches discourage evolution.

The disciplined reuse of object-oriented components, abstract classes, and frameworks stands in stark contrast to the casual reuse of conventional libraries and program skeletons.

Objects evolve as a result of successive refactorings. Languages differ in their suitability for refactoring, and better tools are needed for it.

Structure emerges as objects evolve. There is typically a progression from a casual "white-box" inheritance-based organization to a more mature and refined "black-box" component-based structure. Not all objects complete this journey.


An obvious but important observation: For any given problem, the optimal solution is not the general solution. This is not to say that general solutions that reveal themselves only as experience with multiple reapplications is gained are not much closer to optimal than initial conjectures.

Solutions based upon superficial analysis will hew closely to the surface structure of the problem. (For instance, recursive descent vs. LR(k) approaches).

Would a story about the landmarks in the problem and solution spaces diverging as the solution space becomes populated with more interesting objects be helpful?

There is more than an incidental correspondence between the expansionary phase and the high temperature phases in simulated annealing.


The observation that the result of our evolutionary process, may, in the end, shed most of its object-oriented baggage should support, and not undermine our tale...

Thomas Jay Peckish II on Necessity

|

A good sign that a programming language feature is necessary is when a lot of people go to a good deal of trouble to revinvent it in cumbersome ways on top of existing languages...
--Thomas Jay Peckish II

OOPSLA '86 Reflections

| | Comments (1)

OOPSLA 1986 Zeitgeist: Objects are good for: Nice User Interfaces / Disributed Processing


An irony: OOP does a much better job of allowing one to solve "Pascal-ish" problems than does Pascal.

People like "readable" papers. (People evidently thought these were "readable")...

Distribution of library materials is important. Someone called Click Art the only successful attempt so far at distributing a library of anything...


Randy Smith's (XEROX) Alternate Reality Kit was perhaps the single most impressive thing I saw at OOPSLA '86.

The Most Impressive List:

  • Danny Bobrow
  • Henry Lieberman
  • Peter Deutsch
  • Alan Borning

I was taken with Bobrow's notion of Linguistic Imperialism.

Both Kristen Nygaard and Alan Kay gave good talks.

Peter Deutch observed that it is necessary for us to borrow from the graphic arts (where have I heard this before). This was the other impressive thing.

Mark Sherman (CMU-ITC) gave a convincing testimonial for MacApp.

Other notions propounded: OOP isn't important; OO Evnironments are.

Parallelism / Concurrency was underrepresented.


It's one thing to rearrange things, but a new idea, that's Christmas.
--Bob Weir

Further, it is impossible to hoarde knowledge. Don't be afraid to be wrong now and then, because you will be anyway. The alternative is that there is no correction in the loop.

Trust thy judgement, for thou art stuck with it in any case, and it is more comfortable to do so.
--Thomas Jay Peckish II


Best Exchange:

Kay: Good ideas don't always scale well...

Lieberman: So what do we do? Just scale the bad ones?


Tim O'Shea is quite funny.

A lesson from OOPSLA: The "big boys" skim nearly everything, and read only things in their specialty...


Software Maintenance: People percieve maintainers sort of as software busboys; cleaning up after the designer and analyst have eaten. Academic work in this area seems to have been retarded by the fact that people seem reluctant to become the world's most famous busboy...
--Thomas Jay Peckish II

On Learning Smalltalk-80

|

A Fundamental Observation: there will be no substitute for just plain knowing how everything works. Once we get to that point, the rest will be simple.

It may be the case that it is necessary to write bad programs before one can write good ones...

"Teams" and Protocols in Smalltalk-80

|

One starts to consider the effect of analyzing groups of cooperating objects, such as the collection hierarchy, or the MVC stuff, relative to outside objects.

Such groups could be:

  • interactively defined
  • informal
  • dynamic

This gets at the problem that work is done by informal "teams" of objects that taken together constitute a holon or discrete whole, and present a coherent subsect of their total protocol ot the world.

Protocols could be formalized as objects distinct from other Organizations in the system.

About this Archive

This page is a archive of recent entries in the Bits and Bytes category.

Airplane Reading is the previous category.

Patternalia is the next category.

Find recent content on the main index or look in the archives to find all content.

Pages

November 2012

Sun Mon Tue Wed Thu Fri Sat
        1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30  

Brian's Links

Homepages

Diarists

Brian Marick
Martin Fowler
Ralph Johnson (That's Completely Bogus!)
Dave Thomas (The Pragmatist)
Glenn Vanderberg
Patrick Logan
Lambda the Ultimate
Joshua Allen (Better Living Through Software)
Mariann Unterluggauer (Motz)
James O. Coplien
Eugene Wallingford
Blaine Buxton
Nickieben Bourbaki
Travis Griggs
Ivan Moore
Mike Mikinkovich
Superboy & Ward
Rebecca Wirfs-Brock
Nat Pryce
Tim Ottinger
Forrest Chang
Gregor Hohpe
Sam Gentile
Robert Hanson
Chad Fowler
Jonathan Edwards
James Robertson
Bruce Eckel
Andrew Stopford
Tully Monster
Grady Booch
Dave's Ramblings
ShiningRay
Solveig Haugland
Dave Hoover
But Uncle Bob
Doug Schaefer
Smallthought
Ted Leung
blog.talbot.ws
The Farm
Ian Clysdale (Random)
Gilad Bracha
Keith Devens
Urbana-Champaign Techophiles
Stefan Lauterer (Tinytalk)
Planet Python
Chris Koenig
Peter Lindberg (Tesugen)
Jason Yip
Sean McGrath
Jeff Erickson (Ernie's 3D Pancakes)
Steve Freeman (Mock Turtle Soup)
hakank (komplexitetemergens)
Deciduous Ponderings
Take One Onion
Project.ioi.st
Ken Schreiner
Hen so.com
Michael Mahemoff (Software as She's Developed)
Tootruthy
Champaign Media Watch
Jason E. Sweat's Weblog (PHP, etc.)
Raymond Lewallen (Code Better)
Keith Ray
Raymond Chen (The Old New Thing)
Neil Gafter
Joe Walnes
Ivan Moore
LD/dazza (Lost in La Manche)
Scott Rosenberg (Wordyard)
Dave Stagner (Sit down and shut up!)
Walter Korman (Lemurware)
Munawar Hafiz (The space between)
Rafael de F. Ferreira (Rafael Rambling)
Mike Hostetler (Where Are The Wise Men)
Jordan Magazine
Andriy Solovey (Software Creation)
Mike Griffiths (Ideas and essays on code development)
Ashish Shiraj (Ashish Kumar -- Software Test Engineer)
Nathaniel T. Schutta (Just a thought...)
Lynn Cherny (Ghostweather R&D Blog)
Dominique Boucher (The Scheme Way)

Powered by Movable Type 5.14-en