Recently in Bits and Bytes Category

Software in the Age of Sampling

|

I've just returned from Øredev in Malmö, Sweden where I delivered Software in the Age of Sampling twice. The first time to a healthy turnout, the second to a nearly empty theater, to get a take without demo snafus. The video below is Take Two.

Abstract

Over the last generation or so, software development has changed profoundly, but some of these changes occurred so slowly they have barely been recognized.

Software was once built by skilled but peculiar artisans, who meticulously crafted their original, green-fields commissions from first principles. Today, their frontiers are gone, and the very few of us build anything new from the ground up. Instead existing resources are rehashed, recombined, and remixed to produce "new" mash-ups based up the work of others. It's collaborative, this 21st century scavenging, and it has become a state-of-the-art approach to software development.

These changes in the way we produce software have much in common with the changes that have transformed the music industry over the last thirty years. The garage band era of original composition has given way to the direct borrowing of scraps of other people's pieces and a cavalier disregard for traditional originality.

This session will explore how software developers in the age of sampling have as much in common with contemporary high-tech music "producers" as they do with traditional engineers.

The Cobbler's Children

|

The cobbler's children are the last to be shod.

So it is too for software developers these days. I'm reminded of this hoary trope when I think about the state of twenty-first century programming languages and technology. Because its is still very much mired / anchored in the 95-character ASCII/EBCDIC punchcard-era practices of the fifties and sixties.

Punchcard

This reverie was triggered by a chance Twitter sighting of a 'blog post by Canadian Computer Scientist Greg Wilson on why Donald Knuth's vision of Literate Programming had failed.

Greg's piece resonated with me because it addresses two themes I've trotted out myself on a couple of occasions. Such is the nature of this sort of triggering.

The first is the enduring irony that, despite having delivered all manner of post-ASCII visual GUI bliss and razzle-dazzle to countless users of all descriptions, programmers themselves must still trade in primitive text-based program representations that would be familiar to time-traveling hackers from the Nixon Administration. With a handful of exceptions, today's production code would be quite at home on a keypunch (sans case). These textual accounts of programmer intent remain the canonical, authoritative representations of code, from which any supporting executable binary accounts must be derived. For most, the vast majority, in fact, programs are ASCII characters that live in ASCII files, in directories, which these days, might be under some sort of source control, but still...

Now, one of Greg's points in his "ASCII" passage, was to remind us that Knuth's vision of integrating complex typeset annotations like equations and tables with code, and binding them to the codebase, is even now only clumsily possible using embedded HTML in "doc" comments.

And, it's true, we don't do this well. In general, tying diagrams, equations, and commentary together with code in a way that presents it conveniently along with the code so as to make it even remotely possible to maintain it and keep it consistent with changes to the code, is just not something one sees well supported. And given the polyglot nature of so many applications these days, this ASCII common denominator too often places important cross-language artifacts beyond the reach of refactoring tools that might, given a richer representation, have had a better shot at keeping such support material in sync. Think XML configuration, definition, and data files, even schema.

Cobbler

The cobbler's children lament, then, is borne of the envy that one feels over the fact that so many our customers all get to enjoy much richer representations of the important objects in their domains than we programmers do.

This theme has been showing up of late when I talk about the ubiquitous Ball of Mud phenomenon. These talks have come to have several "DVD" (select your own) endings, one of which explores the idea that one source of our perception of muddiness is the primitive tools and representations we currently employ to navigate and visualize our codebases.

This in turn, is a partial result of working with refactoring tools developers at UIUC a few years back, where I came to believe that program representation is one of the most interesting and important unsolved problems in program language research. By this, I meant getting beyond 95-character ASCII into a richer, round-trip AST-level representation amenable to analysis, refactoring, and annotation. Such representations would need to be cast somewhere beneath text, but above the level of say, Java .class files.

This is neither a new nor an original idea, but it remains, alas, unfinished business. Ever wondered why refactoring tools for C and C++ are so slow in coming? This is part of the problem.

The second reverie that Greg triggered, which was partially reinforced by another Twitter citation, this Redis Road Map from Paul Smith, was this one:

One of the underlying premises of the now discredited waterfall division of labor between skilled architect/designers and rote programmer/coders was the implicit, presumably self-evident assumption that the code was unreadable, and that design must therefore be conveyed by "higher-level" descriptions like documents and cartoons.

Nowadays, a fundamental article of faith underlying the craftmanship, clean code, and iterative, emergent design communities is that the code itself can be wrought to carry the dual burden of conveying to the compiler a comprehensive set execution instructions, while at the same time communicated to at a suitable level of literacy and abstraction the details and intent of the artifacts design. A lot of stock has been placed in this conceit, so one can only hope it holds up.

It is clear that code can do so far better than the woefully minimal degree of adequacy that many shops tolerate. But the question of whether, even at our best, meticulously tended source code will be enough remains an open one.

"Real" Software Engineering

|

This video of a presentation by Glenn Vanderburg entitled Real Software Engineering came up last week during one of those periodic flurries of contrary opinion on Twitter regarding whether or not software development is, or is not engineering. Glenn's 51 minute talk explains why, after after having made a painstaking, convincing case that what we do do is utterly unlike what any other known engineering discipline does, he nonetheless aligns himself with the "pro" engineering perspective.

Real Software Engineering - Glenn Vanderburg from Engine Yard on Vimeo.

It's a well-prepared and delivered piece, and well worth your time. He opens by acknowledging something that anyone who has been in this field for long already knows: that the kind of Software Engineering that has been taught in Computer Science programs for the last forty years has been hopeless, comically out of touch with day-to-day software development reality.

His opening examination of where "Software Engineering" went astray is particularly compelling; he does so by going back and examining some primary sources. For instance, the legendary NATO the 1968 meeting that established the field had some moments that seemingly foreshadowed today's Agile orthodoxy, before heading off into into the weeds for a generation the next year. Winston Royce has evidently been saying, for 42 years, that his original waterfall paper has been tragically misunderstood. Glenn makes a good case that this is so. You may, of course, actually read the paper and decide for yourself. Parnas's A Rational Design Process: How and Why to Fake it is here too. Glenn has some fresh background on Parnas's use of the term "rational".

Galloping Gertie

I thought I caught a welcome, albeit uncredited whiff of Petroski in the second part of the talk, where he describes how science, and art, mathematics, craft, tradition, and empiricism guide what real engineers really do. And no talk on the limits of engineering would be complete without an appearance from Galloping Gertie

I particularly enjoyed Glenn's treatment of of the perennial and enduring mis-perception of the roles of engineers and coders that the industry inherited from its lengthy flirtation with the waterfall model. This conceit went something like this: The "real" engineering effort involved in engineering software is in the design, not the implementation. Hence, design must be something distinct, something more demanding, than mere coding. The software engineers job then, was produce, from a given set of requirements, some artifact that could be "thrown over the wall" to the coders for "construction".

Of course, this analogy is off. The design of the program itself is the part of this process that is akin to automotive or aircraft design. Construction, building, or fabrication is the process of reproducing the shrink-wrapped media, or invoking and executing the application over the web. For aeronautical engineers, fabricating each individual remains aircraft is quite expensive. Though software engineering began during an era of pocket-protectors and mechanical pencils where CPU were still scarce, fabrication for us now is essentially free. Given this perspective, Glenn continues, the folks dismissed as blue-collar coders in the waterfall pecking order are the real engineers. Because engineering is about making stuff that works. And it is with this contention than Vanderburg rests his case.

Which is fine, as far as it goes. I guess, after all that, I feel less obligated to align myself with the engineering fraternity than does Glenn, given how different making software turned out to be from the other disciplines he showcased, but that's probably a matter of taste. I'm just not sure I have a dog in it. There are lots of disciplines that deliver stuff that works besides engineering: cinema, pharmaceuticals, agriculture, bakeries, blacksmiths, composers, ... I could go on. What might we yet learn by analogy from disciplines outside our mathematics and engineerings roots?

Of course, the other great unspoken truth about the software engineering tradition in Computer Science has been that software engineering has always really focused on the admittedly considerable challenges associated with managing, organizing, and yes, even leading a large, infantry scale industrial organization whose objective it is to produce software, often to the detriment of the more technical issues of interest to those in the trenches.

Ironically, one of the successes of the Agile movement has been encourage the emergence of more antonymous "commando" scale units within the kinds of World War II Era waterfall shops that had dominated the industry.

Indeed, these hierarchical corporate traditions, the clashes of these kinds of cultures with the agile mindset, and the daunting demands of scale are all issues that might have merited additional examination, and that continue to the contribute to the perception that software engineering is out-of-touch.

Refactoring's Original Sin: Part II

|
Makeover

Part I of this piece made the case that Refactoring's Original Sin was having allowed Refactoring to come to be perceived as completely cosmetic elective surgery. Worse yet, the resulting improvements benefited only the innards of your application, not the parts the users see.

Even as early as the early nineties, there was a recognition that allowing refactoring to be become pigeonholed as an unproductive polishing pass could do injustice to the cause of the kind of iterative, incremental design.process that we'd become accustomed to working in Smalltalk.

We did make several attempts to characterize this approach as a integrated iterative, incremental process in which design pervaded the object lifecyle. A few years earlier, Lehman and Belady had observed that while design was an entropy decreasing process, maintenance was an entropy increasing process, that inevitable eroded structure.

Yet, somehow, the Smalltak image had avoided this fate, and the development style we'd vicariously absorbed working with Smalltalk was seemingly allowing our code to do so as well. The way this worked was that, as you added new code, you'd keep an eye peeled for opportunities to better integrate it with the code around you. And then ... take them. This casual, continual shift in focus from function to form and back again helped to forestall the kind of death-by-duct tape that seemed so unavoidable when working on classical code. And I think it was fair to say that it was by cultivating these form-focused improvements to the structure, clarity and presentation of the code by hand that we learned what refactorings were.

For instance, we trotted out an extended abstract A Fractal Model of the Lifecycles of Reusable Objects at a few workshops during the early nineties. Conceived as a wry variation on Boehm's Spiral Model, it sketched a three phase scheme whereby an Initial Prototype formed the nucleus of a system that was organically grown through a succession of fine-grained, opportunistic Expansion and Consolidation phases. Expansion had an exploratory focus, and explicitly tolerated a certain degree of structural erosion in the cause of mastering functionality.

Consolidation on the other hand, was a form focused phase in which opportunities to harvest the benefits of hindsight were exploited to repair any damage done during prior expansions and ensure the application was poised to address the challenges to come. Anywhere in the universe, counteracting entropy requires an investment of countervailing energy, and it was here that that entropy reduction was done.

By PLoP '94, Bill Opdyke and I had put together a somewhat more detailed treatment of this material that tried to cast Refactoring in the context of this this kind of process: Lifecycle and Refactoring Patterns that Support Evolution and Reuse. One goal of this work was to once again underscore the idea that refactoring should normally be an integral part of incremental development, not a coarse-grained recovery project. Which is not to say that such an effort might not be called for if a codebase had gone badly to seed, but that with conscientious cultivation the need for major reclamation efforts could be prevented.

Software Tectonics

Our PLoP '95 Software Tectonics idea can be seen in this light: entropic stress accumulating in a system risks release with increasingly severe power-law consequences.

By then Refactoring was poised to become a household word, among hackers, anyway. John Brant and Don Roberts were already working on the first practical automated refactoring tool for Smalltalk, Martin Fowler was soon to embark on a landmark cataloging effort that would, in turn, give Erich Gamma a roadmap that would lead to the refactoring facilities of the Eclipse JDT. And Kent Beck and his cohorts were soon to deem Refactoring as one of the pillars of eXtreme Programming.

So, one of the reasons that Martin's TradableQualityHypothesis piece resonated with me so deeply is that this is precisely the kind of perception battle I still find myself in when discussing Refactoring to this day. Our inability to have framed it as an integral, fundamental, indispensable element of a continuous, organic design and development process has left code quality and the kind of poised responsive, adaptive posture in the face of change that than makes possible, vulnerable to the remorseless whims of the budget axe.

There is still work to be done.

Brownfield Software

|

I spent an informative 51 minutes yesterday watching this InfoQ video presentation on Brownfield Software: Industrial Waste or Business Fertilizer by Josh Graham of Thoughtworks.

Josh describes an intensive, heroic effort to bring a large, festering, vintage 1997 legacy Java system under control.

Among the highlights: his admonition that, no matter how great the temptation may be, avoid the urge to rip out those homebrew frameworks and replace them with more modern, standard ones.

The testimonials from the theretofore downtrodden hackers who'd been pinned down in the trenches on this project were touching as well.

Oh yeah, and they used AOP too. For logging, naturally....

They also spoke of the indispensability of engineers with superior textual skills in an environment like this, which, I gather, means the kinds of folks who for whatever reason are able to make some sense out of code most of us would find hopelessly inscrutable.

Kakonomics

|

Came across this blog entry by Gloria Origgi, courtesy of a tweet from @ronjeffries on Kakonomics: or The Strange Preference for Low-Quality Outcomes.

The piece opines that in many systems, a state of collusive mediocrity can emerge and persist between agents, whereby an exchange of low quality outcomes and minimal rewards is mutually tolerated as part an Evolutionarily Stable Strategy.

It's easy to imagine this sort of low-expectations relationship emerging between an IT organization and mud maintainers, and how it might account for the kind of neglect some codebases must seemingly endure.

The relationship between coder and code itself might lend themselves to this sort of analysis along the lines of Joshua Kerievsky's notions of Sufficient Design.

One wishes the author had opted for a different name. I can almost hear every preschooler (and preschooler-at-heart) in the room giggle each time I read it.

Refactoring's Original Sin: Part I

|

Before there was Refactoring there was refactoring. There were people who refactored just fine before Refactoring was invented.

I remember right well.

I remember back during the late 1980s, the OOPSLA-era Object boom was in full bloom. I had joined a new research group led by Ralph Johnson at the University of Illinois at Urbana-Champaign. And some of us were availing ourselves of the privilege of getting the best apprenticeship in object-oriented programming a body could ever get: spelunking around the PARC Smalltalk-80 codebase.

Gauguin and Tahiti

Here we found code that had clearly been crafted by insanely bright, committed pioneers who had stuck with it, lived amongst it, inhabited it, as they acknowledged themselves, polished it and cultivated it. And incrementally improved it. It was clear to anyone who visited the place, and it felt like a place, that the kind of code you were browsing was the fruit of a patient, sustained, collaborative process of cultivation. All the code was there. You were free to roam, and explore as you pleased.

It was a place that enticed you to join it. Smalltalk development enticed one to "go native", like Gauguin's Tahiti. And you grew and cultivated your code next to theirs, and when you saw an opportunity to improve it, you did. And your code grew better too.

Of course, even then, during the dusk of the Cold War, not every working programmer left every first draft untouched. If you were overconscientious, you might routinely do this as part-and parcel of bringing an implementation into being. Sometimes these ideas would come into focus once the code was actually on-the-air. In those days, when you wanted to improve the design of a chunk of existing code, you said you were going to clean it up, or take a final polish pass over it, or maybe tidy it up a bit. An enlightened boss might tolerate such an indulgence at the end of a successful push. But even then, the suspicion that you might be engaging in self-indulgent lily-gilding was never far behind.

Often, your focus would be strictly on improving the code's clarity or structure, or aesthetics without changing the way it worked at all. By analogy with number theory, one could casual describe a reworked or improved version as being better factored than its predecessor. It became natural to describe the newer take as a refactored version of the older one, and more broadly, to describe this kind of focused improvement activity as refactoring. Still, this most often was less a distinct phase that an acute, opportunistic shift of focus. It was typically part of preparing the groundwork for some subsequent functional improvement.

And so it came to pass that in the early days of the reign of George Bush the Elder, Bill Opdyke and Ralph Johnson set about the task of cataloging some of these Things You Could Do to Programs that Didn't Change The Way They Worked But Made them Better, with an eye towards seeming studying them more formally, and seeing whether they could be automated. And refactoring became Refactoring, with a Capital-R.

A Refactoring became a much more narrowly cast behavior-preserving program transformation, which served their research agenda fine, but cast the broader, more casual usage to the side. Refactoring, the activity, had henceforth to be one in which only such sorts of transformation were to applied. A distinct activity.

Subsequent authors, such as Martin Fowler (Refactoring) and Joshua Kerievsky (Refactoring to Patterns), have probably inadvertently reinforced this impression with their detailed focus on the surgical tactics and skills required to accomplish particular Refactorings, at times seemingly at the expense of broader, more holistic therapeutic approaches to day-to-day codebase health maintenance.

Adam and Eve So, what was Refactoring's Original Sin? Letting the idea be framed as a distinct phase, a polishing pass, an optional brush up, where working code was turned into "higher quality" working code that worked the same way as the original working code. Framed this way, this could sound like an utterly optional spasm of indulgent, flamboyant, narcissistic navel gazing by self-indulgent prima-dona programmers, rather than a inherent, indispensable, vital element of a process that cultivated the growth and sustained health of both the codebase and the team.

The problem with "refactoring" is that refactoring is not a bad name for Refactoring proper, but that the idea has always cast a wider shadow, in part because of its gestation in precisely the kinds of iterative incremental incubators we nostalgically recalled above.

As a result of having framed in this fashion, the perception persist to this day that refactoring is a distinct, coarse-grained, and dispensable luxury, rather than a natural, inherent, integral part of a healthy, sustainable development process.

The problem is that the refactoring process in which Refactoring plays this indispensable role, and the value of the resulting code, have never been really properly framed.

And a consequence of this is that refactoring itself becomes a casualty in precisely the kinds of TradableQualityHypothesis triage games that the Martin Fowler piece that triggered this one so eloquently showcased.

Thomas Jay Peckish II on 21st Century Software Engineering

|

Open-Closed Principle? OCP? Make that private and Young Hakaz gonna copy-paste that pretty little banzai meditation garden of yours and do what they want to it. Encapsulate this yo.
--Thomas Jay Peckish II

Yo Hakaz! I chased it / and I faced it / and I cut it / and I pasted / I could taste it / 'til I wasted it.
Peace.

--Thomas Jay Peckish II

Thomas Jay Peckish II on Life and Death

|

Skills are important. A surgeon screws up, a person dies. A programmer screws up, a process dies. The stakes are identical. Unless you think a process is worth less than a human life...
--Thomas Jay Peckish II

Thomas Jay Peckish II on Programmers

|

All too many programs are made in the images of the people who create them: complex, technically sound, but ineffective, often woefully, at communcating with human beings. Why are we surprised?

--Thomas Jay Peckish II

The tweet looked like:

Too many programs are made in the images of those who create them: complex, technically sound, but poor at communcating with human beings.

which may actually be a tad better...

 

The continuing controversy over Scrum Certification evidently inspired the following "press release", which was circulating at the Agile 2007 Conference this year. While its authenticity, or, to be more precise, its lack thereof, is no longer in dispute, you may nonetheless find it amusing.

REVOLUTIONARY NEW PROCEDURE DECREASES SCRUM MASTERBATION

Press Release, 15 August 2007

*REDACTED*, Inc., is pleased to announce that we will be able to offer safe, somewhat affordable Scrum Circumcision™, for the first time, to our preferred clients, beginning in September of 2007.

Scrum Circumcision™ is perfect for the software professional who is looking for a way to distinguish himself from, shall we say, the unwashed masses.

This revolutionary procedure entails the surgical removal of the patient's foresight. The effects are similar to a traditional XP lobotomy, though Scrum Circumcision™ is typically considerably less invasive.

Research has demonstrated that excising one's foresight can reduce premature generalization, improve code hygiene, and increase pairing satisfaction.

Customers usually report that they feel their XP-ness is dramatically enhanced after undergoing this procedure, and often experience stronger, more robust iterations.

Warning: stubbornly retaining your foresight may cause you to be subjected to ridicule and derision by your peers. Trim that foresight now, and transform yourself into a peak performer.

Act today. Our certified Mohelim are equiped with the lastest, most modern tools available, and are are waiting for your call. Remember, this procedure becomes more painful the longer you wait.

Foresight You're Not Gonna Need It™.

*REDACTED* president *REDACTED* denounced the faux press release as a "crass, puerile fabrication". Said *REDACTED*:

While we have mixed feelings about the necessity for formal certification procedures for any agile methodology, we see no reason to question or mock the motives of anyone advocating such credentialization. Frankly, I'm disgusted.

Regrettably, it appears this fabrication may have been the work of one of our own employees. Should this turn out to be the case, we assure you that the individual responsible will be sternly reprimanded.

Thomas Jay Peckish II on Metaphor

|

When it comes to artifacts fashioned from bits, metaphor is destiny...
--Thomas Jay Peckish II

Thomas Jay Peckish II on Finding the Last Bug

|

Bugs are like prime numbers. You can never really find the last one, but after the first billion or so, they start to thin out a little bit...
--Thomas Jay Peckish II

Thomas Jay Peckish II on Refactoring

|

We'll know that refactoring has really arrived when Microsoft starts marketing Word as a refactoring tool for your prose...
--Thomas Jay Peckish II


You know, over the last couple of days since I've posted this hoary quip, it's occurred to me that there actually may be something to the idea of bringing refactoring tool ideas into the realm of prose cultivation. I wonder if there are tools like this out there already?


Indeed, witness this observation on this topic...

The Mind of the Theoretician

|

I have struggled for, let's say, way longer than is good for a person, to comprehend the mind of the theoretical computer scientist. I've met with mixed success at best over the years, but this post offers more insight than most into the monastic deliberations of this secular priesthood...

A postscript: I've been causally browsing "Ernie" occasionally for the last couple of three or four months, but I'd never hunted around enough to discover that Ernie's Gepetto is UIUC DCS's own Jeff Erickson. I'm reminded of one of those horror stories that ends with "run for your life, that phone call was made from inside the building!". I might add that I completely concur with Steve Freeman's observation that Erickson's work in this area is likely applicable, with appropriate revision, to conference committees in general.

Freedom from Choice

|

In ancient Rome
There was a Poem
About a Dog
Who found two Bones
He picked at One
He licked the Other
He went in Circles
He dropped Dead
--Devo,