Part I of this piece made the case that Refactoring's Original Sin was having allowed Refactoring to come to be perceived as completely cosmetic elective surgery. Worse yet, the resulting improvements benefited only the innards of your application, not the parts the users see.
Even as early as the early nineties, there was a recognition that allowing refactoring to be become pigeonholed as an unproductive polishing pass could do injustice to the cause of the kind of iterative, incremental design.process that we'd become accustomed to working in Smalltalk.
We did make several attempts to characterize this approach as a integrated iterative, incremental process in which design pervaded the object lifecyle. A few years earlier, Lehman and Belady had observed that while design was an entropy decreasing process, maintenance was an entropy increasing process, that inevitable eroded structure.
Yet, somehow, the Smalltak image had avoided this fate, and the development style we'd vicariously absorbed working with Smalltalk was seemingly allowing our code to do so as well. The way this worked was that, as you added new code, you'd keep an eye peeled for opportunities to better integrate it with the code around you. And then ... take them. This casual, continual shift in focus from function to form and back again helped to forestall the kind of death-by-duct tape that seemed so unavoidable when working on classical code. And I think it was fair to say that it was by cultivating these form-focused improvements to the structure, clarity and presentation of the code by hand that we learned what refactorings were.
For instance, we trotted out an extended abstract A Fractal Model of the Lifecycles of Reusable Objects at a few workshops during the early nineties. Conceived as a wry variation on Boehm's Spiral Model, it sketched a three phase scheme whereby an Initial Prototype formed the nucleus of a system that was organically grown through a succession of fine-grained, opportunistic Expansion and Consolidation phases. Expansion had an exploratory focus, and explicitly tolerated a certain degree of structural erosion in the cause of mastering functionality.
Consolidation on the other hand, was a form focused phase in which opportunities to harvest the benefits of hindsight were exploited to repair any damage done during prior expansions and ensure the application was poised to address the challenges to come. Anywhere in the universe, counteracting entropy requires an investment of countervailing energy, and it was here that that entropy reduction was done.
By PLoP '94, Bill Opdyke and I had put together a somewhat more detailed treatment of this material that tried to cast Refactoring in the context of this this kind of process: Lifecycle and Refactoring Patterns that Support Evolution and Reuse. One goal of this work was to once again underscore the idea that refactoring should normally be an integral part of incremental development, not a coarse-grained recovery project. Which is not to say that such an effort might not be called for if a codebase had gone badly to seed, but that with conscientious cultivation the need for major reclamation efforts could be prevented.
Our PLoP '95 Software Tectonics idea can be seen in this light: entropic stress accumulating in a system risks release with increasingly severe power-law consequences.
By then Refactoring was poised to become a household word, among hackers, anyway. John Brant and Don Roberts were already working on the first practical automated refactoring tool for Smalltalk, Martin Fowler was soon to embark on a landmark cataloging effort that would, in turn, give Erich Gamma a roadmap that would lead to the refactoring facilities of the Eclipse JDT. And Kent Beck and his cohorts were soon to deem Refactoring as one of the pillars of eXtreme Programming.
So, one of the reasons that Martin's TradableQualityHypothesis piece resonated with me so deeply is that this is precisely the kind of perception battle I still find myself in when discussing Refactoring to this day. Our inability to have framed it as an integral, fundamental, indispensable element of a continuous, organic design and development process has left code quality and the kind of poised responsive, adaptive posture in the face of change that than makes possible, vulnerable to the remorseless whims of the budget axe.
There is still work to be done.