Mailing List Archive


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [tlug] Giving a program priority briefly



 > Curt Sampson writes:

 > > One particularly interesting point to me, that's perhaps not obvious, is
 > > that I've found that the refactoring generally comes when I'm about to
 > > add new stuff, not after it's added. I find it's when you're faced with
 > > adding new features that you most clearly realize the problems with the
 > > current organization of the code.

Doesn't that lead to precisely what I said?

Darren Cook writes:

 > Yes. You could argue that refactoring before you know what you
 > functionality you will need next is like optimizing without first
 > profiling your code.

Nonsense.  Some optimizations are obvious and costless in readability,
as are some refactorings.  It is well-known that human beings normally
maintain "handles" to 7 plus/minus 2 "chunks" of information.  When
you arrange those chunks in a sequence, you'll have 6 plus/minus 2
"joins", and as you look at the joins, you will see structure ("this
really doesn't need to be a single function" -> refactor!) or
redundancy ("X in this part starts at the same value as Y ends in the
last part" -> optimize!)  It's silly to make proclamations like this;
a substantial fraction of the work done by most people will violate
them.

Note that Darren and Curt are talking about entirely different levels
or scales of work (although the "7 plus/minus 2" formulation can
encompass both with some degree of degradation).  One does not plan
capabilities required for unknown features in 3 minutes of design (at
least not with any degree of usefulness), one does not do profiling in
5 minutes of optimization -- the profiling itself will take 10.

Time and motion studies[1] show that engineers do focus fairly well on
one of those functional areas at one level of structure for up to
about 10 minutes at a time.  But guess what?  They also show that you
can't *stop* them from focusing in that way!  That's just the way the
brain works, it would seem.  It also seems to be the case that for
many problems, the "refactor-add features-optimize" order is natural,
but not universally so -- and Curt's interjection bears that out.  (It
is imposing such an order that I referred to as the "waterfall model".)

The thing that impresses me is that not one of you mentions
*specification* and in particular *listening to clients of your code*.
Although that lack evidently makes Darren nervous.  He can't bring
himself to mention it by name, but it's there:

 > Stepping back in to the real world though, if you need to publish
 > an API that programmers outside your immediate control will use,
 > then some pre-release refactoring based on guessing future needs
 > may be a good idea.

About which Victor Vyssotsky says: "[Developers] won't tell you they
don't understand [the specification]; they will happily invent their
way through the gaps and obscurities."

My conclusion is that the name of the game is specification.  The rest
of the work, you do what comes naturally and it will work out.  Of
course there are best practices, like profiling before and after
optimization, that you can learn.  But rather than discuss "the
natural order of tasks" and the like, the important thing is to *plan*
what you're going to do in the light of *research* into needs.  At the
sub-hour level of task, both research and planning can be implicit and
often undocumented, and as for the work

                              Just Do It

But at higher levels, effective workers generally specify, plan, do,
with formal documentation at each step.


Footnotes: 
[1]  Now about 25 years old, I would guess, so there may be more
recent, better research that invalidates my argument.



Home | Main Index | Thread Index

Home Page Mailing List Linux and Japan TLUG Members Links