What was one of the first things you were taught, when learning to program?  “Comment Your Code!”  And of course, like all programming students, you ignored that advice.  Or, if you are like me, you made vague comments as the lines of “variable called var.” Tonight I opened up some code I haven’t touched in two years.  Code that when I wrote it, made perfect sense to me… at the time. The code was for my binary clock project, BinBoo.  So I need your help, check out the code below and see if you can help me remember what it does!

The Code


// Not final code used!
int calculateTimeBits() {
  int hours = hourFormat12();
  int minutes = minute();
  int timeLEDs=0;

  timeLEDs = hours;
  timeLEDs = hours << 8;

  timeLEDs += (minutes/10)<<4;
  timeLEDs += minutes%10;

  if (isPM())
    bitSet(timeLEDs, 11);

  return timeLEDs;
}
Just kidding.  After about an hour, I figured it out.  The frustrating thing is?  After I remembered what I was trying to do, I checked, and I had an Evernote note that I wrote about it.  I just never bothered to comment the actual code.

Barrel Shifting

The idea behind the code was to represent each LED of my Binary Clock in a 16-bit variable.  The function “calculateTimeBits” would take the current time, convert it to “bits” which another function would read to set the state of each LED.  On the front of the clock there are four columns, each representing a digit of time.  Some would call this a BCD (binary-coded decimal) clock. Figure 1 shows the columns and the binary weights of each light.  The time shown in this image is “20:08”.  (That’s 8:08PM).
BinBoo's Front Panel, Annotated
BinBoo’s Front Panel, Annotated
  Since I had enough LEDs and BinBoo’s control board had enough I/O, I wanted to convert the time into BCD. Problem is, I couldn’t figure out why my LEDs were getting screwed up.  Then it hit me, I was loading in the bits in one direction and reading them back out in another.  So my display was correct, except it in MM:HH  and not HH:MM!

Properly Commented

Here’s the new code.  In addition to loading the data correctly, I added some comments so I can avoid this problem next time I want to make change.  Hopefully it’ll be in a couple of years.

// "Correct" code
int calculateTimeBits() {
   int hours = hour(); // time will be displayed in "military" time
   int minutes = minute();
   int timeLEDs=0;  // 16-bit variable, one for each LED

   // get ones digit of minutes
   // XXXX XXmm
   timeLEDs = (minutes%10);
   // Move it over:  XXXX mmXX
   timeLEDs = timeLEDs << 4;

   // get tens digit of minutes and combine with previous
   // XXXX mmMM
   timeLEDs += (minutes/10);
   // move it over:  XXmm MMXX
   timeLEDs = timeLEDs << 4;

   // get ones of hours and combine
   // XXmm MMhh
   timeLEDs += (hours%10);
   // move it over: mmMM hhXX
   timeLEDs = timeLEDs << 4;

   // finish up with the "tens" of the hour
   timeLEDs += (hours/10);
  // mmMM hhHH

return timeLEDs;
}
Much better to read, isn’t it?  Visually, you might think “hey that’s still backwards!”  Well, that’s okay.  Because when it gets displayed, the bits are “reversed”.  But, that code is for another post. What’s some code you used and later regretted not commenting?  Or, what is your favorite commenting tips?
Author

Fan of making things beep, blink and fly. Created AddOhms. Stream on Twitch. Video Host on element14 Presents and writing for Hackster.IO. Call sign KN6FGY.

2 Comments

  1. Quite true, and a lesson for all budding programmers: if you are working from a reference (book, website, etc), make sure you give that work credit so the next poor slob who has to maintain your code has a chance of understanding your algorithm. If you have no work to cite, explain the algorithm before you code it. It isn’t as critical in a higher-order programming language like C, C++, Python, Java et al; but it’s ESSENTIAL if you have to write in assembler.

    I’ve spent my entire 40-year IT career dealing with assembler code on various platforms. You don’t need to comment every instruction, but you should explain your algorithm in a “concept block” prior to its first instruction. This also serves a purpose in higher-order languages, just not as vitally as with machine code (particularly in an instruction set architecture with which you may not be completely expert!).

    Always consider that next “poor slob” who will come after you and will have to fix/enhance your code. It always happens, given enough time.

    I’ll never forget one contract engagement I walked into about 30 years ago: the programming team I’d been hired to lead knew about ten instructions and (ab)used every one of them. When I noticed that these code fragments had no comments to them, I blew sky high. The ex-team lead (whom I was replacing) said “we didn’t wanna waste the storage space”. WTF? You mean you’d rather someone scratch their head for a couple days figuring out what you were trying to do than ‘waste’ maybe 60 bytes telling them? SMH. It’s a strange world out there.

  2. Spotted this on a T-shirt :-

    Comments?
    If code is hard to write, it should be hard to read!

Write A Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.