Wednesday, April 15, 2009

Gdb set input-radix

Today I messed up a debugging session I had going, when I forgot the gdb syntax for setting the value of a variable in the code. It wouldn't have been a problem for most variables, but I wanted to set the variable i. I typed:
  (gdb) set i 100
the surprising response to that -- from gdb version 6.3.0.0-1.132.EL3rh -- was:
  Input radix now set to decimal 100, hex 64, octal 144.
Say what? Of course the correct syntax for setting a variable in the debug program is:
  (gdb) set var i = 100
You see, gdb was trying to be nice by interpreting "i" as shorthand for "input-radix", a sub-command of "set". If my variable had some other name that didn't trigger the shorthand, I would have gotten a friendly error message and no ill effects:
  (gdb) set flags 100
A syntax error in expression, near `100'.
Having numbers you enter interpreted as base-100 is not very useful, and I ended up killing my session and restarting because I didn't figure out the trick for setting the input-radix back to 10 until I had gotten gdb into what seemed to be a useless state. The actual sequence of events was like this:
  (gdb) set i 100
Input radix now set to decimal 100, hex 64, octal 144.
(gdb) set i 10
Input radix now set to decimal 100, hex 64, octal 144.
(gdb) # Huh? Oh yeah, base100("10") == base10("100").
(gdb) set i 1
Nonsense input radix ``decimal 1''; input radix unchanged.
(gdb) # D'oh! base100("1") isn't 10, it's 1.
(gdb) set i 10
Invalid number "10".
(gdb) # Oops!
Notice that even though it said the radix was unchanged, gdb really did change the radix to base-1! At that point, I thought I was toast, since it couldn't recognize any numbers except 0. I killed my session, losing my breakpoints and history. Later I realized that there is a way out: use a hex number:
  (gdb) set i 0xa
Input radix now set to decimal 10, hex a, octal 12.
Now, set input-radix could be a handy little thing, if it were restricted to values you might actually want to use as the base for numbers you type in to the debugger. Say, 2, 8, 10, and 16. But 100? You can't even enter numbers between 16 and 100: it recognizes "F" as 15 -- as long as there's no variable called "F" in the debug context -- but it doesn't recognize "G" as 16. You can even set the input radix to 0! Watch this:
  (gdb) set i 0x0
Input radix now set to decimal 4294967295, hex ffffffff, octal 37777777777.
The corresponding set output-radix command is restricted to 8, 10, and 16. Base 2 would have been nice, but whatever. It's ridiculous that set input-radix allows anything including 0 and 1.

1 comment:

Anonymous said...

Thanks a tonne. I was having a similar problem in my gdb session. Your post saved me quite some time!