Niels Möller nisse@lysator.liu.se writes:
hex_digits[17] = "0123456789abcdef";
That looks ugly but +1, IMHO.
IIRC, I've seen it used like this too:
hex_digits[17] = "0123456789abcdef\0";
A less idiomatic but more tidy approach would be
hex_digits[16] = { '0', '1', '2', ... 'e', 'f' };
I'm hoping no compiler complains about missing ASCII NUL in a "string" defined that way.
Several base64 implementations use the last approach, but mostly for EBCDIC compatibility rather than to pacify false positive compiler warnings. Do Nettle care about EBCDIC targets? Gnulib's base64 code has the snippet below, but it assumes 'char' is 8-bit.
/Simon
/* With this approach this file works independent of the charset used (think EBCDIC). However, it does assume that the characters in the Base64 alphabet (A-Za-z0-9+/) are encoded in 0..255. POSIX 1003.1-2001 require that char and unsigned char are 8-bit quantities, though, taking care of that problem. But this may be a potential problem on non-POSIX C99 platforms.
IBM C V6 for AIX mishandles "#define B64(x) ...'x'...", so use "_" as the formal parameter rather than "x". */ #define B64(_) \ ((_) == 'A' ? 0 \ : (_) == 'B' ? 1 \ : (_) == 'C' ? 2 \ : (_) == 'D' ? 3 \ ... : (_) == '8' ? 60 \ : (_) == '9' ? 61 \ : (_) == '+' ? 62 \ : (_) == '/' ? 63 \ : -1)
signed char const base64_to_int[256] = { B64 (0), B64 (1), B64 (2), B64 (3), ... B64 (252), B64 (253), B64 (254), B64 (255) };