Just discovered a curious quirk in C#’s handling of the char type. Most of the time you can just think of it as being a character type, like a single character within a string, but actually it isn’t. It has an implicit coversion to int, but not to string, so
Console.WriteLine('A' + 'B');
would produce 131.
I ran into a problem with this because I am currently working on a project that has to read and write lots of ini files. I’ve written a set of classes for handling ini files, but rather than use the Convert class every time I read a number, and ToString every time I write, I instead created an AutoValue class that automatically carries out all the conversions. This lets me write code like
string some = ini["asection"]["some"]; decimal other = ini["asection"]["other"]; ini["asection"]["other"] = other + 1;
It does this by storing the value internally as a string, having a constructor for each type, and implicit conversions in and out of each type. However it went wrong when I tried to do something like
char c = ini["something"]["else"]; bool b = c == 'A';
because it tried to convert c to an int, even though there is an implicit operator conversion for char. I had to change it to
bool b = (char)c == 'A';
to make it use the correct one.