Standard and User-Defined Conversions
~ 3 minute read.
I just came across a surprising bug in my code which I did not expect to cause any problem at all. The simplified code goes like this:
void foo(const std::string& str) { printf("string!"); } void foo(const bool b) { printf("bool!"); } // ... foo("0");
What do you expect? Dumb question, sure, I wouldn’t be mentioning this if it
would simply print string!
.
It indeed prints bool!
because the literal "0"
aka const char[2]
is
converted to const char*
and then bool
using standard conversions 1 and is
therefore priorized over the user-defined conversion 2 from const char[2]
via
std::string(const char*)
.
Solution according to an answer on stackoverflow
is using type_traits
:
#include <type_traits> template<typename T, typename S=std::enable_if_t<std::is_same<T, bool>{}>> void foo(const T b) { printf("bool!"); }
Which in that stackoverflow answer is decribed as “elegant” workaround. As this requires C++14 features as I later found out, here’s C++11 compatible code:
#include <type_traits> template<typename T, typename S=std::enable_if_t<std::is_same<T, bool>::value>::type> void foo(const T b) { printf("bool!"); }
And finally, thank you Vladimír Vondruš for pointing out an–in my opinion–even more elegant solution:
void foo(const char* str) { printf("string!"); } void foo(const bool b) { printf("bool!"); }
The above will now use standard conversions and therefore be correctly chose over the bool
one. You may want to still provide a const std::string&
overload in your use case.
Written in 20 minutes, extended in 10 minutes.