unum_open takes the following parameters:
const UChar* pattern,
const char* locale,
If the style is 0, the pattern and length are used to construct a
Otherwise the pattern and length are ignored, and the style selects one of the
standard styles to use. This is entirely undocumented. In fact, the only way
use the pattern arguments is to specify UNUM_IGNORE as the style, but this is
listed as a valid style to pass to unum_open so its use is a mystery.
There used to be two perfectly reasonable, separate APIs for this, but it
looks like they were unified when the UParseError parameter was added. I think
unification is probably a mistake.
There also are two different kinds of pattern that (in theory) could be used,
the DecimalFormat patterns and the RuleBasedNumberFormat patterns. Currently
code always assumes the DecimalFormat pattern. The docs, of course, don't
the contents of the pattern at all, you're just supposed to know that even
there is this NumberFormat abstraction layer the code basically assumes that it
always DecimalFormat underneath...
If the API is split again then we'd have to analyze the pattern to determine
kind of pattern it is. The syntax is distinct enough to make this possible, I
I'm not sure the openPattern API belongs at this level at all-- it isn't on the
side. I guess it is to avoid having a C variant of DecimalFormat by rolling
everything into unum, but you end up over-unifying things and requiring stuff
having to analyze the pattern string to disambiguate.