It is well understood that operations on Decimal instances must rely on the context. The idea here is to overflow and round correctly upon instance creation without going through a great deal of additional effort.
Why do you think this is a "great deal" of effort?
(1) If your data comes from an external source, it makes sense to say "create a decimal from this" instead of "create a string from this, then a decimal from that, then another decimal from that with the right rounding."
(2) If this is done by inheritance, there will quickly be diamond inheritance. With a mixin, that is probably OK, but still not desirable.
(3) Is there any reason *not* to honor an optional context when creating a Decimal?
The machinery is already there when creating one as the result of a calculation. The machinery is even there to change context at arbitrary times -- why not allow it on input/output, which is when you are most likely to be dealing with an external data source?