... I don't have any problem to implement a context stack. But I think that it's no clear to be useful, so I think it's better to go for the sure staff, and when Decimal gets heavy use, if everybody agrees to add a context stack, we'll go for it. What do you think? .
YAGNI (You Ain't Gonna Need It). Most apps will never change the context. Most of the rest will set it once at program start (typically just to increase precision), and never change it again. The same is true of the elaborate FPU control registers in the Pentium HW most of the world uses today, and the "numeric context" here is just a small generalization of that.
This isn't to say that numeric context isn't useful (or the elaborate FPU control registers): it's extremely useful, but generally only to a handful of numeric experts writing library utilities with severe "works in all possible endcases" requirements. They'll have the responsibility to document the effects their code has on context (when does it signal inexact? overflow? etc), and because they will change context visible to the caller in the documented ways, they've got no use for a stack (they'll generally need to restore *parts* of the context to entry conditions, but change other parts in defined ways -- same as, e.g., the builtin addition function).