no languages (that I can think of) provide built-in type support for arbitrary precision arithmetic
and for good reason - it is expensive in terms of computation.
If I follow you, languages shouldn't provide standard interfaces to do network I/O, because you know, that's computationally expensive and slow. I guess I won't follow you...
They're not built-in types, they're types provided by a library. What the original poster seems (to me) to be confused about is that they can write code similar to
in some language and it will magically make x into a 'Decimal' object, and that
only javascript will 'convert' this into a float/double. This is not true - they all will, as I say, because no language has a built-in type representing decimals, 1.234 is already a float before it gets assigned to anything. If you want decimals, then you have to specifically create one like
var x = new Decimal("1.234")
or however your library requires.