Swift has long suffered a problem with its Decimal
type: unapparent loss of precision.
This happens with all common ways of initializing:
let badDecimal = Decimal(3.133) // 3.132999999999999488
let badDecimal: Decimal = 3.133 // 3.132999999999999488
But not these ones:
let goodDecimal = Decimal(string: "3.133") // 3.133
let goodDecimal = Decimal(sign: .plus, exponent: -3, significand: 3133) // 3.133
Furthermore, this also applies to JSON decoding since it uses NSJSONSerialization
under the hood, which is presumed to parse decimal numbers as Double
and then initializing a Decimal
via its lossy Double
initializer as exemplified above. A common workaround for this is to receive sensitive Decimal
values as strings and parsing into Decimal
with the working string initializer, however oftentimes the format of a JSON payload is out of one's control.
This is something that Apple will most likely fix at some point. In the meantime, PreciseDecimal has your back.
This library declares a lightweight PreciseDecimal
type as a wrapper around Decimal
, with precise init
and Decodable
implementations.
let goodDecimal = PreciseDecimal(3.133) // 3.133
let goodDecimal: PreciseDecimal = 3.133 // 3.133
struct Price: Decodable {
let amount: PreciseDecimal
}
let json = #"{ "amount": 3.133 }"#.data(using: .utf8)!
let goodDecimal = try JSONDecoder().decode(Price.self, from: json).amount // 3.133
No.
PreciseDecimal
falls short for very high precision numbers. Case in point:
let a = PreciseDecimal( 1234567890.0123456789 )
let b = Decimal(string: "1234567890.0123456789")!
print(a) // 1234567890.0123458
print(b) // 1234567890.0123456789
So if you're going to be dealing with more than 6 decimal places, this library is not for you. Instead, the best solution as it currently stands is to represent decimals as strings, especially when it comes to JSON serialization.
It's up to Apple and only Apple to introduce real Decimal
literals into the language, as well as fixing the JSON serialization mechanisms in Foundation.
Because it's very easy to forget to annotate properties, especially since there aren't any compiler checks or tests to ensure the slight change in behavior it provides, leading to sneaky bugs down the road.
In order to keep the library's scope and implementation as lightweight as possible, optimistic for a painless obsolescence once Apple fixes Decimal
.
Do feel free to suggest otherwise if I missed a vital part of functionality that should definitely be in this library.
I don't know.
However, you're not alone in your discontent. Here are two relevant issues you can upvote to improve the chances for Apple to see them: