Description
Previous ID | SR-6440 |
Radar | None |
Original Reporter | @gwynne |
Type | Bug |
Status | Resolved |
Resolution | Done |
Additional Detail from JIRA
Votes | 0 |
Component/s | Foundation |
Labels | Bug |
Assignee | None |
Priority | Medium |
md5: ba2ce7916df7171b8a4d2917feb20190
Issue Description:
When decoding JSON, requesting an integer value which is too large for the requested integer type (e.g. 10000000000000000000000000000000000000
) will throw the following exception on macOS:
dataCorrupted(Swift.DecodingError.Context(codingPath: [VaporTests.CodableTests.(TestModel in _3EDCAB1B5E5ADB4E80B35729410FF649).CodingKeys.bomb], debugDescription: "Parsed JSON number <10000000000000000000000000000000000000> does not fit in Int64.", underlyingError: nil))
However, on Linux, the value Int64.max
is silently returned instead.
As far as I can determine, this corresponds to the documented behavior of strtol()
on overflow, per https://github.com/apple/swift-corelibs-foundation/blob/master/Foundation/JSONSerialization.swift#L826.
The exception thrown on macOS appears at https://github.com/apple/swift-corelibs-foundation/blob/master/Foundation/JSONEncoder.swift#L1832, but the logic doesn't fire on Linux.
The test case which showed this behavior is seen in this commit (which removes the test due to the behavior difference): vapor/vapor@7ebe247
Credit for finding this issue goes to https://github.com/vzsg