You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The latest release of kotlin serialization 1.5.0 added a new interace called ChunkedDecoder which support decoding large data with the help of okio in smaller chunks. This prevents OutOfMemory Exceptions and some performance improvements.
There is "partial" support in the dev branch. This is partial in that it is supported by serialization but not supported in the parsers (yet). This means that the parser will provide whatever string/text size it does, and the serializer provides it in chunks. There is an implication in the parsing that strings are given contiguous, but this is not a hard requirement. If they chunk then chunked parsing will support that. I haven't put such functionality in the platform independent parser yet (and for DOM it makes no sense at all).
The latest release of kotlin serialization 1.5.0 added a new interace called ChunkedDecoder which support decoding large data with the help of okio in smaller chunks. This prevents OutOfMemory Exceptions and some performance improvements.
Kotlin/kotlinx.serialization#2012
At the moment it is only supported by the JsonDecoder. Are there any plans to support this also in your XML Serialization Library?
The text was updated successfully, but these errors were encountered: