-
-
Notifications
You must be signed in to change notification settings - Fork 6.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
optional JSON/JSONB passthrough for postgres #8141
base: master
Are you sure you want to change the base?
Conversation
The cockroachdb circleCI test failed with a segmentation fault... apparently after all the tests hard run. Did a local test with cockroachdb turned on and it seems to run OK. |
This would be amazing!!!! 🤙🏻 |
I really like how a large INT's can be represented in JSON but suddenly a JS parser won't be able to handle big INT's -- breaks JSON stringify... We need to use a different JSON parser and JSON stringify to deal with big INT's -- TypeORM can't use these big INTS, so if you want to be able to create an index on a 64 bit timestamp, in a JSON value. This fixes that! |
export interface ColumnJsonOptions { | ||
|
||
/** | ||
* Return type of HSTORE column. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wrong comment.
@@ -128,6 +141,18 @@ export function Column(typeOrOptions?: ((type?: any) => Function)|ColumnType|(Co | |||
if (options.type === "hstore" && !options.hstoreType) | |||
options.hstoreType = reflectMetadataType === Object ? "object" : "string"; | |||
|
|||
if (options.type === "json" && !options.jsonType) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think better to remove this block for now to ensure its 100% non breaking change.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Spent quite a while thinking about this at the time. If a user manages map a string property of some entity to a json column in the database, in the expectation that json column will only ever contain strings, then of course their code will break if anyone ever inserts anything into that json column which is not a simple json string. The actual 'break' this change would imply is that in this corner case we would see quotes around their string when we didn't before (because the json parser is not being invoked). I felt that the balance was in favor of protecting the user from the consequence of their bad code design.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Happy to pull this out (and any associated unit test) if it is really the only way to get this merged, but since (for me) the only code that would break is arguably already broken, and this change should actually improve the behavior of that broken code, I think there's an argument for keeping this in. I'm going to fix everything else, rebase, and request a re-review.
@@ -56,6 +56,18 @@ export function PrimaryColumn(typeOrOptions?: ColumnType|PrimaryColumnOptions, o | |||
if (!options.type) | |||
throw new ColumnTypeUndefinedError(object, propertyName); | |||
|
|||
if (options.type === "json" && !options.jsonType) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same comment as above^
2118d47
to
71e0710
Compare
Per review of PR typeorm#8141, improve comments/docs where the original was unclear/misleading
There are other builds unrelated to this PR failing with QueryFailedError: SQLITE_ERROR: cannot start a transaction within a transaction, so I'm going to assume that is not related to my update. |
Virtually certain the failing test due to e4d4636 @TheProgrammer21 |
@pleerock fixed the failing test on the master branch. I don't really get why it was failing initially since it ran without problems in my PR. |
If the JSON retrieved from the database is simply to be re-serialized as a string, parsing it into a javascript object is a waste of resources. The default JSON parser does not support high numeric precision for JSON numbers, whereas the JSONP format in postgres does. Using a value transformer with the new passthrough option permits the use of a custom parser (e.g. json-bigint) Closes typeorm#8128
Duplicate the core jsonp tests for a test entity which uses the jsonType: 'string' option
Per review of PR typeorm#8141, improve comments/docs where the original was unclear/misleading
Remove inadvertant duplication of log message, which may be causing issues with some unit tests.
dd6e796
to
5799aac
Compare
@TheProgrammer21 thanks that seems to have done the trick for me. |
I don't think I want to merge this because it introduces double-way behavior. I would like to understand why do we have a |
I suspect it is because, by default, node-postgres implements exactly
the behavior that you wanted to avoid - if passed a string, it will
parse it into JSON, as opposed to treating it as JSON string.
This is based on a vague memory rather than current research, but it
would explain the JSON.stringify - this would put quotes around plain
string?
…------ Original Message ------
From: "Umed Khudoiberdiev" ***@***.***>
To: "typeorm/typeorm" ***@***.***>
Cc: "softwareplumber" ***@***.***>; "Author"
***@***.***>
Sent: 11/18/2021 8:48:40 AM
Subject: Re: [typeorm/typeorm] optional JSON/JSONB passthrough for
postgres (#8141)
I don't think I want to merge this because it introduces double-way
behavior.
I would like to understand why do we have a JSON.stringify there at
all? @imnotjames <https://github.com/imnotjames> maybe you know why?
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#8141 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AGAPFY7DCIQWLBNUWMFWE2TUMT73RANCNFSM5DAZ6XZQ>.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
|
@softwareplumber and what node-pg gives you back? Does it gives an object? I think appropriate solution to this problem is to find out why |
That's not going to solve my problem though.
At the end of the day, JSON is transmitted to postgres over the wire as
a string. At some point in the stack, the conversion will occur.
Removing JSON.stringify from the typeorm layer will simply mean that the
same thing happens in the postgres API layer, which will use the
standard JSON.stringify, which will fail if the object passed to it
contains a large integer.
The same problem applies in reverse when reading from the database.
The way this update works is that the node-postgres connection is
created with a custom type parser which prevents node-postgres applying
JSON.parse to JSON values incoming from the database.
if jsonType is 'object', JSON.stringify is invoked by typeorm on
outgoing JSON parameters, and JSON.parse is applied to incoming JSON
values. I have inspected the pg-types source to verify that JSON.parse
is all that is applied by node-postgres, so this should maintain the
exact same functionality as was the case before I applied the custom
type parser.
if jsonType is 'string', typeorm doesn't apply any transformation. We
send JSON strings to postgres and get JSON strings back from postgres.
The only actual type conversion occurs on the postgres server, which is
quite capable of handling large integer values (and other high-precision
numerics). This is the new functionality I actually need.
So post this PR, we have a case where either typeORM does stringify AND
parse, or it does neither, as controlled by jsonType. This is arguably
cleaner than the current situation, where json.stringify is applied to
outgoing parameters but we rely on node-postgres to parse incoming data.
Regards
Jon.
------ Original Message ------
From: "Umed Khudoiberdiev" ***@***.***>
To: "typeorm/typeorm" ***@***.***>
Cc: "softwareplumber" ***@***.***>; "Mention"
***@***.***>
Sent: 11/18/2021 12:32:29 PM
Subject: Re: [typeorm/typeorm] optional JSON/JSONB passthrough for
postgres (#8141)
***@***.*** <https://github.com/softwareplumber> and what node-pg
…gives you back? Does it gives an object?
I think appropriate solution to this problem is to find out why
JSON.stringify was used and remove it if there are no strong reasons
behind.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#8141 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AGAPFYZBYWQYXO7FX4N5LNLUMU2C3ANCNFSM5DAZ6XZQ>.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
|
@softwareplumber sorry for late answer here... But can't we came up with a more elegant solution? Something like:
what do you think about it? |
I think this issue can be closed via #8766 . Let me know if not. |
Yes, sure, this PR can be summed up as optionally pass through JSON. Whereas #8766 appears to be #always# pass through JSON, which was grits teeth my original proposal, excepting that this was shot down as a breaking change. |
since its |
I had to revert my change introduced in #8766. More details in #8777. Also based on the problem I faced changes proposed in this PR won't play well. So, I'll ask to answer my question regarding to further implementation of this issue. |
@softwareplumber it's been a long while, but can you maybe review the changes once more regarding big int replacement during stringify? |
Description of change
Fixes #8128 - though better than originally proposed. Implements mechanism for postgres JSON/JSONB types similar to HSTORE. Permits developers to use their own JSON parser which is essential for supporting high precision/scale in JSON, while maintaining backward compatibility.
Pull-Request Checklist
master
branchnpm run lint
passes with this changenpm run test
passes with this change (though I've only run for postgres)Fixes #0000