Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

incorrect JSON generated for arrays passed in for columns of type json #374

Closed
spollack opened this issue Jun 10, 2013 · 7 comments
Closed

Comments

@spollack
Copy link
Contributor

To repro:

  1. Using a postgres 9.2 database, create a table with a column of type json.
  2. Run an insert statement, passing in a javascript array as the parameter value for that column.
  3. pg generates an incorrect json string, and postgres errors with "invalid input syntax for type json".

For example, if my input array is:
["foo", "bar", "baz"]

The generated json string from pg is:
{"foo", "bar", "baz"}

In other words, the array gets wrapped with curly braces not square brackets.

You can see where this is happening in lib/utils.js. The issue is that we are trying to treat the array as a postgres array input, even though this should be treated as a json input because the database column is of type json.

I'd be happy to do a PR to fix this, but i'm not sure of the right strategy here -- it seems like to fix this problem, we need to know the column datatype, not just the input datatype? do we have that context available?

Thanks,
Seth

@brianc
Copy link
Owner

brianc commented Jun 25, 2013

Hey Seth. Unfortunately no we don't have the input column data types available, and keeping them available is beyond the scope of node-postgres. I think an ORM or some DAL would be a good place to put that info.

There really is no way to fix inbound type conversion other than to manually convert your data to simple types in userland code. I tried to keep the inbound parameter conversions as simple as possible. The more smarts I've added to outbound parameter conversions the more problems it's caused. So...the best 'fix' for this is if you're inserting non-primitive data into Postgres you'll need to turn it into some sort of form postgres will understand when it receives it as a parameter.

That being said, there could be a bug in this particular circumstance, and I'm not 100% sure from what you were saying the issue. Could you include a small code sample I could look at to make sure it's something to be pushed out into user code & not an actual bug in the implementation?

@spollack
Copy link
Contributor Author

Sure, here is a code sample:

var pg = require('pg');

var conString = "postgres://postgres:postgres@localhost:5432/postgres";

var client = new pg.Client(conString);
client.connect(function(err) {
    console.log(err);
    client.query('CREATE TABLE test (id SERIAL NOT NULL, data JSON)', function(err) {
        console.log(err);
        var myArray = ['foo', 'bar'];
        client.query('INSERT INTO test (data) VALUES ($1)', [myArray], function(err) {
            // NOTE this is where things go sideways
            console.log(err);
        })
    })
});

This results in this output:

node scripts/custom/pg_json_array_bug.js
null
null
{ [error: invalid input syntax for type json]
  length: 149,
  name: 'error',
  severity: 'ERROR',
  code: '22P02',
  detail: 'Expected ":", but found ",".',
  hint: undefined,
  position: undefined,
  internalPosition: undefined,
  internalQuery: undefined,
  where: 'JSON data, line 1: {"foo",...',
  file: 'json.c',
  line: '673',
  routine: 'report_parse_error' }

The key thing here is that the javascript object passed in as the value happens to be a javascript array, not a javascript object. this is legal per the JSON spec and works fine at the postgres level. The issue is that pg assumes that JSON data at the top level is an object. One workaround (which i've taken in my code) is to wrap the array in a top level object.

@brianc
Copy link
Owner

brianc commented Jun 25, 2013

Yeah, I see the issue now. It's because of the inbound parameter conversion here:

https://github.com/brianc/node-postgres/blob/master/lib/utils.js#L55

Arrays are checked and turned into "postgres arrays" before the default JSON.stringify. So in some situations one may desire to have arrays converted into postgres arrays and in another desire arrays converted into JSON. No way to please everyone here.

Few options...one thing is to punch over that function like so:

var pg = require('pg');
pg.utils.prepareValue = function(val) {
  if(val instanceof Date) {
    return dateToString(val);
  }
  if(typeof val === 'undefined') {
    return null;
  }
  if(!val || typeof val !== 'object') {
    return val === null ? null : val.toString();
  }
  return JSON.stringify(val);
}

Basically remove the array converter.

Another would be to do something like this in your code:

client.query('INSERT INTO test (data) VALUES ($1)::JSON', [JSON.stringify(myArray)], function(err) {
            // NOTE this is where things go sideways

Really no way to be beautifully clean about it since we don't know the inbound parameter types, just make very broad assumptions about them.

@spollack
Copy link
Contributor Author

Agreed, no great solution here. Your manual stringification of the inbound parameter is probably the cleanest workaround. Thanks for looking at this.

@spollack
Copy link
Contributor Author

I guess we'll close this.

@matthew-dean
Copy link

So, is a datatype of an array of json not supported? I can't find the answer to this in the documentation.

@brianc
Copy link
Owner

brianc commented Jun 13, 2014

If you can show me a bit of what exactly you're trying to achieve I should be able to answer better. Some inbound parameter conversion stuff got quite a bit better in a recent release

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants