You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently there is no easy way to import data into a database. While it is fairly straight forward to use \copy with a csvq database, a more generic way of doing bulk imports from a file on disk should be supported, as similar functionality exists in almost every database's native command line client. usql should match this native functionality with its own \import command.
Similarly, since the csvq driver may not be available, a separate \import function should not rely on its presence and should instead use Go's standard library. Since it's already possible to export queries to disk in CSV and JSON formats, via \o and \pset format <TYPE>, a native usql\import should also support these formats.
As such, we propose a \import function similar to the following:
\import [-<TYPE>] <TABLE> [<FILE>]
Where TABLE is the name of the table to import to, TYPE is either csv, json or some other format to be supported in the future, and FILE is a path to a file on disk.
When FILE is not specified, then a file named TABLE.TYPE will be looked for in the current working directory. If TYPE is also not specified, then it will look for files with extensions csv and then json. If a corresponding file is not found, then an error would be returned.
If TYPE is not specified, then the command will attempt to first detect the import file's type via its extension, and failing that then via Go's http.DetectContentType.
The \import command would then process the data and prepare an INSERT statement for each row found in the csv or json. It is a requirement that CSV files have a header containing the column names, and it is expected that JSON data have the same style output as generated by the tblfmt's JSON encoder.
As such, with a properly implemented \import command, the following would be possible:
pg:booktest@localhost/booktest=>\pset format json
Output format is json.
pg:booktest@localhost/booktest=>select* from authors;
[{"author_id":1,"name":"Unknown Master"}]
pg:booktest@localhost/booktest=>select* from books;
[{"book_id":1,"author_id":1,"isbn":"1","book_type":"FICTION","title":"my book title","year":2016,"available":"2021-06-17T23:15:04.682659Z","tags":"{}"},{"book_id":2,"author_id":1,"isbn":"2","book_type":"FICTION","title":"changed second title","year":2016,"available":"2021-06-17T23:15:04.682659Z","tags":"{cool,disastor}"},{"book_id":3,"author_id":1,"isbn":"3","book_type":"FICTION","title":"the third book","year":2001,"available":"2021-06-17T23:15:04.682659Z","tags":"{cool}"}]
pg:booktest@localhost/booktest=>\o export.json
pg:booktest@localhost/booktest=>\g
pg:booktest@localhost/booktest=>\o
pg:booktest@localhost/booktest=>\import books export.json
The text was updated successfully, but these errors were encountered:
Currently there is no easy way to import data into a database. While it is fairly straight forward to use
\copy
with acsvq
database, a more generic way of doing bulk imports from a file on disk should be supported, as similar functionality exists in almost every database's native command line client.usql
should match this native functionality with its own\import
command.Similarly, since the
csvq
driver may not be available, a separate\import
function should not rely on its presence and should instead use Go's standard library. Since it's already possible to export queries to disk in CSV and JSON formats, via\o
and\pset format <TYPE>
, a nativeusql
\import
should also support these formats.As such, we propose a
\import
function similar to the following:Where
TABLE
is the name of the table to import to,TYPE
is eithercsv
,json
or some other format to be supported in the future, andFILE
is a path to a file on disk.When
FILE
is not specified, then a file namedTABLE.TYPE
will be looked for in the current working directory. IfTYPE
is also not specified, then it will look for files with extensionscsv
and thenjson
. If a corresponding file is not found, then an error would be returned.If
TYPE
is not specified, then the command will attempt to first detect the import file's type via its extension, and failing that then via Go'shttp.DetectContentType
.The
\import
command would then process the data and prepare anINSERT
statement for each row found in thecsv
orjson
. It is a requirement that CSV files have a header containing the column names, and it is expected that JSON data have the same style output as generated by thetblfmt
's JSON encoder.As such, with a properly implemented
\import
command, the following would be possible:The text was updated successfully, but these errors were encountered: