You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
I'm looking to upload some big df into Big Query (4GB~) and the current format for uploads is JSON which is not very efficient. I'm asking if an option to upload using the PARQUET format can be coded into the package which could improve compression and hence upload times.
The text was updated successfully, but these errors were encountered:
I have experienced the same issue, and I believe the library should allow the user to choose the data format they want to load. I just submitted a pull request that you might want to try out: #609. You only need to add the source_format variable to bq_table_upload:
In the development version that you can install with pak::pak("r-dbi/bigrquery"), you will be able to define the format in which the information will be transmitted bq_table_upload(... , source_format = "PARQUET")
Hi,
I'm looking to upload some big df into Big Query (4GB~) and the current format for uploads is JSON which is not very efficient. I'm asking if an option to upload using the PARQUET format can be coded into the package which could improve compression and hence upload times.
The text was updated successfully, but these errors were encountered: