Documentation
- Document BigQuery data type to pandas dtype conversion for
read_gbq
. ( #269 )
Dependency updates
- Update the minimum version of
google-cloud-bigquery
to 1.9.0. ( #247 ) - Update the minimum version of
pandas
to 0.19.0. ( #262 )
Internal changes
- Update the authentication credentials. Note: You may need to set
reauth=True
in order to update your credentials to the most recent version. This is required to use new functionality such as the BigQuery Storage API. ( #267 ) - Use
to_dataframe()
fromgoogle-cloud-bigquery
in theread_gbq()
function. ( #247 )
Enhancements
- Fix a bug where pandas-gbq could not upload an empty DataFrame. ( #237 )
- Allow table_schema in
to_gbq
to contain only a subset of columns, with the rest being populated using the DataFrame dtypes ( #218 ) (contributed by @JohnPaton) - Read
project_id
into_gbq
from provided credentials if available (contributed by @daureg) read_gbq
uses the timezone-awareDatetimeTZDtype(unit='ns', tz='UTC')
dtype for BigQueryTIMESTAMP
columns. ( #269 )- Add
use_bqstorage_api
toread_gbq
. The BigQuery Storage API can be used to download large query results (>125 MB) more quickly. If the BQ Storage API can't be used, the BigQuery API is used instead. ( #133, #270 )