Skip to content

Conversation

@tswast
Copy link
Contributor

@tswast tswast commented Apr 4, 2019

This is a notebooks tutorial, modeled after the Jupyter notebook example
code for BigQuery. Use some caution when running these tests, as they
run some large-ish (5 GB processed) queries and download about 500 MB
worth of data. This is intentional, as the BigQuery Storage API is most
useful for downloading large results.

This is a notebooks tutorial, modeled after the Jupyter notebook example code for BigQuery. Use some caution when running these tests, as they run some large-ish (5 GB processed) queries and download about 500 MB worth of data. This is intentional, as the BigQuery Storage API is most useful for downloading large results.
@tswast tswast added the bigquery label Apr 4, 2019
@tswast tswast requested review from alixhami and shollyman April 4, 2019 17:58
@googlebot googlebot added the cla: yes This human has signed the Contributor License Agreement. label Apr 4, 2019
Copy link
Contributor

@shollyman shollyman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see why you'd like to convert transform simple queries to a set of projection columns and a row filter for the storage case, as it would avoid the intermediate query.

Only issue is related to the ongoing testing costs. Looks like dependencies table is ~7GB which is probably still reasonable given it can short circuit and the query looks cacheable.

@tswast tswast merged commit bb8c80e into master Apr 9, 2019
@tswast tswast deleted the tswast-bqstorage-pandas branch April 9, 2019 21:13
plamut pushed a commit to plamut/python-bigquery-storage that referenced this pull request Sep 2, 2020
…oogleCloudPlatform/python-docs-samples#2087) * Add magics tutorial with BigQuery Storage API integration. This is a notebooks tutorial, modeled after the Jupyter notebook example code for BigQuery. Use some caution when running these tests, as they run some large-ish (5 GB processed) queries and download about 500 MB worth of data. This is intentional, as the BigQuery Storage API is most useful for downloading large results. * Update deps. * Don't run big queries on Travis.
plamut pushed a commit to googleapis/python-bigquery-storage that referenced this pull request Sep 10, 2020
…oogleCloudPlatform/python-docs-samples#2087) * Add magics tutorial with BigQuery Storage API integration. This is a notebooks tutorial, modeled after the Jupyter notebook example code for BigQuery. Use some caution when running these tests, as they run some large-ish (5 GB processed) queries and download about 500 MB worth of data. This is intentional, as the BigQuery Storage API is most useful for downloading large results. * Update deps. * Don't run big queries on Travis.
Linchin pushed a commit that referenced this pull request Aug 18, 2025
…2087) * Add magics tutorial with BigQuery Storage API integration. This is a notebooks tutorial, modeled after the Jupyter notebook example code for BigQuery. Use some caution when running these tests, as they run some large-ish (5 GB processed) queries and download about 500 MB worth of data. This is intentional, as the BigQuery Storage API is most useful for downloading large results. * Update deps. * Don't run big queries on Travis.
parthea pushed a commit to googleapis/google-cloud-python that referenced this pull request Aug 21, 2025
…oogleCloudPlatform/python-docs-samples#2087) * Add magics tutorial with BigQuery Storage API integration. This is a notebooks tutorial, modeled after the Jupyter notebook example code for BigQuery. Use some caution when running these tests, as they run some large-ish (5 GB processed) queries and download about 500 MB worth of data. This is intentional, as the BigQuery Storage API is most useful for downloading large results. * Update deps. * Don't run big queries on Travis.
parthea pushed a commit to googleapis/google-cloud-python that referenced this pull request Sep 16, 2025
…oogleCloudPlatform/python-docs-samples#2087) * Add magics tutorial with BigQuery Storage API integration. This is a notebooks tutorial, modeled after the Jupyter notebook example code for BigQuery. Use some caution when running these tests, as they run some large-ish (5 GB processed) queries and download about 500 MB worth of data. This is intentional, as the BigQuery Storage API is most useful for downloading large results. * Update deps. * Don't run big queries on Travis.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bigquery cla: yes This human has signed the Contributor License Agreement.

4 participants