Supercharge BigQuery
with BigFunctions
Upgrade your data impact
with 100+ ready-to-use BigQuery Functions
(+ build a catalog of functions)
BigFunctions is:
โ a framework to build a governed catalog of powerful BigQuery functions at YOUR company.
โ 100+ open-source functions to supercharge BigQuery that you can call directly (no install) or redeploy in YOUR catalog.
As a data-analyst
You'll have new powers! (such as loading data from any source or activating your data through reverse ETL).
As an analytics-engineer
You'll feel at home with BigFunctions style which imitates the one of dbt (with a yaml standard and a CLI).
You'll love the idea of getting more things done through SQL.
As a data-engineer
You'll easily build software-engineering best practices through unit testing, cicd, pull request validation, continuous deployment, etc.
You will love avoiding reinventing the wheel by using functions already developed by the community.
As a central data-team player in a large company
You'll be proud of providing a governed catalog of curated functions to your 10000+ employees with mutualized and maintainable effort.
As a security champion
You will enjoy the ability to validate the code of functions before deployment thanks to your git validation workflow, CI Testing, binary authorization, etc.
As an open-source lover
You'll be able to contribute so that a problem solved for you is solved for everyone.
All BigFunctions represented by a 'yaml' file in bigfunctions folder of the GitHub repo are automatically deployed in public datasets so that you can call them directly without install from your BigQuery project.
Give it a try! Execute this SQL query from your GCP Project ๐:
select bigfunctions.eu.faker("name", "it_IT")Explore all available bigfunctions here.
You can also deploy any bigfunction in your project! To deploy my_bigfunction defined in bigfunctions/my_bigfunction.yaml file, simply call:
bigfun deploy my_bigfunctionDetails about bigfun command line are given below.
bigfun CLI (command-line-interface) facilitates BigFunctions development, test, deployment, documentation and monitoring.
pip install bigfunctions$ bigfun --help Usage: bigfun [OPTIONS] COMMAND [ARGS]... Options: --help Show this message and exit. Commands: deploy Deploy BIGFUNCTION docs Generate, serve and publish documentation get Download BIGFUNCTION yaml file from unytics/bigfunctions... test Test BIGFUNCTIONFunctions are defined as yaml files under bigfunctions folder. To create your first function locally, the easiest is to download an existing yaml file of unytics/bigfunctions Github repo.
For instance to download is_email_valid.yaml into bigfunctions folder, do:
bigfun get is_email_validYou can then update the file to suit your needs.
- Make sure the
gcloudcommand is installed on your computer- Activate the application-default account with
gcloud auth application-default login. A browser window should open, and you should be prompted to log into your Google account. Once you've done that,bigfunwill use your oauth'd credentials to connect to BigQuery through BigQuery python client!- Get or create a
DATASETwhere you have permission to edit data and where the function will be deployed.- The
DATASETmust belong to aPROJECTin which you have permission to run BigQuery queries.
You now can deploy the function is_email_valid defined in bigfunctions/is_email_valid.yaml yaml file by running:
bigfun deploy is_email_validThe first time you run this command it will ask for
PROJECTandDATASET.Your inputs will be written to
config.yamlfile in current directory so that you won't be asked again (unless you delete the entries inconfig.yaml). You can also override this config at deploy time:bigfun deploy is_email_valid --project=PROJECT --dataset=DATASET.
Test it with ๐:
select PROJECT.DATASET.is_email_valid('paul.marcombes@unytics.io')To deploy a javascript function which depends on npm packages there are additional requirements in addition to the ones above.
- You will need to install each npm package on your machine and bundle it into one file. For that, you need to install nodejs.
- The bundled js file will be uploaded into a cloud storage bucket in which you must have write access. The bucket name must be provided in
config.yamlfile in a variable namedbucket_js_dependencies. Users of your functions must have read access to the bucket.
You now can deploy the function render_template defined in bigfunctions/render_template.yaml yaml file by running:
bigfun deploy render_templateTest it with ๐:
select PROJECT.DATASET.render_template('Hello {{ user }}', json '{"user": "James"}')To deploy a remote function (e.g. python function), there are additional requirements in addition to the ones of Deploy you first function section.
- A Cloud Run service will be deployed to host the code (as seen here). So you must have permissions to deploy a Cloud Run service in your project
PROJECT.gcloudCLI will be used directly to deploy the service (usinggcloud run deploy). Then, make sure you are logged in withgcloudby calling:gcloud auth login. A browser window should also open, and you should be prompted to log into your Google account. WARNING: you read correctly: you have to authenticate twice. Once for bigquery python client (to deploy any function including remote as seen above.) and once now to usegcloud(to deploy a Cloud Run service).- A BigQuery Remote Connection will be created to link BigQuery with the Cloud Run service. You then should have permissions to create a remote connection. BigQuery Connection Admin or BigQuery Admin roles have these permissions.
- A service account will be automatically created by Google along with the BigQuery Remote Connection. BigQuery will use this service account of the remote connection to invoke the Cloud Run service. You then must have the permission to authorize this service account to invoke the Cloud Run service. This permission is provided in the role roles/run.admin
You now can deploy the function faker defined in bigfunctions/faker.yaml yaml file by running:
bigfun deploy fakerTest it with ๐:
select PROJECT.DATASET.faker("name", "it_IT")How to correctly highlight sql, python and javascript code in yaml files?
In yaml files multiline string are by default highlighted as strings. That makes reading code field hard to read (with all code in the same string color). To correctly highlight the code regarding its python / javascript / sql syntax, you can install YAML Embedded Languages VSCode extension.
BigFunctions is fully open-source. Any contribution is more than welcome ๐ค!
- Add a โญ on the repo to show your support
- Join our Slack and talk with us
- Suggest a new function here
- Raise an issue there
- Open a Pull-Request! (See contributing instructions).
Contributors
