← Google Drive + Snowflake integrations

Insert Multiple Rows with Snowflake API on New Spreadsheet (Instant) from Google Drive API

Pipedream makes it easy to connect APIs for Snowflake, Google Drive and 3,000+ other apps remarkably fast.

Trigger workflow on
New Spreadsheet (Instant) from the Google Drive API
Next, do this
Insert Multiple Rows with the Snowflake API
No credit card required
Intro to Pipedream
Watch us build a workflow
Watch us build a workflow
8 min
Watch now ➜

Trusted by 1,000,000+ developers from startups to Fortune 500 companies

Adyen logo
Appcues logo
Bandwidth logo
Checkr logo
ChartMogul logo
Dataminr logo
Gopuff logo
Gorgias logo
LinkedIn logo
Logitech logo
Replicated logo
Rudderstack logo
SAS logo
Scale AI logo
Webflow logo
Warner Bros. logo
Adyen logo
Appcues logo
Bandwidth logo
Checkr logo
ChartMogul logo
Dataminr logo
Gopuff logo
Gorgias logo
LinkedIn logo
Logitech logo
Replicated logo
Rudderstack logo
SAS logo
Scale AI logo
Webflow logo
Warner Bros. logo

Developers Pipedream

Getting Started

This integration creates a workflow with a Google Drive trigger and Snowflake action. When you configure and deploy the workflow, it will run on Pipedream's servers 24x7 for free.

  1. Select this integration
  2. Configure the New Spreadsheet (Instant) trigger
    1. Connect your Google Drive account
    2. Select a Drive
    3. Configure Push notification renewal schedule
    4. Optional- Select one or more Folders
  3. Configure the Insert Multiple Rows action
    1. Connect your Snowflake account
    2. Select a Database
    3. Select a Schema
    4. Select a Table Name
    5. Select one or more Columns
    6. Configure Row Values
    7. Optional- Configure Batch Size
    8. Optional- Configure Max Payload Size (MB)
    9. Optional- Configure Enable Batch Processing
  4. Deploy the workflow
  5. Send a test event to validate your setup
  6. Turn on the trigger

Details

This integration uses pre-built, source-available components from Pipedream's GitHub repo. These components are developed by Pipedream and the community, and verified and maintained by Pipedream.

To contribute an update to an existing component or create a new component, create a PR on GitHub. If you're new to Pipedream component development, you can start with quickstarts for trigger span and action development, and then review the component API reference.

Trigger

Description:Emit new event when a new spreadsheet is created in a drive.
Version:0.1.17
Key:google_drive-new-spreadsheet

Google Drive Overview

The Google Drive API on Pipedream allows you to automate various file management tasks, such as creating, reading, updating, and deleting files within your Google Drive. You can also share files, manage permissions, and monitor changes to files and folders. This opens up possibilities for creating workflows that seamlessly integrate with other apps and services, streamlining document handling, backup processes, and collaborative workflows.

Trigger Code

import newFilesInstant from "../new-files-instant/new-files-instant.mjs"; export default { ...newFilesInstant, key: "google_drive-new-spreadsheet", type: "source", name: "New Spreadsheet (Instant)", description: "Emit new event when a new spreadsheet is created in a drive.", version: "0.1.17", props: { googleDrive: newFilesInstant.props.googleDrive, db: newFilesInstant.props.db, http: newFilesInstant.props.http, drive: newFilesInstant.props.drive, timer: newFilesInstant.props.timer, folders: { ...newFilesInstant.props.folders, description: "(Optional) The folders you want to watch. Leave blank to watch for any new spreadsheet in the Drive.", }, }, hooks: { ...newFilesInstant.hooks, async deploy() { // Emit sample records on the first run const spreadsheets = await this.getSpreadsheets(5); for (const fileInfo of spreadsheets) { const createdTime = Date.parse(fileInfo.createdTime); this.$emit(fileInfo, { summary: `New File: ${fileInfo.name}`, id: fileInfo.id, ts: createdTime, }); } }, }, methods: { ...newFilesInstant.methods, shouldProcess(file) { return ( file.mimeType.includes("spreadsheet") && newFilesInstant.methods.shouldProcess.bind(this)(file) ); }, getSpreadsheetsFromFolderOpts(folderId) { const mimeQuery = "mimeType = 'application/vnd.google-apps.spreadsheet'"; let opts = { q: `${mimeQuery} and parents in '${folderId}' and trashed = false`, }; if (!this.isMyDrive()) { opts = { corpora: "drive", driveId: this.getDriveId(), includeItemsFromAllDrives: true, supportsAllDrives: true, ...opts, }; } return opts; }, getSpreadsheetsFromFiles(files, limit) { return files.reduce(async (acc, file) => { const spreadsheets = await acc; const fileInfo = await this.googleDrive.getFile(file.id); return spreadsheets.length >= limit ? spreadsheets : spreadsheets.concat(fileInfo); }, []); }, async getSpreadsheets(limit) { const foldersIds = this.folders; if (!foldersIds.length) { const opts = this.getSpreadsheetsFromFolderOpts("root"); const { files } = await this.googleDrive.listFilesInPage(null, opts); return this.getSpreadsheetsFromFiles(files, limit); } return foldersIds.reduce(async (spreadsheets, folderId) => { const opts = this.getSpreadsheetsFromFolderOpts(folderId); const { files } = await this.googleDrive.listFilesInPage(null, opts); const nextSpreadsheets = await this.getSpreadsheetsFromFiles(files, limit); return (await spreadsheets).concat(nextSpreadsheets); }, []); }, }, }; 

Trigger Configuration

This component may be configured based on the props defined in the component code. Pipedream automatically prompts for input values in the UI and CLI.
LabelPropTypeDescription
Google DrivegoogleDriveappThis component uses the Google Drive app.
N/Adb$.service.dbThis component uses $.service.db to maintain state between executions.
N/Ahttp$.interface.httpThis component uses $.interface.http to generate a unique URL when the component is first instantiated. Each request to the URL will trigger the run() method of the component.
DrivedrivestringSelect a value from the drop down menu.
Foldersfoldersstring[]Select a value from the drop down menu.

Trigger Authentication

Google Drive uses OAuth authentication. When you connect your Google Drive account, Pipedream will open a popup window where you can sign into Google Drive and grant Pipedream permission to connect to your account. Pipedream securely stores and automatically refreshes the OAuth tokens so you can easily authenticate any Google Drive API.

Pipedream requests the following authorization scopes when you connect your account:

https://www.googleapis.com/auth/drive

About Google Drive

Google Drive is a file storage and synchronization service which allows you to create and share your work online, and access your documents from anywhere.

Action

Description:Insert multiple rows into a table
Version:0.1.4
Key:snowflake-insert-multiple-rows

Snowflake Overview

Snowflake offers a cloud database and related tools to help developers create robust, secure, and scalable data warehouses. See Snowflake's Key Concepts & Architecture

Getting Started

1. Create a user, role and warehouse in Snowflake

Snowflake recommends you create a new user, role, and warehouse when you integrate a third-party tool like Pipedream. This way, you can control permissions via the user / role, and separate Pipedream compute and costs with the warehouse. You can do this directly in the Snowflake UI

We recommend you create a read-only account if you only need to query Snowflake. If you need to insert data into Snowflake, add permissions on the appropriate objects after you create your user.

2. Enter those details in Pipedream

Visit https://pipedream.com/accounts. Click the button to Connect an App. Enter the required Snowflake account data.

You'll only need to connect your account once in Pipedream. You can connect this account to multiple workflows to run queries against Snowflake, insert data, and more.

3. Build your first workflow

Visit https://pipedream.com/new to build your first workflow. Pipedream workflows let you connect Snowflake with 3,000+ other apps. You can trigger workflows on Snowflake queries, sending results to Slack, Google Sheets, or any app that exposes an API. Or you can accept data from another app, transform it with Python, Node.js, Go or Bash code, and insert it into Snowflake.

Learn more at Pipedream University

Action Code

import snowflake from "../../snowflake.app.mjs"; import { ConfigurationError } from "@pipedream/platform"; export default { type: "action", key: "snowflake-insert-multiple-rows", name: "Insert Multiple Rows", description: "Insert multiple rows into a table", version: "0.1.4", annotations: { destructiveHint: false, openWorldHint: true, readOnlyHint: false, }, props: { snowflake, database: { propDefinition: [ snowflake, "database", ], }, schema: { propDefinition: [ snowflake, "schema", (c) => ({ database: c.database, }), ], }, tableName: { propDefinition: [ snowflake, "tableName", (c) => ({ database: c.database, schema: c.schema, }), ], description: "The table where you want to add rows", }, columns: { propDefinition: [ snowflake, "columns", (c) => ({ tableName: c.tableName, }), ], }, values: { propDefinition: [ snowflake, "values", ], }, batchSize: { type: "integer", label: "Batch Size", description: "Number of rows to process per batch. Automatically calculated based on data size if not specified. Recommended: `50-200` for wide tables, `100-500` for narrow tables.", optional: true, default: 100, min: 10, max: 1000, }, maxPayloadSizeMB: { type: "integer", label: "Max Payload Size (MB)", description: "Maximum payload size per batch in MB. Helps prevent `413 Payload Too Large` errors.", optional: true, default: 5, min: 1, max: 10, }, enableBatching: { type: "boolean", label: "Enable Batch Processing", description: "Enable automatic batch processing for large datasets. Disable only for small datasets (< 50 rows) or troubleshooting.", optional: true, default: true, }, }, async run({ $ }) { let rows = this.values; let inputValidated = true; if (!Array.isArray(rows)) { try { rows = JSON.parse(rows); } catch (parseError) { throw new ConfigurationError("The row data could not be parsed as JSON. Please ensure it's a valid JSON array of arrays."); } } if (!rows || !rows.length || !Array.isArray(rows)) { inputValidated = false; } else { rows.forEach((row, index) => { if (!Array.isArray(row)) { console.log(`Row ${index + 1} is not an array:`, row); inputValidated = false; } }); } // Throw an error if input validation failed if (!inputValidated) { throw new ConfigurationError("The row data you passed is not an array of arrays. Please enter an array of arrays in the `Values` parameter above. If you're trying to add a single row to Snowflake, select the **Insert Single Row** action."); } const expectedColumnCount = this.columns.length; const invalidRows = rows.filter((row, index) => { if (row.length !== expectedColumnCount) { console.error(`Row ${index + 1} has ${row.length} values but ${expectedColumnCount} columns specified`); return true; } return false; }); if (invalidRows.length > 0) { throw new ConfigurationError(`${invalidRows.length} rows have a different number of values than the specified columns. Each row must have exactly ${expectedColumnCount} values to match the selected columns.`); } // Add batch processing options const batchOptions = { batchSize: this.batchSize, maxPayloadSizeMB: this.maxPayloadSizeMB, enableBatching: this.enableBatching, }; try { const response = await this.snowflake.insertRows( this.tableName, this.columns, rows, batchOptions, ); // Handle different response formats (batched vs single insert) if (response.summary) { // Batched response const { summary } = response; $.export("$summary", `Successfully inserted ${summary.totalRowsProcessed} rows into ${this.tableName} using ${summary.totalBatches} batches`); // Export detailed batch information $.export("batchDetails", { totalRows: summary.totalRows, totalBatches: summary.totalBatches, successfulBatches: summary.successfulBatches, failedBatches: summary.failedBatches, batchSize: summary.batchSize, processingTime: new Date().toISOString(), }); // Export batch results for debugging if needed $.export("batchResults", summary.results); return response; } else { // Single insert response (small dataset or batching disabled) $.export("$summary", `Successfully inserted ${rows.length} rows into ${this.tableName}`); return response; } } catch (error) { // Enhanced error handling for batch processing if (error.summary) { // Partial failure in batch processing const { summary } = error; $.export("$summary", `Partial success: ${summary.totalRowsProcessed}/${summary.totalRows} rows inserted. ${summary.failedBatches} batches failed.`); $.export("batchDetails", summary); $.export("failedBatches", summary.results.filter((r) => !r.success)); } // Re-throw the error with additional context if (error.message.includes("413") || error.message.includes("Payload Too Large")) { throw new ConfigurationError( `Payload too large error detected. Try reducing the batch size (current: ${this.batchSize}) or enable batching if disabled. ` + `You're trying to insert ${rows.length} rows with ${this.columns.length} columns each. ` + `Original error: ${error.message}`, ); } throw error; } }, }; 

Action Configuration

This component may be configured based on the props defined in the component code. Pipedream automatically prompts for input values in the UI.

LabelPropTypeDescription
SnowflakesnowflakeappThis component uses the Snowflake app.
DatabasedatabasestringSelect a value from the drop down menu.
SchemaschemastringSelect a value from the drop down menu.
Table NametableNamestringSelect a value from the drop down menu.
Columnscolumnsstring[]Select a value from the drop down menu.
Row Valuesvaluesstring

Provide an array of arrays. Each nested array should represent a row, with each element of the nested array representing a value (e.g., passing [["Foo",1,2],["Bar",3,4]] will insert two rows of data with three columns each). The most common pattern is to reference an array of arrays exported by a previous step (e.g., {{steps.foo.$return_value}}). You may also enter or construct a string that will JSON.parse() to an array of arrays.

Batch SizebatchSizeinteger

Number of rows to process per batch. Automatically calculated based on data size if not specified. Recommended: 50-200 for wide tables, 100-500 for narrow tables.

Max Payload Size (MB)maxPayloadSizeMBinteger

Maximum payload size per batch in MB. Helps prevent 413 Payload Too Large errors.

Enable Batch ProcessingenableBatchingboolean

Enable automatic batch processing for large datasets. Disable only for small datasets (< 50 rows) or troubleshooting.

Action Authentication

Snowflake uses API keys for authentication. When you connect your Snowflake account, Pipedream securely stores the keys so you can easily authenticate to Snowflake APIs in both code and no-code steps.

Snowflake recommends you create a new user, role, and warehouse when you integrate a third-party tool like Pipedream. This way, you can control permissions via the user / role, and separate Pipedream compute and costs with the warehouse. You can do this directly in the Snowflake UI

We recommend you create a read-only account if you only need to query Snowflake. If you need to insert data into Snowflake, add permissions on the appropriate objects after you create your user.

About Snowflake

A data warehouse built for the cloud

More Ways to Connect Snowflake + Google Drive

Insert Row with Snowflake API on Changes to Specific Files (Shared Drive) from Google Drive API
Google Drive + Snowflake
 
Try it
Insert Row with Snowflake API on Changes to Specific Files from Google Drive API
Google Drive + Snowflake
 
Try it
Insert Row with Snowflake API on New or Modified Comments from Google Drive API
Google Drive + Snowflake
 
Try it
Insert Row with Snowflake API on New or Modified Files from Google Drive API
Google Drive + Snowflake
 
Try it
Insert Row with Snowflake API on New or Modified Folders from Google Drive API
Google Drive + Snowflake
 
Try it
Insert Row with Snowflake API on New Shared Drive from Google Drive API
Google Drive + Snowflake
 
Try it
Insert Row with Snowflake API on New Files (Instant) from Google Drive API
Google Drive + Snowflake
 
Try it
Insert Row with Snowflake API on New Presentation (Instant) from Google Drive API
Google Drive + Snowflake
 
Try it
Insert Row with Snowflake API on New Spreadsheet (Instant) from Google Drive API
Google Drive + Snowflake
 
Try it
Insert Multiple Rows with Snowflake API on Changes to Specific Files (Shared Drive) from Google Drive API
Google Drive + Snowflake
 
Try it
Insert Multiple Rows with Snowflake API on Changes to Specific Files from Google Drive API
Google Drive + Snowflake
 
Try it
Insert Multiple Rows with Snowflake API on New or Modified Comments from Google Drive API
Google Drive + Snowflake
 
Try it
Insert Multiple Rows with Snowflake API on New or Modified Files from Google Drive API
Google Drive + Snowflake
 
Try it
Insert Multiple Rows with Snowflake API on New or Modified Folders from Google Drive API
Google Drive + Snowflake
 
Try it
Insert Multiple Rows with Snowflake API on New Files (Instant) from Google Drive API
Google Drive + Snowflake
 
Try it
Insert Multiple Rows with Snowflake API on New Presentation (Instant) from Google Drive API
Google Drive + Snowflake
 
Try it
Insert Multiple Rows with Snowflake API on New Shared Drive from Google Drive API
Google Drive + Snowflake
 
Try it
Execute Query with Snowflake API on New Presentation (Instant) from Google Drive API
Google Drive + Snowflake
 
Try it
Execute Query with Snowflake API on Changes to Specific Files (Shared Drive) from Google Drive API
Google Drive + Snowflake
 
Try it
Execute Query with Snowflake API on Changes to Specific Files from Google Drive API
Google Drive + Snowflake
 
Try it
Changes to Files in Drive from the Google Drive API

Emit new event when a change is made to one of the specified files. See the documentation

 
Try it
Changes to Specific Files from the Google Drive API

Watches for changes to specific files, emitting an event when a change is made to one of those files. To watch for changes to shared drive files, use the Changes to Specific Files (Shared Drive) source instead.

 
Try it
Changes to Specific Files (Shared Drive) from the Google Drive API

Watches for changes to specific files in a shared drive, emitting an event when a change is made to one of those files

 
Try it
New Access Proposal from the Google Drive API

Emit new event when a new access proposal is requested in Google Drive

 
Try it
New Files (Instant) from the Google Drive API

Emit new event when a new file is added in your linked Google Drive

 
Try it
New Files (Shared Drive) from the Google Drive API

Emit new event when a new file is added in your shared Google Drive

 
Try it
New or Modified Comments (Instant) from the Google Drive API

Emit new event when a comment is created or modified in the selected file

 
Try it
New or Modified Files (Instant) from the Google Drive API

Emit new event when a file in the selected Drive is created, modified or trashed.

 
Try it
New or Modified Files (Polling) from the Google Drive API

Emit new event when a file in the selected Drive is created, modified or trashed. See the documentation

 
Try it
New or Modified Folders (Instant) from the Google Drive API

Emit new event when a folder is created or modified in the selected Drive

 
Try it
New Presentation (Instant) from the Google Drive API

Emit new event each time a new presentation is created in a drive.

 
Try it
New Shared Drive from the Google Drive API

Emits a new event any time a shared drive is created.

 
Try it
New Spreadsheet (Instant) from the Google Drive API

Emit new event when a new spreadsheet is created in a drive.

 
Try it
New Row from the Snowflake API

Emit new event when a row is added to a table

 
Try it
New Query Results from the Snowflake API

Run a SQL query on a schedule, triggering a workflow for each row of results

 
Try it
Failed Task in Schema from the Snowflake API

Emit new events when a task fails in a database schema

 
Try it
New Database from the Snowflake API

Emit new event when a database is created

 
Try it
New Deleted Role from the Snowflake API

Emit new event when a role is deleted

 
Try it
New Deleted User from the Snowflake API

Emit new event when a user is deleted

 
Try it
New Role from the Snowflake API

Emit new event when a role is created

 
Try it
New Schema from the Snowflake API

Emit new event when a schema is created

 
Try it
New Table from the Snowflake API

Emit new event when a table is created

 
Try it
New Update Role from the Snowflake API

Emit new event when a role is updated

 
Try it
New Update User from the Snowflake API

Emit new event when a user is updated

 
Try it
New Usage Monitor from the Snowflake API

Emit new event when a query is executed in the specified params

 
Try it
New User from the Snowflake API

Emit new event when a user is created

 
Try it
New, Updated, or Deleted Warehouse from the Snowflake API

Emit new events when a warehouse is created, altered, or dropped

 
Try it
Add Comment with the Google Drive API

Add an unanchored comment to a Google Doc (general feedback, no text highlighting). See the documentation

 
Try it
Copy File with the Google Drive API

Create a copy of the specified file. See the documentation for more information

 
Try it
Create Folder with the Google Drive API

Create a new empty folder. See the documentation for more information

 
Try it
Create New File From Template with the Google Drive API

Create a new Google Docs file from a template. Optionally include placeholders in the template document that will get replaced from this action. See documentation

 
Try it
Create New File From Text with the Google Drive API

Create a new file from plain text. See the documentation for more information

 
Try it
Create Shared Drive with the Google Drive API

Create a new shared drive. See the documentation for more information

 
Try it
Delete Comment with the Google Drive API

Delete a specific comment (Requires ownership or permissions). See the documentation

 
Try it
Delete File with the Google Drive API

Permanently delete a file or folder without moving it to the trash. See the documentation for more information

 
Try it
Delete Reply with the Google Drive API

Delete a reply on a specific comment. See the documentation for more information

 
Try it
Delete Shared Drive with the Google Drive API

Delete a shared drive without any content. See the documentation for more information

 
Try it
Download File with the Google Drive API

Download a file. See the documentation for more information

 
Try it
Find File with the Google Drive API

Search for a specific file by name. See the documentation for more information

 
Try it
Find Folder with the Google Drive API

Search for a specific folder by name. See the documentation for more information

 
Try it
Find Forms with the Google Drive API

List Google Form documents or search for a Form by name. See the documentation for more information

 
Try it
Find Spreadsheets with the Google Drive API

Search for a specific spreadsheet by name. See the documentation for more information

 
Try it
Get Comment By ID with the Google Drive API

Get comment by ID on a specific file. See the documentation for more information

 
Try it
Get Current User with the Google Drive API

Retrieve Google Drive account metadata for the authenticated user via about.get, including display name, email, permission ID, and storage quota. Useful when flows or agents need to confirm the active Google identity or understand available storage. See the documentation

 
Try it
Get File By ID with the Google Drive API

Get info on a specific file. See the documentation for more information

 
Try it
Get Folder ID for a Path with the Google Drive API

Retrieve a folderId for a path. See the documentation for more information

 
Try it
Get Reply By ID with the Google Drive API

Get reply by ID on a specific comment. See the documentation for more information

 
Try it
Get Shared Drive with the Google Drive API

Get metadata for one or all shared drives. See the documentation for more information

 
Try it
List Access Proposals with the Google Drive API

List access proposals for a file or folder. See the documentation

 
Try it
List Comments with the Google Drive API

List all comments on a file. See the documentation

 
Try it
List Files with the Google Drive API

List files from a specific folder. See the documentation for more information

 
Try it
List Replies with the Google Drive API

List replies to a specific comment. See the documentation for more information

 
Try it
Move File with the Google Drive API

Move a file from one folder to another. See the documentation for more information

 
Try it
Move File to Trash with the Google Drive API

Move a file or folder to trash. See the documentation for more information

 
Try it
Reply to Comment with the Google Drive API

Add a reply to an existing comment. See the documentation

 
Try it
Resolve Access Proposals with the Google Drive API

Accept or deny a request for access to a file or folder in Google Drive. See the documentation

 
Try it
Resolve Comment with the Google Drive API

Mark a comment as resolved. See the documentation

 
Try it
Search for Shared Drives with the Google Drive API

Search for shared drives with query options. See the documentation for more information

 
Try it
Share File or Folder with the Google Drive API

Add a sharing permission to the sharing preferences of a file or folder and provide a sharing URL. See the documentation

 
Try it
Update Comment with the Google Drive API

Update the content of a specific comment. See the documentation for more information

 
Try it
Update File with the Google Drive API

Update a file's metadata and/or content. See the documentation for more information

 
Try it
Update Reply with the Google Drive API

Update a reply on a specific comment. See the documentation for more information

 
Try it
Update Shared Drive with the Google Drive API

Update an existing shared drive. See the documentation for more information

 
Try it
Upload File with the Google Drive API

Upload a file to Google Drive. See the documentation for more information

 
Try it
Execute SQL Query with the Snowflake API

Execute a custom Snowflake query. See our docs to learn more about working with SQL in Pipedream.

 
Try it
Insert Multiple Rows with the Snowflake API

Insert multiple rows into a table

 
Try it
Insert Single Row with the Snowflake API

Insert a row into a table

 
Try it
Query SQL Database with the Snowflake API

Execute a SQL Query. See our docs to learn more about working with SQL in Pipedream.

 
Try it

Explore Other Apps

1
-
24
of
3,000+
apps by most popular

Node
Node
Anything you can do with Node.js, you can do in a Pipedream workflow. This includes using most of npm's 400,000+ packages.
Python
Python
Anything you can do in Python can be done in a Pipedream Workflow. This includes using any of the 350,000+ PyPi packages available in your Python powered workflows.
Notion
Notion
Notion is a new tool that blends your everyday work apps into one. It's the all-in-one workspace for you and your team.
OpenAI (ChatGPT)
OpenAI (ChatGPT)
OpenAI is an AI research and deployment company with the mission to ensure that artificial general intelligence benefits all of humanity. They are the makers of popular models like ChatGPT, DALL-E, and Whisper.
Anthropic (Claude)
Anthropic (Claude)
AI research and products that put safety at the frontier. Introducing Claude, a next-generation AI assistant for your tasks, no matter the scale.
Google Sheets
Google Sheets
Use Google Sheets to create and edit online spreadsheets. Get insights together with secure sharing in real-time and from any device.
Telegram
Telegram
Telegram, is a cloud-based, cross-platform, encrypted instant messaging (IM) service.
Google Drive
Google Drive
Google Drive is a file storage and synchronization service which allows you to create and share your work online, and access your documents from anywhere.
HTTP / Webhook
HTTP / Webhook
Get a unique URL where you can send HTTP or webhook requests
Google Calendar
Google Calendar
With Google Calendar, you can quickly schedule meetings and events and get reminders about upcoming activities, so you always know what’s next.
Schedule
Schedule
Trigger workflows on an interval or cron schedule.
Pipedream Utils
Pipedream Utils
Utility functions to use within your Pipedream workflows
Shopify
Shopify
Shopify is a complete commerce platform that lets anyone start, manage, and grow a business. You can use Shopify to build an online store, manage sales, market to customers, and accept payments in digital and physical locations.
Supabase
Supabase
Supabase is an open source Firebase alternative.
MySQL
MySQL
MySQL is an open-source relational database management system.
PostgreSQL
PostgreSQL
PostgreSQL is a free and open-source relational database management system emphasizing extensibility and SQL compliance.
AWS
AWS
Premium
Amazon Web Services (AWS) offers reliable, scalable, and inexpensive cloud computing services.
Twilio SendGrid
Twilio SendGrid
Premium
Send marketing and transactional email through the Twilio SendGrid platform with the Email API, proprietary mail transfer agent, and infrastructure for scalable delivery.
Amazon SES
Amazon SES
Amazon SES is a cloud-based email service provider that can integrate into any application for high volume email automation
Klaviyo
Klaviyo
Premium
Klaviyo unifies your data, channels, and AI agents in one platform—text, WhatsApp, email marketing, and more—driving growth with every interaction.
Zendesk
Zendesk
Premium
Zendesk is award-winning customer service software trusted by 200K+ customers. Make customers happy via text, mobile, phone, email, live chat, social media.
ServiceNow
ServiceNow
Premium
Beta
The smarter way to workflow
Slack
Slack
Slack is the AI-powered platform for work bringing all of your conversations, apps, and customers together in one place. Around the world, Slack is helping businesses of all sizes grow and send productivity through the roof.
Microsoft Teams
Microsoft Teams
Microsoft Teams has communities, events, chats, channels, meetings, storage, tasks, and calendars in one place.