1 Why Build an Apache Kafka® Connector Series - Building Kafka Connectors - the Why and How Part 1 -Why Build a Kafka Connector Part 2 -How to Build a Kafka Connector
2 Speakers Sree Karuthody, Sr. Manager, Technology Partnerships Jeff Bean, Partner Solution Architect Sid Rabindran, Dir. Partner Programs and Tech Ecosystem Partners Lisa Sensmeier, Partner Marketing
3 What is Event Streaming Kafka and Connecting to Kafka Value of building a connector Verified Integrations Program Q and A Agenda Key business drivers behind connecting to Kafka The Verified Integrations Program, benefits, how to participate, process
44 What is Event Streaming
55 The New Enterprise Reality Innovate or be disrupted ● Deliver new, real-time customer experiences ● Create new business models ● Deliver massive internal efficiencies & risk reduction Every company is a software company ● Capital One 10,000 of 40,000 employees are software engineers ● Goldman Sachs 1.5B lines of code across 7,000+ applications Innovation is about events, not “data” ● Event = something happened ● Your business = a continually updating stream of events ● Your success = your ability to respond to these events
66 The Rise of Event Streaming Data as a continuous stream of events 60% Fortune 100 Companies Using Apache Kafka
77 Auto / Transport Without Event Streaming With Event Streaming Event Streaming Enables New Outcomes Call for driver availability No knowledge of driver arrival No data on feature usage Real-time driver-rider match Real-time ETA Real-time sensor diagnostics Banking Nightly updated account balance Batch fraud checks Batch regulatory reporting Real-time account updates Real-time credit card fraud alerts Real-time regulatory reporting Retail Post-order “out of stock” emails No upsell through personalization Batch point-of sale reports Real-time inventory Real-time recommendations Real-time sales reporting
88 Confluent is Enabling Event-driven Transformation across Industries Healthcare & Pharma ● Patient monitoring ● Prescription control ● Lab alerts ● Medication tracking Banking & Capital Markets ● Customer 360 ● Fraud detection ● Mainframe offload ● Trade data capture Retail ● Inventory management ● Product catalog ● A/B testing ● Customized experiences Telecommunications ● Personalized ads ● Customer 360 ● Network integrity Automotive & Transportation ● Connected car ● Fleet management ● Manufacturing data processing Travel & Leisure ● Visitor segmentation ● Booking systems ● Pricing services ● Fraud detection
99 Why Confluent
1010 Confluent Enables Your Event Streaming Success Confluent founders are original creators of Kafka Confluent team wrote 80% of Kafka commits Confluent Platform extends Apache Kafka to be a secure, enterprise-ready platform Confluent helps enterprises successfully deploy event streaming at scale and accelerate time to market Hall of Innovation CTO Innovation Award Winner 2019
11 Confluent Platform Operations and Security Development & Stream Processing Support,services,training&partners Apache Kafka Security plugins | Role-Based Access Control Control Center | Replicator | Auto Data Balancer | Operator Connectors Clients | REST Proxy MQTT Proxy | Schema Registry KSQL Connect Continuous Commit Log Streams Complete Event Streaming Platform Mission-critical Reliability Freedom of Choice Datacenter Public Cloud Confluent Cloud Self-Managed Software Fully-Managed Service
12 Real-Time Inventory Real-Time Fraud Detection Real-Time Customer 360 Machine Learning Models Real-Time Data Transformation ... Contextual Event-Driven Applications Universal Event Pipeline Data Stores Logs 3rd Party Apps Custom Apps/Microservices TREAMSSTREAMS CONNECT CLIENTS With Confluent
1414 Kafka and Connecting to Kafka
15 Without Kafka and Confluent
16 ● End to end monitoring and analytics ● Enterprise-grade security ● Multi-datacenter replication ● SQL-based stream processing ● Cloud-native deployment with K8s operator ● Run as self-managed software or as fully managed service with Confluent Cloud Enterprise streaming platform built by the original creators of Apache Kafka Edge Cloud Data LakeDatabases Datacenter IoT SaaS AppsMobile Microservices Machine Learning Confluent Platform
17 Deploy and Stream with Confluent on Any Cloud Confluent Platform The Enterprise Distribution of Apache Kafka Deploy on any platform on-premises or in public clouds Fully-Managed Service Confluent Cloud Apache Kafka Re-engineered for the Cloud Available on the leading public clouds VM Self-Managed Software
1818 Apache Kafka™ Connect API – Streaming Data Capture JDBC Mongo MySQL Elastic Cassandra HDFS Kafka Connect API Kafka Pipeline Connector Connector Connector Connector Connector Connector Sources Sinks Fault tolerant Manage hundreds of data sources and sinks Preserves data schema Part of Apache Kafka project Integrated within Confluent Platform’s Control Center
19 Confluent Hub Connectors are posted - Verified Partner Integrations - Confluent provided and supported confluent.io/hub
20
2121 Value of building a connector
22 Leverage the Ecosystem Search/Log Alternative Processing Databases/Data Stores JDBC Data Lake/Data Warehouse File/Messaging/Custom MQTT FTP Container/Deployment Platforms Kafka and Confluent Ecosystem JMS SQS
23 New Opportunities Engage with Early Adopters and Early Majority Kafka users Help customers access new, event driven applications Expand your reach into new markets, new use cases and new opportunities Extend the use of your product - a more complete solution offering Kafka adoption continues to grow
24 Visibility and access on Confluent Hub Joint go-to-market activities including Logo exchange Blog opportunity Online talk, field events - and lead sharing Marketing development fund availability Joint collateral Confluent sales enablement activities Joint tutorials Marketing Benefits Go To Market with Confluent
25 "Customers want to ingest data streams in real time from Apache Kafka into Kinetica for immediate action and analysis. Because the Confluent Platform adds significant value for enterprises, we built out the Kinetica connector using Connect APIs, offering a deeper level of integration with the Confluent Platform." -- Irina Farooq, CPO, Kinetica “Neo4j and Confluent share a customer base that is determined to push the boundaries of detecting data connections as interactions and events occur. Driven by customer need to realize more value from their streaming data, we have integrated Neo4j and Kafka both as a sink or source in a Confluent setup. As a result, Confluent and Neo4j customers will be able to power their financial fraud investigations, social media analyses, network & IT management use cases, and more with real-time graph analysis.” -- Philip Rathle, VP of Products, Neo4j "A Verified Gold Connector with Confluent Platform is important to our customers who want a validated and optimized way to enable operational data flows to and from Couchbase, an enterprise-class NoSQL database, and Kafka. With the Kafka Connect API, we have a deeper level of integration with the Confluent Platform so together, our joint solution is truly enterprise ready for any application modernization or cloud native application initiatives." -- Anthony Farinha, Senior Director, Business Development, Couchbase “In collaboration with Confluent we developed a Verified Gold connector that enables our customers to achieve the highest throughput rates possible. It also enables highly secure, resilient, and flexible connections between DataStax database products built on Apache Cassandra™ and Confluent’s event streaming platform. We promised our joint enterprise customers a fully supported microservices-based application stack and this partnership delivers on that promise.“ —Kathryn Erickson, Senior Director of Strategic Partnerships, DataStax "Imply is a real-time analytics solution, built on Apache Druid, to store, query, and visualize event-driven data. Connecting Imply to Kafka and Confluent Platform enables high-throughput streaming and sub-second interactive queries at scale. Together they enable enterprise data applications to analyze clickstreams, use behavior, network telemetry and more." -- Gian Merlino, Chief Technology Officer, Imply "Attunity, a division of Qlik, partners with Confluent to provide technologies that solve the very real problem of streaming data continuously so you can run your business in real time. -- Itamar Ankorion, Managing Director, Data Integration and SVP, Technology Alliances, Qlik What our Partners are saying Driven by customer need to realize more value from their streaming data ..we developed a Verified Gold Connector to enable the highest throughput rates possible. With the Kakfa Connect API, we have a deeper level of integration with Confluent Platform so our solution is truly enterprise ready...
2626 Verified Integration Program https://www.confluent.io/verified-integrations-program/
27 Helps you Build and verify your Connector or integration At no cost to Partner What is the Verified Integrations Program
28 GOLD: Kafka Connect API Standard: Other integrations, such as: ● Consumer / producer ● Stream processor ● Platforms ● Complimentary Verification Levels
29 Best practices documentation Dev/test assistance Verification service and support How We Help Support
30 TSANet Multi vendor case management process and tool Solves multi vendor problems faster and easier Alleviates customer concerns about finger pointing Program Benefits
31 Kafka Connect API Confluent Platform: Commercial standard for Apache Kafka Standardized source and sink Schema registry integration Control center Better user experience Better scalability Free, easy to use, fault tolerant framework Why Gold? Best Integration for Confluent Platform
32 Process 1. Initiated Discussions to start the process 2. Guidance Building the integration or connector 3. Submitted Start the testing with Confluent 4. Verified Published on Confluent.io/Hub, start demand gen activities
33 Requirements Gold Verification Guide ● Criteria detail ● Best practices ● Testing ● Quickstart Guide / Tutorial Standard Verification Guide ● Criteria detail ● Integration points ● Quickstart / Tutorial Documentation
34 How to Submit Gold ● Contact Info ● Connector ● Documentation ○ Product Brief ○ End-to-end usage ○ Version info ○ Config info ○ Testing and results Standard ● Contact Info ● Software Package ● Documentation ○ Product Brief ○ Evaluation Guide ○ Demonstration Checklist
35 Building the connector Have a connector? Review the Checklist Contact us to discuss Submit your connector New Connector Review the Connector Verification Guide Contact us to start the process Sign Up or Contact Us confluent.io/verified-integrations-program
37 Sign up - questions or to start the process: Verified Integrations Program confluent.io/verified-integrations-program/ Attend the online talk part 2: How to build a Kafka connector confluent.io/online-talks/ Kafka Summit San Francisco: kafka-summit.org/ code KS19Online25 for 25% off Q&A
38

Why Build an Apache Kafka® Connector

  • 1.
    1 Why Build anApache Kafka® Connector Series - Building Kafka Connectors - the Why and How Part 1 -Why Build a Kafka Connector Part 2 -How to Build a Kafka Connector
  • 2.
    2 Speakers Sree Karuthody, Sr.Manager, Technology Partnerships Jeff Bean, Partner Solution Architect Sid Rabindran, Dir. Partner Programs and Tech Ecosystem Partners Lisa Sensmeier, Partner Marketing
  • 3.
    3 What is EventStreaming Kafka and Connecting to Kafka Value of building a connector Verified Integrations Program Q and A Agenda Key business drivers behind connecting to Kafka The Verified Integrations Program, benefits, how to participate, process
  • 4.
  • 5.
    55 The New EnterpriseReality Innovate or be disrupted ● Deliver new, real-time customer experiences ● Create new business models ● Deliver massive internal efficiencies & risk reduction Every company is a software company ● Capital One 10,000 of 40,000 employees are software engineers ● Goldman Sachs 1.5B lines of code across 7,000+ applications Innovation is about events, not “data” ● Event = something happened ● Your business = a continually updating stream of events ● Your success = your ability to respond to these events
  • 6.
    66 The Rise ofEvent Streaming Data as a continuous stream of events 60% Fortune 100 Companies Using Apache Kafka
  • 7.
    77 Auto / Transport WithoutEvent Streaming With Event Streaming Event Streaming Enables New Outcomes Call for driver availability No knowledge of driver arrival No data on feature usage Real-time driver-rider match Real-time ETA Real-time sensor diagnostics Banking Nightly updated account balance Batch fraud checks Batch regulatory reporting Real-time account updates Real-time credit card fraud alerts Real-time regulatory reporting Retail Post-order “out of stock” emails No upsell through personalization Batch point-of sale reports Real-time inventory Real-time recommendations Real-time sales reporting
  • 8.
    88 Confluent is EnablingEvent-driven Transformation across Industries Healthcare & Pharma ● Patient monitoring ● Prescription control ● Lab alerts ● Medication tracking Banking & Capital Markets ● Customer 360 ● Fraud detection ● Mainframe offload ● Trade data capture Retail ● Inventory management ● Product catalog ● A/B testing ● Customized experiences Telecommunications ● Personalized ads ● Customer 360 ● Network integrity Automotive & Transportation ● Connected car ● Fleet management ● Manufacturing data processing Travel & Leisure ● Visitor segmentation ● Booking systems ● Pricing services ● Fraud detection
  • 9.
  • 10.
    1010 Confluent Enables YourEvent Streaming Success Confluent founders are original creators of Kafka Confluent team wrote 80% of Kafka commits Confluent Platform extends Apache Kafka to be a secure, enterprise-ready platform Confluent helps enterprises successfully deploy event streaming at scale and accelerate time to market Hall of Innovation CTO Innovation Award Winner 2019
  • 11.
    11 Confluent Platform Operations andSecurity Development & Stream Processing Support,services,training&partners Apache Kafka Security plugins | Role-Based Access Control Control Center | Replicator | Auto Data Balancer | Operator Connectors Clients | REST Proxy MQTT Proxy | Schema Registry KSQL Connect Continuous Commit Log Streams Complete Event Streaming Platform Mission-critical Reliability Freedom of Choice Datacenter Public Cloud Confluent Cloud Self-Managed Software Fully-Managed Service
  • 12.
    12 Real-Time Inventory Real-Time Fraud Detection Real-Time Customer 360 Machine Learning Models Real-Time Data Transformation ... Contextual Event-DrivenApplications Universal Event Pipeline Data Stores Logs 3rd Party Apps Custom Apps/Microservices TREAMSSTREAMS CONNECT CLIENTS With Confluent
  • 13.
  • 14.
  • 15.
    16 ● End toend monitoring and analytics ● Enterprise-grade security ● Multi-datacenter replication ● SQL-based stream processing ● Cloud-native deployment with K8s operator ● Run as self-managed software or as fully managed service with Confluent Cloud Enterprise streaming platform built by the original creators of Apache Kafka Edge Cloud Data LakeDatabases Datacenter IoT SaaS AppsMobile Microservices Machine Learning Confluent Platform
  • 16.
    17 Deploy and Streamwith Confluent on Any Cloud Confluent Platform The Enterprise Distribution of Apache Kafka Deploy on any platform on-premises or in public clouds Fully-Managed Service Confluent Cloud Apache Kafka Re-engineered for the Cloud Available on the leading public clouds VM Self-Managed Software
  • 17.
    1818 Apache Kafka™ ConnectAPI – Streaming Data Capture JDBC Mongo MySQL Elastic Cassandra HDFS Kafka Connect API Kafka Pipeline Connector Connector Connector Connector Connector Connector Sources Sinks Fault tolerant Manage hundreds of data sources and sinks Preserves data schema Part of Apache Kafka project Integrated within Confluent Platform’s Control Center
  • 18.
    19 Confluent Hub Connectors areposted - Verified Partner Integrations - Confluent provided and supported confluent.io/hub
  • 19.
  • 20.
  • 21.
    22 Leverage the Ecosystem Search/Log AlternativeProcessing Databases/Data Stores JDBC Data Lake/Data Warehouse File/Messaging/Custom MQTT FTP Container/Deployment Platforms Kafka and Confluent Ecosystem JMS SQS
  • 22.
    23 New Opportunities Engage with EarlyAdopters and Early Majority Kafka users Help customers access new, event driven applications Expand your reach into new markets, new use cases and new opportunities Extend the use of your product - a more complete solution offering Kafka adoption continues to grow
  • 23.
    24 Visibility and accesson Confluent Hub Joint go-to-market activities including Logo exchange Blog opportunity Online talk, field events - and lead sharing Marketing development fund availability Joint collateral Confluent sales enablement activities Joint tutorials Marketing Benefits Go To Market with Confluent
  • 24.
    25 "Customers want toingest data streams in real time from Apache Kafka into Kinetica for immediate action and analysis. Because the Confluent Platform adds significant value for enterprises, we built out the Kinetica connector using Connect APIs, offering a deeper level of integration with the Confluent Platform." -- Irina Farooq, CPO, Kinetica “Neo4j and Confluent share a customer base that is determined to push the boundaries of detecting data connections as interactions and events occur. Driven by customer need to realize more value from their streaming data, we have integrated Neo4j and Kafka both as a sink or source in a Confluent setup. As a result, Confluent and Neo4j customers will be able to power their financial fraud investigations, social media analyses, network & IT management use cases, and more with real-time graph analysis.” -- Philip Rathle, VP of Products, Neo4j "A Verified Gold Connector with Confluent Platform is important to our customers who want a validated and optimized way to enable operational data flows to and from Couchbase, an enterprise-class NoSQL database, and Kafka. With the Kafka Connect API, we have a deeper level of integration with the Confluent Platform so together, our joint solution is truly enterprise ready for any application modernization or cloud native application initiatives." -- Anthony Farinha, Senior Director, Business Development, Couchbase “In collaboration with Confluent we developed a Verified Gold connector that enables our customers to achieve the highest throughput rates possible. It also enables highly secure, resilient, and flexible connections between DataStax database products built on Apache Cassandra™ and Confluent’s event streaming platform. We promised our joint enterprise customers a fully supported microservices-based application stack and this partnership delivers on that promise.“ —Kathryn Erickson, Senior Director of Strategic Partnerships, DataStax "Imply is a real-time analytics solution, built on Apache Druid, to store, query, and visualize event-driven data. Connecting Imply to Kafka and Confluent Platform enables high-throughput streaming and sub-second interactive queries at scale. Together they enable enterprise data applications to analyze clickstreams, use behavior, network telemetry and more." -- Gian Merlino, Chief Technology Officer, Imply "Attunity, a division of Qlik, partners with Confluent to provide technologies that solve the very real problem of streaming data continuously so you can run your business in real time. -- Itamar Ankorion, Managing Director, Data Integration and SVP, Technology Alliances, Qlik What our Partners are saying Driven by customer need to realize more value from their streaming data ..we developed a Verified Gold Connector to enable the highest throughput rates possible. With the Kakfa Connect API, we have a deeper level of integration with Confluent Platform so our solution is truly enterprise ready...
  • 25.
  • 26.
    27 Helps you Build andverify your Connector or integration At no cost to Partner What is the Verified Integrations Program
  • 27.
    28 GOLD: Kafka ConnectAPI Standard: Other integrations, such as: ● Consumer / producer ● Stream processor ● Platforms ● Complimentary Verification Levels
  • 28.
    29 Best practices documentation Dev/testassistance Verification service and support How We Help Support
  • 29.
    30 TSANet Multi vendor casemanagement process and tool Solves multi vendor problems faster and easier Alleviates customer concerns about finger pointing Program Benefits
  • 30.
    31 Kafka Connect API ConfluentPlatform: Commercial standard for Apache Kafka Standardized source and sink Schema registry integration Control center Better user experience Better scalability Free, easy to use, fault tolerant framework Why Gold? Best Integration for Confluent Platform
  • 31.
    32 Process 1. Initiated Discussions tostart the process 2. Guidance Building the integration or connector 3. Submitted Start the testing with Confluent 4. Verified Published on Confluent.io/Hub, start demand gen activities
  • 32.
    33 Requirements Gold Verification Guide ● Criteriadetail ● Best practices ● Testing ● Quickstart Guide / Tutorial Standard Verification Guide ● Criteria detail ● Integration points ● Quickstart / Tutorial Documentation
  • 33.
    34 How to Submit Gold ●Contact Info ● Connector ● Documentation ○ Product Brief ○ End-to-end usage ○ Version info ○ Config info ○ Testing and results Standard ● Contact Info ● Software Package ● Documentation ○ Product Brief ○ Evaluation Guide ○ Demonstration Checklist
  • 34.
    35 Building the connector Have a connector? Reviewthe Checklist Contact us to discuss Submit your connector New Connector Review the Connector Verification Guide Contact us to start the process Sign Up or Contact Us confluent.io/verified-integrations-program
  • 35.
    37 Sign up -questions or to start the process: Verified Integrations Program confluent.io/verified-integrations-program/ Attend the online talk part 2: How to build a Kafka connector confluent.io/online-talks/ Kafka Summit San Francisco: kafka-summit.org/ code KS19Online25 for 25% off Q&A
  • 36.