Skip to content

Commit 0185f4a

Browse files
topefolorunsooctavia-squidington-iiinatikgadzhiChristoGrabdarynaishchenko
authored
✨Source Youtube Analytics - Migrate Python CDK to Low-code CDK to Manifest-only (#42838)
Co-authored-by: Octavia Squidington III <octavia-squidington-iii@users.noreply.github.com> Co-authored-by: Natik Gadzhi <natik@respawn.io> Co-authored-by: Christo Grabowski <108154848+ChristoGrab@users.noreply.github.com> Co-authored-by: darynaishchenko <darina.ishchenko17@gmail.com> Co-authored-by: Daryna Ishchenko <80129833+darynaishchenko@users.noreply.github.com>
1 parent c6c51eb commit 0185f4a

34 files changed

+1985
-4297
lines changed

airbyte-integrations/connectors/source-youtube-analytics/README.md

Lines changed: 28 additions & 53 deletions
Original file line numberDiff line numberDiff line change
@@ -1,89 +1,64 @@
1-
# Youtube-Analytics source connector
1+
# Youtube analytics source connector
22

3+
This directory contains the manifest-only connector for `source-youtube-analytics`.
4+
This _manifest-only_ connector is not a Python package on its own, as it runs inside of the base `source-declarative-manifest` image.
35

4-
This is the repository for the Youtube-Analytics source connector, written in Python.
5-
For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/sources/youtube-analytics).
6+
For information about how to configure and use this connector within Airbyte, see [the connector's full documentation](https://docs.airbyte.com/integrations/sources/youtube-analytics).
67

78
## Local development
89

9-
### Prerequisites
10-
* Python (~=3.9)
11-
* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation)
10+
We recommend using the Connector Builder to edit this connector.
11+
Using either Airbyte Cloud or your local Airbyte OSS instance, navigate to the **Builder** tab and select **Import a YAML**.
12+
Then select the connector's `manifest.yaml` file to load the connector into the Builder. You're now ready to make changes to the connector!
1213

14+
If you prefer to develop locally, you can follow the instructions below.
1315

14-
### Installing the connector
15-
From this connector directory, run:
16-
```bash
17-
poetry install --with dev
18-
```
19-
20-
21-
### Create credentials
22-
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/youtube-analytics)
23-
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_youtube_analytics/spec.yaml` file.
24-
Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
25-
See `sample_files/sample_config.json` for a sample config file.
26-
27-
28-
### Locally running the connector
29-
```
30-
poetry run source-youtube-analytics spec
31-
poetry run source-youtube-analytics check --config secrets/config.json
32-
poetry run source-youtube-analytics discover --config secrets/config.json
33-
poetry run source-youtube-analytics read --config secrets/config.json --catalog sample_files/configured_catalog.json
34-
```
16+
### Building the docker image
3517

36-
### Running unit tests
37-
To run unit tests locally, from the connector directory run:
38-
```
39-
poetry run pytest unit_tests
40-
```
18+
You can build any manifest-only connector with `airbyte-ci`:
4119

42-
### Building the docker image
4320
1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md)
4421
2. Run the following command to build the docker image:
22+
4523
```bash
4624
airbyte-ci connectors --name=source-youtube-analytics build
4725
```
4826

4927
An image will be available on your host with the tag `airbyte/source-youtube-analytics:dev`.
5028

29+
### Creating credentials
30+
31+
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/sources/youtube-analytics)
32+
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `spec` object in the connector's `manifest.yaml` file.
33+
Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
5134

5235
### Running as a docker container
53-
Then run any of the connector commands as follows:
54-
```
36+
37+
Then run any of the standard source connector commands:
38+
39+
```bash
5540
docker run --rm airbyte/source-youtube-analytics:dev spec
5641
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-youtube-analytics:dev check --config /secrets/config.json
5742
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-youtube-analytics:dev discover --config /secrets/config.json
5843
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-youtube-analytics:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
5944
```
6045

61-
### Running our CI test suite
46+
### Running the CI test suite
47+
6248
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
49+
6350
```bash
6451
airbyte-ci connectors --name=source-youtube-analytics test
6552
```
6653

67-
### Customizing acceptance Tests
68-
Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
69-
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
70-
71-
### Dependency Management
72-
All of your dependencies should be managed via Poetry.
73-
To add a new dependency, run:
74-
```bash
75-
poetry add <package-name>
76-
```
54+
## Publishing a new version of the connector
7755

78-
Please commit the changes to `pyproject.toml` and `poetry.lock` files.
56+
If you want to contribute changes to `source-youtube-analytics`, here's how you can do that:
7957

80-
## Publishing a new version of the connector
81-
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
82-
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-youtube-analytics test`
83-
2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)):
58+
1. Make your changes locally, or load the connector's manifest into Connector Builder and make changes there.
59+
2. Make sure your changes are passing our test suite with `airbyte-ci connectors --name=source-youtube-analytics test`
60+
3. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)):
8461
- bump the `dockerImageTag` value in in `metadata.yaml`
85-
- bump the `version` value in `pyproject.toml`
86-
3. Make sure the `metadata.yaml` content is up to date.
8762
4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/sources/youtube-analytics.md`).
8863
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
8964
6. Pat yourself on the back for being an awesome contributor.

airbyte-integrations/connectors/source-youtube-analytics/acceptance-test-config.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ connector_image: airbyte/source-youtube-analytics:dev
44
acceptance_tests:
55
spec:
66
tests:
7-
- spec_path: "source_youtube_analytics/spec.json"
7+
- spec_path: "manifest.yaml"
88
connection:
99
tests:
1010
- config_path: "secrets/config.json"
Lines changed: 112 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,112 @@
1+
#
2+
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
3+
#
4+
import csv
5+
import io
6+
from dataclasses import dataclass
7+
from typing import Any, Callable, Generator, Mapping, MutableMapping, Optional, Union
8+
9+
import requests
10+
11+
from airbyte_cdk import Decoder
12+
from airbyte_cdk.sources.declarative.migrations.state_migration import StateMigration
13+
from airbyte_cdk.sources.declarative.requesters.http_requester import HttpRequester
14+
from airbyte_cdk.sources.types import StreamSlice, StreamState
15+
16+
17+
@dataclass
18+
class CustomDecoder(Decoder):
19+
def is_stream_response(self) -> bool:
20+
return False
21+
22+
def decode(self, response: requests.Response) -> Generator[MutableMapping[str, Any], None, None]:
23+
fp = io.StringIO(response.text)
24+
reader = csv.DictReader(fp)
25+
for record in reader:
26+
yield record
27+
28+
29+
@dataclass
30+
class JobRequester(HttpRequester):
31+
"""
32+
Sends request to create a report job if it doesn't exist yet.
33+
"""
34+
35+
JOB_NAME = "Airbyte reporting job"
36+
37+
def send_request(
38+
self,
39+
stream_state: Optional[StreamState] = None,
40+
stream_slice: Optional[StreamSlice] = None,
41+
next_page_token: Optional[Mapping[str, Any]] = None,
42+
path: Optional[str] = None,
43+
request_headers: Optional[Mapping[str, Any]] = None,
44+
request_params: Optional[Mapping[str, Any]] = None,
45+
request_body_data: Optional[Union[Mapping[str, Any], str]] = None,
46+
request_body_json: Optional[Mapping[str, Any]] = None,
47+
log_formatter: Optional[Callable[[requests.Response], Any]] = None,
48+
) -> Optional[requests.Response]:
49+
response = super().send_request(
50+
stream_state,
51+
stream_slice,
52+
next_page_token,
53+
path,
54+
request_headers,
55+
request_params,
56+
request_body_data,
57+
request_body_json,
58+
log_formatter,
59+
)
60+
61+
stream_job = [r for r in response.json()["jobs"] if r["reportTypeId"] == self._parameters["report_type_id"]]
62+
63+
if not stream_job:
64+
self._http_client.send_request(
65+
http_method="post",
66+
url=self._get_url(
67+
path=path,
68+
stream_state=stream_state,
69+
stream_slice=stream_slice,
70+
next_page_token=next_page_token,
71+
),
72+
request_kwargs={"stream": self.stream_response},
73+
headers=self._request_headers(stream_state, stream_slice, next_page_token, request_headers),
74+
json={"name": self.JOB_NAME, "reportTypeId": self._parameters["report_id"]},
75+
dedupe_query_params=True,
76+
log_formatter=log_formatter,
77+
exit_on_rate_limit=self._exit_on_rate_limit,
78+
)
79+
response = super().send_request(
80+
stream_state,
81+
stream_slice,
82+
next_page_token,
83+
path,
84+
request_headers,
85+
request_params,
86+
request_body_data,
87+
request_body_json,
88+
log_formatter,
89+
)
90+
91+
return response
92+
93+
94+
class ReportsStateMigration(StateMigration):
95+
def should_migrate(self, stream_state: Mapping[str, Any]) -> bool:
96+
return stream_state.get("state") or stream_state.get("date")
97+
98+
def migrate(self, stream_state: Mapping[str, Any]) -> Mapping[str, Any]:
99+
if stream_state.get("date"):
100+
# old format state before migration to low code
101+
cursor_value = str(stream_state["date"])
102+
stream_state = {
103+
"state": {"date": cursor_value},
104+
"parent_state": {"report": {"state": {"date": cursor_value}, "lookback_window": 0}},
105+
}
106+
return stream_state
107+
108+
cursor_value = stream_state["state"]
109+
cursor_value["date"] = str(cursor_value["date"])
110+
stream_state["parent_state"]["report"]["state"] = cursor_value
111+
stream_state["parent_state"]["report"]["lookback_window"] = 0
112+
return stream_state

airbyte-integrations/connectors/source-youtube-analytics/main.py

Lines changed: 0 additions & 9 deletions
This file was deleted.

0 commit comments

Comments
 (0)