You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This project was designed to leverage **several**`Google Cloud Platform` cloud services to create a robust and automated system which can be interacted with via `C++` code. Why `C++`? The use case of that programming language specifically in the context of this project would be for example, having a `C++` based physical device pull from the cloud the latest `firmware` it needs, or just any kind of data from the cloud really.
3
+
This project was designed to leverage **several**`Google Cloud Platform` cloud services to create a robust and automated system which can be interacted with via `C++` code.
4
4
5
-
Ultimately in the end, a secure API to pull data from a `BigQuery` database table was designed and it can actually be called by **any** programming language as long as the correct credentials get sent in `json` format.
5
+
Why `C++`? The use case of that programming language specifically in the context of this project would be for example, having a `C++` based physical device pull from the cloud the latest `firmware` it needs, or just any kind of data from the cloud really.
6
+
7
+
Ultimately in the end, a secure API to pull data from a `BigQuery` database table was designed and it can actually be called by **any** programming language as long as the correct credentials get sent in `json` format to the `Auth_Func`.
6
8
7
9
The programming language `Python` is utilized to create the entire backend component on `Google Cloud Platform`.
8
10
9
11
`Firebase` is also utilized as part of the authentication system.
10
12
11
13
------
12
14
13
-
**`Google Cloud Platform` technologies:**
15
+
`Google Cloud Platform` technologies:
16
+
17
+
-`Artifact Registry` - Stores the latest Docker image builds for `Cloud Run`
14
18
15
-
1.`Artifact Registry` - Stores the latest Docker image builds for the Cloud Functions / Cloud Run
19
+
-`BigQuery` - Data warehouse which stores the table of data we interact with
16
20
17
-
2.`BigQuery` - Data warehouse which stores the table of data we interact with
21
+
-`Cloud Build` - Monitors GitHub repo and automates the process of deploying new code into the cloud
18
22
19
-
3.`Cloud Build` - Monitors GitHub repo and automates the process of deploying new code into the cloud
23
+
-`Cloud Functions (Gen 2)` - The backend for the project. Within each Cloud Function is the `Python` code.
20
24
21
-
4.`Cloud Functions (Gen 2)` - The backend for the project. Within each Cloud Function is the `Python` code.
25
+
-`Cloud Run` - Generation 2 of `Cloud Functions` is actually `Cloud Run` behind the scenes. Therefore `Python` code executes here.
22
26
23
-
5.`Cloud Run` - Generation 2 of `Cloud Functions` is actually `Cloud Run` behind the scenes. Therefore `Python` code executes here.
27
+
-`Cloud Scheduler` - Cron job scheduler for any job in the cloud
24
28
25
-
6.`Cloud Scheduler` - Cron job scheduler for any job in the cloud
29
+
-`Cloud Storage` - Stores the latest `Python` code for the `Cloud Functions`
26
30
27
-
7.`Secret Manager` - Secure storage system for sensitive data
31
+
-`Secret Manager` - Secure storage system for sensitive data
28
32
29
-
**`Firebase` technologies:**
33
+
`Firebase` technologies:
30
34
31
-
1.`Authentication` - Stores the user account data
35
+
-`Authentication` - Stores the user account data
32
36
33
37
-----
34
38
@@ -41,24 +45,34 @@ Values of the 5 most recent records from the BigQuery table get returned to the
41
45
----
42
46
### Cloud Functions
43
47
44
-
1.`Auth_Func` - Publically Accessible
48
+
1.`Auth_Func` - Publically Accessible (could be private as well)
45
49
2.`Compute_Func` - Private
46
50
3.`Insert_Int_Func` - Private
47
51
48
52
Let's talk about each:
49
53
50
-
`Auth_Func` The authentication system.
54
+
1.`Auth_Func` The authentication system.
51
55
52
-
Everything starts here. Way before any data can be extracted from the `BigQuery` database table, a `request` from the user has to pass multiple tests before the code can proceeed.
56
+
-Everything starts here. Way before any data can be extracted from the `BigQuery` database table, a `request` from the user has to pass multiple tests before the code can proceeed.
53
57
54
-
1. The `request` cannot be empty and the `json` must have the correct keys, `email` and `password`.
55
-
2. The value of the `email` key of the `json` must be a string.
56
-
3. The format of the value of the `email` key of the `json` must be in the correct format e.g. `"name@domain.com"`.
57
-
4. The value of the `password` key of the `json` must be a string.
58
-
5. The `email` and `password` must correlate to an existing user within `Firebase Authentication`.
58
+
-1. The `request` cannot be empty and the `json` must have the correct keys, `email` and `password`.
59
+
-2. The value of the `email` key of the `json` must be a string.
60
+
-3. The format of the value of the `email` key of the `json` must be in the correct format e.g. `"name@domain.com"`.
61
+
-4. The value of the `password` key of the `json` must be a string.
62
+
-5. The `email` and `password` must correlate to an existing user within `Firebase Authentication`.
59
63
60
-
If everything checks out, then the `Compute_Func``Cloud Function` gets called directly from the `Auth_Func`.
64
+
- If everything checks out, then the `Compute_Func``Cloud Function` gets called directly from the `Auth_Func`.
65
+
66
+
2.`Compute_Func` The computation system. Computation in the sense of enumerating, listing out.
67
+
68
+
- The job of the `Compute_Func` is to simply extract the values of the 5 most recent records from a `BigQuery` table.
69
+
- Those values are then placed into a list.
70
+
- That list is converted to a string and then the string is returned to the `Auth_Func`.
61
71
72
+
3.`Insert_Int_Func` The data insertion system.
62
73
74
+
- The `Insert_Int_Func` operates independently from the `Auth_Func` and `Compute_Func`.
75
+
-`Insert_Int_Func` is triggered every **6 hours** by `Cloud Scheduler`
76
+
- A randomized integer between the ranges of 0-75 is inserted into a `BigQuery` table
0 commit comments