Skip to content

Commit 962de9c

Browse files
authored
fix Tensorflow -> TensorFlow (#2783)
1 parent d4f9617 commit 962de9c

File tree

25 files changed

+34
-34
lines changed

25 files changed

+34
-34
lines changed

doc/design/alps_submitter.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -158,7 +158,7 @@ LABEL class
158158
```
159159

160160
### Semantic Analyze
161-
Feature Expressions except for Tensorflow Feature Column API should raise an error.
161+
Feature Expressions except for TensorFlow Feature Column API should raise an error.
162162
```sql
163163
/* Not supported */
164164
select * from kaggle_credit_fraud_training_data

doc/design/codegen_couler_use_ir.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -10,12 +10,12 @@ In this design, we'll only have `codegen_couler.go` to generate programs to run
1010

1111
For example, to run the training steps, we intend to use couler API call: `couler.{xgboost,tensorflow,elasticdl}.train(model_def, data)` to generate a training step. The `train` function must take lots of parameters to run the training job as the SQLFlow statement describes, see [here](https://github.com/sql-machine-learning/sqlflow/blob/develop/python/runtime/tensorflow/train.py#L52) as an example.
1212

13-
To implement the single `codegen_couler.go` to support generate code that can run either Tensorflow/XGBoost/ElasticDL/ALPS programs, we have below two choices:
13+
To implement the single `codegen_couler.go` to support generate code that can run either TensorFlow/XGBoost/ElasticDL/ALPS programs, we have below two choices:
1414

1515
1. `couler.{tensorflow/xgboost/elasticdl}.train` have different arguments defined, so we can do:
1616

1717
```go
18-
if ir.ModelType == "Tensorflow":
18+
if ir.ModelType == "TensorFlow":
1919
tfFiller := TFFiller{
2020
Estimator: generateTFEstimatorCode(ir),
2121
FeatureColumns: generateFeatureColumnsCode(ir),
@@ -55,6 +55,6 @@ To implement the single `codegen_couler.go` to support generate code that can ru
5555

5656
We intend to use the solution **No.2** described above for these reasons.
5757

58-
1. If a data scientist needs to add a new type of engine (SVM, SKLearn, PyTorch, etc.) other than Tensorflow/XGBoost/ElasticDL, he/she can use python only to define a `couler.{new_engine}.train` function without modifying the Go code in SQLFlow.
58+
1. If a data scientist needs to add a new type of engine (SVM, SKLearn, PyTorch, etc.) other than TensorFlow/XGBoost/ElasticDL, he/she can use python only to define a `couler.{new_engine}.train` function without modifying the Go code in SQLFlow.
5959
1. `codegen_couler.go` have less code.
6060
1. All submitter unit tests can run in couler.

doc/design/couler_sqlflow.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -104,7 +104,7 @@ Note: You can check more details about the IR definition from [ir.go](/pkg/ir/ir
104104

105105
### SQLFLow Submitter Python Module
106106

107-
An SQLFlow submitter Python module `runtime.{tensorflow,xgboost,elasticdl}.train` accepts an SQLFlow IR with protobuf text format, and then submit a Tensorflow, XGBoost or ElasticDL training job, we can call it like:
107+
An SQLFlow submitter Python module `runtime.{tensorflow,xgboost,elasticdl}.train` accepts an SQLFlow IR with protobuf text format, and then submit a TensorFlow, XGBoost or ElasticDL training job, we can call it like:
108108

109109
``` bash
110110
cat ir.proto_text | python -m runtime.xgboost.train

doc/design/diag_attribute_error.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ this can active three advantages at least:
3131

3232
1. Early testing, we can do early testing before running the job; users can wait less time and cluster save resources.
3333
2. More accurate diagnostic message.
34-
3. Model developers do not have to involve dependencies other than Keras or Tensorflow.
34+
3. Model developers do not have to involve dependencies other than Keras or TensorFlow.
3535

3636
## Design
3737

doc/design/model_zoo.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -101,7 +101,7 @@ $ sqlflow delete repo model_image:v0.1
101101

102102
For a SELECT program using the SQLFlow syntax extension, the SQLFlow server converts it into a [workflow](workflow.md) and submit the workflow job to a workflow engine like Argo/Tekton on Kubernetes. Each step in the workflow is one SQL statement.
103103

104-
By default, we use a default Docker image to run the training, predicting or explaining job. The default Docker image contains pre-made Tensorflow estimator models, Keras models defined in [sqlflow_models repo](https://github.com/sql-machine-learning/models) and XGBoost. To use a custom model repo Docker image, write SQL statements mentioned above:
104+
By default, we use a default Docker image to run the training, predicting or explaining job. The default Docker image contains pre-made TensorFlow estimator models, Keras models defined in [sqlflow_models repo](https://github.com/sql-machine-learning/models) and XGBoost. To use a custom model repo Docker image, write SQL statements mentioned above:
105105

106106
```sql
107107
SELECT * FROM employee WHERE onboard_year < 2019

doc/model_parameter.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -206,7 +206,7 @@ TBD
206206

207207
TBD
208208

209-
## Tensorflow Parameters
209+
## TensorFlow Parameters
210210

211211
### TRAIN
212212

doc/sqlflow.org_cn/index.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,8 +12,8 @@
1212
1. [模型解释]()
1313
1. [模型列表](model_list.cn.md)
1414
1. [有监督学习]()
15-
1. [Tensorflow Estimator 模型]()
16-
1. [Tensorflow Keras 模型]()
15+
1. [TensorFlow Estimator 模型]()
16+
1. [TensorFlow Keras 模型]()
1717
1. [XGBoost 模型]()
1818
1. [无监督学习]()
1919
1. [部署]()

doc/talk/201906/sqlflow.slide

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -81,7 +81,7 @@ TensorFlow model inputs are dense tensors, which are normally high-dimensional m
8181
- Hash the name string into a 64-bit integer, then bucketizing hashed value into 100 dimension columns.
8282
.image column_fea2.png _ 600
8383

84-
Generated code using Tensorflow "Feature Column" should look like:
84+
Generated code using TensorFlow "Feature Column" should look like:
8585
.code feature_column_code.pysample
8686

8787
* Design: Feature Derivation

doc/talk/20190620/sqlflow.slide

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ TensorFlow model inputs are dense tensors, which usually are high-dimensional ma
6363
- Hash the name string into a 64-bit integer, then bucketing hashed value into 100 dimension columns.
6464
.image ../201906/column_fea2.png _ 600
6565

66-
Generated code using Tensorflow "Feature Column" should look like:
66+
Generated code using TensorFlow "Feature Column" should look like:
6767
.code ../201906/feature_column_code.pysample
6868

6969
* Design: Feature Derivation

docker/step/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
1-
# A Slim Tensorflow Step Image
1+
# A Slim TensorFlow Step Image
22

3-
This image is used when submitting Argo workflows to run Tensorflow/PAI Tensorflow jobs. To build this image, you should follow the below steps:
3+
This image is used when submitting Argo workflows to run TensorFlow/PAI TensorFlow jobs. To build this image, you should follow the below steps:
44

55
1. Go to SQLFlow root directory
66
```bash

0 commit comments

Comments
 (0)