Skip to content

Commit e748c7c

Browse files
authored
Merge branch 'master' into neo-pytorch-inf
2 parents fe05be0 + ffee0c8 commit e748c7c

File tree

82 files changed

+5453
-4851
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

82 files changed

+5453
-4851
lines changed

advanced_functionality/multi_model_bring_your_own/multi_model_endpoint_bring_your_own.ipynb

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,10 @@
1111
"\n",
1212
"For the inference container to serve multiple models in a multi-model endpoint, it must implement [additional APIs](https://docs.aws.amazon.com/sagemaker/latest/dg/build-multi-model-build-container.html) in order to load, list, get, unload and invoke specific models. This notebook demonstrates how to build your own inference container that implements these APIs.\n",
1313
"\n",
14+
"**Note**: Because this notebook builds a Docker container, it does not run in Amazon SageMaker Studio.\n",
15+
"\n",
16+
"This notebook was tested with the `conda_mxnet_p36` kernel running SageMaker Python SDK version 2.15.3 on an Amazon SageMaker notebook instance.\n",
17+
"\n",
1418
"---\n",
1519
"\n",
1620
"### Contents\n",
@@ -553,7 +557,7 @@
553557
"name": "python",
554558
"nbconvert_exporter": "python",
555559
"pygments_lexer": "ipython3",
556-
"version": "3.6.5"
560+
"version": "3.7.6"
557561
}
558562
},
559563
"nbformat": 4,

advanced_functionality/mxnet_mnist_byom/mxnet_mnist.ipynb

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@
2828
"## Introduction\n",
2929
"In this notebook, we will train a neural network locally on the location from where this notebook is run using MXNet. We will then see how to create an endpoint from the trained MXNet model and deploy it on SageMaker. We will then inference from the newly created SageMaker endpoint. \n",
3030
"\n",
31-
"The neural network that we will use is a simple fully-connected neural network. The definition of the neural network can be found in the accompanying [mnist.py](mnist.py) file. The ``build_graph`` method contains the model defnition (shown below).\n",
31+
"The neural network that we will use is a simple fully-connected neural network. The definition of the neural network can be found in the accompanying [mnist.py](mnist.py) file. The ``build_graph`` method contains the model definition (shown below).\n",
3232
"\n",
3333
"```python\n",
3434
"def build_graph():\n",
@@ -98,10 +98,10 @@
9898
"source": [
9999
"### Training\n",
100100
"\n",
101-
"It is time to train the network. Since we are training the network locally, we can make use of mxnet training tools. The training method is also in the accompanying [mnist.py](mnist.py) file. The notebook assumes that this instance is a `p2.xlarge`. If running this in a non-GPU notebook instance, please adjust num_gpus=0 and num_cpu=1 The method is shown below. \n",
101+
"It is time to train the network. Since we are training the network locally, we can make use of mxnet training tools. The training method is also in the accompanying [mnist.py](mnist.py) file. The method is as follows. \n",
102102
"\n",
103103
"```python \n",
104-
"def train(data, hyperparameters= {'learning_rate': 0.11}, num_cpus=0, num_gpus =1 , **kwargs):\n",
104+
"def train(data, hyperparameters= {'learning_rate': 0.11}, num_cpus=1, num_gpus =0 , **kwargs):\n",
105105
" train_labels = data['train_label']\n",
106106
" train_images = data['train_data']\n",
107107
" test_labels = data['test_label']\n",
@@ -133,15 +133,13 @@
133133
"outputs": [],
134134
"source": [
135135
"from mnist import train\n",
136-
"model = train(data = data, num_cpus=0, num_gpus=1)"
136+
"model = train(data = data, num_cpus=1, num_gpus=0)"
137137
]
138138
},
139139
{
140140
"cell_type": "markdown",
141141
"metadata": {},
142142
"source": [
143-
"If you want to run the training on a cpu or if you are on an instance with cpus only, pass appropriate arguments. \n",
144-
"\n",
145143
"## Set up hosting for the model\n",
146144
"\n",
147145
"### Export the model from mxnet\n",

autopilot/model-explainability/explaining_customer_churn_model.ipynb

Lines changed: 13 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
"\n",
99
"Kernel `Python 3 (Data Science)` works well with this notebook.\n",
1010
"\n",
11-
"_This notebook was created and tested on an ml.m5.large notebook instance._\n",
11+
"_This notebook was created and tested on an ml.m5.xlarge notebook instance._\n",
1212
"\n",
1313
"## Table of Contents\n",
1414
"\n",
@@ -101,9 +101,8 @@
101101
"source": [
102102
"import shap\n",
103103
"\n",
104-
"from kernel_explainer_wrapper import KernelExplainerWrapper\n",
104+
"from shap import KernelExplainer\n",
105105
"from shap import sample\n",
106-
"from shap.common import LogitLink, IdentityLink\n",
107106
"from scipy.special import expit\n",
108107
"\n",
109108
"# Initialize plugin to make plots interactive.\n",
@@ -235,7 +234,7 @@
235234
"metadata": {},
236235
"outputs": [],
237236
"source": [
238-
"churn_data = pd.read_csv('./Data sets/churn.txt')\n",
237+
"churn_data = pd.read_csv('../Data sets/churn.txt')\n",
239238
"data_without_target = churn_data.drop(columns=['Churn?'])\n",
240239
"\n",
241240
"background_data = sample(data_without_target, 50)"
@@ -252,7 +251,10 @@
252251
"cell_type": "markdown",
253252
"metadata": {},
254253
"source": [
255-
"Next, we create the `KernelExplainer`. Note that since it's a black box explainer, `KernelExplainer` only requires a handle to the predict (or predict_proba) function and does not require any other information about the model. For classification it is recommended to derive feature importance scores in the log-odds space since additivity is a more natural assumption there thus we use `LogitLink`. For regression `IdentityLink` should be used."
254+
"Next, we create the `KernelExplainer`. Note that since it's a black box explainer, `KernelExplainer` only requires a handle to the\n",
255+
"predict (or predict_proba) function and does not require any other information about the model. For classification it is recommended to\n",
256+
"derive feature importance scores in the log-odds space since additivity is a more natural assumption there thus we use `logit`. For\n",
257+
"regression `identity` should be used."
256258
]
257259
},
258260
{
@@ -263,17 +265,16 @@
263265
"source": [
264266
"# Derive link function \n",
265267
"problem_type = automl_job.describe_auto_ml_job(job_name=automl_job_name)['ResolvedAttributes']['ProblemType'] \n",
266-
"link_fn = IdentityLink if problem_type == 'Regression' else LogitLink \n",
268+
"link = \"identity\" if problem_type == 'Regression' else \"logit\"\n",
267269
"\n",
268-
"# the handle to predict_proba is passed to KernelExplainerWrapper since KernelSHAP requires the class probability\n",
269-
"explainer = KernelExplainerWrapper(automl_estimator.predict_proba, background_data, link=link_fn())"
270+
"# the handle to predict_proba is passed to KernelExplainer since KernelSHAP requires the class probability\n",
271+
"explainer = KernelExplainer(automl_estimator.predict_proba, background_data, link=link)"
270272
]
271273
},
272274
{
273275
"cell_type": "markdown",
274276
"metadata": {},
275277
"source": [
276-
"Currently, `shap.KernelExplainer` only supports numeric data. A version of SHAP that supports text will become available soon. A workaround is provided by our wrapper `KernelExplainerWrapper`. Once a new version of SHAP is released, `shap.KernelExplainer` should be used instead of `KernelExplainerWrapper`.\n",
277278
"\n",
278279
"By analyzing the background data `KernelExplainer` provides us with `explainer.expected_value` which is the model prediction with all features missing. Considering a customer for which we have no data at all (i.e. all features are missing) this should theoretically be the model prediction."
279280
]
@@ -326,7 +327,7 @@
326327
"outputs": [],
327328
"source": [
328329
"# Since shap_values are provided in the log-odds space, we convert them back to the probability space by using LogitLink\n",
329-
"shap.force_plot(explainer.expected_value, shap_values, x, link=link_fn())"
330+
"shap.force_plot(explainer.expected_value, shap_values, x, link=link)"
330331
]
331332
},
332333
{
@@ -348,7 +349,7 @@
348349
"source": [
349350
"with ManagedEndpoint(ep_name) as mep:\n",
350351
" shap_values = explainer.shap_values(x, nsamples='auto', l1_reg='num_features(5)')\n",
351-
"shap.force_plot(explainer.expected_value, shap_values, x, link=link_fn())"
352+
"shap.force_plot(explainer.expected_value, shap_values, x, link=link)"
352353
]
353354
},
354355
{
@@ -396,7 +397,7 @@
396397
"metadata": {},
397398
"outputs": [],
398399
"source": [
399-
"shap.force_plot(explainer.expected_value, shap_values, X, link=link_fn())"
400+
"shap.force_plot(explainer.expected_value, shap_values, X, link=link)"
400401
]
401402
},
402403
{

autopilot/model-explainability/kernel_explainer_wrapper.py

Lines changed: 0 additions & 57 deletions
This file was deleted.

aws_marketplace/creating_marketplace_products/Bring_Your_Own-Creating_Algorithm_and_Model_Package.ipynb

Lines changed: 3 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -258,8 +258,6 @@
258258
"\n",
259259
"# Get the region defined in the current configuration (default to us-west-2 if none defined)\n",
260260
"region=$(aws configure get region)\n",
261-
"# specifically setting to us-east-2 since during the pre-release period, we support only that region.\n",
262-
"region=${region:-us-east-2}\n",
263261
"\n",
264262
"fullname=\"${account}.dkr.ecr.${region}.amazonaws.com/${algorithm_name}:latest\"\n",
265263
"\n",
@@ -595,14 +593,6 @@
595593
"Now that you have verified that the algorithm code works for training, live inference and batch inference in the above sections, you can start packaging it up as an Amazon SageMaker Algorithm."
596594
]
597595
},
598-
{
599-
"cell_type": "markdown",
600-
"metadata": {},
601-
"source": [
602-
"#### Region Limitation\n",
603-
"Seller onboarding is limited to us-east-2 region (CMH) only. The client we are creating below will be hard-coded to talk to our us-east-2 endpoint only."
604-
]
605-
},
606596
{
607597
"cell_type": "code",
608598
"execution_count": null,
@@ -611,7 +601,7 @@
611601
"source": [
612602
"import boto3\n",
613603
"\n",
614-
"smmp = boto3.client('sagemaker', region_name='us-east-2', endpoint_url=\"https://sagemaker.us-east-2.amazonaws.com\")"
604+
"smmp = boto3.client('sagemaker')"
615605
]
616606
},
617607
{
@@ -807,21 +797,13 @@
807797
"A Model Package is a reusable model artifacts abstraction that packages all ingredients necessary for inference. It consists of an inference specification that defines the inference image to use along with an optional model weights location.\n"
808798
]
809799
},
810-
{
811-
"cell_type": "markdown",
812-
"metadata": {},
813-
"source": [
814-
"#### Region Limitation\n",
815-
"Seller onboarding is limited to us-east-2 region (CMH) only. The client we are creating below will be hard-coded to talk to our us-east-2 endpoint only. (Note: You may have previous done this step in Part 3. Repeating here to keep Part 4 self contained.)"
816-
]
817-
},
818800
{
819801
"cell_type": "code",
820802
"execution_count": null,
821803
"metadata": {},
822804
"outputs": [],
823805
"source": [
824-
"smmp = boto3.client('sagemaker', region_name='us-east-2', endpoint_url=\"https://sagemaker.us-east-2.amazonaws.com\")"
806+
"smmp = boto3.client('sagemaker')"
825807
]
826808
},
827809
{
@@ -982,7 +964,7 @@
982964
"name": "python",
983965
"nbconvert_exporter": "python",
984966
"pygments_lexer": "ipython3",
985-
"version": "3.6.5"
967+
"version": "3.6.10"
986968
}
987969
},
988970
"nbformat": 4,

aws_marketplace/curating_aws_marketplace_listing_and_sample_notebook/Algorithm/Sample_Notebook_Template/title_of_your_product-Algorithm.ipynb

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,7 @@
1313
"\n",
1414
"This sample notebook shows you how to train a custom ML model using <font color='red'> For Seller to update:[Title_of_your_Algorithm](Provide link to your marketplace listing of your product)</font> from AWS Marketplace.\n",
1515
"\n",
16+
"> **Note**: This is a reference notebook and it cannot run unless you make changes suggested in the notebook.\n",
1617
"\n",
1718
"#### Pre-requisites:\n",
1819
"1. **Note**: This notebook contains elements which render correctly in Jupyter interface. Open this notebook from an Amazon SageMaker Notebook Instance or Amazon SageMaker Studio.\n",
@@ -844,9 +845,9 @@
844845
],
845846
"metadata": {
846847
"kernelspec": {
847-
"display_name": "Python 3",
848+
"display_name": "conda_python3",
848849
"language": "python",
849-
"name": "python3"
850+
"name": "conda_python3"
850851
},
851852
"language_info": {
852853
"codemirror_mode": {
@@ -858,7 +859,7 @@
858859
"name": "python",
859860
"nbconvert_exporter": "python",
860861
"pygments_lexer": "ipython3",
861-
"version": "3.8.4"
862+
"version": "3.6.10"
862863
}
863864
},
864865
"nbformat": 4,

aws_marketplace/curating_aws_marketplace_listing_and_sample_notebook/ModelPackage/Sample_Notebook_Template/title_of_your_product-Model.ipynb

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@
1111
"\n",
1212
"This sample notebook shows you how to deploy <font color='red'> For Seller to update:[Title_of_your_ML Model](Provide link to your marketplace listing of your product)</font> using Amazon SageMaker.\n",
1313
"\n",
14+
"> **Note**: This is a reference notebook and it cannot run unless you make changes suggested in the notebook.\n",
1415
"\n",
1516
"#### Pre-requisites:\n",
1617
"1. **Note**: This notebook contains elements which render correctly in Jupyter interface. Open this notebook from an Amazon SageMaker Notebook Instance or Amazon SageMaker Studio.\n",
@@ -416,9 +417,9 @@
416417
],
417418
"metadata": {
418419
"kernelspec": {
419-
"display_name": "Python 3",
420+
"display_name": "conda_python3",
420421
"language": "python",
421-
"name": "python3"
422+
"name": "conda_python3"
422423
},
423424
"language_info": {
424425
"codemirror_mode": {
@@ -430,7 +431,7 @@
430431
"name": "python",
431432
"nbconvert_exporter": "python",
432433
"pygments_lexer": "ipython3",
433-
"version": "3.8.4"
434+
"version": "3.6.10"
434435
}
435436
},
436437
"nbformat": 4,

aws_marketplace/using_model_packages/auto_insurance/src/model_package_arns.py

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,6 +27,8 @@ def get_vehicle_damage_detection_model_package_arn(current_region):
2727
def get_vehicle_recognition_model_package_arn(current_region):
2828
mapping = {
2929
"us-east-1" : "arn:aws:sagemaker:us-east-1:865070037744:model-package/vehicle-5bbb43353155de115c9fabdde5167c06",
30+
"us-east-2" : "arn:aws:sagemaker:us-east-2:057799348421:model-package/vehicle-5bbb43353155de115c9fabdde5167c06",
31+
3032
"ap-northeast-1" : "arn:aws:sagemaker:ap-northeast-1:977537786026:model-package/vehicle-5bbb43353155de115c9fabdde5167c06",
3133
"ap-northeast-2" : "arn:aws:sagemaker:ap-northeast-2:745090734665:model-package/vehicle-5bbb43353155de115c9fabdde5167c06",
3234
"ap-southeast-1" : "arn:aws:sagemaker:ap-southeast-1:192199979996:model-package/vehicle-5bbb43353155de115c9fabdde5167c06",
@@ -35,6 +37,9 @@ def get_vehicle_recognition_model_package_arn(current_region):
3537
"ap-south-1": "arn:aws:sagemaker:ap-south-1:077584701553:model-package/vehicle-5bbb43353155de115c9fabdde5167c06",
3638
"ca-central-1":"arn:aws:sagemaker:ca-central-1:470592106596:model-package/vehicle-5bbb43353155de115c9fabdde5167c06",
3739
"eu-west-1" : "arn:aws:sagemaker:eu-west-1:985815980388:model-package/vehicle-5bbb43353155de115c9fabdde5167c06",
38-
"eu-west-2" : "arn:aws:sagemaker:eu-west-2:856760150666:model-package/vehicle-5bbb43353155de115c9fabdde5167c06"
40+
"eu-west-2" : "arn:aws:sagemaker:eu-west-2:856760150666:model-package/vehicle-5bbb43353155de115c9fabdde5167c06",
41+
"us-west-2" : "arn:aws:sagemaker:us-west-2:594846645681:model-package/vehicle-5bbb43353155de115c9fabdde5167c06",
42+
"us-west-1" : "arn:aws:sagemaker:us-west-1:382657785993:model-package/vehicle-5bbb43353155de115c9fabdde5167c06"
43+
3944
}
4045
return mapping[current_region]

aws_marketplace/using_model_packages/generic_sample_notebook/A_generic_sample_notebook_to_perform_inference_on_ML_model_packages_from_AWS_Marketplace.ipynb

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
{
22
"cells": [
33
{
4+
"attachments": {},
45
"cell_type": "markdown",
56
"metadata": {},
67
"source": [
@@ -10,6 +11,7 @@
1011
"\n",
1112
"If such a sample notebook does not exist and you want to deploy and try an ML model package via code written in python language, this generic notebook can guide you on how to deploy and perform inference on an ML model package from AWS Marketplace.\n",
1213
"\n",
14+
"> **Note**: This is a reference notebook and it cannot run unless you make changes suggested in the notebook.\n",
1315
"\n",
1416
"> **Note**:If you are facing technical issues while trying an ML model package from AWS Marketplace and need help, please open a support ticket or write to the team on aws-mp-bd-ml@amazon.com for additional assistance.\n",
1517
"\n",
@@ -935,7 +937,7 @@
935937
"name": "python",
936938
"nbconvert_exporter": "python",
937939
"pygments_lexer": "ipython3",
938-
"version": "3.6.5"
940+
"version": "3.6.10"
939941
}
940942
},
941943
"nbformat": 4,

aws_marketplace/using_model_packages/improving_industrial_workplace_safety/src/model_package_arns.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,9 +13,9 @@ def get_construction_worker_model_package_arn(current_region):
1313
"eu-west-1": "arn:aws:sagemaker:eu-west-1:985815980388:model-package/construction-worker-v1-copy-06-3f94f03fae021ca61cb609d42d0118c2",
1414
"eu-west-2": "arn:aws:sagemaker:eu-west-2:856760150666:model-package/construction-worker-v1-copy-06-3f94f03fae021ca61cb609d42d0118c2",
1515
"us-east-1": "arn:aws:sagemaker:us-east-1:865070037744:model-package/construction-worker-v1-copy-06-3f94f03fae021ca61cb609d42d0118c2",
16-
"us-east-2": "arn:aws:sagemaker:us-west-1:382657785993:model-package/construction-worker-v1-copy-06-3f94f03fae021ca61cb609d42d0118c2",
17-
"us-west-1": "arn:aws:sagemaker:us-west-2:594846645681:model-package/construction-worker-v1-copy-06-3f94f03fae021ca61cb609d42d0118c2",
18-
"us-west-2": "arn:aws:sagemaker:us-east-2:057799348421:model-package/construction-worker-v1-copy-06-3f94f03fae021ca61cb609d42d0118c2"}
16+
"us-east-2": "arn:aws:sagemaker:us-east-2:057799348421:model-package/construction-worker-v1-copy-06-3f94f03fae021ca61cb609d42d0118c2",
17+
"us-west-1": "arn:aws:sagemaker:us-west-1:382657785993:model-package/construction-worker-v1-copy-06-3f94f03fae021ca61cb609d42d0118c2",
18+
"us-west-2": "arn:aws:sagemaker:us-west-2:594846645681:model-package/construction-worker-v1-copy-06-3f94f03fae021ca61cb609d42d0118c2"}
1919
return mapping[current_region]
2020
@staticmethod
2121
def get_machine_detection_model_package_arn(current_region):

0 commit comments

Comments
 (0)