Models and Pipelines are Core SDK Abstractions¶
This notebook demonstrates how models can be deployed as steps in pipelines implictly and explicitly. Older concepts such as Deployment, PipelineVariant, and PipelineConfigBuilder are no longer necessary. We simply deploy and undeploy pipelines.
[1]:
import json
import os
import wallaroo
from wallaroo.pipeline import Pipeline
model_path = "keras_ccfraud.onnx"
data_path = "dev_smoke_test.json"
with open(data_path, "rb") as f:data = json.load(f)
wl = wallaroo.Client()
Convenience Style - single model¶
Lets create and deploy a pipeline with a single model.
[2]:
pipeline = wl.upload_model(name="ccfraud2", path=model_path).deploy("single-model")
Waiting for deployment - this will take up to 45s ..... ok
We can easily undeploy it.
[3]:
pipeline.undeploy()
[3]:
{'name': 'single-model', 'create_time': datetime.datetime(2021, 5, 22, 4, 9, 11, 19234, tzinfo=tzutc()), 'definition': "[{'ModelInference': {'models': [{'name': 'ccfraud2', 'version': '76cacac7-1dc7-41fc-8b7a-1e5dc17a093c', 'sha': '4dc88d159249ccce83942ada69b919cb91455d5fd0e4bfc287de3f21d1aafb1b'}]}}]"}
Now let’s redeploy it.
[4]:
pipeline.deploy()
Waiting for deployment - this will take up to 45s ...... ok
[4]:
{'name': 'single-model', 'create_time': datetime.datetime(2021, 5, 22, 4, 10, 11, 32734, tzinfo=tzutc()), 'definition': "[{'ModelInference': {'models': [{'name': 'ccfraud2', 'version': '76cacac7-1dc7-41fc-8b7a-1e5dc17a093c', 'sha': '4dc88d159249ccce83942ada69b919cb91455d5fd0e4bfc287de3f21d1aafb1b'}]}}]"}
[5]:
pipeline.infer(data)
[5]:
[InferenceResult({'check_failures': [],
'elapsed': 103147,
'model_name': 'ccfraud2',
'model_version': 'b75496c3-5e00-461d-b921-feaf3ae7dec8',
'original_data': {'tensor': [[1.0678324729342086,
0.21778102664937624,
-1.7115145261843976,
0.6822857209662413,
1.0138553066742804,
-0.43350000129006655,
0.7395859436561657,
-0.28828395953577357,
-0.44726268795990787,
0.5146124987725894,
0.3791316964287545,
0.5190619748123175,
-0.4904593221655364,
1.1656456468728569,
-0.9776307444180006,
-0.6322198962519854,
-0.6891477694494687,
0.17833178574255615,
0.1397992467197424,
-0.35542206494183326,
0.4394217876939808,
1.4588397511627804,
-0.3886829614721505,
0.4353492889350186,
1.7420053483337177,
-0.4434654615252943,
-0.15157478906219238,
-0.26684517248765616,
-1.454961775612449]]},
'outputs': [{'Float': {'data': [0.001497417688369751],
'dim': [1, 1],
'v': 1}}],
'pipeline_name': 'single-model',
'time': 1644014192840})]
[6]:
pipeline.undeploy()
[6]:
{'name': 'single-model', 'create_time': datetime.datetime(2021, 5, 22, 4, 11, 11, 65536, tzinfo=tzutc()), 'definition': "[{'ModelInference': {'models': [{'name': 'ccfraud2', 'version': '76cacac7-1dc7-41fc-8b7a-1e5dc17a093c', 'sha': '4dc88d159249ccce83942ada69b919cb91455d5fd0e4bfc287de3f21d1aafb1b'}]}}]"}
Convenience Style - chain a second model on the end¶
Lets create a pipeline, deploy it, add another model step to the end and redeploy it.
[7]:
pipeline = wl.upload_model(name="ccfraud", path=model_path).deploy("chained-model")
model2 = wl.upload_model(name="ccfraud2", path=model_path)
Waiting for deployment - this will take up to 45s ...... ok
[8]:
pipeline.add_model_step(model2)
[8]:
{'name': 'chained-model', 'create_time': datetime.datetime(2021, 5, 22, 4, 12, 11, 65536, tzinfo=tzutc()), 'definition': "[{'ModelInference': {'models': [{'name': 'ccfraud2', 'version': '76cacac7-1dc7-41fc-8b7a-1e5dc17a093c', 'sha': '4dc88d159249ccce83942ada69b919cb91455d5fd0e4bfc287de3f21d1aafb1b'}]}}]"}
[9]:
pipeline.deploy()
ok
[9]:
{'name': 'chained-model', 'create_time': datetime.datetime(2021, 5, 22, 4, 12, 42, 86529, tzinfo=tzutc()), 'definition': "[{'ModelInference': {'models': [{'name': 'ccfraud2', 'version': '76cacac7-1dc7-41fc-8b7a-1e5dc17a093c', 'sha': '4dc88d159249ccce83942ada69b919cb91455d5fd0e4bfc287de3f21d1aafb1b'}]}}]"}
[10]:
pipeline.undeploy()
[10]:
{'name': 'chained-model', 'create_time': datetime.datetime(2021, 5, 22, 4, 13, 17, 86529, tzinfo=tzutc()), 'definition': "[{'ModelInference': {'models': [{'name': 'ccfraud2', 'version': '76cacac7-1dc7-41fc-8b7a-1e5dc17a093c', 'sha': '4dc88d159249ccce83942ada69b919cb91455d5fd0e4bfc287de3f21d1aafb1b'}]}}]"}
Convenience Style - replace a model¶
We can replace a model model by clearing the steps and adding a new step with the new model.
[11]:
pipeline = wl.upload_model(name="ccfraud", path=model_path).deploy("replaced-model")
Waiting for deployment - this will take up to 45s ...... ok
[12]:
pipeline.clear().add_model_step(model2).deploy()
ok
[12]:
{'name': 'replaced-model', 'create_time': datetime.datetime(2021, 5, 22, 4, 14, 17, 86529, tzinfo=tzutc()), 'definition': "[{'ModelInference': {'models': [{'name': 'ccfraud2', 'version': '76cacac7-1dc7-41fc-8b7a-1e5dc17a093c', 'sha': '4dc88d159249ccce83942ada69b919cb91455d5fd0e4bfc287de3f21d1aafb1b'}]}}]"}
[13]:
pipeline.undeploy()
[13]:
{'name': 'replaced-model', 'create_time': datetime.datetime(2021, 5, 22, 4, 14, 51, 86529, tzinfo=tzutc()), 'definition': "[{'ModelInference': {'models': [{'name': 'ccfraud2', 'version': '76cacac7-1dc7-41fc-8b7a-1e5dc17a093c', 'sha': '4dc88d159249ccce83942ada69b919cb91455d5fd0e4bfc287de3f21d1aafb1b'}]}}]"}
Replace a model in the middle of a pipeline¶
Each “add_<step>” method has an equivalent “replace_with_<step>” method. See the pipeline documentation for details
[14]:
model1 = wl.upload_model("preprocess", "./test_resources/pre_process.py").configure('python')
model2 = wl.upload_model("noopfloats", "./test_resources/no-op-floats.onnx").configure('onnx')
model3 = wl.upload_model("postprocess", "./test_resources/post_process.py").configure('python')
Lets create a pipeline with three model steps.
[15]:
python_pipeline = (wl.build_pipeline("pythonpipeline")
.add_model_step(model1)
.add_model_step(model2)
.add_model_step(model3)).deploy()
Waiting for deployment - this will take up to 45s ........ ok
We’ll replace the step at index 1 (second step in the pipeline) with a different model step.
[16]:
model2_replacement = wl.upload_model("some-other-model", "./test_resources/no-op-floats.onnx")
[17]:
python_pipeline.replace_with_model_step(1, model2_replacement).deploy()
ok
[17]:
{'name': 'pythonpipeline', 'create_time': datetime.datetime(2021, 5, 22, 4, 15, 2, 86529, tzinfo=tzutc()), 'definition': "[{'ModelInference': {'models': [{'name': 'ccfraud2', 'version': '76cacac7-1dc7-41fc-8b7a-1e5dc17a093c', 'sha': '4dc88d159249ccce83942ada69b919cb91455d5fd0e4bfc287de3f21d1aafb1b'}]}}]"}
[18]:
pipeline.undeploy()
[18]:
{'name': 'pythonpipeline', 'create_time': datetime.datetime(2021, 5, 22, 4, 15, 38, 86529, tzinfo=tzutc()), 'definition': "[{'ModelInference': {'models': [{'name': 'ccfraud2', 'version': '76cacac7-1dc7-41fc-8b7a-1e5dc17a093c', 'sha': '4dc88d159249ccce83942ada69b919cb91455d5fd0e4bfc287de3f21d1aafb1b'}]}}]"}
Explicit Style with all options¶
Now will create a pipeline with models that have configuration options set.
[19]:
model1 = wl.upload_model(name="ccfraud", path=model_path).configure(filter_threshold=0.123)
model2 = wl.upload_model(name="ccfraud2", path=model_path).configure()
[20]:
pipeline = wl.build_pipeline("all-options")
pipeline.add_model_step(model1)
pipeline.add_model_step(model2)
pipeline.deploy()
Waiting for deployment - this will take up to 45s ...... ok
[20]:
{'name': 'all-options', 'create_time': datetime.datetime(2021, 5, 22, 4, 14, 17, 86529, tzinfo=tzutc()), 'definition': "[{'ModelInference': {'models': [{'name': 'ccfraud2', 'version': '76cacac7-1dc7-41fc-8b7a-1e5dc17a093c', 'sha': '4dc88d159249ccce83942ada69b919cb91455d5fd0e4bfc287de3f21d1aafb1b'}]}}]"}
[21]:
pipeline.undeploy()
[21]:
{'name': 'all-options', 'create_time': datetime.datetime(2021, 5, 22, 4, 14, 19, 86529, tzinfo=tzutc()), 'definition': "[{'ModelInference': {'models': [{'name': 'ccfraud2', 'version': '76cacac7-1dc7-41fc-8b7a-1e5dc17a093c', 'sha': '4dc88d159249ccce83942ada69b919cb91455d5fd0e4bfc287de3f21d1aafb1b'}]}}]"}
[ ]: