YOUR CODE! AT SCALE! Amazon SageMaker Script Mode

  Рет қаралды 9,407

mikegchambers

mikegchambers

Күн бұрын

Amazon SageMaker has pre-built algorithms, and can orchestrate containers that you provide... but what's the middle ground?
Amazon SageMaker Script Mode will take your ML code and handle the creation of containers for you.
I use code from one of my Git repos in this video, you can find it here:
github.com/learn-mikegchamber...
If you liked this video, please subscribe, and connect with me on LinkedIn:
/ mikegchambers
Times:
0:00 - Intro
3:51 - Code Review
11:37 - Script Mode Training
24:53 - Script Mode Inference
28:23 - Outro

Пікірлер: 31
@AIwithMahmudul
@AIwithMahmudul Жыл бұрын
Very good one, straight to the point. Thanks for making it
@KellyDiversified
@KellyDiversified Жыл бұрын
Outstanding, Mike - thank you!
@mikegchambers
@mikegchambers Жыл бұрын
Glad you enjoyed it!
@satishb9975
@satishb9975 4 ай бұрын
Thank you very much this is really awesome
@Kmysiak1
@Kmysiak1 Жыл бұрын
This is awesome! I've been struggling with using the sagemaker SDK and this allows me to use pure python and its open source ML packages but on top of the compute resources of AWS. I like to separate my ML pipeline into their unique notebooks (ie. processing, training, tuning, etc.). Can we use multiple scripts? You've earned my subscription.
@DeepakSingh-ji3zo
@DeepakSingh-ji3zo Жыл бұрын
@mikegchambers Excellent Content! I was lost in Abalone code and you saved me, Thanks again. Can you please let me know, how I can use this endpoint to create an inference pipeline (independent of this notebook)
@saivinilpratap9208
@saivinilpratap9208 Жыл бұрын
thanks for the video mike! it was insightful, it was really helpful. it would be even more better if you can take a model which is in local and convert it to sagemaker compatible python script and recording it parallely.
@mikegchambers
@mikegchambers Жыл бұрын
Yeah you can do that. Hosting models for inference only. I’ll keep in mind for other videos.
@of7104
@of7104 2 ай бұрын
@mikegchambers great video - just wondering if there would be any additional adaptations to make if you were training a deep learning model with eg. pytorch/fastai?
@mehuljan26
@mehuljan26 Жыл бұрын
Great content!!! how do you think we can perform distributed computing on GPU with pytorch/tensorflow in script mode?
@maryamr.aliabadi6101
@maryamr.aliabadi6101 Жыл бұрын
Thanks for your great video. i assume that every single line of code of this jupyter notebook can not run on a local host. We need to run it on sagemaker notebook. Is that right? So, where the python s cript should be located ??
@hariscolic5215
@hariscolic5215 Жыл бұрын
Nice video! I'd be interested to see how to deploy an endpoint with a custom inference script for the input and output_handlers, if you got time on your hands!
@mikegchambers
@mikegchambers Жыл бұрын
Sounds like a plan! :)
@kscolina
@kscolina Жыл бұрын
@@mikegchambers Yes pleasee.
@mikegchambers
@mikegchambers Жыл бұрын
@@kscolina so I’ve been working on a demo project. Can I confirm what is wanted in terms of input and output handlers? Are we talking pipeline model with data pre-processing?
@kscolina
@kscolina Жыл бұрын
​@@mikegchambers That I am not sure of yet. By the way, I raised a question in a separate reply. :)
@karangupta_DE
@karangupta_DE 4 ай бұрын
great video, SM_MODEL_DIR, SM_CHANNEL_TRAIN, SM_CHANNEL_TEST, these have default /opt/ml locations defined in sagemaker python sdk? And once we pass the s3 bucket location, the data from the s3 bucket is automatically pulled into the sagemaker training job containers?
@user-hx8ex7gq5t
@user-hx8ex7gq5t 4 ай бұрын
In Canvas, is there a python script for training the data (includes the algorithm used to train the data) gets created that I can download, other than the model notebook ?
@GS-gi9bc
@GS-gi9bc Жыл бұрын
Hi Mike. Very informative presentation. I need to create a model trained only on Mainframe code artifacts (COBOL, JCL, DB2, etc.) I have a full set of GPT prompts, scripts and templates that generate all variations of full program code for my industry the first 4 months of this year. The biggest drawback that prevents companies from adopting the LLM approach is that public models don't give them secure protection of their code and data. If someone could guide me how to create a locally housed model that can be language, token or template interrogated, we can make a lot of money. The model doesn't need to be trained for email replies, excel formulas, document summaries, etc. It needs to absorb our entire code base and add it to any working model that has some level of intelligent COBOL / Mainframe code generating prowess. Is there some way to co-opt the ChatGPT 4 code base for COBOL, SQL, JCL and add it to our code base on a local machine? I think in 2 years this will be the standard method of project development. Some companies may soon be overrun by those that are willing to be the initial movers in this arena.
@keerthang5557
@keerthang5557 Жыл бұрын
Hello Mike, I need additional python modules which I wish to place in requirements.txt file, will that be picked up by Sklearn Container to install the modules?
@sndrstpnv8419
@sndrstpnv8419 Жыл бұрын
how you use VS code 1) locally on your computer and make calculations remotely on aws sagemaker, then how you connected to aws sagemaker or 2) use VS running on aws , then how you set up vs code?
@user-hx8ex7gq5t
@user-hx8ex7gq5t 4 ай бұрын
Is this endpoint needs an API gateway, and maybe lambda, so I can inference from outside AWS world ?
@u123123123123123
@u123123123123123 Жыл бұрын
great! As an MLOps engineer trying to persuade data scientists to use sagemaker, I found this useful. Basically, they can use a same notebook to pass different hyper params and data to generate different training jobs, am i right?
@mikegchambers
@mikegchambers Жыл бұрын
Absolutely. You can use notebooks as you would normally, and use the SageMaker SDK to train jobs at scale, and use many more tools like AutoML, Data Wrangler, etc etc. So much power, yet in a familiar interface.
@Kmysiak1
@Kmysiak1 Жыл бұрын
@@mikegchambers Do you mind doing video on AutoML, data wrangler, etc. I swear you can explain ML to my grandmother lol
@mikegchambers
@mikegchambers Жыл бұрын
@@Kmysiak1 on the way!
@hejden
@hejden Жыл бұрын
It seems unnecessarily complicated that sagemaker demands we put the training code in a separate script, it would be easier if we could just put it into the notebook with everything else. It also makes it difficult to monitor and debug the actual training scipt when it's implemented separately and run as a monolith. Why is this required and could you just as well just put everything in the notebook?
@mikegchambers
@mikegchambers Жыл бұрын
Hey. I hear your frustration, and I have some thoughts here. So this method is all about running the code at scale, and specifically not running it inside the notebook itself. In other words we are using the notebook, not for ML code, but to orchestrate other infrastructure to run our ML code. So, when I comes to debugging, you would want to do list debugging earlier in in the process, with the ML code (probably in a notebook somewhere) and once your happy with it, we move to this method described here. As for debugging the ‘at-scale’ production deployment, there are ways to do this, that I didn’t cover in this video but I think I should in a future video. I hope that helps put things into perspective. I appreciate you raising that point, and I’ll see how I can clarify for the future.
@hejden
@hejden Жыл бұрын
@@mikegchambers Thank you for the response :) When you say "in a notebook somewhere" are you talking about somewhere in AWS Sagemaker? I would like to use Sagemaker both for development (running on smaller datasets, check that model is correctly setup, monitor convergence etc.) and then later maybe for large scale training. Where do I turn for the former?
@mikegchambers
@mikegchambers Жыл бұрын
@@hejden yes absolutely. So you can spin up a notebook in SageMaker Notebooks, SageMaker Studio, or even SageMaker Studio Labs (for free) and run the ML code in the notebook ’locally’ (to the notebook server). When you’re happy you can ‘migrate’ the code into prod scale as shown here. It’s basically the setup I run through here. I show how the ML code works in the notebook, then get it working in the container using SageMaker. Maybe what I could clarify, is that in this video I use a the same notebook to explore the solution and then get it working in SageMaker managed containers. There is no need to have both these steps in the same notebook, and in many real world scenarios you probably wouldn’t. Steps: - Get your ML code working. Could be done on your own machine, or on a notebook server like SagaMaker Notebooks, Studio, or Studio Labs. This code should include methods to load data, and to serialise and deserialise the model for storage. - Transfer the code into a .py file with the necessary function hooks that SageMaker will be looking for, for the lifecycle of the ML tasks. (Load data, save model, etc). - Create some SageMaker code to orchestrate getting your .py file into a managed SageMaker container. This code can also run anywhere you have access to AWS SDKs, so your own machine, and EC2 instance, a SageMaker Notebook or Studio (but probably not Studio Labs at this time.) - Run your orchestration code and SageMaker will handle the rest. As a preference, I run all code in SageMaker when I can. I don’t like local development and dealing with dependencies etc. it sounds like this is you’re preference too. Make sense? (I’m typing this on my phone, fingers crossed there are not too many typos!)
Game On! - SageMaker STUDIO vs SageMaker NOTEBOOKS
11:53
mikegchambers
Рет қаралды 21 М.
What is Amazon SageMaker?
14:26
mikegchambers
Рет қаралды 63 М.
A clash of kindness and indifference #shorts
00:17
Fabiosa Best Lifehacks
Рет қаралды 95 МЛН
Русалка
01:00
История одного вокалиста
Рет қаралды 6 МЛН
How Many Balloons Does It Take To Fly?
00:18
MrBeast
Рет қаралды 152 МЛН
Who has won ?? 😀 #shortvideo #lizzyisaeva
00:24
Lizzy Isaeva
Рет қаралды 63 МЛН
Deploy LLMs (Large Language Models) on AWS SageMaker using DLC
57:06
AWS Sagemaker tutorial | Build and deploy a Machine Learning API with Python
53:32
Computer vision engineer
Рет қаралды 10 М.
I Reviewed Your Beginner React Code
12:36
Josh tried coding
Рет қаралды 98 М.
Amazon SageMaker Notebooks - Intro to Jupyter and hands on!
26:34
mikegchambers
Рет қаралды 16 М.
How to Deploy ML Solutions with FastAPI, Docker, & AWS
28:48
Shaw Talebi
Рет қаралды 4,4 М.
Build ML models using SageMaker Studio Notebooks - AWS Virtual Workshop
1:16:15
Deliver high-performance ML models faster with MLOps tools
1:01:08
AWS Developers
Рет қаралды 10 М.
Ты же девочка 2 👧🏻🤣😋 #comedy
0:26
Fast Family LIFE
Рет қаралды 2 МЛН
Fun Fun TV short film: 🙏baby save water😍
0:28
Fun Fun TV
Рет қаралды 4,5 МЛН
Жаз бітетін болдығой😂
0:33
NNN LIFE TV
Рет қаралды 6 МЛН