title
stringlengths
12
112
published
stringlengths
19
23
url
stringlengths
28
28
video_id
stringlengths
11
11
channel_id
stringclasses
5 values
id
stringlengths
16
31
text
stringlengths
0
596
start
float64
0
37.8k
end
float64
2.18
37.8k
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t402.0
Now we also want to import Torch and we're going to use two sentences here.
402
417.2
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t408.8
So both of these are from the Wikipedia page on the American Civil War and these are both consecutive sentences.
408.8
425.2
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t417.2
So going back to what we looked at before we would be hoping that BERT would output a 0 label for both of these.
417.2
430.4
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t425.2
Because sentence B is the next sentence after sentence A.
425.2
436
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t430.4
This one being sentence B this one being sentence A.
430.4
442
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t436.0
So execute that and we now have three different steps that we need to take.
436
452
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t442.0
And that is tokenization, create a classification label so the 0 or the 1 so that we can train the model.
442
458
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t452.0
And then from that we calculate the loss. So the first step there is tokenization.
452
468.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t458.0
We tokenize, it's pretty easy. All we do is inputs, tokenizer and then we pass text and text2.
458
476.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t468.8
And we are using PyTorch here so I want to return a PyTorch tensor.
468.8
483.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t476.8
Make sure that's PT.
476.8
499.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t483.8
We need to also initialize those so tokenizer equals BERT tokenizer from pre-trained.
483.8
510.4
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t499.8
And we'll just use BERT base encased for now. Obviously you can use another BERT model if you want.
499.8
517.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t510.4
And copy that and initialize our model as well.
510.4
521.4
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t517.8
Okay now rerun that.
517.8
528.6
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t521.4
And we'll get this warning that's because we're using these models that are used for training or for fine tuning.
521.4
534
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t528.6
So it's just telling us that we shouldn't really use this for inference. You need to train it first.
528.6
537
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t534.0
And that's fine because that's our intention.
534
543
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t537.0
Now from these inputs we get a few different tensors.
537
546.4
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t543.0
So we have input ids, token type ids and attention mask.
543
552.2
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t546.4
Now for next sentence prediction we do need all of these.
546.4
554.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t552.2
So this is a little bit different to mass language modeling.
552.2
558.2
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t554.8
With mass language modeling we don't actually need token type ids.
554.8
563.6
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t558.2
But for next sentence prediction we do.
558.2
568
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t563.6
So let's have a look at what we have inside these.
563.6
571.6
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t568.0
So input ids is just our tokenized text.
568
575.6
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t571.6
And you see that we pass these two sentences here.
571.6
582.2
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t575.6
And they're actually both within the same sentence or the same tensor here, input ids.
575.6
586.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t582.2
And they're separated by this 102 in the middle which is a separated token.
582.2
593.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t586.8
So before that all these tokens that is our text variable or sentence A.
586.8
600
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t593.8
And then afterwards we have our text 2 variable which is sentence B.
593.8
603.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t600.0
And we can see this mirrored in the token type ids tensor as well.
600
609.2
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t603.8
So all the way along here up to here that's our sentence A.
603.8
611.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t609.1999999999999
So we have zeros for sentence A.
609.2
617.6
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t611.8
And then following that we have ones representing sentence B.
611.8
624.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t617.5999999999999
And then we have our attention mask which is just ones because the attention mask is a one where it's a real token and a zero where we have padding token.
617.6
629.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t624.8
So I don't need to really worry about that tensor at all.
624.8
635.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t629.8
Now the next step here is that we need to create a labels tensor.
629.8
645.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t635.8
So to do that we just write labels and we just need to make sure that when we do this we use a long tensor.
635.8
660.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t645.8
OK. So we use a long tensor and in here we need to pass a list containing a single value which is either our zero but is the next sentence or one for is not the next sentence.
645.8
664.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t660.8
In our case our two sentences are supposed to be together.
660.8
669.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t664.8
So we would pass a zero in here.
664.8
671.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t669.8
And run that.
669.8
678.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t671.8
And if we have a look at what we get from there you see that we get this integer tensor.
671.8
682.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t678.8
So now we're ready to calculate our loss which is really easy.
678.8
686.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t682.8
So we have our model up here which we have already initialized.
682.8
688.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t686.8
So we just take that.
686.8
697.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t688.8
And all we do is pass our inputs from here into our model as keyword arguments.
688.8
700.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t697.8
So that's what these two symbols are for.
697.8
705.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t700.8
And then we also pass labels to the labels parameter.
700.8
716.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t705.8
OK. And that will output a couple of tensors for us so we can execute that and let's have a look at what we have.
705.8
722.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t716.8
So you see that we get these two tensors we have the logits and we also have the loss tensor.
716.8
728.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t722.8
So let's have a look at the logits and we should be able to recognize this from early runway.
722.8
738.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t728.8
We saw those two nodes and we had the two values on for the index zero for is next and index one for is not next.
728.8
741.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t738.8
So let's have a look.
738.8
743.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t741.8
You see here that we get both of those.
741.8
747.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t743.8
This is our activation for is the next sentence.
743.8
751.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t747.8
This is our activation for is not the next sentence.
747.8
762.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t751.8
And if we were to take the argmax of those outputs logits we get zero which means it is the next sentence.
751.8
771.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t762.8
OK. And we also have the loss and this loss tensor that will only be output if we pass our labels here.
762.8
773.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t771.8
Otherwise we just get a logits tensor.
771.8
779.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t773.8
So when we're training obviously we need labels so that we can calculate the loss.
773.8
796.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t779.8
And if we just have a look at that we see it's just a loss value which is very small because the model is predicting a zero and the label that we've provided is also a zero.
779.8
798.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t796.8
So the loss is pretty good there.
796.8
802.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t798.8
So that is how NSP works.
798.8
812.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t802.8
Obviously it's slightly different if you're actually training your model and I am going to cover that in the next video.
802.8
815.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t812.8
So I'll leave a link to that in the description.
812.8
817.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t815.8
But for now that's it for this.
815.8
836.8
Training BERT #3 - Next Sentence Prediction (NSP)
2021-05-25 14:56:47 UTC
https://youtu.be/1gN1snKBLP0
1gN1snKBLP0
UCv83tO5cePwHMt1952IVVHw
1gN1snKBLP0-t817.8
817.8
836.8
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t0.0
Today we're going to look at how we can actually publish the component that we've been building for Streamlit.
0
17
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t7.0
So what that means is we can actually pip install the component and then use it in any Streamlit app.
7
23
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t17.0
Just as we would a normal Python framework.
17
29
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t23.0
So we pip install it and then we just import and use it.
23
35
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t29.0
So this is an article I've been putting together that kind of covers this.
29
39
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t35.0
So we're basically just going to be going through this.
35
43
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t39.0
And what you're going to see at the end is we can actually do this.
39
49
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t43.0
So you see here we have this from st card component import card component.
43
53
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t49.0
And then we just do card component title subtitle body link.
49
58
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t53.0
And that will create our card.
53
64
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t58.0
So there's not that much to go through. It's pretty straightforward I think.
58
69
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t64.0
So let's jump into it.
64
76
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t69.0
So a little bit of background on how pip is actually working here.
69
81
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t76.0
When you pip install something you're actually installing it from this here.
76
84
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t81.0
The Python package index.
81
88
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t84.0
So put all pi b here. I think that's how you pronounce it.
84
94
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t88.0
And I can go in here and I can search for pandas.
88
100
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t94.0
And it's going to show us pandas or it's going to show us a lot of different pandas.
94
102
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t100.0
I think it's this one.
100
106
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t102.0
OK. Yeah. Pip install pandas at the top there.
102
114
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t106.0
Now we can also find the st card component I've already built beforehand.
106
119
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t114.0
So if we open this. I think it's this one. Yeah.
114
124
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t119.0
It's like version 0.10 at the moment.
119
129
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t124.0
So this is the current component. Look there's me.
124
132
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t129.0
And you can go ahead and install that.
129
137
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t132.0
It's slightly different to the component that we have been putting together.
132
141
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t137.0
Not as generic.
137
147
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t141.0
I built it for a particular use case which you'll probably see pretty soon.
141
152
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t147.0
So what we're going to do is create another st card component 2 or something.
147
158
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t152.0
I don't know. You have to give it a unique name so we can't use the same name again.
152
162
Streamlit for ML #5.3 - Publishing Components to Pip
2022-02-28 17:00:29 UTC
https://youtu.be/lZ2EaPUnV7k
lZ2EaPUnV7k
UCv83tO5cePwHMt1952IVVHw
lZ2EaPUnV7k-t158.0
So we'll go ahead and start with that.
158
166