title
stringlengths 12
112
| published
stringlengths 19
23
| url
stringlengths 28
28
| video_id
stringlengths 11
11
| channel_id
stringclasses 5
values | id
stringlengths 16
31
| text
stringlengths 0
596
| start
float64 0
37.8k
| end
float64 2.18
37.8k
|
---|---|---|---|---|---|---|---|---|
Fast intro to multi-modal ML with OpenAI's CLIP
|
2022-08-11 13:03:08 UTC
|
https://youtu.be/989aKUVBfbk
|
989aKUVBfbk
|
UCv83tO5cePwHMt1952IVVHw
|
989aKUVBfbk-t1219.5
|
And actually that would be 0 i and then I'm going to I am going to show and I'm going to PLT show.
| 1,219.5 | 1,239.5 |
Fast intro to multi-modal ML with OpenAI's CLIP
|
2022-08-11 13:03:08 UTC
|
https://youtu.be/989aKUVBfbk
|
989aKUVBfbk
|
UCv83tO5cePwHMt1952IVVHw
|
989aKUVBfbk-t1229.5
|
OK. Cool. So yeah I mean that's it. So the first first item as we would expect is a dog in the snow.
| 1,229.5 | 1,251.5 |
Fast intro to multi-modal ML with OpenAI's CLIP
|
2022-08-11 13:03:08 UTC
|
https://youtu.be/989aKUVBfbk
|
989aKUVBfbk
|
UCv83tO5cePwHMt1952IVVHw
|
989aKUVBfbk-t1239.5
|
So after that we get dogs and we get like these snowy areas. The reason for that is that we just don't have any more images of dogs in the snow.
| 1,239.5 | 1,256.5 |
Fast intro to multi-modal ML with OpenAI's CLIP
|
2022-08-11 13:03:08 UTC
|
https://youtu.be/989aKUVBfbk
|
989aKUVBfbk
|
UCv83tO5cePwHMt1952IVVHw
|
989aKUVBfbk-t1251.5
|
This one I don't know what this is. It's like a toy that maybe it's a dog. Maybe it's a bear.
| 1,251.5 | 1,260.5 |
Fast intro to multi-modal ML with OpenAI's CLIP
|
2022-08-11 13:03:08 UTC
|
https://youtu.be/989aKUVBfbk
|
989aKUVBfbk
|
UCv83tO5cePwHMt1952IVVHw
|
989aKUVBfbk-t1256.5
|
I'm not sure. But I suppose technically that's like a dog in the snow.
| 1,256.5 | 1,269.5 |
Fast intro to multi-modal ML with OpenAI's CLIP
|
2022-08-11 13:03:08 UTC
|
https://youtu.be/989aKUVBfbk
|
989aKUVBfbk
|
UCv83tO5cePwHMt1952IVVHw
|
989aKUVBfbk-t1260.5
|
So we have that. So yeah obviously the model is performing pretty well and I think that's very cool that we can do that so easily.
| 1,260.5 | 1,282.5 |
Fast intro to multi-modal ML with OpenAI's CLIP
|
2022-08-11 13:03:08 UTC
|
https://youtu.be/989aKUVBfbk
|
989aKUVBfbk
|
UCv83tO5cePwHMt1952IVVHw
|
989aKUVBfbk-t1269.5
|
And yeah I mean CLIP is I think an amazing model that we can use to do a load of cool things across both the text and image domain which is super interesting.
| 1,269.5 | 1,297.5 |
Fast intro to multi-modal ML with OpenAI's CLIP
|
2022-08-11 13:03:08 UTC
|
https://youtu.be/989aKUVBfbk
|
989aKUVBfbk
|
UCv83tO5cePwHMt1952IVVHw
|
989aKUVBfbk-t1282.5
|
And it's definitely like if you think just a couple of years ago this sort of thing was impossible and didn't seem like at least not to this sort of degree of accuracy like it was going to be happening anytime soon.
| 1,282.5 | 1,305.5 |
Fast intro to multi-modal ML with OpenAI's CLIP
|
2022-08-11 13:03:08 UTC
|
https://youtu.be/989aKUVBfbk
|
989aKUVBfbk
|
UCv83tO5cePwHMt1952IVVHw
|
989aKUVBfbk-t1297.5
|
So this is this is really cool. Here we've obviously shown I showed you how to do like a text to image search.
| 1,297.5 | 1,312.5 |
Fast intro to multi-modal ML with OpenAI's CLIP
|
2022-08-11 13:03:08 UTC
|
https://youtu.be/989aKUVBfbk
|
989aKUVBfbk
|
UCv83tO5cePwHMt1952IVVHw
|
989aKUVBfbk-t1305.5
|
You can do this like the deep. In reality what we're doing is kind of searching through the vectors.
| 1,305.5 | 1,319.5 |
Fast intro to multi-modal ML with OpenAI's CLIP
|
2022-08-11 13:03:08 UTC
|
https://youtu.be/989aKUVBfbk
|
989aKUVBfbk
|
UCv83tO5cePwHMt1952IVVHw
|
989aKUVBfbk-t1312.5
|
So it doesn't matter you know which direction you're doing that search. The vectors are all the same.
| 1,312.5 | 1,325.5 |
Fast intro to multi-modal ML with OpenAI's CLIP
|
2022-08-11 13:03:08 UTC
|
https://youtu.be/989aKUVBfbk
|
989aKUVBfbk
|
UCv83tO5cePwHMt1952IVVHw
|
989aKUVBfbk-t1319.5
|
So if you want to do a text to text search with with CLIP you could. You want to do image to image search you could.
| 1,319.5 | 1,331.5 |
Fast intro to multi-modal ML with OpenAI's CLIP
|
2022-08-11 13:03:08 UTC
|
https://youtu.be/989aKUVBfbk
|
989aKUVBfbk
|
UCv83tO5cePwHMt1952IVVHw
|
989aKUVBfbk-t1325.5
|
If you want to do image to text or all of those things all at once you could.
| 1,325.5 | 1,337.5 |
Fast intro to multi-modal ML with OpenAI's CLIP
|
2022-08-11 13:03:08 UTC
|
https://youtu.be/989aKUVBfbk
|
989aKUVBfbk
|
UCv83tO5cePwHMt1952IVVHw
|
989aKUVBfbk-t1331.5
|
It is not as you're searching through vectors. So what is behind those vectors doesn't really matter so much.
| 1,331.5 | 1,349.5 |
Fast intro to multi-modal ML with OpenAI's CLIP
|
2022-08-11 13:03:08 UTC
|
https://youtu.be/989aKUVBfbk
|
989aKUVBfbk
|
UCv83tO5cePwHMt1952IVVHw
|
989aKUVBfbk-t1337.5
|
OK. So I think that's it for this video. I think CLIP is super interesting and I hope that you do as well in the future or very soon actually.
| 1,337.5 | 1,353.5 |
Fast intro to multi-modal ML with OpenAI's CLIP
|
2022-08-11 13:03:08 UTC
|
https://youtu.be/989aKUVBfbk
|
989aKUVBfbk
|
UCv83tO5cePwHMt1952IVVHw
|
989aKUVBfbk-t1349.5
|
I'm going to be going into a lot more detail on CLIP.
| 1,349.5 | 1,364.5 |
Fast intro to multi-modal ML with OpenAI's CLIP
|
2022-08-11 13:03:08 UTC
|
https://youtu.be/989aKUVBfbk
|
989aKUVBfbk
|
UCv83tO5cePwHMt1952IVVHw
|
989aKUVBfbk-t1353.5
|
So if you are interested in that subscribe and click on the little notification button and you will get a notification about that pretty soon.
| 1,353.5 | 1,384.5 |
Fast intro to multi-modal ML with OpenAI's CLIP
|
2022-08-11 13:03:08 UTC
|
https://youtu.be/989aKUVBfbk
|
989aKUVBfbk
|
UCv83tO5cePwHMt1952IVVHw
|
989aKUVBfbk-t1364.5
| 1,364.5 | 1,384.5 |
|
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t0.0
|
Today we're going to be having a look at multilingual sentence transformers.
| 0 | 10 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t4.4
|
We're going to look at how they work, how they're trained, and why they're so useful.
| 4.4 | 17.6 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t10.88
|
We're going to be focusing on one specific training method which I think is quite useful
| 10.88 | 27.44 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t17.6
|
because all it really needs is a reasonably small data set of parallel data which is simply
| 17.6 | 33.28 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t27.44
|
translation pairs from a source language like English to whichever other language you're using.
| 27.44 | 40 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t33.28
|
So obviously if you are wanting to train a sentence transformer in a language that
| 33.28 | 45.12 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t40.0
|
doesn't really have that much data, it's particularly sentence similarity data,
| 40 | 53.68 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t46.0
|
this can be really useful for actually taking a high performing, for example, English sentence
| 46 | 61.28 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t53.68
|
transformer and transferring that knowledge or distilling that knowledge into a sentence
| 53.68 | 68.4 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t61.28
|
transformer for your own language. So I think this will be pretty useful for a lot of you.
| 61.28 | 72.08 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t69.03999999999999
|
And let's jump straight into it.
| 69.04 | 84.4 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t72.08
|
Before we really get into the whole multilingual sentence transformer part of the video,
| 72.08 | 89.68 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t85.36
|
I just want to sort of give an impression of what these multilingual sentence transformers
| 85.36 | 98.8 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t89.68
|
are actually doing. So on here we can see a single English sentence or brief phrase down
| 89.68 | 106.4 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t98.8
|
at the bottom, I love plants, and the rest of these are all in Italian. So what we have here
| 98.8 | 115.36 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t106.39999999999999
|
are a vector representations of dense vector representations of these phrases. And a monolingual
| 106.4 | 121.68 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t115.36
|
sentence transformer, which is most of the sentence transformers, will only cope with one language.
| 115.36 | 129.04 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t121.68
|
So we would hope that phrases that have a similar meaning end up within the same sort of vector
| 121.68 | 141.12 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t129.04000000000002
|
space. So like we have for amo lippiante here, and I love plants, these are kind of in the same space.
| 129.04 | 150.4 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t141.12
|
A monolingual sentence transformer would do that for similar sentences. So in English, we might
| 141.12 | 156.88 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t150.4
|
have I love plants and I like plants, which is actually what we have up here. So this here is
| 150.4 | 164.24 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t156.88
|
Italian for I like plants. And we would hope that they're in a similar area, whereas irrelevant or
| 156.88 | 173.2 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t165.04000000000002
|
almost contradictory sentences we would hope would be far off somewhere else like our vector over here.
| 165.04 | 179.2 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t173.2
|
So that's how obviously a monolingual sentence transformer works. And it's exactly the same for
| 173.2 | 185.84 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t179.2
|
a multilingual sentence transformer. The only difference is that rather than having a single
| 179.2 | 193.04 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t185.83999999999997
|
language, it will comprehend multiple languages. And that's what you can see in this visual. So
| 185.84 | 199.84 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t194.16
|
in this example, I have I love plants and amo lippiante, they have the same meaning,
| 194.16 | 207.36 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t199.84
|
they have the same meaning just in different languages. So that means that they should be as
| 199.84 | 215.84 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t207.36
|
close together as possible in this vector space. So here we're just visualizing three dimensions.
| 207.36 | 223.04 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t215.84
|
In reality, it'll be a lot more. I think most transforming models go with 768 dimensions.
| 215.84 | 230.16 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t223.04
|
But obviously we can't visualize that. So we have 3D here. So we want different languages or similar
| 223.04 | 235.92 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t230.16
|
sentences from different languages to end up in the same area. And we also want to be able to
| 230.16 | 243.44 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t235.92
|
represent relationships between different sentences that are similar. And we can kind of see that
| 235.92 | 250 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t243.44
|
relationship here. So we have mi piacere e le piante and amo lippiante and I love plants are all
| 243.44 | 260.08 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t250.0
|
kind of in the same sort of area. Mi piacere e le piante, so I like plants, is obviously separated
| 250 | 264.96 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t260.08
|
somewhat, but it's still within the same area. And then in the bottom left down there we have
| 260.08 | 274.64 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t265.52
|
un cane arancione, which means I have a orange dog. So obviously, you know, that's really nothing
| 265.52 | 280.16 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t274.64
|
to do with I love plants. Although I suppose you could say it's you're talking about yourself, so
| 274.64 | 289.68 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t280.15999999999997
|
maybe it's a little bit similar, but otherwise they're completely different topics. So that's
| 280.16 | 295.76 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t289.68
|
kind of what we want to build. Something that takes sentences from different languages and maps them
| 289.68 | 303.2 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t295.76
|
into a vector space, which has some sort of numerical structure to represent the semantic
| 295.76 | 309.44 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t303.2
|
meaning of those sentences. And it should be language agnostic. So obviously we can't, well,
| 303.2 | 314 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t310.0
|
maybe we can train on every language. I don't know any models that are trained in every single
| 310 | 323.2 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t314.0
|
language, but we want it to be able to comprehend different languages and not be biased towards
| 314 | 328.96 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t323.84
|
different phrases in different languages, but to have a very balanced comprehension of all of them.
| 323.84 | 342.56 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t328.96
|
Okay. So that's how the vectors should look. And then, okay. So how, what would a training data
| 328.96 | 349.04 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t342.56
|
for this look like? And what are the training approaches? So like I said before, there's two
| 342.56 | 352.88 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t349.03999999999996
|
training approaches that I'm going to just briefly touch upon, but we're going to focus on the
| 349.04 | 364.08 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t352.88
|
latter of those. So the first one that I want to mention is what the M-U-S-E, MUSE or Multilingual
| 352.88 | 370.8 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t364.08
|
Universal Sentence Encoder Model was trained on, which is a multitask
| 364.08 | 385.84 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t370.8
|
translation bridging approach to training. So what I mean by that is it uses two or uses a dual encoder
| 370.8 | 397.04 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t387.28000000000003
|
structure and those encoders deal with two different tasks. So on one end you have the
| 387.28 | 404.4 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t397.04
|
parallel data training. So when we say parallel data, these are sentence pairs in different
| 397.04 | 412.4 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t404.40000000000003
|
languages. So like we had before, we had the Amalepiante and Isle of Plants, which is just the
| 404.4 | 419.84 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t412.40000000000003
|
Italian and English phrases for Isle of Plants. So we would have our source language
| 412.4 | 428.88 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t419.84
|
and also the translation or the target language. It's probably a better way to put translation now.
| 419.84 | 435.36 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t429.76
|
So we have the source and translation, that's our parallel data set. And what we're doing is
| 429.76 | 439.76 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t435.35999999999996
|
optimizing to get those two vectors or the two sentence vectors produced by either one of those
| 435.36 | 451.52 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t439.76
|
sentences as close as possible. And then there is also the source data. So we basically have
| 439.76 | 460.24 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t451.52
|
like sentence similarity or NLI data, but we have it just for the source language. So we have source
| 451.52 | 470.48 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t460.24
|
sentence A and source sentence B. And we train on both of these. Now this is, it works and that's
| 460.24 | 478.88 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t470.48
|
good, but obviously we train on a multi-task architecture here and training on a single task
| 470.48 | 484 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t478.88
|
in machine learning is already hard enough. Training on two and getting them to balance
| 478.88 | 491.12 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t484.0
|
and train well is harder. And the amount of data, at least for Muse and I believe for,
| 484 | 495.2 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t491.68
|
if you're training using this approach, you're going to need to use a similar amount of data,
| 491.68 | 502.32 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t496.16
|
is pretty significant. I think Muse is something like a billion pairs, so it is pretty high.
| 496.16 | 509.6 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t503.36
|
And another thing is that we also need something called hard negatives in the training data
| 503.36 | 517.12 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t509.6
|
in order for this model to perform well. So what I mean by hard negative is, say we have our,
| 509.6 | 524.4 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t517.84
|
you know, we have our source sentence A here and we have this source B, which is like a similar
| 517.84 | 532.24 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t524.4
|
sentence, a high similarity sentence. They mean basically the same thing. We'd also have to add a
| 524.4 | 542.16 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t532.24
|
source C and this source C will have to be similar in the words that uses to source A,
| 532.24 | 546.24 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t542.16
|
but actually means something different. So it's harder for the model to differentiate between
| 542.16 | 552.24 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t546.24
|
them. Again, the model would have to figure out, you know, these two sentences are not similar,
| 546.24 | 557.04 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t552.24
|
even though they seem similar at first, but they're not. So it makes the task, the training
| 552.24 | 566.4 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t557.04
|
task harder for the model, which of course makes the model better. So that is training approach
| 557.04 | 572.4 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t566.4
|
number one. And we've mentioned the parallel data there. That's the data set we're going to be using
| 566.4 | 582.56 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t572.4
|
for the second training approach. And that second training approach is called multi-lingual
| 572.4 | 595.84 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t582.56
|
knowledge distillation. So that is a mouthful and takes me a while to write down, sorry. So
| 582.56 | 606.64 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t596.88
|
multi-lingual knowledge distillation. So this was introduced in 2020 by, you know, who we mentioned
| 596.88 | 613.28 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t606.64
|
before, the sentence transformers people, Nils Reimers and Irenia Gurevich. And the sort of
| 606.64 | 619.52 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t613.28
|
advantage of using this approach is that we only need the parallel data set. So we only need those
| 613.28 | 627.76 |
All You Need to Know on Multilingual Sentence Vectors (1 Model, 50+ Languages)
|
2021-11-04 13:00:10 UTC
|
https://youtu.be/NNS5pOpjvAQ
|
NNS5pOpjvAQ
|
UCv83tO5cePwHMt1952IVVHw
|
NNS5pOpjvAQ-t619.52
|
translation pairs and the amount of training data you need is a lot smaller. And using this approach,
| 619.52 | 635.92 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.