title
stringlengths 12
112
| published
stringlengths 19
23
| url
stringlengths 28
28
| video_id
stringlengths 11
11
| channel_id
stringclasses 5
values | id
stringlengths 16
31
| text
stringlengths 0
596
| start
float64 0
37.8k
| end
float64 2.18
37.8k
|
---|---|---|---|---|---|---|---|---|
How to Build Custom Q&A Transformer Models in Python
|
2021-02-12 13:30:03 UTC
|
https://youtu.be/ZIRmXkHp0-c
|
ZIRmXkHp0-c
|
UCv83tO5cePwHMt1952IVVHw
|
ZIRmXkHp0-c-t4176.72
|
zero. So when you consider that with our 63.6% accuracy here that means that this
| 4,176.72 | 4,190.24 |
How to Build Custom Q&A Transformer Models in Python
|
2021-02-12 13:30:03 UTC
|
https://youtu.be/ZIRmXkHp0-c
|
ZIRmXkHp0-c
|
UCv83tO5cePwHMt1952IVVHw
|
ZIRmXkHp0-c-t4186.04
|
model is actually probably doing pretty well. It's not perfect of course but it's
| 4,186.04 | 4,196.24 |
How to Build Custom Q&A Transformer Models in Python
|
2021-02-12 13:30:03 UTC
|
https://youtu.be/ZIRmXkHp0-c
|
ZIRmXkHp0-c
|
UCv83tO5cePwHMt1952IVVHw
|
ZIRmXkHp0-c-t4190.240000000001
|
doing pretty well. So overall that's I mean that's everything for this video
| 4,190.24 | 4,201.52 |
How to Build Custom Q&A Transformer Models in Python
|
2021-02-12 13:30:03 UTC
|
https://youtu.be/ZIRmXkHp0-c
|
ZIRmXkHp0-c
|
UCv83tO5cePwHMt1952IVVHw
|
ZIRmXkHp0-c-t4196.240000000001
|
we've gone all the way through this. If you do want to code for this I'm gonna
| 4,196.24 | 4,206.64 |
How to Build Custom Q&A Transformer Models in Python
|
2021-02-12 13:30:03 UTC
|
https://youtu.be/ZIRmXkHp0-c
|
ZIRmXkHp0-c
|
UCv83tO5cePwHMt1952IVVHw
|
ZIRmXkHp0-c-t4201.52
|
make sure I keep a link to it in the description so check that out if you
| 4,201.52 | 4,212 |
How to Build Custom Q&A Transformer Models in Python
|
2021-02-12 13:30:03 UTC
|
https://youtu.be/ZIRmXkHp0-c
|
ZIRmXkHp0-c
|
UCv83tO5cePwHMt1952IVVHw
|
ZIRmXkHp0-c-t4206.64
|
just want to sort of copy this across. But for now that's everything so thank
| 4,206.64 | 4,232.8 |
How to Build Custom Q&A Transformer Models in Python
|
2021-02-12 13:30:03 UTC
|
https://youtu.be/ZIRmXkHp0-c
|
ZIRmXkHp0-c
|
UCv83tO5cePwHMt1952IVVHw
|
ZIRmXkHp0-c-t4212.0
| 4,212 | 4,232.8 |
|
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t0.0
|
Hi and welcome to this video on question answering with Bert.
| 0 | 8.56 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t4.0
|
So firstly we're going to have a look at the transformers library
| 4 | 11.76 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t8.56
|
and we're going to look at how we can find a Q&A model.
| 8.56 | 14.88 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t12.4
|
And then we're going to look at the Q&A pipeline.
| 12.4 | 20 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t14.88
|
So we're going to look at actually loading a model in python using the transformers library.
| 14.88 | 25.28 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t20.0
|
We're going to look at tokenization, how we load a tokenizer and what exactly tokenizer is actually doing.
| 20 | 29.52 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t25.28
|
And then we're going to take a look at the pipeline class,
| 25.28 | 34.96 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t29.520000000000003
|
which is essentially a wrapper made available by the Hugging Face Transformers library.
| 29.52 | 40.96 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t34.96
|
And it basically just makes our job in terms of building a Q&A pipeline incredibly easy.
| 34.96 | 45.6 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t40.96
|
So we're going to cover all those, it's going to be quite straightforward and quite simple.
| 40.96 | 47.36 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t45.6
|
So let's just get straight into it.
| 45.6 | 54.16 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t47.36
|
Okay so when we're doing question answering, we're essentially asking the model a question
| 47.36 | 60.64 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t54.16
|
and passing a context, which is what you can see here for the model to use to answer that question.
| 54.16 | 65.68 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t62.0
|
So you can see down here we have these three questions.
| 62 | 70.16 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t65.67999999999999
|
So what organization is the IPCC a part of?
| 65.68 | 76.4 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t70.16
|
And then the model will read through this and use its language modeling to figure out
| 70.16 | 82.4 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t76.4
|
which organization the IPCC is part of, which is not inherently clear from reading this.
| 76.4 | 88.64 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t82.4
|
We can see we've got IPCC here and is a scientific intergovernmental body
| 82.4 | 91.2 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t88.64
|
under the auspices of the United Nations.
| 88.64 | 98.64 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t92.48
|
So clearly the IPCC is a part of the United Nations, but it's not clear.
| 92.48 | 104.08 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t98.64000000000001
|
It's not definitively saying that in this, but once we've actually built this model,
| 98.64 | 108.24 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t104.08000000000001
|
it will quite easily be able to answer each one of these questions without any issues.
| 104.08 | 114 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t108.24
|
So the first thing we want to do is go over to the Hugging Face website.
| 108.24 | 123.84 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t114.47999999999999
|
And on the Hugging Face website, we just want to go over to the Models page.
| 114.48 | 124.64 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t123.83999999999999
|
So it's here.
| 123.84 | 130.64 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t125.84
|
Okay and on this Models page, the thing that we want to be looking at
| 125.84 | 133.28 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t130.64
|
is this question and answering task.
| 130.64 | 136.8 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t133.28
|
So here we have all these tasks because when you're working with transformers,
| 133.28 | 139.28 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t136.8
|
they can work with a lot of different things.
| 136.8 | 145.28 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t140.24
|
Text summarization, text classification, generation, loads of different things.
| 140.24 | 147.68 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t145.28
|
But what we want to do is question answering.
| 145.28 | 153.76 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t147.68
|
So we click on here and this filters all of the models that are available to us
| 147.68 | 156.8 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t154.24
|
just purely for question and answering.
| 154.24 | 164.4 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t158.24
|
So this is the sort of power of using the Hugging Face Transformers library.
| 158.24 | 169.6 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t164.4
|
It already has all these pre-trained models that we can just download and start using.
| 164.4 | 175.04 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t171.20000000000002
|
Now, when you want to go and apply these to specific use cases,
| 171.2 | 179.44 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t175.04000000000002
|
you probably want to fine tune it, which means you want to train it a little bit more
| 175.04 | 180.8 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t179.44
|
than what it is already trained.
| 179.44 | 186.4 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t181.68
|
But for actually getting used to how all of this works,
| 181.68 | 190.32 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t186.4
|
all you need to do is download this model and start asking questions
| 186.4 | 193.36 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t190.32
|
and understanding how everything is actually functioning.
| 190.32 | 195.92 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t193.36
|
So obviously there's a lot of models here.
| 193.36 | 199.12 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t195.92000000000002
|
We've got 262 models for question answering,
| 195.92 | 201.12 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t199.12
|
and there's new ones being added all the time.
| 199.12 | 205.44 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t201.84
|
A few of the ones that I would recommend using are the DeepSets models.
| 201.84 | 211.2 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t207.44000000000003
|
So here are the DeepSets models, eight of them for question answering.
| 207.44 | 214.96 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t211.20000000000002
|
The one that we will be using is this BERT Base Case Squad 2.
| 211.2 | 219.28 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t214.96
|
Another one that I would definitely recommend trying out is this Electra Base Squad 2.
| 214.96 | 222.64 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t219.28
|
But we will be sticking with BERT Base.
| 219.28 | 227.36 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t222.64000000000001
|
Now, it's called DeepSet here because it's from the DeepSet AI company,
| 222.64 | 230.96 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t227.36
|
and this model is being pulled directly from their GitHub repository.
| 227.36 | 234.64 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t230.96
|
So DeepSet is actually the GitHub organization,
| 230.96 | 237.68 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t234.64
|
and then this is the repository BERT Base Case Squad 2.
| 234.64 | 241.2 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t238.4
|
BERT is obviously the model BERT from Google AI.
| 238.4 | 245.04 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t242.0
|
Base is the base version of BERT.
| 242 | 249.84 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t245.04
|
So you can see here we have BERT large, that's just a large model. We're using the base model.
| 245.04 | 254.88 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t249.84
|
Case just refers to the fact that this model will differentiate between
| 249.84 | 258.24 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t254.88
|
uppercase and lowercase characters or words.
| 254.88 | 262.64 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t258.24
|
The alternative to this would be uncased here where there's no differentiation
| 258.24 | 264.4 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t262.64
|
between uppercase and lowercase.
| 262.64 | 270.88 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t264.4
|
And then Squad 2 refers to the question answering data set that this model has been trained on,
| 264.4 | 272.96 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t270.88
|
which is the Squad 2 version.
| 270.88 | 275.2 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t272.96
|
So we're going to take this model.
| 272.96 | 280.96 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t275.2
|
So you see DeepSet BERT Base Case Squad 2, and we are going to load it into here.
| 275.2 | 287.6 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t282.0
|
And all we need to do to do that is from transformers.
| 282 | 290.48 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t287.59999999999997
|
So this is the Hug & Face Transformers library.
| 287.6 | 298.48 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t292.24
|
We want to import BERT for question and answer.
| 292.24 | 305.6 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t298.48
|
So this is a specific class and using this class we can initialize a few different models,
| 298.48 | 307.36 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t305.6
|
not just this specific model.
| 305.6 | 309.68 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t307.36
|
So you can see here we have this BERT base case.
| 307.36 | 314.24 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t309.68
|
We can also initialize this BERT large uncased Roberta.
| 309.68 | 318.16 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t314.24
|
And if there's a Distill BERT as well, we can also load those in.
| 314.24 | 324.4 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t319.92
|
And what this does is it loads that BERT base case.
| 319.92 | 332.48 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t324.4
|
And what this does is it loads that specific model with its question and answering layer
| 324.4 | 333.76 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t332.47999999999996
|
added on there as well.
| 332.48 | 338.4 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t333.76
|
So this model has been trained with the extra layer specifically for question answering.
| 333.76 | 341.92 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t338.4
|
And we need to use BERT for question answering to load that.
| 338.4 | 347.76 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t341.91999999999996
|
Otherwise, if you are not using it with a specific use case and you're just wanting
| 341.92 | 352.56 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t347.76
|
to get the model itself, you can just use the auto model class like that.
| 347.76 | 355.68 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t352.56
|
But we want it for question answering, so we load this one.
| 352.56 | 361.12 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t355.68
|
Another thing to note is that we are using the PyTorch implementation of BERT here.
| 355.68 | 368.48 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t361.92
|
So Transformers works by having both TensorFlow and PyTorch as alternative frameworks working
| 361.92 | 369.68 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t368.48
|
behind the scenes.
| 368.48 | 371.6 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t369.68
|
In this case, we're using PyTorch.
| 369.68 | 376.56 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t371.6
|
If you want to switch over to TensorFlow, all you do is add TF in front of that class.
| 371.6 | 380.48 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t378.4
|
So that is our model.
| 378.4 | 389.84 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t380.48
|
And to actually load that in, all we do is copy this and we use the from pre-trained
| 380.48 | 390.34 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t389.84000000000003
|
method.
| 389.84 | 396.32 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t392.16
|
And then this is where the model name from over here comes into play.
| 392.16 | 398.96 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t396.32
|
So we've got deep set BERT base case squad 2.
| 396.32 | 402.88 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t401.36
|
And we just enter that in there.
| 401.36 | 405.76 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t402.88
|
And we just enter that in there.
| 402.88 | 417.52 |
How to Build Q&A Models in Python (Transformers)
|
2021-02-19 15:00:21 UTC
|
https://youtu.be/scJsty_DR3o
|
scJsty_DR3o
|
UCv83tO5cePwHMt1952IVVHw
|
scJsty_DR3o-t414.4
|
Okay, and with that, we've actually just loaded the model.
| 414.4 | 418.64 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.