Spaces:
Sleeping
Sleeping
File size: 7,290 Bytes
2098354 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 |
{
"cells": [
{
"cell_type": "code",
"execution_count": 23,
"metadata": {},
"outputs": [],
"source": [
"from dotenv import load_dotenv\n",
"load_dotenv()\n",
"import os \n",
"\n",
"api_key = os.environ.get('GOOGLE_API_KEY')"
]
},
{
"cell_type": "code",
"execution_count": 24,
"metadata": {},
"outputs": [],
"source": [
"import google.generativeai as genai\n",
"genai.configure(api_key = api_key)"
]
},
{
"cell_type": "code",
"execution_count": 25,
"metadata": {},
"outputs": [],
"source": [
"Model = genai.GenerativeModel(model_name=\"models/gemini-1.5-pro\")\n",
"\n",
"file = genai.upload_file(\"video.mp4\")"
]
},
{
"cell_type": "code",
"execution_count": 27,
"metadata": {},
"outputs": [],
"source": [
"response = Model.generate_content([\"Tell me the summury of this video\",file])"
]
},
{
"cell_type": "code",
"execution_count": 28,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"'This video explains linear regression, a statistical technique used to model the relationship between an output variable and one or more input variables. In simpler terms, this means fitting a line through data points and making predictions using that line. The formula for this is y equals mx + b, where y is the output variable, also called the dependent variable. The x is the input variable, called the independent variable. The m and b variables are the coefficients that are solved for in linear regression. The m variable controls the slope of the line. The b variable controls the intercept of the line. These are also referred to as beta1 and beta0. This equation can also have multiple input variables x1, x2, and x3.\\n\\nThe video also explains how to find the best fit for the regression line using residuals or the differences between the data points and the line. Squaring the residuals, then totaling these squares for a given line will get the sum of squared error (SSE). The beta coefficients that minimize the SSE are the most appropriate for the data.\\n\\nTo validate the regression, machine learning practitioners often divide the data into a training set and a test set. They use the training set to fit the regression line, then use the test set to validate the line. They use the r-squared, standard error, prediction intervals, and statistical significance metrics to evaluate the regression.'"
]
},
"execution_count": 28,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"response.text"
]
},
{
"cell_type": "code",
"execution_count": 29,
"metadata": {},
"outputs": [
{
"data": {
"text/markdown": [
"This video explains linear regression, a statistical technique used to model the relationship between an output variable and one or more input variables. In simpler terms, this means fitting a line through data points and making predictions using that line. The formula for this is y equals mx + b, where y is the output variable, also called the dependent variable. The x is the input variable, called the independent variable. The m and b variables are the coefficients that are solved for in linear regression. The m variable controls the slope of the line. The b variable controls the intercept of the line. These are also referred to as beta1 and beta0. This equation can also have multiple input variables x1, x2, and x3.\n",
"\n",
"The video also explains how to find the best fit for the regression line using residuals or the differences between the data points and the line. Squaring the residuals, then totaling these squares for a given line will get the sum of squared error (SSE). The beta coefficients that minimize the SSE are the most appropriate for the data.\n",
"\n",
"To validate the regression, machine learning practitioners often divide the data into a training set and a test set. They use the training set to fit the regression line, then use the test set to validate the line. They use the r-squared, standard error, prediction intervals, and statistical significance metrics to evaluate the regression."
],
"text/plain": [
"<IPython.core.display.Markdown object>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"from IPython.display import Markdown\n",
"display(Markdown(response.text))"
]
},
{
"cell_type": "code",
"execution_count": 30,
"metadata": {},
"outputs": [],
"source": [
"response = Model.generate_content([\"what is the equations provided in the video\",file])"
]
},
{
"cell_type": "code",
"execution_count": 31,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"'The narrator in this video describes the basic process of using linear regression, a type of statistical modeling that assumes a linear relationship between variables. In the equation, the output or dependent variable y can be expressed as a function of an input or independent variable, such as x, or as several input variables, x1, x2, x3, etc. He shows the two main ways these equations are generally expressed:\\n\\ny=mx+b\\nf(x) = mx + b\\ny = β1x + β0\\ny = β2x2 + β1x1 + β0\\ny = β3x3 + β2x2 + β1x1 + β0\\ny = β4x4 + β3x3 + β2x2 + β1x1 + β0\\ny = β5x5 + β4x4 + β3x3 + β2x2 + β1x1 + β0\\n\\n\\n'"
]
},
"execution_count": 31,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"response.text"
]
},
{
"cell_type": "code",
"execution_count": 32,
"metadata": {},
"outputs": [
{
"data": {
"text/markdown": [
"The narrator in this video describes the basic process of using linear regression, a type of statistical modeling that assumes a linear relationship between variables. In the equation, the output or dependent variable y can be expressed as a function of an input or independent variable, such as x, or as several input variables, x1, x2, x3, etc. He shows the two main ways these equations are generally expressed:\n",
"\n",
"y=mx+b\n",
"f(x) = mx + b\n",
"y = β1x + β0\n",
"y = β2x2 + β1x1 + β0\n",
"y = β3x3 + β2x2 + β1x1 + β0\n",
"y = β4x4 + β3x3 + β2x2 + β1x1 + β0\n",
"y = β5x5 + β4x4 + β3x3 + β2x2 + β1x1 + β0\n",
"\n",
"\n"
],
"text/plain": [
"<IPython.core.display.Markdown object>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"display(Markdown(response.text))"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.0"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
|