Spaces:
Running
on
Zero
Running
on
Zero
File size: 3,506 Bytes
1b80e0f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 |
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"colab": {
"name": "Untitled0.ipynb",
"provenance": [],
"include_colab_link": true
},
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
},
"accelerator": "GPU"
},
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "view-in-github",
"colab_type": "text"
},
"source": [
"<a href=\"https://colab.research.google.com/github/hzwer/Practical-RIFE/blob/main/Colab_demo.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"cell_type": "code",
"metadata": {
"id": "FypCcZkNNt2p"
},
"source": [
"%cd /content\n",
"!git clone https://github.com/hzwer/Practical-RIFE"
],
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "1wysVHxoN54f"
},
"source": [
"!gdown --id 1O5KfS3KzZCY3imeCr2LCsntLhutKuAqj\n",
"!7z e Practical-RIFE/RIFE_trained_model_v3.8.zip"
],
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "AhbHfRBJRAUt"
},
"source": [
"!mkdir /content/Practical-RIFE/train_log\n",
"!mv *.py /content/Practical-RIFE/train_log/\n",
"!mv *.pkl /content/Practical-RIFE/train_log/\n",
"%cd /content/Practical-RIFE/\n",
"!gdown --id 1i3xlKb7ax7Y70khcTcuePi6E7crO_dFc\n",
"!pip3 install -r requirements.txt"
],
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "rirngW5uRMdg"
},
"source": [
"Please upload your video to content/Practical-RIFE/video.mp4, or use our demo video."
]
},
{
"cell_type": "code",
"metadata": {
"id": "dnLn4aHHPzN3"
},
"source": [
"!nvidia-smi\n",
"!python3 inference_video.py --exp=1 --video=demo.mp4 --montage --skip"
],
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {
"id": "77KK6lxHgJhf"
},
"source": [
"Our demo.mp4 is 25FPS. You can adjust the parameters for your own perference.\n",
"For example: \n",
"--fps=60 --exp=1 --video=mydemo.avi --png"
]
},
{
"cell_type": "code",
"metadata": {
"id": "0zIBbVE3UfUD",
"cellView": "code"
},
"source": [
"from IPython.display import display, Image\n",
"import moviepy.editor as mpy\n",
"display(mpy.ipython_display('demo_4X_100fps.mp4', height=256, max_duration=100.))"
],
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"metadata": {
"id": "tWkJCNgP3zXA"
},
"source": [
"!python3 inference_img.py --img demo/I0_0.png demo/I0_1.png\n",
"ffmpeg -r 10 -f image2 -i output/img%d.png -s 448x256 -vf \"split[s0][s1];[s0]palettegen=stats_mode=single[p];[s1][p]paletteuse=new=1\" output/slomo.gif\n",
"# Image interpolation"
],
"execution_count": null,
"outputs": []
}
]
} |