
Recherche avancée
Médias (1)
-
Revolution of Open-source and film making towards open film making
6 octobre 2011, par
Mis à jour : Juillet 2013
Langue : English
Type : Texte
Autres articles (21)
-
La sauvegarde automatique de canaux SPIP
1er avril 2010, parDans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...) -
Script d’installation automatique de MediaSPIP
25 avril 2011, parAfin de palier aux difficultés d’installation dues principalement aux dépendances logicielles coté serveur, un script d’installation "tout en un" en bash a été créé afin de faciliter cette étape sur un serveur doté d’une distribution Linux compatible.
Vous devez bénéficier d’un accès SSH à votre serveur et d’un compte "root" afin de l’utiliser, ce qui permettra d’installer les dépendances. Contactez votre hébergeur si vous ne disposez pas de cela.
La documentation de l’utilisation du script d’installation (...) -
Automated installation script of MediaSPIP
25 avril 2011, parTo overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
The documentation of the use of this installation script is available here.
The code of this (...)
Sur d’autres sites (1333)
-
Reducing a ffmpeg two-pass xml profile to a single pass profile
14 octobre 2013, par M.frankI am to reduce the XML profile below to a single pass profile rather than a two-pass profile.
The settings are (if possible) the same. The profile looks like this :<?xml version="1.0" encoding="utf-8"?>
-i %infile% -vcodec libx264 -aspect 4:3 -s 480x360 -r 25.000 -vb 380000 -vprofile main -level 3.0 -pix_fmt yuv420p -pass 1 -sn -an -y %outfile%
-i %infile% -vcodec libx264 -aspect 4:3 -s 480x360 -r 25.000 -vb 380000 -vprofile main -level 3.0 -pix_fmt yuv420p -pass 2 -sn -acodec libvo_aacenc -ab 48000 -ar 22050 -ac 2 -y %outfile%
My idea was to just remove the two-pass cmdline, but I want to make sure that it is correct.
-
How to save variables at start of script for use later if script needs to be re-run due to errors or bad user input
28 novembre 2023, par slyfox1186I have a script that uses GitHub's API to get the latest version number of the repositories that I am trying to download and then compile.


Due to the fact that without using a specialized token from GitHub you are only allowed 50 API calls a day vs the 5000 a day with the API user token.


I want to be able to parse all of the repositories and grab the version numbers that my script will then import into the code up front so in case someone who accidentally cancels the build in the middle of it (for who knows what reasons) wont have to eat up their 50 day API call allowance.


Essentially, store each repo's version number, if the user then needs to rerun the script and version numbers that have been saved so far will be skipped (thus eliminating an API call) and any numbers that are still needing to be sourced will be called and then stored for used in the script.


I am kinda of lost for a method on how to go about this.


Maybe some sort of external file can be generated ?


So what my script does is it builds FFmpeg from source code and all of the external libraries that you can link to it are also built from their latest source code.


The code calls the function
git_ver_fn
and passes arguments to it which are parsed inside the function and directed to another functionsgit_1_fn or git_2_fn
which passed those parsed arguments that have been passed on to the CURL command which changes the URL based on the arguments passed. It uses thejq
command to capture the GitHubversion number
anddownload link
for thetar.gz
file.

It is the version number I am and trying to figure out the best way to store in case the script fails and has to be rerun, which will eat up all of the 50 APT limit that GitHub imposes without a token. I can't post my token in the script because GitHub deactivates it and thus the users will be SOL if they need to run the script more than once.


curl_timeout='5'

git_1_fn()
{
 # SCRAPE GITHUB WEBSITE FOR LATEST REPO VERSION
 github_repo="$1"
 github_url="$2"

 if curl_cmd="$(curl -m "$curl_timeout" -sSL "https://api.github.com/repos/$github_repo/$github_url")"; then
 g_ver="$(echo "$curl_cmd" | jq -r '.[0].name')"
 g_ver="${g_ver#v}"
 g_ssl="$(echo "$curl_cmd" | jq -r '.[0].name')"
 g_ssl="${g_ssl#OpenSSL }"
 g_pkg="$(echo "$curl_cmd" | jq -r '.[0].name')"
 g_pkg="${g_pkg#pkg-config-}"
 g_url="$(echo "$curl_cmd" | jq -r '.[0].tarball_url')"
 fi
}

git_2_fn()
{
 videolan_repo="$1"
 videolan_url="$2"
 if curl_cmd="$(curl -m "$curl_timeout" -sSL "https://code.videolan.org/api/v4/projects/$videolan_repo/repository/$videolan_url")"; then
 g_ver="$(echo "$curl_cmd" | jq -r '.[0].commit.id')"
 g_sver="$(echo "$curl_cmd" | jq -r '.[0].commit.short_id')"
 g_ver1="$(echo "$curl_cmd" | jq -r '.[0].name')"
 g_ver1="${g_ver1#v}"
 fi
}

git_ver_fn()
{
 local v_flag v_tag url_tag

 v_url="$1"
 v_tag="$2"

 if [ -n "$3" ]; then v_flag="$3"; fi

 if [ "$v_flag" = 'B' ] && [ "$v_tag" = '2' ]; then
 url_tag='git_2_fn' gv_url='branches'
 fi

 if [ "$v_flag" = 'X' ] && [ "$v_tag" = '5' ]; then
 url_tag='git_5_fn'
 fi

 if [ "$v_flag" = 'T' ] && [ "$v_tag" = '1' ]; then
 url_tag='git_1_fn' gv_url='tags'
 elif [ "$v_flag" = 'T' ] && [ "$v_tag" = '2' ]; then
 url_tag='git_2_fn' gv_url='tags'
 fi

 if [ "$v_flag" = 'R' ] && [ "$v_tag" = '1' ]; then
 url_tag='git_1_fn'; gv_url='releases'
 elif [ "$v_flag" = 'R' ] && [ "$v_tag" = '2' ]; then
 url_tag='git_2_fn'; gv_url='releases'
 fi

 case "$v_tag" in
 2) url_tag='git_2_fn';;
 esac

 "$url_tag" "$v_url" "$gv_url" 2>/dev/null
}

# begin source code building
git_ver_fn 'freedesktop/pkg-config' '1' 'T'
if build 'pkg-config' "$g_pkg"; then
 download "https://pkgconfig.freedesktop.org/releases/$g_ver.tar.gz" "$g_ver.tar.gz"
 execute ./configure --silent --prefix="$workspace" --with-pc-path="$workspace"/lib/pkgconfig/ --with-internal-glib
 execute make -j "$cpu_threads"
 execute make install
 build_done 'pkg-config' "$g_pkg"
fi

git_ver_fn 'yasm/yasm' '1' 'T'
if build 'yasm' "$g_ver"; then
 download "https://github.com/yasm/yasm/releases/download/v$g_ver/yasm-$g_ver.tar.gz" "yasm-$g_ver.tar.gz"
 execute ./configure --prefix="$workspace"
 execute make -j "$cpu_threads"
 execute make install
 build_done 'yasm' "$g_ver"
fi



-
Revision 0ba1542f12 : Vidyo : Support for one-pass rc-enabled SVC encoder Adds support for one-pass rc
7 novembre 2014, par Deb MukherjeeChanged Paths :
Modify /examples/vp9_spatial_svc_encoder.c
Modify /vp9/vp9_cx_iface.c
Modify /vpx/src/svc_encodeframe.c
Modify /vpx/svc_context.h
Modify /vpx/vp8cx.h
Modify /vpx/vpx_encoder.h
Vidyo : Support for one-pass rc-enabled SVC encoderAdds support for one-pass rc-enabled SVC encoder with callbacks for
getting per-layer packets.the callback function registration is implemented as an encoder
control function.if the callback function is not registered, the old way of
aggregating packets with superframe will take effect.one more control function “VP9E_GET_SVC_LAYER_ID” has been
implemented to get the temporal/spatial id from the encoder
within the callback. This can be used to get the ids to put on RTP
packet.Change-Id : I1a90e00135dde65da128b758e6c00b57299a111a