
Recherche avancée
Autres articles (51)
-
Mise à jour de la version 0.1 vers 0.2
24 juin 2013, parExplications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)
Sur d’autres sites (10182)
-
How to pipe frames in stdout using FFMPEG and Golang
6 avril 2020, par Sven.DGI am fairly new to Golang, my goal is to build a script in Golang that :



- 

- Transcodes the video input .MOV file into frames, using FFMPEG
- Some downstream process that does something with the emitted frames.







Currently I have the following code as a first attempt :



package main

import (
 "fmt"
 "os/exec"
 "strconv"
 "io"
)

const (
 frameX = 240
 frameY = 135
 frameSize = frameX * frameY * 3
)

func main() {
 ffmpeg := exec.Command("ffmpeg", "-i", "project/data/sample.mov", "-vf", "fps=fps=1/2", "-s", strconv.Itoa(frameX)+"x"+strconv.Itoa(frameY), "-f", "rawvideo", "pipe:1") //nolint
 ffmpegOut, _ := ffmpeg.StdoutPipe()

 if err := ffmpeg.Start(); err != nil {
 panic(err)
 }

 reader(ffmpegOut)
 fmt.Println("Completed. YAY")
}

func reader(ffmpegOut io.Reader) {
 buf := make([]byte, frameSize)
 fmt.Println("Looping..")
 counter := 0
 for {
 if _, err := io.ReadFull(ffmpegOut, buf); err != nil {
 // fmt.Println(err)
 }
 if buf[0] != 0 {
 fmt.Println("Got a frame!!")
 counter++
 fmt.Println(counter)
 }
 }
 fmt.Println("total amount of frames", counter)
}




This is mainly based on this example : https://github.com/pion/example-webrtc-applications/blob/master/gocv-receive/main.go



However, when running this code, it appears that I'm getting an endless amount of data in my downstream
reader
. I would expect that the counter equals the amount of frames but that is clearly not the case looking at the rate at which the counter increases, so obviously I am doing something wrong. Can anybody point me in the right direction ?


I used the sample .mov file with 1280 x 720 resolution here : https://file-examples.com/index.php/sample-video-files/sample-mov-files-download/.
The video is about 31 seconds so I would expect to see 15 frames downstream in the reader.



Cheers !


-
How to pipe rawvideo to v4l2loopback using ffmpeg ?
3 août 2020, par sepehr78I am trying to process a video by OpenCV in Python and then send each frame to a virtual camera (i.e., v4l2loopback). I have seen questions asked where OpenCV output is piped to ffmpeg and saved into a file, and other questions where a video file is piped to v4l2 using ffmpeg, but no question where these two are combined. I can do either of the above two things on their own, but not combined.



My Python code uses a subprocess to pipe each frame to ffmpeg. The ffmpeg command for piping the output of OpenCV to an .mp4 file is



ffmpeg -y -f rawvideo -vcodec rawvideo -s 1280x720 -pix_fmt bgr24 -i - -vcodec libx264 -crf 0 -preset fast output.mp4




This works and I have tested it.



The ffmpeg command to pipe a video file to the v4l2 virtual camera is



ffmpeg -re -i input.mp4 -map 0:v -f v4l2 /dev/video0




This also works and I have tested it.



I tried combining the above two commands and came up with



ffmpeg -y -f rawvideo -vcodec rawvideo -s 1280x720 -pix_fmt bgr24 -i - -vcodec libx264 -crf 0 -preset fast -map 0:v -f v4l2 /dev/video0




but I get the following error





[NULL @ 0x55a12fcc60] Unable to find a suitable output format for ''
 : Invalid argument





I would be glad if anyone could help me figure this out.



Thanks.


-
How to add BGM when pipe opencv images to ffmpeg - python
23 mars 2020, par karobbenBasically, I know how to stream by Pipe opencv to ffmpeg using python.
But my problem is I can’t add an audio file (BGM) for it. Does anyone know how to make it with python ?my python code is :
import cv2 as cv
import subprocess as sp
# ffmpeg command
command = ['ffmpeg',
#'-re', '-stream_loop', '-1',
#'-i', '/home/pi/scrpt/Blive/StarBucks_BGN.mp3',
'-y',
'-f', 'rawvideo',
'-vcodec','rawvideo',
'-pix_fmt', 'bgr24',
'-s', "{}x{}".format(width, height),
'-r', str(fps),
'-i', '-',
'-c:v', 'libx264',
'-pix_fmt', 'yuv420p',
'-preset', 'ultrafast',
'-f', 'flv',
rtmpUrl]I know that I can achieve this by :
sudo raspivid -o - -t 0 -w 1280 -h 720 -fps 24 -b 1000000 |
ffmpeg -re -stream_loop -1 -i "/home/pi/scrpt/Blive/StarBucks_BGN.mp3" \
-f h264 -i - -vcodec copy -r 30 -acodec aac -b:a 100k -preset ultrafast \
-tune zerolatency -f flv "rtmp://"So, I tried to add
-re -stream_loop -1 -i "/home/pi/scrpt/Blive/StarBucks_BGN.mp3"
into the python pipe, but it crashed with :[libx264 @ 0x1952aa0] using cpu capabilities: ARMv6 NEON
[libx264 @ 0x1952aa0] profile Constrained Baseline, level 3.1
[libx264 @ 0x1952aa0] 264 - core 148 r2748 97eaef2 - H.264/MPEG-4 AVC codec - Copyleft 2003-2016 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=25 scenecut=0 intra_refresh=0 rc=crf mbtree=0 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=0
[flv @ 0x1951d00] FLV does not support sample rate 48000, choose from (44100, 22050, 11025)
[flv @ 0x1951d00] Audio codec mp3 not compatible with flv
Could not write header for output file #0 (incorrect codec parameters ?): Function not implementedStream mapping:
Stream #1:0 -> #0:0 (rawvideo (native) -> h264 (libx264))
Stream #0:0 -> #0:1 (mp3 (native) -> mp3 (libmp3lame))
Last message repeated 1 times
Traceback (most recent call last):
File "With_BG.py", line 57, in <module>
p.stdin.write(BG.tostring())
BrokenPipeError: [Errno 32] Broken pipe
</module>