
Recherche avancée
Autres articles (65)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs
12 avril 2011, parLa manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras.
Sur d’autres sites (13936)
-
How to use ffmpeg to calculator output video file size before convert ?
6 juillet 2016, par flashvnnI am starting with developing a video converter application, it has a feature allowing users to choose a video.
Then they can preview the output file size before they decide to convert the video.
Can ffmpeg calculator output the file size before converting it ?
-
NSOperationQueue's threads just don't die
8 novembre 2013, par u2FanSorry, it's a bit wordy, but I wanted to make sure I was clear ! ;-)
I have an iOS app that uses FFMPEG for streaming RTSP. I've multi-threaded FFMPEG using NSOperationQueue such that most its work, other than painting the image to the screen, of course, happens in background threads.
Works great ! ...except for the fact that threads the NSOperationQueue creates never die !I init the Queue in the class' init method with :
self->opQ = [[NSOperationQueue alloc] init];
[self->opQ setMaxConcurrentOperationCount:1];I add methods to the Queue using blocks :
[self->opQ addOperationWithBlock:^{
[self haveConnectedSuccessfullyOperation];
}];Or
[self->opQ addOperationWithBlock:^{
if (SOME_CONDITION) {
[self performSelectorOnMainThread:@selector(DO_SOME_CRAP) withObject:nil waitUntilDone:NO];
}
}];Later, when I need to tear down the RTSP stream, in addition to telling FFMPEG to shut down, I call :
[self->opQ cancelAllOperations];
Which does indeed stop the threads from doing any work , but never actually destroys them. Below, you'll see a screen shot of threads that are doing nothing at all. This is what my threads look like after starting/stoping FFMPEG several times.
I seem to remember reading in Apple's documentation that NSOperations and the threads they are run on are destroyed once they are done executing, unless otherwise referenced. This doesn't appear to be the case.
Do I just need to destroy the NSOperationQueue, then re-init it when I need to start up FFMPEG again (I just realized I haven't tried this) ? Anyone know how I need to kill these extra threads ?
THANKS !
-
Extracting metadata from incomplete video files
17 juillet 2013, par npgallCan anyone tell me where metadata is stored in common video file formats ? And if it would be located towards the start of the file, or scattered throughout.
I'm working with a remote object store containing a lot of video files and I want to extract metadata, in particular video duration and video dimensions from those files, without streaming the entire file contents to the local machine.
I'm hoping that this metadata will be stored in the first X bytes of files, and so I can just fetch a byte range starting at the beginning instead of the whole file, passing this partial file data to
ffprobe
.For testing purposes I created a 22MB MP4 file, and used the following command to supply only the first 1MB of data to ffprobe :
head -c1024K '2013-07-04 12.20.07.mp4' | ffprobe -
It prints :
avprobe version 0.8.6-4:0.8.6-0ubuntu0.12.04.1, Copyright (c) 2007-2013 the Libav developers
built on Apr 2 2013 17:02:36 with gcc 4.6.3
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x1a6b7a0] stream 0, offset 0x10beab: partial file
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'pipe:':
Metadata:
major_brand : isom
minor_version : 0
compatible_brands: isom3gp4
creation_time : 1947-07-04 11:20:07
Duration: 00:00:09.84, start: 0.000000, bitrate: N/A
Stream #0.0(eng): Video: h264 (High), yuv420p, 1920x1080, 20028 kb/s, PAR 65536:65536 DAR 16:9, 29.99 fps, 30 tbr, 90k tbn, 180k tbc
Metadata:
creation_time : 1947-07-04 11:20:07
Stream #0.1(eng): Audio: aac, 48000 Hz, stereo, s16, 189 kb/s
Metadata:
creation_time : 1947-07-04 11:20:07So I see the first 1MB was enough to extract video duration 9.84 seconds and video dimensions 1920x1080, even though ffprobe printed the warning about detecting a partial file. If I supply less than 1MB, it fails completely.
Would this approach work for other common video file formats to reliably extract metadata, or do any common formats scatter metadata throughout the file ?
I'm aware of the concept of container formats and that various codecs may be used represent the audio/video data inside those containers. I'm not familiar with the details though. So I guess the question may apply to common combinations of containers + codecs ? Thanks in advance.