
Recherche avancée
Médias (91)
-
999,999
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Slip - Artworks
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
-
Demon seed (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
The four of us are dying (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Corona radiata (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Lights in the sky (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
Autres articles (100)
-
Multilang : améliorer l’interface pour les blocs multilingues
18 février 2011, parMultilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela. -
L’agrémenter visuellement
10 avril 2011MediaSPIP est basé sur un système de thèmes et de squelettes. Les squelettes définissent le placement des informations dans la page, définissant un usage spécifique de la plateforme, et les thèmes l’habillage graphique général.
Chacun peut proposer un nouveau thème graphique ou un squelette et le mettre à disposition de la communauté. -
Possibilité de déploiement en ferme
12 avril 2011, parMediaSPIP peut être installé comme une ferme, avec un seul "noyau" hébergé sur un serveur dédié et utilisé par une multitude de sites différents.
Cela permet, par exemple : de pouvoir partager les frais de mise en œuvre entre plusieurs projets / individus ; de pouvoir déployer rapidement une multitude de sites uniques ; d’éviter d’avoir à mettre l’ensemble des créations dans un fourre-tout numérique comme c’est le cas pour les grandes plate-formes tout public disséminées sur le (...)
Sur d’autres sites (6078)
-
How do I get an InputStream out of a Mono ?
30 octobre 2022, par EinfariBear with my noobness, I am learning web-flux. I had this simple application that takes a video and extract the audio using FFprobe and FFmpeg, so I thought of redoing it reactively, but I am failing miserably...


Controller :


@PostMapping("/upload")
public String upload(@RequestPart("file") Mono<filepart> filePartMono, final Model model) {
 Flux<string> filenameList = mediaComponent.extractAudio(filePartMono);
 model.addAttribute("filenameList", new ReactiveDataDriverContextVariable(filenameList));
 return "download";
}
</string></filepart>


Function to get audio streams out of the video :


public Mono<ffproberesult> getAudioStreams(InputStream inputStream) {
 try {
 return Mono.just(FFprobe.atPath(FFprobePath)
 .setShowStreams(true)
 .setSelectStreams(StreamType.AUDIO)
 .setLogLevel(LogLevel.INFO)
 .setInput(inputStream)
 .execute());
 } catch (JaffreeException e) {
 log.error(e.getMessage(), e);
 throw new MediaException("Audio formats could not be identified.");
 }
}
</ffproberesult>


Attempt 1 :


public Flux<string> extractAudio(Mono<filepart> filePartMono) {
 filePartMono.flatMapMany(Part::content)
 .map(dataBuffer -> dataBuffer.asInputStream(true))
 .flatMap(this::getAudioStreams)
 .subscribe(System.out::println);
 ...
}
</filepart></string>


Attempt 2 :


public Flux<string> extractAudio(Mono<filepart> filePartMono) {
 filePartMono.flatMapMany(Part::content)
 .reduce(InputStream.nullInputStream(), (inputStream, dataBuffer) -> new SequenceInputStream(
 inputStream, dataBuffer.asInputStream()
 ))
 .flatMap(this::getAudioStreams)
 .subscribe(System.out::println);
 ...
}
</filepart></string>


Attempt 3 :


public Flux<string> extractAudio(Mono<filepart> filePartMono) {
 DataBufferUtils.write(filePartMono.flatMapMany(Part::content), OutputStream.nullOutputStream())
 .map(dataBuffer -> dataBuffer.asInputStream(true))
 .flatMap(this::getAudioStreams)
 .subscribe(System.out::println);
 ...
}
</filepart></string>


Attempt 1 and 3 seems to be the same in the end, FFprobe complains as follows :


2022-10-30 11:24:30.292 WARN 79049 --- [ StdErr] c.g.k.jaffree.process.BaseStdReader : [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f9162702340] [warning] STSZ atom truncated
2022-10-30 11:24:30.292 ERROR 79049 --- [ StdErr] c.g.k.jaffree.process.BaseStdReader : [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f9162702340] [error] stream 0, contradictionary STSC and STCO
2022-10-30 11:24:30.292 ERROR 79049 --- [ StdErr] c.g.k.jaffree.process.BaseStdReader : [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f9162702340] [error] error reading header
2022-10-30 11:24:30.294 ERROR 79049 --- [ StdErr] c.g.k.jaffree.process.BaseStdReader : [error] tcp://127.0.0.1:51532: Invalid data found when processing input
2022-10-30 11:24:30.295 INFO 79049 --- [oundedElastic-3] c.g.k.jaffree.process.ProcessHandler : Process has finished with status: 1
2022-10-30 11:24:30.409 ERROR 79049 --- [oundedElastic-3] c.e.s.application.MediaComponent : Process execution has ended with non-zero status: 1. Check logs for detailed error message.



Attempt 2 produces multiple of these :


Exception in thread "Runnable-0" java.lang.StackOverflowError
 at java.base/java.io.SequenceInputStream.read(SequenceInputStream.java:198)



Could anybody point me in the right direction ? What am I doing wrong ? By the way, I am outputting to console just to see a result, but in the end I need to take all the outputted streams and pass them as arguments to another function that will finally extract the audio, so I need to figure out that as well.


Thank you in advance.


-
Trying to merge two videos from my photo roll with Flutter ffmpeg without success
4 avril 2023, par Stéphane de LucaMy goal is to merge too video I pick from my photo roll.
My code starts as follows :


// videos[0] contains: "content://media/external/video/media/2779"
 final v1 = await videos[0].getMediaUrl();
 if (v1 == null) return;
 final v1Path = await LecleFlutterAbsolutePath.getAbsolutePath(uri: v1);



But printing
v1Path
give a path withjpeg
extension :
/data/user/0/com.example.shokaze/cache/OutputFile_1669939088711.jpeg' which I though would have bear
mp4` as it is a video.

Why is it so ?


Another question I have is how can I make a relevant path so that the ffmpeg video appears in my roll after its creation ? Should I do the following and provide
outputPath
to the code ?

The command it executes is :

-i /data/user/0/com.example.shokaze/cache/OutputFile_1669940421875.jpeg -i /data/user/0/com.example.shokaze/cache/OutputFile_1669940428723.jpeg -filter_complex '[0:0][1:0]concat=n=2:v=1:a=0[out]' -map '[out]' /data/user/0/com.example.shokaze/app_flutter/output.mp4


And I get an error :

I/flutter (30190): error 1


My code is as follows :


String output = "content://media/external/video/media/output";
 final outputPath = await LecleFlutterAbsolutePath.getAbsolutePath(uri: output);
 if (outputPath == null) return;



The full code is as follows :


// Makes the final video by merging all videos from the mixing table
 void makeFinalVideo() async {
 if (videos.length < 2) return;

 final v1 = await videos[0].getMediaUrl();
 if (v1 == null) return;
 final v1Path = await LecleFlutterAbsolutePath.getAbsolutePath(uri: v1);
 if (v1Path == null) return;
 //String v1 = "";
 final v2 = await videos[1].getMediaUrl();
 if (v2 == null) return;
 final v2Path = await LecleFlutterAbsolutePath.getAbsolutePath(uri: v2);
 if (v2Path == null) return;
 String output = "content://media/external/video/media/output";
 final outputPath = "";
 // await LecleFlutterAbsolutePath.getAbsolutePath(uri: output);
 // if (outputPath == null) return;

 Video.merge(v1Path, v2Path, outputPath);
 }



class Video {
 /// Merges the video [v1] with [v2] as [output] video located in app doc path
 static void merge(String v1, String v2, String output) async {
 final appDocDir = await getApplicationDocumentsDirectory();

 //final appDir = await syspaths.getApplicationDocumentsDirectory();
 String rawDocumentPath = appDocDir.path;
 final outputPath = '$rawDocumentPath/output.mp4';

 final command =
 '-i $v1 -i $v2 -filter_complex \'[0:0][1:0]concat=n=2:v=1:a=0[out]\' -map \'[out]\' $outputPath';
 //await execute(command);
 try {
 final r = await FFmpegKit.execute(command);

 //.then((rc) => print("FFmpeg process exited with rc $rc"));
 print("Result: $r");
 } catch (e) {
 print("Exception: $e");
 }
 }
}



-
Trying to get pixels from a transparent AVI file to use them in a Raylib image but AVFrame comes out
29 novembre 2024, par Kolt PennyI was trying to open a video file with FFmpeg's API using C++. So far all the steps up until calling
av_read_frame
has no errors :

- 

avformat_open_input
,avformat_find_stream_info
,avcodec_find_decoder
,avcodec_alloc_context3
,avcodec_parameters_to_context
andavcodec_open2
pass the error checking.AVFrame* av_frame
andAVPacket* av_packet
are correctly allocated.






This following code is where I do the rest of the checks, to get the pixel data :


int bufferSize = av_codec_ctx->width * av_codec_ctx->height * 4;
unsigned char* data = new unsigned char[bufferSize];
 
 while (av_read_frame(av_format_ctx, av_packet) >= 0) {
 if (av_packet->stream_index == video_stream_idx) {
 if (avcodec_send_packet(av_codec_ctx, av_packet) == 0) {
 if (avcodec_receive_frame(av_codec_ctx, av_frame) == 0) {

 memcpy(data, av_frame->data[0], bufferSize);

 break; // We only need the first frame
 }
 }
 }
 }



The issue is, up until
... if (avcodec_receive_frame(av_codec_ctx, av_frame) == 0) { ...
and while inspecting the data, the packet does bear information in the buffer, but once it's received into the frame, it turns into all zeroes.

None of the function calls return any error so it should be working, yet the packet has no data, and consequently the frame comes out empty.


Why could this be ? Should I keep looping on
avcodec_receive_frame(av_codec_ctx, av_frame)
until it comes out not empty ?