
Recherche avancée
Médias (3)
-
Elephants Dream - Cover of the soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
Valkaama DVD Label
4 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Image
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (65)
-
Pas question de marché, de cloud etc...
10 avril 2011Le vocabulaire utilisé sur ce site essaie d’éviter toute référence à la mode qui fleurit allègrement
sur le web 2.0 et dans les entreprises qui en vivent.
Vous êtes donc invité à bannir l’utilisation des termes "Brand", "Cloud", "Marché" etc...
Notre motivation est avant tout de créer un outil simple, accessible à pour tout le monde, favorisant
le partage de créations sur Internet et permettant aux auteurs de garder une autonomie optimale.
Aucun "contrat Gold ou Premium" n’est donc prévu, aucun (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (10470)
-
Flutter FFMPEG - Getting the same waveform output image for different audio files
28 novembre 2020, par mtkguyI'm trying to generate waveform images for different audio files but no matter what i do, i keep getting the same waveform image for every audio. The first audio file returns the correct waveform image but subsequent files also return that same image as output. Yes, i already set the overwrite flag to my command but it doesn't change anything. Here's my code.


class RecordController extends GetxController {
Rx<file> cachedAudioFile = Rx<file>();
 RxString waveFormImagePath = "".obs;

Future<int> getWaveForm(String url) async {
 final FlutterFFmpeg _flutterFFmpeg = FlutterFFmpeg();
 final FlutterFFmpegConfig _flutterFFmpegConfig = FlutterFFmpegConfig();
 // clear out path for new wave form
 if (waveFormImagePath.value != "") {
 File(waveFormImagePath.value).delete();
 }
 waveFormImagePath.value = "";
 Directory appDocumentDir = await getApplicationDocumentsDirectory();
 String rawDocumentPath = appDocumentDir.path;
 waveFormImagePath.value = [rawDocumentPath, '/audioWaveForm.png'].join();

 cachedAudioFile.value = await DefaultCacheManager().getSingleFile(url);
 String inputFilePath = cachedAudioFile.value.path;
 String commandToExecute = [
 '-y -i $inputFilePath -filter_complex "compand,aformat=channel_layouts=mono,showwavespic=s=4000x2000:colors=white" -frames:v 1 ',
 waveFormImagePath.value
 ].join();
 _flutterFFmpeg
 .execute(commandToExecute)
 .then((rc) => print("FFmpeg process exited with rc $rc"));
 int returnCode = await _flutterFFmpegConfig.getLastReturnCode();
 var output = await _flutterFFmpegConfig.getLastCommandOutput();
 print(output);
 print(waveFormImagePath.value);

 return returnCode;
 }
}
</int></file></file>


I'm using the Getx package to manage state and as such i can use the
waveFormImagePath
anywhere in my UI plus it's reactive. So somewhere in my UI i have this

FileImage(File(recordController?.waveFormImagePath?.value))



Any help would be appreciated.


-
combine Scheduled audio ffmpeg
5 septembre 2018, par Erfan EsmaeilzadehI am developing a music making application for Android. I need to create a system that can extract MP3s from the user’s music project.
I can save the name of every file per second of music in a string, as in the following example :
a.mp3-b.mp3-c.mp3|| d.mp3-e.mp3-f.mp3|| g.mp3-h.mp3-i.mp3
=> final.mp3Between "||" is equivalent to one second and the audio within this parameter must be combined. After that, the combinations are put together.
The most important thing is the timing of each sound. Some sounds must be combined at one time and some sounds at other times.
-
seeking with av_seek_frame returns invalid data
27 juin 2016, par mtadmkBased on dranger tutorial I’m trying to implement seeking in video file. Problem is with seeking to beginning milliseconds - it always returns 0 when seeking backward, or some place in file (based on file format).
I.e. with this video file and seeking forward to 500 pts there is always 6163 pts returned. Seeking backward to 500 is always returning 0. I have no idea why.
Code to do this :
import java.util.Arrays;
import java.util.List;
import org.bytedeco.javacpp.BytePointer;
import org.bytedeco.javacpp.DoublePointer;
import org.bytedeco.javacpp.PointerPointer;
import org.bytedeco.javacpp.avcodec;
import org.bytedeco.javacpp.avformat;
import org.bytedeco.javacpp.avutil;
import org.bytedeco.javacpp.swscale;
import static org.bytedeco.javacpp.avcodec.av_free_packet;
import static org.bytedeco.javacpp.avcodec.avcodec_close;
import static org.bytedeco.javacpp.avcodec.avcodec_decode_video2;
import static org.bytedeco.javacpp.avcodec.avcodec_find_decoder;
import static org.bytedeco.javacpp.avcodec.avcodec_flush_buffers;
import static org.bytedeco.javacpp.avcodec.avcodec_open2;
import static org.bytedeco.javacpp.avcodec.avpicture_fill;
import static org.bytedeco.javacpp.avcodec.avpicture_get_size;
import static org.bytedeco.javacpp.avformat.AVSEEK_FLAG_BACKWARD;
import static org.bytedeco.javacpp.avformat.av_dump_format;
import static org.bytedeco.javacpp.avformat.av_read_frame;
import static org.bytedeco.javacpp.avformat.av_register_all;
import static org.bytedeco.javacpp.avformat.av_seek_frame;
import static org.bytedeco.javacpp.avformat.avformat_close_input;
import static org.bytedeco.javacpp.avformat.avformat_find_stream_info;
import static org.bytedeco.javacpp.avformat.avformat_open_input;
import static org.bytedeco.javacpp.avutil.AVMEDIA_TYPE_VIDEO;
import static org.bytedeco.javacpp.avutil.AV_PIX_FMT_RGB24;
import static org.bytedeco.javacpp.avutil.AV_TIME_BASE;
import static org.bytedeco.javacpp.avutil.av_frame_alloc;
import static org.bytedeco.javacpp.avutil.av_frame_get_best_effort_timestamp;
import static org.bytedeco.javacpp.avutil.av_free;
import static org.bytedeco.javacpp.avutil.av_make_q;
import static org.bytedeco.javacpp.avutil.av_malloc;
import static org.bytedeco.javacpp.avutil.av_q2d;
import static org.bytedeco.javacpp.avutil.av_rescale_q;
import static org.bytedeco.javacpp.swscale.SWS_BILINEAR;
import static org.bytedeco.javacpp.swscale.sws_getContext;
import static org.bytedeco.javacpp.swscale.sws_scale;
public class SeekTest1 {
private avformat.AVFormatContext videoFile;
private final avcodec.AVPacket framePacket = new avcodec.AVPacket();
private avcodec.AVCodecContext videoCodec;
private avutil.AVFrame yuvFrame;
private swscale.SwsContext scalingContext;
private avutil.AVFrame rgbFrame;
private int streamId;
private long lastTime;
public static void main(String[] args) {
if (args.length > 0) {
new SeekTest1().start(args[0]);
} else {
new SeekTest1().start("/path/to/file");
}
}
private void start(String path) {
init(path);
//test for forward
// List<integer> seekPos = Arrays.asList(0, 100, 0, 200, 0, 300, 0, 400, 0, 500, 0, 600, 0, 700);
//test for backward
List<integer> seekPos = Arrays.asList(10000, 8000, 10000, 7000, 10000, 6000, 10000, 4000, 10000, 2000, 10000, 1000, 10000, 200);
seekPos.forEach(seekPosition -> {
readFrame();
seek(seekPosition);
});
readFrame();
cleanUp();
}
long seek(long seekPosition) {
final avutil.AVRational timeBase = fetchTimeBase();
long timeInBaseUnit = fetchTimeInBaseUnit(seekPosition, timeBase);
//changing flag to AVSEEK_FLAG_ANY does not change behavior
int flag = 0;
if (timeInBaseUnit < lastTime) {
flag = AVSEEK_FLAG_BACKWARD;
System.out.println("seeking backward to " + timeInBaseUnit);
} else {
System.out.println("seeking forward to " + timeInBaseUnit);
}
avcodec_flush_buffers(videoCodec);
av_seek_frame(videoFile, streamId, timeInBaseUnit, flag);
return timeInBaseUnit;
}
private long fetchTimeInBaseUnit(long seekPositionMillis, avutil.AVRational timeBase) {
return av_rescale_q((long) (seekPositionMillis / 1000.0 * AV_TIME_BASE), av_make_q(1, AV_TIME_BASE), timeBase);
}
private void readFrame() {
while (av_read_frame(videoFile, framePacket) >= 0) {
if (framePacket.stream_index() == streamId) {
// Decode video frame
final int[] frameFinished = new int[1];
avcodec_decode_video2(this.videoCodec, yuvFrame, frameFinished, framePacket);
if (frameFinished[0] != 0) {
av_free_packet(framePacket);
long millis = produceFinishedFrame();
System.out.println("produced time " + millis);
break;
}
}
av_free_packet(framePacket);
}
}
private long produceFinishedFrame() {
sws_scale(scalingContext, yuvFrame.data(), yuvFrame.linesize(), 0,
videoCodec.height(), rgbFrame.data(), rgbFrame.linesize());
final long pts = av_frame_get_best_effort_timestamp(yuvFrame);
final double timeBase = av_q2d(fetchTimeBase());
final long foundMillis = (long) (pts * 1000 * timeBase);
lastTime = foundMillis;
return foundMillis;
}
private avutil.AVRational fetchTimeBase() {
return videoFile.streams(streamId).time_base();
}
private void init(String videoPath) {
final avformat.AVFormatContext videoFile = new avformat.AVFormatContext(null);
av_register_all();
if (avformat_open_input(videoFile, videoPath, null, null) != 0) throw new RuntimeException("unable to open");
if (avformat_find_stream_info(videoFile, (PointerPointer) null) < 0) throw new RuntimeException("Couldn't find stream information");
av_dump_format(videoFile, 0, videoPath, 0);
final int streamId = findFirstVideoStream(videoFile);
final avcodec.AVCodecContext codec = videoFile.streams(streamId).codec();
final avcodec.AVCodec pCodec = avcodec_find_decoder(codec.codec_id());
if (pCodec == null) throw new RuntimeException("Unsupported codec");
if (avcodec_open2(codec, pCodec, (avutil.AVDictionary) null) < 0) throw new RuntimeException("Could not open codec");
final avutil.AVFrame yuvFrame = av_frame_alloc();
final avutil.AVFrame rgbFrame = av_frame_alloc();
if (rgbFrame == null) throw new RuntimeException("Can't allocate avframe");
final int numBytes = avpicture_get_size(AV_PIX_FMT_RGB24, codec.width(), codec.height());
final BytePointer frameBuffer = new BytePointer(av_malloc(numBytes));
final swscale.SwsContext swsContext = sws_getContext(codec.width(), codec.height(), codec.pix_fmt(), codec.width(), codec.height(),
AV_PIX_FMT_RGB24, SWS_BILINEAR, null, null, (DoublePointer) null);
avpicture_fill(new avcodec.AVPicture(rgbFrame), frameBuffer, AV_PIX_FMT_RGB24, codec.width(), codec.height());
this.videoFile = videoFile;
this.videoCodec = codec;
this.yuvFrame = yuvFrame;
this.scalingContext = swsContext;
this.rgbFrame = rgbFrame;
this.streamId = streamId;
}
private static int findFirstVideoStream(avformat.AVFormatContext videoFile) {
int videoStream = -1;
for (int i = 0; i < videoFile.nb_streams(); i++) {
if (videoFile.streams(i).codec().codec_type() == AVMEDIA_TYPE_VIDEO) {
videoStream = i;
break;
}
}
if (videoStream == -1) throw new RuntimeException("Didn't find video stream");
return videoStream;
}
private void cleanUp() {
av_free(this.rgbFrame);
av_free(yuvFrame);
avcodec_close(videoCodec);
avformat_close_input(videoFile);
}
}
</integer></integer>As input arg should be provided file from above.