
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (80)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)
Sur d’autres sites (9578)
-
SDL2 won't play with more than 6 audio channels
13 juin 2020, par Hiko HaietoI am trying to stream (raw) video and audio from a capture device as part of my home media setup (with my pc acting similarly to a receiver in a typical home theatre setup), but the biggest problem I haven't been able to get past is that I haven't been able to get ffplay (using SDL2 as its audio backend) to work with all 8 channels in 7.1 streams - two simply get dropped, despite it recognising 8 channel input or me specifying a 7.1 layout.



I have been able to confirm that all 8 channels are present in the source by first using ffmpeg to save the output of a speaker test to a file and playing that back with both mplayer (which works) and ffplay (which doesn't). I also wrote some minimal code to play the audio directly through SDL's API with the same result, so it's not the fault of ffplay. I might simply use mplayer if it weren't for the fact that piping output from ffmpeg adds too much latency for real-time use. I am using libSDL version 2.0.12 and ffplay 4.2.3, both of which are the latest at the time of writing and are ostensibly supposed to support 7.1 audio.



Using output recorded from
speaker-test -c 8
, I am using the following to play it back in mplayer :


mplayer -channels 8 -rawaudio channels=8 -format s16le -demuxer rawaudio speaker-test.pcm




and the following to play it back in ffplay :



ffplay -f s16le -ac 8 -af 'channelmap=channel_layout=7.1' speaker-test.pcm




No matter what I try, the two side channels get dropped. I couldn't figure out how to play raw pcm in SDL, so I repeated the same tests with wav output and used the following code to play it back :



#include <sdl2></sdl2>SDL.h>

int main(int argc, char **argv) {
 SDL_Init(SDL_INIT_AUDIO);
 SDL_AudioSpec wavSpec;
 Uint32 wavLength;
 Uint8 *wavBuffer;
 SDL_LoadWAV("speaker-test.wav", &wavSpec, &wavBuffer, &wavLength);
 SDL_AudioDeviceID deviceID = SDL_OpenAudioDevice(NULL, 0, &wavSpec, NULL, 0);
 SDL_QueueAudio(deviceID, wavBuffer, wavLength);
 SDL_PauseAudioDevice(deviceID, 0);
 SDL_Delay(30000);
 SDL_CloseAudioDevice(deviceID);
 SDL_FreeWAV(wavBuffer);
 SDL_Quit();
 return 0;
}




The above code exhibits the same behaviour of dropping the two additional side channels, despite it being the latest version of SDL that should have supported 7.1 for many releases now. Why might this be happening, and how might I fix it ?


-
Inconsistent behaviour of ffmpeg -i flag
19 mai 2021, par zeebrahI'm trying to write a Bash script that would download an audio track from youtube and convert it to Apple ringtone format, which is .m4r. The tools I employ are
youtube-dl
andffmpeg
. The former seems to work fine but I have a strange issue with the latter. When I try to pass the name of the file parametrically, shell prints that file or directory can't be found.

I use the same method for unwraping user input (or my defaults) for both commands but it only seems to work with
youtube-dl
.

Example of what doesn't work and print the text above. Obviously, all the files are in fact there and accessible


youtube-dl -i --extract-audio --audio-format m4a -o $filepath_interim --audio-quality 0 $video_link
ffmpeg -i $filepath_interim -acodec copy -f ipod -ss $offset -t $length $filepath_out



So, I tried to do it with
''
strings,$(command)
syntax, it all failed. I only found one way to make it work, which is to hard code the values of the paths but this defeats the whole purpose of my script.

youtube-dl -i --extract-audio --audio-format m4a -o $filepath_interim --audio-quality 0 $video_link
ffmpeg -i ~/Downloads/ringtone.m4a -acodec copy -f ipod -ss $offset -t $length ~/Downloads/ringtone.m4r



I want to figure out why this keeps happening, whether it is in any way specific to
ffmpeg
or am I just missing some piece of knowledge about$
name unwrapping.

Minimal Working Example. Run with bash or pack in script and run. If your path to
bash
is different, please change the shebang :

#!/usr/local/bin/bash
# initialise
filepath_interim="~/Downloads/ringtone.m4a"
filepath_out="~/Downloads/ringtone.m4r"
video_link="https://www.youtube.com/watch?v=dQw4w9WgXcQ"
offset=0
length=30

# main part
youtube-dl -i --extract-audio --audio-format m4a -o $filepath_interim --audio-quality 0 $video_link
ffmpeg -i $filepath_interim -acodec copy -f ipod -ss $offset -t $length $filepath_out



-
How to minimize latency in ffmpeg stream Java ?
13 juillet 2022, par Taavi SõerdI need to stream ffmpeg video feed in android studio and need minimal latency. Code below has achieved that when playing on galaxy s21 ultra but when I play it on galaxy tab then it's like in slow motion. When i set buffer size to 0 I get minimal latency but can't actually even see the video as it's all corrupted (all gray and colored noise).


public class Decode implements Runnable {
public Activity activity;
AVFrame pFrameRGB;
SwsContext sws_ctx;
ByteBuffer bitmapBuffer;
Bitmap bmp;
byte[] array;
int imageViewWidth = 0;
int imageViewHeight = 0;
boolean imageChanged = true;
int v_stream_idx = -1;
int klv_stream_idx = -1;

boolean imageDrawMutex = false;

boolean imageIsSet = false;
ImageView imageView = MainActivity.getmInstanceActivity().findViewById(R.id.imageView);

String mFilename = "udp://@" + MainActivity.connectionIP;;
UasDatalinkLocalSet mLatestDls;

public Decode(Activity _activity) {
 this.activity = _activity;
}

public void create_decoder(AVCodecContext codec_ctx) {
 imageChanged = true;

 // Determine required buffer size and allocate buffer
 int numBytes =av_image_get_buffer_size(AV_PIX_FMT_RGBA, codec_ctx.width(),
 codec_ctx.height(), 1);
 BytePointer buffer = new BytePointer(av_malloc(numBytes));

 bmp = Bitmap.createBitmap(codec_ctx.width(), codec_ctx.height(), Bitmap.Config.ARGB_8888);

 array = new byte[codec_ctx.width() * codec_ctx.height() * 4];
 bitmapBuffer = ByteBuffer.wrap(array);

 sws_ctx = sws_getContext(
 codec_ctx.width(),
 codec_ctx.height(),
 codec_ctx.pix_fmt(),
 codec_ctx.width(),
 codec_ctx.height(),
 AV_PIX_FMT_RGBA,
 SWS_POINT,
 null,
 null,
 (DoublePointer) null
 );

 if (sws_ctx == null) {
 Log.d("app", "Can not use sws");
 throw new IllegalStateException();
 }

 av_image_fill_arrays(pFrameRGB.data(), pFrameRGB.linesize(),
 buffer, AV_PIX_FMT_RGBA, codec_ctx.width(), codec_ctx.height(), 1);
}

@Override
public void run() {
 Log.d("app", "Start decoder");

 int ret = -1, i = 0;
 String vf_path = mFilename;

 AVFormatContext fmt_ctx = new AVFormatContext(null);
 AVPacket pkt = new AVPacket();


 AVDictionary multicastDict = new AVDictionary();

 av_dict_set(multicastDict, "rtsp_transport", "udp_multicast", 0);

 av_dict_set(multicastDict, "localaddr", getIPAddress(true), 0);
 av_dict_set(multicastDict, "reuse", "1", 0);

 av_dict_set(multicastDict, "buffer_size", "0.115M", 0);

 ret = avformat_open_input(fmt_ctx, vf_path, null, multicastDict);
 if (ret < 0) {
 Log.d("app", String.format("Open video file %s failed \n", vf_path));
 byte[] error_message = new byte[1024];
 int elen = av_strerror(ret, error_message, 1024);
 String s = new String(error_message, 0, 20);
 Log.d("app", String.format("Return: %d", ret));
 Log.d("app", String.format("Message: %s", s));
 throw new IllegalStateException();
 }
 
 if (avformat_find_stream_info(fmt_ctx, (PointerPointer) null) < 0) {
 //System.exit(-1);
 Log.d("app", "Stream info not found");
 }


 avformat.av_dump_format(fmt_ctx, 0, mFilename, 0);

 int nstreams = fmt_ctx.nb_streams();

 for (i = 0; i < fmt_ctx.nb_streams(); i++) {
 if (fmt_ctx.streams(i).codecpar().codec_type() == AVMEDIA_TYPE_VIDEO) {
 v_stream_idx = i;
 }
 if (fmt_ctx.streams(i).codecpar().codec_type() == AVMEDIA_TYPE_DATA) {
 klv_stream_idx = i;
 }
 }
 if (v_stream_idx == -1) {
 Log.d("app", "Cannot find video stream");
 throw new IllegalStateException();
 } else {
 Log.d("app", String.format("Video stream %d with resolution %dx%d\n", v_stream_idx,
 fmt_ctx.streams(v_stream_idx).codecpar().width(),
 fmt_ctx.streams(v_stream_idx).codecpar().height()));
 }

 AVCodecContext codec_ctx = avcodec_alloc_context3(null);
 avcodec_parameters_to_context(codec_ctx, fmt_ctx.streams(v_stream_idx).codecpar());


 AVCodec codec = avcodec_find_decoder(codec_ctx.codec_id());


 AVDictionary avDictionary = new AVDictionary();

 av_dict_set(avDictionary, "fflags", "nobuffer", 0);


 if (codec == null) {
 Log.d("app", "Unsupported codec for video file");
 throw new IllegalStateException();
 }
 ret = avcodec_open2(codec_ctx, codec, avDictionary);
 if (ret < 0) {
 Log.d("app", "Can not open codec");
 throw new IllegalStateException();
 }

 AVFrame frm = av_frame_alloc();

 // Allocate an AVFrame structure
 pFrameRGB = av_frame_alloc();
 if (pFrameRGB == null) {
 //System.exit(-1);
 Log.d("app", "unable to init pframergb");
 }

 create_decoder(codec_ctx);

 int width = codec_ctx.width();
 int height = codec_ctx.height();

 double fps = 15;
 

 while (true) {
 try {
 Thread.sleep(1);
 } catch (Exception e) {

 }

 try {
 if (av_read_frame(fmt_ctx, pkt) >= 0) {
 if (pkt.stream_index() == v_stream_idx) {
 avcodec_send_packet(codec_ctx, pkt);

 if (codec_ctx.width() != width || codec_ctx.height() != height) {
 create_decoder(codec_ctx);
 width = codec_ctx.width();
 height = codec_ctx.height();
 }
 }

 if (pkt.stream_index() == klv_stream_idx) {

 byte[] klvDataBuffer = new byte[pkt.size()];

 for (int j = 0; j < pkt.size(); j++) {
 klvDataBuffer[j] = pkt.data().get(j);
 }

 try {
 KLV k = new KLV(klvDataBuffer, KLV.KeyLength.SixteenBytes, KLV.LengthEncoding.BER);
 byte[] main_payload = k.getValue();

 // decode the Uas Datalink Local Set from main_payload binary blob.
 mLatestDls = new UasDatalinkLocalSet(main_payload);

 if (mLatestDls != null) {

 MainActivity.getmInstanceActivity().runOnUiThread(new Runnable() {
 @RequiresApi(api = Build.VERSION_CODES.Q)
 @Override
 public void run() {
 MainActivity.getmInstanceActivity().updateKlv(mLatestDls);
 }
 });
 }
 } catch (Exception e) {
 e.printStackTrace();
 }
 
 }

 int wasFrameDecoded = 0;
 while (wasFrameDecoded >= 0) {
 wasFrameDecoded = avcodec_receive_frame(codec_ctx, frm);

 if (wasFrameDecoded >= 0) {
 // get clip fps
 fps = 15; //av_q2d(fmt_ctx.streams(v_stream_idx).r_frame_rate());

 sws_scale(
 sws_ctx,
 frm.data(),
 frm.linesize(),
 0,
 codec_ctx.height(),
 pFrameRGB.data(),
 pFrameRGB.linesize()
 );

 if(!imageDrawMutex) {
 MainActivity.getmInstanceActivity().runOnUiThread(new Runnable() {
 @Override
 public void run() {
 if (imageIsSet) {
 imageDrawMutex = true;
 pFrameRGB.data(0).position(0).get(array);
 bitmapBuffer.rewind();
 bmp.copyPixelsFromBuffer(bitmapBuffer);

 if (imageChanged) {
 (imageView).setImageBitmap(bmp);
 imageChanged = false;
 }

 (imageView).invalidate();
 imageDrawMutex = false;
 } else {
 (imageView).setImageBitmap(bmp);
 imageIsSet = true;
 }
 }
 });
 }
 }
 }
 av_packet_unref(pkt);

 }
 } catch (Exception e) {
 e.printStackTrace();
 }

 if (false) {
 Log.d("threads", "false");

 av_frame_free(frm);

 avcodec_close(codec_ctx);
 avcodec_free_context(codec_ctx);

 avformat_close_input(fmt_ctx);
 }
 }
}



This code is running in Android Studio with Java. I'm quite new on this topic so not really sure even where to start.
What could be the cause of that ?