
Recherche avancée
Médias (1)
-
MediaSPIP Simple : futur thème graphique par défaut ?
26 septembre 2013, par
Mis à jour : Octobre 2013
Langue : français
Type : Video
Autres articles (75)
-
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...) -
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ; -
Le plugin : Gestion de la mutualisation
2 mars 2010, parLe plugin de Gestion de mutualisation permet de gérer les différents canaux de mediaspip depuis un site maître. Il a pour but de fournir une solution pure SPIP afin de remplacer cette ancienne solution.
Installation basique
On installe les fichiers de SPIP sur le serveur.
On ajoute ensuite le plugin "mutualisation" à la racine du site comme décrit ici.
On customise le fichier mes_options.php central comme on le souhaite. Voilà pour l’exemple celui de la plateforme mediaspip.net :
< ?php (...)
Sur d’autres sites (3235)
-
JavaCV/FFmpeg causes crash only on Lollipop
13 novembre 2015, par WeirdHatI get reports of this crash from users of my app on Galaxy Note devices running Android Lollipop (might happen on other devices but the majority of my users have Galaxy Note because it’s a drawing app). Below is the relevant code from my ASyncTask which exports a series of images (plus audio) to a video file using JavaCV/FFmpeg, and the stack trace from the reported crash. It works fine on my Note 8.0 running KitKat (there’s no Lollipop update available for that device), and works fine in Lollipop on the emulator, so I’m having a hard time tracking down the cause and don’t know what to do.
@Override
protected String doInBackground(final File... params) {
FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(params[0],canvas_width,canvas_height, 0);
FrameGrabber audiograbber = null;
if(audio!=null) audiograbber = new FFmpegFrameGrabber(audio);
Frame audioframe;
Frame image;
AndroidFrameConverter converter = new AndroidFrameConverter();
try {
recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
recorder.setFrameRate(framerate);
recorder.setPixelFormat(0);
if(audio!=null) {
audiograbber.setFrameRate(framerate);
audiograbber.start();
recorder.setAudioChannels(audiograbber.getAudioChannels());
recorder.setSampleRate(audiograbber.getSampleRate());
}
recorder.start();
Bitmap drawframe = createBitmap(canvas_width, canvas_height, Bitmap.Config.ARGB_8888);
Canvas c = new Canvas(drawframe);
Paint p = new Paint();
for(int frame=0; frame/code to draw image
image = converter.convert(drawframe);
recorder.record(image);
}
if(isCancelled()) {
bgbitmap.recycle();
drawframe.recycle();
return "Cancelled";
}
if(audio!=null) {
while((audioframe = audiograbber.grabFrame()) != null) {
recorder.record(audioframe);
}
}
recorder.stop();
bgbitmap.recycle();
drawframe.recycle();
if(audio!=null) audiograbber.stop();
return "Saved "+params[0];
} catch (Exception e) {
e.printStackTrace();
}
return "Failed";
}-
*** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
Build fingerprint: 'samsung/treltexx/trelte:5.1.1/LMY47X/N910CXXU1COH4:user/release-keys'
Revision: '21'
ABI: 'arm'
pid: 12039, tid: 12053, name: GCDaemon >>> com.weirdhat.roughanimator <<<
signal 6 (SIGABRT), code -6 (SI_TKILL), fault addr --------
Abort message: 'sart/runtime/gc/collector/mark_sweep.cc:381] Can't mark invalid object'
r0 00000000 r1 00002f15 r2 00000006 r3 00000000
r4 b366bdb8 r5 00000006 r6 00000002 r7 0000010c
r8 00000001 r9 b446f550 sl b4429000 fp 9a975c60
ip 00002f15 sp b366b008 lr b6db9cb9 pc b6ddd3ac cpsr 60070010
backtrace:
#00 pc 0003b3ac /system/lib/libc.so (tgkill+12)
#01 pc 00017cb5 /system/lib/libc.so (pthread_kill+52)
#02 pc 000188c7 /system/lib/libc.so (raise+10)
#03 pc 00015165 /system/lib/libc.so (__libc_android_abort+36)
#04 pc 00012fac /system/lib/libc.so (abort+4)
#05 pc 00242f17 /system/lib/libart.so (art::Runtime::Abort()+170)
#06 pc 000ad991 /system/lib/libart.so (art::LogMessage::~LogMessage()+1360)
#07 pc 0013ec53 /system/lib/libart.so (bool art::gc::accounting::HeapBitmap::AtomicTestAndSet(art::mirror::Object const*, art::gc::collector::MarkSweepMarkObjectSlowPath const&)+422)
#08 pc 0013ed27 /system/lib/libart.so (art::gc::collector::MarkSweep::MarkObjectParallel(art::mirror::Object const*)+142)
#09 pc 0013ff23 /system/lib/libart.so (art::gc::collector::MarkSweep::MarkRootParallelCallback(art::mirror::Object**, void*, art::RootInfo const&)+26)
#10 pc 0025893d /system/lib/libart.so (art::ReferenceMapVisitor::VisitQuickFrame()+1024)
#11 pc 00258cad /system/lib/libart.so (art::ReferenceMapVisitor::VisitFrame()+224)
#12 pc 0024c8e9 /system/lib/libart.so (art::StackVisitor::WalkStack(bool)+276)
#13 pc 0024e617 /system/lib/libart.so (art::Thread::VisitRoots(void (*)(art::mirror::Object**, void*, art::RootInfo const&), void*)+994)
#14 pc 0013ef9f /system/lib/libart.so (art::gc::collector::CheckpointMarkThreadRoots::Run(art::Thread*)+126)
#15 pc 0025b44d /system/lib/libart.so (art::ThreadList::RunCheckpoint(art::Closure*)+296)
#16 pc 0013dc0d /system/lib/libart.so (art::gc::collector::MarkSweep::MarkRootsCheckpoint(art::Thread*, bool)+96)
#17 pc 0014165d /system/lib/libart.so (art::gc::collector::MarkSweep::PreCleanCards()+172)
#18 pc 001417d3 /system/lib/libart.so (art::gc::collector::MarkSweep::MarkingPhase()+126)
#19 pc 001418b9 /system/lib/libart.so (art::gc::collector::MarkSweep::RunPhases()+176)
#20 pc 00138527 /system/lib/libart.so (art::gc::collector::GarbageCollector::Run(art::gc::GcCause, bool)+246)
#21 pc 0015865b /system/lib/libart.so (art::gc::Heap::CollectGarbageInternal(art::gc::collector::GcType, art::gc::GcCause, bool)+1426)
#22 pc 00159af5 /system/lib/libart.so (art::gc::Heap::ConcurrentGC(art::Thread*)+56)
#23 pc 000003ef /system/framework/arm/boot.oat -
squeeze image while capturing video with FFmpegFrameRecorder
12 avril 2016, par SatyI am trying to stream video with FFmpegFrameRecorder using javacv. All are working great except I find videos which are actually images are bit squeeze from the height.
I am using the below code which half of the internet community is using to live stream
public class MainActivity extends Activity implements OnClickListener {
private final static String LOG_TAG = "MainActivity";
private PowerManager.WakeLock mWakeLock;
private String ffmpeg_link = "rtmp://username:password@xxx.xxx.xxx.xxx:1935/live/test.flv";
//private String ffmpeg_link = "/mnt/sdcard/new_stream.flv";
private volatile FFmpegFrameRecorder recorder;
boolean recording = false;
long startTime = 0;
private int sampleAudioRateInHz = 44100;
private int imageWidth = 320;
private int imageHeight = 240;
private int frameRate = 30;
private Thread audioThread;
volatile boolean runAudioThread = true;
private AudioRecord audioRecord;
private AudioRecordRunnable audioRecordRunnable;
private CameraView cameraView;
private IplImage yuvIplimage = null;
private Button recordButton;
private LinearLayout mainLayout;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
setContentView(R.layout.activity_main);
initLayout();
initRecorder();
}
@Override
protected void onResume() {
super.onResume();
if (mWakeLock == null) {
PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, LOG_TAG);
mWakeLock.acquire();
}
}
@Override
protected void onPause() {
super.onPause();
if (mWakeLock != null) {
mWakeLock.release();
mWakeLock = null;
}
}
@Override
protected void onDestroy() {
super.onDestroy();
recording = false;
}
private void initLayout() {
mainLayout = (LinearLayout) this.findViewById(R.id.record_layout);
recordButton = (Button) findViewById(R.id.recorder_control);
recordButton.setText("Start");
recordButton.setOnClickListener(this);
cameraView = new CameraView(this);
LinearLayout.LayoutParams layoutParam = new LinearLayout.LayoutParams(imageWidth, imageHeight);
mainLayout.addView(cameraView, layoutParam);
Log.v(LOG_TAG, "added cameraView to mainLayout");
}
private void initRecorder() {
Log.w(LOG_TAG,"initRecorder");
if (yuvIplimage == null) {
// Recreated after frame size is set in surface change method
yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
//yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);
Log.v(LOG_TAG, "IplImage.create");
}
recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
Log.v(LOG_TAG, "FFmpegFrameRecorder: " + ffmpeg_link + " imageWidth: " + imageWidth + " imageHeight " + imageHeight);
recorder.setFormat("flv");
Log.v(LOG_TAG, "recorder.setFormat(\"flv\")");
recorder.setSampleRate(sampleAudioRateInHz);
Log.v(LOG_TAG, "recorder.setSampleRate(sampleAudioRateInHz)");
// re-set in the surface changed method as well
recorder.setFrameRate(frameRate);
Log.v(LOG_TAG, "recorder.setFrameRate(frameRate)");
// Create audio recording thread
audioRecordRunnable = new AudioRecordRunnable();
audioThread = new Thread(audioRecordRunnable);
}
// Start the capture
public void startRecording() {
try {
recorder.start();
startTime = System.currentTimeMillis();
recording = true;
audioThread.start();
} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
}
public void stopRecording() {
// This should stop the audio thread from running
runAudioThread = false;
if (recorder != null && recording) {
recording = false;
Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
try {
recorder.stop();
recorder.release();
} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
recorder = null;
}
}
@Override
public boolean onKeyDown(int keyCode, KeyEvent event) {
// Quit when back button is pushed
if (keyCode == KeyEvent.KEYCODE_BACK) {
if (recording) {
stopRecording();
}
finish();
return true;
}
return super.onKeyDown(keyCode, event);
}
@Override
public void onClick(View v) {
if (!recording) {
startRecording();
Log.w(LOG_TAG, "Start Button Pushed");
recordButton.setText("Stop");
} else {
stopRecording();
Log.w(LOG_TAG, "Stop Button Pushed");
recordButton.setText("Start");
}
}
//---------------------------------------------
// audio thread, gets and encodes audio data
//---------------------------------------------
class AudioRecordRunnable implements Runnable {
@Override
public void run() {
// Set the thread priority
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
// Audio
int bufferSize;
short[] audioData;
int bufferReadResult;
bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
audioData = new short[bufferSize];
Log.d(LOG_TAG, "audioRecord.startRecording()");
audioRecord.startRecording();
// Audio Capture/Encoding Loop
while (runAudioThread) {
// Read from audioRecord
bufferReadResult = audioRecord.read(audioData, 0, audioData.length);
if (bufferReadResult > 0) {
//Log.v(LOG_TAG,"audioRecord bufferReadResult: " + bufferReadResult);
// Changes in this variable may not be picked up despite it being "volatile"
if (recording) {
try {
// Write to FFmpegFrameRecorder
Buffer[] buffer = {ShortBuffer.wrap(audioData, 0, bufferReadResult)};
recorder.record(buffer);
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
}
}
Log.v(LOG_TAG,"AudioThread Finished");
/* Capture/Encoding finished, release recorder */
if (audioRecord != null) {
audioRecord.stop();
audioRecord.release();
audioRecord = null;
Log.v(LOG_TAG,"audioRecord released");
}
}
}
class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {
private boolean previewRunning = false;
private SurfaceHolder holder;
private Camera camera;
private byte[] previewBuffer;
long videoTimestamp = 0;
Bitmap bitmap;
Canvas canvas;
public CameraView(Context _context) {
super(_context);
holder = this.getHolder();
holder.addCallback(this);
holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
camera = Camera.open();
try {
camera.setPreviewDisplay(holder);
camera.setPreviewCallback(this);
Camera.Parameters currentParams = camera.getParameters();
Log.v(LOG_TAG,"Preview Framerate: " + currentParams.getPreviewFrameRate());
Log.v(LOG_TAG,"Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);
// Use these values
imageWidth = currentParams.getPreviewSize().width;
imageHeight = currentParams.getPreviewSize().height;
frameRate = currentParams.getPreviewFrameRate();
bitmap = Bitmap.createBitmap(imageWidth, imageHeight, Bitmap.Config.ALPHA_8);
/*
Log.v(LOG_TAG,"Creating previewBuffer size: " + imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8);
previewBuffer = new byte[imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8];
camera.addCallbackBuffer(previewBuffer);
camera.setPreviewCallbackWithBuffer(this);
*/
camera.startPreview();
previewRunning = true;
}
catch (IOException e) {
Log.v(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
Log.v(LOG_TAG,"Surface Changed: width " + width + " height: " + height);
// We would do this if we want to reset the camera parameters
/*
if (!recording) {
if (previewRunning){
camera.stopPreview();
}
try {
//Camera.Parameters cameraParameters = camera.getParameters();
//p.setPreviewSize(imageWidth, imageHeight);
//p.setPreviewFrameRate(frameRate);
//camera.setParameters(cameraParameters);
camera.setPreviewDisplay(holder);
camera.startPreview();
previewRunning = true;
}
catch (IOException e) {
Log.e(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
*/
// Get the current parameters
Camera.Parameters currentParams = camera.getParameters();
Log.v(LOG_TAG,"Preview Framerate: " + currentParams.getPreviewFrameRate());
Log.v(LOG_TAG,"Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);
// Use these values
imageWidth = currentParams.getPreviewSize().width;
imageHeight = currentParams.getPreviewSize().height;
frameRate = currentParams.getPreviewFrameRate();
// Create the yuvIplimage if needed
yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
//yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
try {
camera.setPreviewCallback(null);
previewRunning = false;
camera.release();
} catch (RuntimeException e) {
Log.v(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
if (yuvIplimage != null && recording) {
videoTimestamp = 1000 * (System.currentTimeMillis() - startTime);
// Put the camera preview frame right into the yuvIplimage object
yuvIplimage.getByteBuffer().put(data);
// FAQ about IplImage:
// - For custom raw processing of data, getByteBuffer() returns an NIO direct
// buffer wrapped around the memory pointed by imageData, and under Android we can
// also use that Buffer with Bitmap.copyPixelsFromBuffer() and copyPixelsToBuffer().
// - To get a BufferedImage from an IplImage, we may call getBufferedImage().
// - The createFrom() factory method can construct an IplImage from a BufferedImage.
// - There are also a few copy*() methods for BufferedImage<->IplImage data transfers.
// Let's try it..
// This works but only on transparency
// Need to find the right Bitmap and IplImage matching types
/*
bitmap.copyPixelsFromBuffer(yuvIplimage.getByteBuffer());
//bitmap.setPixel(10,10,Color.MAGENTA);
canvas = new Canvas(bitmap);
Paint paint = new Paint();
paint.setColor(Color.GREEN);
float leftx = 20;
float topy = 20;
float rightx = 50;
float bottomy = 100;
RectF rectangle = new RectF(leftx,topy,rightx,bottomy);
canvas.drawRect(rectangle, paint);
bitmap.copyPixelsToBuffer(yuvIplimage.getByteBuffer());
*/
//Log.v(LOG_TAG,"Writing Frame");
try {
// Get the correct time
recorder.setTimestamp(videoTimestamp);
// Record the image into FFmpegFrameRecorder
recorder.record(yuvIplimage);
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
}
}}
-
Video players questions
31 mai 2015, par yayujGiven that FFmpeg is the leading multimedia framework and most of the video/audio players uses it, I’m wondering somethings about audio/video players using FFmpeg as intermediate.
I’m studying and I want to know how audio/video players works and I have some questions.
I was reading the ffplay source code and I saw that ffplay handles the subtitle stream. I tried to use a mkv file with a subtitle on it and doesn’t work. I tried using arguments such as -sst but nothing happened. - I was reading about subtitles and how video files uses it (or may I say containers ?). I saw that there’s two ways putting a subtitle : hardsubs and softsubs - roughly speaking hardsubs mode is burned and becomes part of the video, and softsubs turns a stream of subtitles (I might be wrong - please, correct me).
-
The question is : How does they handle this ? I mean, when the subtitle is part of the video there’s nothing to do, the video stream itself shows the subtitle, but what about the softsubs ? how are they handled ? (I heard something about text subs as well). - How does the subtitle appears on the screen and can be configured changing fonts, size, colors, without encoding everything again ?
-
I was studying some video players source codes and some or most of them uses OpenGL as renderer of the frame and others uses (such as Qt’s QWidget) (kind of or for sure) canvas. - What is the most used and which one is fastest and better ? OpenGL with shaders and stuffs ? Handling YUV or RGB and so on ? How does that work ?
-
It might be a dump question but what is the format that AVFrame returns ? For example, when we want to save frames as images first we need the frame and then we convert, from which format we are converting from ? Does it change according with the video codec or it’s always the same ?
-
Most of the videos I’ve been trying to handle is using YUV720P, I tried to save the frames as png and I need to convert to RGB first. I did a test with the players and I put at the same frame and I took also screenshots and compared. The video players shows the frames more colorful. I tried the same with ffplay that uses SDL (OpenGL) and the colors (quality) of the frames seems to be really low. What might be ? What they do ? Is it shaders (or a kind of magic ? haha).
Well, I think that is it for now. I hope you help me with that.
If this isn’t the correct place, please let me know where. I haven’t found another place in Stack Exchange communities.
-