
Recherche avancée
Médias (21)
-
1,000,000
27 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Demon Seed
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Four of Us are Dying
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Corona Radiata
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Lights in the Sky
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Head Down
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (65)
-
Ajouter notes et légendes aux images
7 février 2011, parPour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
Modification lors de l’ajout d’un média
Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...) -
Submit bugs and patches
13 avril 2011Unfortunately a software is never perfect.
If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
You may also (...) -
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)
Sur d’autres sites (8739)
-
FFmpeg Video Streaming on android app
18 avril 2017, par Johnny RogerI’m trying to do a streaming of my laptop webcam to my android smartphone.
So I set an ffserver in this wayHTTPPort 1234
RTSPPort 1235
<feed>
File /tmp/feed2.ffm
FileMaxSize 2M
ACL allow 127.0.0.1
</feed>
<stream>
Feed feed2.ffm
Format rtp
Noaudio
VideoCodec libx264
AVOptionVideo flags +global_header
AVOptionVideo me_range 16
AVOptionVideo qdiff 4
AVOptionVideo qmin 10
AVOptionVideo qmax 51
ACL allow 192.168.0.0 192.168.255.255
</stream>and I used the following ffmpeg instruction
ffmpeg -i /dev/video0 -vcodec libx264 -tune zerolatency -crf 18 http://localhost:1234/feed2.ffm
in my android studio project i set in the manifest file
That is my MainActivity.java
package com.example.johnny.ffmpeg;
import android.app.ProgressDialog;
import android.content.Context;
import android.media.MediaPlayer;
import android.os.Bundle;
import android.app.Activity;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.view.SurfaceView;
import android.view.View;
import android.net.Uri;
import android.widget.Button;
import android.widget.ImageButton;
import android.widget.ImageView;
import android.widget.MediaController;
import android.widget.Toast;
import android.widget.VideoView;
public class MainActivity extends AppCompatActivity implements VideoView.OnClickListener
{
ProgressDialog mDialog;
VideoView videoView;
ImageView btnPlayPause;
MediaController mediaController;
String videoURL ="rtsp://192.168.1.100:1235/test1.sdp";
@Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
videoView =(VideoView)findViewById(R.id.videoView);
mediaController= new MediaController(this);
mediaController.setAnchorView(videoView);
btnPlayPause = (ImageButton)findViewById(R.id.btn_play_pause);
btnPlayPause.setOnClickListener(this);
}
@Override
public void onClick(View v) {
mDialog = new ProgressDialog(MainActivity.this);
mDialog.setMessage("Please wait...");
mDialog.setCanceledOnTouchOutside(false);
mDialog.show();
Uri uri = Uri.parse(videoURL);
videoView.setVideoURI(uri);
try{
if(!videoView.isPlaying()) {
videoView.setMediaController(mediaController);
videoView.requestFocus();
videoView.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
@Override
public void onCompletion(MediaPlayer mp) {
btnPlayPause.setImageResource(R.drawable.ic_aplay);
}
});
}
else
{
videoView.pause();
btnPlayPause.setImageResource(R.drawable.ic_pause);
}
}
catch(Exception ex)
{
Context context = getApplicationContext();
String text = ex.toString();
int duration = Toast.LENGTH_SHORT;
Toast toast = Toast.makeText(context, text, duration);
toast.show();
}
videoView.requestFocus();
videoView.setOnPreparedListener(new MediaPlayer.OnPreparedListener(){
@Override
public void onPrepared(MediaPlayer mp){
mDialog.dismiss();
mp.setLooping(true);
videoView.start();
btnPlayPause.setImageResource(R.drawable.ic_pause);
}
});
videoView.setOnErrorListener(new MediaPlayer.OnErrorListener() {
@Override
public boolean onError(MediaPlayer mp, int what, int extra) {
Log.d("video", "setOnErrorListener ");
return true;
}
});
}
}and that is the relative activity_main.xml file :
<relativelayout>
<videoview></videoview>
<textview></textview>
<textview></textview>
</relativelayout>I tested the streaming on vlc and it work well (even if with dealy, it’s not in real time).
The app instead give me the message "can’t play this video" and the following error :V/MediaPlayer: message received msg=100, ext1=1, ext2=-38
E/MediaPlayer: Error (1,-38)
D/VideoView: Error: 1,-38
D/video: setOnErrorListener
E/MediaPlayer: error (1, -38)
V/MediaPlayer: callback application
V/MediaPlayer: back from callback
E/MediaPlayer: Error (1,-38)
D/VideoView: Error: 1,-38
D/video: setOnErrorListener
E/ViewRootImpl: sendUserActionEvent() mView == nullI saw that the problem could be the format of the video, but i tried also with an mp4 file
ffmpeg -i test.mp4 -vcodec libx264 -tune zerolatency -crf 18 http://localhost:1234/feed2.ffm
and the result was the same. How Can I fix it and let my app work well ?
-
WebRTC predictions for 2016
17 février 2016, par silviaI wrote these predictions in the first week of January and meant to publish them as encouragement to think about where WebRTC still needs some work. I’d like to be able to compare the state of WebRTC in the browser a year from now. Therefore, without further ado, here are my thoughts.
WebRTC Browser support
I’m quite optimistic when it comes to browser support for WebRTC. We have seen Edge bring in initial support last year and Apple looking to hire engineers to implement WebRTC. My prediction is that we will see the following developments in 2016 :
- Edge will become interoperable with Chrome and Firefox, i.e. it will publish VP8/VP9 and H.264/H.265 support
- Firefox of course continues to support both VP8/VP9 and H.264/H.265
- Chrome will follow the spec and implement H.264/H.265 support (to add to their already existing VP8/VP9 support)
- Safari will enter the WebRTC space but only with H.264/H.265 support
Codec Observations
With Edge and Safari entering the WebRTC space, there will be a larger focus on H.264/H.265. It will help with creating interoperability between the browsers.
However, since there are so many flavours of H.264/H.265, I expect that when different browsers are used at different endpoints, we will get poor quality video calls because of having to negotiate a common denominator. Certainly, baseline will work interoperably, but better encoding quality and lower bandwidth will only be achieved if all endpoints use the same browser.
Thus, we will get to the funny situation where we buy ourselves interoperability at the cost of video quality and bandwidth. I’d call that a “degree of interoperability” and not the best possible outcome.
I’m going to go out on a limb and say that at this stage, Google is going to consider strongly to improve the case of VP8/VP9 by improving its bandwidth adaptability : I think they will buy themselves some SVC capability and make VP9 the best quality codec for live video conferencing. Thus, when Safari eventually follows the standard and also implements VP8/VP9 support, the interoperability win of H.264/H.265 will become only temporary overshadowed by a vastly better video quality when using VP9.
The Enterprise Boundary
Like all video conferencing technology, WebRTC is having a hard time dealing with the corporate boundary : firewalls and proxies get in the way of setting up video connections from within an enterprise to people outside.
The telco world has come up with the concept of SBCs (session border controller). SBCs come packed with functionality to deal with security, signalling protocol translation, Quality of Service policing, regulatory requirements, statistics, billing, and even media service like transcoding.
SBCs are a total overkill for a world where a large number of Web applications simply want to add a WebRTC feature – probably mostly to provide a video or audio customer support service, but it could be a live training session with call-in, or an interest group conference all.
We cannot install a custom SBC solution for every WebRTC service provider in every enterprise. That’s like saying we need a custom Web proxy for every Web server. It doesn’t scale.
Cloud services thrive on their ability to sell directly to an individual in an organisation on their credit card without that individual having to ask their IT department to put special rules in place. WebRTC will not make progress in the corporate environment unless this is fixed.
We need a solution that allows all WebRTC services to get through an enterprise firewall and enterprise proxy. I think the WebRTC standards have done pretty well with firewalls and connecting to a TURN server on port 443 will do the trick most of the time. But enterprise proxies are the next frontier.
What it takes is some kind of media packet forwarding service that sits on the firewall or in a proxy and allows WebRTC media packets through – maybe with some configuration that is necessary in the browsers or the Web app to add this service as another type of TURN server.
I don’t have a full understanding of the problems involved, but I think such a solution is vital before WebRTC can go mainstream. I expect that this year we will see some clever people coming up with a solution for this and a new type of product will be born and rolled out to enterprises around the world.
Summary
So these are my predictions. In summary, they address the key areas where I think WebRTC still has to make progress : interoperability between browsers, video quality at low bitrates, and the enterprise boundary. I’m really curious to see where we stand with these a year from now.
—
It’s worth mentioning Philipp Hancke’s tweet reply to my post :
https://datatracker.ietf.org/doc/draft-ietf-rtcweb-return/ … — we saw some clever people come up with a solution already. Now it needs to be implemented
The post WebRTC predictions for 2016 first appeared on ginger’s thoughts.
-
How can I capture audio AND video simultenaous with ffmpeg from an USB capture device
22 octobre 2018, par obanI’m capturing a video by means of an USB Terratec Grabster AV350 (which is based on the em2860 chip).
I don’t succeed to get the audio when it is played . If I play the captured video with vlc or with ffplay I got only 3 seconds sound and then a silence for the rest of the video ...
During the capturing I don’t get any errors. At the end it indicates the size of the video and audio captured ....
I’m using the ffmpeg command for this :
ffmpeg -f alsa -ac 2 -i hw:3 -f video4linux2 -i /dev/video0 -acodec ac3 -ab 128k -vcodec mpeg4 -b 6000k -r 25 test5.avi
The log is :
[alsa @ 0x9bcd420]Estimating duration from bitrate, this may be inaccurate
Input #0, alsa, from 'hw:3':
Duration: N/A, start: 69930.998994, bitrate: N/A
Stream #0.0: Audio: pcm_s16le, 44100 Hz, 2 channels, s16, 1411 kb/s
[video4linux2 @ 0x9bf5d30]Estimating duration from bitrate, this may be inaccurate
Input #1, video4linux2, from '/dev/video0':
Duration: N/A, start: 1307111377.654173, bitrate: -2147483 kb/s
Stream #1.0: Video: rawvideo, yuyv422, 720x576, -2147483 kb/s, 1000k tbr, 1000k tbn, 1000k tbc
[ac3 @ 0x9bf9590]No channel layout specified. The encoder will guess the layout, but it might be incorrect.
Output #0, avi, to 'test5.avi':
Metadata:
ISFT : Lavf52.64.2
Stream #0.0: Video: mpeg4, yuv420p, 720x576, q=2-31, 6000 kb/s, 25 tbn, 25 tbc
Stream #0.1: Audio: ac3, 44100 Hz, stereo, s16, 128 kb/s
Stream mapping:
Stream #1.0 -> #0.0
Stream #0.0 -> #0.1
Press [q] to stop encoding
frame= 1283 fps= 25 q=2.3 Lsize= 38677kB time=51.32 bitrate=6173.9kbits/s
**video:37755kB audio:846kB** global headers:0kB muxing overhead 0.198922%If I reduce the command for only capturing audio, then the audio file can be played successfully :
ffmpeg -f alsa -ac 2 -i hw:3,0 -acodec ac3 -ab 128k test5.avi
[alsa @ 0x8ede420]Estimating duration from bitrate, this may be inaccurate
Input #0, alsa, from 'hw:3,0':
Duration: N/A, start: 70395.998935, bitrate: N/A
Stream #0.0: Audio: pcm_s16le, 44100 Hz, 2 channels, s16, 1411 kb/s
[ac3 @ 0x8eebac0]No channel layout specified. The encoder will guess the layout, but it might be incorrect.
Output #0, avi, to 'test5.avi':
Metadata:
ISFT : Lavf52.64.2
Stream #0.0: Audio: ac3, 44100 Hz, stereo, s16, 128 kb/s
Stream mapping:
Stream #0.0 -> #0.0
Press [q] to stop encoding
size= 227kB time=13.62 bitrate= 136.8kbits/s
**video:0kB audio:213kB** global headers:0kB muxing overhead 6.902375%If I run the command for only video capturing then vlc or ffplay can play the video successfully :
ffmpeg -f video4linux2 -i /dev/video0 -vcodec mpeg4 -b 12000k -r 25 test5.avi
[video4linux2 @ 0x91d6420]Estimating duration from bitrate, this may be inaccurate
Input #0, video4linux2, from '/dev/video0':
Duration: N/A, start: 1307112044.025687, bitrate: -2147483 kb/s
Stream #0.0: Video: rawvideo, yuyv422, 720x576, -2147483 kb/s, 1000k tbr, 1000k tbn, 1000k tbc
Output #0, avi, to 'test5.avi':
Metadata:
ISFT : Lavf52.64.2
Stream #0.0: Video: mpeg4, yuv420p, 720x576, q=2-31, 12000 kb/s, 25 tbn, 25 tbc
Stream mapping:
Stream #0.0 -> #0.0
Press [q] to stop encoding
frame= 388 fps= 25 q=2.0 Lsize= 12963kB time=15.52 bitrate=6842.5kbits/s
**video:12949kB audio:0kB** global headers:0kB muxing overhead 0.114584%Strange behaviour I noticed is that when I tried capturing video and audio, I can not capture the audio afterwards any more,
unless I unplug the AV350 first.The G350 is located at card 3 :
htpc@htpc-01:/proc/asound/G350/pcm0c$ more info
card: 3
device: 0
subdevice: 0
stream: CAPTURE
id: USB Audio
name: USB Audio
subname: subdevice #0
class: 0
subclass: 0
subdevices_count: 1
subdevices_avail: 1The OS is a Linux 2.6.38-8-generic with the Ubuntu Natty Narwhal version
Any help on how to tackle this issue would be great ....
Thanks !