
Recherche avancée
Autres articles (67)
-
Participer à sa traduction
10 avril 2011Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
Actuellement MediaSPIP n’est disponible qu’en français et (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (10257)
-
Read h264 stream from an IP camera
13 juillet 2015, par João NevesCurrently, I am trying to use opencv to read a video from my Canon VB-H710F camera.
For this purpose I tried two different solutions :
SOLUTION 1 : Read the stream from rtsp address
VideoCapture cam ("rtsp://root:camera@10.0.4.127/stream/profile1=u");
while(true)
cam >> frame;In this case I am using opencv to directly read from a stream encoded with in H264 (profile1), however this yields the same problem reported here http://answers.opencv.org/question/34012/ip-camera-h264-error-while-decoding/
As suggested in the previous question, I tried to disable FFMPEG support in opencv installation, which solved the h264 decoding errors but raised other problem.
When accessing the stream with opencv, supported by gstreame, there is always a large delay associated.
With this solution I achieve 15 FPS but I have a delay of 5 seconds, which is not acceptable considering that I need a real time application.SOLUTION 2 : Read the frames from http address
while(true)
startTime=System.currentTimeMillis() ;URL url = new URL("h t t p://[IP]/-wvhttp-01-/image.cgi");
URLConnection con = url.openConnection();
BufferedImage image = ImageIO.read(con.getInputStream());
showImage(image);
estimatedTime=System.currentTimeMillis()-startTime;
System.out.println(estimatedTime);
Thread.sleep(5);
}This strategy simply grabs the frame from the url that the camera provides. The code is in Java but the results are the same in C++ with the curl library.
This solution avoids the delay of the first solution however it takes little more than 100 ms to grab each frame, which means that I can only achieve on average 10 FPS.I would like to know how can I read the video using c++ or another library developed in c++ ?
-
How do I keep black areas black with ffmpeg ?
11 janvier 2018, par Aurelius SchnitzlerWhen encoding GoPro videos with ffmpeg using
ffmpeg -i Goprovideo.mp4 -pix_fmt yuv420p -vf scale=1920:-1,crop=1920:1080:0:362 Goprovideo-out.mp4
I noticed that the videos do not only get much smaller, but also in some cases do black areas lose intensity. So what was black is now grey. Not to an extreme, but slightly noticable.
How do I keep black areas black with ffmpeg ?
-
How to convert raw camera Android data into usual mpeg4 by ffmpeg ?
12 octobre 2014, par trololoI can recieve raw camera data :
fos = new FileOutputStream (new File(Environment.getExternalStorageDirectory().getPath() + "/video_raw.raw"));
_.mCamera.setPreviewCallback(new Camera.PreviewCallback() {
public synchronized void onPreviewFrame(byte[] data, Camera camera) {
try {
if (fos != null)
fos.write(data);//I can save data to a file
//Environment.getExternalStorageDirectory().getPath() + "/video_raw.raw"
} catch (IOException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
try{
camera.addCallbackBuffer(data);
}catch (Exception e) {
Log.e("CameraTest", "addCallbackBuffer error");
return;
}
return;
}
});and trying to use this library to manipulate with video(audio)-data
File dir = new File(_.mediaFile).getParentFile();
FfmpegController fc = new FfmpegController(this, dir);
Clip clip_in = new Clip(Environment.getExternalStorageDirectory().getPath() + "/video_raw.raw");
clip_in.height = 480;
clip_in.width = 720;
clip_in.videoCodec = "rawvideo";
clip_in.videoFilter= "rawvideo";
Clip clip_out = new Clip(Environment.getExternalStorageDirectory().getPath() + "/video15a.mp4");
//put flags in clip
clip_out.videoFps = "30";
clip_out.width = 480;
clip_out.height = 320;
clip_out.videoCodec = "libx264";
clip_out.audioCodec = "copy";
fc.processVideo(clip_in, clip_out, false, new ShellUtils.ShellCallback() {
@Override
public void shellOut(String shellLine) {
System.out.println("MIX> " + shellLine);
}
@Override
public void processComplete(int exitValue) {
if (exitValue != 0) {
System.err.println("concat non-zero exit: " + exitValue);
Log.d("ffmpeg","Compilation error. FFmpeg failed");
Toast.makeText(AMain.this, "result: ffmpeg failed", Toast.LENGTH_LONG).show();
} else {
}
}
});Link to the project :
https://github.com/guardianproject/android-ffmpeg-javaThis code works correctly for any videofile which created before (the other apps). BUt this oce doesn’t work in my case. How to fix it ?