
Recherche avancée
Médias (1)
-
The pirate bay depuis la Belgique
1er avril 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Image
Autres articles (69)
-
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
-
Prérequis à l’installation
31 janvier 2010, parPréambule
Cet article n’a pas pour but de détailler les installations de ces logiciels mais plutôt de donner des informations sur leur configuration spécifique.
Avant toute chose SPIPMotion tout comme MediaSPIP est fait pour tourner sur des distributions Linux de type Debian ou dérivées (Ubuntu...). Les documentations de ce site se réfèrent donc à ces distributions. Il est également possible de l’utiliser sur d’autres distributions Linux mais aucune garantie de bon fonctionnement n’est possible.
Il (...) -
Emballe Médias : Mettre en ligne simplement des documents
29 octobre 2010, parLe plugin emballe médias a été développé principalement pour la distribution mediaSPIP mais est également utilisé dans d’autres projets proches comme géodiversité par exemple. Plugins nécessaires et compatibles
Pour fonctionner ce plugin nécessite que d’autres plugins soient installés : CFG Saisies SPIP Bonux Diogène swfupload jqueryui
D’autres plugins peuvent être utilisés en complément afin d’améliorer ses capacités : Ancres douces Légendes photo_infos spipmotion (...)
Sur d’autres sites (8588)
-
Android recording video with overlay view [way 2]
2 mars 2016, par t0mI am trying app in android which have functionality to capture video with overlay views. I tried two ways (1. and 2.).
1. Via SurfaceView and JavaCV with FFmpeg.
2. Via OpenCV and JavaCV with FFmpeg.
3. For API21+ maybe with MediaProjection.(Question is divided to two questions, due to stackoverflow length limit.)
ad 1. Via SurfaceView and JavaCV with FFmpeg :
ad 2. Via OpenCV and JavaCV with FFmpeg :
OpenCVCameraActivity.java :
import android.app.Activity;
import android.hardware.Camera;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.os.Environment;
import android.util.Log;
import android.view.Menu;
import android.view.MenuItem;
import android.view.MotionEvent;
import android.view.SubMenu;
import android.view.SurfaceView;
import android.view.View;
import android.view.WindowManager;
import android.widget.Toast;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.bytedeco.javacv.Frame;
import org.opencv.android.BaseLoaderCallback;
import org.opencv.android.CameraBridgeViewBase;
import org.opencv.android.LoaderCallbackInterface;
import org.opencv.android.OpenCVLoader;
import org.opencv.core.Mat;
import java.io.File;
import java.nio.ByteBuffer;
import java.nio.ShortBuffer;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.List;
import java.util.ListIterator;
@SuppressWarnings("ALL")
public class OpenCVCameraActivity extends Activity implements
CameraBridgeViewBase.CvCameraViewListener2,
View.OnTouchListener {
//name of activity, for DEBUGGING
private static final String TAG = OpenCVCameraActivity.class.getSimpleName();
private OpenCVCameraPreview mOpenCvCameraView;
private List mResolutionList;
private MenuItem[] mEffectMenuItems;
private SubMenu mColorEffectsMenu;
private MenuItem[] mResolutionMenuItems;
private SubMenu mResolutionMenu;
private static long frameCounter = 0;
long startTime = 0;
private Mat edgesMat;
boolean recording = false;
private int sampleAudioRateInHz = 44100;
private int imageWidth = 1280;
private int imageHeight = 720;
private int frameRate = 30;
private Frame yuvImage = null;
private File ffmpeg_link;
private FFmpegFrameRecorder recorder;
/*audio data getting thread */
private AudioRecord audioRecord;
private AudioRecordRunnable audioRecordRunnable;
private Thread audioThread;
volatile boolean runAudioThread = true;
ShortBuffer[] samples;
private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
@Override
public void onManagerConnected(int status) {
switch (status) {
case LoaderCallbackInterface.SUCCESS:
Log.i(TAG, "OpenCV loaded successfully");
mOpenCvCameraView.enableView();
mOpenCvCameraView.setOnTouchListener(OpenCVCameraActivity.this);
break;
default:
super.onManagerConnected(status);
break;
}
}
};
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
if(Static.DEBUG) Log.i(TAG, "onCreate()");
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
Thread.setDefaultUncaughtExceptionHandler(uncaughtExceptionHandler);
try {
setContentView(R.layout.activity_opencv);
mOpenCvCameraView = (OpenCVCameraPreview) findViewById(R.id.openCVCameraPreview);
mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);
mOpenCvCameraView.setCvCameraViewListener(this);
mOpenCvCameraView.enableFpsMeter();
ffmpeg_link = new File(Environment.getExternalStorageDirectory(), "stream.mp4");
} catch (Exception e){
e.printStackTrace();
}
}
private Thread.UncaughtExceptionHandler uncaughtExceptionHandler =
new Thread.UncaughtExceptionHandler() {
public void uncaughtException(Thread thread, Throwable ex) {
if(Static.DEBUG) Log.e(TAG, "Uncaught exception", ex);
}
};
@Override
protected void onRestart() {
if (Static.DEBUG) Log.i(TAG, "onRestart()");
super.onRestart();
}
@Override
protected void onStart() {
if (Static.DEBUG) Log.i(TAG, "onStart()");
super.onStart();
}
@Override
protected void onResume() {
if (Static.DEBUG) Log.i(TAG, "onResume()");
super.onResume();
if (!OpenCVLoader.initDebug()) {
Log.i(TAG, "Internal OpenCV library not found. Using OpenCV Manager for initialization");
OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_3_1_0, this, mLoaderCallback);
} else {
Log.i(TAG, "OpenCV library found inside package. Using it!");
mLoaderCallback.onManagerConnected(LoaderCallbackInterface.SUCCESS);
}
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
if (Static.DEBUG) Log.i(TAG, "onCreateOptionsMenu()");
super.onCreateOptionsMenu(menu);
List<string> effects = mOpenCvCameraView.getEffectList();
if (effects == null) {
Log.e(TAG, "Color effects are not supported by device!");
return true;
}
mColorEffectsMenu = menu.addSubMenu("Color Effect");
mEffectMenuItems = new MenuItem[effects.size()];
int idx = 0;
ListIterator<string> effectItr = effects.listIterator();
while(effectItr.hasNext()) {
String element = effectItr.next();
mEffectMenuItems[idx] = mColorEffectsMenu.add(1, idx, Menu.NONE, element);
idx++;
}
mResolutionMenu = menu.addSubMenu("Resolution");
mResolutionList = mOpenCvCameraView.getResolutionList();
mResolutionMenuItems = new MenuItem[mResolutionList.size()];
ListIterator resolutionItr = mResolutionList.listIterator();
idx = 0;
while(resolutionItr.hasNext()) {
Camera.Size element = resolutionItr.next();
mResolutionMenuItems[idx] = mResolutionMenu.add(2, idx, Menu.NONE,
Integer.valueOf(element.width).toString() + "x" + Integer.valueOf(element.height).toString());
idx++;
}
return true;
}
@Override
protected void onPause() {
if (Static.DEBUG) Log.i(TAG, "onPause()");
super.onPause();
if (mOpenCvCameraView != null)
mOpenCvCameraView.disableView();
}
@Override
protected void onStop() {
if (Static.DEBUG) Log.i(TAG, "onStop()");
super.onStop();
}
@Override
protected void onDestroy() {
if (Static.DEBUG) Log.i(TAG, "onDestroy()");
super.onDestroy();
if (mOpenCvCameraView != null)
mOpenCvCameraView.disableView();
}
public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {
++frameCounter;
//Log.i(TAG, "Frame number: "+frameCounter);
final Mat rgba = inputFrame.rgba();
//Core.flip(rgba, rgba, 1);
/*if(Static.DEBUG) Log.i(TAG,"rgba.total(): "+rgba.total());
if(Static.DEBUG) Log.i(TAG,"rgba.channels(): " +rgba.channels());*/
byte[] data = new byte[(int) (rgba.total() * rgba.channels())];
rgba.get(0, 0, data);
//if(Static.DEBUG) Log.i(TAG,"return_buff: "+return_buff.length);
if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
startTime = System.currentTimeMillis();
return rgba;
}
// get video data
if (yuvImage != null && recording) {
ByteBuffer b = (ByteBuffer)yuvImage.image[0].position(0);
b.put(data);
try {
long t = 1000 * (System.currentTimeMillis() - startTime);
if(Static.DEBUG) Log.i(TAG,"Writing Frame on timestamp: "+t);
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
}
recorder.record(yuvImage);
} catch (FFmpegFrameRecorder.Exception e) {
if(Static.DEBUG) Log.i(TAG,e.getMessage());
e.printStackTrace();
}
}
return rgba;
}
@Override
public void onCameraViewStarted(int width, int height) {
edgesMat = new Mat();
}
@Override
public void onCameraViewStopped() {
if (edgesMat != null)
edgesMat.release();
edgesMat = null;
}
public boolean onOptionsItemSelected(MenuItem item) {
Log.i(TAG, "called onOptionsItemSelected; selected item: " + item);
if (item.getGroupId() == 1)
{
mOpenCvCameraView.setEffect((String) item.getTitle());
Toast.makeText(this, mOpenCvCameraView.getEffect(), Toast.LENGTH_SHORT).show();
} else if (item.getGroupId() == 2) {
int id = item.getItemId();
Camera.Size resolution = mResolutionList.get(id);
mOpenCvCameraView.setResolution(resolution);
resolution = mOpenCvCameraView.getResolution();
String caption = Integer.valueOf(resolution.width).toString() + "x" + Integer.valueOf(resolution.height).toString();
Toast.makeText(this, caption, Toast.LENGTH_SHORT).show();
}
return true;
}
@Override
public boolean onTouch(View v, MotionEvent event) {
Log.i(TAG,"onTouch event");
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd_HH-mm-ss");
String currentDateandTime = sdf.format(new Date());
String fileName = Environment.getExternalStorageDirectory().getPath() +
"/sample_picture_" + currentDateandTime + ".jpg";
mOpenCvCameraView.takePicture(fileName);
Toast.makeText(this, fileName + " saved", Toast.LENGTH_SHORT).show();
return false;
}
/**
* Click to ImageButton to start recording.
*/
public void onClickBtnStartRecord2(View v) {
if (Static.DEBUG) Log.i(TAG, "onClickBtnStartRecord()");
if(!recording)
startRecording();
else
stopRecording();
}
private void startRecording() {
if (Static.DEBUG) Log.i(TAG, "startRecording()");
initRecorder();
try {
recorder.start();
startTime = System.currentTimeMillis();
recording = true;
audioThread.start();
if (Static.DEBUG) Log.i(TAG, "startRecording() success");
} catch(FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
}
private void stopRecording() {
if (Static.DEBUG) Log.i(TAG, "stopRecording()");
runAudioThread = false;
try {
audioThread.join();
} catch(InterruptedException e) {
e.printStackTrace();
}
audioRecordRunnable = null;
audioThread = null;
if (Static.DEBUG) Log.i(TAG, "stopRecording() 2");
if(recorder != null && recording) {
recording = false;
try {
recorder.stop();
recorder.release();
Log.i(TAG, "Finishing recording, calling stop and release on recorder");
} catch(FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
recorder = null;
}
}
//---------------------------------------
// initialize ffmpeg_recorder
//---------------------------------------
private void initRecorder() {
Log.i(TAG, "init recorder");
try {
if (yuvImage == null) {
yuvImage = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 4);
Log.i(TAG, "create yuvImage");
}
Log.i(TAG, "ffmpeg_url: " + ffmpeg_link.getAbsolutePath());
//Log.i(TAG, "ffmpeg_url: " + ffmpeg_link.exists());
recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
recorder.setFormat("mp4");
recorder.setSampleRate(sampleAudioRateInHz);
// Set in the surface changed method
recorder.setFrameRate(frameRate);
audioRecordRunnable = new AudioRecordRunnable();
audioThread = new Thread(audioRecordRunnable);
runAudioThread = true;
Log.i(TAG, "recorder initialize success");
} catch (Exception e){
e.printStackTrace();
}
}
//---------------------------------------------
// audio thread, gets and encodes audio data
//---------------------------------------------
class AudioRecordRunnable implements Runnable {
@Override
public void run() {
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
// Audio
int bufferSize;
ShortBuffer audioData;
int bufferReadResult;
bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
audioData = ShortBuffer.allocate(bufferSize);
Log.d(TAG, "audioRecord.startRecording()");
audioRecord.startRecording();
// ffmpeg_audio encoding loop
while(runAudioThread) {
//Log.v(TAG,"recording? " + recording);
bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
audioData.limit(bufferReadResult);
if(bufferReadResult > 0) {
Log.v(TAG, "bufferReadResult: " + bufferReadResult);
// If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
// Why? Good question...
if(recording) {
try {
recorder.recordSamples(audioData);
//Log.v(TAG,"recording " + 1024*i + " to " + 1024*i+1024);
} catch(FFmpegFrameRecorder.Exception e) {
Log.v(TAG, e.getMessage());
e.printStackTrace();
}
}
}
}
Log.v(TAG, "AudioThread Finished, release audioRecord");
// encoding finish, release recorder
if(audioRecord != null) {
audioRecord.stop();
audioRecord.release();
audioRecord = null;
Log.v(TAG, "audioRecord released");
}
}
}
}
</string></string>OpenCVCameraPreview.java :
import android.content.Context;
import android.hardware.Camera;
import android.util.AttributeSet;
import android.util.Log;
import org.opencv.android.JavaCameraView;
import java.io.FileOutputStream;
import java.util.List;
public class OpenCVCameraPreview extends JavaCameraView implements Camera.PictureCallback {
private static final String TAG = OpenCVCameraPreview.class.getSimpleName();
private String mPictureFileName;
public OpenCVCameraPreview(Context context, AttributeSet attrs) {
super(context, attrs);
}
public List<string> getEffectList() {
return mCamera.getParameters().getSupportedColorEffects();
}
public boolean isEffectSupported() {
return (mCamera.getParameters().getColorEffect() != null);
}
public String getEffect() {
return mCamera.getParameters().getColorEffect();
}
public void setEffect(String effect) {
Camera.Parameters params = mCamera.getParameters();
params.setColorEffect(effect);
mCamera.setParameters(params);
}
public List getResolutionList() {
return mCamera.getParameters().getSupportedPreviewSizes();
}
public void setResolution(Camera.Size resolution) {
disconnectCamera();
mMaxHeight = resolution.height;
mMaxWidth = resolution.width;
connectCamera(getWidth(), getHeight());
}
public Camera.Size getResolution() {
return mCamera.getParameters().getPreviewSize();
}
public void takePicture(final String fileName) {
Log.i(TAG, "Taking picture");
this.mPictureFileName = fileName;
// Postview and jpeg are sent in the same buffers if the queue is not empty when performing a capture.
// Clear up buffers to avoid mCamera.takePicture to be stuck because of a memory issue
mCamera.setPreviewCallback(null);
// PictureCallback is implemented by the current class
mCamera.takePicture(null, null, this);
}
@Override
public void onPictureTaken(byte[] data, Camera camera) {
Log.i(TAG, "Saving a bitmap to file");
// The camera preview was automatically stopped. Start it again.
mCamera.startPreview();
mCamera.setPreviewCallback(this);
// Write the image in a file (in jpeg format)
try {
FileOutputStream fos = new FileOutputStream(mPictureFileName);
fos.write(data);
fos.close();
} catch (java.io.IOException e) {
Log.e("PictureDemo", "Exception in photoCallback", e);
}
}
}
</string>activity_opencv.xml :
<?xml version="1.0" encoding="utf-8"?>
<relativelayout>
<ImageButton<br />
android:id="@+id/btnStartRecord2"<br />
android:layout_width="70dp"<br />
android:layout_height="70dp"<br />
android:scaleType="fitXY"<br />
android:src="@drawable/record_icon"<br />
android:background="@null"<br />
android:text="@string/btnStartRecord"<br />
android:onClick="onClickBtnStartRecord2"<br />
android:layout_centerVertical="true"<br />
android:layout_alignParentRight="true"<br />
android:layout_alignParentEnd="true"/>
</relativelayout>Overlay views working, but recorded video is without overlay views, and recording with
onCameraFrame
method is very slow. -
Anomalie #3453 (Nouveau) : Bouton changer de la redirection d’article
19 mai 2015, par Eric CamusBonjour,
Un problème sur l’action du "onkeypress" du champ de redirection d’article ne fait pas apparaitre le bouton "changer" en cas de copie par la souris ou avec le raccourci CTRL+V.
Bref, pourquoi avoir cet effet uniquement sur ce bloc alors que les autres bloc au dessus affiche directement ce bouton ?
Pour ma part, je serais pour la suppression de cette action buggée et de faire comme les blocs du dessus : toujours l’afficher ?Dans le fichiers "prive/formulaires/rediriger_article.html".
Modification des ligne 15-16 vers :]<input type="text" class="text" value="#ENV{#GET{name},'http://'}" />
Modification de l’ex ligne 22 vers :
-
Safari on Yosemite doesn't get total media duration from init file of video generated using MP4Box
30 avril 2015, par sonamI am trying to play HTML5 video using MediaSource Extesion. I am generating dash fragmnets using MP4Box. I am creating init fragment and respective dash fragments.
I am creating mediasource and adding init fragment to it. When this is done, all the browsers where media source extension is supported(chrome, firefox developer version, android chrome, windows 8 IE11), return correct media duration. But the same doesn’t work with safari on Yosemite, where spec says media source is supported. So on safari, media duration is updated as and when data fragments are appended to it progressively. How can I make it work in safari ? Is there something that is missing from init fragment and how to add it ?