
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (87)
-
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...) -
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ; -
Le plugin : Gestion de la mutualisation
2 mars 2010, parLe plugin de Gestion de mutualisation permet de gérer les différents canaux de mediaspip depuis un site maître. Il a pour but de fournir une solution pure SPIP afin de remplacer cette ancienne solution.
Installation basique
On installe les fichiers de SPIP sur le serveur.
On ajoute ensuite le plugin "mutualisation" à la racine du site comme décrit ici.
On customise le fichier mes_options.php central comme on le souhaite. Voilà pour l’exemple celui de la plateforme mediaspip.net :
< ?php (...)
Sur d’autres sites (7367)
-
Video too fast FFmpeg
22 novembre 2012, par SpamdarkI am having an issue again with ffmpeg, I'm a newbie with ffmpeg, and I can't find a good tutorial up to date...
This time, when I play a video with ffmpeg, it plays too fast, ffmpeg is ignoring the FPS, I don't want to handle that with a thread sleep, because the videos have differents FPS's.
I created a thread, there you can find the loop :
AVPacket framepacket;
while(av_read_frame(formatContext,&framepacket)>= 0){
pausecontrol.lock();
// Is it a video or audio frame¿?
if(framepacket.stream_index==gotVideoCodec){
int framereaded;
// Video? Ok
avcodec_decode_video2(videoCodecContext,videoFrame,&framereaded,&framepacket);
// Yeah, did we get it?
if(framereaded && doit){
AVRational millisecondbase = {1,1000};
int f_number = framepacket.dts;
int f_time = av_rescale_q(framepacket.dts,formatContext->streams[gotVideoCodec]->time_base,millisecondbase);
currentTime=f_time;
currentFrameNumber=f_number;
int stWidth = videoCodecContext->width;
int stHeight = videoCodecContext->height;
SwsContext *ctx = sws_getContext(stWidth, stHeight, videoCodecContext->pix_fmt, stWidth,
stHeight, PIX_FMT_RGB24, SWS_BICUBIC, NULL, NULL, NULL);
if(ctx!=0){
sws_scale(ctx,videoFrame->data,videoFrame->linesize,0,videoCodecContext->height,videoFrameRGB->data,videoFrameRGB->linesize);
QImage framecapsule=QImage(stWidth,stHeight,QImage::Format_RGB888);
for(int y=0;ydata[0]+y*videoFrameRGB->linesize[0],stWidth*3);
}
emit newFrameReady(framecapsule);
sws_freeContext(ctx);
}
}
}
if(framepacket.stream_index==gotAudioCodec){
// Audio? Ok
}
pausecontrol.unlock();
av_free_packet(&framepacket);
}Any Idea ?
-
ffmpeg + ffserver : "Broken ffmpeg default settings detected"
18 octobre 2012, par Chris NoletI'm just trying to connect ffmpeg to ffserver and stream rawvideo.
I keep getting the error :
broken ffmpeg default settings detected
from libx264 and thenError while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height
from ffmpeg before it exits.I'm launching ffmpeg with the command :
ffmpeg -f x11grab -s 320x480 -r 10 -i :0.0 -tune zerolatency http://localhost:8090/feed1.ffm
My ffserver.conf file (for ffserver) looks like this :
Port 8090
BindAddress 0.0.0.0
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 1000
CustomLog -
NoDaemon
<feed>
ACL allow 127.0.0.1
</feed>
<stream>
Feed feed1.ffm
Format asf
NoAudio
VideoBitRate 128
VideoBufferSize 400
VideoFrameRate 24
VideoSize 320x480
VideoGopSize 12
VideoQMin 1
VideoQMax 31
VideoCodec libx264
</stream>
<stream>
Format status
</stream>And the full output is :
ffmpeg version N-45614-g364c60b Copyright (c) 2000-2012 the FFmpeg developers
built on Oct 17 2012 04:34:04 with Apple clang version 4.1 (tags/Apple/clang-421.11.65) (based on LLVM 3.1svn)
configuration: --enable-shared --enable-libx264 --enable-libmp3lame --enable-x11grab --enable-gpl --enable-version3 --enable-nonfree --enable-hardcoded-tables --cc=/usr/bin/clang --host-cflags='-Os -w -pipe -march=native -Qunused-arguments -mmacosx-version-min=10.7' --extra-cflags='-x objective-c' --extra-ldflags='-framework Foundation -framework Cocoa -framework CoreServices -framework ApplicationServices -lobjc'
libavutil 51. 76.100 / 51. 76.100
libavcodec 54. 66.100 / 54. 66.100
libavformat 54. 32.101 / 54. 32.101
libavdevice 54. 3.100 / 54. 3.100
libavfilter 3. 19.103 / 3. 19.103
libswscale 2. 1.101 / 2. 1.101
libswresample 0. 16.100 / 0. 16.100
libpostproc 52. 1.100 / 52. 1.100
[x11grab @ 0x7f87dc01e200] device: :0.0 -> display: :0.0 x: 0 y: 0 width: 320 height: 480
[x11grab @ 0x7f87dc01e200] Estimating duration from bitrate, this may be inaccurate
Input #0, x11grab, from ':0.0':
Duration: N/A, start: 1350517708.386699, bitrate: 49152 kb/s
Stream #0:0: Video: rawvideo (BGRA / 0x41524742), bgra, 320x480, 49152 kb/s, 10 tbr, 1000k tbn, 10 tbc
[tcp @ 0x7f87dc804120] TCP connection to localhost:8090 failed: Connection refused
[tcp @ 0x7f87dc804b20] TCP connection to localhost:8090 failed: Connection refused
[libx264 @ 0x7f87dd801000] broken ffmpeg default settings detected
[libx264 @ 0x7f87dd801000] use an encoding preset (e.g. -vpre medium)
[libx264 @ 0x7f87dd801000] preset usage: -vpre <speed> -vpre <profile>
[libx264 @ 0x7f87dd801000] speed presets are listed in x264 --help
[libx264 @ 0x7f87dd801000] profile is optional; x264 defaults to high
Output #0, ffm, to 'http://localhost:8090/feed1.ffm':
Metadata:
creation_time : now
Stream #0:0: Video: h264, yuv420p, 160x128, q=2-31, 128 kb/s, 1000k tbn, 10 tbc
Stream mapping:
Stream #0:0 -> #0:0 (rawvideo -> libx264)
Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height
</profile></speed>Any help much appreciated :)
-
Android FFmpeg Video Recording Delete Last Recorded Part
17 avril 2015, par user3587194I’m trying to do exactly what this picture shows.
Anyway, how can I delete part of a video ? The code I was testing is on github.
It uses a progress bar so that when you record the progress bar will move, and keep them in separate segments. What is confusing to me is trying to figure out where and how to grab each segment to see if I want to delete that segment or not.
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
long frameTimeStamp = 0L;
if (mAudioTimestamp == 0L && firstTime > 0L)
frameTimeStamp = 1000L * (System.currentTimeMillis() - firstTime);
else if (mLastAudioTimestamp == mAudioTimestamp)
frameTimeStamp = mAudioTimestamp + frameTime;
else {
long l2 = (System.nanoTime() - mAudioTimeRecorded) / 1000L;
frameTimeStamp = l2 + mAudioTimestamp;
mLastAudioTimestamp = mAudioTimestamp;
}
synchronized (mVideoRecordLock) {
//Log.e("recorder", "mVideoRecordLock " + mVideoRecordLock);
if (recording && rec && lastSavedframe != null && lastSavedframe.getFrameBytesData() != null && yuvIplImage != null) {
if (isFirstFrame) {
isFirstFrame = false;
firstData = data;
}
totalTime = System.currentTimeMillis() - firstTime - pausedTime - ((long) (1.0 / (double) frameRate) * 1000);
if (lastSavedframe != null && !deleteEnabled) {
deleteEnabled = true;
deleteBtn.setVisibility(View.VISIBLE);
cancelBtn.setVisibility(View.GONE);
}
if (!nextEnabled && totalTime >= recordingChangeTime) {
Log.e("recording", "totalTime >= recordingChangeTime " + totalTime + " " + recordingChangeTime);
nextEnabled = true;
nextBtn.setVisibility(View.VISIBLE);
}
if (nextEnabled && totalTime >= recordingMinimumTime) {
mHandler.sendEmptyMessage(5);
}
if (currentRecorderState == RecorderState.PRESS && totalTime >= recordingChangeTime) {
currentRecorderState = RecorderState.LOOSEN;
mHandler.sendEmptyMessage(2);
}
mVideoTimestamp += frameTime;
if (lastSavedframe.getTimeStamp() > mVideoTimestamp)
mVideoTimestamp = lastSavedframe.getTimeStamp();
try {
yuvIplImage.getByteBuffer().put(lastSavedframe.getFrameBytesData());
videoRecorder.setTimestamp(lastSavedframe.getTimeStamp());
videoRecorder.record(yuvIplImage);
} catch (com.googlecode.javacv.FrameRecorder.Exception e) {
e.printStackTrace();
}
}
byte[] tempData = rotateYUV420Degree90(data, previewWidth, previewHeight);
if (cameraSelection == 1)
tempData = rotateYUV420Degree270(data, previewWidth, previewHeight);
lastSavedframe = new SavedFrames(tempData, frameTimeStamp);
//Log.e("recorder", "lastSavedframe " + lastSavedframe);
}
}
}
public class Util {
public static ContentValues videoContentValues = null;
public static String getRecordingTimeFromMillis(long millis) {
String strRecordingTime = null;
int seconds = (int) (millis / 1000);
int minutes = seconds / 60;
int hours = minutes / 60;
if (hours >= 0 && hours < 10)
strRecordingTime = "0" + hours + ":";
else
strRecordingTime = hours + ":";
if (hours > 0)
minutes = minutes % 60;
if (minutes >= 0 && minutes < 10)
strRecordingTime += "0" + minutes + ":";
else
strRecordingTime += minutes + ":";
seconds = seconds % 60;
if (seconds >= 0 && seconds < 10)
strRecordingTime += "0" + seconds ;
else
strRecordingTime += seconds ;
return strRecordingTime;
}
public static int determineDisplayOrientation(Activity activity, int defaultCameraId) {
int displayOrientation = 0;
if (Build.VERSION.SDK_INT > Build.VERSION_CODES.FROYO) {
CameraInfo cameraInfo = new CameraInfo();
Camera.getCameraInfo(defaultCameraId, cameraInfo);
int degrees = getRotationAngle(activity);
if (cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
displayOrientation = (cameraInfo.orientation + degrees) % 360;
displayOrientation = (360 - displayOrientation) % 360;
} else {
displayOrientation = (cameraInfo.orientation - degrees + 360) % 360;
}
}
return displayOrientation;
}
public static int getRotationAngle(Activity activity) {
int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
int degrees = 0;
switch (rotation) {
case Surface.ROTATION_0:
degrees = 0;
break;
case Surface.ROTATION_90:
degrees = 90;
break;
case Surface.ROTATION_180:
degrees = 180;
break;
case Surface.ROTATION_270:
degrees = 270;
break;
}
return degrees;
}
public static int getRotationAngle(int rotation) {
int degrees = 0;
switch (rotation) {
case Surface.ROTATION_0:
degrees = 0;
break;
case Surface.ROTATION_90:
degrees = 90;
break;
case Surface.ROTATION_180:
degrees = 180;
break;
case Surface.ROTATION_270:
degrees = 270;
break;
}
return degrees;
}
public static String createImagePath(Context context){
long dateTaken = System.currentTimeMillis();
String title = Constants.FILE_START_NAME + dateTaken;
String filename = title + Constants.IMAGE_EXTENSION;
String dirPath = Environment.getExternalStorageDirectory()+"/Android/data/" + context.getPackageName()+"/video";
File file = new File(dirPath);
if(!file.exists() || !file.isDirectory())
file.mkdirs();
String filePath = dirPath + "/" + filename;
return filePath;
}
public static String createFinalPath(Context context) {
Log.e("util", "createFinalPath");
long dateTaken = System.currentTimeMillis();
String title = Constants.FILE_START_NAME + dateTaken;
String filename = title + Constants.VIDEO_EXTENSION;
String filePath = genrateFilePath(context, String.valueOf(dateTaken), true, null);
ContentValues values = new ContentValues(7);
values.put(Video.Media.TITLE, title);
values.put(Video.Media.DISPLAY_NAME, filename);
values.put(Video.Media.DATE_TAKEN, dateTaken);
values.put(Video.Media.MIME_TYPE, "video/3gpp");
values.put(Video.Media.DATA, filePath);
videoContentValues = values;
Log.e("util", "filePath " + filePath);
return filePath;
}
public static void deleteTempVideo(Context context) {
final String filePath = Environment.getExternalStorageDirectory() + "/Android/data/" + context.getPackageName() + "/video";
new Thread(new Runnable() {
@Override
public void run() {
File file = new File(filePath);
if (file != null && file.isDirectory()) {
Log.e("util", "file.isDirectory() " + file.isDirectory());
for (File file2 : file.listFiles()) {
Log.e("util", "file.listFiles() " + file.listFiles());
file2.delete();
}
}
}
}).start();
}
private static String genrateFilePath(Context context,String uniqueId, boolean isFinalPath, File tempFolderPath) {
String fileName = Constants.FILE_START_NAME + uniqueId + Constants.VIDEO_EXTENSION;
String dirPath = Environment.getExternalStorageDirectory() + "/Android/data/" + context.getPackageName() + "/video";
if (isFinalPath) {
File file = new File(dirPath);
if (!file.exists() || !file.isDirectory())
file.mkdirs();
} else
dirPath = tempFolderPath.getAbsolutePath();
String filePath = dirPath + "/" + fileName;
return filePath;
}
public static String createTempPath(Context context, File tempFolderPath ) {
long dateTaken = System.currentTimeMillis();
String filePath = genrateFilePath(context,String.valueOf(dateTaken), false, tempFolderPath);
return filePath;
}
public static File getTempFolderPath() {
File tempFolder = new File(Constants.TEMP_FOLDER_PATH +"_" +System.currentTimeMillis());
return tempFolder;
}
public static List getResolutionList(Camera camera) {
Parameters parameters = camera.getParameters();
List previewSizes = parameters.getSupportedPreviewSizes();
return previewSizes;
}
public static RecorderParameters getRecorderParameter(int currentResolution) {
RecorderParameters parameters = new RecorderParameters();
if (currentResolution == Constants.RESOLUTION_HIGH_VALUE) {
parameters.setAudioBitrate(128000);
parameters.setVideoQuality(0);
} else if (currentResolution == Constants.RESOLUTION_MEDIUM_VALUE) {
parameters.setAudioBitrate(128000);
parameters.setVideoQuality(5);
} else if (currentResolution == Constants.RESOLUTION_LOW_VALUE) {
parameters.setAudioBitrate(96000);
parameters.setVideoQuality(20);
}
return parameters;
}
public static int calculateMargin(int previewWidth, int screenWidth) {
int margin = 0;
if (previewWidth <= Constants.RESOLUTION_LOW) {
margin = (int) (screenWidth*0.12);
} else if (previewWidth > Constants.RESOLUTION_LOW && previewWidth <= Constants.RESOLUTION_MEDIUM) {
margin = (int) (screenWidth*0.08);
} else if (previewWidth > Constants.RESOLUTION_MEDIUM && previewWidth <= Constants.RESOLUTION_HIGH) {
margin = (int) (screenWidth*0.08);
}
return margin;
}
public static int setSelectedResolution(int previewHeight) {
int selectedResolution = 0;
if(previewHeight <= Constants.RESOLUTION_LOW) {
selectedResolution = 0;
} else if (previewHeight > Constants.RESOLUTION_LOW && previewHeight <= Constants.RESOLUTION_MEDIUM) {
selectedResolution = 1;
} else if (previewHeight > Constants.RESOLUTION_MEDIUM && previewHeight <= Constants.RESOLUTION_HIGH) {
selectedResolution = 2;
}
return selectedResolution;
}
public static class ResolutionComparator implements Comparator {
@Override
public int compare(Camera.Size size1, Camera.Size size2) {
if(size1.height != size2.height)
return size1.height -size2.height;
else
return size1.width - size2.width;
}
}
public static void concatenateMultipleFiles(String inpath, String outpath)
{
File Folder = new File(inpath);
File files[];
files = Folder.listFiles();
if(files.length>0)
{
for(int i = 0;ilibencoding.so";
}
private static HashMap getMetaData()
{
HashMap localHashMap = new HashMap();
localHashMap.put("creation_time", new SimpleDateFormat("yyyy_MM_dd_HH_mm_ss_SSSZ").format(new Date()));
return localHashMap;
}
public static int getTimeStampInNsFromSampleCounted(int paramInt) {
return (int)(paramInt / 0.0441D);
}
/*public static void saveReceivedFrame(SavedFrames frame) {
File cachePath = new File(frame.getCachePath());
BufferedOutputStream bos;
try {
bos = new BufferedOutputStream(new FileOutputStream(cachePath));
if (bos != null) {
bos.write(frame.getFrameBytesData());
bos.flush();
bos.close();
}
} catch (FileNotFoundException e) {
e.printStackTrace();
cachePath = null;
} catch (IOException e) {
e.printStackTrace();
cachePath = null;
}
}*/
public static Toast showToast(Context context, String textMessage, int timeDuration) {
if (null == context) {
return null;
}
textMessage = (null == textMessage ? "Oops! " : textMessage.trim());
Toast t = Toast.makeText(context, textMessage, timeDuration);
t.show();
return t;
}
public static void showDialog(Context context, String title, String content, int type, final Handler handler) {
final Dialog dialog = new Dialog(context, R.style.Dialog_loading);
dialog.setCancelable(true);
LayoutInflater inflater = LayoutInflater.from(context);
View view = inflater.inflate(R.layout.global_dialog_tpl, null);
Button confirmButton = (Button) view.findViewById(R.id.setting_account_bind_confirm);
Button cancelButton = (Button) view.findViewById(R.id.setting_account_bind_cancel);
TextView dialogTitle = (TextView) view.findViewById(R.id.global_dialog_title);
View line_hori_center = view.findViewById(R.id.line_hori_center);
confirmButton.setVisibility(View.GONE);
line_hori_center.setVisibility(View.GONE);
TextView textView = (TextView) view.findViewById(R.id.setting_account_bind_text);
Window dialogWindow = dialog.getWindow();
WindowManager.LayoutParams lp = dialogWindow.getAttributes();
lp.width = (int) (context.getResources().getDisplayMetrics().density*288);
dialogWindow.setAttributes(lp);
if(type != 1 && type != 2){
type = 1;
}
dialogTitle.setText(title);
textView.setText(content);
if(type == 1 || type == 2){
confirmButton.setVisibility(View.VISIBLE);
confirmButton.setOnClickListener(new OnClickListener(){
@Override
public void onClick(View v){
if(handler != null){
Message msg = handler.obtainMessage();
msg.what = 1;
handler.sendMessage(msg);
}
dialog.dismiss();
}
});
}
// 取消按钮事件
if(type == 2){
cancelButton.setVisibility(View.VISIBLE);
line_hori_center.setVisibility(View.VISIBLE);
cancelButton.setOnClickListener(new OnClickListener(){
@Override
public void onClick(View v){
if(handler != null){
Message msg = handler.obtainMessage();
msg.what = 0;
handler.sendMessage(msg);
}
dialog.dismiss();
}
});
}
dialog.addContentView(view, new LayoutParams(LayoutParams.MATCH_PARENT, LayoutParams.MATCH_PARENT));
dialog.setCancelable(true);// 点击返回键关闭
dialog.setCanceledOnTouchOutside(true);// 点击外部关闭
dialog.show();
}
public IplImage getFrame(String filePath) {
Log.e("util", "getFrame" + filePath);
CvCapture capture = cvCreateFileCapture(filePath);
Log.e("util", "capture " + capture);
IplImage image = cvQueryFrame(capture);
Log.e("util", "image " + image);
return image;
}