
Recherche avancée
Médias (91)
-
Richard Stallman et le logiciel libre
19 octobre 2011, par
Mis à jour : Mai 2013
Langue : français
Type : Texte
-
Stereo master soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
Elephants Dream - Cover of the soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
#7 Ambience
16 octobre 2011, par
Mis à jour : Juin 2015
Langue : English
Type : Audio
-
#6 Teaser Music
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#5 End Title
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
Autres articles (112)
-
Script d’installation automatique de MediaSPIP
25 avril 2011, parAfin de palier aux difficultés d’installation dues principalement aux dépendances logicielles coté serveur, un script d’installation "tout en un" en bash a été créé afin de faciliter cette étape sur un serveur doté d’une distribution Linux compatible.
Vous devez bénéficier d’un accès SSH à votre serveur et d’un compte "root" afin de l’utiliser, ce qui permettra d’installer les dépendances. Contactez votre hébergeur si vous ne disposez pas de cela.
La documentation de l’utilisation du script d’installation (...) -
Demande de création d’un canal
12 mars 2010, parEn fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...) -
La sauvegarde automatique de canaux SPIP
1er avril 2010, parDans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...)
Sur d’autres sites (12397)
-
Android FFmpeg Video Recording Delete Last Recorded Part
17 avril 2015, par user3587194I’m trying to do exactly what this picture shows.
Anyway, how can I delete part of a video ? The code I was testing is on github.
It uses a progress bar so that when you record the progress bar will move, and keep them in separate segments. What is confusing to me is trying to figure out where and how to grab each segment to see if I want to delete that segment or not.
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
long frameTimeStamp = 0L;
if (mAudioTimestamp == 0L && firstTime > 0L)
frameTimeStamp = 1000L * (System.currentTimeMillis() - firstTime);
else if (mLastAudioTimestamp == mAudioTimestamp)
frameTimeStamp = mAudioTimestamp + frameTime;
else {
long l2 = (System.nanoTime() - mAudioTimeRecorded) / 1000L;
frameTimeStamp = l2 + mAudioTimestamp;
mLastAudioTimestamp = mAudioTimestamp;
}
synchronized (mVideoRecordLock) {
//Log.e("recorder", "mVideoRecordLock " + mVideoRecordLock);
if (recording && rec && lastSavedframe != null && lastSavedframe.getFrameBytesData() != null && yuvIplImage != null) {
if (isFirstFrame) {
isFirstFrame = false;
firstData = data;
}
totalTime = System.currentTimeMillis() - firstTime - pausedTime - ((long) (1.0 / (double) frameRate) * 1000);
if (lastSavedframe != null && !deleteEnabled) {
deleteEnabled = true;
deleteBtn.setVisibility(View.VISIBLE);
cancelBtn.setVisibility(View.GONE);
}
if (!nextEnabled && totalTime >= recordingChangeTime) {
Log.e("recording", "totalTime >= recordingChangeTime " + totalTime + " " + recordingChangeTime);
nextEnabled = true;
nextBtn.setVisibility(View.VISIBLE);
}
if (nextEnabled && totalTime >= recordingMinimumTime) {
mHandler.sendEmptyMessage(5);
}
if (currentRecorderState == RecorderState.PRESS && totalTime >= recordingChangeTime) {
currentRecorderState = RecorderState.LOOSEN;
mHandler.sendEmptyMessage(2);
}
mVideoTimestamp += frameTime;
if (lastSavedframe.getTimeStamp() > mVideoTimestamp)
mVideoTimestamp = lastSavedframe.getTimeStamp();
try {
yuvIplImage.getByteBuffer().put(lastSavedframe.getFrameBytesData());
videoRecorder.setTimestamp(lastSavedframe.getTimeStamp());
videoRecorder.record(yuvIplImage);
} catch (com.googlecode.javacv.FrameRecorder.Exception e) {
e.printStackTrace();
}
}
byte[] tempData = rotateYUV420Degree90(data, previewWidth, previewHeight);
if (cameraSelection == 1)
tempData = rotateYUV420Degree270(data, previewWidth, previewHeight);
lastSavedframe = new SavedFrames(tempData, frameTimeStamp);
//Log.e("recorder", "lastSavedframe " + lastSavedframe);
}
}
}
public class Util {
public static ContentValues videoContentValues = null;
public static String getRecordingTimeFromMillis(long millis) {
String strRecordingTime = null;
int seconds = (int) (millis / 1000);
int minutes = seconds / 60;
int hours = minutes / 60;
if (hours >= 0 && hours < 10)
strRecordingTime = "0" + hours + ":";
else
strRecordingTime = hours + ":";
if (hours > 0)
minutes = minutes % 60;
if (minutes >= 0 && minutes < 10)
strRecordingTime += "0" + minutes + ":";
else
strRecordingTime += minutes + ":";
seconds = seconds % 60;
if (seconds >= 0 && seconds < 10)
strRecordingTime += "0" + seconds ;
else
strRecordingTime += seconds ;
return strRecordingTime;
}
public static int determineDisplayOrientation(Activity activity, int defaultCameraId) {
int displayOrientation = 0;
if (Build.VERSION.SDK_INT > Build.VERSION_CODES.FROYO) {
CameraInfo cameraInfo = new CameraInfo();
Camera.getCameraInfo(defaultCameraId, cameraInfo);
int degrees = getRotationAngle(activity);
if (cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
displayOrientation = (cameraInfo.orientation + degrees) % 360;
displayOrientation = (360 - displayOrientation) % 360;
} else {
displayOrientation = (cameraInfo.orientation - degrees + 360) % 360;
}
}
return displayOrientation;
}
public static int getRotationAngle(Activity activity) {
int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
int degrees = 0;
switch (rotation) {
case Surface.ROTATION_0:
degrees = 0;
break;
case Surface.ROTATION_90:
degrees = 90;
break;
case Surface.ROTATION_180:
degrees = 180;
break;
case Surface.ROTATION_270:
degrees = 270;
break;
}
return degrees;
}
public static int getRotationAngle(int rotation) {
int degrees = 0;
switch (rotation) {
case Surface.ROTATION_0:
degrees = 0;
break;
case Surface.ROTATION_90:
degrees = 90;
break;
case Surface.ROTATION_180:
degrees = 180;
break;
case Surface.ROTATION_270:
degrees = 270;
break;
}
return degrees;
}
public static String createImagePath(Context context){
long dateTaken = System.currentTimeMillis();
String title = Constants.FILE_START_NAME + dateTaken;
String filename = title + Constants.IMAGE_EXTENSION;
String dirPath = Environment.getExternalStorageDirectory()+"/Android/data/" + context.getPackageName()+"/video";
File file = new File(dirPath);
if(!file.exists() || !file.isDirectory())
file.mkdirs();
String filePath = dirPath + "/" + filename;
return filePath;
}
public static String createFinalPath(Context context) {
Log.e("util", "createFinalPath");
long dateTaken = System.currentTimeMillis();
String title = Constants.FILE_START_NAME + dateTaken;
String filename = title + Constants.VIDEO_EXTENSION;
String filePath = genrateFilePath(context, String.valueOf(dateTaken), true, null);
ContentValues values = new ContentValues(7);
values.put(Video.Media.TITLE, title);
values.put(Video.Media.DISPLAY_NAME, filename);
values.put(Video.Media.DATE_TAKEN, dateTaken);
values.put(Video.Media.MIME_TYPE, "video/3gpp");
values.put(Video.Media.DATA, filePath);
videoContentValues = values;
Log.e("util", "filePath " + filePath);
return filePath;
}
public static void deleteTempVideo(Context context) {
final String filePath = Environment.getExternalStorageDirectory() + "/Android/data/" + context.getPackageName() + "/video";
new Thread(new Runnable() {
@Override
public void run() {
File file = new File(filePath);
if (file != null && file.isDirectory()) {
Log.e("util", "file.isDirectory() " + file.isDirectory());
for (File file2 : file.listFiles()) {
Log.e("util", "file.listFiles() " + file.listFiles());
file2.delete();
}
}
}
}).start();
}
private static String genrateFilePath(Context context,String uniqueId, boolean isFinalPath, File tempFolderPath) {
String fileName = Constants.FILE_START_NAME + uniqueId + Constants.VIDEO_EXTENSION;
String dirPath = Environment.getExternalStorageDirectory() + "/Android/data/" + context.getPackageName() + "/video";
if (isFinalPath) {
File file = new File(dirPath);
if (!file.exists() || !file.isDirectory())
file.mkdirs();
} else
dirPath = tempFolderPath.getAbsolutePath();
String filePath = dirPath + "/" + fileName;
return filePath;
}
public static String createTempPath(Context context, File tempFolderPath ) {
long dateTaken = System.currentTimeMillis();
String filePath = genrateFilePath(context,String.valueOf(dateTaken), false, tempFolderPath);
return filePath;
}
public static File getTempFolderPath() {
File tempFolder = new File(Constants.TEMP_FOLDER_PATH +"_" +System.currentTimeMillis());
return tempFolder;
}
public static List getResolutionList(Camera camera) {
Parameters parameters = camera.getParameters();
List previewSizes = parameters.getSupportedPreviewSizes();
return previewSizes;
}
public static RecorderParameters getRecorderParameter(int currentResolution) {
RecorderParameters parameters = new RecorderParameters();
if (currentResolution == Constants.RESOLUTION_HIGH_VALUE) {
parameters.setAudioBitrate(128000);
parameters.setVideoQuality(0);
} else if (currentResolution == Constants.RESOLUTION_MEDIUM_VALUE) {
parameters.setAudioBitrate(128000);
parameters.setVideoQuality(5);
} else if (currentResolution == Constants.RESOLUTION_LOW_VALUE) {
parameters.setAudioBitrate(96000);
parameters.setVideoQuality(20);
}
return parameters;
}
public static int calculateMargin(int previewWidth, int screenWidth) {
int margin = 0;
if (previewWidth <= Constants.RESOLUTION_LOW) {
margin = (int) (screenWidth*0.12);
} else if (previewWidth > Constants.RESOLUTION_LOW && previewWidth <= Constants.RESOLUTION_MEDIUM) {
margin = (int) (screenWidth*0.08);
} else if (previewWidth > Constants.RESOLUTION_MEDIUM && previewWidth <= Constants.RESOLUTION_HIGH) {
margin = (int) (screenWidth*0.08);
}
return margin;
}
public static int setSelectedResolution(int previewHeight) {
int selectedResolution = 0;
if(previewHeight <= Constants.RESOLUTION_LOW) {
selectedResolution = 0;
} else if (previewHeight > Constants.RESOLUTION_LOW && previewHeight <= Constants.RESOLUTION_MEDIUM) {
selectedResolution = 1;
} else if (previewHeight > Constants.RESOLUTION_MEDIUM && previewHeight <= Constants.RESOLUTION_HIGH) {
selectedResolution = 2;
}
return selectedResolution;
}
public static class ResolutionComparator implements Comparator {
@Override
public int compare(Camera.Size size1, Camera.Size size2) {
if(size1.height != size2.height)
return size1.height -size2.height;
else
return size1.width - size2.width;
}
}
public static void concatenateMultipleFiles(String inpath, String outpath)
{
File Folder = new File(inpath);
File files[];
files = Folder.listFiles();
if(files.length>0)
{
for(int i = 0;ilibencoding.so";
}
private static HashMap getMetaData()
{
HashMap localHashMap = new HashMap();
localHashMap.put("creation_time", new SimpleDateFormat("yyyy_MM_dd_HH_mm_ss_SSSZ").format(new Date()));
return localHashMap;
}
public static int getTimeStampInNsFromSampleCounted(int paramInt) {
return (int)(paramInt / 0.0441D);
}
/*public static void saveReceivedFrame(SavedFrames frame) {
File cachePath = new File(frame.getCachePath());
BufferedOutputStream bos;
try {
bos = new BufferedOutputStream(new FileOutputStream(cachePath));
if (bos != null) {
bos.write(frame.getFrameBytesData());
bos.flush();
bos.close();
}
} catch (FileNotFoundException e) {
e.printStackTrace();
cachePath = null;
} catch (IOException e) {
e.printStackTrace();
cachePath = null;
}
}*/
public static Toast showToast(Context context, String textMessage, int timeDuration) {
if (null == context) {
return null;
}
textMessage = (null == textMessage ? "Oops! " : textMessage.trim());
Toast t = Toast.makeText(context, textMessage, timeDuration);
t.show();
return t;
}
public static void showDialog(Context context, String title, String content, int type, final Handler handler) {
final Dialog dialog = new Dialog(context, R.style.Dialog_loading);
dialog.setCancelable(true);
LayoutInflater inflater = LayoutInflater.from(context);
View view = inflater.inflate(R.layout.global_dialog_tpl, null);
Button confirmButton = (Button) view.findViewById(R.id.setting_account_bind_confirm);
Button cancelButton = (Button) view.findViewById(R.id.setting_account_bind_cancel);
TextView dialogTitle = (TextView) view.findViewById(R.id.global_dialog_title);
View line_hori_center = view.findViewById(R.id.line_hori_center);
confirmButton.setVisibility(View.GONE);
line_hori_center.setVisibility(View.GONE);
TextView textView = (TextView) view.findViewById(R.id.setting_account_bind_text);
Window dialogWindow = dialog.getWindow();
WindowManager.LayoutParams lp = dialogWindow.getAttributes();
lp.width = (int) (context.getResources().getDisplayMetrics().density*288);
dialogWindow.setAttributes(lp);
if(type != 1 && type != 2){
type = 1;
}
dialogTitle.setText(title);
textView.setText(content);
if(type == 1 || type == 2){
confirmButton.setVisibility(View.VISIBLE);
confirmButton.setOnClickListener(new OnClickListener(){
@Override
public void onClick(View v){
if(handler != null){
Message msg = handler.obtainMessage();
msg.what = 1;
handler.sendMessage(msg);
}
dialog.dismiss();
}
});
}
// 取消按钮事件
if(type == 2){
cancelButton.setVisibility(View.VISIBLE);
line_hori_center.setVisibility(View.VISIBLE);
cancelButton.setOnClickListener(new OnClickListener(){
@Override
public void onClick(View v){
if(handler != null){
Message msg = handler.obtainMessage();
msg.what = 0;
handler.sendMessage(msg);
}
dialog.dismiss();
}
});
}
dialog.addContentView(view, new LayoutParams(LayoutParams.MATCH_PARENT, LayoutParams.MATCH_PARENT));
dialog.setCancelable(true);// 点击返回键关闭
dialog.setCanceledOnTouchOutside(true);// 点击外部关闭
dialog.show();
}
public IplImage getFrame(String filePath) {
Log.e("util", "getFrame" + filePath);
CvCapture capture = cvCreateFileCapture(filePath);
Log.e("util", "capture " + capture);
IplImage image = cvQueryFrame(capture);
Log.e("util", "image " + image);
return image;
} -
Send info from ffmpeg to text file & then parsing into a new file
26 octobre 2023, par PrestonI am trying to scan a directory of .mkv files and output the volume information to out.txt and then parse out.txt for volume_max and send it to log.txt using


for %i in (*.mkv) do ffmpeg -i "%i" -af "volumedetect" -vn -sn -dn -f 1>> out.txt 2>&1 | findstr max_volume out.txt > log.txt



but I keep getting the error(s) :


Missing argument for option 'f'.
Error splitting the argument list: Invalid argument



when it runs the ffmpeg command.


I do get the out.txt file but it's not complete information for each file that's processed. Also, the log.txt file is always 0 bytes, I assume because out.txt doesn't get to the point of generating the max_volume information for each file before erroring.


I know it's got to be something simple but I can't see it. Can anyone shed a bit of light on what I might be doing wrong here ? Thanks !


-
Horizontally show two videos side by side and export to one video
15 juin 2015, par Michael LibermanWhat I am trying to achieve :
Programmatically (using some kind of script) combine two different videos to one video where the videos are shown side by side and sound from both videos is played simultaneously.I found at least two ways I could display two videos side by side but in each one I have a different problem that makes it difficult to complete the process :
-
Avisyth and Avs2Avi :
Avisyth really makes the process easy. With a simple Avisyth script like this you should be able to play two videos side by side :video1=AVISource("d:\file1.avi")<br />
video2=AVISource("d:\file2.avi")<br />
StackHorizontal(video1,video2)The creation of a new output file can be done with avs2avi.
The problem here is that i am not able to play even one avi file with avisyth because I get "AviSource couldn’t locate a decompressor for fourcc xvid" message. I googled for the solutions to that problem but nothing helped. GSpot says that i have all the codecs needed and it seems I can not do anything. Because I can not get to running real videos, I dont know if the final video plays sounds from both videos. Avisyth installed correctly and I am able to run the following script StackHorizontal(version ,version). -
ffmpeg
Works like a charm with two videos and the final file BUT i do not get the sounds from both videos but only the first one. I found out that the solution to add sound from the second video will be by using the libfaac in the command line, like that :
ffmpeg.exe -i video1.mp4 -vf "[in] scale=iw/2:ih/2, pad=2*iw:ih [left]; movie=video2.mp4, scale=iw/3:ih/3 [right]; [left][right] overlay=main_w/2:0 [out]" -c:a libfaac -b:v 768k Output.mp4
but i always get an error that the encoder libfaac can not be found. I downloaded the libfaac.dll but still no result.
Is there a solution to any of these problems ? Is there another way to programmatically make one video from two videos that are played side by side ? Thanks in advance.
-