
Recherche avancée
Médias (1)
-
Carte de Schillerkiez
13 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (104)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)
Sur d’autres sites (14069)
-
SegmentedIndexBox (SIDX) not generated when using WEBM over DASH
11 juillet 2014, par Flock DawsonI’m trying to get the Industry Format DASH player to work with WEBM audio/video files. However, I’m running in the same error again and again, and Google doesn’t seem to give much help.
To start with, I created different streams of the same file (different resolutions and bitrates) using this tutorial : https://developer.mozilla.org/en-US/docs/Web/HTML/DASH_Adaptive_Streaming_for_HTML_5_Video
Then, I downloaded the Industry Format DASH player (http://dashif.org/software/) and pointed it to the DASH manifest I created. When I try to play the video in Chrome, I get the following log :
Parsing complete: ( xml2json: 3ms, objectiron: 2ms, total: 0.005s) dash.all.js:3
Manifest has loaded. dash.all.js:3
MediaSource is open! dash.all.js:3
Event {clipboardData: undefined, path: NodeList[0], cancelBubble: false, returnValue: true, srcElement: MediaSource…}
dash.all.js:3
Video codec: video/webm;codecs="vp8" dash.all.js:3
No text tracks. dash.all.js:3
Audio codec: audio/webm;codecs="vorbis" dash.all.js:3
Duration successfully set to: 27.2 dash.all.js:3
Perform SIDX load: https://*****/street_orig_125k_final.webm dash.all.js:3
Perform SIDX load: https://*****/street_audio_final.webm dash.all.js:3
Uncaught RangeError: Offset is outside the bounds of the DataViewFrom this log, I distilled that the manifest is fetched and processed correctly, but something goes wrong when trying to process the SIDX (SegmetIndexBox). I tried another (third-party) source, which works perfectly. I analysed the response returned by the server when trying to fetch the SIDX, and when converted to a readable presentation, the text ’Dsidx’ can be found in this response. So, I analyzed the WEBM file I provide (hexdump and grep), but I cannot find such a SIDX. My conclusion is that the SIDX is never added to the WEBM file.
From the tutorial I used, I guess the generation of the SIDX is handled by the samplemuxer command, which does not offer any additional parameters. Is there anyone who has more experience in generating this SIDX ?
-
Framerate live streaming webm with dash and ffmpeg
14 mai 2016, par JonesI am streaming a live video with ffmpeg and dash.js using
these instructions. It works well except that the video is playing at a too high framerate. No framerate is specified in the manifest.
Creating the Chunks :SET VP9_LIVE_PARAMS=-speed 6 -threads 8 -static-thresh 0 -max-intra-rate 300 -deadline realtime -lag-in-frames 0 -error-resilient 1
ffmpeg -re -r 25 -i tcp://localhost:8891 ^
-map 0:0 ^
-pix_fmt yuv420p ^
-c:v libvpx-vp9 ^
-s 800x600 -keyint_min 25 -g 25 %VP9_LIVE_PARAMS% ^
-f webm_chunk ^
-header "webm_live/glass_360.hdr" ^
-chunk_start_index 1 ^
webm_live\glass_360_%%d.chk ^Creating the Manifest :
ffmpeg ^
-f webm_dash_manifest -live 1 ^
-r 25 ^
-i webm_live/glass_360.hdr ^
-c copy ^
-map 0 ^
-r 25 ^
-framerate 25 ^
-f webm_dash_manifest -live 1 ^
-adaptation_sets "id=0,streams=0" ^
-chunk_start_index 1 ^
-chunk_duration_ms 1000 ^
-time_shift_buffer_depth 7200 ^
-minimum_update_period 7200 ^
webm_live/glass_live_manifest.mpdManifest :
<?xml version="1.0" encoding="UTF-8"?>
<mpd xmlns="urn:mpeg:DASH:schema:MPD:2011" type="dynamic" minbuffertime="PT1S" profiles="urn:mpeg:dash:profile:isoff-live:2011" availabilitystarttime="2016-03-30T13:02:53Z" timeshiftbufferdepth="PT7200S" minimumupdateperiod="PT7200S">
<period start="PT0S">
<adaptationset mimetype="video/webm" codecs="vp9" bitstreamswitching="true" subsegmentalignment="true" subsegmentstartswithsap="1">
<contentcomponent type="video"></contentcomponent>
<segmenttemplate timescale="1000" duration="1000" media="glass_$RepresentationID$_$Number$.chk" startnumber="1" initialization="glass_$RepresentationID$.hdr"></segmenttemplate>
<representation bandwidth="1000000" width="800" height="600" codecs="vp9" mimetype="video/webm" startswithsap="1"></representation>
</adaptationset>
</period>
</mpd>Any Ideas how to fix this ?
-
ffmpeg permission denied in android studio ?
16 avril 2023, par Rakesh SainiI want to merge an audio in video using ffmpeg of the following dependency but getting the permission denied error.


Cannot run program "/data/user/0/com.example.mytestapp/files/ffmpeg": error=13, Permission denied



Dependency I am using


implementation 'com.writingminds:FFmpegAndroid:0.3.2'



I am using following code to merge an audio into the video. I added the dependency and provided the path of the video. I also placed the ffpmeg.exe into the assets folder


package com.example.mytestapp.Activity;
 
 import androidx.annotation.NonNull;
 import androidx.appcompat.app.AppCompatActivity;
 
 import android.Manifest;
 import android.content.Context;
 import android.content.pm.PackageManager;
 import android.content.res.AssetManager;
 import android.os.Build;
 import android.os.Bundle;
 import android.os.Environment;
 import android.util.Log;
 import android.widget.Toast;
 
 import com.example.mytestapp.R;
 import com.github.hiteshsondhi88.libffmpeg.ExecuteBinaryResponseHandler;
 import com.github.hiteshsondhi88.libffmpeg.FFmpeg;
 import com.github.hiteshsondhi88.libffmpeg.exceptions.FFmpegCommandAlreadyRunningException;
 
 import java.io.File;
 import java.io.FileNotFoundException;
 import java.io.FileOutputStream;
 import java.io.IOException;
 import java.io.InputStream;
 
 public class MainActivity3 extends AppCompatActivity {
 Context context;
 @Override
 protected void onCreate(Bundle savedInstanceState) {
 super.onCreate(savedInstanceState);
 setContentView(R.layout.activity_main3);
 
 }
 protected Boolean doInBackground() {
 String ffmpegPath = getApplicationContext().getFilesDir().getAbsolutePath() + "/ffmpeg";
 Log.e("called ", ffmpegPath);
 
 String vpath = " /storage/emulated/0/DCIM/Camera/VID_20230325_083306.mp4";
 String aPath = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS) + "/Seesekadil.mp3";
 
 String oPath = "/storage/emulated/0/DCIM/202304_15-223653rangeela.mp4";
 String[] cmd = {"-i", vpath, "-i", aPath, "-shortest", oPath};
 try {
 FFmpeg.getInstance(getApplicationContext()).execute(cmd, new ExecuteBinaryResponseHandler() {
 @Override
 public void onStart() {
 Log.e("Started", "yes");
 }
 
 @Override
 public void onProgress(String message) {
 // do nothing
 }
 
 @Override
 public void onFailure(String message) {
 Log.e("Failed ", "yes");
 }
 
 @Override
 public void onSuccess(String message) {
 Log.e("Success ", "yes");
 }
 
 @Override
 public void onFinish() {
 // do nothing
 }
 });
 } catch (FFmpegCommandAlreadyRunningException e) {
 e.printStackTrace();
 }
 
 return null;
 }
 
 @Override
 protected void onResume() {
 super.onResume();
 checkPermission();
 File file = new File(getApplicationContext().getFilesDir(), "ffmpeg");
 if (!file.canExecute()) {
 file.setExecutable(true);
 }
 doInBackground();
 
 }
 
 private void checkPermission() {
 if (Build.VERSION.SDK_INT < Build.VERSION_CODES.M) {
 return;
 }
 // request camera permission if it has not been grunted.
 if (checkSelfPermission(Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED ||
 checkSelfPermission(Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED ||
 checkSelfPermission(Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED
 ) {
 
 requestPermissions(new String[]{Manifest.permission.CAMERA, Manifest.permission.RECORD_AUDIO, Manifest.permission.WRITE_EXTERNAL_STORAGE}, 1);
 }
 
 }
 
 @Override
 public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
 super.onRequestPermissionsResult(requestCode, permissions, grantResults);
 switch (requestCode) {
 case 1:
 if (grantResults.length > 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) {
 Toast.makeText(MainActivity3.this, "permission has been grunted.", Toast.LENGTH_SHORT).show();
 } else {
 Toast.makeText(MainActivity3.this, "[WARN] permission is not grunted.", Toast.LENGTH_SHORT).show();
 }
 break;
 }
 }
 
 }