
Recherche avancée
Médias (1)
-
Collections - Formulaire de création rapide
19 février 2013, par
Mis à jour : Février 2013
Langue : français
Type : Image
Autres articles (78)
-
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...) -
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (15412)
-
ffmpeg c/c++ get frame count or timestamp and fps
23 juin 2016, par broschbI am using ffmpeg to decode a video file in C. I am struggling to get either the count of the current frame I am decoding or the timestamp of the frame. I have read numerous posts that show how to calculate an estimated frame no based on the fps and frame timestamp, however I am not able to get either of those.
What I need : fps of video file, timestamp of current frame or frame no(not calculated)
What I have : I am able to get the time of the video using
pFormatCtx->duration/AV_TIME_BASE
I am counting the frames currently as I process them, and getting a current frame count, this is not going to work longterm though. I can get the total frame count for the file using
pFormatCtx->streams[currentStream->videoStream]->nb_frames
I have read this may not work for all streams, although it has worked for every stream I have tried.
I have tried using the time_base.num and time_base.den values and packet.pts, but I can’t make any sense of the values that I am getting from those, so I may just need to understand better what those values are.
Does anyone know of resources that show examples on how to get this values ?
-
java.lang.ExceptionInInitializerError in FFmpeg Test
22 janvier 2013, par Mehul RanparaI have a demo called FFMpegTest of Android which i downloaded from here..https://github.com/appunite/AndroidFFmpeg..and i also have Library file of FFMpeg which is in this demo and i am using this Library in FFmpegTest..Now when i running..it gives java.lang.ExceptionInInitializerError.. Anyone have an idea about this..please share with me..Thanx in Advance..
which is in...VideoActivity.java
public class VideoActivity extends Activity implements OnClickListener,FFmpegListener, OnSeekBarChangeListener, OnItemSelectedListener
{
private FFmpegPlayer mpegPlayer;
private static boolean isSurfaceView = true;
protected boolean mPlay = false;
private View controlsView;
private View loadingView;
private SeekBar seekBar;
public View videoView;
private Button playPauseButton;
private boolean mTracking = false;
private View streamsView;
private Spinner languageSpinner;
private int languageSpinnerSelectedPosition = 0;
private Spinner subtitleSpinner;
private int subtitleSpinnerSelectedPosition = 0;
private StreamAdapter languageAdapter;
private StreamAdapter subtitleAdapter;
private FFmpegStreamInfo audioStream = null;
private FFmpegStreamInfo subtitleStream = null;
@Override
public void onCreate(Bundle savedInstanceState)
{
this.getWindow().requestFeature(Window.FEATURE_NO_TITLE);
getWindow().setFormat(PixelFormat.RGB_565);
getWindow().clearFlags(WindowManager.LayoutParams.FLAG_DITHER);
super.onCreate(savedInstanceState);
this.getWindow().addFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN);
this.getWindow().clearFlags(WindowManager.LayoutParams.FLAG_FORCE_NOT_FULLSCREEN);
this.getWindow().setBackgroundDrawable(null);
this.setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
if (isSurfaceView)
VideoActivity.this.setContentView(R.layout.video_surfaceview);
else
VideoActivity.this.setContentView(R.layout.video_view);
seekBar = (SeekBar) this.findViewById(R.id.seek_bar);
seekBar.setOnSeekBarChangeListener(this);
playPauseButton = (Button) this.findViewById(R.id.play_pause);
playPauseButton.setOnClickListener(this);
controlsView = this.findViewById(R.id.controls);
streamsView = this.findViewById(R.id.streams);
loadingView = this.findViewById(R.id.loading_view);
languageSpinner = (Spinner) this.findViewById(R.id.language_spinner);
subtitleSpinner = (Spinner) this.findViewById(R.id.subtitle_spinner);
languageAdapter = new StreamAdapter(this,android.R.layout.simple_spinner_item,new ArrayList<ffmpegstreaminfo>(), StreamAdapterType.AUDIO);
languageAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);
languageSpinner.setAdapter(languageAdapter);
languageSpinner.setOnItemSelectedListener(this);
subtitleAdapter = new StreamAdapter(this,android.R.layout.simple_spinner_item,new ArrayList<ffmpegstreaminfo>(), StreamAdapterType.SUBTITLE);
subtitleAdapter.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item);
subtitleSpinner.setAdapter(subtitleAdapter);
subtitleSpinner.setOnItemSelectedListener(this);
try
{
videoView = this.findViewById(R.id.video_view);
//VideoActivity.this.mpegPlayer = new FFmpegPlayer((FFmpegDisplay)videoView, VideoActivity.this);
this.mpegPlayer = new FFmpegPlayer((FFmpegDisplay)videoView, this);
this.mpegPlayer.setMpegListener(this);
setDataSource();
//Too
}
catch (Exception e)
{
e.printStackTrace();
}
}
private static enum StreamAdapterType
{
AUDIO, SUBTITLE
};
private static class StreamAdapter extends ArrayAdapter<ffmpegstreaminfo>
{
private final StreamAdapterType adapterType;
public StreamAdapter(Context context, int textViewResourceId,List<ffmpegstreaminfo> objects, StreamAdapterType adapterType)
{
super(context, textViewResourceId, objects);
this.adapterType = adapterType;
}
@Override
public View getView(int position, View convertView, ViewGroup parent)
{
TextView view = (TextView) super.getView(position, convertView, parent);
FFmpegStreamInfo item = getItem(position);
Locale locale = item.getLanguage();
String formatter;
if (StreamAdapterType.AUDIO.equals(adapterType))
{
formatter = getContext().getString(R.string.language_spinner_drop_down);
}
else
{
formatter = getContext().getString(R.string.subtitle_spinner_drop_down);
}
String languageName = locale == null ? getContext().getString(R.string.unknown) : locale.getDisplayLanguage();
String text = String.format(formatter, languageName);
view.setText(text);
return view;
}
@Override
public View getDropDownView(int position, View convertView, ViewGroup parent)
{
CheckedTextView view = (CheckedTextView) super.getDropDownView(position, convertView, parent);
FFmpegStreamInfo item = getItem(position);
Locale locale = item.getLanguage();
String languageName = locale == null ? getContext().getString(R.string.unknown) : locale.getDisplayLanguage();
view.setText(languageName);
return view;
}
}
@Override
protected void onPause() {
super.onPause();
};
@Override
protected void onResume() {
super.onResume();
}
@Override
protected void onDestroy()
{
super.onDestroy();
this.mpegPlayer.setMpegListener(null);
this.mpegPlayer.stop();
stop();
}
private void setDataSource()
{
Log.d("url", "checin");
HashMap params = new HashMap();
// set font for ass
File assFont = new File(Environment.getExternalStorageDirectory(),"Roboto.ttf");
params.put("ass_default_font_path", assFont.getAbsolutePath());
Intent intent = getIntent();
String url = intent.getStringExtra(AppConstants.VIDEO_PLAY_ACTION_EXTRA_URL);
Log.d("url", url);
if (url == null)
{
throw new IllegalArgumentException(String.format("\"%s\" did not provided", AppConstants.VIDEO_PLAY_ACTION_EXTRA_URL));
}
if (intent.hasExtra(AppConstants.VIDEO_PLAY_ACTION_EXTRA_ENCRYPTION_KEY))
{
params.put("aeskey", intent.getStringExtra(AppConstants.VIDEO_PLAY_ACTION_EXTRA_ENCRYPTION_KEY));
}
this.playPauseButton.setBackgroundResource(android.R.drawable.ic_media_play);
this.playPauseButton.setEnabled(true);
mPlay = false;
mpegPlayer.setDataSource(url, params, null, audioStream,subtitleStream);
}
@Override
public void onClick(View v)
{
int viewId = v.getId();
switch (viewId)
{
case R.id.play_pause:
resumePause();
return;
default:
throw new RuntimeException();
}
}
@Override
public void onFFUpdateTime(int currentTimeS, int videoDurationS, boolean isFinished)
{
if (!mTracking)
{
seekBar.setMax(videoDurationS);
seekBar.setProgress(currentTimeS);
}
if (isFinished)
{
new AlertDialog.Builder(this)
.setTitle(R.string.dialog_end_of_video_title)
.setMessage(R.string.dialog_end_of_video_message)
.setCancelable(true).show();
}
}
@Override
public void onFFDataSourceLoaded(FFmpegError err, FFmpegStreamInfo[] streams)
{
if (err != null) {
String format = getResources().getString(
R.string.main_could_not_open_stream);
String message = String.format(format, err.getMessage());
Builder builder = new AlertDialog.Builder(VideoActivity.this);
builder.setTitle(R.string.app_name)
.setMessage(message)
.setOnCancelListener(
new DialogInterface.OnCancelListener() {
@Override
public void onCancel(DialogInterface dialog) {
VideoActivity.this.finish();
}
}).show();
return;
}
playPauseButton.setBackgroundResource(android.R.drawable.ic_media_play);
playPauseButton.setEnabled(true);
this.controlsView.setVisibility(View.VISIBLE);
this.streamsView.setVisibility(View.VISIBLE);
this.loadingView.setVisibility(View.GONE);
this.videoView.setVisibility(View.VISIBLE);
languageAdapter.clear();
subtitleAdapter.clear();
for (FFmpegStreamInfo streamInfo : streams) {
CodecType mediaType = streamInfo.getMediaType();
if (FFmpegStreamInfo.CodecType.AUDIO.equals(mediaType)) {
languageAdapter.add(streamInfo);
} else if (FFmpegStreamInfo.CodecType.SUBTITLE.equals(mediaType)) {
subtitleAdapter.add(streamInfo);
}
}
}
private void displaySystemMenu(boolean visible) {
if (Build.VERSION.SDK_INT >= 14) {
displaySystemMenu14(visible);
} else if (Build.VERSION.SDK_INT >= 11) {
displaySystemMenu11(visible);
}
}
@SuppressWarnings("deprecation")
@TargetApi(11)
private void displaySystemMenu11(boolean visible) {
if (visible) {
this.videoView.setSystemUiVisibility(View.STATUS_BAR_VISIBLE);
} else {
this.videoView.setSystemUiVisibility(View.STATUS_BAR_HIDDEN);
}
}
@TargetApi(14)
private void displaySystemMenu14(boolean visible) {
if (visible) {
this.videoView.setSystemUiVisibility(View.SYSTEM_UI_FLAG_VISIBLE);
} else {
this.videoView
.setSystemUiVisibility(View.SYSTEM_UI_FLAG_LOW_PROFILE);
}
}
public void resumePause() {
this.playPauseButton.setEnabled(false);
if (mPlay) {
mpegPlayer.pause();
} else {
mpegPlayer.resume();
displaySystemMenu(true);
}
mPlay = !mPlay;
}
@Override
public void onFFResume(NotPlayingException result) {
this.playPauseButton
.setBackgroundResource(android.R.drawable.ic_media_pause);
this.playPauseButton.setEnabled(true);
displaySystemMenu(false);
mPlay = true;
}
@Override
public void onFFPause(NotPlayingException err) {
this.playPauseButton
.setBackgroundResource(android.R.drawable.ic_media_play);
this.playPauseButton.setEnabled(true);
mPlay = false;
}
private void stop() {
this.controlsView.setVisibility(View.GONE);
this.streamsView.setVisibility(View.GONE);
this.loadingView.setVisibility(View.VISIBLE);
this.videoView.setVisibility(View.INVISIBLE);
}
@Override
public void onFFStop() {
}
@Override
public void onFFSeeked(NotPlayingException result)
{
// if (result != null)
// throw new RuntimeException(result);
}
@Override
public void onProgressChanged(SeekBar seekBar, int progress,boolean fromUser)
{
if (fromUser)
{
mpegPlayer.seek(progress);
}
}
@Override
public void onStartTrackingTouch(SeekBar seekBar) {
mTracking = true;
}
@Override
public void onStopTrackingTouch(SeekBar seekBar) {
mTracking = false;
}
private void setDataSourceAndResumeState() {
int progress = seekBar.getProgress();
setDataSource();
mpegPlayer.seek(progress);
mpegPlayer.resume();
}
@Override
public void onItemSelected(AdapterView<?> parentView,
View selectedItemView, int position, long id) {
FFmpegStreamInfo streamInfo = (FFmpegStreamInfo) parentView
.getItemAtPosition(position);
if (parentView == languageSpinner) {
if (languageSpinnerSelectedPosition != position) {
languageSpinnerSelectedPosition = position;
audioStream = streamInfo;
setDataSourceAndResumeState();
}
} else if (parentView == subtitleSpinner) {
if (subtitleSpinnerSelectedPosition != position) {
subtitleSpinnerSelectedPosition = position;
subtitleStream = streamInfo;
setDataSourceAndResumeState();
}
} else {
throw new RuntimeException();
}
}
@Override
public void onNothingSelected(AdapterView<?> parentView) {
// if (parentView == languageSpinner) {
// audioStream = null;
// } else if (parentView == subtitleSpinner) {
// subtitleStream = null;
// } else {
// throw new RuntimeException();
// }
// play();
}
}
</ffmpegstreaminfo></ffmpegstreaminfo></ffmpegstreaminfo></ffmpegstreaminfo>and this is my logcat :-
01-22 15:29:15.302: E/AndroidRuntime(1560): FATAL EXCEPTION: main
01-22 15:29:15.302: E/AndroidRuntime(1560): java.lang.ExceptionInInitializerError
01-22 15:29:15.302: E/AndroidRuntime(1560): at java.lang.Class.newInstanceImpl(Native Method)
01-22 15:29:15.302: E/AndroidRuntime(1560): at java.lang.Class.newInstance(Class.java:1319)
01-22 15:29:15.302: E/AndroidRuntime(1560): at android.app.Instrumentation.newActivity(Instrumentation.java:1023)
01-22 15:29:15.302: E/AndroidRuntime(1560): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:1871)
01-22 15:29:15.302: E/AndroidRuntime(1560): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:1981)
01-22 15:29:15.302: E/AndroidRuntime(1560): at android.app.ActivityThread.access$600(ActivityThread.java:123)
01-22 15:29:15.302: E/AndroidRuntime(1560): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1147)
01-22 15:29:15.302: E/AndroidRuntime(1560): at android.os.Handler.dispatchMessage(Handler.java:99)
01-22 15:29:15.302: E/AndroidRuntime(1560): at android.os.Looper.loop(Looper.java:137)
01-22 15:29:15.302: E/AndroidRuntime(1560): at android.app.ActivityThread.main(ActivityThread.java:4424)
01-22 15:29:15.302: E/AndroidRuntime(1560): at java.lang.reflect.Method.invokeNative(Native Method)
01-22 15:29:15.302: E/AndroidRuntime(1560): at java.lang.reflect.Method.invoke(Method.java:511)
01-22 15:29:15.302: E/AndroidRuntime(1560): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:784)
01-22 15:29:15.302: E/AndroidRuntime(1560): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:551)
01-22 15:29:15.302: E/AndroidRuntime(1560): at dalvik.system.NativeStart.main(Native Method)
01-22 15:29:15.302: E/AndroidRuntime(1560): Caused by: java.lang.UnsatisfiedLinkError: Couldn't load ffmpeg: findLibrary returned null
01-22 15:29:15.302: E/AndroidRuntime(1560): at java.lang.Runtime.loadLibrary(Runtime.java:365)
01-22 15:29:15.302: E/AndroidRuntime(1560): at java.lang.System.loadLibrary(System.java:535)
01-22 15:29:15.302: E/AndroidRuntime(1560): at roman10.ffmpegTest.VideoBrowser.<clinit>(VideoBrowser.java:37)
01-22 15:29:15.302: E/AndroidRuntime(1560): ... 15 more
</clinit> -
Best way to put video from RTSP to HTML page ?
10 juillet 2015, par MaxI have a url to RTSP stream with authentication in H264 - MPEG-4 part 10 codec.
Lik looks like this : rtsp ://login:pass@xxx.xxx.xxx.xxx:557Is there way to show video in HTML-page ?
Is it require to use some decoding server like ffmpeg or vlc ?
Or we can put it to page directly ?