
Recherche avancée
Médias (1)
-
The Slip - Artworks
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (69)
-
Gestion des droits de création et d’édition des objets
8 février 2011, parPar défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;
-
(Dés)Activation de fonctionnalités (plugins)
18 février 2011, parPour gérer l’ajout et la suppression de fonctionnalités supplémentaires (ou plugins), MediaSPIP utilise à partir de la version 0.2 SVP.
SVP permet l’activation facile de plugins depuis l’espace de configuration de MediaSPIP.
Pour y accéder, il suffit de se rendre dans l’espace de configuration puis de se rendre sur la page "Gestion des plugins".
MediaSPIP est fourni par défaut avec l’ensemble des plugins dits "compatibles", ils ont été testés et intégrés afin de fonctionner parfaitement avec chaque (...) -
Activation de l’inscription des visiteurs
12 avril 2011, parIl est également possible d’activer l’inscription des visiteurs ce qui permettra à tout un chacun d’ouvrir soit même un compte sur le canal en question dans le cadre de projets ouverts par exemple.
Pour ce faire, il suffit d’aller dans l’espace de configuration du site en choisissant le sous menus "Gestion des utilisateurs". Le premier formulaire visible correspond à cette fonctionnalité.
Par défaut, MediaSPIP a créé lors de son initialisation un élément de menu dans le menu du haut de la page menant (...)
Sur d’autres sites (10338)
-
Generate MPEG-DASH segments when requested [closed]
16 septembre 2024, par John SmithUp front disclaimer : this question is almost identical to the one here : Generate single MPEG-Dash segment with ffmpeg but it seems that ffmpeg has updated over time and now the answer provided by Coumeu is no longer accurate.


Using ffmpeg, I am attempting to "lazily create" the segments needed for MPEG DASH playback. I have a static manifest (MPD) which includes SegmentTemplates for 1 video and 1 audio stream.


Only when the endpoints provided in the SegmentTemplate for init or media segments is hit it will generate this segment. In other words, nothing is processed up front.


I am creating the init segment using the following movflags :


+frag_keyframe+faststart+skip_trailer 



I am creating the media segment using the following movflags


+frag_keyframe+default_base_moof+delay_moov+skip_trailer+dash+global_sidx



and the following additional commands any segment type :


-map_metadata -1 -copyts -start_at_zero -map 0:${streamIndex}



This results in playable video with the correct duration, but it seems to be switching fragments quite fast. On average one fragment per second. When I omit the delay_moov flag, the video also starts glitching, like it's stepping over I-Frames.


My observations when I look at the boxes :


- 

- earliest_presentation_time (sidx box) is 0 for all segments
- base_media_decode_time (moof->traf->tfdt box) is 0 for all segments






I'm looking for either the correct command line to create segments on the fly, or hacky ways to get it done right if ffmpeg doesn't provide it out of the box. Even information about what to look for would be helpful, because my knowledge about mp4 and MPEG-DASH is running out.


-
Serving live stream by using ffmpeg via rtmp protocol
28 janvier 2019, par YusufuI have been trying to serve my rtmp stream to web, struggling with videojs
I am able to serve my static video or live stream from android screen by using ffmpeg via this command :ffmpeg -i video7.mp4 -c:v libx264 -g 25 -preset fast -b:v 4096k -c:a libfdk_aac -ar 44100 -f flv rtmp://127.0.0.1/media_server/video.flv
So can connect it via ffplay. This part works like a charm.
ffplay rtmp://127.0.0.1/media_server/video.flv
But couldnt watch in web. My html file here. Copied from videojs offical example
- question : Being able to watch via ffplay on rtmp means my nginx rtmp module works well ?
-
I have been serving httml via http-server and rtmp on rtmp ://127.0.0.1 without port caused problem ?
-
Any other videoplayer advice ? I have already tried hls it works but in my case creating m3u8 manifest file not desired because I am streaming live video from mobile screen record so creating new ts files nut not updated m3u8 file not for me I guess ?
-
What else I can try as a protocol ? instead of rtmp
I can share about my screenrecord and ffmpeg commands to help me or you. Thanks
-
Youtube Watch Me Android application closing unexpectedly
18 septembre 2015, par KichuI create android application from https://github.com/youtube/yt-watchme.
When I try to create the event using the "CREATE LIVE EVENT" button.It throws some following error. I think it’s happening due to the camera permission issue.
ERROR :
09-17 11:43:53.582 32383-32383/com.google.android.apps.watchme E/AndroidRuntime﹕ FATAL EXCEPTION: main
Process: com.google.android.apps.watchme, PID: 32383
java.lang.NoSuchMethodError: com.google.android.apps.watchme.StreamerActivity.checkSelfPermission
at com.google.android.apps.watchme.StreamerActivity.startStreaming(StreamerActivity.java:174)
at com.google.android.apps.watchme.StreamerActivity.access$200(StreamerActivity.java:46)
at com.google.android.apps.watchme.StreamerActivity$1.onServiceConnected(StreamerActivity.java:63)
at android.app.LoadedApk$ServiceDispatcher.doConnected(LoadedApk.java:1110)
at android.app.LoadedApk$ServiceDispatcher$RunConnection.run(LoadedApk.java:1127)
at android.os.Handler.handleCallback(Handler.java:733)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:136)
at android.app.ActivityThread.main(ActivityThread.java:5097)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:515)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:785)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:601)
at dalvik.system.NativeStart.main(Native Method)Any one please suggest . How can I solve this issue.
Thanks In Advance
Code Update :
Manifest file XML
<manifest package="com.google.android.apps.watchme">
<application>
<activity>
<action></action>
<category></category>
</activity>
<activity></activity>
<service></service>
</application>
</manifest>Stream Activity File
/*
* Copyright (c) 2014 Google Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except
* in compliance with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software distributed under the License
* is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express
* or implied. See the License for the specific language governing permissions and limitations under
* the License.
*/
package com.google.android.apps.watchme;
import android.Manifest;
import android.app.Activity;
import android.content.ComponentName;
import android.content.Context;
import android.content.Intent;
import android.content.ServiceConnection;
import android.content.pm.PackageManager;
import android.hardware.Camera;
import android.os.Bundle;
import android.os.IBinder;
import android.os.PowerManager;
import android.support.design.widget.Snackbar;
import android.support.v4.app.ActivityCompat;
import android.util.Log;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.ToggleButton;
import com.google.android.apps.watchme.util.Utils;
import com.google.android.apps.watchme.util.YouTubeApi;
import java.util.ArrayList;
import java.util.List;
/**
* @author Ibrahim Ulukaya <ulukaya@google.com>
* <p></p>
* StreamerActivity class which previews the camera and streams via StreamerService.
*/
public class StreamerActivity extends Activity {
// CONSTANTS
// TODO: Stop hardcoding this and read values from the camera's supported sizes.
public static final int CAMERA_WIDTH = 640;
public static final int CAMERA_HEIGHT = 480;
private static final int REQUEST_CAMERA_MICROPHONE = 0;
// Member variables
private StreamerService streamerService;
private ServiceConnection streamerConnection = new ServiceConnection() {
@Override
public void onServiceConnected(ComponentName className, IBinder service) {
Log.d(MainActivity.APP_NAME, "onServiceConnected");
streamerService = ((StreamerService.LocalBinder) service).getService();
restoreStateFromService();
startStreaming();
}
@Override
public void onServiceDisconnected(ComponentName className) {
Log.e(MainActivity.APP_NAME, "onServiceDisconnected");
// This should never happen, because our service runs in the same process.
streamerService = null;
}
};
private PowerManager.WakeLock wakeLock;
private Preview preview;
private String rtmpUrl;
private String broadcastId;
@Override
public void onCreate(Bundle savedInstanceState) {
Log.d(MainActivity.APP_NAME, "onCreate");
super.onCreate(savedInstanceState);
broadcastId = getIntent().getStringExtra(YouTubeApi.BROADCAST_ID_KEY);
//Log.v(MainActivity.APP_NAME, broadcastId);
rtmpUrl = getIntent().getStringExtra(YouTubeApi.RTMP_URL_KEY);
if (rtmpUrl == null) {
Log.w(MainActivity.APP_NAME, "No RTMP URL was passed in; bailing.");
finish();
}
Log.i(MainActivity.APP_NAME, String.format("Got RTMP URL '%s' from calling activity.", rtmpUrl));
setContentView(R.layout.streamer);
preview = (Preview) findViewById(R.id.surfaceViewPreview);
if (!bindService(new Intent(this, StreamerService.class), streamerConnection,
BIND_AUTO_CREATE | BIND_DEBUG_UNBIND)) {
Log.e(MainActivity.APP_NAME, "Failed to bind StreamerService!");
}
final ToggleButton toggleButton = (ToggleButton) findViewById(R.id.toggleBroadcasting);
toggleButton.setOnClickListener(new OnClickListener() {
@Override
public void onClick(View v) {
if (toggleButton.isChecked()) {
streamerService.startStreaming(rtmpUrl);
} else {
streamerService.stopStreaming();
}
}
});
}
@Override
protected void onResume() {
Log.d(MainActivity.APP_NAME, "onResume");
super.onResume();
if (streamerService != null) {
restoreStateFromService();
}
}
@Override
protected void onPause() {
Log.d(MainActivity.APP_NAME, "onPause");
super.onPause();
if (preview != null) {
preview.setCamera(null);
}
if (streamerService != null) {
streamerService.releaseCamera();
}
}
@Override
protected void onDestroy() {
Log.d(MainActivity.APP_NAME, "onDestroy");
super.onDestroy();
if (streamerConnection != null) {
unbindService(streamerConnection);
}
stopStreaming();
if (streamerService != null) {
streamerService.releaseCamera();
}
}
private void restoreStateFromService() {
preview.setCamera(Utils.getCamera(Camera.CameraInfo.CAMERA_FACING_FRONT));
}
private void startStreaming() {
Log.d(MainActivity.APP_NAME, "startStreaming");
PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
wakeLock = pm.newWakeLock(PowerManager.SCREEN_DIM_WAKE_LOCK, this.getClass().getName());
wakeLock.acquire();
if (!streamerService.isStreaming()) {
String cameraPermission = Manifest.permission.CAMERA;
String microphonePermission = Manifest.permission.RECORD_AUDIO;
int hasCamPermission = checkSelfPermission(cameraPermission);
int hasMicPermission = checkSelfPermission(microphonePermission);
List<string> permissions = new ArrayList<string>();
if (hasCamPermission != PackageManager.PERMISSION_GRANTED) {
permissions.add(cameraPermission);
if (ActivityCompat.shouldShowRequestPermissionRationale(this,
Manifest.permission.CAMERA)) {
// Provide rationale in Snackbar to request permission
Snackbar.make(preview, R.string.permission_camera_rationale,
Snackbar.LENGTH_INDEFINITE).show();
} else {
// Explain in Snackbar to turn on permission in settings
Snackbar.make(preview, R.string.permission_camera_explain,
Snackbar.LENGTH_INDEFINITE).show();
}
}
if (hasMicPermission != PackageManager.PERMISSION_GRANTED) {
permissions.add(microphonePermission);
if (ActivityCompat.shouldShowRequestPermissionRationale(this,
Manifest.permission.RECORD_AUDIO)) {
// Provide rationale in Snackbar to request permission
Snackbar.make(preview, R.string.permission_microphone_rationale,
Snackbar.LENGTH_INDEFINITE).show();
} else {
// Explain in Snackbar to turn on permission in settings
Snackbar.make(preview, R.string.permission_microphone_explain,
Snackbar.LENGTH_INDEFINITE).show();
}
}
if (!permissions.isEmpty()) {
String[] params = permissions.toArray(new String[permissions.size()]);
ActivityCompat.requestPermissions(this, params, REQUEST_CAMERA_MICROPHONE);
} else {
// We already have permission, so handle as normal
streamerService.startStreaming(rtmpUrl);
}
}
}
/**
* Callback received when a permissions request has been completed.
*/
@Override
public void onRequestPermissionsResult(int requestCode,
String permissions[], int[] grantResults) {
switch (requestCode) {
case REQUEST_CAMERA_MICROPHONE: {
Log.i(MainActivity.APP_NAME, "Received response for camera with mic permissions request.");
// We have requested multiple permissions for contacts, so all of them need to be
// checked.
if (Utils.verifyPermissions(grantResults)) {
// permissions were granted, yay! do the
// streamer task you need to do.
streamerService.startStreaming(rtmpUrl);
} else {
Log.i(MainActivity.APP_NAME, "Camera with mic permissions were NOT granted.");
Snackbar.make(preview, R.string.permissions_not_granted,
Snackbar.LENGTH_SHORT)
.show();
}
break;
}
// other 'switch' lines to check for other
// permissions this app might request
}
return;
}
private void stopStreaming() {
Log.d(MainActivity.APP_NAME, "stopStreaming");
if (wakeLock != null) {
wakeLock.release();
wakeLock = null;
}
if (streamerService.isStreaming()) {
streamerService.stopStreaming();
}
}
public void endEvent(View view) {
Intent data = new Intent();
data.putExtra(YouTubeApi.BROADCAST_ID_KEY, broadcastId);
if (getParent() == null) {
setResult(Activity.RESULT_OK, data);
} else {
getParent().setResult(Activity.RESULT_OK, data);
}
finish();
}
}
</string></string>Update ERROR CODE :
08:57:14.447 18829-18829/com.google.android.apps.watchme E/AndroidRuntime﹕ FATAL EXCEPTION: main
Process: com.google.android.apps.watchme, PID: 18829
java.lang.UnsatisfiedLinkError: Couldn't load ffmpeg from loader dalvik.system.PathClassLoader[DexPathList[[zip file "/data/app/com.google.android.apps.watchme-1.apk"],nativeLibraryDirectories=[/data/app-lib/com.google.android.apps.watchme-1, /vendor/lib, /system/lib]]]: findLibrary returned null
at java.lang.Runtime.loadLibrary(Runtime.java:358)
at java.lang.System.loadLibrary(System.java:526)
at com.google.android.apps.watchme.Ffmpeg.<clinit>(Ffmpeg.java:26)
at com.google.android.apps.watchme.VideoStreamingConnection.open(VideoStreamingConnection.java:71)
at com.google.android.apps.watchme.StreamerService.startStreaming(StreamerService.java:80)
at com.google.android.apps.watchme.StreamerActivity.startStreaming(StreamerActivity.java:212)
at com.google.android.apps.watchme.StreamerActivity.access$200(StreamerActivity.java:47)
at com.google.android.apps.watchme.StreamerActivity$1.onServiceConnected(StreamerActivity.java:64)
at android.app.LoadedApk$ServiceDispatcher.doConnected(LoadedApk.java:1110)
at android.app.LoadedApk$ServiceDispatcher$RunConnection.run(LoadedApk.java:1127)
at android.os.Handler.handleCallback(Handler.java:733)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:136)
at android.app.ActivityThread.main(ActivityThread.java:5097)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:515)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:785)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:601)
at dalvik.system.NativeStart.main(Native Method)
</clinit>