
Recherche avancée
Médias (2)
-
SPIP - plugins - embed code - Exemple
2 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (105)
-
List of compatible distributions
26 avril 2011, parThe table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Encoding and processing into web-friendly formats
13 avril 2011, parMediaSPIP automatically converts uploaded files to internet-compatible formats.
Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
All uploaded files are stored online in their original format, so you can (...)
Sur d’autres sites (10738)
-
avcodec/nvenc : only unregister input resources when absolutely needed
24 avril 2019, par Timo Rothenpieleravcodec/nvenc : only unregister input resources when absolutely needed
This reverts nvenc to old behaviour, which in some super rare edge cases
performs better.
The implication of this is that any potential API user who relies on
nvenc cleaning up every frames device resources after it's done using
them will have to change their usage pattern.That should not be a problem, since pretty much every normal usage
pattern automatically implies that surfaces are reused from a common
pool, since constant re-allocation is also very expensive. -
How to decrypt hls video content
16 mai 2019, par SHAH MD MONIRUL ISLAMMy requirement is to play the encrypted
hls
video files from local storage inandroid
. I have usedNanoHTTPD
to create and run the local server. From there I am serving the.ts
an.m3u8
files. To play this videoExoPlayer
need a key to decrypt the files and thus I made a url : http://localhost:4990/dataKey.Here is my local server class :
import android.os.Environment;
import android.util.Log;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.util.Map;
import fi.iki.elonen.NanoHTTPD;
public class LocalStreamingServer extends NanoHTTPD{
public LocalStreamingServer(int port){
super(port);
}
@Override
public Response serve(IHTTPSession session){
Log.e("req", session.getUri());
if(session.getUri().equalsIgnoreCase("/dataKey")){
return newFixedLengthResponse(Response.Status.OK, "txt", "what is the key?");
}
if(session.getUri().contains("m3u8")){
String path = Environment.getExternalStorageDirectory().toString() + "/s3" + session.getUri();
FileInputStream fis = null;
File f = new File(path);
try {
fis = new FileInputStream(f);
} catch (FileNotFoundException e) {
}
return newFixedLengthResponse(Response.Status.OK, "m3u8", fis, f.length());
}
if(session.getUri().endsWith("ts")){
String path = Environment.getExternalStorageDirectory().toString() + "/s3" + session.getUri();
FileInputStream fis = null;
File f = new File(path);
try {
fis = new FileInputStream(f);
} catch (FileNotFoundException e) {
}
return newFixedLengthResponse(Response.Status.OK, "ts", fis, f.length());
}
String path = Environment.getExternalStorageDirectory().toString() + "/s3/master.m3u8";
FileInputStream fis = null;
File f = new File(path);
try {
fis = new FileInputStream(f);
} catch (FileNotFoundException e) {
}
return newFixedLengthResponse(Response.Status.OK, "m3u8", fis, f.length());
}
}I have transcoded the video using
ffmpeg
. I need to know that which data or key need to be returned when the dataKey url is called. I have the encrypted the video using these key :key=617D8A125A284DF48E3C6B1866348A3F
IV=5ff82ce11c7e73dcdf7e73cacd0ef98I can not understand which of them are need to be returned from the datakey url. Both of them are not working.
Exoplayer
is sending the error message :java.security.InvalidKeyException: Unsupported key size
can Any one help me about this ?
-
FFmpeg - get and play audio data from frames
22 avril 2019, par AscendCodeI have an project that use FFmpeg to play video. I was able to step frames from the video and display image gotten from those. But don’t know how to get audio data from those frames and play it.
Did any one experience with FFmpeg and audio processing ?
Please help.
Thanks in advance// Here is a piece of the code written in Objective C
FFDecoder.m
-(id)initWithVideo:(NSString *)moviePath{
self = [super init];
if (self == nil) {
return nil;
}
pFormatCtx = avformat_alloc_context();
AVCodec * pCodec = NULL;
if(avformat_open_input(&pFormatCtx, [moviePath cStringUsingEncoding:NSUTF8StringEncoding], NULL, NULL) != 0){
av_log(NULL, AV_LOG_ERROR, "Couldn't open file\n");
goto initError;
}
if (avformat_find_stream_info(pFormatCtx, NULL) < 0) {
avformat_close_input(&pFormatCtx);
av_log(NULL, AV_LOG_ERROR, "Couldn't find stream information\n");
goto initError;
}
av_dump_format(pFormatCtx, 0, [moviePath.lastPathComponent cStringUsingEncoding:NSUTF8StringEncoding], false);
videoStream = -1;
for (int i = 0 ; i < pFormatCtx->nb_streams; ++i ) {
if (pFormatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_VIDEO && videoStream < 0) {
videoStream = i;
}
}
if (videoStream == -1) {
av_log(NULL, AV_LOG_ERROR, "Didn't find a video stream");
goto initError;
}
pCodecCtx = pFormatCtx->streams[videoStream]->codec;
pCodec = avcodec_find_decoder(pCodecCtx->codec_id);
if (!pCodec) {
av_log(NULL, AV_LOG_ERROR, "unsupported codec\n");
goto initError;
}
if (avcodec_open2(pCodecCtx, pCodec, NULL) < 0) {
av_log(NULL, AV_LOG_ERROR, "Can't open video decoder\n");
goto initError;
}
pFrame = av_frame_alloc();
outputWidth = pCodecCtx->width;
outputHeight = pCodecCtx->height;
return self;
initError:
return nil;
}
- (void)seekTime:(double)seconds {
AVRational timeBase = pFormatCtx->streams[videoStream]->time_base;
int64_t targetFrame = (int64_t)((double)timeBase.den / timeBase.num * seconds);
avformat_seek_file(pFormatCtx, videoStream, targetFrame, targetFrame, targetFrame, AVSEEK_FLAG_FRAME);
avcodec_flush_buffers(pCodecCtx);
}
- (BOOL)stepFrame {
int frameFinished = 0;
while (!frameFinished && av_read_frame(pFormatCtx, &pPacket) >= 0) {
if (pPacket.stream_index == videoStream) {
//decode video frame
avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &pPacket);
}
}
return frameFinished != 0;
}And here is code for updating UI of viewcontroller, but don’t know how to play audio
- (IBAction)playButtonAction:(id)sender {
[[self playButton] setEnabled:NO];
[video seekTime:0.0];
double preferredFramesPerSecond = 30.0f;
[NSTimer scheduledTimerWithTimeInterval:1.0/preferredFramesPerSecond
target:self
selector:@selector(displayNextFrame:)
userInfo:nil
repeats:YES];
}
#define LERP(A,B,C) ((A)*(1.0-C)+(B)*C)
-(void)displayNextFrame:(NSTimer *)timer {
NSTimeInterval startTime = [NSDate timeIntervalSinceReferenceDate];
if (![video stepFrame]) {
[timer invalidate];
[_playButton setEnabled:YES];
[video closeAudio];
return;
}
_imageView.image = video.currentImage;
float frameTime = 1.0/([NSDate timeIntervalSinceReferenceDate]-startTime);
if (lastFrameTime<0) {
lastFrameTime = frameTime;
} else {
lastFrameTime = LERP(frameTime, lastFrameTime, 0.8);
}
[_label setText:[NSString stringWithFormat:@"%.0f",lastFrameTime]];
}