Recherche avancée
Médias (91)
-
MediaSPIP Simple : futur thème graphique par défaut ?
26 septembre 2013, par
Mis à jour : Octobre 2013
Langue : français
Type : Video
-
avec chosen
13 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
sans chosen
13 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
config chosen
13 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
SPIP - plugins - embed code - Exemple
2 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
GetID3 - Bloc informations de fichiers
9 avril 2013, par
Mis à jour : Mai 2013
Langue : français
Type : Image
Autres articles (112)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)
Sur d’autres sites (7935)
-
How to using every 5 sec generate video output File Path to Encode with RTMP Formate write data in ios ? [on hold]
16 juillet 2015, par Sandeep Joshi(void) segmentRecording:(NSTimer*)timer {
if (!shouldBeRecording) {
[timer invalidate];
}
AVAssetWriter *tempAssetWriter = self.assetWriter;
AVAssetWriterInput *tempAudioEncoder = self.audioEncoder;
AVAssetWriterInput *tempVideoEncoder = self.videoEncoder;
self.assetWriter = queuedAssetWriter;
self.audioEncoder = queuedAudioEncoder;
self.videoEncoder = queuedVideoEncoder;
NSLog(@"Switching encoders");
dispatch_async(segmentingQueue, ^{
if (tempAssetWriter.status == AVAssetWriterStatusWriting) {
@try {
[tempAudioEncoder markAsFinished];
[tempVideoEncoder markAsFinished];
[tempAssetWriter finishWritingWithCompletionHandler:^{
if (tempAssetWriter.status == AVAssetWriterStatusFailed) {
[self showError:tempAssetWriter.error];
} else {
[self uploadLocalURL:tempAssetWriter.outputURL];
}
}];
}
@catch (NSException *exception) {
NSLog(@"Caught exception: %@", [exception description]);
//[BugSenseController logException:exception withExtraData:nil];
}
}
self.segmentCount++;
if (self.readyToRecordAudio && self.readyToRecordVideo) {
NSError *error = nil;
self.queuedAssetWriter = [[AVAssetWriter alloc] initWithURL:[OWUtilities urlForRecordingSegmentCount:segmentCount basePath:self.basePath] fileType:(NSString *)kUTTypeMPEG4 error:&error];
if (error) {
[self showError:error];
}
self.queuedVideoEncoder = [self setupVideoEncoderWithAssetWriter:self.queuedAssetWriter formatDescription:videoFormatDescription bitsPerSecond:videoBPS];
self.queuedAudioEncoder = [self setupAudioEncoderWithAssetWriter:self.queuedAssetWriter formatDescription:audioFormatDescription bitsPerSecond:audioBPS];
//NSLog(@"Encoder switch finished");
}
});}
(void) uploadLocalURL:(NSURL*)url {
NSLog(@"upload local url: %@", url);
NSString *inputPath = [url path];
NSString *outputPath = [inputPath stringByReplacingOccurrencesOfString:@".mp4" withString:@".ts"];
NSString *outputFileName = [outputPath lastPathComponent];
NSDictionary *options = @{kFFmpegOutputFormatKey: @"mpegts"};
NSLog(@"%@ conversion...", outputFileName);
[ffmpegWrapper convertInputPath:[url path] outputPath:outputPath options:options progressBlock:nil completionBlock:^(BOOL success, NSError *error) {
if (success) {
if (!isRtmpConnected) {
isRtmpConnected = [rtmp openWithURL:HOST_URL enableWrite:YES];
}
isRtmpConnected = [rtmp isConnected];
if (isRtmpConnected) {
NSData *video = [NSData dataWithContentsOfURL:[NSURL URLWithString:outputPath]];
NSUInteger length = [video length];
NSUInteger chunkSize = 1024 * 5;;
NSUInteger offset = 0;
NSLog(@"original video length: %lu \n chunkSize : %lu", length,chunkSize);
// Let's split video to small chunks to publish to media server
do {
NSUInteger thisChunkSize = length - offset > chunkSize ? chunkSize : length - offset;
NSData* chunk = [NSData dataWithBytesNoCopy:(char *)[video bytes] + offset
length:thisChunkSize
freeWhenDone:NO];
offset += thisChunkSize;
// Write new chunk to rtmp server
NSLog(@"%lu", (unsigned long)[rtmp write:chunk]);
sleep(1);
} while (offset < length);
}else{
[rtmp close];
}
} else {
NSLog(@"conversion error: %@", error.userInfo);
}
}];}This code use for live streaming for send data using RTMP Wrapper.
Not write in Socket properly because every 5 second to generate different file output file.This is proper way ?
I have no idea how to get NSData in proper way.
Please help me .
-
Compiling ffmpeg for iOS and gas-preprocessor.pl
16 mai 2017, par user500I want to compile ffmpeg for iOS. I did it a few times before. But now I’m on clean new Mavericks and on configure I’m always getting
Configured with: --prefix=/Applications/Xcode.app/Contents/Developer/usr --with-gxx-include-dir=/usr/include/c++/4.2.1
Configured with: --prefix=/Applications/Xcode.app/Contents/Developer/usr --with-gxx-include-dir=/usr/include/c++/4.2.1
GNU assembler not found, install gas-preprocessor
If you think configure made a mistake, make sure you are using the latest
version from Git. If the latest version fails, report the problem to the
ffmpeg-user@ffmpeg.org mailing list or IRC #ffmpeg on irc.freenode.net.
Include the log file "config.log" produced by configure as this will help
solving the problem.I have current Xcode installed. Also brews. And current
gas-preprocessor.pl(https://github.com/yuvi/gas-preprocessor) inusr/binand also inusr/local/bin.
On
perl /usr/bin/gas-preprocessor.pl gccI’m gettingUnrecognized input filetype at /usr/bin/gas-preprocessor.pl line 33.
This config works :
./configure \
--extra-cflags='-arch arm64 -mios-version-min=7.0 -mthumb' \
--extra-ldflags='-arch arm64 -mios-version-min=7.0' \
--enable-cross-compile \
--arch=arm64 \
--target-os=darwin \
--cc=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang \
--sysroot=/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS7.0.sdk \
--prefix=arm64 \
--disable-doc \
--disable-shared \
--disable-everything \
--enable-static \
--enable-pic \
--disable-muxers \
--enable-muxer=flv \
--disable-demuxers \
--enable-demuxer=h264 \
--enable-demuxer=pcm_s16le \
--disable-devices \
--disable-parsers \
--enable-parser=h264 \
--disable-encoders \
--enable-encoder=aac \
--disable-decoders \
--enable-decoder=h264 \
--enable-decoder=pcm_s16le \
--disable-protocols \
--enable-protocol=rtmp \
--disable-filters \
--disable-bsfs
This config throws error above (GNU assembler not found, install gas-preprocessor) :
./configure \
--cpu=cortex-a8 \
--extra-cflags='-arch armv7 -mios-version-min=7.0 -mthumb' \
--extra-ldflags='-arch armv7 -mios-version-min=7.0' \
--enable-cross-compile \
--arch=armv7 \
--target-os=darwin \
--cc=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang \
--sysroot=/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS7.0.sdk \
--prefix=armv7 \
--disable-doc \
--disable-shared \
--disable-everything \
--enable-static \
--enable-pic \
--disable-muxers \
--enable-muxer=flv \
--disable-demuxers \
--enable-demuxer=h264 \
--enable-demuxer=pcm_s16le \
--disable-devices \
--disable-parsers \
--enable-parser=h264 \
--disable-encoders \
--enable-encoder=aac \
--disable-decoders \
--enable-decoder=h264 \
--enable-decoder=pcm_s16le \
--disable-protocols \
--enable-protocol=rtmp \
--disable-filters \
--disable-bsfs -
iOs : Low frame per second(fps) for VGA resolution
26 juillet 2014, par Bhuvan BalasubramanianI’m facing an issue in broadcasting video from one iPhone to another iPhone.
The issue is when I view the friend’s live video in my iPhone, the frame per second(fps) is very low(it is 12fps). Video quality and audio is looking fine but the only problem is fps.
I don’t know where I need to config/change the code to convert from variable fps to constant fps. Also to increase the fps as
**24/30**.The resolution I used for broadcasting
RESOLUTION_VGA, // 480x640px (landscape) & 640x480px (portrait)I’m using following libraries for streaming
- MediaLibiOS - link
- Ffmpeg-2.2.1
- CommLibiOS
- libx264-r2409
Wowza is a Media Server and iOS target version is 7.0
Please help !
Thanks in advance.