
Recherche avancée
Médias (2)
-
SPIP - plugins - embed code - Exemple
2 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (99)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
ANNEXE : Les plugins utilisés spécifiquement pour la ferme
5 mars 2010, parLe site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)
Sur d’autres sites (7302)
-
MediaPlayer miscalculates audio duration. getCurrentPosition() goes back when audio reaches the end
26 mars 2017, par KacyI’m streaming .ogg files from my server. The
MediaPlayer
plays the song to completion, but when it reaches the end of the song,getCurrentPosition()
returns a time less than the actual duration of the song. For 1 song it’s only off by a second, but for another song it’s off by 13 seconds.I have a
SeekBar
that’s updated every second by callinggetCurrentPosition()
, and once the bar reaches the end, it actually jumps back a few seconds. And I can no longer useseekTo()
into the segment of the song that’s past the miscalculated duration because I receive the error :Attempt to seek to past end of file
.Chrome’s html video element simply displays the songs with the incorrect durations.
Other tools like ffmpeg or this app on Windows 10 called PowerMediaPlayer both calculate the songs’ durations perfectly.
I believe this problem is related to this answer, but the solution assumes ffmpeg also miscalculates the duration, which is not the case here. Using the
-vn
flag when converting audio files to ogg didn’t change anything.What are my options ? Is there a way to get the MediaPlayer to calculate the correct duration ?
Update :
Converting to mp3 has no issues (although I used
audioBitrate('96k')
instead ofaudioQuality(2)
, but I require a solution to get ogg files working. Below is how I’m using ffmpeg to perform the conversion. Another thing worth noting is that when I used'vorbis'
instead of'libvorbis'
for the codec the durations were off by less than a second, if at all. However, I must use'libvorbis'
because'vorbis'
is experimental and results in significanlty inconsistent bitrates across different files given the same audio quality parameter. (And it ignoresaudioBitrate()
entirely.)ffmpeg( filepath )
.toFormat( 'ogg' )
.audioCodec( 'libvorbis' )
.audioQuality( 2 )
.output( destination )
.run(); -
Live video broadcasting in android using front and back facing cameras [on hold]
1er décembre 2016, par Jun KimThis is my first time I am posting a question on Stack Overflow.
I am currently developing an android application. My goal is to make an app that captures and records video and broadcasts it to other mobile devices as well as web browsers like
facebook
live.
(Nowadays, famous apps like Youtube, Facebook, Twitch, Periscope has this function)Now, I am just researching what technologies and ways to develop this app.
I have been researching and reading a lot of blogs and documentations for the past five weeks about FFMpeg, different types of streaming technologies, types of web servers etc. I decided to use MPEG-DASH for my app and nginx server (Nginx-rtmp-module).
While searching further, I got stuck and confused about how I can possibly capture & record video using internal camera of Android mobile devices.
I was thinking that I could use MediaRecorder to capture & record (I could hardly find an example using MediaRecorder to record video) and use FFMpeg
to somehow encode with the certain video and audio codecs to make a container format and send it to nginx server to broadcast other devices.
But then.... i am not sure if this is the right way to do it...My questions is... I would like to know the whole process (in detail if it possible) how I can possible record and broadcast to other devices.
The whole process of recording video using internal camera that is in user’s mobile device and send the frame to the server to broadcast to other devices seems difficult to imagine for me... Can anyone elaborate ?
Or any suggestion about how can I reach my goal ?
Every help would be appreciated. -
Use Static Library Build Shared Library Undefined Reference
26 mai 2020, par LeoI want to use AndroidStudio ndk-build to build Shared Library on Window with ffmpeg lib but i got Undefined Reference Error,Please tell me what I do wrong



ndkVersion = 21.0.6113669
compileSdkVersion 29



I also try to build By GCC , the same error happen
I also try to build By CMake with ffmpeg SourceCode , the same error happen



I try to build .a file with some simple C Code on gcc and use another C Code to use it,it work,but not with ffmpeg .a file i dont know why



this is my Android.mk



`LOCAL_PATH := $(call my-dir)
 include $(CLEAR_VARS)
 LOCAL_MODULE := avcodec
 LOCAL_SRC_FILES := lib/libavcodec.dll.a
 include $(PREBUILT_STATIC_LIBRARY)

 include $(CLEAR_VARS)
 LOCAL_MODULE := avformat
 LOCAL_SRC_FILES := lib/libavformat.dll.a
 include $(PREBUILT_STATIC_LIBRARY)

 ...

 include $(CLEAR_VARS)
 LOCAL_MODULE := avutil
 LOCAL_SRC_FILES := lib/libavutil.dll.a
 include $(PREBUILT_STATIC_LIBRARY)

 include $(CLEAR_VARS)
 LOCAL_MODULE := libTestDll
 LOCAL_SRC_FILES := TestDll.cpp
 LOCAL_C_MODULE := avcodec avformat swscale avutil avfilter avdevice

 include $(BUILD_SHARED_LIBRARY)`




this is my main cpp



#include 
#include 

#define __STDC_CONSTANT_MACROS

#include <string>

extern"C"
{
#include "libavformat/avformat.h"
#include "libavutil/mathematics.h"
#include "libavutil/time.h"
#include "libavutil/pixfmt.h"
#include "libavcodec/avcodec.h"
#include "libavformat/avformat.h"
#include "libswscale/swscale.h"
#include "libavutil/imgutils.h"
}

using namespace std;

char output[99999];
int frameCount = 0;
uint8_t* buf;
int bufLength;
uint8_t* p_global_bgr_buffer;
int num_bytes;
int wrap;
bool EndReceFrame = false;
SwsContext* swsContext;
AVFrame* pFrame = av_frame_alloc();
AVCodecContext* pCodecContext;
AVPixelFormat dstFmt = AV_PIX_FMT_RGB24;

static void BGRpgm_save(AVFrame* frame)
{
 int linesize[8] = { frame->linesize[0] * 3 };

 uint8_t* bgr_buffer[8] = { p_global_bgr_buffer };

 sws_scale(swsContext, frame->data, frame->linesize, 0, frame->height, bgr_buffer, linesize);

 memcpy(buf, bgr_buffer[0], linesize[0] * frame->height);

 bufLength = linesize[0] * frame->height;

 frameCount++;

}
...

</string>



Error Message



[x86] SharedLibrary : libTestDll.so
jni/TestDll.cpp:123: error: undefined reference to 'avformat_open_input'
jni/TestDll.cpp:127: error: undefined reference to 'avformat_find_stream_info'
jni/TestDll.cpp:138: error: undefined reference to 'av_dump_format'
jni/TestDll.cpp:144: error: undefined reference to 'avcodec_find_decoder'
jni/TestDll.cpp:146: error: undefined reference to 'avcodec_alloc_context3'