
Recherche avancée
Médias (91)
-
Géodiversité
9 septembre 2011, par ,
Mis à jour : Août 2018
Langue : français
Type : Texte
-
USGS Real-time Earthquakes
8 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
-
SWFUpload Process
6 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
-
La conservation du net art au musée. Les stratégies à l’œuvre
26 mai 2011
Mis à jour : Juillet 2013
Langue : français
Type : Texte
-
Podcasting Legal guide
16 mai 2011, par
Mis à jour : Mai 2011
Langue : English
Type : Texte
-
Creativecommons informational flyer
16 mai 2011, par
Mis à jour : Juillet 2013
Langue : English
Type : Texte
Autres articles (40)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...) -
Other interesting software
13 avril 2011, parWe don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
We don’t know them, we didn’t try them, but you can take a peek.
Videopress
Website : http://videopress.com/
License : GNU/GPL v2
Source code : (...)
Sur d’autres sites (6300)
-
ffmpeg ndk multiple definition libavcodec.a(golomb.o)
4 décembre 2014, par WLGfxI’m trying to link ffmpeg as static libraries with android NDK but I’m getting ’multiple definition’ error’ errors as below. I’ve also included my build script which runs through everything just fine but when I come to using the libraries in Eclipse with the ADT plugin, I can’t get anywhere.
From this it looks like it wants something to do with VLC. I don’t want anything to do with VLC, just ffmpeg for video streaming. Everything works fine with shared libraries, but I’m after a very tiny player because I’m restricted to space on the device.
EDIT : Also ’log2_tab_tab.o’ has multiple definitions.
error: jni/libs\libavcodec.a(golomb.o): multiple definition of 'ff_golomb_vlc_len' Ffplayer C/C++ Problem
error: jni/libs\libavcodec.a(golomb.o): multiple definition of 'ff_interleaved_dirac_golomb_vlc_code' Ffplayer C/C++ Problem
error: jni/libs\libavcodec.a(golomb.o): multiple definition of 'ff_interleaved_golomb_vlc_len' Ffplayer C/C++ Problem
error: jni/libs\libavcodec.a(golomb.o): multiple definition of 'ff_interleaved_se_golomb_vlc_code' Ffplayer C/C++ Problem
error: jni/libs\libavcodec.a(golomb.o): multiple definition of 'ff_interleaved_ue_golomb_vlc_code' Ffplayer C/C++ Problem
error: jni/libs\libavcodec.a(golomb.o): multiple definition of 'ff_se_golomb_vlc_code' Ffplayer C/C++ Problem
error: jni/libs\libavcodec.a(golomb.o): multiple definition of 'ff_ue_golomb_len' Ffplayer C/C++ Problem
error: jni/libs\libavcodec.a(golomb.o): multiple definition of 'ff_ue_golomb_vlc_code' Ffplayer C/C++ Problem
error: jni/libs\libavcodec.a(log2_tab.o): multiple definition of 'ff_log2_tab' Ffplayer C/C++ Problem
error: jni/libs\libavformat.a(log2_tab.o): multiple definition of 'ff_log2_tab' Ffplayer C/C++ Problem
jni/libs\libavformat.a(golomb_tab.o): previous definition here Ffplayer C/C++ Problem
jni/libs\libavutil.a(log2_tab.o): previous definition here Ffplayer C/C++ Problem
make.exe: *** [obj/local/armeabi-v7a-hard/libffplayer.so] Error 1 Ffplayer C/C++ ProblemUsing the latest branch of ffmpeg (2.4.3) my build script for Android (using toolchain 8 because it’s old hardware I’m working with) and wanting the NEON hardware support :
export ANDROID_NDK=/home/carl/dev/ndk
export TOOLCHAIN=/home/carl/temp/ffmpeg
export SYSROOT=$TOOLCHAIN/sysroot/
$ANDROID_NDK/build/tools/make-standalone-toolchain.sh \
--platform=android-8 --install-dir=$TOOLCHAIN
export PATH=$TOOLCHAIN/bin:$PATH
export CC=arm-linux-androideabi-gcc
export LD=arm-linux-androideabi-ld
export AR=arm-linux-androideabi-ar
CFLAGS="-O3 -Wall -mthumb -pipe -fpic -fasm \
-finline-limit=300 -ffast-math \
-fstrict-aliasing -Werror=strict-aliasing \
-fmodulo-sched -fmodulo-sched-allow-regmoves \
-Werror=implicit-function-declaration \
-Wno-psabi -Wa,--noexecstack"
# -D__ARM_ARCH_5__ -D__ARM_ARCH_5E__ \
# -D__ARM_ARCH_5T__ -D__ARM_ARCH_5TE__ \
# -DANDROID -DNDEBUG"
EXTRA_CFLAGS="-march=armv7-a -mfpu=neon \
-mfloat-abi=softfp -mvectorize-with-neon-quad \
-DHAVE_ISNAN -DHAVE_ISINF
-std=c99"
EXTRA_LDFLAGS="-Wl,--fix-cortex-a8"
FFMPEG_FLAGS="--prefix=/home/dev/ffmpeg/build \
--target-os=linux \
--arch=arm \
--enable-cross-compile \
--cross-prefix=arm-linux-androideabi- \
--enable-shared \
--enable-static \
--enable-small \
--disable-symver \
--disable-doc \
--disable-ffplay \
--disable-ffmpeg \
--disable-ffprobe \
--disable-ffserver \
--disable-avdevice \
--disable-avfilter \
--disable-encoders \
--disable-muxers \
--disable-demuxers \
--disable-filters \
--disable-devices \
--disable-decoders \
--enable-decoder=mjpeg \
--enable-decoder=mp1 \
--enable-decoder=mp2 \
--enable-decoder=mp3 \
--enable-decoder=mpeg1_vdpau \
--enable-decoder=mpeg1video \
--enable-decoder=mpeg2video \
--enable-decoder=mpeg4 \
--enable-decoder=mpeg4_vdpau \
--enable-decoder=mpegvideo \
--enable-decoder=mpeg_xvmc \
--enable-decoder=h261 \
--enable-decoder=h263 \
--enable-decoder=h263i \
--enable-decoder=h263p \
--enable-hwaccel=h263_vaapi \
--enable-hwaccel=h263_vdpau \
--enable-hwaccel=mpeg1_vdpau \
--enable-hwaccel=mpeg1_xvmc \
--enable-hwaccel=mpeg2_dxva2 \
--enable-hwaccel=mpeg2_vaapi \
--enable-hwaccel=mpeg2_vdpau \
--enable-hwaccel=mpeg2_xvmc \
--enable-hwaccel=mpeg4_vaapi \
--enable-hwaccel=mpeg4_vdpau \
--enable-demuxer=aac \
--enable-demuxer=ac3 \
--enable-demuxer=h261 \
--enable-demuxer=h263 \
--enable-demuxer=pcm_s16be \
--enable-demuxer=pcm_s16le \
--enable-demuxer=pcm_s8 \
--enable-demuxer=mpegps \
--enable-demuxer=mpegts \
--enable-demuxer=mpegtsraw \
--enable-demuxer=mpegvideo \
--enable-demuxer=rtp \
--enable-demuxer=rtsp \
--enable-parser=aac \
--enable-parser=mpegvideo \
--enable-parser=ac3 \
--enable-parser=h261 \
--enable-parser=h263 \
--enable-parser=mjpeg \
--enable-parser=mpeg4video \
--enable-parser=mpegaudio \
--enable-protocol=rtp \
--enable-protocol=file \
--enable-protocol=ftp \
--enable-protocol=tcp \
--enable-protocol=http \
--enable-protocol=udp \
--enable-protocol=pipe \
--enable-protocol=unix \
--enable-network \
--disable-swscale \
--enable-asm \
--enable-memalign-hack \
--disable-golomb \
--enable-stripping \
--enable-pthreads \
--disable-symver \
--enable-version3"
./configure $FFMPEG_FLAGS --extra-cflags="$CFLAGS $EXTRA_CFLAGS" \
--extra-ldflags="$EXTRA_LDFLAGS"
make clean
echo "Project now cleaned"
make -j4
echo "Stripping multiple references from libraries"
arm-linux-androideabi-ar d libavcodec.a log2_tab.o
arm-linux-androideabi-ar d libavutil.a log2_tab.o
echo "Done..."And this is the Android.mk file which works fine.
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE := avutil
LOCAL_SRC_FILES := libs\libavutil.a
include $(PREBUILT_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := avformat
LOCAL_SRC_FILES := libs\libavformat.a
include $(PREBUILT_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := avcodec
LOCAL_SRC_FILES := libs\libavcodec.a
include $(PREBUILT_STATIC_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := ffplayer
LOCAL_SRC_FILES := ffplayer.cpp
LOCAL_C_INCLUDES := C:\DEV\ffmpeg\
LOCAL_LDLIBS += -llog -ljnigraphics -lGLESv2 -ldl
LOCAL_LDLIBS += -lstdc++ -lc
LOCAL_LDLIBS += -lz -lm
LOCAL_WHOLE_STATIC_LIBRARIES += libavutil libavformat libavcodec
include $(BUILD_SHARED_LIBRARY)If anybody can spot what’s wrong with this it would be so appreciated.
-
C++/OpenCV - VideoCapture doesn't work but it worked before with exactly the same code
5 mars 2015, par Damià ObradorI’m trying to write a shot boundary detection algorithm in C++ using OpenCV. After all, I have to say that I have no experience working with OpenCV.
I have been improving the following code (and I am still in it) during the last two weeks and everything seems the works correctly, not in terms of perfect shot detection, but every line of code did what was expected from it.
#include <cstdlib>
#include <opencv2></opencv2>opencv.hpp>
#include <opencv2></opencv2>core/core.hpp>
#include <opencv2></opencv2>video/background_segm.hpp>
#include <opencv2></opencv2>highgui/highgui.hpp>
#include <iostream>
#include <opencv2></opencv2>imgproc/imgproc.hpp>
#include
#include <istream>
#include <fstream>
using namespace cv;
int main(){
VideoCapture capture("vdevendetta.mp4");
if ( !capture.isOpened() )
{
std::cout << "Cannot open the video file";
return -1;
}
int numFrames=capture.get(CV_CAP_PROP_FRAME_COUNT);
bool SB_counter=false;
Mat currentFrame;
Mat frame;
Mat fore;
Mat back;
vector<mat> frames;
FileStorage file;
FileStorage file2;
Mat prev_Y;
Mat channels[3];
for(int i=0;i>currentFrame;
frame=currentFrame.clone();
cvtColor(frame,frame,CV_BGR2YUV);
split( frame, channels );
prev_Y=channels[0];
SB_counter=false;
}
else
{
capture>>currentFrame;
frame=currentFrame.clone();
Mat channels[3];
cvtColor(frame,frame,CV_BGR2YUV);
split( frame, channels );
Mat curr_Y=channels[0];
channels[0]=prev_Y;
prev_Y=curr_Y;
merge(channels,3,frame);
cvtColor(frame,frame,CV_Luv2BGR);
cvtColor(frame,frame,CV_BGR2GRAY);
frames.push_back(frame);
}
}
vector<double> MAFDs;
vector<double> MAFD;
vector<double> aux;
double min;
double min2;
for(int j=1;j14)
{
std::cout<<"De"<*
std::ofstream fout("MAFD.txt");
if(fout.is_open()==true)
{
//file opened successfully so we are here
std::cout << "File Opened successfully!!!. Writing data from array to file" << std::endl;
for(int i = 0; MAFD[i] != '\0'; i++)
{
fout << MAFD[i]; //writing ith character of array in the file
}
std::cout << "Array data successfully saved into the file test.txt" << std::endl;
}
else //file could not be opened
{
std::cout << "File could not be opened." << std::endl;
}*/
return 0;
}
</double></double></double></mat></fstream></istream></iostream></cstdlib>Three days ago I had an strange problem. The line :
VideoCapture capture("vdevendetta.mp4");
stopped working. I spent many hours looking for the solution but nothing seems to repair it. After reading everything related with OpenCV, VideoCapture and ffmpeg that I found on internet I decided to reinstall everything taking care of each detail, but it still didn’t work. Finally I solved it changing the line with this other one :
VideoCapture capture("/home/damia/Documentos/Universitat/ARA/PAEAV/workspace_cpp/SBD/src/vdevendetta.mp4");
I did not know why this solved it because the video is in the same directory than the program, but I continued working because everything seemed correct.
Today I had the same problem again on the new line.
The terminal doesn’t show any error comment (exceptuating "Cannot open the video file", obviously) but the code doesn’t work again.I’m using Ubuntu 14.04 LTS, OpenCV 2.4.9 and Eclipse.
I think if someone test the code it will run correctly, but I would like to know if there is something I have no taking into account.
Thank you very match.
-
Android FFMPEG - Low FPS & File Size is massive
2 mai 2018, par Alexzander FloresI am new to Android app development and I have been asked to make a video splitter app. I am trying to use FFMPEG, but the library size is massive and makes the .APK file 140MB. How can I solve this ? Similar apps are around 15MBs in size.
Also, the framerate starts at 30FPS and drops to around 2.2FPS over time when trying to split a 30 second long video into two parts. How can I solve this ? This is my code currently :
package splicer.com.splicer;
import android.Manifest;
import android.content.Context;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.database.Cursor;
import android.media.MediaMetadataRetriever;
import android.media.MediaScannerConnection;
import android.net.Uri;
import android.provider.MediaStore;
import android.support.v4.app.ActivityCompat;
import android.support.v4.content.ContextCompat;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.text.method.ScrollingMovementMethod;
import android.view.View;
import android.widget.Button;
import android.widget.TextView;
import com.github.hiteshsondhi88.libffmpeg.ExecuteBinaryResponseHandler;
import com.github.hiteshsondhi88.libffmpeg.FFmpeg;
import com.github.hiteshsondhi88.libffmpeg.LoadBinaryResponseHandler;
import com.github.hiteshsondhi88.libffmpeg.exceptions.FFmpegNotSupportedException;
public class MainActivity extends AppCompatActivity {
private Button button;
private TextView textView;
private FFmpeg ffmpeg;
static {
System.loadLibrary("native-lib");
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
ffmpeg = FFmpeg.getInstance(getApplicationContext());
try {
ffmpeg.loadBinary(new LoadBinaryResponseHandler() {
@Override
public void onStart() {}
@Override
public void onFailure() {}
@Override
public void onSuccess() {}
@Override
public void onFinish() {}
});
} catch(FFmpegNotSupportedException e) {
e.printStackTrace();
}
textView = (TextView) findViewById(R.id.textView);
textView.setY(200);
textView.setHeight(700);
textView.setMovementMethod(new ScrollingMovementMethod());
button = (Button) findViewById(R.id.button);
button.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
openGallery();
}
});
}
/**
* A native method that is implemented by the 'native-lib' native library,
* which is packaged with this application.
*/
public native String stringFromJNI();
public void openGallery() {
if(ContextCompat.checkSelfPermission(this, Manifest.permission.READ_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, new String [] {Manifest.permission.READ_EXTERNAL_STORAGE}, 0);
}
if(ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, new String [] {Manifest.permission.WRITE_EXTERNAL_STORAGE}, 0);
}
Intent gallery = new Intent(Intent.ACTION_PICK, MediaStore.Video.Media.EXTERNAL_CONTENT_URI);
startActivityForResult(gallery, 100);
}
public String getRealPathFromURI(Context context, Uri contentUri) {
Cursor cursor = null;
try {
String[] proj = { MediaStore.Images.Media.DATA };
cursor = context.getContentResolver().query(contentUri, proj, null, null, null);
int column_index = cursor.getColumnIndexOrThrow(MediaStore.Images.Media.DATA);
cursor.moveToFirst();
return cursor.getString(column_index);
} finally {
if (cursor != null) {
cursor.close();
}
}
}
@Override
protected void onActivityResult(int requestCode, int resultCode, final Intent intent) {
super.onActivityResult(requestCode, resultCode, intent);
if(resultCode == RESULT_OK && requestCode == 100) {
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
try {
retriever.setDataSource(getBaseContext(), intent.getData());
String time = retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION);
long splitCount = Long.valueOf(time) / 1000 / 15;
if(splitCount > 1) {
final String path = getRealPathFromURI(getBaseContext(), Uri.parse(intent.getData().toString()));
for(int a = 0, start = 0; a < splitCount; ++a, start += 15) {
// I am only testing with .mp4s atm, this will change before production
final String targetPath = path.replace(".mp4", "_" + (a + 1) + ".mp4");
ffmpeg.execute(new String [] {
"-r",
"1",
"-i",
path,
"-ss",
String.valueOf(start),
"-t",
String.valueOf(start + 15),
"-r",
"24",
targetPath
}, new ExecuteBinaryResponseHandler() {
@Override
public void onStart() {}
@Override
public void onProgress(String message) {
textView.setText("onProcess: " + message);
}
@Override
public void onFailure(String message) {
textView.setText("onFailure: " + message + " --- " + path);
}
@Override
public void onSuccess(String message) {
textView.setText("onSuccess:" + message);
MediaScannerConnection.scanFile(getBaseContext(),
new String [] { targetPath }, null,
new MediaScannerConnection.OnScanCompletedListener() {
public void onScanCompleted(String path, Uri uri) {}
});
}
@Override
public void onFinish() {}
});
}
}
} catch(Exception e) {
e.printStackTrace();
} finally {
retriever.release();
}
}
}
}I don’t believe everything here is as optimal as it could be, but I’m just trying to prove the concept at the moment. Any help in the right direction would be amazing, thank you !