
Recherche avancée
Autres articles (77)
-
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
-
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...) -
Pas question de marché, de cloud etc...
10 avril 2011Le vocabulaire utilisé sur ce site essaie d’éviter toute référence à la mode qui fleurit allègrement
sur le web 2.0 et dans les entreprises qui en vivent.
Vous êtes donc invité à bannir l’utilisation des termes "Brand", "Cloud", "Marché" etc...
Notre motivation est avant tout de créer un outil simple, accessible à pour tout le monde, favorisant
le partage de créations sur Internet et permettant aux auteurs de garder une autonomie optimale.
Aucun "contrat Gold ou Premium" n’est donc prévu, aucun (...)
Sur d’autres sites (9139)
-
Using and building FFMPEG library in Android Studio(Cmake)
8 septembre 2017, par y3k00000As title, I’m trying to use the ffmpeg source code as library in Android Studio on Ubuntu Linux, but meeting some trouble in compilation stage. I think I must be missing some compile option but having no idea what I miss, can somebody lend a hand ?
My codes & settings are below :
A simple auto-generated .cpp
#include
#include <string>
#include "libavcodec/aacenc.h"
extern "C"
JNIEXPORT jstring JNICALL
Java_y3k_testffmpegnative_MainActivity_stringFromJNI(
JNIEnv *env,
jobject /* this */) {
std::string hello = "Hello from C++";
return env->NewStringUTF(hello.c_str());
}
AACEncContext * aacEncContext; // Just trying by adding this.
</string>I’ve tried these Gradle setting and it didn’t help :
android {
....
externalNativeBuild {
cmake {
arguments "-DANDROID_TOOLCHAIN=clang"
cFlags "-std=c99"
cppFlags "-frtti","-fexceptions"
}
}
}
....CMakeLists.txt
(Android Studio generated)
....
include_directories( /home/y3k/ffmpeg )Error message while compiling :
/home/y3k/ffmpeg/libavutil/float_dsp.h
Error:(164, 50) error: expected ')'
Information:(164, 30) to match this '('
Error:(164, 50) error: expected ')'
Information:(164, 30) note: to match this '('
/home/y3k/ffmpeg/libavutil/fixed_dsp.h
Error:(153, 44) error: expected ')'
Information:(153, 30) to match this '('
Error:(153, 44) error: expected ')'
Information:(153, 30) note: to match this '('
/home/y3k/ffmpeg/libavcodec/mpeg4audio.h
Error:(44, 8) error: unknown type name 'av_export'
Error:(44, 18) error: expected unqualified-id
Error:(44, 8) error: unknown type name 'av_export'
Error:(44, 18) error: expected unqualified-id
/home/y3k/ffmpeg/libavcodec/aac.h
Error:(294, 21) error: expected member name or ';' after declaration specifiers
Error:(294, 21) error: expected member name or ';' after declaration specifiersFFMpeg config.h generated by this script :
NDK=/home/y3k/Android/Sdk/ndk-bundle
SYSROOT=$NDK/platforms/android-23/arch-arm/
TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64
function build_one
{
./configure \
--prefix=$PREFIX \
--disable-shared \
--enable-static \
--disable-doc \
--disable-ffmpeg \
--disable-ffplay \
--disable-ffprobe \
--disable-ffserver \
--disable-avdevice \
--disable-doc \
--disable-symver \
--cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
--target-os=linux \
--arch=arm \
--enable-cross-compile \
--sysroot=$SYSROOT \
--extra-cflags="-Os -fpic $ADDI_CFLAGS" \
--extra-ldflags="$ADDI_LDFLAGS" \
$ADDITIONAL_CONFIGURE_FLAG
make clean
make
make install
}
CPU=arm
PREFIX=$(pwd)/android/$CPU
ADDI_CFLAGS="-marm"
build_oneBy tracing the errors I found these lines :
in float_dsp.h, fixed_dsp.h :
void (*butterflies_float)(float *av_restrict v1, float *av_restrict v2, int len);
av_restrict defined in config.h
#define av_restrict restrict
in aac.h :
....
struct AACContext {
AVClass *class;
....in mpeg4audio.h :
extern av_export const int avpriv_mpeg4audio_sample_rates[16];
So I’m guessing it’s because the compiler misrecognized the C code as C++, I tried adding these :
arguments "-DANDROID_TOOLCHAIN=clang"
cFlags "-std=c99"to my build.gradle but didn’t help. Having no idea where to move on. :(
I’m using the latest stable version of Android Studio on Ubuntu desktop. Please don’t be hesitate to ask if any extra information is required.
Also I’ve been tried edit my native-lib.cpp into .c (Surely with the code contents changed), but wasn’t working either.
Appreciate for any help.
-
ffmpeg Exception : Working Directory : null Environment : null
13 décembre 2016, par DylanCreating the Android app, I am trying to crop video using ffmpeg library and store it in the app directory. I created the command :
String command = "ffmpeg -i /storage/emulated/0/DMC/diamondVideo.mp4 -vf crop=471:592:162:462 -c:a copy /storage/sdcard0/DMC/diamondVideoCropped.mp4";
String[] cmd = command.toString().split(" ");After that I pass this command to the AsyncTask :
new CropVideoTask().execute(cmd);
And in my AsyncTask I execute this command :
private class CropVideoTask extends AsyncTask {
protected Void doInBackground(String[]... cmd) {
FFmpeg ffmpeg = FFmpeg.getInstance(getContext());
try {
ffmpeg.execute(cmd[0], new ExecuteBinaryResponseHandler() {
@Override
public void onStart() {}
@Override
public void onProgress(String message) {}
@Override
public void onFailure(String message) {}
@Override
public void onSuccess(String message) {}
@Override
public void onFinish() {}
});
} catch (FFmpegCommandAlreadyRunningException e) {
// Handle if FFmpeg is already running
}
return null;
}After the code enters in onStart method, it throw an Exception :
12-13 15:43:02.832 28941-32324/com.example.dmc E/FFmpeg: Exception while trying to run: [Ljava.lang.String;@42851b38
java.io.IOException: Error running exec(). Command: [/data/data/com.studioidan.dmc/files/ffmpeg, ffmpeg, -i, /storage/sdcard0/DMC/diamondVideo.mp4, -vf, crop=471:592:162:462, -c:a, copy, /storage/sdcard0/DMC/diamondVideoCropped.mp4] Working Directory: null Environment: null
at java.lang.ProcessManager.exec(ProcessManager.java:211)
at java.lang.Runtime.exec(Runtime.java:168)
at java.lang.Runtime.exec(Runtime.java:123)
at com.github.hiteshsondhi88.libffmpeg.ShellCommand.run(ShellCommand.java:10)
at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:38)
at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:10)
at android.os.AsyncTask$2.call(AsyncTask.java:287)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:305)
at java.util.concurrent.FutureTask.run(FutureTask.java:137)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:230)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1076)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:569)
at java.lang.Thread.run(Thread.java:856)
Caused by: java.io.IOException: No such file or directory
at java.lang.ProcessManager.exec(Native Method)
at java.lang.ProcessManager.exec(ProcessManager.java:209)
at java.lang.Runtime.exec(Runtime.java:168)
at java.lang.Runtime.exec(Runtime.java:123)
at com.github.hiteshsondhi88.libffmpeg.ShellCommand.run(ShellCommand.java:10)
at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:38)
at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:10)
at android.os.AsyncTask$2.call(AsyncTask.java:287)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:305)
at java.util.concurrent.FutureTask.run(FutureTask.java:137)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:230)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1076)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:569)
at java.lang.Thread.run(Thread.java:856)Manifest.xml contains premission
Did I miss something ?
-
How to send libmmpeg AVPacket through WebRTC (using libdatachannel)
29 mars 2022, par mikeI'm encoding a video frame with the
ffmpeg
libraries, generating anAVPacket
with compressed data.

Thanks to some recent advice here on S/O, I am trying to send that frame over a network using the
WebRTC
librarylibdatachannel
, specifically by adapting the example here :

https://github.com/paullouisageneau/libdatachannel/tree/master/examples/streamer


I am seeing problems inside
h264rtppacketizer.cpp
(part of the library, not the example) which are almost certainly to do with how I'm providing the sample data.
(I don't think that this is anything to do with libdatachannel specifically, it will be an issue with what I'm sending)

The example code reads each encoded frame from a file, and populates a
sample
by setting the content of the file to the contents of the file :

sample = *reinterpret_cast *>(&fileContents);


sample
is just astd::vector<byte>;</byte>


I have naively copied the contents of an
AVPacket->data
pointer into thesample
vector :

sample.resize(pkt->size);
memcpy(sample.data(), pkt->data, pkt->size * sizeof(std::byte)); 



but the packetizer is falling over when trying to get length values out of that data.
Specifically, in the following code, the first iteration gets a length of 1, but the second, looking up index 5, gives 1119887324. This is way too big for my data, which is only 3526 bytes (the whole frame is a single colour so likely to be small once encoded) :


while (index < message->size()) {
assert(index + 4 < message->size());
auto lengthPtr = (uint32_t *)(message->data() + index);
uint32_t length = ntohl(*lengthPtr);
auto naluStartIndex = index + 4;
auto naluEndIndex = naluStartIndex + length;
assert(naluEndIndex <= message->size()); 
 
auto begin = message->begin() + naluStartIndex;
auto end = message->begin() + naluEndIndex;
nalus->push_back(std::make_shared<nalunit>(begin, end));
index = naluEndIndex;
}
</nalunit>


Here is a dump of


uint32_t length = ntohl(*lengthPtr);



for the first few elements of the message (
*lengthPtr
in parentheses) :

[2022-03-29 15:12:01.182] [info] index 0: 1 (16777216)
[2022-03-29 15:12:01.183] [info] index 1: 359 (1728118784)
[2022-03-29 15:12:01.184] [info] index 2: 91970 (1114046720)
[2022-03-29 15:12:01.186] [info] index 3: 23544512 (3225577217)
[2022-03-29 15:12:01.186] [info] index 4: 1732427807 (532693607)
[2022-03-29 15:12:01.187] [info] index 5: 1119887324 (3693068354)
[2022-03-29 15:12:01.188] [info] index 6: 3223313413 (98312128)
[2022-03-29 15:12:01.188] [info] index 7: 534512896 (384031)
[2022-03-29 15:12:01.188] [info] index 8: 3691315291 (1526728156)
[2022-03-29 15:12:01.189] [info] index 9: 83909537 (2707095557)
[2022-03-29 15:12:01.189] [info] index 10: 6004992 (10574592)
[2022-03-29 15:12:01.190] [info] index 11: 1537277952 (41307)
[2022-03-29 15:12:01.190] [info] index 12: 2701131779 (50331809)
[2022-03-29 15:12:01.192] [info] index 13: 768 (196608)



(I know I should post a complete sample, I am working on it)


- 

-
I am fairly sure I am just missing something basic. E.g. am I supposed to do something with the
AVPacket
side_data
, does AVPacket have or miss some header info ?

-
If I just fwrite the
pkt->data
for a single frame to disk, I can read the codec information with ffprobe :







Input #0, h264, from 'encodedOut.h264':
Duration: N/A, bitrate: N/A
Stream #0:0: Video: h264 (Constrained Baseline), yuv420p(progressive), 1280x720, 30 tbr, 1200k tbn



- 

- whereas the same for the example input files (again a single frame) gives the following :




[h264 @ 000001c88d1135c0] Format h264 detected only with low score of 1, misdetection possible!
[h264 @ 000001c88f337400] missing picture in access unit with size 85306
[extract_extradata @ 000001c88d11ee40] No start code is found.
sample-0.h264: Invalid data found when processing input



-