
Recherche avancée
Médias (21)
-
1,000,000
27 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Demon Seed
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Four of Us are Dying
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Corona Radiata
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Lights in the Sky
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Head Down
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (33)
-
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)
Sur d’autres sites (7014)
-
How to encode HDR10+(or Dolby vision) videos to HDR video ? (In FFmpeg)
18 janvier 2023, par HONGSMI downloaded HDR10+(or Dolby vision) videos from YouTube.


Now I want to check the quality difference.


Although I don't have enough good HW display monitor,


I want to convert HDR10+(Dolby vision) video —> to HDR video.
(I heard that Dolby vision is better colors than HDR video.)


Can you share the command ?
Thanks in advance.


I searched some ffmpeg command, and I found some command that converts HDR video to SDR video.
But there is no ffmpeg command HDR10+ video to HDR video.


-
Android player, FFMPEG, JNI
14 septembre 2022, par Nikita PavlenkoCan I use
std::async
withjni
or I should throughstd::thread
orpthread_create
? Because i'm writing android player with my C++ lib that uses std::async for demuxing, video and audio. I saw that every callCallVoidMethod
(of course with attach and detach thread) create new thread (thread id in Java methods is increased). And in some time a big error occurres

A/e.androidplaye: java_vm_ext.cc:594] JNI DETECTED ERROR IN APPLICATION: thread Thread[2,tid=3420,Native,Thread*=0xb400007bd4470c00,peer=0x12cc1510,"Thread-8687"] using JNIEnv* from thread Thread[3,tid=3419,Native,Thread*=0xb400007aadc2e400,peer=0x12d00000,"Thread-8688"]
 java_vm_ext.cc:594] in call to CallVoidMethodV
A/e.androidplaye: runtime.cc:675] Runtime aborting...
 runtime.cc:675] Dumping all threads without mutator lock held
 runtime.cc:675] All threads:
 runtime.cc:675] DALVIK THREADS (23):
 runtime.cc:675] "Thread-8687" prio=10 tid=2 Runnable
 runtime.cc:675] | group="" sCount=0 ucsCount=0 flags=0 obj=0x12d80000 self=0xb400007bd4470c00
 runtime.cc:675] | sysTid=3420 nice=-10 cgrp=default sched=0/0 handle=0x7aa4166cb0
 runtime.cc:675] | state=R schedstat=( 6040359820 215025421 9416 ) utm=498 stm=105 core=1 HZ=100
 runtime.cc:675] | stack=0x7aa406f000-0x7aa4071000 stackSize=991KB
 runtime.cc:675] | held mutexes= "abort lock" "mutator lock"(shared held)
 runtime.cc:675] native: #00 pc 00000000005376cc /apex/com.android.art/lib64/libart.so (art::DumpNativeStack(std::__1::basic_ostream >&, int, BacktraceMap*, char const*, art::ArtMethod*, void*, bool)+128) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #01 pc 00000000006f0e24 /apex/com.android.art/lib64/libart.so (art::Thread::DumpStack(std::__1::basic_ostream >&, bool, BacktraceMap*, bool) const+236) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #02 pc 00000000006fe6b0 /apex/com.android.art/lib64/libart.so (art::DumpCheckpoint::Run(art::Thread*)+208) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #03 pc 0000000000364174 /apex/com.android.art/lib64/libart.so (art::ThreadList::RunCheckpoint(art::Closure*, art::Closure*)+440) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #04 pc 00000000006fce50 /apex/com.android.art/lib64/libart.so (art::ThreadList::Dump(std::__1::basic_ostream >&, bool)+280) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #05 pc 00000000006d7ec4 /apex/com.android.art/lib64/libart.so (art::AbortState::Dump(std::__1::basic_ostream >&) const+212) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #06 pc 00000000006d2b24 /apex/com.android.art/lib64/libart.so (art::Runtime::Abort(char const*)+1016) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #07 pc 0000000000016ea8 /apex/com.android.art/lib64/libbase.so (android::base::SetAborter(std::__1::function<void>&&)::$_3::__invoke(char const*)+80) (BuildId: b77c57f68a484ed93d5a7eda59d83bf9)
 runtime.cc:675] native: #08 pc 0000000000016450 /apex/com.android.art/lib64/libbase.so (android::base::LogMessage::~LogMessage()+352) (BuildId: b77c57f68a484ed93d5a7eda59d83bf9)
 runtime.cc:675] native: #09 pc 0000000000445224 /apex/com.android.art/lib64/libart.so (art::JavaVMExt::JniAbort(char const*, char const*)+1612) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #10 pc 0000000000329258 /apex/com.android.art/lib64/libart.so (art::JavaVMExt::JniAbortV(char const*, char const*, std::__va_list)+108) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #11 pc 000000000048d59c /apex/com.android.art/lib64/libart.so (art::(anonymous namespace)::ScopedCheck::AbortF(char const*, ...) (.__uniq.99033978352804627313491551960229047428)+144) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #12 pc 0000000000453ad0 /apex/com.android.art/lib64/libart.so (art::(anonymous namespace)::ScopedCheck::Check(art::ScopedObjectAccess&, bool, char const*, art::(anonymous namespace)::JniValueType*) (.__uniq.99033978352804627313491551960229047428)+4648) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #13 pc 0000000000475e2c /apex/com.android.art/lib64/libart.so (art::(anonymous namespace)::CheckJNI::CheckCallArgs(art::ScopedObjectAccess&, art::(anonymous namespace)::ScopedCheck&, _JNIEnv*, _jobject*, _jclass*, _jmethodID*, art::InvokeType, art::(anonymous namespace)::VarArgs const*) (.__uniq.99033978352804627313491551960229047428)+76) (BuildId: 56e704c544e6c624201be2ab4933e853)
A/e.androidplaye: runtime.cc:675] native: #14 pc 0000000000474d4c /apex/com.android.art/lib64/libart.so (art::(anonymous namespace)::CheckJNI::CallMethodV(char const*, _JNIEnv*, _jobject*, _jclass*, _jmethodID*, std::__va_list, art::Primitive::Type, art::InvokeType) (.__uniq.99033978352804627313491551960229047428)+268) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #15 pc 00000000005ca9ec /apex/com.android.art/lib64/libart.so (art::(anonymous namespace)::CheckJNI::CallVoidMethodV(_JNIEnv*, _jobject*, _jmethodID*, std::__va_list) (.__uniq.99033978352804627313491551960229047428.llvm.5591279935177935698)+72) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #16 pc 0000000000024ad4 /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 174000) (_JNIEnv::CallVoidMethod(_jobject*, _jmethodID*, ...)+172) (BuildId: 2f8cba1fdcec13c03a15bf3de18a4ddd1c91325d)
 runtime.cc:675] native: #17 pc 000000000002173c /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 174000) (ffmpeg::AndroidPlayer::handleAudioFrame(ffmpeg::Frame const&)+184) (BuildId: 2f8cba1fdcec13c03a15bf3de18a4ddd1c91325d)
 runtime.cc:675] native: #18 pc 0000000000028488 /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 174000) (???) (BuildId: 2f8cba1fdcec13c03a15bf3de18a4ddd1c91325d)
 runtime.cc:675] native: #19 pc 00000000000283c0 /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 174000) (???) (BuildId: 2f8cba1fdcec13c03a15bf3de18a4ddd1c91325d)
 runtime.cc:675] native: #20 pc 000000000002835c /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 174000) (std::__ndk1::__bind_return<void> >, std::__ndk1::tuple, __is_valid_bind_return<void> >, std::__ndk1::tuple >::value>::type std::__ndk1::__bind<void> const&>::operator()(ffmpeg::Frame const&)+76) (BuildId: 2f8cba1fdcec13c03a15bf3de18a4ddd1c91325d)
 runtime.cc:675] native: #21 pc 00000000000282ec /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 174000) (???) (BuildId: 2f8cba1fdcec13c03a15bf3de18a4ddd1c91325d)
 runtime.cc:675] native: #22 pc 0000000000028288 /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 174000) (void std::__ndk1::__invoke_void_return_wrapper<void>::__call const&>&, ffmpeg::Frame const&>(std::__ndk1::__bind<void> const&>&, ffmpeg::Frame const&)+48) (BuildId: 2f8cba1fdcec13c03a15bf3de18a4ddd1c91325d)
 runtime.cc:675] native: #23 pc 0000000000028234 /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 174000) (???) (BuildId: 2f8cba1fdcec13c03a15bf3de18a4ddd1c91325d)
A/e.androidplaye: runtime.cc:675] native: #24 pc 000000000002725c /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 174000) (std::__ndk1::__function::__func const&>, std::__ndk1::allocator const&> >, void (ffmpeg::Frame const&)>::operator()(ffmpeg::Frame const&)+48) (BuildId: 2f8cba1fdcec13c03a15bf3de18a4ddd1c91325d)
 runtime.cc:675] native: #25 pc 000000000004039c /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 380000) (???) (BuildId: 9a9d303cb440eed7364800d0ead0962a65175480)
 runtime.cc:675] native: #26 pc 000000000003b1f8 /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 380000) (std::__ndk1::function<void>::operator()(ffmpeg::Frame const&) const+44) (BuildId: 9a9d303cb440eed7364800d0ead0962a65175480)
 runtime.cc:675] native: #27 pc 0000000000039dd4 /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 380000) (ffmpeg::Player::playAudio()+164) (BuildId: 9a9d303cb440eed7364800d0ead0962a65175480)
 runtime.cc:675] native: #28 pc 000000000003db90 /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 380000) (???) (BuildId: 9a9d303cb440eed7364800d0ead0962a65175480)
 runtime.cc:675] native: #29 pc 000000000003db1c /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 380000) (???) (BuildId: 9a9d303cb440eed7364800d0ead0962a65175480)
 runtime.cc:675] native: #30 pc 000000000003dacc /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 380000) (???) (BuildId: 9a9d303cb440eed7364800d0ead0962a65175480)
 runtime.cc:675] native: #31 pc 000000000003da10 /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 380000) (???) (BuildId: 9a9d303cb440eed7364800d0ead0962a65175480)
 runtime.cc:675] native: #32 pc 000000000003e6cc /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 380000) (???) (BuildId: 9a9d303cb440eed7364800d0ead0962a65175480)
 runtime.cc:675] native: #33 pc 000000000003e5e0 /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 380000) (???) (BuildId: 9a9d303cb440eed7364800d0ead0962a65175480)
 runtime.cc:675] native: #34 pc 000000000003df50 /data/app/~~kWvaZj7JOt9YyFawweYvtw==/com.example.androidplayer-_Vqw2zfb57LpijMAPW6U4w==/base.apk (offset 380000) (???) (BuildId: 9a9d303cb440eed7364800d0ead0962a65175480)
 runtime.cc:675] native: #35 pc 00000000000efb14 /apex/com.android.runtime/lib64/bionic/libc.so (__pthread_start(void*)+264) (BuildId: 94065bf91428f6ae9fb310c478171302)
 runtime.cc:675] native: #36 pc 000000000008c35c /apex/com.android.runtime/lib64/bionic/libc.so (__start_thread+68) (BuildId: 94065bf91428f6ae9fb310c478171302)
 runtime.cc:675] (no managed stack frames)
 runtime.cc:675] 
 runtime.cc:675] "main" prio=10 tid=1 Native
 runtime.cc:675] | group="" sCount=1 ucsCount=0 flags=1 obj=0x71915d18 self=0xb400007bd4472800
 runtime.cc:675] | sysTid=3241 nice=-10 cgrp=default sched=0/0 handle=0x7bd5ac64f8
 runtime.cc:675] | state=S schedstat=( 20358355629 2398850818 13099 ) utm=1560 stm=474 core=5 HZ=100
 runtime.cc:675] | stack=0x7ff2616000-0x7ff2618000 stackSize=8188KB
 runtime.cc:675] | held mutexes=
 runtime.cc:675] native: #00 pc 0000000000086f90 /apex/com.android.runtime/lib64/bionic/libc.so (syscall+32) (BuildId: 94065bf91428f6ae9fb310c478171302)
A/e.androidplaye: runtime.cc:675] native: #01 pc 000000000047cc80 /apex/com.android.art/lib64/libart.so (art::ConditionVariable::WaitHoldingLocks(art::Thread*)+140) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #02 pc 00000000005c8cac /apex/com.android.art/lib64/libart.so (art::(anonymous namespace)::CheckJNI::FindClass(_JNIEnv*, char const*) (.__uniq.99033978352804627313491551960229047428.llvm.5591279935177935698)+1096) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #03 pc 000000000011aae0 /system/lib64/libandroid_runtime.so (android::NativeDisplayEventReceiver::dispatchVsync(long, android::PhysicalDisplayId, unsigned int, android::VsyncEventData)+80) (BuildId: f73b7e7fab15dc5b96680181e111fb8f)
 runtime.cc:675] native: #04 pc 00000000000af3f0 /system/lib64/libgui.so (android::DisplayEventDispatcher::handleEvent(int, int, void*)+204) (BuildId: 3f01a4a30b5fa3a4804b8361ddccc4a3)
 runtime.cc:675] native: #05 pc 0000000000018184 /system/lib64/libutils.so (android::Looper::pollInner(int)+916) (BuildId: 16796d84bdcf185b2112267dbd820c19)
 runtime.cc:675] native: #06 pc 0000000000017d84 /system/lib64/libutils.so (android::Looper::pollOnce(int, int*, int*, void**)+116) (BuildId: 16796d84bdcf185b2112267dbd820c19)
 runtime.cc:675] native: #07 pc 0000000000154668 /system/lib64/libandroid_runtime.so (android::android_os_MessageQueue_nativePollOnce(_JNIEnv*, _jobject*, long, int)+48) (BuildId: f73b7e7fab15dc5b96680181e111fb8f)
 runtime.cc:675] at android.os.MessageQueue.nativePollOnce(Native method)
 runtime.cc:675] at android.os.MessageQueue.next(MessageQueue.java:337)
 runtime.cc:675] at android.os.Looper.loopOnce(Looper.java:168)
 runtime.cc:675] at android.os.Looper.loop(Looper.java:299)
 runtime.cc:675] at android.app.ActivityThread.main(ActivityThread.java:8250)
 runtime.cc:675] at java.lang.reflect.Method.invoke(Native method)
 runtime.cc:675] at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:556)
 runtime.cc:675] at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1045)
 runtime.cc:675] 
 runtime.cc:675] "Signal Catcher" prio=10 tid=4 WaitingInMainSignalCatcherLoop
 runtime.cc:675] | group="" sCount=1 ucsCount=0 flags=1 obj=0x13340270 self=0xb400007b1f611800
 runtime.cc:675] | sysTid=3247 nice=-20 cgrp=default sched=0/0 handle=0x7b209aecb0
 runtime.cc:675] | state=S schedstat=( 300521 0 2 ) utm=0 stm=0 core=6 HZ=100
 runtime.cc:675] | stack=0x7b208b7000-0x7b208b9000 stackSize=991KB
 runtime.cc:675] | held mutexes=
 runtime.cc:675] native: #00 pc 00000000000db77c /apex/com.android.runtime/lib64/bionic/libc.so (__rt_sigtimedwait+12) (BuildId: 94065bf91428f6ae9fb310c478171302)
 runtime.cc:675] native: #01 pc 000000000009a1b8 /apex/com.android.runtime/lib64/bionic/libc.so (sigwait64+92) (BuildId: 94065bf91428f6ae9fb310c478171302)
 runtime.cc:675] native: #02 pc 000000000057420c /apex/com.android.art/lib64/libart.so (art::SignalCatcher::WaitForSignal(art::Thread*, art::SignalSet&)+108) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #03 pc 0000000000573e1c /apex/com.android.art/lib64/libart.so (art::SignalCatcher::Run(void*)+228) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #04 pc 00000000000efb14 /apex/com.android.runtime/lib64/bionic/libc.so (__pthread_start(void*)+264) (BuildId: 94065bf91428f6ae9fb310c478171302)
 runtime.cc:675] native: #05 pc 000000000008c35c /apex/com.android.runtime/lib64/bionic/libc.so (__start_thread+68) (BuildId: 94065bf91428f6ae9fb310c478171302)
 runtime.cc:675] (no managed stack frames)
 runtime.cc:675] 
 runtime.cc:675] "perfetto_hprof_listener" prio=10 tid=7 Native (still starting up)
 runtime.cc:675] | group="" sCount=1 ucsCount=0 flags=1 obj=0x0 self=0xb400007b138a8800
 runtime.cc:675] | sysTid=3248 nice=-20 cgrp=default sched=0/0 handle=0x7b208b0cb0
 runtime.cc:675] | state=S schedstat=( 175052 0 3 ) utm=0 stm=0 core=6 HZ=100
A/e.androidplaye: runtime.cc:675] | stack=0x7b207b9000-0x7b207bb000 stackSize=991KB
 runtime.cc:675] | held mutexes=
 runtime.cc:675] native: #00 pc 00000000000daad8 /apex/com.android.runtime/lib64/bionic/libc.so (read+8) (BuildId: 94065bf91428f6ae9fb310c478171302)
 runtime.cc:675] native: #01 pc 000000000001d840 /apex/com.android.art/lib64/libperfetto_hprof.so (void* std::__1::__thread_proxy >, ArtPlugin_Initialize::$_34> >(void*)+260) (BuildId: d60314cb99f035c98d20ab286d52fc6d)
 runtime.cc:675] native: #02 pc 00000000000efb14 /apex/com.android.runtime/lib64/bionic/libc.so (__pthread_start(void*)+264) (BuildId: 94065bf91428f6ae9fb310c478171302)
 runtime.cc:675] native: #03 pc 000000000008c35c /apex/com.android.runtime/lib64/bionic/libc.so (__start_thread+68) (BuildId: 94065bf91428f6ae9fb310c478171302)
 runtime.cc:675] (no managed stack frames)
 runtime.cc:675] 
 runtime.cc:675] "ADB-JDWP Connection Control Thread" prio=10 tid=8 WaitingInMainDebuggerLoop
 runtime.cc:675] | group="" sCount=1 ucsCount=0 flags=1 obj=0x133402e8 self=0xb400007b1f64d800
 runtime.cc:675] | sysTid=3249 nice=-20 cgrp=default sched=0/0 handle=0x7b207b2cb0
 runtime.cc:675] | state=S schedstat=( 935055 0 13 ) utm=0 stm=0 core=4 HZ=100
 runtime.cc:675] | stack=0x7b206bb000-0x7b206bd000 stackSize=991KB
 runtime.cc:675] | held mutexes=
 runtime.cc:675] native: #00 pc 00000000000dbe3c /apex/com.android.runtime/lib64/bionic/libc.so (__ppoll+12) (BuildId: 94065bf91428f6ae9fb310c478171302)
 runtime.cc:675] native: #01 pc 00000000000973b4 /apex/com.android.runtime/lib64/bionic/libc.so (poll+96) (BuildId: 94065bf91428f6ae9fb310c478171302)
 runtime.cc:675] native: #02 pc 00000000000099e4 /apex/com.android.art/lib64/libadbconnection.so (adbconnection::AdbConnectionState::RunPollLoop(art::Thread*)+724) (BuildId: c4c03b03ed50414e876ae23cd04eae7e)
 runtime.cc:675] native: #03 pc 00000000000080ac /apex/com.android.art/lib64/libadbconnection.so (adbconnection::CallbackFunction(void*)+1320) (BuildId: c4c03b03ed50414e876ae23cd04eae7e)
 runtime.cc:675] native: #04 pc 00000000000efb14 /apex/com.android.runtime/lib64/bionic/libc.so (__pthread_start(void*)+264) (BuildId: 94065bf91428f6ae9fb310c478171302)
 runtime.cc:675] native: #05 pc 000000000008c35c /apex/com.android.runtime/lib64/bionic/libc.so (__start_thread+68) (BuildId: 94065bf91428f6ae9fb310c478171302)
 runtime.cc:675] (no managed stack frames)
 runtime.cc:675] 
 runtime.cc:675] "Jit thread pool worker thread 0" prio=5 tid=9 Native
 runtime.cc:675] | group="" sCount=1 ucsCount=0 flags=1 obj=0x13340360 self=0xb400007b138c4000
 runtime.cc:675] | sysTid=3250 nice=9 cgrp=default sched=0/0 handle=0x7b206b4cb0
 runtime.cc:675] | state=S schedstat=( 1384418878 710723029 1368 ) utm=118 stm=20 core=1 HZ=100
 runtime.cc:675] | stack=0x7b205b5000-0x7b205b7000 stackSize=1023KB
 runtime.cc:675] | held mutexes=
 runtime.cc:675] native: #00 pc 0000000000086f90 /apex/com.android.runtime/lib64/bionic/libc.so (syscall+32) (BuildId: 94065bf91428f6ae9fb310c478171302)
 runtime.cc:675] native: #01 pc 000000000047cc80 /apex/com.android.art/lib64/libart.so (art::ConditionVariable::WaitHoldingLocks(art::Thread*)+140) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #02 pc 000000000047cb18 /apex/com.android.art/lib64/libart.so (art::ThreadPool::GetTask(art::Thread*)+120) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #03 pc 0000000000619810 /apex/com.android.art/lib64/libart.so (art::ThreadPoolWorker::Run()+136) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #04 pc 00000000006196f0 /apex/com.android.art/lib64/libart.so (art::ThreadPoolWorker::Callback(void*)+160) (BuildId: 56e704c544e6c624201be2ab4933e853)
A/e.androidplaye: runtime.cc:675] native: #05 pc 00000000000efb14 /apex/com.android.runtime/lib64/bionic/libc.so (__pthread_start(void*)+264) (BuildId: 94065bf91428f6ae9fb310c478171302)
 runtime.cc:675] native: #06 pc 000000000008c35c /apex/com.android.runtime/lib64/bionic/libc.so (__start_thread+68) (BuildId: 94065bf91428f6ae9fb310c478171302)
 runtime.cc:675] (no managed stack frames)
 runtime.cc:675] 
 runtime.cc:675] "HeapTaskDaemon" prio=5 tid=10 Native
 runtime.cc:675] | group="" sCount=1 ucsCount=0 flags=1 obj=0x13340d60 self=0xb400007b138d5800
 runtime.cc:675] | sysTid=3256 nice=4 cgrp=default sched=0/0 handle=0x7b205aecb0
 runtime.cc:675] | state=S schedstat=( 12936736517 2503386925 3722 ) utm=1074 stm=219 core=4 HZ=100
 runtime.cc:675] | stack=0x7b204ab000-0x7b204ad000 stackSize=1039KB
 runtime.cc:675] | held mutexes=
 runtime.cc:675] native: #00 pc 0000000000086f90 /apex/com.android.runtime/lib64/bionic/libc.so (syscall+32) (BuildId: 94065bf91428f6ae9fb310c478171302)
 runtime.cc:675] native: #01 pc 000000000047cc80 /apex/com.android.art/lib64/libart.so (art::ConditionVariable::WaitHoldingLocks(art::Thread*)+140) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #02 pc 000000000056bbb8 /apex/com.android.art/lib64/libart.so (art::gc::Heap::TrimIndirectReferenceTables(art::Thread*)+1364) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #03 pc 000000000056b468 /apex/com.android.art/lib64/libart.so (art::gc::Heap::Trim(art::Thread*)+68) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #04 pc 000000000056b3cc /apex/com.android.art/lib64/libart.so (art::gc::Heap::HeapTrimTask::Run(art::Thread*)+32) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #05 pc 000000000046ce28 /apex/com.android.art/lib64/libart.so (art::gc::TaskProcessor::RunAllTasks(art::Thread*)+56) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] at dalvik.system.VMRuntime.runHeapTasks(Native method)
 runtime.cc:675] at java.lang.Daemons$HeapTaskDaemon.runInternal(Daemons.java:609)
 runtime.cc:675] at java.lang.Daemons$Daemon.run(Daemons.java:140)
 runtime.cc:675] at java.lang.Thread.run(Thread.java:1012)
 runtime.cc:675] 
 runtime.cc:675] "ReferenceQueueDaemon" prio=5 tid=11 Waiting
 runtime.cc:675] | group="" sCount=1 ucsCount=0 flags=1 obj=0x133403d8 self=0xb400007b138d7400
 runtime.cc:675] | sysTid=3259 nice=4 cgrp=default sched=0/0 handle=0x7b204a4cb0
 runtime.cc:675] | state=S schedstat=( 2230226662 207087138 1228 ) utm=26 stm=196 core=4 HZ=100
 runtime.cc:675] | stack=0x7b203a1000-0x7b203a3000 stackSize=1039KB
 runtime.cc:675] | held mutexes=
 runtime.cc:675] native: #00 pc 0000000000086f90 /apex/com.android.runtime/lib64/bionic/libc.so (syscall+32) (BuildId: 94065bf91428f6ae9fb310c478171302)
 runtime.cc:675] native: #01 pc 000000000047cc80 /apex/com.android.art/lib64/libart.so (art::ConditionVariable::WaitHoldingLocks(art::Thread*)+140) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] native: #02 pc 0000000000486588 /apex/com.android.art/lib64/libart.so (art::Monitor::Wait(art::Thread*, art::ObjPtr, long, int, bool, art::ThreadState)+2520) (BuildId: 56e704c544e6c624201be2ab4933e853)
 runtime.cc:675] at java.lang.Object.wait(Native method)
 runtime.cc:675] - waiting on <0x0483c441> (a java.lang.Class)
 runtime.cc:675] at java.lang.Object.wait(Object.java:442)
 runtime.cc:675] at java.lang.Object.wait(Object.java:568)
 runtime.cc:675] at java.lang.Daemons$ReferenceQueueDaemon.runInternal(Daemons.java:232)
 runtime.cc:675] - locked <0x0483c441> (a java.lang.Class)
 runtime.cc:675] at java.lang.Daemons$Daemon.run(Daemons.java:140)
 runtime.cc:675] at java.lang.Thread.run(Thread.java:1012)
 runtime.cc:675] 
...(this part i missed because so big)
W/e.androidplaye: Suspending all threads took: 10.935ms
A/e.androidplaye: runtime.cc:683] JNI DETECTED ERROR IN APPLICATION: thread Thread[2,tid=3420,Native,Thread*=0xb400007bd4470c00,peer=0x12cc1510,"Thread-8687"] using JNIEnv* from thread Thread[3,tid=3419,Native,Thread*=0xb400007aadc2e400,peer=0x12d00000,"Thread-8688"]
 runtime.cc:683] in call to CallVoidMethodV
A/libc: Fatal signal 6 (SIGABRT), code -1 (SI_QUEUE) in tid 3420 (Thread-8687), pid 3241 (e.androidplayer)
</void></void></void></void></void></void></void>


-
C++ ffmpeg Specified pixel format is invalid or not supported
4 septembre 2022, par TurgutSo I have a program that reads an opengl window and encodes the read data as a video. Now through a series of experimentation I have learned that the bit format of my glfw window is
8:8:8
as returned byglfwGetVideoMode(monitor)
. So I use this function to read the window :

glReadPixels(0, 0,gl_width, gl_height,GL_RGBA, GL_UNSIGNED_BYTE, (GLvoid*) Buffer);


and I simply encode it in the
AV_PIX_FMT_YUV420P
format.

Under normal circumstances this method works just fine. However, when I actually run the program, the output I get, as opposed to what I can see in the glfw window, is really low resolution and a bit pixelated.


Here is what my GLFW window looks like :



Now this is what I want it to look like. It looks just fine on the opengl window, and I encode it directly without altering
Buffer
.

And here is what the encoded result, test.mp4 looks like when I run it using mplayer or similar software :




It's a lot more blurry and pixelated compare to the GLFW window. With some experimentation and following an answer to another question I asked, I us
avcodec_find_best_pix_fmt_of_list((*codec)->pix_fmts, AV_PIX_FMT_RGBA, 1, &ret)
and it returned 13. Which led me to believe usingAV_PIX_FMT_YUVJ422P
is the best option for this convertion to not have a blurry/pixelated result. However, no matter which function I pass, every single format gives off an error exceptAV_PIX_FMT_YUV420P
. The error is :

[mpeg4 @ 0x558e74f47900] Specified pixel format yuvj422p is invalid or not supported



I have no idea why this is happening, as the format is bound to a define and it is changed throughout the entire program when I change the define.


Here is my encoder so far (I have trimmed some parts) :


video_encoder.cpp :


int video_encoder::write_frame(AVFormatContext *fmt_ctx, AVCodecContext *c,
 AVStream *st, AVFrame *frame, AVPacket *pkt)
 {
 int ret;
 // Conditional jump or move depends on uninitialised value
 // Use of uninitialised value of size 8
 // send the frame to the encoder

 // Error is about c.
 ret = avcodec_send_frame(c, frame);

 if (ret < 0) {
 std::cout << "Error sending a frame to the encoder: " << ret << std::endl;
 exit(1);
 }

 while (ret >= 0) {
 ret = avcodec_receive_packet(c, pkt);

 if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)
 break;
 else if (ret < 0) {
 std::cout << "Error encoding a frame: " << ret << std::endl;
 exit(1);
 }

 /* rescale output packet timestamp values from codec to stream timebase */
 av_packet_rescale_ts(pkt, c->time_base, st->time_base);
 pkt->stream_index = st->index;

 /* Write the compressed frame to the media file. */
 //log_packet(fmt_ctx, pkt);

 //std::cout << "Packet: " << pkt << std::endl;

 ret = av_interleaved_write_frame(fmt_ctx, pkt);

 /* pkt is now blank (av_interleaved_write_frame() takes ownership of
 * its contents and resets pkt), so that no unreferencing is necessary.
 * This would be different if one used av_write_frame(). */
 if (ret < 0) {
 std::cout << "Error while writing output packet: " << ret << std::endl;
 exit(1);
 }
 }

 return ret == AVERROR_EOF ? 1 : 0;
}


/* Add an output stream. */
void video_encoder::add_stream(OutputStream *ost, AVFormatContext *oc,
 const AVCodec **codec,
 enum AVCodecID codec_id)
{
 AVCodecContext *c;
 int i;

 /* find the encoder */
 *codec = avcodec_find_encoder(codec_id);
 
 if (!(*codec)) {
 fprintf(stderr, "Could not find encoder for '%s'\n",
 avcodec_get_name(codec_id));
 exit(1);
 }

 ost->tmp_pkt = av_packet_alloc();

 if (!ost->tmp_pkt) {
 fprintf(stderr, "Could not allocate AVPacket\n");
 exit(1);
 }

 ost->st = avformat_new_stream(oc, NULL);
 if (!ost->st) {
 fprintf(stderr, "Could not allocate stream\n");
 exit(1);
 }
 ost->st->id = oc->nb_streams-1;
 c = avcodec_alloc_context3(*codec);
 if (!c) {
 fprintf(stderr, "Could not alloc an encoding context\n");
 exit(1);
 }
 ost->enc = c;


 switch ((*codec)->type) {
 case AVMEDIA_TYPE_AUDIO:
 ...
 case AVMEDIA_TYPE_VIDEO:
 c->codec_id = codec_id;

 c->bit_rate = 10000;
 /* Resolution must be a multiple of two. */
 c->width = width;
 c->height = height;
 /* timebase: This is the fundamental unit of time (in seconds) in terms
 * of which frame timestamps are represented. For fixed-fps content,
 * timebase should be 1/framerate and timestamp increments should be
 * identical to 1. */
 ost->st->time_base = (AVRational){ 1, STREAM_FRAME_RATE }; // *frame_rate
 c->time_base = ost->st->time_base;

 c->gop_size = 7; /* emit one intra frame every twelve frames at most */
 c->pix_fmt = STREAM_PIX_FMT;
 //if (c->codec_id == AV_CODEC_ID_MPEG2VIDEO) 
 // c->max_b_frames = 2;
 if (c->codec_id == AV_CODEC_ID_MPEG1VIDEO) {
 /* Needed to avoid using macroblocks in which some coeffs overflow.
 * This does not happen with normal video, it just happens here as
 * the motion of the chroma plane does not match the luma plane. */
 c->mb_decision = 2;
 }

 if ((*codec)->pix_fmts){
 //c->pix_fmt = (*codec)->pix_fmts[0];
 std::cout << "NEW FORMAT : " << c->pix_fmt << std::endl;
 }

 int ret;
 avcodec_find_best_pix_fmt_of_list((*codec)->pix_fmts, AV_PIX_FMT_RGBA, 1, &ret);
 std::cout << "Desired format is: " << ret << std::endl;
 break;
 }
 

 /* Some formats want stream headers to be separate. */
 if (oc->oformat->flags & AVFMT_GLOBALHEADER)
 c->flags |= AV_CODEC_FLAG_GLOBAL_HEADER;
}

/**************************************************************/
/* video output */

AVFrame* video_encoder::alloc_picture(enum AVPixelFormat pix_fmt, int width, int height)
{
 AVFrame *picture;
 int ret;

 picture = av_frame_alloc();
 if (!picture)
 return NULL;

 picture->format = pix_fmt;
 picture->width = width;
 picture->height = height;

 /* allocate the buffers for the frame data */
 ret = av_frame_get_buffer(picture, 0);
 if (ret < 0) {
 fprintf(stderr, "Could not allocate frame data.\n");
 exit(1);
 }

 return picture;
}

void video_encoder::open_video(AVFormatContext *oc, const AVCodec *codec,
 OutputStream *ost, AVDictionary *opt_arg)
{
 int ret;
 AVCodecContext *c = ost->enc;
 AVDictionary *opt = NULL;
 av_dict_copy(&opt, opt_arg, 0);
 /* open the codec */
 ret = avcodec_open2(c, codec, &opt);
 av_dict_free(&opt);
 if (ret < 0) {
 fprintf(stderr, "Could not open video codec: %s\n", ret);
 exit(1);
 }

 /* allocate and init a re-usable frame */
 ost->frame = alloc_picture(c->pix_fmt, c->width, c->height);
 if (!ost->frame) {
 fprintf(stderr, "Could not allocate video frame\n");
 exit(1);
 }

 /* If the output format is not YUV420P, then a temporary YUV420P
 * picture is needed too. It is then converted to the required
 * output format. */
 ost->tmp_frame = NULL;


 /* copy the stream parameters to the muxer */
 ret = avcodec_parameters_from_context(ost->st->codecpar, c);
 if (ret < 0) {
 fprintf(stderr, "Could not copy the stream parameters\n");
 exit(1);
 }
}


void video_encoder::set_frame_yuv_from_rgb(AVFrame *frame, struct SwsContext *sws_context) {
 const int in_linesize[1] = { 4 * width };
 //uint8_t* dest[4] = { rgb_data, NULL, NULL, NULL };

 
 sws_context = sws_getContext(
 width, height, AV_PIX_FMT_RGBA,
 width, height, STREAM_PIX_FMT,
 SCALE_FLAGS, 0, 0, 0);

 sws_scale(sws_context, (const uint8_t * const *)&rgb_data, in_linesize, 0,
 height, frame->data, frame->linesize);
}

AVFrame* video_encoder::get_video_frame(OutputStream *ost)
{
 AVCodecContext *c = ost->enc;
 /* check if we want to generate more frames */
 if (av_compare_ts(ost->next_pts, c->time_base,
 (float) STREAM_DURATION / 1000, (AVRational){ 1, 1 }) > 0)
 return NULL;

 /* when we pass a frame to the encoder, it may keep a reference to it
 * internally; make sure we do not overwrite it here */
 if (av_frame_make_writable(ost->frame) < 0)
 exit(1);

 
 set_frame_yuv_from_rgb(ost->frame, ost->sws_ctx);
 
 
 ost->frame->pts = ost->next_pts++;

 return ost->frame;
}

/*
 * encode one video frame and send it to the muxer
 * return 1 when encoding is finished, 0 otherwise
 */
int video_encoder::write_video_frame(AVFormatContext *oc, OutputStream *ost)
{
 return write_frame(oc, ost->enc, ost->st, get_video_frame(ost), ost->tmp_pkt);
}

void video_encoder::close_stream(AVFormatContext *oc, OutputStream *ost)
{
 avcodec_free_context(&ost->enc);
 av_frame_free(&ost->frame);
 av_frame_free(&ost->tmp_frame);
 av_packet_free(&ost->tmp_pkt);
 //sws_freeContext(ost->sws_ctx);
 //swr_free(&ost->swr_ctx);
}

/**************************************************************/
/* media file output */

void video_encoder::set_encode_framebuffer(uint8_t* data, bool audio_only)
{
 rgb_data = data;
}

video_encoder::~video_encoder()
{
 av_write_trailer(enc_inf.oc);

 /* Close each codec. */
 if (enc_inf.have_video)
 close_stream(enc_inf.oc, &enc_inf.video_st);

 if (!(enc_inf.fmt->flags & AVFMT_NOFILE))
 /* Close the output file. */
 avio_closep(&enc_inf.oc->pb);

 /* free the stream */
 avformat_free_context(enc_inf.oc);
 std::cout << "Done, closing." << std::endl; 
}

bool video_encoder::encode_one_frame()
{
 if (enc_inf.encode_video || enc_inf.encode_audio) {
 /* select the stream to encode */
 if (enc_inf.encode_video &&
 (!enc_inf.encode_audio || av_compare_ts(enc_inf.video_st.next_pts, enc_inf.video_st.enc->time_base,
 enc_inf.audio_st.next_pts, enc_inf.audio_st.enc->time_base) <= 0)) {
 enc_inf.encode_video = !write_video_frame(enc_inf.oc, &enc_inf.video_st);
 return true;
 } 
 }
 return false;
}


video_encoder::video_encoder(int w, int h, float fps, unsigned int duration) 
 :width(w), height(h), STREAM_FRAME_RATE(fps), STREAM_DURATION(duration)
{
 //std::filesystem::create_directory("media"); 

 //std::string as_str = "./output/" + std::string(getenv ("OUTPUT_UUID")) + ".mp4";
 std::string as_str = "./output/video.mp4";

 char* filename = const_cast(as_str.c_str());
 enc_inf.video_st, enc_inf.audio_st = (struct OutputStream) { 0 };
 enc_inf.video_st.next_pts = 1; 
 enc_inf.audio_st.next_pts = 1;
 enc_inf.encode_audio, enc_inf.encode_video = 0;
 int ret;
 int i;
 //rgb_data = (uint8_t*)malloc( 48 * sizeof(uint8_t) );

 /* allocate the output media context */
 avformat_alloc_output_context2(&enc_inf.oc, NULL, NULL, filename);

 if (!enc_inf.oc) {
 //VI_ERROR("Could not deduce output format from file extension: using MPEG.\n");
 avformat_alloc_output_context2(&enc_inf.oc, NULL, "mpeg", filename);
 }
 if (!enc_inf.oc)
 std::cout << "FAILED" << std::endl;
 //return 1;

 enc_inf.fmt = enc_inf.oc->oformat;

 /* Add the audio and video streams using the default format codecs
 * and initialize the codecs. */
 if (enc_inf.fmt->video_codec != AV_CODEC_ID_NONE) {
 add_stream(&enc_inf.video_st, enc_inf.oc, &video_codec, enc_inf.fmt->video_codec);
 enc_inf.have_video = 1;
 enc_inf.encode_video = 1;
 }

 /* Now that all the parameters are set, we can open the audio and
 * video codecs and allocate the necessary encode buffers. */
 if (enc_inf.have_video)
 open_video(enc_inf.oc, video_codec, &enc_inf.video_st, opt);

 /* open the output file, if needed */
 if (!(enc_inf.fmt->flags & AVFMT_NOFILE)) {
 ret = avio_open(&enc_inf.oc->pb, filename, AVIO_FLAG_WRITE);
 if (ret < 0) {
 //VI_ERROR("Could not open '%s': %s\n", filename, ret);
 //return 1;
 }
 }

 /* Write the stream header, if any. */
 ret = avformat_write_header(enc_inf.oc, &opt);
 if (ret < 0) {
 VI_ERROR("Error occurred when opening output file:");
 //return 1;
 }
 
 //return 0;
}



video_encoder.h :


#define STREAM_PIX_FMT AV_PIX_FMT_YUV420P /* default pix_fmt */

#define SCALE_FLAGS SWS_SPLINE

/* The output bit rate in bit/s */
#define OUTPUT_BIT_RATE 96000
/* The number of output channels */
#define OUTPUT_CHANNELS 2

typedef struct OutputStream {
 AVStream *st;
 AVCodecContext *enc;

 /* pts of the next frame that will be generated */
 int64_t next_pts;
 int samples_count;

 AVFrame *frame;
 AVFrame *tmp_frame;

 AVPacket *tmp_pkt;

 float t, tincr, tincr2;

 struct SwsContext *sws_ctx;
 struct SwrContext *swr_ctx;
} OutputStream;
 typedef struct {
 OutputStream video_st, audio_st;
 const AVOutputFormat *fmt;
 AVFormatContext *oc;
 int have_video, have_audio, encode_video, encode_audio;
 std::string name;
 } encode_info;



Again, changing
STREAM_PIX_FMT
anything other thanAV_PIX_FMT_YUV420P
causes the program to give the error.

What is the cause of this and how can I fix this ? Also am I on the right track for fixing the pixelation problem ? I'm using ubuntu.