
Recherche avancée
Médias (2)
-
Exemple de boutons d’action pour une collection collaborative
27 février 2013, par
Mis à jour : Mars 2013
Langue : français
Type : Image
-
Exemple de boutons d’action pour une collection personnelle
27 février 2013, par
Mis à jour : Février 2013
Langue : English
Type : Image
Autres articles (112)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)
Sur d’autres sites (15805)
-
Encoding Android Camera Frames h263/h264 [closed]
29 décembre 2015, par Another GuyI want to implement a Video Chatting app on Android . I created a SurfaceView , get Camera preview and captured the frame as a byte[ ] using Camera.PreviewCallback.onPreviewFrame( ) method . Now I want to encode that byte[ ] to a h263 frame , pass over the network . Also get similar type of frame over network , decode it to a byte[ ] ( or a displayable format ) , and show it in a View . I want to do it at least 15 FPS rate . I am asking help for the encoding/decoding part .
For that purpose , which library would be suitable for me ? From where I should start ? Is my starting point wrong ? Is there any support on SDK or would I have to use NDK ? I have heard about FFMpeg/x264 library , but I could not figure out how I can use them in my purpose . It’s highly appreciated If you name any library , any reference , learning path , link or code snippet . Can you help me saying any workflow I should follow ? What if I want to use h264 encoding ? I just want to get it done , GPL/LGPL license of library would not matter a lot .
I am a beginner in Android development . Any kind of assistance would be highly appreciated . Thanks in advance .
-
How to check if OpenCV VideoCapture uses multithreading or multiprocessing internally when connecting to a camera [closed]
20 décembre 2024, par ChenI'm using OpenCV's cv::VideoCapture in C++ (or Python) to connect to a camera stream. I would like to know if OpenCV internally uses multithreading or multiprocessing when opening and capturing frames from the camera.


Specifically, I'm curious about :


- 

- Whether OpenCV spawns new threads or processes when using cv::VideoCapture.
- If FFmpeg is enabled as the backend, does it use multithreading or multiprocessing under the hood ?
- How can I verify the threading or process behavior during camera streaming ?








What I Tried :
I checked the current process using psutil (in Python) and printed the number of threads :


import cv2
import psutil

cap = cv2.VideoCapture("rtsp://camera_stream", cv2.CAP_FFMPEG)
process = psutil.Process()

print(f"Process ID: {process.pid}, Name: {process.name()}")
print(f"Thread count: {process.num_threads()}")

for thread in process.threads():
print(f"Thread ID: {thread.id}, CPU Time: {thread.user_time + thread.system_time}")

cap.release()



- 

- Does VideoCapture use multithreading or multiprocessing internally ?
- Is there a reliable way to verify whether the underlying backend (e.g., FFmpeg) has enabled multithreading ?
- Are there specific settings in OpenCV or FFmpeg that can control this behavior ?








-
Undefined symbol despite libraries being linked
17 septembre 2020, par Areopagcurrently I'm trying to write a JNI function using ffmpeg features, but when I'm trying to run my java executable after a clean compile I get the following error :


/usr/lib/jvm/java-1.11.0-openjdk-amd64/bin/java: symbol lookup error: /lib/libmediaserv_ffmpeg.so: undefined symbol: _Z19avformat_open_inputPP15AVFormatContextPKcP13AVInputFormatPP12AVDictionary



I can't figure out why ffmpeg is not linked correctly.


mediaserv_ffmpeg.cpp :


#include <iostream>
#include <libavformat></libavformat>avformat.h>
#include <libavutil></libavutil>avutil.h>

JNIEXPORT jobject JNICALL Java_mediaserv_ffmpeg_getMetadata
 (JNIEnv* env, jclass cls, jstring path) { ... }
</iostream>


What I've tried so far is inspecting libmediaserv_ffmpeg.so :


nm libmediaserv_ffmpeg.so --format=sysv | grep avformat
_Z19avformat_open_inputPP15AVFormatContextPKcP13AVInputFormatPP12AVDictionary| | U | NOTYPE| | |*UND*
_Z20avformat_close_inputPP15AVFormatContext| | U | NOTYPE| | |*UND*
_Z25avformat_find_stream_infoP15AVFormatContextPP12AVDictionary| | U | NOTYPE| | |*UND*



Which, I think, is just another way of saying that there are undefined symbols in the .so file.


The commands used to build the lib (taken from
make VERBOSE=1
) :

/usr/bin/c++ -Dmediaserv_ffmpeg_EXPORTS -I/usr/lib/jvm/java-11-openjdk-amd64/include -I/usr/lib/jvm/java-11-openjdk-amd64/include/linux -I/mnt/g/Workspace/mediaserv/third-party/ffmpeg -fPIC -std=gnu++1z -o CMakeFiles/mediaserv_ffmpeg.dir/mediaserv_ffmpeg.cpp.o -c /mnt/g/Workspace/mediaserv/third-party/mediaserv_ffmpeg.cpp
/usr/bin/c++ -fPIC -shared -Wl,-soname,libmediaserv_ffmpeg.so -o /mnt/g/Workspace/mediaserv/build/libs/libmediaserv_ffmpeg.so CMakeFiles/mediaserv_ffmpeg.dir/mediaserv_ffmpeg.cpp.o -Wl,-rpath,/usr/lib/jvm/java-11-openjdk-amd64/lib:/usr/lib/jvm/java-11-openjdk-amd64/lib/server:/mnt/g/Workspace/mediaserv/third-party/ffmpeg/libavutil:/mnt/g/Workspace/mediaserv/third-party/ffmpeg/libavformat /usr/lib/jvm/java-11-openjdk-amd64/lib/libjawt.so /usr/lib/jvm/java-11-openjdk-amd64/lib/server/libjvm.so ../ffmpeg/libavutil/libavutil.so ../ffmpeg/libavformat/libavformat.so



I've already tried to seek for errors in GCC's trace option for the linker and it lists me which libraries it tries to link / is linking but the result stays the same.


/usr/bin/ld: mode elf_x86_64
/usr/lib/gcc/x86_64-linux-gnu/7/../../../x86_64-linux-gnu/crti.o
/usr/lib/gcc/x86_64-linux-gnu/7/crtbeginS.o
CMakeFiles/mediaserv_ffmpeg.dir/mediaserv_ffmpeg.cpp.o
/usr/lib/jvm/java-11-openjdk-amd64/lib/libjawt.so
/usr/lib/jvm/java-11-openjdk-amd64/lib/server/libjvm.so
../ffmpeg/libavutil/libavutil.so
../ffmpeg/libavformat/libavformat.so
-lstdc++ (/usr/lib/gcc/x86_64-linux-gnu/7/libstdc++.so)
/lib/x86_64-linux-gnu/libm.so.6
/lib/x86_64-linux-gnu/libmvec.so.1
libgcc_s.so.1 (/usr/lib/gcc/x86_64-linux-gnu/7/libgcc_s.so.1)
/lib/x86_64-linux-gnu/libc.so.6
/lib/x86_64-linux-gnu/ld-linux-x86-64.so.2
/lib/x86_64-linux-gnu/ld-linux-x86-64.so.2
libgcc_s.so.1 (/usr/lib/gcc/x86_64-linux-gnu/7/libgcc_s.so.1)
/usr/lib/gcc/x86_64-linux-gnu/7/crtendS.o
/usr/lib/gcc/x86_64-linux-gnu/7/../../../x86_64-linux-gnu/crtn.o



CMakeLists.txt :


cmake_minimum_required(VERSION 3.17)
project(mediaserv_ffmpeg)

set(CMAKE_CXX_STANDARD 17)
set(BUILD_SHARED_LIBS ON)
add_library(${PROJECT_NAME} SHARED mediaserv_ffmpeg.cpp mediaserv_ffmpeg.h)

find_package(JNI REQUIRED)

find_path(AVUTIL_INCLUDE_DIR libavutil/avutil.h HINTS ffmpeg REQUIRED)
find_library(AVUTIL_LIBRARY avutil HINTS ${PROJECT_SOURCE_DIR}/ffmpeg/libavutil PATHS REQUIRED)

find_path(AVFORMAT_INCLUDE_DIR libavformat/avformat.h HINTS ffmpeg REQUIRED)
find_library(AVFORMAT_LIBRARY avformat HINTS ${PROJECT_SOURCE_DIR}/ffmpeg/libavformat REQUIRED)

target_include_directories(${PROJECT_NAME} PRIVATE ${JNI_INCLUDE_DIRS} ${AVUTIL_INCLUDE_DIR} ${AVFORMAT_INCLUDE_DIR})

target_link_libraries(${PROJECT_NAME} ${JNI_LIBRARIES} ${AVUTIL_LIBRARY} ${AVFORMAT_LIBRARY})



Neither ffmpeg taken from the official ubuntu bionic repository nor a self-compiled version with
configure --enable-shared
works for me.

Where is my mistake or what could I still inspect ?


Thanks in advance.