
Recherche avancée
Autres articles (106)
-
Use, discuss, criticize
13 avril 2011, parTalk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
A discussion list is available for all exchanges between users. -
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs
12 avril 2011, parLa manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras.
Sur d’autres sites (10119)
-
Opencv VideoCapture not streaming RTSP link and returns "no frame !"
6 septembre 2023, par Asadullah NaeemI am trying to stream my HikVision IP camera throough python. I am using
cv2.VideoCapture("rtsp_link")
which works fine on my Laptop but when I try to run the same python script with same Opencv and FFmpeg version it gives me following error :

Error :


[h264 @ 000002124c7f9a40] missing picture in access unit with size 47
[h264 @ 000002124c7f9a40] no frame!



I have so far tried to run this script on 5 computer devices but it gives the same error. I am using the following python script and my Opencv version is
4.6.0.66
and ffmpeg version2022-06-20-git-56419428a8-essentials_build-www.gyan.dev
:

Python Script :


import cv2

# RTSP stream URL
rtsp_url = "rtsp://username:password@ip_address:port/Streaming/Channels/501"

# Open the RTSP stream
cap = cv2.VideoCapture(rtsp_url)

# Check if the stream was successfully opened
if not cap.isOpened():
 print("Failed to open RTSP stream.")
 exit()

# Read and display frames from the stream
while True:
 # Read a frame from the stream
 ret, frame = cap.read()

 # Check if the frame was successfully read
 if not ret:
 print("Failed to read frame from RTSP stream.")
 break

 # Display the frame
 cv2.imshow("RTSP Stream", frame)

 # Exit if 'q' is pressed
 if cv2.waitKey(1) & 0xFF == ord('q'):
 break

# Release the resources
cap.release()
cv2.destroyAllWindows()




Update :


Code runs on a laptop on both wifi and mobile internet (4G) but on other devices rtsp link is accessible only with mobile internet (4G).


-
ffmpeg configure always returns "not found" when using sysroot and cross-prefix options - how to use them correctly ?
5 août 2014, par user2212461I try to build ffmpeg with a cross-compiler. When setting the configure flags I am getting the following errors :
WARNING: Compiler does not indicate floating-point ABI, guessing soft.
build_it.sh: 29: build_it.sh: --sysroot=/home/android/android-ndk-r9/platforms/
android-9/arch-arm/: not found
build_it.sh: 32: build_it.sh.sh: --cross-prefix=/home/android/android-ndk-r9/
toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-and
roideabi-: not foundThe error message "not found" sounds like the paths dont exist but they are all correct and existing. How can I resolve this error ?
build_it.sh :
NDK=/home/android/android-ndk-r9
SYSROOT=$NDK/platforms/android-9/arch-arm/
TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86
CPU=arm
PREFIX=test
ADDI_CFLAGS="-marm"
platform="android-9"
./configure \
--disable-asm \
--enable-shared \
--target-os=linux \
--arch=arm \
--enable-cross-compile \
--disable-static \
--sysroot=$SYSROOT \
--extra-cflags="-Os -fpic $ADDI_CFLAGS" \
--extra-ldflags="$ADDI_LDFLAGS"
--cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
--prefix=$PREFIX \
##configure fails here
... -
How do you properly free a BitStreamFilter (bsf) without getting a double free error ?
19 janvier 2021, par Alexis WilkeI'm trying to write a C++ class handling MP4 movies via ffmpeg.


First I created a couple of functions to use with
std::unique<>()
so that way things get released even on exceptions.

However, I get a double free when I try to free the BitStreamFilter object, yet the documentation clearly says that each
av_bsf_alloc()
must be paired with anav_bsf_free()
call.



@param ctx
a pointer into which the pointer to the newly-allocated context
will be written. It must be freed withav_bsf_free()
after the
filtering is done.



Note : emphasis mine.


However, at the time I call the
avformat_close_input()
I get a double free error, even if I haven't used the two contexts for anything !? I'm thinking that there may be a packet that both allocate and both try to free. But since these two contexts are not directly connected, I really don't understand how they end up freeing something twice.

Below is code which reproduce the error (at least on an amd64 platform). Once compiled, you can just execute it. Make sure to specify a filename as in :


./ffmpeg_demuxer_test mymovie.mp4



I used the following command to compile & link the code :


$ /usr/bin/c++ -std=c++17 -DDEBUG -D_DEBUG -D_GLIBCXX_ASSERTIONS -g -O0 \
 -fsanitize=address -fsanitize=enum -fsanitize=unreachable \
 -o ffmpeg_demuxer_test ffmpeg_demuxer_test.cpp \
 -lavformat -lavcodec



Notice the
-fsanitize=...
options to capture errors such as a double free error.

Here is the code :


extern "C" {
#include <libavformat></libavformat>avformat.h>
#include <libavformat></libavformat>avio.h>
#include <libavcodec></libavcodec>avcodec.h>
}
#include <iostream>
#include <memory>


void ffmpeg_demuxer_avformat_context_free(AVFormatContext * context)
{
 if(context != nullptr) avformat_close_input(&context);
}

void ffmpeg_demuxer_av_bsf_free(AVBSFContext * context)
{
 if(context != nullptr) av_bsf_free(&context);
}


int main(int argc, char * argv [])
{
 if(argc != 2)
 {
 std::cout << "Usage: " << argv[0] << " movie.mp4" << std::endl;
 exit(1);
 }

 // init the AV libraries
 //
 av_register_all();
 avformat_network_init();

 // allocate the AVFormatContext
 //
 AVFormatContext * format_context(nullptr);
 int const r1(avformat_open_input(
 &format_context
 , argv[1]
 , nullptr // input format
 , nullptr)); // options
 if(r1 != 0
 || format_context == nullptr)
 {
 throw std::bad_alloc();
 }

 auto f_format_context = std::unique_ptr<
 AVFormatContext
 , decltype(&ffmpeg_demuxer_avformat_context_free)>(
 format_context
 , &ffmpeg_demuxer_avformat_context_free);


 // now allocate a stream
 //
 if(avformat_find_stream_info(f_format_context.get(), nullptr) < 0)
 {
 throw std::runtime_error("ffmpeg: Could not find stream info");
 }

 auto f_video_stream_index = av_find_best_stream(
 f_format_context.get()
 , AVMEDIA_TYPE_VIDEO
 , -1 // wanted stream (any)
 , -1 // related stream (none)
 , nullptr // AVCodec *
 , 0); // flags
 if(f_video_stream_index < 0)
 {
 throw std::runtime_error("ffmpeg: Could not find stream in input file");
 }
 if(static_cast<unsigned int="int">(f_video_stream_index) >= f_format_context->nb_streams)
 {
 throw std::range_error("ffmpeg: Stream index out of range");
 }

 auto f_stream = f_format_context->streams[f_video_stream_index];

 auto f_video_codec = f_stream->codecpar->codec_id;

 int f_bit_depth(0);
 switch(f_stream->codecpar->format)
 {
 case AV_PIX_FMT_YUV420P10LE:
 f_bit_depth = 10;
 break;

 case AV_PIX_FMT_YUV420P12LE:
 f_bit_depth = 12;
 break;

 default:
 f_bit_depth = 8;
 break;

 }

 bool f_mp4_h264 = f_video_codec == AV_CODEC_ID_H264 && (
 strcmp(f_format_context->iformat->long_name, "QuickTime / MOV") == 0
 || strcmp(f_format_context->iformat->long_name, "FLV (Flash Video)") == 0
 || strcmp(f_format_context->iformat->long_name, "Matroska / WebM") == 0
 );

 if(f_mp4_h264)
 {
 AVBitStreamFilter const * bsf_stream_filter(av_bsf_get_by_name("h264_mp4toannexb"));
 if(bsf_stream_filter == nullptr)
 {
 throw std::runtime_error("av_bsf_get_by_name(\"h264_mp4toannexb\") failed");
 }
 AVBSFContext * bsf_context(nullptr);
 int const r2(av_bsf_alloc(bsf_stream_filter, &bsf_context));
 if(r2 < 0
 || bsf_context == nullptr)
 {
 throw std::bad_alloc();
 }
 auto f_bsf_context = std::unique_ptr<
 AVBSFContext
 , decltype(&ffmpeg_demuxer_av_bsf_free)>(
 bsf_context
 , &ffmpeg_demuxer_av_bsf_free);
 f_bsf_context->par_in = f_stream->codecpar;
 if(av_bsf_init(f_bsf_context.get()) < 0)
 {
 throw std::runtime_error("av_bsf_init() failed");
 }
 }

 return 0;
}
</unsigned></memory></iostream>


So. Am I misreading the documentation or misusing one of the free/close function ? I don't think that the
AVBSFContext
itself gets released when I close theAVFormatContext
, but I may be mistaken ?

I won't put "C" as a tag since it will get removed even though this applies to C. It's not because I use C++ that I get a double free (i.e. it happens in the ffmpeg C library).