
Recherche avancée
Médias (1)
-
The pirate bay depuis la Belgique
1er avril 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Image
Autres articles (80)
-
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...) -
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...) -
Contribute to translation
13 avril 2011You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
MediaSPIP is currently available in French and English (...)
Sur d’autres sites (7157)
-
HTTP Livestreaming with ffmpeg
25 août 2016, par Hugo14453Some context : I have an MKV file, I am attempting to stream it to http://localhost:8090/test.flv as an flv file.
The stream begins and then immediately ends.
The command I am using is :
sudo ffmpeg -re -i input.mkv -c:v libx264 -maxrate 1000k -bufsize 2000k -an -bsf:v h264_mp4toannexb -g 50 http://localhost:8090/test.flv
A breakdown of what I believe these options do incase this post becomes useful for someone else :
sudo
Run as root
ffmpeg
The stream command thingy
-re
Stream in real-time
-i input.mkv
Input option and path to input file
-c:v libx264
Use codec libx264 for conversion
-maxrate 1000k -bufsize 2000k
No idea, some options for conversion, seems to help
-an -bsf:v h264_mp4toannexb
Audio options I think, not sure really. Also seems to help
-g 50
Still no idea, maybe frame rateframerateframerateframerate ?
http://localhost:8090/test.flv
Output using http protocol to localhost on port 8090 as a file called test.flv
Anyway the actual issue I have is that it begins to stream for about a second and then immediately ends.
The mpeg command result :
ffmpeg version N-80901-gfebc862 Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04.3)
configuration: --extra-libs=-ldl --prefix=/opt/ffmpeg --mandir=/usr/share/man --enable-avresample --disable-debug --enable-nonfree --enable-gpl --enable-version3 --enable-libopencore-amrnb --enable-libopencore-amrwb --disable-decoder=amrnb --disable-decoder=amrwb --enable-libpulse --enable-libfreetype --enable-gnutls --enable-libx264 --enable-libx265 --enable-libfdk-aac --enable-libvorbis --enable-libmp3lame --enable-libopus --enable-libvpx --enable-libspeex --enable-libass --enable-avisynth --enable-libsoxr --enable-libxvid --enable-libvidstab
libavutil 55. 28.100 / 55. 28.100
libavcodec 57. 48.101 / 57. 48.101
libavformat 57. 41.100 / 57. 41.100
libavdevice 57. 0.102 / 57. 0.102
libavfilter 6. 47.100 / 6. 47.100
libavresample 3. 0. 0 / 3. 0. 0
libswscale 4. 1.100 / 4. 1.100
libswresample 2. 1.100 / 2. 1.100
libpostproc 54. 0.100 / 54. 0.100
Input #0, matroska,webm, from 'input.mkv':
Metadata:
encoder : libebml v1.3.0 + libmatroska v1.4.0
creation_time : 1970-01-01 00:00:02
Duration: 00:01:32.26, start: 0.000000, bitrate: 4432 kb/s
Stream #0:0(eng): Video: h264 (High 10), yuv420p10le, 1920x1080 [SAR 1:1 DAR 16:9], 23.98 fps, 23.98 tbr, 1k tbn, 47.95 tbc (default)
Stream #0:1(nor): Audio: flac, 48000 Hz, stereo, s16 (default)
[libx264 @ 0x2e1c380] using SAR=1/1
[libx264 @ 0x2e1c380] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX
[libx264 @ 0x2e1c380] profile High, level 4.0
[libx264 @ 0x2e1c380] 264 - core 148 r2643 5c65704 - H.264/MPEG-4 AVC codec - Copyleft 2003-2015 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=1 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=50 keyint_min=5 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 vbv_maxrate=1000 vbv_bufsize=2000 crf_max=0.0 nal_hrd=none filler=0 ip_ratio=1.40 aq=1:1.00
[flv @ 0x2e3f0a0] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead.
Output #0, flv, to 'http://localhost:8090/test.flv':
Metadata:
encoder : Lavf57.41.100
Stream #0:0(eng): Video: h264 (libx264) ([7][0][0][0] / 0x0007), yuv420p, 1920x1080 [SAR 1:1 DAR 16:9], q=-1--1, 23.98 fps, 1k tbn, 23.98 tbc (default)
Metadata:
encoder : Lavc57.48.101 libx264
Side data:
cpb: bitrate max/min/avg: 1000000/0/0 buffer size: 2000000 vbv_delay: -1
Stream mapping:
Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
Press [q] to stop, [?] for help
Killed 26 fps= 26 q=0.0 size= 0kB time=00:00:00.00 bitrate=N/A speed= 0xThe ffserver outputs :
Sat Aug 20 12:40:11 2016 File '/test.flv' not found
Sat Aug 20 12:40:11 2016 [SERVER IP] - - [POST] "/test.flv HTTP/1.1" 404 189The config file is :
#Sample ffserver configuration file
# Port on which the server is listening. You must select a different
# port from your standard HTTP web server if it is running on the same
# computer.
Port 8090
# Address on which the server is bound. Only useful if you have
# several network interfaces.
BindAddress 0.0.0.0
# Number of simultaneous HTTP connections that can be handled. It has
# to be defined *before* the MaxClients parameter, since it defines the
# MaxClients maximum limit.
MaxHTTPConnections 2000
# Number of simultaneous requests that can be handled. Since FFServer
# is very fast, it is more likely that you will want to leave this high
# and use MaxBandwidth, below.
MaxClients 1000
# This the maximum amount of kbit/sec that you are prepared to
# consume when streaming to clients.
MaxBandwidth 1000
# Access log file (uses standard Apache log file format)
# '-' is the standard output.
CustomLog -
# Suppress that if you want to launch ffserver as a daemon.
#NoDaemon
##################################################################
# Definition of the live feeds. Each live feed contains one video
# and/or audio sequence coming from an ffmpeg encoder or another
# ffserver. This sequence may be encoded simultaneously with several
# codecs at several resolutions.
<feed>
ACL allow 192.168.0.0 192.168.255.255
# You must use 'ffmpeg' to send a live feed to ffserver. In this
# example, you can type:
#
#ffmpeg http://localhost:8090/test.ffm
# ffserver can also do time shifting. It means that it can stream any
# previously recorded live stream. The request should contain:
# "http://xxxx?date=[YYYY-MM-DDT][[HH:]MM:]SS[.m...]".You must specify
# a path where the feed is stored on disk. You also specify the
# maximum size of the feed, where zero means unlimited. Default:
# File=/tmp/feed_name.ffm FileMaxSize=5M
File /tmp/feed1.ffm
FileMaxSize 200m
# You could specify
# ReadOnlyFile /saved/specialvideo.ffm
# This marks the file as readonly and it will not be deleted or updated.
# Specify launch in order to start ffmpeg automatically.
# First ffmpeg must be defined with an appropriate path if needed,
# after that options can follow, but avoid adding the http:// field
#Launch ffmpeg
# Only allow connections from localhost to the feed.
ACL allow 127.0.0.1
</feed>
##################################################################
# Now you can define each stream which will be generated from the
# original audio and video stream. Each format has a filename (here
# 'test1.mpg'). FFServer will send this stream when answering a
# request containing this filename.
<stream>
# coming from live feed 'feed1'
Feed feed1.ffm
# Format of the stream : you can choose among:
# mpeg : MPEG-1 multiplexed video and audio
# mpegvideo : only MPEG-1 video
# mp2 : MPEG-2 audio (use AudioCodec to select layer 2 and 3 codec)
# ogg : Ogg format (Vorbis audio codec)
# rm : RealNetworks-compatible stream. Multiplexed audio and video.
# ra : RealNetworks-compatible stream. Audio only.
# mpjpeg : Multipart JPEG (works with Netscape without any plugin)
# jpeg : Generate a single JPEG image.
# asf : ASF compatible streaming (Windows Media Player format).
# swf : Macromedia Flash compatible stream
# avi : AVI format (MPEG-4 video, MPEG audio sound)
Format mpeg
# Bitrate for the audio stream. Codecs usually support only a few
# different bitrates.
AudioBitRate 32
# Number of audio channels: 1 = mono, 2 = stereo
AudioChannels 2
# Sampling frequency for audio. When using low bitrates, you should
# lower this frequency to 22050 or 11025. The supported frequencies
# depend on the selected audio codec.
AudioSampleRate 44100
# Bitrate for the video stream
VideoBitRate 64
# Ratecontrol buffer size
VideoBufferSize 40
# Number of frames per second
VideoFrameRate 3
# Size of the video frame: WxH (default: 160x128)
# The following abbreviations are defined: sqcif, qcif, cif, 4cif, qqvga,
# qvga, vga, svga, xga, uxga, qxga, sxga, qsxga, hsxga, wvga, wxga, wsxga,
# wuxga, woxga, wqsxga, wquxga, whsxga, whuxga, cga, ega, hd480, hd720,
# hd1080
VideoSize hd1080
# Transmit only intra frames (useful for low bitrates, but kills frame rate).
#VideoIntraOnly
# If non-intra only, an intra frame is transmitted every VideoGopSize
# frames. Video synchronization can only begin at an intra frame.
VideoGopSize 12
# More MPEG-4 parameters
# VideoHighQuality
# Video4MotionVector
# Choose your codecs:
#AudioCodec mp2
#VideoCodec mpeg1video
# Suppress audio
#NoAudio
# Suppress video
#NoVideo
#VideoQMin 3
#VideoQMax 31
# Set this to the number of seconds backwards in time to start. Note that
# most players will buffer 5-10 seconds of video, and also you need to allow
# for a keyframe to appear in the data stream.
#Preroll 15
# ACL:
# You can allow ranges of addresses (or single addresses)
ACL ALLOW localhost
# You can deny ranges of addresses (or single addresses)
#ACL DENY <first address="address">
# You can repeat the ACL allow/deny as often as you like. It is on a per
# stream basis. The first match defines the action. If there are no matches,
# then the default is the inverse of the last ACL statement.
#
# Thus 'ACL allow localhost' only allows access from localhost.
# 'ACL deny 1.0.0.0 1.255.255.255' would deny the whole of network 1 and
# allow everybody else.
</first></stream>
##################################################################
# Example streams
# Multipart JPEG
#<stream>
#Feed feed1.ffm
#Format mpjpeg
#VideoFrameRate 2
#VideoIntraOnly
#NoAudio
#Strict -1
#</stream>
# Single JPEG
#<stream>
#Feed feed1.ffm
#Format jpeg
#VideoFrameRate 2
#VideoIntraOnly
##VideoSize 352x240
#NoAudio
#Strict -1
#</stream>
# Flash
#<stream>
#Feed feed1.ffm
#Format swf
#VideoFrameRate 2
#VideoIntraOnly
#NoAudio
#</stream>
# ASF compatible
<stream>
Feed feed1.ffm
Format asf
VideoFrameRate 15
VideoSize 352x240
VideoBitRate 256
VideoBufferSize 40
VideoGopSize 30
AudioBitRate 64
StartSendOnKey
</stream>
# MP3 audio
#<stream>
#Feed feed1.ffm
#Format mp2
#AudioCodec mp3
#AudioBitRate 64
#AudioChannels 1
#AudioSampleRate 44100
#NoVideo
#</stream>
# Ogg Vorbis audio
#<stream>
#Feed feed1.ffm
#Title "Stream title"
#AudioBitRate 64
#AudioChannels 2
#AudioSampleRate 44100
#NoVideo
#</stream>
# Real with audio only at 32 kbits
#<stream>
#Feed feed1.ffm
#Format rm
#AudioBitRate 32
#NoVideo
#NoAudio
#</stream>
# Real with audio and video at 64 kbits
#<stream>
#Feed feed1.ffm
#Format rm
#AudioBitRate 32
#VideoBitRate 128
#VideoFrameRate 25
#VideoGopSize 25
#NoAudio
#</stream>
##################################################################
# A stream coming from a file: you only need to set the input
# filename and optionally a new format. Supported conversions:
# AVI -> ASF
#<stream>
#File "/usr/local/httpd/htdocs/tlive.rm"
#NoAudio
#</stream>
#<stream>
#File "/usr/local/httpd/htdocs/test.asf"
#NoAudio
#Author "Me"
#Copyright "Super MegaCorp"
#Title "Test stream from disk"
#Comment "Test comment"
#</stream>
##################################################################
# RTSP examples
#
# You can access this stream with the RTSP URL:
# rtsp://localhost:5454/test1-rtsp.mpg
#
# A non-standard RTSP redirector is also created. Its URL is:
# http://localhost:8090/test1-rtsp.rtsp
#<stream>
#Format rtp
#File "/usr/local/httpd/htdocs/test1.mpg"
#</stream>
# Transcode an incoming live feed to another live feed,
# using libx264 and video presets
#<stream>
#Format rtp
#Feed feed1.ffm
#VideoCodec libx264
#VideoFrameRate 24
#VideoBitRate 100
#VideoSize 480x272
#AVPresetVideo default
#AVPresetVideo baseline
#AVOptionVideo flags +global_header
#
#AudioCodec libfaac
#AudioBitRate 32
#AudioChannels 2
#AudioSampleRate 22050
#AVOptionAudio flags +global_header
#</stream>
##################################################################
# SDP/multicast examples
#
# If you want to send your stream in multicast, you must set the
# multicast address with MulticastAddress. The port and the TTL can
# also be set.
#
# An SDP file is automatically generated by ffserver by adding the
# 'sdp' extension to the stream name (here
# http://localhost:8090/test1-sdp.sdp). You should usually give this
# file to your player to play the stream.
#
# The 'NoLoop' option can be used to avoid looping when the stream is
# terminated.
#<stream>
#Format rtp
#File "/usr/local/httpd/htdocs/test1.mpg"
#MulticastAddress 224.124.0.1
#MulticastPort 5000
#MulticastTTL 16
#NoLoop
#</stream>
##################################################################
# Special streams
# Server status
<stream>
Format status
# Only allow local people to get the status
ACL allow localhost
ACL allow 192.168.0.0 192.168.255.255
#FaviconURL http://pond1.gladstonefamily.net:8080/favicon.ico
</stream>
# Redirect index.html to the appropriate site
<redirect>
URL http://www.ffmpeg.org/
</redirect>
#http://www.ffmpeg.org/Any help is greatly appreciated, I will do my best draw a picture of the best answer based on their username.
-
FFMPEG dumping a RTSP streaming works on Windows but not in Ubuntu
9 août 2016, par cuentafalsa7I have a camera with a
RTSP
server. When I try to save the streaming, usingFFMPEG
in Windows works ; in Ubuntu,FFMPEG
it doesn’t.The command, in Windows is :
ffmpeg.exe -i "rtsp://192.168.1.10:1236/?videoapi=mc&h264=1000-20-1280-960" -r 20 test.mp4
In Linux :
ffmpeg -i "rtsp://192.168.1.10:1236/?videoapi=mc&h264=1000-20-1280-960" -r 20 test.mp4
The output in Linux is :
$ ffmpeg -i "rtsp://192.168.1.10:1236/?videoapi=mr&h264=1000-20-1280-960" -r 20 test.mp4
ffmpeg version 2.8.6-1ubuntu2 Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 5.3.1 (Ubuntu 5.3.1-11ubuntu1) 20160311
configuration: --prefix=/usr --extra-version=1ubuntu2 --build-suffix=-ffmpeg --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --cc=cc --cxx=g++ --enable-gpl --enable-shared --disable-stripping --disable-decoder=libopenjpeg --disable-decoder=libschroedinger --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzvbi --enable-openal --enable-opengl --enable-x11grab --enable-libdc1394 --enable-libiec61883 --enable-libzmq --enable-frei0r --enable-libx264 --enable-libopencv
libavutil 54. 31.100 / 54. 31.100
libavcodec 56. 60.100 / 56. 60.100
libavformat 56. 40.101 / 56. 40.101
libavdevice 56. 4.100 / 56. 4.100
libavfilter 5. 40.101 / 5. 40.101
libavresample 2. 1. 0 / 2. 1. 0
libswscale 3. 1.101 / 3. 1.101
libswresample 1. 2.101 / 1. 2.101
libpostproc 53. 3.100 / 53. 3.100
[rtsp @ 0x1f18340] UDP timeout, retrying with TCP
[rtsp @ 0x1f18340] Nonmatching transport in server reply
[rtsp @ 0x1f18340] Could not find codec parameters for stream 0 (Video: h264, none): unspecified size
Consider increasing the value for the 'analyzeduration' and 'probesize' options
rtsp://192.168.1.10:1236/?videoapi=mr&h264=1000-20-1280-960: could not find codec parameters
Input #0, rtsp, from 'rtsp://192.168.1.10:1236/?videoapi=mr&h264=1000-20-1280-960':
Metadata:
title : Unnamed
comment : N/A
Duration: N/A, bitrate: N/A
Stream #0:0: Video: h264, none, 90k tbr, 90k tbn, 180k tbc
Output #0, mp4, to 'test.mp4':
Output file #0 does not contain any streamThe ffmpeg.exe output is :
>ffmpeg.exe
ffmpeg version N-81300-gce2217b Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 5.4.0 (GCC)
configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-dxva2 --enable-libmfx --enable-nvenc --enable-avisynth --enable-bzlib --enable-lib
ebur128 --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libfree
type --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-lib
openjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame
--enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-
libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-decklink --enable-zlib
libavutil 55. 28.100 / 55. 28.100
libavcodec 57. 51.100 / 57. 51.100
libavformat 57. 46.100 / 57. 46.100
libavdevice 57. 0.102 / 57. 0.102
libavfilter 6. 50.100 / 6. 50.100
libswscale 4. 1.100 / 4. 1.100
libswresample 2. 1.100 / 2. 1.100
libpostproc 54. 0.100 / 54. 0.100
Hyper fast Audio and Video encoder
usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...
Use -h to get full help or, even better, run 'man ffmpeg'The ffmpeg output in Ubuntu is :
$ ffmpeg
ffmpeg version 2.8.6-1ubuntu2 Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 5.3.1 (Ubuntu 5.3.1-11ubuntu1) 20160311
configuration: --prefix=/usr --extra-version=1ubuntu2 --build-suffix=-ffmpeg --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --cc=cc --cxx=g++ --enable-gpl --enable-shared --disable-stripping --disable-decoder=libopenjpeg --disable-decoder=libschroedinger --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzvbi --enable-openal --enable-opengl --enable-x11grab --enable-libdc1394 --enable-libiec61883 --enable-libzmq --enable-frei0r --enable-libx264 --enable-libopencv
libavutil 54. 31.100 / 54. 31.100
libavcodec 56. 60.100 / 56. 60.100
libavformat 56. 40.101 / 56. 40.101
libavdevice 56. 4.100 / 56. 4.100
libavfilter 5. 40.101 / 5. 40.101
libavresample 2. 1. 0 / 2. 1. 0
libswscale 3. 1.101 / 3. 1.101
libswresample 1. 2.101 / 1. 2.101
libpostproc 53. 3.100 / 53. 3.100
Hyper fast Audio and Video encoder
usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...
Use -h to get full help or, even better, run 'man ffmpeg'I’m using Windows 7 64 bits and Ubuntu 16.04.
I have tried to increase the suggested values but didn’t work either.
Any idea, except switching to Windows ? -
av_read_frame function in ffmpeg in android always returning packet.stream_index as 0
31 juillet 2013, par droidmadI am using the following standard code pasted below (ref : http://dranger.com/ffmpeg/) to use ffmpeg in android using ndk. My code is working fine in ubuntu 10.04 using gcc compiler. But I am facing an issue in android.The issue is
av_read_frame(pFormatCtx, &packet)
is always returningpacket.stream_index=0
. I have tested my code with various rtsp urls and I have the same behaviour in all cases. I do not have any linking or compiling issues as everything seems to be working fine except this issue. I am trying to solve this from last 2 days but I am stuck badly.Please point me in right direction.#include
#include <android></android>log.h>
#include <libavcodec></libavcodec>avcodec.h>
#include <libavformat></libavformat>avformat.h>
#include <libswscale></libswscale>swscale.h>
#include
#define DEBUG_TAG "mydebug_ndk"
jint Java_com_example_tut2_MainActivity_myfunc(JNIEnv * env, jobject this,jstring myjurl) {
AVFormatContext *pFormatCtx = NULL;
int i, videoStream;
AVCodecContext *pCodecCtx = NULL;
AVCodec *pCodec = NULL;
AVFrame *pFrame = NULL;
AVPacket packet;
int frameFinished;
AVDictionary *optionsDict = NULL;
struct SwsContext *sws_ctx = NULL;
jboolean isCopy;
const char * mycurl = (*env)->GetStringUTFChars(env, myjurl, &isCopy);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:p2: [%s]", mycurl);
// Register all formats and codecs
av_register_all();
avformat_network_init();
// Open video file
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:before_open");
if(avformat_open_input(&pFormatCtx, mycurl, NULL, NULL)!=0)
return -1;
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "start: %d\t%d\n",pFormatCtx->raw_packet_buffer_remaining_size,pFormatCtx->max_index_size);
(*env)->ReleaseStringUTFChars(env, myjurl, mycurl);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:before_stream");
// Retrieve stream information
if(avformat_find_stream_info(pFormatCtx, NULL)<0)
return -1;
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:after_stream");
// Find the first video stream
videoStream=-1;
for(i=0; inb_streams; i++)
if(pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO) {
videoStream=i;
break;
}
if(videoStream==-1)
return -1; // Didn't find a video stream
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:after_videostream");
// Get a pointer to the codec context for the video stream
pCodecCtx=pFormatCtx->streams[videoStream]->codec;
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:after_codec_context");
// Find the decoder for the video stream
pCodec=avcodec_find_decoder(pCodecCtx->codec_id);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:after_decoder");
if(pCodec==NULL)
return -1;
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:found_decoder");
// Open codec
if(avcodec_open2(pCodecCtx, pCodec, &optionsDict)<0)
return -1;
// Allocate video frame
pFrame=avcodec_alloc_frame();
sws_ctx = sws_getContext(pCodecCtx->width,pCodecCtx->height,pCodecCtx->pix_fmt,pCodecCtx->width,
pCodecCtx->height,PIX_FMT_YUV420P,SWS_BILINEAR,NULL,NULL,NULL);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:before_while");
int count=0;
while(av_read_frame(pFormatCtx, &packet)>=0) {
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "entered while: %d %d %d\n", packet.duration,packet.stream_index,packet.size);
if(packet.stream_index==videoStream) {
// Decode video frame
//break;
avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished,&packet);
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:in_while");
if(frameFinished) {
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:gng_out_of_while");
break;
}
}
// Free the packet that was allocated by av_read_frame
av_free_packet(&packet);
if(++count>1000)
return -2; //infinite while loop
}
__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "NDK:after_while");
// Free the YUV frame
av_free(pFrame);
// Close the codec
avcodec_close(pCodecCtx);
// Close the video file
avformat_close_input(&pFormatCtx);
return 0;
}