
Recherche avancée
Autres articles (65)
-
Participer à sa traduction
10 avril 2011Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
Actuellement MediaSPIP n’est disponible qu’en français et (...) -
Encoding and processing into web-friendly formats
13 avril 2011, parMediaSPIP automatically converts uploaded files to internet-compatible formats.
Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
All uploaded files are stored online in their original format, so you can (...) -
Les statuts des instances de mutualisation
13 mars 2010, parPour des raisons de compatibilité générale du plugin de gestion de mutualisations avec les fonctions originales de SPIP, les statuts des instances sont les mêmes que pour tout autre objets (articles...), seuls leurs noms dans l’interface change quelque peu.
Les différents statuts possibles sont : prepa (demandé) qui correspond à une instance demandée par un utilisateur. Si le site a déjà été créé par le passé, il est passé en mode désactivé. publie (validé) qui correspond à une instance validée par un (...)
Sur d’autres sites (14217)
-
Unable to merge videos in Android using JavaCV ("Sample Description" Error)
15 décembre 2015, par SanI am creating a video from images via FFMPEG and I am able to get the video from images. I am also making use of JavaCV to merge two videos and I am able to join videos using JavaCV without any issues provided both the videos are taken via camera, i.e, a video actually recorded via mobile camera.
Issue that I’m facing :
I am not able to merge the video that was generated from FFMPEG using the images along with the video user has chosen which will mostly be a video that was not generated and taken via mobile camera.
CODE :
Code to generate Video via Images :FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(path + "/" + "dec16.mp4", 800, 400);
try {
recorder.setVideoCodec(avcodec.AV_CODEC_ID_MPEG4);
//recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
recorder.setVideoCodecName("H264");
recorder.setVideoOption("preset", "ultrafast");
recorder.setFormat("mp4");
recorder.setFrameRate(frameRate);
recorder.setVideoBitrate(60);
recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
startTime = System.currentTimeMillis();
recorder.start();
for(int j=0;j recorder.getTimestamp()) {
recorder.setTimestamp(t);
recorder.record(image);
}
}
recorder.stop();
} catch (Exception e) {
e.printStackTrace();
}Code to merge Videos :
int count = file_path.size();
System.out.println("final_joined_list size " + file_path.size());
if (file_path.size() != 1) {
try {
Movie[] inMovies = new Movie[count];
mediaStorageDir = new File(
Environment.getExternalStorageDirectory()
+ "/Pictures");
for (int i = 0; i < count; i++) {
File file = new File(file_path.get(i));
System.out.println("fileeeeeeeeeeeeeeee " + file);
System.out.println("file exists!!!!!!!!!!");
FileInputStream fis = new FileInputStream(file);
FileChannel fc = fis.getChannel();
inMovies[i] = MovieCreator.build(fc);
fis.close();
fc.close();
}
List<track> videoTracks = new LinkedList<track>();
List<track> audioTracks = new LinkedList<track>();
Log.d("Movies length", "isss " + inMovies.length);
if (inMovies.length != 0) {
for (Movie m : inMovies) {
for (Track t : m.getTracks()) {
if (t.getHandler().equals("soun")) {
audioTracks.add(t);
}
if (t.getHandler().equals("vide")) {
videoTracks.add(t);
}
if (t.getHandler().equals("")) {
}
}
}
}
Movie result = new Movie();
System.out.println("audio and videoo tracks : "
+ audioTracks.size() + " , " + videoTracks.size());
if (audioTracks.size() > 0) {
result.addTrack(new AppendTrack(audioTracks
.toArray(new Track[audioTracks.size()])));
}
if (videoTracks.size() > 0) {
result.addTrack(new AppendTrack(videoTracks
.toArray(new Track[videoTracks.size()])));
}
IsoFile out = null;
try {
out = (IsoFile) new DefaultMp4Builder().build(result);
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
long timestamp = new Date().getTime();
String timestampS = "" + timestamp;
File storagePath = new File(mediaStorageDir
+ File.separator);
storagePath.mkdirs();
File myMovie = new File(storagePath, String.format("%s.mp4", timestampS));
FileOutputStream fos = new FileOutputStream(myMovie);
FileChannel fco = fos.getChannel();
fco.position(0);
out.getBox(fco);
fco.close();
fos.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
String mFileName = Environment.getExternalStorageDirectory()
.getAbsolutePath();
// mFileName += "/output.mp4";
File sdCardRoot = Environment.getExternalStorageDirectory();
File yourDir = new File(mediaStorageDir + File.separator);
for (File f : yourDir.listFiles()) {
if (f.isFile())
name = f.getName();
// make something with the name
}
mFileName = mediaStorageDir.getPath() + File.separator
+ "output-%s.mp4";
System.out.println("final filename : "
+ mediaStorageDir.getPath() + File.separator
+ "output-%s.mp4" + "names of files : " + name);
single_video = false;
return name;
} else {
single_video = true;
name = file_path.get(0);
return name;
}
</track></track></track></track>Error :
The Error that I am facing while trying to merge the videos generated via Images and a normal video is
12-15 12:26:06.155 26022-26111/? W/System.err﹕ java.io.IOException: Cannot append com.googlecode.mp4parser.authoring.Mp4TrackImpl@45417c38 to com.googlecode.mp4parser.authoring.Mp4TrackImpl@44ffac60 since their Sample Description Boxes differ
12-15 12:26:06.155 26022-26111/? W/System.err﹕ at com.googlecode.mp4parser.authoring.tracks.AppendTrack.<init>(AppendTrack.java:48)
</init>Fix that I tried :
Google advised me to change the CODEC in JavaCV from
avcodec.AV_CODEC_ID_MPEG4
toavcodec.AV_CODEC_ID_H264
. But when I did that, I am not able to get the video from images thereby throwing the following error :12-15 12:26:05.840 26022-26089/? W/linker﹕ libavcodec.so has text relocations. This is wasting memory and is a security risk. Please fix.
12-15 12:26:05.975 26022-26089/? W/System.err﹕ com.googlecode.javacv.FrameRecorder$Exception: avcodec_open2() error -1: Could not open video codec.
12-15 12:26:05.975 26022-26089/? W/System.err﹕ at com.googlecode.javacv.FFmpegFrameRecorder.startUnsafe(FFmpegFrameRecorder.java:492)
12-15 12:26:05.975 26022-26089/? W/System.err﹕ at com.googlecode.javacv.FFmpegFrameRecorder.start(FFmpegFrameRecorder.java:267)What I need :
Creating video from Images is inevitable and that video will definitely be used to merge with other videos which might have any Codec Formats. So I need to find a way to merge any kind of videos irrespective of their Codecs or any other parameters. I am trying to keep it simple by just using the Jars and SO files and I dont want to drive myself crazy by going on a full scale implementation of FFMPEG Library. That being said, I am also ready to look into that library if I dont have any other ways to achieve what I want but a solid resource with an ALMOST working code would be much appreciated. Cheers.
Update :
I looked upon the issues mentioned at GitHub of OpenCV, but didnt find anything solid from it.
OpenCV Issues -
ffmpeg-mp4box-mpeg dash plays only few segments
30 octobre 2015, par IdrisNeed help in debugging the segment dash files
The input was an MP4 with these details. This was recorded from a video camera, the output from the camera was mkv and we converted into MP4 after editing the audio via adobe
- Size : 7.51 GB Frame rate : 25 frames/ second
- Data rate : 25326kbps
- Total bitrate : 25525kbps
Converted this to another mp4 with this command
ffmpeg -i "input.mp4" -s 1280x720 -c:v libx264 -b:v 750k -bf 2 -g 75 -sc_threshold 0 -an video_1280x720_750k.mp4
ffmpeg -i "input.mp4" -c:a aac -strict experimental -b:a 96k -ar 32000 -vn audio_96k.mp4The output video has
- fps : 25
- Data rate : 761kbps
- bitrate : 761kbps
Then, created the segmented dash through MP4Box
MP4Box -dash 10000 -frag 10000 -rap -segment-name video_0_1280000\segment_ video_1280x720_750k.mp4
MP4Box -dash 3000 -frag 10000 -rap -segment-name audio_0_96000\segment_ audio_96k.mp4The MPD generated was validated online and its perfect
UPDATE ! Included the MPD file
<?xml version="1.0"?>
<mpd xmlns="urn:mpeg:dash:schema:mpd:2011" minbuffertime="PT1.500S" type="static" mediapresentationduration="PT0H2M0.000S" maxsegmentduration="PT0H0M10.000S" profiles="urn:mpeg:dash:profile:full:2011">
<programinformation moreinformationurl="http://gpac.sourceforge.net">
</programinformation>
<period duration="PT0H2M0.000S">
<adaptationset segmentalignment="true" lang="eng">
<representation mimetype="audio/mp4" codecs="mp4a.40.2" audiosamplingrate="32000" startwithsap="1" bandwidth="98434">
<audiochannelconfiguration schemeiduri="urn:mpeg:dash:23003:3:audio_channel_configuration:2011" value="2"></audiochannelconfiguration>
<segmentlist timescale="32000" duration="319999">
<initialization sourceurl="audio_0_96000/segment_init.mp4"></initialization>
<segmenturl media="audio_0_96000/segment_1.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_2.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_3.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_4.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_5.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_6.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_7.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_8.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_9.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_10.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_11.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_12.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_13.m4s"></segmenturl>
</segmentlist>
</representation>
</adaptationset>
<adaptationset segmentalignment="true" maxwidth="1280" maxheight="720" maxframerate="25" par="16:9" lang="eng">
<representation mimetype="video/mp4" codecs="avc3.64001f" width="1280" height="720" framerate="25" sar="1:1" startwithsap="1" bandwidth="764668">
<segmentlist timescale="12800" duration="125866">
<initialization sourceurl="video_0_1280000/segment_init.mp4"></initialization>
<segmenturl media="video_0_1280000/segment_1.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_2.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_3.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_4.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_5.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_6.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_7.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_8.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_9.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_10.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_11.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_12.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_13.m4s"></segmenturl>
</segmentlist>
</representation>
</adaptationset>
</period>
</mpd>Played the video through dashjs.. I believe it just plays the initial segment and errors out as MEDIA_ERR_DECODE..MEDIA_ERR_SRC_NOT_SUPPORTED.. or some message which says start not found..
Through chrome debugging I see that atleast 4 segments are correctly loading.. I am not sure whats going on..
Any help in debugging the issue is really appreciated. I really can’t understand if this is a problem with the file or ffmpeg or mp4box or chrome.
Output from chrome debugging tool
[dash.js 1.5.1] new MediaPlayer instance has been created
dash.all.js:11 Playback initiated!
dash.all.js:11 Parsing complete: ( xml2json: 5ms, objectiron: 10ms, total: 0.015s)
dash.all.js:11 Manifest has been refreshed at Mon Oct 26 2015 10:19:22 GMT-0400 (Eastern Daylight Time)[1445869162092]
dash.all.js:11 SegmentTimeline detected using calculated Live Edge Time
dash.all.js:11 MediaSource is open!
dash.all.js:11 [object Event]
dash.all.js:11 Duration successfully set to: 120
dash.all.js:11 Added 0 inline events
dash.all.js:11 video codec: video/mp4;codecs="avc3.64001f"
dash.all.js:11 [video] stop
dash.all.js:11 audio codec: audio/mp4;codecs="mp4a.40.2"
dash.all.js:11 [audio] stop
dash.all.js:11 No text data.
dash.all.js:11 No fragmentedText data.
dash.all.js:11 No muxed data.
dash.all.js:11 [video] start
dash.all.js:11 [video] Getting the request for time: 0
dash.all.js:11 [video] Index for time 0 is 0
dash.all.js:11 [video] SegmentList: 0 / 120
dash.all.js:11 [audio] start
dash.all.js:11 [audio] Getting the request for time: 0
dash.all.js:11 [audio] Index for time 0 is 0
dash.all.js:11 [audio] SegmentList: 0 / 120
dash.all.js:11 [video] Getting the request for time: 9.83328125
dash.all.js:11 [video] Index for time 9.83328125 is 0
dash.all.js:11 [video] SegmentList: 0 / 120
dash.all.js:11 [video] SegmentList: 9.83328125 / 120
dash.all.js:11 [audio] Getting the request for time: 9.99996875
dash.all.js:11 [audio] Index for time 9.99996875 is 0
dash.all.js:11 [audio] SegmentList: 0 / 120
dash.all.js:11 [audio] SegmentList: 9.99996875 / 120
dash.all.js:11 loaded audio:Media Segment:0 (200, 20ms, 6ms)
dash.all.js:11 loaded video:Media Segment:0 (200, 153ms, 43ms)
dash.all.js:11 loaded video:Initialization Segment:NaN (200, 0ms, 32ms)
dash.all.js:11 [video] Initialization finished loading
dash.all.js:11 loaded audio:Initialization Segment:NaN (200, 0ms, 34ms)
dash.all.js:11 [audio] Initialization finished loading
dash.all.js:11 [video] Getting the request for time: 19.6665625
dash.all.js:11 [video] Index for time 19.6665625 is 1
dash.all.js:11 [video] SegmentList: 9.83328125 / 120
dash.all.js:11 [video] SegmentList: 19.6665625 / 120
dash.all.js:11 [audio] Getting the request for time: 19.9999375
dash.all.js:11 [audio] Index for time 19.9999375 is 1
dash.all.js:11 [audio] SegmentList: 9.99996875 / 120
dash.all.js:11 [audio] SegmentList: 19.9999375 / 120
dash.all.js:11 [video] Stalling Buffer
dash.all.js:11 [video] Waiting for more buffer before starting playback.
dash.all.js:11 [video] Getting the request for time: 0
dash.all.js:11 [video] Index for time 0 is 0
dash.all.js:11 [audio] Stalling Buffer
dash.all.js:11 [audio] Waiting for more buffer before starting playback.
dash.all.js:11 [audio] Getting the request for time: 0
dash.all.js:11 [audio] Index for time 0 is 0
dash.all.js:11 <video> loadedmetadata
dash.all.js:11 Starting playback at offset: 0
dash.all.js:11 [video] Getting the request for time: 29.499843750000004
dash.all.js:11 [video] Index for time 29.499843750000004 is 2
dash.all.js:11 [video] SegmentList: 19.6665625 / 120
dash.all.js:11 [video] SegmentList: 29.499843750000004 / 120
dash.all.js:11 [video] Got enough buffer to start.
dash.all.js:11 [video] seek: 0
dash.all.js:11 [audio] Getting the request for time: 29.999906250000002
dash.all.js:11 [audio] Index for time 29.999906250000002 is 2
dash.all.js:11 [audio] SegmentList: 19.9999375 / 120
dash.all.js:11 [audio] SegmentList: 29.999906250000002 / 120
dash.all.js:11 [audio] Got enough buffer to start.
dash.all.js:11 [audio] seek: 0
dash.all.js:11 loaded audio:Media Segment:9.99996875 (200, 67ms, 24ms)
dash.all.js:11 loaded video:Media Segment:9.83328125 (200, 71ms, 31ms)
dash.all.js:11 [audio] Buffered Range: 0.032 - 9.984
dash.all.js:11 [audio] Getting the request for time: 0
dash.all.js:11 [audio] Index for time 0 is 0
dash.all.js:11 Start Event Controller
dash.all.js:11 [audio] Buffered Range: 0.032 - 19.999968
dash.all.js:11 [audio] Getting the request for time: 0
dash.all.js:11 [audio] Index for time 0 is 0
dash.all.js:11 <video> play
dash.all.js:11 [video] start
dash.all.js:11 [video] Getting the request for time: 0
dash.all.js:11 [video] Index for time 0 is 0
dash.all.js:11 [video] SegmentList: 0 / 120
dash.all.js:11 [video] SegmentList: 9.83328125 / 120
dash.all.js:11 [video] SegmentList: 19.6665625 / 120
dash.all.js:11 [audio] start
dash.all.js:11 <video> playing
dash.all.js:11 [video] Buffered Range: 0 - 9
dash.all.js:11 [video] Getting the request for time: 0
dash.all.js:11 [video] Index for time 0 is 0
dash.all.js:11 Do seek: 0.032
dash.all.js:11 <video> seek
dash.all.js:11 [video] Getting the request for time: 29.499843750000004
dash.all.js:11 [video] Index for time 29.499843750000004 is 2
dash.all.js:11 [video] SegmentList: 19.6665625 / 120
dash.all.js:11 [video] SegmentList: 29.499843750000004 / 120
dash.all.js:11 [video] seek: 0.032
dash.all.js:11 [audio] seek: 0.032
dash.all.js:11 [video] Getting the request for time: 9
dash.all.js:11 [video] Index for time 9 is 0
dash.all.js:11 [video] SegmentList: 0 / 120
dash.all.js:11 [video] SegmentList: 9.83328125 / 120
dash.all.js:11 [video] SegmentList: 19.6665625 / 120
dash.all.js:11 [video] SegmentList: 29.499843750000004 / 120
dash.all.js:11 [video] Buffered Range: 0 - 18
dash.all.js:11 [video] Getting the request for time: 0
dash.all.js:11 [video] Index for time 0 is 0
dash.all.js:11 loaded video:Media Segment:19.6665625 (200, 42ms, 33ms)
dash.all.js:11 <video> seeked
dash.all.js:11 Start Event Controller
dash.all.js:11 <video> playing
dash.all.js:11 [video] Buffered Range: 0 - 28
dash.all.js:11 [video] Getting the request for time: 0
dash.all.js:11 [video] Index for time 0 is 0
dash.all.js:11 [audio] Getting the request for time: 19.999968
dash.all.js:11 [audio] Index for time 19.999968 is 1
dash.all.js:11 [audio] SegmentList: 9.99996875 / 120
dash.all.js:11 [audio] SegmentList: 19.9999375 / 120
dash.all.js:11 [audio] Getting the request for time: 29.999906250000002
dash.all.js:11 [audio] Index for time 29.999906250000002 is 2
dash.all.js:11 [audio] SegmentList: 19.9999375 / 120
dash.all.js:11 [audio] SegmentList: 29.999906250000002 / 120
dash.all.js:11 loaded audio:Media Segment:19.9999375 (200, 102ms, 2ms)
dash.all.js:11 [audio] Buffered Range: 0.032 - 29.983968
dash.all.js:11 [audio] Getting the request for time: 0
dash.all.js:11 [audio] Index for time 0 is 0
dash.all.js:11 loaded audio:Media Segment:29.999906250000002 (200, 26ms, 2ms)
dash.all.js:11 [audio] Buffered Range: 0.032 - 39.999968
dash.all.js:11 [audio] Getting the request for time: 0
dash.all.js:11 [audio] Index for time 0 is 0
dash.all.js:11 loaded video:Media Segment:29.499843750000004 (200, 47ms, 7ms)
dash.all.js:11 Video Element Error: MEDIA_ERR_DECODE
dash.all.js:11 [video] stop
dash.all.js:11 [audio] stop
dash.all.js:11 Video Element Error: MEDIA_ERR_SRC_NOT_SUPPORTED
dash.all.js:11 <video> play
</video></video></video></video></video></video></video> -
ffmpeg-mp4box-mpeg dash plays only few segments
30 octobre 2015, par IdrisNeed help in debugging the segment dash files
The input was an MP4 with these details. This was recorded from a video camera, the output from the camera was mkv and we converted into MP4 after editing the audio via adobe
- Size : 7.51 GB Frame rate : 25 frames/ second
- Data rate : 25326kbps
- Total bitrate : 25525kbps
Converted this to another mp4 with this command
ffmpeg -i "input.mp4" -s 1280x720 -c:v libx264 -b:v 750k -bf 2 -g 75 -sc_threshold 0 -an video_1280x720_750k.mp4
ffmpeg -i "input.mp4" -c:a aac -strict experimental -b:a 96k -ar 32000 -vn audio_96k.mp4The output video has
- fps : 25
- Data rate : 761kbps
- bitrate : 761kbps
Then, created the segmented dash through MP4Box
MP4Box -dash 10000 -frag 10000 -rap -segment-name video_0_1280000\segment_ video_1280x720_750k.mp4
MP4Box -dash 3000 -frag 10000 -rap -segment-name audio_0_96000\segment_ audio_96k.mp4The MPD generated was validated online and its perfect
UPDATE ! Included the MPD file
<?xml version="1.0"?>
<mpd xmlns="urn:mpeg:dash:schema:mpd:2011" minbuffertime="PT1.500S" type="static" mediapresentationduration="PT0H2M0.000S" maxsegmentduration="PT0H0M10.000S" profiles="urn:mpeg:dash:profile:full:2011">
<programinformation moreinformationurl="http://gpac.sourceforge.net">
</programinformation>
<period duration="PT0H2M0.000S">
<adaptationset segmentalignment="true" lang="eng">
<representation mimetype="audio/mp4" codecs="mp4a.40.2" audiosamplingrate="32000" startwithsap="1" bandwidth="98434">
<audiochannelconfiguration schemeiduri="urn:mpeg:dash:23003:3:audio_channel_configuration:2011" value="2"></audiochannelconfiguration>
<segmentlist timescale="32000" duration="319999">
<initialization sourceurl="audio_0_96000/segment_init.mp4"></initialization>
<segmenturl media="audio_0_96000/segment_1.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_2.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_3.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_4.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_5.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_6.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_7.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_8.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_9.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_10.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_11.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_12.m4s"></segmenturl>
<segmenturl media="audio_0_96000/segment_13.m4s"></segmenturl>
</segmentlist>
</representation>
</adaptationset>
<adaptationset segmentalignment="true" maxwidth="1280" maxheight="720" maxframerate="25" par="16:9" lang="eng">
<representation mimetype="video/mp4" codecs="avc3.64001f" width="1280" height="720" framerate="25" sar="1:1" startwithsap="1" bandwidth="764668">
<segmentlist timescale="12800" duration="125866">
<initialization sourceurl="video_0_1280000/segment_init.mp4"></initialization>
<segmenturl media="video_0_1280000/segment_1.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_2.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_3.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_4.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_5.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_6.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_7.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_8.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_9.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_10.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_11.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_12.m4s"></segmenturl>
<segmenturl media="video_0_1280000/segment_13.m4s"></segmenturl>
</segmentlist>
</representation>
</adaptationset>
</period>
</mpd>Played the video through dashjs.. I believe it just plays the initial segment and errors out as MEDIA_ERR_DECODE..MEDIA_ERR_SRC_NOT_SUPPORTED.. or some message which says start not found..
Through chrome debugging I see that atleast 4 segments are correctly loading.. I am not sure whats going on..
Any help in debugging the issue is really appreciated. I really can’t understand if this is a problem with the file or ffmpeg or mp4box or chrome.
Output from chrome debugging tool
[dash.js 1.5.1] new MediaPlayer instance has been created
dash.all.js:11 Playback initiated!
dash.all.js:11 Parsing complete: ( xml2json: 5ms, objectiron: 10ms, total: 0.015s)
dash.all.js:11 Manifest has been refreshed at Mon Oct 26 2015 10:19:22 GMT-0400 (Eastern Daylight Time)[1445869162092]
dash.all.js:11 SegmentTimeline detected using calculated Live Edge Time
dash.all.js:11 MediaSource is open!
dash.all.js:11 [object Event]
dash.all.js:11 Duration successfully set to: 120
dash.all.js:11 Added 0 inline events
dash.all.js:11 video codec: video/mp4;codecs="avc3.64001f"
dash.all.js:11 [video] stop
dash.all.js:11 audio codec: audio/mp4;codecs="mp4a.40.2"
dash.all.js:11 [audio] stop
dash.all.js:11 No text data.
dash.all.js:11 No fragmentedText data.
dash.all.js:11 No muxed data.
dash.all.js:11 [video] start
dash.all.js:11 [video] Getting the request for time: 0
dash.all.js:11 [video] Index for time 0 is 0
dash.all.js:11 [video] SegmentList: 0 / 120
dash.all.js:11 [audio] start
dash.all.js:11 [audio] Getting the request for time: 0
dash.all.js:11 [audio] Index for time 0 is 0
dash.all.js:11 [audio] SegmentList: 0 / 120
dash.all.js:11 [video] Getting the request for time: 9.83328125
dash.all.js:11 [video] Index for time 9.83328125 is 0
dash.all.js:11 [video] SegmentList: 0 / 120
dash.all.js:11 [video] SegmentList: 9.83328125 / 120
dash.all.js:11 [audio] Getting the request for time: 9.99996875
dash.all.js:11 [audio] Index for time 9.99996875 is 0
dash.all.js:11 [audio] SegmentList: 0 / 120
dash.all.js:11 [audio] SegmentList: 9.99996875 / 120
dash.all.js:11 loaded audio:Media Segment:0 (200, 20ms, 6ms)
dash.all.js:11 loaded video:Media Segment:0 (200, 153ms, 43ms)
dash.all.js:11 loaded video:Initialization Segment:NaN (200, 0ms, 32ms)
dash.all.js:11 [video] Initialization finished loading
dash.all.js:11 loaded audio:Initialization Segment:NaN (200, 0ms, 34ms)
dash.all.js:11 [audio] Initialization finished loading
dash.all.js:11 [video] Getting the request for time: 19.6665625
dash.all.js:11 [video] Index for time 19.6665625 is 1
dash.all.js:11 [video] SegmentList: 9.83328125 / 120
dash.all.js:11 [video] SegmentList: 19.6665625 / 120
dash.all.js:11 [audio] Getting the request for time: 19.9999375
dash.all.js:11 [audio] Index for time 19.9999375 is 1
dash.all.js:11 [audio] SegmentList: 9.99996875 / 120
dash.all.js:11 [audio] SegmentList: 19.9999375 / 120
dash.all.js:11 [video] Stalling Buffer
dash.all.js:11 [video] Waiting for more buffer before starting playback.
dash.all.js:11 [video] Getting the request for time: 0
dash.all.js:11 [video] Index for time 0 is 0
dash.all.js:11 [audio] Stalling Buffer
dash.all.js:11 [audio] Waiting for more buffer before starting playback.
dash.all.js:11 [audio] Getting the request for time: 0
dash.all.js:11 [audio] Index for time 0 is 0
dash.all.js:11 <video> loadedmetadata
dash.all.js:11 Starting playback at offset: 0
dash.all.js:11 [video] Getting the request for time: 29.499843750000004
dash.all.js:11 [video] Index for time 29.499843750000004 is 2
dash.all.js:11 [video] SegmentList: 19.6665625 / 120
dash.all.js:11 [video] SegmentList: 29.499843750000004 / 120
dash.all.js:11 [video] Got enough buffer to start.
dash.all.js:11 [video] seek: 0
dash.all.js:11 [audio] Getting the request for time: 29.999906250000002
dash.all.js:11 [audio] Index for time 29.999906250000002 is 2
dash.all.js:11 [audio] SegmentList: 19.9999375 / 120
dash.all.js:11 [audio] SegmentList: 29.999906250000002 / 120
dash.all.js:11 [audio] Got enough buffer to start.
dash.all.js:11 [audio] seek: 0
dash.all.js:11 loaded audio:Media Segment:9.99996875 (200, 67ms, 24ms)
dash.all.js:11 loaded video:Media Segment:9.83328125 (200, 71ms, 31ms)
dash.all.js:11 [audio] Buffered Range: 0.032 - 9.984
dash.all.js:11 [audio] Getting the request for time: 0
dash.all.js:11 [audio] Index for time 0 is 0
dash.all.js:11 Start Event Controller
dash.all.js:11 [audio] Buffered Range: 0.032 - 19.999968
dash.all.js:11 [audio] Getting the request for time: 0
dash.all.js:11 [audio] Index for time 0 is 0
dash.all.js:11 <video> play
dash.all.js:11 [video] start
dash.all.js:11 [video] Getting the request for time: 0
dash.all.js:11 [video] Index for time 0 is 0
dash.all.js:11 [video] SegmentList: 0 / 120
dash.all.js:11 [video] SegmentList: 9.83328125 / 120
dash.all.js:11 [video] SegmentList: 19.6665625 / 120
dash.all.js:11 [audio] start
dash.all.js:11 <video> playing
dash.all.js:11 [video] Buffered Range: 0 - 9
dash.all.js:11 [video] Getting the request for time: 0
dash.all.js:11 [video] Index for time 0 is 0
dash.all.js:11 Do seek: 0.032
dash.all.js:11 <video> seek
dash.all.js:11 [video] Getting the request for time: 29.499843750000004
dash.all.js:11 [video] Index for time 29.499843750000004 is 2
dash.all.js:11 [video] SegmentList: 19.6665625 / 120
dash.all.js:11 [video] SegmentList: 29.499843750000004 / 120
dash.all.js:11 [video] seek: 0.032
dash.all.js:11 [audio] seek: 0.032
dash.all.js:11 [video] Getting the request for time: 9
dash.all.js:11 [video] Index for time 9 is 0
dash.all.js:11 [video] SegmentList: 0 / 120
dash.all.js:11 [video] SegmentList: 9.83328125 / 120
dash.all.js:11 [video] SegmentList: 19.6665625 / 120
dash.all.js:11 [video] SegmentList: 29.499843750000004 / 120
dash.all.js:11 [video] Buffered Range: 0 - 18
dash.all.js:11 [video] Getting the request for time: 0
dash.all.js:11 [video] Index for time 0 is 0
dash.all.js:11 loaded video:Media Segment:19.6665625 (200, 42ms, 33ms)
dash.all.js:11 <video> seeked
dash.all.js:11 Start Event Controller
dash.all.js:11 <video> playing
dash.all.js:11 [video] Buffered Range: 0 - 28
dash.all.js:11 [video] Getting the request for time: 0
dash.all.js:11 [video] Index for time 0 is 0
dash.all.js:11 [audio] Getting the request for time: 19.999968
dash.all.js:11 [audio] Index for time 19.999968 is 1
dash.all.js:11 [audio] SegmentList: 9.99996875 / 120
dash.all.js:11 [audio] SegmentList: 19.9999375 / 120
dash.all.js:11 [audio] Getting the request for time: 29.999906250000002
dash.all.js:11 [audio] Index for time 29.999906250000002 is 2
dash.all.js:11 [audio] SegmentList: 19.9999375 / 120
dash.all.js:11 [audio] SegmentList: 29.999906250000002 / 120
dash.all.js:11 loaded audio:Media Segment:19.9999375 (200, 102ms, 2ms)
dash.all.js:11 [audio] Buffered Range: 0.032 - 29.983968
dash.all.js:11 [audio] Getting the request for time: 0
dash.all.js:11 [audio] Index for time 0 is 0
dash.all.js:11 loaded audio:Media Segment:29.999906250000002 (200, 26ms, 2ms)
dash.all.js:11 [audio] Buffered Range: 0.032 - 39.999968
dash.all.js:11 [audio] Getting the request for time: 0
dash.all.js:11 [audio] Index for time 0 is 0
dash.all.js:11 loaded video:Media Segment:29.499843750000004 (200, 47ms, 7ms)
dash.all.js:11 Video Element Error: MEDIA_ERR_DECODE
dash.all.js:11 [video] stop
dash.all.js:11 [audio] stop
dash.all.js:11 Video Element Error: MEDIA_ERR_SRC_NOT_SUPPORTED
dash.all.js:11 <video> play
</video></video></video></video></video></video></video>