Recherche avancée

Médias (0)

Mot : - Tags -/presse-papier

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (111)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Mise à disposition des fichiers

    14 avril 2011, par

    Par défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
    Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
    Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

Sur d’autres sites (7258)

  • Is it possible to merge video files with FFmpeg interlaced not concatenated ?

    11 juin 2019, par TudorT

    I am trying to create a video file that has the frames from 2 source mp4 video files interlaced like this :

    I1 I2 P1 P2 I1 I2 P1 P2 I1 I2 P1 P2 ...

    where I = Intra, P = predicted, and the numbers are from which source file they come.

    Is this possible with FFmpeg commands ? If not, if I use the ffplay code from read_thread, how do I put the content returned from av_read_frame into a proper video file ?

    Thanks

  • Why are no dts pts written to my mp4 container

    14 mai 2019, par Kiamur

    Based on my (self-answered) question here Muxing AVPackets into mp4 file - revisited, I have to ask, what could be the reason why there are no values written for pts/dts in the resulting mp4 container.

    I examined the container file with the tool MediaInfo.
    I observe that only the very first Frame contains a value for pts in the container. After that, pts is not even shown in the mp4 file anymore, but dts is, with a value of all zeros.

    This is the output from MediaInfo for the first 3 frames :

    0000A2   slice_layer_without_partitioning (IDR) - 0 (0x0) - Frame 0 - slice_type I - frame_num 0 - DTS 00:00:00.000 - PTS 00:00:00.017 (141867 bytes)
    0000A2    Header (5 bytes)
    0000A2     zero_byte:                          0 (0x00)
    0000A3     start_code_prefix_one_3bytes:       1 (0x000001)
    0000A6     nal_ref_idc:                        3 (0x3) - (2 bits)
    0000A6     nal_unit_type:                      5 (0x05) - (5 bits)
    0000A7    slice_header (3 bytes)
    0000A7     first_mb_in_slice:                  0 (0x0)
    0000A7     slice_type:                         7 (0x07) - I
    0000A8     pic_parameter_set_id:               0 (0x0)
    0000A8     frame_num:                          0 (0x0)
    0000A8     idr_pic_id:                         0 (0x0)
    0000A8     no_output_of_prior_pics_flag:       No
    0000A8     long_term_reference_flag:           No
    0000A9     slice_qp_delta:                     -5 (0xFFFFFFFB)
    0000AA     disable_deblocking_filter_idc:      0 (0x0)
    0000AA     slice_alpha_c0_offset_div2:         0 (0x0)
    0000AA     slice_beta_offset_div2:             0 (0x0)
    0000AA    slice_data (141856 bytes)
    0000AA     (ToDo):                             (Data)
    022ACD   slice_layer_without_partitioning (IDR) - 0 (0x0) - Frame 0 - slice_type I - frame_num 0 - DTS 00:00:00.000 - PTS 00:00:00.017 - first_mb_in_slice 8040 (2248 bytes)
    022ACD    Header (5 bytes)
    022ACD     zero_byte:                          0 (0x00)
    022ACE     start_code_prefix_one_3bytes:       1 (0x000001)
    022AD1     nal_ref_idc:                        3 (0x3) - (2 bits)
    022AD1     nal_unit_type:                      5 (0x05) - (5 bits)
    022AD2    slice_header (6 bytes)
    022AD2     first_mb_in_slice:                  8040 (0x001F68)
    022AD5     slice_type:                         7 (0x07) - I
    022AD6     pic_parameter_set_id:               0 (0x0)
    022AD6     frame_num:                          0 (0x0)
    022AD6     idr_pic_id:                         0 (0x0)
    022AD6     no_output_of_prior_pics_flag:       No
    022AD6     long_term_reference_flag:           No
    022AD7     slice_qp_delta:                     -5 (0xFFFFFFFB)
    022AD8     disable_deblocking_filter_idc:      0 (0x0)
    022AD8     slice_alpha_c0_offset_div2:         0 (0x0)
    022AD8     slice_beta_offset_div2:             0 (0x0)
    022AD8    slice_data (2237 bytes)
    022AD8     (ToDo):                             (Data)
    023395  1 (36212 bytes)
    023395   slice_layer_without_partitioning (non-IDR) - 2 (0x2) - Frame 1 - slice_type P - frame_num 1 - DTS 00:00:00.000 (36017 bytes)
    023395    Header (5 bytes)
    023395     zero_byte:                          0 (0x00)
    023396     start_code_prefix_one_3bytes:       1 (0x000001)
    023399     nal_ref_idc:                        3 (0x3) - (2 bits)
    023399     nal_unit_type:                      1 (0x01) - (5 bits)
    02339A    slice_header (3 bytes)
    02339A     first_mb_in_slice:                  0 (0x0)
    02339A     slice_type:                         5 (0x5) - P
    02339A     pic_parameter_set_id:               0 (0x0)
    02339A     frame_num:                          1 (0x1)
    02339B     num_ref_idx_active_override_flag (0 bytes)
    02339B      num_ref_idx_active_override_flag:  Yes
    02339B      num_ref_idx_l0_active_minus1:      0 (0x0)
    02339B     ref_pic_list_modification_flag_l0:  No
    02339B     adaptive_ref_pic_marking_mode_flag: No
    02339C     cabac_init_idc:                     0 (0x0)
    02339C     slice_qp_delta:                     -3 (0xFFFFFFFD)
    02339C     disable_deblocking_filter_idc:      0 (0x0)
    02339C     slice_alpha_c0_offset_div2:         0 (0x0)
    02339D     slice_beta_offset_div2:             0 (0x0)
    02339D    slice_data (36012 bytes)
    02339D     (ToDo):                             (Data)
    02C046   slice_layer_without_partitioning (non-IDR) - 2 (0x2) - Frame 1 - slice_type P - frame_num 1 - DTS 00:00:00.000 - first_mb_in_slice 8040 (195 bytes)
    02C046    Header (5 bytes)
    02C046     zero_byte:                          0 (0x00)
    02C047     start_code_prefix_one_3bytes:       1 (0x000001)
    02C04A     nal_ref_idc:                        3 (0x3) - (2 bits)
    02C04A     nal_unit_type:                      1 (0x01) - (5 bits)
    02C04B    slice_header (6 bytes)
    02C04B     first_mb_in_slice:                  8040 (0x001F68)
    02C04E     slice_type:                         5 (0x5) - P
    02C04E     pic_parameter_set_id:               0 (0x0)
    02C04E     frame_num:                          1 (0x1)
    02C04F     num_ref_idx_active_override_flag (0 bytes)
    02C04F      num_ref_idx_active_override_flag:  Yes
    02C04F      num_ref_idx_l0_active_minus1:      0 (0x0)
    02C04F     ref_pic_list_modification_flag_l0:  No
    02C04F     adaptive_ref_pic_marking_mode_flag: No
    02C050     cabac_init_idc:                     0 (0x0)
    02C050     slice_qp_delta:                     -3 (0xFFFFFFFD)
    02C050     disable_deblocking_filter_idc:      0 (0x0)
    02C050     slice_alpha_c0_offset_div2:         0 (0x0)
    02C051     slice_beta_offset_div2:             0 (0x0)
    02C051    slice_data (190 bytes)
    02C051     (ToDo):                             (Data)
    02C109  1 (26280 bytes)
    02C109   slice_layer_without_partitioning (non-IDR) - 4 (0x4) - Frame 2 - slice_type P - frame_num 2 - DTS 00:00:00.000 (26157 bytes)
    02C109    Header (5 bytes)
    02C109     zero_byte:                          0 (0x00)
    02C10A     start_code_prefix_one_3bytes:       1 (0x000001)
    02C10D     nal_ref_idc:                        3 (0x3) - (2 bits)
    02C10D     nal_unit_type:                      1 (0x01) - (5 bits)
    02C10E    slice_header (3 bytes)
    02C10E     first_mb_in_slice:                  0 (0x0)
    02C10E     slice_type:                         5 (0x5) - P
    02C10E     pic_parameter_set_id:               0 (0x0)
    02C10E     frame_num:                          2 (0x2)
    02C10F     num_ref_idx_active_override_flag (0 bytes)
    02C10F      num_ref_idx_active_override_flag:  Yes
    02C10F      num_ref_idx_l0_active_minus1:      0 (0x0)
    02C10F     ref_pic_list_modification_flag_l0:  No
    02C10F     adaptive_ref_pic_marking_mode_flag: No
    02C110     cabac_init_idc:                     0 (0x0)
    02C110     slice_qp_delta:                     -2 (0xFFFFFFFE)
    02C110     disable_deblocking_filter_idc:      0 (0x0)
    02C110     slice_alpha_c0_offset_div2:         0 (0x0)
    02C111     slice_beta_offset_div2:             0 (0x0)
    02C111    slice_data (26152 bytes)
    02C111     (ToDo):                             (Data)
    032736   slice_layer_without_partitioning (non-IDR) - 4 (0x4) - Frame 2 - slice_type P - frame_num 2 - DTS 00:00:00.000 - first_mb_in_slice 8040 (123 bytes)
    032736    Header (5 bytes)
    032736     zero_byte:                          0 (0x00)
    032737     start_code_prefix_one_3bytes:       1 (0x000001)
    03273A     nal_ref_idc:                        3 (0x3) - (2 bits)
    03273A     nal_unit_type:                      1 (0x01) - (5 bits)
    03273B    slice_header (6 bytes)
    03273B     first_mb_in_slice:                  8040 (0x001F68)
    03273E     slice_type:                         5 (0x5) - P
    03273E     pic_parameter_set_id:               0 (0x0)
    03273E     frame_num:                          2 (0x2)
    03273F     num_ref_idx_active_override_flag (0 bytes)
    03273F      num_ref_idx_active_override_flag:  Yes
    03273F      num_ref_idx_l0_active_minus1:      0 (0x0)
    03273F     ref_pic_list_modification_flag_l0:  No
    03273F     adaptive_ref_pic_marking_mode_flag: No
    032740     cabac_init_idc:                     0 (0x0)
    032740     slice_qp_delta:                     -2 (0xFFFFFFFE)
    032740     disable_deblocking_filter_idc:      0 (0x0)
    032740     slice_alpha_c0_offset_div2:         0 (0x0)
    032741     slice_beta_offset_div2:             0 (0x0)
    032741    slice_data (118 bytes)
    032741     (ToDo):                             (Data)
    0327B1  1 (21125 bytes)

    It goes on like that, even though I set pts and dts. The settings may not be correct already (I do some calculations like (1 / framerate) * FrameNumber), but I would expect at least some numbers in pts and dts, when I set the according fields in the avPacket structure and write that via av_interleaved_write_frame(outFmtCtx, &avPacket) ; to the file.

    What could be wrong here ?

    Edit :

    (please see below in the comments the download to my testdata and source file)
    One thing that bugs me is the fact, if I compare the output of MediaInfo from my file and that of the muxing.c generated is, that in the header, the muxing.c generated already mentions the duration of the file as 9960 ms, whereas mine is only 40 ms.

    muxing.c also does call avformat_write_header before even one frame is drawn. Yes, I suppose that the header will be updated, when the either av_interleaved_write_frame or av_write_trailer is called, but I totally not understand the mechanics behind it.
    Maybe somebody can enlighten me with some background information of any kind.

    Additionally, I think it could be necessarry to extract some SPS and PPS from my raw data (preceding the I-slice), and give that as extra data to the avformat_write_header call. But I just cannot figure out myself if I have to do that at all and if so, how to do it.

  • Make video frames from a livestream identifiable across multiple clients

    23 septembre 2016, par mschwaig

    I need to distribute a video stream from a live source to several clients with the additional requirement that each frame is identifiable across all clients.

    I have already done research into the topic, and I have arrived at a possible solution that I can share. My solution seems suboptimal and this is my first experience of working with video streams, so I want to see if somebody knows a better way.

    The reason why I need to be able to identify specific frames within the video stream is that the streaming clients need to be able to talk about the time differences between events each of them identifies in their video stream.

    A little clarifying example

    I want to enable the following interaction :

    • Two client applications Dewey and Stevie connect to the streaming server
    • Dewey displays the stream and Stevie saves it to disk
    • Dewey identifies a specific video frame that is of interest to Stevie, so he wants to tell Stevie about it
    • Dewey extracts some identifying information from the video frame and sends it to Stevie
    • Stevie uses the identifying information to extract the same frame from the copy of the livestream he is currently saving

    Dewey cannot send the frame to Stevie directly, because Malcolm and Reese also want to tell him about specific video frames and Stevie is interested in the time difference between their findings.

    Suggested solution

    The solution that I found was using ffserver to broadcast a RTP stream and use the timestamps from the RTCP packets to identify frames. These timestamps are normally used to synchronize audio and video, and not to provide a shared timeline across several clients, which is why I am skeptical this is the best way to solve my problem.

    It also seems beneficial to have frame numbers, like an increasing counter of frames instead of arbitrary timestamps which increase by some perhaps varying offset as for my application I also have to reference neighboring frames and it seems easier to compute time differences from frame numbers, than the other way around.