Recherche avancée

Médias (91)

Autres articles (78)

  • D’autres logiciels intéressants

    12 avril 2011, par

    On ne revendique pas d’être les seuls à faire ce que l’on fait ... et on ne revendique surtout pas d’être les meilleurs non plus ... Ce que l’on fait, on essaie juste de le faire bien, et de mieux en mieux...
    La liste suivante correspond à des logiciels qui tendent peu ou prou à faire comme MediaSPIP ou que MediaSPIP tente peu ou prou à faire pareil, peu importe ...
    On ne les connais pas, on ne les a pas essayé, mais vous pouvez peut être y jeter un coup d’oeil.
    Videopress
    Site Internet : (...)

  • MediaSPIP Player : problèmes potentiels

    22 février 2011, par

    Le lecteur ne fonctionne pas sur Internet Explorer
    Sur Internet Explorer (8 et 7 au moins), le plugin utilise le lecteur Flash flowplayer pour lire vidéos et son. Si le lecteur ne semble pas fonctionner, cela peut venir de la configuration du mod_deflate d’Apache.
    Si dans la configuration de ce module Apache vous avez une ligne qui ressemble à la suivante, essayez de la supprimer ou de la commenter pour voir si le lecteur fonctionne correctement : /** * GeSHi (C) 2004 - 2007 Nigel McNie, (...)

  • L’agrémenter visuellement

    10 avril 2011

    MediaSPIP est basé sur un système de thèmes et de squelettes. Les squelettes définissent le placement des informations dans la page, définissant un usage spécifique de la plateforme, et les thèmes l’habillage graphique général.
    Chacun peut proposer un nouveau thème graphique ou un squelette et le mettre à disposition de la communauté.

Sur d’autres sites (4757)

  • Unable to stream file onto localhost - ffmpeg

    18 octobre 2013, par trueblue

    I am new to ffmpeg/ffserver. I am trying to stream a local file named Trial onto a localhost using ffserver. I want to run the file in browser as http://localhost:8090/feed1.ffm
    I am executing the below command in Ubuntu(Trial is a Mpeg TS file) :

     ffmpeg -i Trial http://localhost:8090/feed1.ffm

    Upon execution of above command I am getting below error :

    FFmpeg version SVN-r0.5.9-4:0.5.9-0ubuntu0.10.04.3, Copyright (c) 2000-2009 Fabrice Bellard, et al.
     configuration: --extra-version=4:0.5.9-0ubuntu0.10.04.3 --prefix=/usr --enable-avfilter --enable-avfilter-lavf --enable-vdpau --enable-bzlib --enable-libgsm --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-pthreads --enable-zlib --disable-stripping --disable-vhook --enable-runtime-cpudetect --enable-gpl --enable-postproc --enable-swscale --enable-x11grab --enable-libdc1394 --enable-shared --disable-static
     libavutil     49.15. 0 / 49.15. 0
     libavcodec    52.20. 1 / 52.20. 1
     libavformat   52.31. 0 / 52.31. 0
     libavdevice   52. 1. 0 / 52. 1. 0
     libavfilter    0. 4. 0 /  0. 4. 0
     libswscale     0. 7. 1 /  0. 7. 1
     libpostproc   51. 2. 0 / 51. 2. 0
     built on Jan 24 2013 19:42:59, gcc: 4.4.3

    Seems stream 0 codec frame rate differs from container frame rate: 119.88 (120000/1001) -> 59.94 (60000/1001)
    Input #0, mpegts, from 'Trial':
     Duration: 00:00:04.22, start: 0.177633, bitrate: 40368 kb/s
     Program 2
       Stream #0.0[0x21]: Video: mpeg2video, yuv420p, 1280x720 [PAR 1:1 DAR 16:9], 45000 kb/s, 59.94 tbr, 90k tbn, 119.88 tbc
    Output #0, ffm, to 'http://localhost:8090/feed1.ffm':
       Stream #0.0: Video: flv, yuv420p, 352x288, q=1-5, 100 kb/s, 1000k tbn, 15 tbc
       Stream #0.1: Audio: mp2, 44100 Hz, mono, s16, 32 kb/s
       Stream #0.2: Video: mpeg1video, yuv420p, 160x128, q=3-31, 64 kb/s, 1000k tbn, 3 tbc
       Stream #0.3: Audio: mp2, 22050 Hz, mono, s16, 64 kb/s
       Stream #0.4: Video: msmpeg4, yuv420p, 352x240, q=3-31, 256 kb/s, 1000k tbn, 15 tbc
    Could not find input stream matching output stream #0.1

    My ffserver.conf file goes like this :

    # Port on which the server is listening. You must select a different
    # port from your standard HTTP web server if it is running on the same
    # computer.
    Port 8090

    # Address on which the server is bound. Only useful if you have
    # several network interfaces.
    BindAddress 0.0.0.0

    # Number of simultaneous HTTP connections that can be handled. It has
    # to be defined *before* the MaxClients parameter, since it defines the
    # MaxClients maximum limit.
    MaxHTTPConnections 2000

    # Number of simultaneous requests that can be handled. Since FFServer
    # is very fast, it is more likely that you will want to leave this high
    # and use MaxBandwidth, below.
    MaxClients 1000

    # This the maximum amount of kbit/sec that you are prepared to
    # consume when streaming to clients.
    MaxBandwidth 1000

    # Access log file (uses standard Apache log file format)
    # '-' is the standard output.
    CustomLog -

    # Suppress that if you want to launch ffserver as a daemon.
    NoDaemon


    ##################################################################
    # Definition of the live feeds. Each live feed contains one video
    # and/or audio sequence coming from an ffmpeg encoder or another
    # ffserver. This sequence may be encoded simultaneously with several
    # codecs at several resolutions.

    <feed>

    # You must use &#39;ffmpeg&#39; to send a live feed to ffserver. In this
    # example, you can type:
    #
    # ffmpeg http://localhost:8090/feed1.ffm

    # ffserver can also do time shifting. It means that it can stream any
    # previously recorded live stream. The request should contain:
    # "http://xxxx?date=[YYYY-MM-DDT][[HH:]MM:]SS[.m...]".You must specify
    # a path where the feed is stored on disk. You also specify the
    # maximum size of the feed, where zero means unlimited. Default:
    # File=/tmp/feed_name.ffm FileMaxSize=5M
    File /tmp/feed1.ffm
    FileMaxSize 5M

    # You could specify
    # ReadOnlyFile /saved/specialvideo.ffm
    # This marks the file as readonly and it will not be deleted or updated.

    # Specify launch in order to start ffmpeg automatically.
    # First ffmpeg must be defined with an appropriate path if needed,
    # after that options can follow, but avoid adding the http:// field
    #Launch ffmpeg

    # Only allow connections from localhost to the feed.
    ACL allow 127.0.0.1

    </feed>



    <stream>
    Feed feed1.ffm
    Format swf
    VideoCodec flv
    VideoFrameRate 15
    VideoBufferSize 80000
    VideoBitRate 100
    VideoQMin 1
    VideoQMax 5
    VideoSize 352x288
    PreRoll 0
    Noaudio
    </stream>

    ##################################################################
    # Now you can define each stream which will be generated from the
    # original audio and video stream. Each format has a filename (here
    # &#39;test1.mpg&#39;). FFServer will send this stream when answering a
    # request containing this filename.

    <stream>

    # coming from live feed &#39;feed1&#39;
    Feed feed1.ffm

    # Format of the stream : you can choose among:
    # mpeg       : MPEG-1 multiplexed video and audio
    # mpegvideo  : only MPEG-1 video
    # mp2        : MPEG-2 audio (use AudioCodec to select layer 2 and 3 codec)
    # ogg        : Ogg format (Vorbis audio codec)
    # rm         : RealNetworks-compatible stream. Multiplexed audio and video.
    # ra         : RealNetworks-compatible stream. Audio only.
    # mpjpeg     : Multipart JPEG (works with Netscape without any plugin)
    # jpeg       : Generate a single JPEG image.
    # asf        : ASF compatible streaming (Windows Media Player format).
    # swf        : Macromedia Flash compatible stream
    # avi        : AVI format (MPEG-4 video, MPEG audio sound)
    Format mpeg

    # Bitrate for the audio stream. Codecs usually support only a few
    # different bitrates.
    AudioBitRate 32

    # Number of audio channels: 1 = mono, 2 = stereo
    AudioChannels 1

    # Sampling frequency for audio. When using low bitrates, you should
    # lower this frequency to 22050 or 11025. The supported frequencies
    # depend on the selected audio codec.
    AudioSampleRate 44100

    # Bitrate for the video stream
    VideoBitRate 64


    # Ratecontrol buffer size
    VideoBufferSize 40

    # Number of frames per second
    VideoFrameRate 3

    # Size of the video frame: WxH (default: 160x128)
    # The following abbreviations are defined: sqcif, qcif, cif, 4cif, qqvga,
    # qvga, vga, svga, xga, uxga, qxga, sxga, qsxga, hsxga, wvga, wxga, wsxga,
    # wuxga, woxga, wqsxga, wquxga, whsxga, whuxga, cga, ega, hd480, hd720,
    # hd1080
    VideoSize 160x128

    # Transmit only intra frames (useful for low bitrates, but kills frame rate).
    #VideoIntraOnly

    # If non-intra only, an intra frame is transmitted every VideoGopSize
    # frames. Video synchronization can only begin at an intra frame.
    VideoGopSize 12

    # More MPEG-4 parameters
    # VideoHighQuality
    # Video4MotionVector

    # Choose your codecs:
    #AudioCodec mp2
    #VideoCodec mpeg1video

    # Suppress audio
    #NoAudio

    # Suppress video
    #NoVideo

    #VideoQMin 3
    #VideoQMax 31

    # Set this to the number of seconds backwards in time to start. Note that
    # most players will buffer 5-10 seconds of video, and also you need to allow
    # for a keyframe to appear in the data stream.
    #Preroll 15

    # ACL:

    # You can allow ranges of addresses (or single addresses)
    #ACL ALLOW <first address="address"> <last address="address">

    # You can deny ranges of addresses (or single addresses)
    #ACL DENY <first address="address"> <last address="address">

    # You can repeat the ACL allow/deny as often as you like. It is on a per
    # stream basis. The first match defines the action. If there are no matches,
    # then the default is the inverse of the last ACL statement.
    #
    # Thus &#39;ACL allow localhost&#39; only allows access from localhost.
    # &#39;ACL deny 1.0.0.0 1.255.255.255&#39; would deny the whole of network 1 and
    # allow everybody else.

    </last></first></last></first></stream>


    ##################################################################
    # Example streams


    # Multipart JPEG

    #<stream>
    #Feed feed1.ffm
    #Format mpjpeg
    #VideoFrameRate 2
    #VideoIntraOnly
    #NoAudio
    #Strict -1
    #</stream>


    # Single JPEG

    #<stream>
    #Feed feed1.ffm
    #Format jpeg
    #VideoFrameRate 2
    #VideoIntraOnly
    ##VideoSize 352x240
    #NoAudio
    #Strict -1
    #</stream>



    # Flash

    #<stream>
    #Feed feed1.ffm
    #Format swf
    #VideoFrameRate 2
    #VideoIntraOnly
    #NoAudio
    #</stream>


    # ASF compatible

    <stream>
    Feed feed1.ffm
    Format asf
    VideoFrameRate 15
    VideoSize 352x240
    VideoBitRate 256
    VideoBufferSize 40
    VideoGopSize 30
    AudioBitRate 64
    StartSendOnKey
    </stream>


    # MP3 audio

    #<stream>
    #Feed feed1.ffm
    #Format mp2
    #AudioCodec mp3
    #AudioBitRate 64
    #AudioChannels 1
    #AudioSampleRate 44100
    #NoVideo
    #</stream>


    # Ogg Vorbis audio

    #<stream>
    #Feed feed1.ffm
    #Title "Stream title"
    #AudioBitRate 64
    #AudioChannels 2
    #AudioSampleRate 44100
    #NoVideo
    #</stream>


    # Real with audio only at 32 kbits

    #<stream>
    #Feed feed1.ffm
    #Format rm
    #AudioBitRate 32
    #NoVideo
    #NoAudio
    #</stream>


    # Real with audio and video at 64 kbits

    #<stream>
    #Feed feed1.ffm
    #Format rm
    #AudioBitRate 32
    #VideoBitRate 128
    #VideoFrameRate 25
    #VideoGopSize 25
    #NoAudio
    #</stream>


    ##################################################################
    # A stream coming from a file: you only need to set the input
    # filename and optionally a new format. Supported conversions:
    #    AVI -> ASF

    #<stream>
    #File "/usr/local/httpd/htdocs/tlive.rm"
    #NoAudio
    #</stream>

    #<stream>
    #File "/usr/local/httpd/htdocs/test.asf"
    #NoAudio
    #Author "Me"
    #Copyright "Super MegaCorp"
    #Title "Test stream from disk"
    #Comment "Test comment"
    #</stream>


    ##################################################################
    # RTSP examples
    #
    # You can access this stream with the RTSP URL:
    #   rtsp://localhost:5454/test1-rtsp.mpg
    #
    # A non-standard RTSP redirector is also created. Its URL is:
    #   http://localhost:8090/test1-rtsp.rtsp

    #<stream>
    #Format rtp
    #File "/usr/local/httpd/htdocs/test1.mpg"
    #</stream>


    ##################################################################
    # SDP/multicast examples
    #
    # If you want to send your stream in multicast, you must set the
    # multicast address with MulticastAddress. The port and the TTL can
    # also be set.
    #
    # An SDP file is automatically generated by ffserver by adding the
    # &#39;sdp&#39; extension to the stream name (here
    # http://localhost:8090/test1-sdp.sdp). You should usually give this
    # file to your player to play the stream.
    #
    # The &#39;NoLoop&#39; option can be used to avoid looping when the stream is
    # terminated.

    #<stream>
    #Format rtp
    #File "/usr/local/httpd/htdocs/test1.mpg"
    #MulticastAddress 224.124.0.1
    #MulticastPort 5000
    #MulticastTTL 16
    #NoLoop
    #</stream>


    ##################################################################
    # Special streams

    # Server status

    <stream>
    Format status

    # Only allow local people to get the status
    ACL allow localhost
    ACL allow 192.168.0.0 192.168.255.255

    #FaviconURL http://pond1.gladstonefamily.net:8080/favicon.ico
    </stream>


    # Redirect index.html to the appropriate site

    <redirect>
    URL http://www.ffmpeg.org/
    </redirect>

    Kindly anyone please assist me whether I am missing something or do i need to change my server.conf file ? I have referred many websites. But still I am unable to fix it. Thanks in advance.

  • Android FFmpegPlayer Streaming Service onClick notification

    8 octobre 2013, par agony

    I have a MainActivity class that displays the list of streams available for my project and the StreamingActivity class where the streaming is done.

    If the user selected an item from the list it will start the StreamingActivity and start playing the stream.
    I'm having trouble to continue streaming music when the user pressed the notification and returning it to the StreamingActivity class if the user pressed or clicked the home menu or when the app goes to onDestroy().

    I'm using FFmpegPlayer for my project 'coz it requires to play mms :// live streams for local FM station.

    Here's my code :

    public class StreamingActivity extends BaseActivity  implements ActionBar.TabListener,
    PlayerControlListener, IMediaPlayerServiceClient {


    private StatefulMediaPlayer mMediaPlayer;
    private FFmpegService mService;
    private boolean mBound;

    public static final String TAG = "StationActivity";

    private static Bundle mSavedInstanceState;

    private static PlayerFragment mPlayerFragment;
    private static DJListFragment mDjListFragment;

    private SectionsPagerAdapter mSectionsPagerAdapter;
    private ViewPager mViewPager;

    private String stream = "";
    private String fhz = "";
    private String page = "0";

    private Dialog shareDialog;
       private ProgressDialog dialog;

    private boolean isStreaming;


    /*************************************************************************************************************/

    @Override
    public void onCreate(Bundle savedInstanceState){
       super.onCreate(savedInstanceState);
       setContentView(R.layout.activity_station);

       Bundle bundle = getIntent().getExtras();
       if(bundle !=null){
           fhz = bundle.getString("fhz");
           stream = bundle.getString("stream");
       }

       Log.d(TAG, "page: " + page + " fhz: " + fhz + " stream: " + stream + " isStreaming: " + isStreaming);

       getSupportActionBar().setTitle("Radio \n" + fhz);
       getSupportActionBar().setDisplayHomeAsUpEnabled(true);
       getSupportActionBar().setNavigationMode(ActionBar.NAVIGATION_MODE_TABS);

       mPlayerFragment = (PlayerFragment) Fragment.instantiate(this, PlayerFragment.class.getName(), null);
       mDjListFragment = (DJListFragment) Fragment.instantiate(this, DJListFragment.class.getName(), null);

       mSectionsPagerAdapter = new SectionsPagerAdapter(getSupportFragmentManager());

       mViewPager = (ViewPager) findViewById(R.id.pager);
       mViewPager.setAdapter(mSectionsPagerAdapter);
       mViewPager.setCurrentItem(Integer.parseInt(page));

       mSavedInstanceState = savedInstanceState;

       Tab playingTab = getSupportActionBar().newTab();
       playingTab.setText(getString(R.string.playing_label));
       playingTab.setTabListener(this);

       Tab djTab = getSupportActionBar().newTab();
       djTab.setText(getString(R.string.dj_label));
       djTab.setTabListener(this);

       getSupportActionBar().addTab(playingTab);
       getSupportActionBar().addTab(djTab);

       // When swiping between different sections, select the corresponding
       // tab. We can also use ActionBar.Tab#select() to do this if we have
       // a reference to the Tab.
       mViewPager.setOnPageChangeListener(new ViewPager.SimpleOnPageChangeListener() {
           @Override
           public void onPageSelected(int position) {
               StationActivity.this.getSupportActionBar().setSelectedNavigationItem(position);
           }
       });

       if (mSavedInstanceState != null) {
           getSupportActionBar().setSelectedNavigationItem(mSavedInstanceState.getInt("tab", 0));
       }

       dialog = new ProgressDialog(this);

       bindToService();

       UriBean.getInstance().setStream(stream);
       Log.d(TAG ,"stream: " + UriBean.getInstance().getStream());

    }

    /********************************************************************************************************/

    public class SectionsPagerAdapter extends FragmentPagerAdapter {
       public SectionsPagerAdapter(FragmentManager fm) {
           super(fm);
       }

       @Override
       public Fragment getItem(int position) {
           if (position == 0) {
               return mPlayerFragment;
           } else {
               return mDjListFragment;
           }
       }

       @Override
       public int getCount() {
           return 2;
       }
    }

    @Override
    public void onTabSelected(Tab tab, FragmentTransaction ft) {
       // When the given tab is selected, switch to the corresponding page in the ViewPager.
       mViewPager.setCurrentItem(tab.getPosition());
    }

    @Override
    public void onTabUnselected(Tab tab, FragmentTransaction ft) { }

    @Override
    public void onTabReselected(Tab tab, FragmentTransaction ft) { }

    /********************************************************************************************************/

    public void showLoadingDialog() {
       dialog.setMessage("Buffering...");
       dialog.show();
    }

    public void dismissLoadingDialog() {
       dialog.dismiss();
    }

    /********************************************************************************************************/

    /**
    * Binds to the instance of MediaPlayerService. If no instance of MediaPlayerService exists, it first starts
    * a new instance of the service.
    */
    public void bindToService() {
       Intent intent = new Intent(this, FFmpegService.class);

       if (Util.isFFmpegServiceRunning(getApplicationContext())){
           // Bind to Service
           Log.i(TAG, "bindService");
           bindService(intent, mConnection, Context.BIND_AUTO_CREATE);
       } else {
           //start service and bind to it
           Log.i(TAG, "startService &amp; bindService");
           startService(intent);
           bindService(intent, mConnection, Context.BIND_AUTO_CREATE);
       }
    }


    /**
    * Defines callbacks for service binding, passed to bindService()
    */
    private ServiceConnection mConnection = new ServiceConnection() {
       @Override
       public void onServiceConnected(ComponentName className, IBinder serviceBinder) {
           Log.d(TAG,"service connected");

           //bound with Service. get Service instance
           MediaPlayerBinder binder = (FFmpegService.MediaPlayerBinder) serviceBinder;
           mService = binder.getService();

           //send this instance to the service, so it can make callbacks on this instance as a client
           mService.setClient(StationActivity.this);
           mBound = true;

           Log.d(TAG, "isPlaying === SERVICE: " + mService.isPlaying());

           //if

           startStreaming();
       }

       @Override
       public void onServiceDisconnected(ComponentName arg0) {
           mBound = false;
           mService = null;
       }
    };

    /********************************************************************************************************/

    @Override
    public void onPlayerPlayStop() {
       Log.d(TAG, "onPlayerPlayStop");

       Log.v(TAG, "isStreaming: " + isStreaming);
       Log.v(TAG, "mBound:  " + mBound);

       if (mBound) {
           Log.d(TAG, "bound.............");
           mMediaPlayer = mService.getMediaPlayer();
           //pressed pause ->pause
           if (!PlayerFragment.play.isChecked()) {
               if (mMediaPlayer.isStarted()) {
                   Log.d(TAG, "pause");
                   mService.pauseMediaPlayer();
               }
           } else { //pressed play
               // STOPPED, CREATED, EMPTY, -> initialize
               if (mMediaPlayer.isStopped() || mMediaPlayer.isCreated() || mMediaPlayer.isEmpty()) {
                   startStreaming();
               } else if (mMediaPlayer.isPrepared() || mMediaPlayer.isPaused()) { //prepared, paused -> resume play
                   Log.d(TAG, "start");
                   mService.startMediaPlayer();
               }
           }

           Log.d(TAG, "isPlaying === SERVICE: " + mService.isPlaying());
       }
    }

    /********************************************************************************************************/

    @Override
    public void onDownload() {
       Toast.makeText(this, "Not yet available...", Toast.LENGTH_SHORT).show();
    }

    @Override
    public void onComment() {
       FragmentManager fm = getSupportFragmentManager();
       DialogFragment newFragment = MyAlertDialogFragment.newInstance();
       newFragment.show(fm, "comment_dialog");
    }

    @Override
    public void onShare() {
       showShareDialog();
    }

    /********************************************************************************************************/

    private void startStreaming() {
       Log.d(TAG, "@startLoading");
       boolean isNetworkFound = Util.checkConnectivity(getApplicationContext());
       if(isNetworkFound) {
           Log.d(TAG, "network found");
           mService.initializePlayer(stream);
           isStreaming = true;
       } else {
           Toast.makeText(getApplicationContext(), "No internet connection found...", Toast.LENGTH_SHORT).show();
       }

       Log.d(TAG, "isStreaming: " + isStreaming);
       Log.d(TAG, "isPlaying === SERVICE: " + mService.isPlaying());
    }

    @Override
    public void onInitializePlayerStart() {
       showLoadingDialog();
    }

    @Override
    public void onInitializePlayerSuccess() {
       dismissLoadingDialog();
       PlayerFragment.play.setChecked(true);


       Log.d(TAG, "isPlaying === SERVICE: " + mService.isPlaying());
    }

    @Override
    public void onError() {
       Toast.makeText(getApplicationContext(), "Not connected to the server...", Toast.LENGTH_SHORT).show();
    }

       @Override
    public void onDestroy() {
       Log.d(TAG, "onDestroy");
       super.onDestroy();
       uiHelper.onDestroy();

       Log.d(TAG, "isPlaying === SERVICE: " + mService.isPlaying());
       if (mBound) {
           mService.unRegister();
           unbindService(mConnection);
           mBound = false;
       }

       Log.d(TAG, "service: " + Util.isFFmpegServiceRunning(getApplicationContext()));
    }

    @Override
    public void onStop(){
       Log.d(TAG, "onStop");
       super.onStop();
    }

    /*******************************************************************************************************/

    @Override
    public boolean onOptionsItemSelected(MenuItem item) {
       int itemId = item.getItemId();
       switch (itemId){
       case android.R.id.home:
           onBackPressed();
           break;
       default:
           break;
       }    
       return true;
    }

    @Override
    public boolean onKeyDown(int keyCode, KeyEvent event) {
       Log.d(TAG, "@onKeyDown");
       if (keyCode == KeyEvent.KEYCODE_BACK &amp;&amp; event.getRepeatCount() == 0){
           //this.moveTaskToBack(true);
           onBackPressed();
           return true;
       }
       return super.onKeyDown(keyCode, event);
    }
    }






    public class FFmpegService  extends Service implements IMediaPlayerThreadClient {

    private FFmpegPlayerThread mMediaPlayerThread       = new FFmpegPlayerThread(this);
    private final Binder mBinder                        = new MediaPlayerBinder();
    private IMediaPlayerServiceClient mClient;
    //private StreamStation mCurrentStation;

    private boolean mIsSupposedToBePlaying = false;

    private boolean isPausedInCall = false;
    private PhoneStateListener phoneStateListener;
    private TelephonyManager telephonyManager;

    @Override
    public void onCreate(){
       mMediaPlayerThread.start();
    }

    /**
    * A class for clients binding to this service. The client will be passed an object of this class
    * via its onServiceConnected(ComponentName, IBinder) callback.
    */
    public class MediaPlayerBinder extends Binder {
       /**
        * Returns the instance of this service for a client to make method calls on it.
        * @return the instance of this service.
        */
       public FFmpegService getService() {
           return FFmpegService.this;
       }
    }

    /**
    * Returns the contained StatefulMediaPlayer
    * @return
    */
    public StatefulMediaPlayer getMediaPlayer() {
       return mMediaPlayerThread.getMediaPlayer();
    }

    public boolean isPlaying() {
       return mIsSupposedToBePlaying;
    }

    @Override
    public IBinder onBind(Intent arg0) {
       return mBinder;
    }

    @Override
    public int onStartCommand(Intent intent, int flags, int startId) {

       telephonyManager = (TelephonyManager) getSystemService(Context.TELEPHONY_SERVICE);
       phoneStateListener = new PhoneStateListener() {
           @Override
           public void onCallStateChanged(int state, String incomingNumber) {
               // String stateString = "N/A";
               Log.v("FFmpegService", "Starting CallStateChange");
               switch (state) {
               case TelephonyManager.CALL_STATE_OFFHOOK:
               case TelephonyManager.CALL_STATE_RINGING:
                   if (mMediaPlayerThread != null) {
                       pauseMediaPlayer();
                       isPausedInCall = true;
                   }
                   break;
               case TelephonyManager.CALL_STATE_IDLE:
                   // Phone idle. Start playing.
                   if (mMediaPlayerThread != null) {
                       if (isPausedInCall) {
                           isPausedInCall = false;
                           startMediaPlayer();
                       }
                   }
                   break;
               }
           }
       };

       // Register the listener with the telephony manager
       telephonyManager.listen(phoneStateListener, PhoneStateListener.LISTEN_CALL_STATE);

       return START_STICKY;
    }

    /**
    * Sets the client using this service.
    * @param client The client of this service, which implements the IMediaPlayerServiceClient interface
    */
    public void setClient(IMediaPlayerServiceClient client) {
       this.mClient = client;
    }


    public void initializePlayer(final String station) {
       //mCurrentStation = station;
       mMediaPlayerThread.initializePlayer(station);
    }

    public void startMediaPlayer() {

       Intent notificationIntent = new Intent(getApplicationContext(), StreamingActivity.class);
       //notificationIntent.putExtra("page", "0");
       //notificationIntent.putExtra("isPlaying", isPlaying());
       notificationIntent.addFlags(Intent.FLAG_ACTIVITY_SINGLE_TOP | Intent.FLAG_ACTIVITY_CLEAR_TOP);
       PendingIntent contentIntent = PendingIntent.getActivity(getApplicationContext(), 0 , notificationIntent , PendingIntent.FLAG_UPDATE_CURRENT);

       NotificationCompat.Builder mBuilder = new NotificationCompat.Builder(this)
               .setSmallIcon(R.drawable.ic_launcher)
               .setContentTitle("You are listening to Radio...")
               .setContentText("test!!!")
               .setContentIntent(contentIntent);

       startForeground(1, mBuilder.build());

       NotificationManager notificationManager = (NotificationManager) getSystemService(Context.NOTIFICATION_SERVICE);
       notificationManager.notify(1, mBuilder.build());

       mIsSupposedToBePlaying = true;
       mMediaPlayerThread.startMediaPlayer();
    }

    public void dismissNotification(Context context) {
       String ns = Context.NOTIFICATION_SERVICE;
       NotificationManager mNotificationManager = (NotificationManager) getSystemService(ns);
       mNotificationManager.cancel(1);
    }

    /**
    * Pauses playback
    */
    public void pauseMediaPlayer() {
       Log.d("MediaPlayerService","pauseMediaPlayer() called");
       mMediaPlayerThread.pauseMediaPlayer();
       stopForeground(true);
       mIsSupposedToBePlaying = false;
       dismissNotification(this);
    }
    /**
    * Stops playback
    */
    public void stopMediaPlayer() {
       stopForeground(true);
       mMediaPlayerThread.stopMediaPlayer();

       mIsSupposedToBePlaying = false;
       dismissNotification(this);
    }

    public void resetMediaPlayer() {
       mIsSupposedToBePlaying = false;
       stopForeground(true);
       mMediaPlayerThread.resetMediaPlayer();
       dismissNotification(this);
    }

    @Override
    public void onError() {
       mIsSupposedToBePlaying = false;
       mClient.onError();
       dismissNotification(this);
    }

    @Override
    public void onInitializePlayerStart() {
       mClient.onInitializePlayerStart();
    }

    @Override
    public void onInitializePlayerSuccess() {
       startMediaPlayer();
       mClient.onInitializePlayerSuccess();
       mIsSupposedToBePlaying = true;
    }

    public void unRegister() {
       this.mClient = null;
       mIsSupposedToBePlaying = false;
       dismissNotification(this);
    }

    }

    Hoping someone can help me here...

  • Android- Error during executing Runtime.getRuntime().exec() - Environment Null -ffmpeg

    18 février 2015, par Chaitanya Chandurkar

    I have compiled ffmpeg library on ubuntu 11.10 and ported compiled files on android.
    After compiling i got libffmpeg.so successfully. It gets loaded on android successfully.

    I am doing it on ubuntu 11.10 eclipse android emulator.

    I have created a small test application which act as command prompt which accepts command from user and displays result. (testing ffmpeg commands)

    When i run simple commands like "ls", "ls -l" it works perfectly. but when i simply type "cd mnt" or "ffmpeg" as command and try to run it. I got Warnings in Logcat saying that

    08-26 16:44:52.553: W/System.err(5961): java.io.IOException: Error running exec(). Command: [ffmpeg] Working Directory: null Environment: null
    08-26 16:44:52.573: W/System.err(5961):     at java.lang.ProcessManager.exec(ProcessManager.java:211)
    08-26 16:44:52.573: W/System.err(5961):     at java.lang.Runtime.exec(Runtime.java:168)
    08-26 16:44:52.573: W/System.err(5961):     at java.lang.Runtime.exec(Runtime.java:241)
    08-26 16:44:52.583: W/System.err(5961):     at java.lang.Runtime.exec(Runtime.java:184)
    08-26 16:44:52.593: W/System.err(5961):     at ch.ffmpeg.reversit.MainActivity.Execute(MainActivity.java:61)
    08-26 16:44:52.593: W/System.err(5961):     at ch.ffmpeg.reversit.MainActivity$1.onClick(MainActivity.java:46)
    08-26 16:44:52.593: W/System.err(5961):     at android.view.View.performClick(View.java:3480)
    08-26 16:44:52.593: W/System.err(5961):     at android.view.View$PerformClick.run(View.java:13983)
    08-26 16:44:52.603: W/System.err(5961):     at android.os.Handler.handleCallback(Handler.java:605)
    08-26 16:44:52.603: W/System.err(5961):     at android.os.Handler.dispatchMessage(Handler.java:92)
    08-26 16:44:52.603: W/System.err(5961):     at android.os.Looper.loop(Looper.java:137)
    08-26 16:44:52.614: W/System.err(5961):     at android.app.ActivityThread.main(ActivityThread.java:4340)
    08-26 16:44:52.624: W/System.err(5961):     at java.lang.reflect.Method.invokeNative(Native Method)
    08-26 16:44:52.624: W/System.err(5961):     at java.lang.reflect.Method.invoke(Method.java:511)
    08-26 16:44:52.634: W/System.err(5961):     at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:784)
    08-26 16:44:52.634: W/System.err(5961):     at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:551)
    08-26 16:44:52.644: W/System.err(5961):     at dalvik.system.NativeStart.main(Native Method)
    08-26 16:44:52.644: W/System.err(5961): Caused by: java.io.IOException: Permission denied
    08-26 16:44:52.674: W/System.err(5961):     at java.lang.ProcessManager.exec(Native Method)
    08-26 16:44:52.674: W/System.err(5961):     at java.lang.ProcessManager.exec(ProcessManager.java:209)
    08-26 16:44:52.684: W/System.err(5961):     ... 16 more

    Here is my code :

    imports;
    public class MainActivity extends Activity {
       String com;
       Process process;
       EditText command;
       Button run;
       RelativeLayout main_layout;

       static {
        System.loadLibrary("ffmpeg");
       }

       /** Called when the activity is first created. */
       @Override
       public void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           setContentView(R.layout.activity_main);

          //find view
          command=(EditText)findViewById(R.id.command);
          run=(Button)findViewById(R.id.run);

           run.setOnClickListener(new OnClickListener() {
               public void onClick(View v) {
                   com=command.getText().toString();
                   try {
                       Execute();
                   } catch (IOException e) {
                       // TODO Auto-generated catch block
                       e.printStackTrace();
                   } catch (InterruptedException e) {
                       // TODO Auto-generated catch block
                       e.printStackTrace();
                   }

               }
           });

       }

       public void Execute() throws IOException, InterruptedException{
           process=Runtime.getRuntime().exec(com);
           // process = pb.command(com).redirectErrorStream(true).start();

           if(process!=null)
           ShowOutput();
           else
           Toast.makeText(getBaseContext(),"Null Process",Toast.LENGTH_LONG).show();
       }

       public void ShowOutput() throws IOException, InterruptedException{
           String s,text="",errors="";
           BufferedReader stdInput = new BufferedReader(new
                   InputStreamReader(process.getInputStream()));

              BufferedReader stdError = new BufferedReader(new
                   InputStreamReader(process.getErrorStream()));


              TextView output=(TextView)findViewById(R.id.output);
              TextView error=(TextView)findViewById(R.id.error);

               while ((s = stdInput.readLine()) != null) {
                      text+=s.toString()+"\n";
                      System.out.println("Error: "+s);
                  }

            output.setText(text);
            text="";
              // read any errors from the attempted command
              System.out.println("Here is the standard error of the command (if any):\n");
              while ((s = stdError.readLine()) != null) {
                  text+=s.toString()+"\n";
                  System.out.println("Error: "+s);
              }

              error.setText(text);

              error.setMovementMethod(new ScrollingMovementMethod());
              output.setMovementMethod(new ScrollingMovementMethod());

              stdInput.close();
              stdError.close();

              process.waitFor();
              process.getOutputStream().close();
              process.getInputStream().close();
              process.getErrorStream().close();
              process.destroy();


       }

    }

    I even tried process = pb.command(com).redirectErrorStream(true).start(); for execution. It gives me same error but this time environment was [ANDROID_SOCKET_Zygot] bla bla bla..

    EDIT 1 :
    I use Openjdk on ubuntu

    Help me out !!