Search Results

Search found 29 results on 2 pages for 'mediarecorder'.

Page 1/2 | 1 2  | Next Page >

  • Problems with MediaRecorder class setting audio source - setAudioSource() - unsupported parameter

    - by arakn0
    Hello everybody, I'm new in Android development and I have the next question/problem. I'm playing around with the MediaRecorder class to record just audio from the microphone. I'm following the steps indicated in the official site: http://developer.android.com/reference/android/media/MediaRecorder.html So I have a method that initializes and configure the MediaRecorder object in order to start recording. Here you have the code: this.mr = new MediaRecorder(); this.mr.setAudioSource(MediaRecorder.AudioSource.MIC); this.mr.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP); this.mr.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB); this.mr.setOutputFile(this.path + this.fileName); try { this.mr.prepare(); } catch (IllegalStateException e) { Log.d("Syso", e.toString()); e.printStackTrace(); } catch (IOException e) { Log.d("Syso", e.toString()); e.printStackTrace(); } When I execute this code in the simulator, thanks to logcat, I can see that the method setAudioSource(MediaRecorder.AudioSource.MIC) gives the next error (with the tag audio_ipunt) when it is called: ERROR/audio_input(34): unsupported parameter: x-pvmf/media-input-node/cap-config-interface;valtype=key_specific_value ERROR/audio_input(34): VerifyAndSetParameter failed And then when the method prepare() is called, I get the another error again: ERROR/PVOMXEncNode(34): PVMFOMXEncNode-Audio_AMRNB::DoPrepare(): Got Component OMX.PV.amrencnb handle If I start to record bycalling the method start()... I get lots of messages saying: AudioFlinger(34):RecordThread: buffer overflow Then...after stop and release,.... I can see that a file has been created, but it doesn't seem that it been well recorderd. Anway, if i try this in a real device I can record with no problems, but I CAN'T play what I just recorded. I gues that the key is in these errors that I've mentioned before. How can I fix them? Any suggestion or help?? Thanks in advanced!!

    Read the article

  • android mediarecorder exception

    - by Arutha
    I try to use the MediaRecorder class to record a video but I get an exception : failed to get Camera parameters. Prepare failed. Here's my code : camera = Camera.open(); recorder = new MediaRecorder(); recorder.setCamera(camera); recorder.setVideoSource(VideoSource.CAMERA); recorder.setPreviewDisplay(m_holder.getSurface()); recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP); recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB); recorder.setVideoEncoder(MediaRecorder.VideoEncoder.DEFAULT); recorder.setMaxDuration(10000); recorder.setOutputFile(file.getPath()); recorder.prepare(); Any idea ?

    Read the article

  • [android] MediaRecorder prepare() causes segfault

    - by dwilde1
    Folks, I have a situation where my MediaRecorder instance causes a segfault. I'm working with a HTC Hero, Android 1.5+APIs. I've tried all variations, including 3gpp and H.263 and reducing the video resolution to 320x240. What am I missing? The state machine causes 4 MediaPlayer beeps and then turns on the video camera. Here's the pertinent source: UPDATE: ADDING SURFACE CREATE INFO I have rebooted the device based on previous answer to similar question. UPDATE 2: I seem to be following the MediaRecorder state machine perfectly, and if I trap out the MR code, the blank surface displays perfectly and everything else functions perfectly. I can record videos manually and play back via MediaPlayer in my code, so there should be nothing wrong with the underlying code. I've copied sample code on the surface and surfaceHolder code. I've looked at the MR instance in the Debug perspective in Eclipse and see that all (known) variables seem to be instantiated correctly. The setter calls are all now implemented in the exaxct order specced in the state diagram. // in activity class definition protected MediaPlayer mPlayer; protected MediaRecorder mRecorder; protected boolean inCapture = false; protected int phaseCapture = 0; protected int durCapturePhase = INF; protected SurfaceView surface; protected SurfaceHolder surfaceHolder; // in onCreate() // panelPreview is an empty LinearLayout surface = new SurfaceView(getApplicationContext()); surfaceHolder = surface.getHolder(); surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS); panelPreview.addView(surface); // in timer handler runnable if (mRecorder == null) mRecorder = new MediaRecorder(); mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC); mRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA); mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP); mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB); mRecorder.setOutputFile(path + "/" + vlip); mRecorder.setVideoSize(320, 240); mRecorder.setVideoFrameRate(15); mRecorder.setPreviewDisplay(surfaceHolder.getSurface()); panelPreview.setVisibility(LinearLayout.VISIBLE); mRecorder.prepare(); mRecorder.start(); Here is a complete log trace for the process run and crash: I/ActivityManager( 80): Start proc com.ejf.convince.jenplus for activity com.ejf.convince.jenplus/.JenPLUS: pid=17738 uid=10075 gids={1006, 3003} I/jdwp (17738): received file descriptor 10 from ADB W/System.err(17738): Can't dispatch DDM chunk 46454154: no handler defined W/System.err(17738): Can't dispatch DDM chunk 4d505251: no handler defined I/WindowManager( 80): Screen status=true, current orientation=-1, SensorEnabled=false I/WindowManager( 80): needSensorRunningLp, mCurrentAppOrientation =-1 I/WindowManager( 80): Enabling listeners W/ActivityThread(17738): Application com.ejf.convince.jenplus is waiting for the debugger on port 8100... I/System.out(17738): Sending WAIT chunk I/dalvikvm(17738): Debugger is active I/AlertDialog( 80): [onCreate] auto launch SIP. I/WindowManager( 80): onOrientationChanged, rotation changed to 0 I/System.out(17738): Debugger has connected I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): debugger has settled (1370) I/ActivityManager( 80): Displayed activity com.ejf.convince.jenplus/.JenPLUS: 5186 ms I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/AudioHardwareMSM72XX( 2696): AUDIO_START: start kernel pcm_out driver. W/AudioFlinger( 2696): write blocked for 96 msecs I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 W/AuthorDriver( 2696): Intended width(640) exceeds the max allowed width(352). Max width is used instead. W/AuthorDriver( 2696): Intended height(480) exceeds the max allowed height(288). Max height is used instead. I/AudioHardwareMSM72XX( 2696): AudioHardware pcm playback is going to standby. I/DEBUG (16094): *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** I/DEBUG (16094): Build fingerprint: 'sprint/htc_heroc/heroc/heroc: 1.5/CUPCAKE/85027:user/release-keys' I/DEBUG (16094): pid: 17738, tid: 17738 com.ejf.convince.jenplus Thanks in advance! -- Don Wilde http://www.ConvinceProject.com

    Read the article

  • MediaRecorder prepare() causes segfault

    - by dwilde1
    Folks, I have a situation where my MediaRecorder instance causes a segfault. I'm working with a HTC Hero, Android 1.5+APIs. I've tried all variations, including 3gpp and H.263 and reducing the video resolution to 320x240. What am I missing? The state machine causes 4 MediaPlayer beeps and then turns on the video camera. Here's the pertinent source: UPDATE: ADDING SURFACE CREATE INFO I have rebooted the device based on previous answer to similar question. UPDATE 2: I seem to be following the MediaRecorder state machine perfectly, and if I trap out the MR code, the blank surface displays perfectly and everything else functions perfectly. I can record videos manually and play back via MediaPlayer in my code, so there should be nothing wrong with the underlying code. I've copied sample code on the surface and surfaceHolder code. I've looked at the MR instance in the Debug perspective in Eclipse and see that all (known) variables seem to be instantiated correctly. The setter calls are all now implemented in the exaxct order specced in the state diagram. UPDATE 3: I've tried all permission combinations: CAMERA + RECORD_AUDIO+RECORD_VIDEO, CAMERA only, RECORD_AUDIO+RECORD_VIDEO This is driving me bats! :))) // in activity class definition protected MediaPlayer mPlayer; protected MediaRecorder mRecorder; protected boolean inCapture = false; protected int phaseCapture = 0; protected int durCapturePhase = INF; protected SurfaceView surface; protected SurfaceHolder surfaceHolder; // in onCreate() // panelPreview is an empty LinearLayout surface = new SurfaceView(getApplicationContext()); surfaceHolder = surface.getHolder(); surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS); panelPreview.addView(surface); // in timer handler runnable if (mRecorder == null) mRecorder = new MediaRecorder(); mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC); mRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA); mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP); mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB); mRecorder.setOutputFile(path + "/" + vlip); mRecorder.setVideoSize(320, 240); mRecorder.setVideoFrameRate(15); mRecorder.setPreviewDisplay(surfaceHolder.getSurface()); panelPreview.setVisibility(LinearLayout.VISIBLE); mRecorder.prepare(); mRecorder.start(); Here is a complete log trace for the process run and crash: I/ActivityManager( 80): Start proc com.ejf.convince.jenplus for activity com.ejf.convince.jenplus/.JenPLUS: pid=17738 uid=10075 gids={1006, 3003} I/jdwp (17738): received file descriptor 10 from ADB W/System.err(17738): Can't dispatch DDM chunk 46454154: no handler defined W/System.err(17738): Can't dispatch DDM chunk 4d505251: no handler defined I/WindowManager( 80): Screen status=true, current orientation=-1, SensorEnabled=false I/WindowManager( 80): needSensorRunningLp, mCurrentAppOrientation =-1 I/WindowManager( 80): Enabling listeners W/ActivityThread(17738): Application com.ejf.convince.jenplus is waiting for the debugger on port 8100... I/System.out(17738): Sending WAIT chunk I/dalvikvm(17738): Debugger is active I/AlertDialog( 80): [onCreate] auto launch SIP. I/WindowManager( 80): onOrientationChanged, rotation changed to 0 I/System.out(17738): Debugger has connected I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): debugger has settled (1370) I/ActivityManager( 80): Displayed activity com.ejf.convince.jenplus/.JenPLUS: 5186 ms I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/AudioHardwareMSM72XX( 2696): AUDIO_START: start kernel pcm_out driver. W/AudioFlinger( 2696): write blocked for 96 msecs I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 W/AuthorDriver( 2696): Intended width(640) exceeds the max allowed width(352). Max width is used instead. W/AuthorDriver( 2696): Intended height(480) exceeds the max allowed height(288). Max height is used instead. I/AudioHardwareMSM72XX( 2696): AudioHardware pcm playback is going to standby. I/DEBUG (16094): *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** I/DEBUG (16094): Build fingerprint: 'sprint/htc_heroc/heroc/heroc: 1.5/CUPCAKE/85027:user/release-keys' I/DEBUG (16094): pid: 17738, tid: 17738 com.ejf.convince.jenplus Thanks in advance! -- Don Wilde http://www.ConvinceProject.com

    Read the article

  • Starting to make progress Was [MediaRecorder prepare() causes segfault]

    - by dwilde1
    Folks, I have a situation where my MediaRecorder instance causes a segfault. I'm working with a HTC Hero, Android 1.5+APIs. I've tried all variations, including 3gpp and H.263 and reducing the video resolution to 320x240. What am I missing? The state machine causes 4 MediaPlayer beeps and then turns on the video camera. Here's the pertinent source: UPDATE: ADDING SURFACE CREATE INFO I have rebooted the device based on previous answer to similar question. UPDATE 2: I seem to be following the MediaRecorder state machine perfectly, and if I trap out the MR code, the blank surface displays perfectly and everything else functions perfectly. I can record videos manually and play back via MediaPlayer in my code, so there should be nothing wrong with the underlying code. I've copied sample code on the surface and surfaceHolder code. I've looked at the MR instance in the Debug perspective in Eclipse and see that all (known) variables seem to be instantiated correctly. The setter calls are all now implemented in the exaxct order specced in the state diagram. UPDATE 3: I've tried all permission combinations: CAMERA + RECORD_AUDIO+RECORD_VIDEO, CAMERA only, RECORD_AUDIO+RECORD_VIDEO This is driving me bats! :))) UPDATE 4: starting to work... but with puzzling results. Based on info in bug #5050, I spaced everything out. I have now gotten the recorder to actually save a snippet of video (a whole 2160 bytes!), and I did it by spacing the view visibility, prepare() and start() w.a.a.a.a.a.y out (like several hundred milliseconds for each step). I think what happens is that either bringing the surface VISIBLE has delayed processing or else the start() steps on the prepare() operation before it is complete. What is now happening, however, is that my simple timer tickdown counter is getting clobbered. Is it now that the preview and save operations are causing my main process thread to become unavailable? I'm recording only 10fps at 176x144. Referencing the above code, I've added a timer tickdown after setPreviewDisplay(), prepare() and start(). As I say, it now functions to some degree, but the results still have anomalies. // in activity class definition protected MediaPlayer mPlayer; protected MediaRecorder mRecorder; protected boolean inCapture = false; protected int phaseCapture = 0; protected int durCapturePhase = INF; protected SurfaceView surface; protected SurfaceHolder surfaceHolder; // in onCreate() // panelPreview is an empty LinearLayout surface = new SurfaceView(getApplicationContext()); surfaceHolder = surface.getHolder(); surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS); panelPreview.addView(surface); // in timer handler runnable if (mRecorder == null) mRecorder = new MediaRecorder(); mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC); mRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA); mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP); mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB); mRecorder.setOutputFile(path + "/" + vlip); mRecorder.setVideoSize(320, 240); mRecorder.setVideoFrameRate(15); mRecorder.setPreviewDisplay(surfaceHolder.getSurface()); panelPreview.setVisibility(LinearLayout.VISIBLE); mRecorder.prepare(); mRecorder.start(); Here is a complete log trace for the process run and crash: I/ActivityManager( 80): Start proc com.ejf.convince.jenplus for activity com.ejf.convince.jenplus/.JenPLUS: pid=17738 uid=10075 gids={1006, 3003} I/jdwp (17738): received file descriptor 10 from ADB W/System.err(17738): Can't dispatch DDM chunk 46454154: no handler defined W/System.err(17738): Can't dispatch DDM chunk 4d505251: no handler defined I/WindowManager( 80): Screen status=true, current orientation=-1, SensorEnabled=false I/WindowManager( 80): needSensorRunningLp, mCurrentAppOrientation =-1 I/WindowManager( 80): Enabling listeners W/ActivityThread(17738): Application com.ejf.convince.jenplus is waiting for the debugger on port 8100... I/System.out(17738): Sending WAIT chunk I/dalvikvm(17738): Debugger is active I/AlertDialog( 80): [onCreate] auto launch SIP. I/WindowManager( 80): onOrientationChanged, rotation changed to 0 I/System.out(17738): Debugger has connected I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): waiting for debugger to settle... I/System.out(17738): debugger has settled (1370) I/ActivityManager( 80): Displayed activity com.ejf.convince.jenplus/.JenPLUS: 5186 ms I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/AudioHardwareMSM72XX( 2696): AUDIO_START: start kernel pcm_out driver. W/AudioFlinger( 2696): write blocked for 96 msecs I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 I/OpenCore( 2696): [Hank debug] LN 289 FN CreateNode I/PlayerDriver( 2696): CIQ 1625 sendEvent state=5 W/AuthorDriver( 2696): Intended width(640) exceeds the max allowed width(352). Max width is used instead. W/AuthorDriver( 2696): Intended height(480) exceeds the max allowed height(288). Max height is used instead. I/AudioHardwareMSM72XX( 2696): AudioHardware pcm playback is going to standby. I/DEBUG (16094): *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** I/DEBUG (16094): Build fingerprint: 'sprint/htc_heroc/heroc/heroc: 1.5/CUPCAKE/85027:user/release-keys' I/DEBUG (16094): pid: 17738, tid: 17738 com.ejf.convince.jenplus Thanks in advance! -- Don Wilde http://www.ConvinceProject.com

    Read the article

  • Video Recording Not Working in ICS

    - by Nirav Ranpara
    I have implement code Record video in Android Phone . This code is working in 2.2 , 2.3 . not in ICS But when I checked in ICS code is not working ? here I posted code and xml file. videorecord.java import java.io.File; import java.io.IOException; import android.app.Activity; import android.app.AlertDialog; import android.content.Context; import android.content.DialogInterface; import android.content.Intent; import android.content.SharedPreferences; import android.hardware.Camera; import android.media.CamcorderProfile; import android.media.MediaRecorder; import android.os.Bundle; import android.os.CountDownTimer; import android.os.Environment; import android.util.Log; import android.view.Display; import android.view.KeyEvent; import android.view.SurfaceHolder; import android.view.SurfaceView; import android.view.View; import android.widget.EditText; import android.widget.FrameLayout; import android.widget.ImageView; import android.widget.LinearLayout; import android.widget.TextView; import android.widget.Toast; public class videorecord extends Activity{ SharedPreferences.Editor pre; String filename; CountDownTimer t; private Camera myCamera; private MyCameraSurfaceView myCameraSurfaceView; private MediaRecorder mediaRecorder; Integer cnt=0; LinearLayout myButton; TextView myButton1; SurfaceHolder surfaceHolder; boolean recording; private TextView txtcount; private ImageView btnplay; @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); recording = false; setContentView(R.layout.videorecord); init(); myCamera = getCameraInstance(); if(myCamera == null){ } myCameraSurfaceView = new MyCameraSurfaceView(this, myCamera); FrameLayout myCameraPreview = (FrameLayout)findViewById(R.id.videoview); Display display = getWindowManager().getDefaultDisplay(); int width = display.getWidth(); int height = display.getHeight(); myCameraSurfaceView.setLayoutParams(new LinearLayout.LayoutParams(width, height-60)); myCameraPreview.addView(myCameraSurfaceView); myButton = (LinearLayout)findViewById(R.id.mybutton); btnplay.setOnClickListener(myButtonOnClickListener); } private void init() { txtcount = (TextView) findViewById(R.id.txtcounter); //myButton1 = (TextView) findViewById(R.id.mybutton1); btnplay = (ImageView)findViewById(R.id.btnplay); t = new CountDownTimer( Long.MAX_VALUE , 1000) { @Override public void onTick(long millisUntilFinished) { cnt++; String time = new Integer(cnt).toString(); long millis = cnt; int seconds = (int) (millis / 60); int minutes = seconds / 60; seconds = seconds % 60; txtcount.setText(String.format("%d:%02d:%02d", minutes, seconds,millis)); } @Override public void onFinish() { } }; } @Override public boolean onKeyDown(int keyCode, KeyEvent event) { if ((keyCode == KeyEvent.KEYCODE_BACK)) { if(recording) { new AlertDialog.Builder(videorecord.this).setTitle("Do you want to save Video ?") .setPositiveButton("OK", new DialogInterface.OnClickListener() { public void onClick(DialogInterface dialog, int which) { filename(); //finish(); } }).setNegativeButton("Cancle", new DialogInterface.OnClickListener() { public void onClick(DialogInterface dialog, int which) { // TODO Auto-generated method stub } }).show(); } else { if ((keyCode == KeyEvent.KEYCODE_BACK)) { //Intent homeIntent= new Intent(Intent.ACTION_MAIN); //homeIntent.addCategory(Intent.CATEGORY_HOME); //homeIntent.setFlags(Intent.FLAG_ACTIVITY_CLEAR_TOP); //startActivity(homeIntent); //this.finishActivity(1); finish(); } //moveTaskToBack(true); // finish(); return super.onKeyDown(keyCode, event); } } else { // Toast.makeText(getApplicationContext(), "asd", Toast.LENGTH_LONG).show(); android.os.Process.killProcess(android.os.Process.myPid()) ; } return super.onKeyDown(keyCode, event); } ImageView.OnClickListener myButtonOnClickListener = new ImageView.OnClickListener(){ public void onClick(View v) { if(recording){ Log.e("Record error", "error in recording ."); mediaRecorder.stop(); t.cancel(); filename(); releaseMediaRecorder(); }else{ releaseCamera(); Log.e("Record Stop error", "error in recording ."); // if(!prepareMediaRecorder()){ prepareMediaRecorder(); finish(); } mediaRecorder.start(); recording = true; // myButton1.setText("STOP Recording"); // btnplay.setImageResource(android.R.drawable.ic_media_pause); btnplay.setImageResource(R.drawable.stoprec); t.start(); } }}; private Camera getCameraInstance(){ Camera c = null; try { c = Camera.open(); } catch (Exception e){ } return c; } private void filename() { AlertDialog.Builder alert = new AlertDialog.Builder(this); alert.setTitle("Save Video"); alert.setMessage("Enter File Name"); final EditText input = new EditText(this); alert.setView(input); alert.setPositiveButton("Ok", new DialogInterface.OnClickListener() { public void onClick(DialogInterface dialog, int whichButton) { if(input.getText().length()>=1) { filename = input.getText().toString(); File sdcard = new File(Environment.getExternalStorageDirectory() + "/VideoRecord"); File from = new File(sdcard,"null.mp4"); File to = new File(sdcard,filename+".mp4"); from.renameTo(to); SharedPreferences sp = videorecord.this.getSharedPreferences("data", MODE_WORLD_WRITEABLE); pre = sp.edit(); pre.clear(); pre.commit(); pre.putString("lastvideo", filename+".mp4"); pre.commit(); //btnplay.setImageResource(android.R.drawable.ic_media_play); btnplay.setImageResource(R.drawable.startrec); // Intent intent = new Intent(videorecord.this,StopVidoWatch_Activity.class); // startActivity(intent); Intent myIntent = new Intent(getApplicationContext(), StopVidoWatch_Activity.class).setFlags(Intent.FLAG_ACTIVITY_CLEAR_TOP); startActivity(myIntent); } else { filename(); } } }); alert.setNegativeButton("Cancel", new DialogInterface.OnClickListener() { public void onClick(DialogInterface dialog, int whichButton) { // Intent intent = new Intent(videorecord.this,StopVidoWatch_Activity.class); // startActivity(intent); File file = new File(Environment.getExternalStorageDirectory() + "/VideoRecord/null.mp4"); //boolean deleted = file.delete(); file.delete(); finish(); } }); alert.show(); } private boolean prepareMediaRecorder(){ myCamera = getCameraInstance(); mediaRecorder = new MediaRecorder(); myCamera.unlock(); mediaRecorder.setCamera(myCamera); mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER); mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA); mediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH)); File folder = new File(Environment.getExternalStorageDirectory() + "/VideoRecord"); boolean success = false; if (!folder.exists()) { success = folder.mkdir(); } if (!success) { } else { } mediaRecorder.setOutputFile("/sdcard/VideoRecord/"+filename+".mp4"); mediaRecorder.setMaxDuration(60000); mediaRecorder.setMaxFileSize(5000000); Display display = getWindowManager().getDefaultDisplay(); int width = display.getHeight(); int height = display.getWidth(); String s = new String(); s= s.valueOf(width); String s1 = new String(); s1= s1.valueOf(height); // Toast.makeText(videorecord.this, "Width : " + s , Toast.LENGTH_LONG).show(); // Toast.makeText(videorecord.this, "Height : " + s1 , Toast.LENGTH_LONG).show(); mediaRecorder.setVideoSize(height, width); mediaRecorder.setPreviewDisplay(myCameraSurfaceView.getHolder().getSurface()); try { mediaRecorder.prepare(); } catch (IllegalStateException e) { releaseMediaRecorder(); return false; } catch (IOException e) { releaseMediaRecorder(); return false; } return true; } @Override protected void onPause() { super.onPause(); releaseMediaRecorder(); releaseCamera(); } private void releaseMediaRecorder() { if (mediaRecorder != null) { mediaRecorder.reset(); mediaRecorder.release(); mediaRecorder = null; myCamera.lock(); } } private void releaseCamera(){ if (myCamera != null){ myCamera.release(); myCamera = null; } } public class MyCameraSurfaceView extends SurfaceView implements SurfaceHolder.Callback{ private SurfaceHolder mHolder; private Camera mCamera; public MyCameraSurfaceView(Context context, Camera camera) { super(context); mCamera = camera; mHolder = getHolder(); mHolder.addCallback(this); mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS); } public void surfaceChanged(SurfaceHolder holder, int format, int weight, int height) { if (mHolder.getSurface() == null){ return; } try { mCamera.stopPreview(); } catch (Exception e){ } try { mCamera.setPreviewDisplay(mHolder); mCamera.startPreview(); } catch (Exception e){ } } public void surfaceCreated(SurfaceHolder holder) { try { mCamera.setPreviewDisplay(holder); mCamera.startPreview(); } catch (IOException e) { } } public void surfaceDestroyed(SurfaceHolder holder) { } } } videorecord.xml <?xml version="1.0" encoding="utf-8"?> <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android" android:orientation="vertical" android:layout_width="fill_parent" android:layout_height="fill_parent" > <FrameLayout android:layout_width="fill_parent" android:layout_height="fill_parent" > <FrameLayout android:id="@+id/videoview" android:layout_width="fill_parent" android:layout_height="fill_parent"></FrameLayout> <LinearLayout android:id="@+id/mybutton" android:layout_width="fill_parent" android:layout_marginBottom="0dip" android:layout_height="wrap_content" android:orientation="horizontal" android:layout_weight="0" > <!-- <TextView android:text="START Recording" android:id="@+id/mybutton1" android:layout_height="wrap_content" android:layout_width="wrap_content" style="@style/savestyle" android:layout_weight="1" android:gravity="left" > </TextView> --> <ImageView android:layout_height="wrap_content" android:id="@+id/btnplay" android:padding="5dip" android:background="#A0000000" android:textColor="#ffffffff" android:layout_width="wrap_content" android:src="@drawable/startrec" /> </LinearLayout> <TextView android:text="00:00:00" android:id="@+id/txtcounter" android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_gravity="right|bottom" android:padding="5dip" android:background="#A0000000" android:textColor="#ffffffff" /> </FrameLayout> <RelativeLayout android:layout_width="fill_parent" android:layout_height="fill_parent" android:background="@color/bgcolor" > <LinearLayout android:layout_above="@+id/mybutton" android:orientation="horizontal" android:layout_width="fill_parent" android:layout_height="fill_parent" > </LinearLayout> </RelativeLayout> </LinearLayout>

    Read the article

  • Get Object from memory using memory adresse

    - by Hamza Karmouda
    I want to know how to get an Object from memory, in my case a MediaRecorder. Here's my class: Mymic class: public class MyMic { MediaRecorder recorder2; File file; private Context c; public MyMic(Context context){ this.c=context; } private void stopRecord() throws IOException { recorder2.stop(); recorder2.reset(); recorder2.release(); } private void startRecord() { recorder2= new MediaRecorder(); recorder2.setAudioSource(MediaRecorder.AudioSource.MIC); recorder2.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP); recorder2.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB); recorder2.setOutputFile(file.getPath()); try { recorder2.prepare(); recorder2.start(); } catch (IllegalStateException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } } } my Receiver Class: public class MyReceiver extends BroadcastReceiver { private Context c; private MyMic myMic; @Override public void onReceive(Context context, Intent intent) { this.c=context; myMic = new MyMic(c); if(my condition = true){ myMic.startRecord(); }else myMic.stopRecord(); } } So when I'm calling startRecord() it create a new MediaRecorder but when i instantiate my class a second time i can't retrieve my Object. Can i retrieve my MediaRecorder with his addresse

    Read the article

  • Record audio via MediaRecorder

    - by Isuru Madusanka
    I am trying to record audio by MediaRecorder, and I get an error, I tried to change everything and nothing works. Last two hours I try to find the error, I used Log class too and I found out that error occurred when it call recorder.start() method. What could be the problem? public class AudioRecorderActivity extends Activity { MediaRecorder recorder; File audioFile = null; private static final String TAG = "AudioRecorderActivity"; private View startButton; private View stopButton; /** Called when the activity is first created. */ @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); startButton = findViewById(R.id.start); stopButton = findViewById(R.id.stop); setContentView(R.layout.main); } public void startRecording(View view) throws IOException{ startButton.setEnabled(false); stopButton.setEnabled(true); File sampleDir = Environment.getExternalStorageDirectory(); try{ audioFile = File.createTempFile("sound", ".3gp", sampleDir); }catch(IOException e){ Toast.makeText(getApplicationContext(), "SD Card Access Error", Toast.LENGTH_LONG).show(); Log.e(TAG, "Sdcard access error"); return; } recorder = new MediaRecorder(); recorder.setAudioSource(MediaRecorder.AudioSource.MIC); recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP); recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB); recorder.setAudioEncodingBitRate(16); recorder.setAudioSamplingRate(44100); recorder.setOutputFile(audioFile.getAbsolutePath()); recorder.prepare(); recorder.start(); } public void stopRecording(View view){ startButton.setEnabled(true); stopButton.setEnabled(false); recorder.stop(); recorder.release(); addRecordingToMediaLibrary(); } protected void addRecordingToMediaLibrary(){ ContentValues values = new ContentValues(4); long current = System.currentTimeMillis(); values.put(MediaStore.Audio.Media.TITLE, "audio" + audioFile.getName()); values.put(MediaStore.Audio.Media.DATE_ADDED, (int)(current/1000)); values.put(MediaStore.Audio.Media.MIME_TYPE, "audio/3gpp"); values.put(MediaStore.Audio.Media.DATA, audioFile.getAbsolutePath()); ContentResolver contentResolver = getContentResolver(); Uri base = MediaStore.Audio.Media.EXTERNAL_CONTENT_URI; Uri newUri = contentResolver.insert(base, values); sendBroadcast(new Intent(Intent.ACTION_MEDIA_SCANNER_SCAN_FILE, newUri)); Toast.makeText(this, "Added File" + newUri, Toast.LENGTH_LONG).show(); } } And here is the xml layout. <?xml version="1.0" encoding="utf-8"?> <RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android" android:id="@+id/RelativeLayout1" android:layout_width="fill_parent" android:layout_height="fill_parent" android:orientation="vertical" > <Button android:id="@+id/start" android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_alignParentTop="true" android:layout_centerHorizontal="true" android:layout_marginTop="146dp" android:onClick="startRecording" android:text="Start Recording" /> <Button android:id="@+id/stop" android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_alignLeft="@+id/start" android:layout_below="@+id/start" android:layout_marginTop="41dp" android:enabled="false" android:onClick="stopRecording" android:text="Stop Recording" /> </RelativeLayout> And I added permission to AndroidManifest file. <?xml version="1.0" encoding="utf-8"?> <manifest xmlns:android="http://schemas.android.com/apk/res/android" package="in.isuru.audiorecorder" android:versionCode="1" android:versionName="1.0" > <uses-sdk android:minSdkVersion="8" /> <application android:icon="@drawable/ic_launcher" android:label="@string/app_name" > <activity android:name=".AudioRecorderActivity" android:label="@string/app_name" > <intent-filter> <action android:name="android.intent.action.MAIN" /> <category android:name="android.intent.category.LAUNCHER" /> </intent-filter> </activity> </application> <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/> <uses-permission android:name="android.permission.RECORD_AUDIO" /> </manifest> I need to record high quality audio. Thanks!

    Read the article

  • How do I use MediaRecorder to record video without causing a segmentation fault?

    - by rabidsnail
    I'm trying to use android.media.MediaRecorder to record video, and no matter what I do the android runtime segmentation faults when I call prepare(). Here's an example: public void onCreate(Bundle savedInstanceState) { Log.i("video test", "making recorder"); MediaRecorder recorder = new MediaRecorder(); contentResolver = getContentResolver(); try { super.onCreate(savedInstanceState); Log.i("video test", "--------------START----------------"); SurfaceView target_view = new SurfaceView(this); Log.i("video test", "making surface"); Surface target = target_view.getHolder().getSurface(); Log.i("video test", target.toString()); Log.i("video test", "new recorder"); recorder = new MediaRecorder(); Log.i("video test", "set display"); recorder.setPreviewDisplay(target); Log.i("video test", "pushing surface"); setContentView(target_view); Log.i("video test", "set audio source"); recorder.setAudioSource(MediaRecorder.AudioSource.MIC); Log.i("video test", "set video source"); recorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT); Log.i("video test", "set output format"); recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP); Log.i("video test", "set audio encoder"); recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB); Log.i("video test", "set video encoder"); recorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP); Log.i("video test", "set max duration"); recorder.setMaxDuration(3600); Log.i("video test", "set on info listener"); recorder.setOnInfoListener(new listener()); Log.i("video test", "set video size"); recorder.setVideoSize(320, 240); Log.i("video test", "set video frame rate"); recorder.setVideoFrameRate(15); Log.i("video test", "set output file"); recorder.setOutputFile(get_path(this, "foo.3gp")); Log.i("video test", "prepare"); recorder.prepare(); Log.i("video test", "start"); recorder.start(); Log.i("video test", "sleep"); Thread.sleep(3600); Log.i("video test", "stop"); recorder.stop(); Log.i("video test", "release"); recorder.release(); Log.i("video test", "-----------------SUCCESS------------------"); finish(); } catch (Exception e) { Log.i("video test", e.toString()); recorder.reset(); recorder.release(); Log.i("video tets", "-------------------FAIL-------------------"); finish(); } } public static String get_path (Context context, String fname) { String path = context.getFileStreamPath("foo").getParentFile().getAbsolutePath(); String res = path+"/"+fname; Log.i("video test", "path: "+res); return res; } class listener implements MediaRecorder.OnInfoListener { public void onInfo(MediaRecorder recorder, int what, int extra) { Log.i("video test", "Video Info: "+what+", "+extra); } }

    Read the article

  • Android - What is mediarecorder's maximum maxfilesize?

    - by andy_spoo
    Android - What is the maximum file size that setMaxFileSize can be set to in respect to Androids mediarecorder? I know it's somewhere between 4147483650 and 5147483650. Why is there a limit in the first place? I'm recording on to a SDCARD, detecting the size of the cards space before we run. "ERROR/AuthorDriver(31): setParameter(max-filesize = 7270309850) failed with result -5" "ERROR/AuthorDriver(31): Ln 903 handleSetParameters("max-filesize=7270309850") error" "ERROR/AndroidRuntime(409): java.lang.RuntimeException: setMaxFileSize failed."

    Read the article

  • How could I send live video stream to remote server from my phone !!!

    - by poc
    Hello , I have a problem about streaming my video to server in real-time from my phone. that is , let my phone be a IP Camera , and server can watch the live video from my phone I have googled many many solutions, but there is no one can solve my problem. I use MediaRecorder to record . it can save video file in the SD card correctly. then , I refered this page and used some method as followings skt = new Socket(InetAddress.getByName(hostname),port); pfd =ParcelFileDescriptor.fromSocket(skt); mediaRecorder.setOutputFile(pfd.getFileDescriptor()); now it seems I can send the video stream while recording however, I wrote a receiver-side program to receive the video stream from Android , but it doesn't work . is there any error? I can receive file , but I can not open the video file . I guess the problem may caused by file format ? there are outline of my code. in android side Socket skt = new Socket(hostIP,port); ParcelFileDescriptor pfd =ParcelFileDescriptor.fromSocket(skt); .... .... mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC); mediaRecorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT); mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4); mediaRecorder.setOutputFile(pfd.getFileDescriptor()); ..... mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT); mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP); ..... mediaRecorder.start(); in receiver side (my ACER notebook) // anyway , I don't think the file extentions will do any effect File video = new File (strDate+".3gpp"); FileOutputStream fos; try { fos = new FileOutputStream(video); byte[] data = new byte[1024]; int count =-1; while( (count = fin.read(data,0,1024) ) !=-1) { fos.write(data,0,count); fos.flush(); } fos.close(); fin.close(); I confused a long time.... thanks in advance

    Read the article

  • MediaRecorder Prepare Failed

    - by AndyTang
    Hi, I'm new here. I have been trying to create a video capture app using the android emulator without much success. As far as I know and looking through all the samples and code on the internet (this site and others), I must still be missing a step. I've tried using this sample near the end of this thread made by JonPro: http://www.anddev.org/viewtopic.php?p=24723#24723 and I've tried making my own but the media recorder would always fail on the prepare stage with the most unhelpful message of 'prepare failed'. I have no clue what I am missing. I seem to have the correct permissions and a SDCard is mounted according to the emulator. Should I be using a android SDK version other than 2.1? Even though the code in that forum claims to work, I figured out that this line was missing: recorder.setCamera(camera); But still no joy as the logs shows that: 'Failed to get camera(0x16b70) parameters' when prepare() is called but it still doesn't make sense as the preview is okay, but no recording! Any help or suggestions will be appreciated.

    Read the article

  • Why do the usable sizes differ

    - by Raigex
    Again, dont really know how to phrase the question so I will explain. I have a video recorder application. I open my camera with cameraRecorder = Camera.open(1); //(this is the front facing camera) And get the camera parameters and all supported preview sizes CameraParameters tmpParams = cameraRecorder.getParameters(); List<Camera.Size> tmpList = tmpParams.getSupportedPreviewSizes(); one of the preview sizes on the Galaxy Tab 10.1 running ICS (4.0.4) is 800x600 but when I try to set the Video size in my media Player mediaRecorder.setVideoSize(800,600); I get this error: 12-19 17:27:55.035: E/CameraSource(110): Video dimension (800x600) is unsupported 12-19 17:27:55.035: E/StagefrightRecorder(110): cameraSource do not init 12-19 17:27:55.035: E/StagefrightRecorder(110): setupCameraSource failed. (-19) 12-19 17:27:55.035: E/StagefrightRecorder(110): setupMediaSource is failed. (-19) 12-19 17:27:55.035: E/StagefrightRecorder(110): setupMPEG4Recording is failed. (-19) 12-19 17:27:55.035: E/MediaRecorder(30119): start failed: -19 Does anyone know why this discrepancy might exist (I know one of the supported record sizes is 1280x720 but that is too big for me).

    Read the article

  • Android: Using MediaRecorder to crop an existing audio file?

    - by user141146
    Hi, I'd like to take an existing mp3 file located on an SD card and arbitrarily crop it (e.g. crop from 0:12 to 1:14 in a 3 minute song). The only class that I've seen that seems remotely relevant to do this is the MediaRecorder class. My 'hope' would be to "record" an existing file like this: MediaRecorder recorder = new MediaRecorder(); recorder.setAudioSource(###some magical way of specifying an existing file??###); But this obviously doesn't work (setAudioSource() takes an int and seems to default to the phone's microphone). Is there a class or an approach that can be used to crop audio on the phone itself? TKS!!

    Read the article

  • Android - Audio recorder FileNotFound

    - by david
    Hi, I'm trying to record audio this.recorder = new android.media.MediaRecorder(); this.recorder.setAudioSource(android.media.MediaRecorder.AudioSource.MIC); this.recorder.setOutputFormat(android.media.MediaRecorder.OutputFormat.DEFAULT); this.recorder.setAudioEncoder(android.media.MediaRecorder.AudioEncoder.DEFAULT); this.recorder.setOutputFile("pruebaAudioRecorder.mp4"); this.recorder.prepare(); this.recorder.start(); but when i call prepare method throws the FileNotFound exception. Should I create the file before prepare method? something like new File(...) If so, which should be the file path? thx a lot.

    Read the article

  • Android stream to Wowza

    - by Curtis Kiu
    I feel very confused about Android streaming to wowza. I am doing a video conference using rtmp cross-platform, but Android doesn't eat RTMP. Therefore I need to find another way to do it. Upstreaming I found a new open-source app called spydroid-ipcamera. It is using rtp, sending udp packets to computer, and opens it in vlc using the following sdp v=0 s=Unnamed m=video 5006 RTP/AVP 96 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1;profile-level-id=420016;sprop-parameter-sets=Z0IAFukBQHsg,aM4BDyA=; But it can't work. Then I follow wowza tutorial and stream to it and then play again in VLC. That works! I wrote it in http://code.google.com/p/spydroid-ipcamera/issues/detail?id=2 However when I want to add audio in the packet, it fails to work. I change to code in http://code.google.com/p/spydroid-ipcamera/source/browse/trunk/src/net/mkp/spydroid/CameraStreamer.java mr.setAudioSource(MediaRecorder.AudioSource.MIC); mr.setVideoSource(MediaRecorder.VideoSource.CAMERA); mr.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4); mr.setVideoFrameRate(20); mr.setVideoSize(640, 480); mr.setAudioEncoder(MediaRecorder.AudioEncoder.AAC); mr.setVideoEncoder(MediaRecorder.VideoEncoder.H264); mr.setPreviewDisplay(holder.getSurface()); Then I thought that the problem should be in sdp, but I don't know how to due with sdp. I am streaming H.264/AAC with Mp4 Second I don't understand sdp. So how can I make video conference upstreaming part using this apps. Android ----(UDP Port:5006)----> PC (SDP file) and then Wowza read the SDP file ------> VLC I think in this way the system cannot handle more than 1 client. sdp can only hold 1 port, any idea or actually it wont' work? Also Wowza need to set the stream before we stream it, so does it mean that I should not follow this way to do it? Sorry my English is poor, I hope you guys understand.

    Read the article

  • Android: videocamera, limit length of videos taken

    - by AP257
    I'm working in Android and starting the video camera activity using ACTION_VIDEO_CAPTURE. Is there any way I can limit the length (in time) of the videos the user can take? I think this is possible if you use MediaRecorder, but I don't really fancy doing that since it's so much more complicated than using the simple ACTION_VIDEO_CAPTURE. Current code: Intent videoCaptureIntent = new Intent(MediaStore.ACTION_VIDEO_CAPTURE); startActivityForResult(videoCaptureIntent,1); If it's not possible, does anyone know whether I could set a timer (TimerTask?) in Java and then show a Toast message after a certain length of time warning the user that they need to stop filming? (I'm a Java newbie, so I don't know if this is exactly what I need.)

    Read the article

  • AudioRecord doesn't work for Motorola Milestone

    - by hcpl
    I'm having this problem only on Motorola Milestone. Code: // init recorder recordInstance = new AudioRecord(MediaRecorder.AudioSource.MIC, 8000, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, 8000); recordInstance.startRecording(); //more code here recordInstance.stop(); The errorinformation I have (can't find more for the moment since I don't have a milestone myself for debugging): Uncaught handler: thread main exiting due to uncaught exception java.lang.IllegalStateException: stop() called on an uninitialized AudioRecord. at android.media.AudioRecord.stop(AudioRecord.java:51 6) Apparantly I'm not the only one with this problem. Some very similar threads I found (without solution): http://groups.google.com/group/android-developers/browse_thread/thread/6dd24aeb484b2e40 http://androidcommunity.com/forums/f2/problem-using-audiorecord-in-motorola-milestone-30935/ http://community.developer.motorola.com/t5/Android-App-Development-for/Problem-using-AudioRecord-on-Milestone-device/m-p/3889 http://www.fring.com/forums/showthread.php?t=16194

    Read the article

  • How could I make the SurfaceView and Button have a different orientation in a Activity

    - by ???
    I hava a simple program about video recording . I use the surfaceView to show the preview screen. and I also want to put some buttons on the screen. I put all the component mentioned above into a XML file which is "ipcam.xml" I use MediaRecorder , SurfaceView , SurfaceHolder to complete this program because the preview screen does not orientate correctly when I rotate my phone. so I use "setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);" but , the button in the program is always in LANDSCAPE direction . it will not auto change direction when I rotate my phone , but the preview is ok!! so is there anyway to solve this problem. Thank you all very much in advance.

    Read the article

  • Getting camera preview data without using a preview callback

    - by velocipedestrian
    I have an app that does some processing to camera preview frames before displaying them to the user. I'm currently using preview callbacks to access the image data, but the problem I have is that the onPreviewFrame() function stops getting called if you start recording video using a MediaRecorder, and I want the processing to continue when video is being recorded. I've tried doing the following: public static Bitmap convertViewToBitmap(View view) { Bitmap bitmap = Bitmap.createBitmap(view.getWidth(),view.getHeight(), Bitmap.Config.ARGB_8888); view.draw(new Canvas(bitmap)); return bitmap; } to convert the preview surface to a bitmap, but when I pass the preview SurfaceView to the function it returns an all-black bitmap (it works when I test it on normal views though). Is there any other way I can access the image data if preview callbacks are not available?

    Read the article

  • Microphone input

    - by George
    I'm trying to build a gadget that detects pistol shots using Android. It's a part of a training aid for pistol shooters that tells how the shots are distributed in time and I use a HTC Tattoo for testing. I use the MediaRecorder and its getMaxAmplitude method to get the highest amplitude during the last 1/100 s but it does not work as expected; speech gives me values from getMaxAmplitude in the range from 0 to about 25000 while the pistol shots (or shouting!) only reaches about 15000. With a sampling frequency of 8kHz there should be some samples with considerably high level. Anyone who knows how these things work? Are there filters that are applied before registering the max amplitude. If so, is it hardware or software? Thanks, /George

    Read the article

  • Problem using AudioRecord with 8-bit encoding in android

    - by maxsap
    Hello, I have made an application that records from the phones microphone using the AudioRecord and 16-bit encoding, and I am able to playback the recording. For some compatibility reason I need to use 8-bit encoding, but when I try to run the same program using that encoding I keep getting an Invalid Audio Format. my code is : int bufferSize = AudioRecord.getMinBufferSize(11025, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_8BIT); AudioRecord recordInstance = new AudioRecord( MediaRecorder.AudioSource.MIC, 11025, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_8BIT, bufferSize); Any one knows what is the problem? according to the documentation AudioRecord is capable of 8-bit encoding. thanks in advanced maxsap.

    Read the article

  • Microphone input

    - by George
    I'm trying to build a gadget that detects pistol shots using Android. It's a part of a training aid for pistol shooters that tells how the shots are distributed in time and I use a HTC Tattoo for testing. I use the MediaRecorder and its getMaxAmplitude method to get the highest amplitude during the last 1/100 s but it does not work as expected; speech gives me values from getMaxAmplitude in the range from 0 to about 25000 while the pistol shots (or shouting!) only reaches about 15000. With a sampling frequency of 8kHz there should be some samples with considerably high level. Anyone who knows how these things work? Are there filters that are applied before registering the max amplitude. If so, is it hardware or software? Thanks, /George

    Read the article

  • how to continuously send data without blocking?

    - by Donal Rafferty
    I am trying to send rtp audio data from my Android application. I currently can send 1 RTP packet with the code below and I also have another class that extends Thread that listens to and receives RTP packets. My question is how do I continuously send my updated buffer through the packet payload without blocking the receiving thread? public void run() { isRecording = true; android.os.Process.setThreadPriority (android.os.Process.THREAD_PRIORITY_URGENT_AUDIO); int buffersize = AudioRecord.getMinBufferSize(8000, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT); Log.d("BUFFERSIZE","Buffer size = " + buffersize); arec = new AudioRecord(MediaRecorder.AudioSource.MIC, 8000, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, buffersize); short[] readBuffer = new short[80]; byte[] buffer = new byte[160]; arec.startRecording(); while(arec.getRecordingState() == AudioRecord.RECORDSTATE_RECORDING){ int frames = arec.read(readBuffer, 0, 80); @SuppressWarnings("unused") int lenghtInBytes = codec.encode(readBuffer, 0, buffer, frames); RtpPacket rtpPacket = new RtpPacket(); rtpPacket.setV(2); rtpPacket.setX(0); rtpPacket.setM(0); rtpPacket.setPT(0); rtpPacket.setSSRC(123342345); rtpPacket.setPayload(buffer, 160); try { rtpSession2.sendRtpPacket(rtpPacket); } catch (UnknownHostException e) { // TODO Auto-generated catch block e.printStackTrace(); } catch (RtpException e) { // TODO Auto-generated catch block e.printStackTrace(); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } } } So when I send on one device and receive on another I get decent audio, but when I send and receive on both I get broken sound like its taking turns to send and receive audio. I have a feeling it could be to do with the while loop? it could be looping around in there and not letting anything else run?

    Read the article

1 2  | Next Page >