Search Results

Search found 16362 results on 655 pages for 'audio interface'.

Page 30/655 | < Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >

  • Streaming Audio over UDP to Android

    - by Mr. Pig
    Is it possible to have Android (perhaps via MediaPlayer or a different existing class) accept media streams over UDP? I've successfully had MediaPlayer connect to an HTTP stream (as well as static files hosted on an HTTP server) but I'm wondering how one would go about accepting a stream from a UDP source. I've seen this and suppose a solution similar to that (where I download the stream via an independent UDP socket and then move the data to a MemoryBuffer that I then pass to MediaPlayer) is an option but I'm curious if a method already exists in the SDK, and if it does not, what other options do I have? Thanks

    Read the article

  • UIViewController programmatically vs Interface Builder

    - by alexey
    I have a custom UIViewController and a corresponding view in a nib file. The view is added to the UIWindow directly. [window addSubview:customViewController.view]; Sizes of the window and the view are default (480x320 and 460x320 correspondingly). When I create CustomViewController inside the nib file and check "Resize View From NIB" in IB Attributes tab everything works just fine. But when I create CustomViewController programmmatically with initWithNibName message the view is not positioned on the window correctly. There is an empty stripe at the bottom. Its height is 20px. I see it's because of status bar offset. IB handles that with "Resize View From NIB". How to emulate that programmatically?

    Read the article

  • Audio recording and playback in Silverlight

    - by Ramesh
    I have a Silverlight 4 application that records user's voice through the mic. Now, as soon as the recording is completed, I need to play the recorded voice back to the user before posting it to the server. Is it at all possible to play it back to the user without getting into format conversions etc? Any ideas are welcome. Thanks!

    Read the article

  • Image not showing in UIImageView in Interface Builder / iPhone

    - by dbonneville
    I have a UIView with an UIImageView dragged onto the view. All of a sudden, for all my xibs, the image no longer shows up. There is a blue X. However, when it builds, the image is there. At one point, I deleted and regenerated all my images and moved some into a subfolder in XCode. Normally, when you go to select an image for an UIImageView, IB allows you to pick from any image in the project. But, I can't see any of the images I had put in the folder anymore in the dropdown. All I see in the dropdown on the Inspector is the one image I want, but that is also the one that is not showing up. And like I said, if I build it on the device or simulator, it all works. There is some cache or something screwed up somewhere. Everything builds with no errors. I cleared the caches and rebuilt. It all works. No error or warnings. But...I can't see any other images and IB still thinks it's missing the image that is clearly selected in the dropdown. So how do I get XCode and IB back on track and see what assets it properly should be seeing in the XIBs?

    Read the article

  • Core-audio - constructing an AudioBufferList struct (Q about c struct definition)

    - by mustISignUp
    The definition of AudioBufferList looks weird to me… i guess my C is not so good struct AudioBufferList { UInt32 mNumberBuffers; AudioBuffer mBuffers[kVariableLengthArray]; }; typedef struct AudioBufferList AudioBufferList; Why AudioBuffer mBuffers[kVariableLengthArray]; and not AudioBuffer *mBuffers; ? kVariableLengthArray appears to be == 1. Eh? I think i have it working but would appreciate it if anyone could set me straight.

    Read the article

  • Android - Getting audio to play through earpiece

    - by Donal Rafferty
    I currently have code that reads a recording in from the devices mic using the AudioRecord class and then playing it back out using the AudioTrack class. My problem is that when I play it out it plays vis the speaker phone. I want it to play out via the ear piece on the device. Here is my code: public class LoopProg extends Activity { boolean isRecording; //currently not used AudioManager am; int count = 0; /** Called when the activity is first created. */ @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); am = (AudioManager) getSystemService(Context.AUDIO_SERVICE); am.setMicrophoneMute(true); while(count <= 1000000){ Record record = new Record(); record.run(); count ++; Log.d("COUNT", "Count is : " + count); } } public class Record extends Thread { static final int bufferSize = 200000; final short[] buffer = new short[bufferSize]; short[] readBuffer = new short[bufferSize]; public void run() { isRecording = true; android.os.Process.setThreadPriority (android.os.Process.THREAD_PRIORITY_URGENT_AUDIO); int buffersize = AudioRecord.getMinBufferSize(11025, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT); AudioRecord arec = new AudioRecord(MediaRecorder.AudioSource.MIC, 11025, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, buffersize); AudioTrack atrack = new AudioTrack(AudioManager.STREAM_MUSIC, 11025, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, buffersize, AudioTrack.MODE_STREAM); am.setRouting(AudioManager.MODE_NORMAL,1, AudioManager.STREAM_MUSIC); int ok = am.getRouting(AudioManager.ROUTE_EARPIECE); Log.d("ROUTING", "getRouting = " + ok); setVolumeControlStream(AudioManager.STREAM_VOICE_CALL); //am.setSpeakerphoneOn(true); Log.d("SPEAKERPHONE", "Is speakerphone on? : " + am.isSpeakerphoneOn()); am.setSpeakerphoneOn(false); Log.d("SPEAKERPHONE", "Is speakerphone on? : " + am.isSpeakerphoneOn()); atrack.setPlaybackRate(11025); byte[] buffer = new byte[buffersize]; arec.startRecording(); atrack.play(); while(isRecording) { arec.read(buffer, 0, buffersize); atrack.write(buffer, 0, buffer.length); } arec.stop(); atrack.stop(); isRecording = false; } } } As you can see if the code I have tried using the AudioManager class and its methods including the deprecated setRouting method and nothing works, the setSpeatPoneOn method seems to have no effect at all, neither does the routing method. Has anyone got any ideas on how to get it to play via the earpiece instead of the spaker phone?

    Read the article

  • Fluent interface design and code smell

    - by Jiho Han
    public class StepClause { public NamedStepClause Action1() {} public NamedStepClause Action2() {} } public class NamedStepClause : StepClause { public StepClause Step(string name) {} } Basically, I want to be able to do something like this: var workflow = new Workflow().Configure() .Action1() .Step("abc").Action2() .Action2() .Step("def").Action1(); So, some "steps" are named and some are not. The thing I do not like is that the StepClause has knowledge of its derived class NamedStepClause. I tried a couple of things to make this sit better with me. I tried to move things out to interfaces but then the problem just moved from the concrete to the interfaces - INamedStepClause still need to derive from IStepClause and IStepClause needs to return INamedStepClause to be able to call Step(). I could also make Step() part of a completely separate type. Then we do not have this problem and we'd have: var workflow = new Workflow().Configure() .Step().Action1() .Step("abc").Action2() .Step().Action2() .Step("def").Action1(); Which is ok but I'd like to make the step-naming optional if possible. I found this other post on SO here which looks interesting and promising. What are your opinions? I'd think the original solution is completely unacceptable or is it? By the way, those action methods will take predicates and functors and I don't think I want to take an additional parameter for naming the step there. The point of it all is, for me, is to only define these action methods in one place and one place only. So the solutions from the referenced link using generics and extension methods seem to be the best approaches so far.

    Read the article

  • Finding out estimated duration of a stream using Core Audio

    - by Reflog
    I am streaming a MP3 over network using custom feeding code, not AVAudioPlayer (which only works with URLs) using APIs like AudioFileStreamOpen and etc. Is there any way to estimate a length of the stream? I know that I can get a 'elapsed' property using: if(AudioQueueGetCurrentTime(queue.audioQueue, NULL, &t, &b) < 0) return 0; return t.mSampleTime / dataFormat.mSampleRate; But what about total duration to create a progress bar? Is that possible?

    Read the article

  • Core-audio - constructing an AudioBufferList

    - by mustISignUp
    The definition of AudioBufferList looks weird to me… i guess my C is not so good struct AudioBufferList { UInt32 mNumberBuffers; AudioBuffer mBuffers[kVariableLengthArray]; }; typedef struct AudioBufferList AudioBufferList; Why AudioBuffer mBuffers[kVariableLengthArray]; and not AudioBuffer *mBuffers; ? kVariableLengthArray appears to be == 1. Eh? I think i have it working but would appreciate it if anyone could set me straight.

    Read the article

  • How to get actual type of an derived class from its parent interface

    - by Tarik
    Hello people, Lets say we have a code portion like this : IProduct product = ProductCreator.CreateProduct(); //Factory Method we have here SellThisProduct(product); //... private void SellThisProduct(IProduct product) { //..do something here } //... internal class Soda : IProduct {} internal class Book : IProduct {} How can I infer which product is actually passed into SellThisProduct() method in the method? I think if I say GetType() or something it will probably return the IProduct type. Thanks...

    Read the article

  • Interface Builder: Resize View From NIB

    - by alexey
    I have a custom UIViewController and a corresponding view in a nib file. The view is added to the UIWindow directly. [window addSubview:customViewController.view]; Sizes of the window and the view are default (480x320 and 460x320 correspondingly). When I create CustomViewController inside the nib file and check "Resize View From NIB" in IB Attributes tab everything works just fine. But when I create CustomViewController programmmatically with initWithNibName message the view is not positioned on the window correctly. There is an empty stripe at the bottom. Its height is 20px. I see it's because of status bar offset. IB handles that with "Resize View From NIB". How to emulate that programmatically? It seems that IB uses some custom subclass of UIViewController. So the question: how is "Resize View From NIB" implemented there?

    Read the article

  • Can't change UITableViewCell size in Interface Builder

    - by Ondrej
    Does anyone have the same problem as I do? ... I've upgraded to the iPhone SDK 3.2 and I am unable to resize UITableViewCell object in my XIB file (usually I've been just resizing the view but now the cell has the same size and there is just a grey are around) ... btw, I've tried to reinstall twice including one deep reinstall.

    Read the article

  • How to change to a grouped table view in xcode without using Interface Builder

    - by Dave
    I have a table that I created within xcode so there is no nib file in this case. I want to make my table into the 'Grouped' style but im not sure how. I think it has somthing to do with the method below, the problem is Im not really sure how to call it, I do understand how methods work I'm just not too sure on where to start with this one: - (id)initWithFrame:(CGRect)frame style:(UITableViewStyle)UITableViewStyleGrouped So could someone tell me how to call it? The problem is that its not a method I wrote its a built in one so I could put that line into my header file but how would I use it in my implementation file? Thanks guys,

    Read the article

  • iPhone Orientation Relayout From Single Column to Double Column

    - by kkrizka
    I am trying to create a UIView in Interface Builder that shows to the user two boxes containing some text. This UIView should support both landscape and portrait modes. When in portrait orientation, the two boxes should be centered horizontally and be under each other. Like in the picture below: But when in landscape orientation, it should show the two boxes centered vertically and by side by side. Like in the picture below: Is this possible using only the autosizing options (or any other IB options), or do I have to relayout the view in code on orientation change events? I would prefer using only IB. I tried locking the top and left margins of the top box and locking the bottom and right margins of the bottom box. But the problem is that for it to work I also need to shrink the two boxes as one changes from portrait to landscape, because otherwise they would overlap.

    Read the article

  • processing an audio wav file with C

    - by sa125
    Hi - I'm working on processing the amplitude of a wav file and scaling it by some decimal factor. I'm trying to wrap my head around how to read and re-write the file in a memory-efficient way while also trying to tackle the nuances of the language (I'm new to C). The file can be in either an 8- or 16-bit format. The way I thought of doing this is by first reading the header data into some pre-defined struct, and then processing the actual data in a loop where I'll read a chunk of data into a buffer, do whatever is needed to it, and then write it to the output. #include <stdio.h> #include <stdlib.h> typedef struct header { char chunk_id[4]; int chunk_size; char format[4]; char subchunk1_id[4]; int subchunk1_size; short int audio_format; short int num_channels; int sample_rate; int byte_rate; short int block_align; short int bits_per_sample; short int extra_param_size; char subchunk2_id[4]; int subchunk2_size; } header; typedef struct header* header_p; void scale_wav_file(char * input, float factor, int is_8bit) { FILE * infile = fopen(input, "rb"); FILE * outfile = fopen("outfile.wav", "wb"); int BUFSIZE = 4000, i, MAX_8BIT_AMP = 255, MAX_16BIT_AMP = 32678; // used for processing 8-bit file unsigned char inbuff8[BUFSIZE], outbuff8[BUFSIZE]; // used for processing 16-bit file short int inbuff16[BUFSIZE], outbuff16[BUFSIZE]; // header_p points to a header struct that contains the file's metadata fields header_p meta = (header_p)malloc(sizeof(header)); if (infile) { // read and write header data fread(meta, 1, sizeof(header), infile); fwrite(meta, 1, sizeof(meta), outfile); while (!feof(infile)) { if (is_8bit) { fread(inbuff8, 1, BUFSIZE, infile); } else { fread(inbuff16, 1, BUFSIZE, infile); } // scale amplitude for 8/16 bits for (i=0; i < BUFSIZE; ++i) { if (is_8bit) { outbuff8[i] = factor * inbuff8[i]; if ((int)outbuff8[i] > MAX_8BIT_AMP) { outbuff8[i] = MAX_8BIT_AMP; } } else { outbuff16[i] = factor * inbuff16[i]; if ((int)outbuff16[i] > MAX_16BIT_AMP) { outbuff16[i] = MAX_16BIT_AMP; } else if ((int)outbuff16[i] < -MAX_16BIT_AMP) { outbuff16[i] = -MAX_16BIT_AMP; } } } // write to output file for 8/16 bit if (is_8bit) { fwrite(outbuff8, 1, BUFSIZE, outfile); } else { fwrite(outbuff16, 1, BUFSIZE, outfile); } } } // cleanup if (infile) { fclose(infile); } if (outfile) { fclose(outfile); } if (meta) { free(meta); } } int main (int argc, char const *argv[]) { char infile[] = "file.wav"; float factor = 0.5; scale_wav_file(infile, factor, 0); return 0; } I'm getting differing file sizes at the end (by 1k or so, for a 40Mb file), and I suspect this is due to the fact that I'm writing an entire buffer to the output, even though the file may have terminated before filling the entire buffer size. Also, the output file is messed up - won't play or open - so I'm probably doing the whole thing wrong. Any tips on where I'm messing up will be great. Thanks!

    Read the article

  • iPhone SDK: Interface Builder label font, only shows when editing label

    - by Nic Hubbard
    I have tried this on a few installations of the 3.1.3 SDK. When I add a label to my view, I would like to change the font to something like Futura. I know how to change the font, but, for some reason, it does not show that it is changed. ONLY when I edit the label by double clicking, do I see my new font. And, this is the only time that I do get to see the new font, is when editing the label. Why does this happen? How can I change the font of my labels, and have it show up? Why would I care to have the font changed when I edit the label?!

    Read the article

  • audio processing using java

    - by Sukhhhh
    We have a requirement where we need to convert from .wav file to .mp3 and we are currently using "Tritonus" library to do that . The concern with that library is that requires "installation" of some "dll" files to the class path. I am wondering are there any API's those allow better processing without local installation. And other question is ,having mp3 format files will make it easier to join the files into a single file than having .wav files ?

    Read the article

  • Play multiple audio files using AVAudioPlayer

    - by inScript09
    Hi all, I am planning on releasing 10 of my song recordings for free but bundled in an iphone app. They are not available on web or itunes or anywhere as of now. I am new to iphone sdk (latest) as you can imagine, so I have been going through the developer documentation, various forums and stackoverflow to learn. Apple's avTouch sample application was a great start. But I want my app to play all the 10 tracks one by one. All the songs are added to resources folder and are named as track1, track2...track10. In the avTouch app code I can see the following 2 parts which is where I think I need to make changes to achieve what I am looking for. But I am lost. // Load the array with the sample file NSURL *fileURL = [[NSURL alloc] initFileURLWithPath: [[NSBundle mainBundle] pathForResource:@"sample" ofType:@"m4a"]]; - (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag { if (flag == NO) NSLog(@"Playback finished unsuccessfully"); [player setCurrentTime:0.]; [self updateViewForPlayerState]; } can anyone please help me on 1. how to load the array with all the 10 tracks which are added to resources folder 2. and when I hit play, player should start the first track. when the 1st track ends 2nd track should start and so on for the remaining tracks. Thank You

    Read the article

  • Set an Interface Builder created element's state programatically

    - by mvexel
    I have a couple of UISwitch elements in a view controller that is presented modally in my iPhone app. I set up the view in IB. I want these UISwitch elements to reflect the current values in my [NSUserDefaults standardUserDefaults] where I store the appropriate BOOLs. I thought this would do the trick setting the switches to the right state, but no: -(void)viewWillAppear:(BOOL)animated { NSUserDefaults *defaults = [NSUserDefaults standardUserDefaults]; [backgroundSwitch setOn:[defaults boolForKey:kTrackInBackgroundKey] animated:NO]; [batterySaveSwitch setOn:[defaults boolForKey:kBatterySaveKey] animated:NO]; } The backgroundSwitch and batterySaveSwitch are declared as properties for the view controller class. They are not initialized; that does not seem to make a difference. I did check the values coming out of the NSUserDefaults dictionary. The method is being called at the right time.

    Read the article

  • Align UItextFields one below the other in interface builder

    - by Dave
    How to align 2 textfields one below the other in a tool bar and display a button on the left side (or right side) in the vertical middle of those two fields? Please see the image to know what I am talking about. http://developer.apple.com/iphone/library/documentation/userexperience/conceptual/mobilehig/art/ui_textfields.jpg

    Read the article

  • Interface Builder warnings

    - by Biranchi
    Hi all, I am getting an warning while building my source code as follows: /* com.apple.ibtool.document.warnings */ /Users/biranchi/Desktop/Hotlist v2.0/Classes/HLCheckinViewController.xib:6: warning: The separator style "Single Line Etched" is not supported on iPhone OS versions prior to 3.2. What is this error due to ? Thanks

    Read the article

  • blackberry implement audio player

    - by Prasad
    Hi, I am developing an application which let users to hear songs online. And I used Blackberry Player and Manager APIs. My application works fine and I can play songs. Now I wan't to add more controls to it. As an example I want pause, play songs. Mute the sound, Control the volume. Display the progress of the play back. Display the current time position of the song like that. I started research on that. And I tried to do that with PlayerListener. But unfortunately all the time I am getting IllegalStateException. So I can't go ahead with that research. As a help can someone please tell me how can I implement above kind of controls for a player. Appreciate if someone can post a sample code to do that. Further I will put my playback source code here. public void run() { try { p = Manager.createPlayer(requestedSong + SystemSettings.strNetwork); p.setLoopCount(1); p.start(); } catch (IOException ioe) { } catch (MediaException me) { } } public void run() { try { p = Manager.createPlayer(strSongURL); p.setLoopCount(1); p.start(); } catch (IOException ioe) { } catch (MediaException me) { } } Thank you very much. Prasad

    Read the article

< Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >