Search Results

Search found 5071 results on 203 pages for 'audio zoom'.

Page 74/203 | < Previous Page | 70 71 72 73 74 75 76 77 78 79 80 81  | Next Page >

  • what (clip) and DataLine.Info represents...?

    - by user528050
    I got this code from one of my friend. import java.io.*; import javax.sound.sampled.*; public class xx { public static void main(String args[]) { try { File f=new File("mm.wav"); AudioInputStream a=AudioSystem.getAudioInputStream(f); AudioFormat au=a.getFormat(); DataLine.Info di=new DataLine.Info(Clip.class,au); Clip c=(Clip)AudioSystem.getLine(di); c.open(a); c.start(); } catch(Exception e) { System.out.println("Exception caught "); } } } But i didn't understand what this line means Cilp c=(Clip)AudioSystem.getLine(di); what (clip) represents....? And my 2nd problem is what is the DataLine is it an interface and what is the meaning of this statement DataLine.Info....?

    Read the article

  • UITableView with Shake and play.

    - by avural79
    hi all i am trying to make a musical app for iphone. the app is simple. there is a couple of musical note sample (caf) files. when user taps the predefined positions on uiview(like strings). app plays note sample and add a string value to a nsmutablearray about note. played note lists displays in a table. now i want to add a shake and play mode to app. when user shake iphone, recorded notes start to play from first record to last record and loop again. also if user shake iphone harder notes will plays faster. how can i do that. any idea? thanks

    Read the article

  • Preinitialize BackgroundAudioPlayer in WP7?

    - by kgrevehagen
    When I am using the BackgroundAudioPlayer in my Windows Phone 7 application, it takes a lot of time to load the first time I want to play a song. Is there any way of preinitializing the BackgroundAudioPlayer before playing the first track, so that when I start playing, it starts right along? I have googled it, but no luck. I am just using BackgroundAudioPlayer.Instance when I e.g. want to play, pause, stop etc an audiotrack. Is there something other i could do to fix this? Thanks in advance!

    Read the article

  • Drawing a waveform in C#

    - by user488792
    Hi! I want to be able to display a WaveForm in C#, along with some simple features such as zooming and selection. I already have the data as a short[] of amplitude values. However, I am an amateur when it comes to hardcoding GUI. I have already found a possible helper class WaveFormClass that may help me achieve this but as a backup, I want to learn how to manually do it. So may I ask for some methods and possibly some links that will help? Thanks!

    Read the article

  • Getting Frequency Components with FFT

    - by ruhig brauner
    so I was able to solv my last problem but i stubmled upon the next already. So I want to make a simple spectrogram but in oder to do so I want to understand how FFT-libaries work and what they actually calculate and return. (FFT and Signal Processing is the number 1 topic I will get into as soon as I have time but right now, I only have time for some programming exercises in the evening. ;) ) Here I just summarized the most important parts: int framesPerSecond; int samplesPerSecond; int samplesPerCycle; // right now i want to refresh the spectogram every DoubleFFT_1D fft; WAVReader audioIn; double audioL[], audioR[]; double fftL[], fftR[]; ..... framesPerSecond = 30; audioIn= new WAVReader("Strobe.wav"); int samplesPerSecond = (int)audioIn.GetSampleRate(); samplesPerCycle = (int)(audioIn.GetSampleRate()/framesPerSecond); audioL = new double[samplesPerCycle*2]; audioR = new double[samplesPerCycle*2]; fftL = new double[samplesPerCycle]; fftR = new double[samplesPerCycle]; for(int i = 0; i < samplesPerCycle; i++) { // don't even know why,... fftL[i] = 0; fftR[i] = 0; } fft = new DoubleFFT_1D(samplesPerCycle); ..... for(int i = 0; i < samplesPerCycle; i++) { audioIn.GetStereoSamples(temp); audioL[i]=temp[0]; audioR[i]=temp[1]; } fft.realForwardFull(audioL); //still stereo fft.realForwardFull(audioR); System.out.println("Check"); for(int i = 0; i < samplesPerCycle; i++) { //storing the magnitude in the fftL/R arrays fftL[i] = Math.sqrt(audioL[2*i]*audioL[2*i] + audioL[2*i+1]*audioL[2*i+1]); fftR[i] = Math.sqrt(audioR[2*i]*audioR[2*i] + audioR[2*i+1]*audioR[2*i+1]); } So the question is, if I want to know, what frequencys are in the sampled signal, how do I calculate them? (When I want to print the fftL / fftR arrays, I get some exponential formes at both ends of the array.) Thx :)

    Read the article

  • sound not playing when i press the button and how to fix overlapping sounds

    - by alfredjunco
    the code is giving me an error"Unused variable'path'" and when i press a button there is no sound playing how do i fix this the aSound is in the h file - (void)playOnce:(NSString *)aSound; - (IBAction) beatButton50 { [self playOnce:@"racecars"]; } - (void)playOnce:(NSString *)aSound { NSString *path = [[NSBundle mainBundle] pathForResource:aSound ofType:@"caf"]; if([theAudio isPlaying]) { [theAudio stop]; } } - (void)playLooped:(NSString *)aSound { NSString *path = [[NSBundle mainBundle] pathForResource:aSound ofType:@"caf"]; if (!theAudio) { theAudio = [[AVAudioPlayer alloc] initWithContentsOfURL: [NSURL fileURLWithPath: path] error: NULL]; } [theAudio setDelegate: self]; // loop indefinitely [theAudio setNumberOfLoops: -1]; [theAudio setVolume: 1.0]; [theAudio play]; } - (void)stopAudio { [theAudio stop]; [theAudio setCurrentTime:0]; }

    Read the article

  • Uninterrupted mp3 play on a website?

    - by Kevin
    Client is requesting a single track to be heard across the website. Generally I advise against it, but they insist. So, what is the most straightforward way of having a flash player embedded in a site, and when a user goes to another page there isn't a gap/interruption? I am thinking an iframe is required.. I am using a flash player that has autoresume, but that only solves picking up where you last left off on the song before going to another page. I tried searching SO for an answer..

    Read the article

  • Google Maps v3: Enforcing min. zoom level when using fitBounds

    - by chris
    I'm drawing a series of markers on a map (using v3 of the maps api). In v2, I had the following code: bounds = new GLatLngBounds(); ... loop thru and put markers on map ... bounds.extend(point); ... end looping map.setCenter(bounds.getCenter()); var level = map.getBoundsZoomLevel(bounds); if ( level == 1 ) level = 5; map.setZoom(level > 6 ? 6 : level); And that work fine to ensure that there was always an appropriate level of detail displayed on the map. I'm trying to duplicate this functionality in v3, but the setZoom and fitBounds don't seem to be cooperating: ... loop thru and put markers on the map var ll = new google.maps.LatLng(p.lat,p.lng); bounds.extend(ll); ... end loop var zoom = map.getZoom(); map.setZoom(zoom > 6 ? 6 : zoom); map.fitBounds(bounds); I've tried different permutation (moving the fitBounds before the setZoom, for example) but nothing I do with setZoom seems to affect the map. Am I missing something? Is there a way to do this?

    Read the article

  • How do I reset the scale/zoom of a web app on an orientation change on the iPhone?

    - by Elisabeth
    I'm having the same problem that a couple of others have had with getting the correct behavior in a web app on an orientation change, and there doesn't seem to be an obvious solution - I've seen this question asked a couple of times on Stack Overflow and no one's yet been able to answer it. When I start the app in portrait mode, it works fine. Then I rotate into landscape and it's scaled up. To get it to scale correctly for the landscape mode I have to double tap on something twice, first to zoom all the way in (the normal double tap behavior) and again to zoom all the way out (again, the normal double tap behavior). When it zooms out, it zooms out to the correct NEW scale for landscape mode. Switching back to portrait seems to work more consistently; that is, it handles the zoom so that the scale is correct when the orientation changes back to portrait. I am trying to figure out if this is a bug? or if this is something that can be fixed with Javascript? With the viewport meta content, I am setting the initial-scale to 1.0 and I am NOT setting minimum or maximum scale (nor do I want to). I am setting the width to device-width. Any ideas? I know a lot of people would be grateful to have a solution as it seems to be a persistent problem. Thank you!

    Read the article

  • how do I zoom to normal text size in vs2008?

    - by gerryLowry
    I can not find any help on this issue in vs2008 help, Google, or SO. Scenario: I'm looking at a source file in vs2008 SP1; Windows 2003 Server SP2 Standard Edition, 1280x1024. The irrelevant name of this file is index.aspx. What is relevant is that the file has only 65 lines of code. The print is unreadably small--less than 4 point. It uses less than a third of the vs2008 text window vertically and less that a quarter of the vs2008 text window horizontally. It's not just index.aspx; e.g. another file with 142 lines only fills about 3/4 of the vs2008 text window vertically and less that a quarter of the vs2008 text window horizontally. Possible cause: Probably, but not certainly, I found the equivalent of zoom in/zoom out such as one finds in products like Microsoft Word. However, I've explored many vs2008 toolbars and other customization options and unfortunately I can not find out how to get myself out of this mess. Window, Reset Window Layout has no effect on the text size; my tiny text size did not change. QUESTION: how do I zoom vs2008 text size in and out and back to normal size? Thank you. Regards ~~ Gerry (Lowry)

    Read the article

  • CMake: Mac OS X: ld: unknown option: -soname

    - by Alex Ivasyuv
    I try to build my app with CMake on Mac OS X, I get the following error: Linking CXX shared library libsml.so ld: unknown option: -soname collect2: ld returned 1 exit status make[2]: *** [libsml.so] Error 1 make[1]: *** [CMakeFiles/sml.dir/all] Error 2 make: *** [all] Error 2 This is strange, as Mac has .dylib extension instead of .so. There's my CMakeLists.txt: cmake_minimum_required(VERSION 2.6) PROJECT (SilentMedia) SET(SourcePath src/libsml) IF (DEFINED OSS) SET(OSS_src ${SourcePath}/Media/Audio/SoundSystem/OSS/DSP/DSP.cpp ${SourcePath}/Media/Audio/SoundSystem/OSS/Mixer/Mixer.cpp ) ENDIF(DEFINED OSS) IF (DEFINED ALSA) SET(ALSA_src ${SourcePath}/Media/Audio/SoundSystem/ALSA/DSP/DSP.cpp ${SourcePath}/Media/Audio/SoundSystem/ALSA/Mixer/Mixer.cpp ) ENDIF(DEFINED ALSA) SET(SilentMedia_src ${SourcePath}/Utils/Base64/Base64.cpp ${SourcePath}/Utils/String/String.cpp ${SourcePath}/Utils/Random/Random.cpp ${SourcePath}/Media/Container/FileLoader.cpp ${SourcePath}/Media/Container/OGG/OGG.cpp ${SourcePath}/Media/PlayList/XSPF/XSPF.cpp ${SourcePath}/Media/PlayList/XSPF/libXSPF.cpp ${SourcePath}/Media/PlayList/PlayList.cpp ${OSS_src} ${ALSA_src} ${SourcePath}/Media/Audio/Audio.cpp ${SourcePath}/Media/Audio/AudioInfo.cpp ${SourcePath}/Media/Audio/AudioProxy.cpp ${SourcePath}/Media/Audio/SoundSystem/SoundSystem.cpp ${SourcePath}/Media/Audio/SoundSystem/libao/AO.cpp ${SourcePath}/Media/Audio/Codec/WAV/WAV.cpp ${SourcePath}/Media/Audio/Codec/Vorbis/Vorbis.cpp ${SourcePath}/Media/Audio/Codec/WavPack/WavPack.cpp ${SourcePath}/Media/Audio/Codec/FLAC/FLAC.cpp ) SET(SilentMedia_LINKED_LIBRARY sml vorbisfile FLAC++ wavpack ao #asound boost_thread-mt boost_filesystem-mt xspf gtest ) INCLUDE_DIRECTORIES( /usr/include /usr/local/include /usr/include/c++/4.4 /Users/alex/Downloads/boost_1_45_0 ${SilentMedia_SOURCE_DIR}/src ${SilentMedia_SOURCE_DIR}/${SourcePath} ) #link_directories( # /usr/lib # /usr/local/lib # /Users/alex/Downloads/boost_1_45_0/stage/lib #) IF(LibraryType STREQUAL "static") ADD_LIBRARY(sml-static STATIC ${SilentMedia_src}) # rename library from libsml-static.a => libsml.a SET_TARGET_PROPERTIES(sml-static PROPERTIES OUTPUT_NAME "sml") SET_TARGET_PROPERTIES(sml-static PROPERTIES CLEAN_DIRECT_OUTPUT 1) ELSEIF(LibraryType STREQUAL "shared") ADD_LIBRARY(sml SHARED ${SilentMedia_src}) # change compile optimization/debug flags # -Werror -pedantic IF(BuildType STREQUAL "Debug") SET_TARGET_PROPERTIES(sml PROPERTIES COMPILE_FLAGS "-pipe -Wall -W -ggdb") ELSEIF(BuildType STREQUAL "Release") SET_TARGET_PROPERTIES(sml PROPERTIES COMPILE_FLAGS "-pipe -Wall -W -O3 -fomit-frame-pointer") ENDIF() SET_TARGET_PROPERTIES(sml PROPERTIES CLEAN_DIRECT_OUTPUT 1) ENDIF() ### TEST ### IF(Test STREQUAL "true") ADD_EXECUTABLE (bin/TestXSPF ${SourcePath}/Test/Media/PlayLists/XSPF/TestXSPF.cpp) TARGET_LINK_LIBRARIES (bin/TestXSPF ${SilentMedia_LINKED_LIBRARY}) ADD_EXECUTABLE (bin/test1 ${SourcePath}/Test/test.cpp) TARGET_LINK_LIBRARIES (bin/test1 ${SilentMedia_LINKED_LIBRARY}) ADD_EXECUTABLE (bin/TestFileLoader ${SourcePath}/Test/Media/Container/FileLoader/TestFileLoader.cpp) TARGET_LINK_LIBRARIES (bin/TestFileLoader ${SilentMedia_LINKED_LIBRARY}) ADD_EXECUTABLE (bin/testMixer ${SourcePath}/Test/testMixer.cpp) TARGET_LINK_LIBRARIES (bin/testMixer ${SilentMedia_LINKED_LIBRARY}) ENDIF (Test STREQUAL "true") ### TEST ### ADD_CUSTOM_TARGET(doc COMMAND doxygen ${SilentMedia_SOURCE_DIR}/doc/Doxyfile) There was no error on Linux. Build process: cmake -D BuildType=Debug -D LibraryType=shared . make I found, that incorrect command generate in CMakeFiles/sml.dir/link.txt. But why, as the goal of CMake is cross-platforming.. How to fix it?

    Read the article

  • How to sync audio files with Logitech media server in MAC OS?

    - by Abhishek
    I want to customize the Logitech Media Server (web interface on localhost) so that N number of DIFFERENT audio files will start to play at the same time on N number of wifi receivers, each file on a different receiver. Currently, the server will sync only 1 track to N number(amount) of receivers. Is it possible with Logitech media server is open source. How can I able to do this? can you explain me sample code?

    Read the article

  • Ask HTG: Dealing with Windows 8 CP Expiry, Nintendo DS Save Backups, Jumbled Audio Tracks in Windows Media Player

    - by Jason Fitzpatrick
    Once a week we round up some great reader questions and share the answers with everyone. This week we’re looking at what to do when Windows 8 Consumer Preview expires, backing up your Nintendo DS saves, and how to sort out jumbled audio tracks in Windows Media Player movies. How To Be Your Own Personal Clone Army (With a Little Photoshop) How To Properly Scan a Photograph (And Get An Even Better Image) The HTG Guide to Hiding Your Data in a TrueCrypt Hidden Volume

    Read the article

  • How to record both audio, Where i have one music running and my microphone is in use?

    - by YumYumYum
    I have one music playing, and i have microphone open, already the microphone is used by other application. In such case, how can i record that music and the microphone audio to a file? (if possible with command line). Follow up: $ rec new-file.wav Input File : 'default' (alsa) Channels : 2 Sample Rate : 48000 Precision : 16-bit Sample Encoding: 16-bit Signed Integer PCM In:0.00% 00:00:25.94 [00:00:00.00] Out:1.24M [ | ] Clip:0 ^C $ sox -d new-file.wav

    Read the article

  • Join mp4 files in linux

    - by Jose Armando
    I want to join two mp4 files to create a single one. The video streams are encoded in h264 and the audio in aac. I can not re-encode the videos to another format due to computational reasons. Also, I cannot use any gui programs, all processing must be performed with linux command line utilities. FFmpeg cannot do this for mpeg4 files so instead I used MP4Box e.g. MP4Box -add video1.mp4 -cat video2.mp4 newvideo.mp4 unfortunately the audio gets all mixed up. I thought that the problem was that the audio was in aac so I transcoded it in mp3 and used again MP4Box. In this case the audio is fine for the first half of newvideo.mp4 (corresponding to video1.mp4) but then their is no audio and I cannot navigate in the video also. My next thought was that the audio and video streams had some small discrepancies in their lengths that I should fix. So for each input video I splitted the video and audio streams and then joined them with the -shortest option in ffmpeg. thus for the first video I ran avconv -y -i video1.mp4 -c copy -map 0:0 videostream1.mp4 avconv -y -i video1.mp4 -c copy -map 0:1 audiostream1.m4a avconv -y -i videostream1.mp4 -i audiostream1.m4a -c copy -shortest video1_aligned.mp4 similarly for the second video and then used MP4Box as previously. Unfortunately this didn't work either. The only success I had was when I joined the video streams separetely (i.e. videostream1.mp4 and videostream2.mp4) and the audio streams (i.e. audiostream1.m4a and audiostream2.m4a) and then joined the video and audio in a final file. However, the synchronization is lost for the second half of the video. Concretelly, there is a 1 sec delay of audio and video. Any suggestions are really welcome.

    Read the article

  • How to upload binary (audio) data from a Flash AS3 client to .NET server (WCF/REST/HTTP/?)?

    - by Bobby
    Simply stated: I'm trying to record audio in a browser, and get that data back up to the server. I originally tried to capture, encode and upload the audio using Silverlight, but because of the lack of suitable client-side encoding options, I'm now giving Flash a shot (Flash has baked-in support for encoding to Speex). I think I've figured out how to capture and encode the audio... But now what was easy in Silverlight, is the challenge in Flash. My server-side is .NET: MVC2- I'm open to receiving the audio in whatever manner is best- REST, WCF.. So that's my question: How could one upload binary data from Flash, to a .NET server-side endpoint. If the answer is WCF: then how would one setup the client-side proxies to communicate with the service? If the answer is REST or HTTP Post, then how would one construct this HTTP request and pass along the data? I've been reading up on AS3, but am new to Flash dev... Thanks for any help!

    Read the article

  • MPlayer does not work

    - by Soham Pal
    Using the xubuntu desktop, on Ubuntu Raring updated from Quantal. MPlayer never really worked. No video, no audio, nothing. I really can't be any more helpful, so here's the log: petey@home-pc:~$ mplayer "/home/petey/Downloads/Polar Bear Cafe (480p)HorribleSubs]/[HorribleSubs] Polar Bear Cafe - 01 [480p].mkv" MPlayer SVN-r35984-4.7 (C) 2000-2013 MPlayer Team Playing /home/petey/Downloads/Polar Bear Cafe (480p)[HorribleSubs]/[HorribleSubs] Polar Bear Cafe - 01 [480p].mkv. libavformat version 55.0.100 (internal) libavformat file format detected. [lavf] stream 0: video (h264), -vid 0 [lavf] stream 1: audio (aac), -aid 0 [lavf] stream 2: subtitle (ass), -sid 0 VIDEO: [H264] 848x480 0bpp 23.810 fps 0.0 kbps ( 0.0 kbyte/s) Clip info: creation_time: 2012-04-05 21:36:10 Load subtitles in /home/petey/Downloads/Polar Bear Cafe (480p)[HorribleSubs]/ Can't open /dev/fb0: Permission denied [fbdev2] Can't open /dev/fb0: Permission denied VO: [v4l2] No such file or directory vo_cvidix: No vidix driver name provided, probing available ones (-v option for details)! [cyberblade] Error occurred during pci scan: Operation not permitted [mach64] Error occurred during pci scan: Operation not permitted [mga] Error occurred during pci scan: Operation not permitted [mga] Error occurred during pci scan: Operation not permitted [nvidia_vid] Error occurred during pci scan: Operation not permitted [pm3] Error occurred during pci scan: Operation not permitted [radeon] Error occurred during pci scan: Operation not permitted [rage128] Error occurred during pci scan: Operation not permitted [s3_vid] Error occurred during pci scan: Operation not permitted [SiS] Error occurred during pci scan: Operation not permitted [unichrome] Error occurred during pci scan: Operation not permitted [VO_SUB_VIDIX] Couldn't find working VIDIX driver. ========================================================================== Opening video decoder: [ffmpeg] FFmpeg's libavcodec codec family libavcodec version 55.0.100 (internal) Selected video codec: [ffh264] vfm: ffmpeg (FFmpeg H.264) ========================================================================== ========================================================================== Opening audio decoder: [ffmpeg] FFmpeg/libavcodec audio decoders AUDIO: 44100 Hz, 2 ch, floatle, 0.0 kbit/0.00% (ratio: 0->352800) Selected audio codec: [ffaac] afm: ffmpeg (FFmpeg AAC (MPEG-2/MPEG-4 Audio)) ========================================================================== [AO OSS] audio_setup: Can't open audio device /dev/dsp: No such file or directory DVB card number must be between 1 and 4 AO: [null] 44100Hz 2ch floatle (4 bytes per sample) Starting playback... Movie-Aspect is 1.78:1 - prescaling to correct movie aspect. VO: [null] 848x480 = 854x480 Planar YV12 A: 4.7 V: 4.7 A-V: 0.002 ct: 0.083 0/ 0 22% 0% 0.5% 0 0 MPlayer interrupted by signal 2 in module: sleep_timer A: 4.7 V: 4.7 A-V: 0.001 ct: 0.083 0/ 0 21% 0% 0.5% 0 0 Exiting... (Quit)

    Read the article

  • alsa - sound issues on ubuntu 12.04

    - by tam_ubuuser
    i am having an sony E series laptop.i have an HDMI port .at this stage ,i have tested my sound card , which provides audio out on my laptop i.e i could hear songs .my laptop has two sound cards amd 5450 and an intel-hda(alsamixer shows that as s/pdif) . i decided to connect HDMI output to my new HD-TV.but, i could get only visuals on my TV,NO AUDIO OUTPUT ( HDMI cable works fine with win 7).my laptop has two sound cards.but i couldn't switch output to other card.( i don't know ,how to do that) i decided to update alsa. complied the following code in terminal. sudo apt-add-repository ppa:ubuntu-audio-dev/alsa-daily sudo apt-get update sudo apt-get install alsa-hda-dkms then,strangely no login sound, and no audio output on my laptop at all .then, started complied code from step1 sound troubleshooting procedure from offical ubuntu site.then, my speaker icon taskbar disappeared .obivously $aplay -l ,provided output as no soundcards detected . so , i implemented step 4 from that guide, it provides a output of all hardware devices in my laptop. *-multimedia UNCLAIMED description: Audio device product: Cedar HDMI Audio [Radeon HD 5400/6300 Series] vendor: Hynix Semiconductor (Hyundai Electronics) physical id: 0.1 bus info: pci@0000:01:00.1 version: 00 width: 64 bits clock: 33MHz capabilities: pm pciexpress msi bus_master cap_list configuration: latency=0 resources: memory:f0040000-f0043fff *-multimedia UNCLAIMED description: Audio device product: 5 Series/3400 Series Chipset High Definition Audio vendor: Intel Corporation physical id: 1b bus info: pci@0000:00:1b.0 version: 05 width: 64 bits clock: 33MHz capabilities: pm msi pciexpress bus_master cap_list configuration: latency=0 resources: memory:f5e00000-f5e03fff that command displayed output name of the two cards . but , still i have no positive output on $aplay -l. so therfore, i think alsa couldn't detect my sound cards . is there solution to this problem? it could be better,if alsa would channel output from multiple sound cards ? how should install and configure alsa such that detects HDMI cable as soon i connect to my HD tv? is it possible to alsa and pluseaudio 2.0 to co-exist, if so how?

    Read the article

  • why some mp3s on mime_content_type return application/octet-stream

    - by robertdd
    why on some mp3s file when i call mime_content_type($mp3_file_path) it's return application/octet-stream? i have this: if (!empty($_FILES)) { $tempFile = $_FILES['Filedata']['tmp_name']; $image = getimagesize($tempFile); $mp3_mimes = array('audio/mpeg', 'audio/x-mpeg', 'audio/mp3', 'audio/x-mp3', 'audio/mpeg3', 'audio/x-mpeg3', 'audio/mpg', 'audio/x-mpg', 'audio/x-mpegaudio'); if (in_array(mime_content_type($tempFile), $mp3_mimes)) { echo json_encode("mp3"); } elseif ($image['mime']=='image/jpeg') { echo json_encode("jpg"); } else{ echo json_encode("error"); } }

    Read the article

  • Where can I buy a Stereo audio to 3.5mm adapter?

    - by iftrue
    I need a stereo (6.33mm) to PC audio (3.5mm) adapter, and I'd like it to have an inch or two of cable so that yanking the connector doesn't break the audio port the 3.5mm is plugged into. I used to own one of these, but I lost the adapter. Where can I buy something like this online? I can only find solid adapters or 25' cables.

    Read the article

  • How do I programmatically determine the current zoom level of a browser window?

    - by Mihai Fonoage
    Hi, I want to find out the zoom level of what is being displayed in a browser window based on the javascripts' window object properties (http://www.javascriptkit.com/jsref/window.shtml) to which I have access. I just can't seem to find the right mathematical formula for the zoom based on the inner width, page offset, etc. I found a solution, but that uses the document.body.getBoundingClientRect call which does not return anything in my case and for which I can't tell if there's a suitable replacement from the window properties. I am using Safari. Thank you, Mihai

    Read the article

  • How can I render a custom view in a UIScrollView at varying zoom levels without distorting the view?

    - by eczarny
    The scenario: I have a custom view (a subclass of UIView) that draws a game board. To enable the ability to zoom into, and pan around, the board I added my view as a subview of UIScrollView. This kind of works, but the game board is being rendered incorrectly. Everything is kind of fuzzy, and nothing looks right. The question: How can I force my view to be redrawn correctly ay varying scales? I'm providing my view with the current scale and sending it a setNeedsDisplay message after the scroll view is done zooming in/out, but the game board is still being rendered incorrectly. My view should be redrawing the game board depending on the zoom level, but this isn't happening. Does the scroll view perform a generic transformation on subviews? Is there a way to disable this behavior?

    Read the article

  • Android - How to launch Google map intent in android app with certain location, zoom level and marker

    - by umirza47
    Map Intent not working with specific zoom level as well as custom marker float lat = 40.714728f; float lng = -73.998672f; String maplLabel = "ABC Label"; final Intent intent = new Intent(android.content.Intent.ACTION_VIEW, Uri.parse("geo:0,0?q="+lat+","+lng+"&z=16 (" + maplLabel + ")")); startActivity(intent); Anybody know what is wrong? or how to do so? I want to show map of certain (lat,lng) with a custom label-marker at a specific zoom level.

    Read the article

< Previous Page | 70 71 72 73 74 75 76 77 78 79 80 81  | Next Page >