Search Results

Search found 5274 results on 211 pages for 'stream operators'.

Page 148/211 | < Previous Page | 144 145 146 147 148 149 150 151 152 153 154 155  | Next Page >

  • ASP.NET Download All Files as Zip

    - by Ronnie Overby
    I have a folder on my web server that has hundreds of mp3 files in it. I would like to provide the option for a user to download a zipped archive of every mp3 in the directory from a web page. I want to compress the files programmatically only when needed. Because the zip file will be quite large, I am thinking that I will need to send the zip file to the response stream as it is being zipped, for performance reasons. Is this possible? How can I do it?

    Read the article

  • Thread safe lockfree mutual ByteArray queue

    - by user313421
    A byte stream should be transferred and there is one producer thread and a consumer one. Speed of producer is higher than consumer most of the time, and I need enough buffered data for QoS of my application. I read about my problem and there are solutions like shared buffer, PipeStream .NET class ... This class is going to be instantiated many times on server so I need and optimized solution. Is it good idea to use a Queue of ByteArray ? If yes, I'll use an optimization algorithm to guess the Queue size and each ByteArray capacity and theoretically it fits my case. If no, I what's the best approach ? Please let me know if there's a good lock free thread safe implementation of ByteArray Queue in C# or VB. Thanks in advance

    Read the article

  • Video/Audio frame as input to OpenCore

    - by Vinay
    I am not able to use MediaPlayer/VideoView to make rtsp to work in Android. So I have created a client to interact with RTSP server, I have succeeded in doing this. I am able to get the video/audio frame from RTSP server (MySpace) in Android. Now I want to play the frames. I have searched OpenCore APIs to play the frames, but didn't get any APIs. My investigation: There is a class PlayerDriver.c It creates two sinks one audio and other video. handleSetVideoSurface handleSetAudioSink Two objects of type PVPlayerDataSinkPVMFNode are created. I suspect this class has a way to give the stream as input, but I am not getting the definition of this class. Can you suggest me is there any class I need to look into it?

    Read the article

  • How to walk through two files simultaneously in Perl?

    - by Alex Reynolds
    I have two text files that contain columnar data of the variety position-value. Here is an example of the first file (file A): 100 1 101 1 102 0 103 2 104 1 ... Here is an example of the second file (B): 20 0 21 0 ... 100 2 101 1 192 3 193 1 ... Instead of reading one of the two files into a hash table, which is prohibitive due to memory constraints, what I would like to do is walk through two files simultaneously, in a stepwise fashion. What this means is that I would like to stream through lines of either A or B and compare position values. If the two positions are equal, then I perform a calculation on the values associated with that position. Otherwise, if the positions are not equal, I move through lines of file A or file B until the positions are equal (when I again perform my calculation) or I reach EOF of both files. Is there a way to do this in Perl?

    Read the article

  • AS3 Pass FlashVars to loaded swf

    - by Robin
    Hi I have a A.swf which loads B.swf onto a movieclip and needs to pass it some FlashVars. When loading B.swf with html, I can pass FlashVars fine. When passing from A.swf, it gets a Error #2044: Unhandled ioError:. text=Error #2032: Stream Error. URL: file: The code in A.swf is var request:URLRequest = new URLRequest ("B.swf"); var variables : URLVariables = new URLVariables(); variables.xml = "test.xml"; // This line causes the error 2044, else B.swf loads fine with FlashVars request.data = variables; loader.load (request); In B.swf, it is checking the Flashvars like so. It works fine from html side this.loaderInfo.parameters.xml

    Read the article

  • extract digg data by digg api

    - by vamsivanka
    I am trying to extract digg data for a user using this url "http://services.digg.com/user/vamsivanka/diggs?count=25&appkey=34asd56asdf789as87df65s4fas6" and the web response is throwing an error "The remote server returned an error: (403) Forbidden." Please let me know. public static XmlTextReader CreateWebRequest(string url) { HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create(url); webRequest.UserAgent = ".NET Framework digg Test Client"; webRequest.Credentials = System.Net.CredentialCache.DefaultCredentials; webRequest.Accept = "text/xml"; HttpWebResponse webResponse = (HttpWebResponse)webRequest.GetResponse(); System.IO.Stream responseStream = webResponse.GetResponseStream(); XmlTextReader reader = new XmlTextReader(responseStream); return reader; }

    Read the article

  • Reducing Code Repetition: Calling functions with slightly different signatures

    - by Brian
    Suppose I have two functions which look like this: public static void myFunction1(int a, int b, int c, string d) { //dostuff someoneelsesfunction(c,d); //dostuff2 } public static void myFunction2(int a, int b, int c, Stream d) { //dostuff someoneelsesfunction(c,d); //dostuff2 } What would be a good way to avoid repeated dostuff? Ideas I've thought of, but don't like: I could make d an object and cast at runtype based on type, but this strikes me as not being ideal; it removes a type check which was previously happening at compile time. I could also write a private helper class that takes an object and write both signatures as public functions. I could replace dostuff and dostuff2 with delegates or function calls or something.

    Read the article

  • How to get the default audio format of a TTS Engine

    - by Itslava
    In Microsoft TTS 5.1 or newer. The SpVoice.AudioOutputStream property says: The AudioOutputStream property gets and sets the current audio stream object used by the voice. Setting the voice's AudioOutputStream property may cause its audio output format to be automatically changed to match the text-to-speech (TTS) engine's preferred audio output format. If the voice's AllowAudioOutputFormatChangesOnNextSet property is True, the format change takes place; if False, the format remains unchanged. In order to set the AudioOutputStream property of a voice to a specific format, its AllowOutputFormatChangesOnNextSet should be False. It means a engine's always has a preferred audio output format. So, how can i get it.. i have not found any interface to get that attribute.

    Read the article

  • Comparison of music data

    - by Christian P.
    Hey I am looking for theory, algorithms and similar for how to compare music. More specifically, I am looking into how to dupecheck music tracks that have different bitrates or perhaps slightly different variations (radio vs album version), but otherwise sound the same. Use cases for this include services such as Grooveshark, Youtube, etc. where they get a lot of duplicate tracks. I am also interested in text comparisons (Britney Spers vs Britney Spears, how far they deviate, etc.) although this is secondary and I already have some sources to go on in this area. I am mostly interested in codec-agnostic comparison techniques and algoritms (assuming a "raw" stream), but codec-specific resources are appreciated. I am aware of projects such as musicbrainz.org, but have not investigated it further, and would be interested if such projects could be of help in this endeavor.

    Read the article

  • Distributed datastore

    - by Julien Genestoux
    We're trying to add some kind of persistence in our app. The app generates about 250 entries per second. Each of these entries belong to one of 2M files. For each file, we want to keep the last 10 entries, so we can look them up later. The way our client application works : it gets a stream of all the data it fetches the right file (GET) it adds the new content it saves the file back (PUT) We're looking for an efficient way to store this data that can scale horizontally as the amount of data we're getting is doubling every few weeks. We initially looked at S3. It works fine, but becomes very expensive very fast ($1000 monthly just in PUT operations!) We then gave a shot at Riak. But it seems we can't get more than 60 write/sec on each node, which is very very slow. Any other solution out there?

    Read the article

  • Bitbanging a PIO on Coldfire/ucLinux

    - by G Forty
    Here's the problem: I need to program some hardware via 2 pins of the PIO (1 clock, 1 data). Timing constraints are tight - 10ms clock cycle time. All this, of course, whilst I maintain very high level services (CAN bus, TCP/IP). The downstream unit also ACKS by asserting a PIO pin, configured as an input, high. So this loop has to both read and write. I need to send 16 bits in the serial stream. Is there an established way to do this sort of thing or should I simply get the hardware guys to add a PIC or somesuch. I'd much prefer to avoid exotics like RTAI extensions at this stage. I did once see a reference to user-mode IO which implied a possible interrupt driven driver but lost track of it. Any pointers welcomed.

    Read the article

  • How to get netstream bytesLoaded and bytesTotal from streaming .mp4?

    - by Amy
    I have a flex 3 app that uses netstream and a video object to stream .mp4 movies. I want to use the bytesLoaded and bytesTotal properties of the netstream to display the buffering information. I would also like to get any information about the number of frames that are dropped if possible. When I've tested on .flv I'm able to get the information without a problem, but it doesn't seem to work on .mp4. Is it possible to get this information streaming .mp4? Is there some configuration that I'm missing to make things work the same for .mp4 as .flv? Thanks!

    Read the article

  • How return 304 status with FileResult in ASP.NET MVC RC1

    - by Maysam
    As you may know we have got a new ActionResult called FileResult in RC1 version of ASP.NET MVC. Using that, your action methods can return image to browser dynamically. Something like this: public ActionResult DisplayPhoto(int id) { Photo photo = GetPhotoFromDatabase(id); return File(photo.Content, photo.ContentType); } In the HTML code, we can use something like this: <img src="http://mysite.com/controller/DisplayPhoto/657"> Since the image is returned dynamically, we need a way to cache the returned stream so that we don't need to read the image again from database. I guess we can do it with something like this, I'm not sure: Response.StatusCode = 304; This tells the browser that you already have the image in your cache. I just don't know what to return in my action method after setting StatusCode to 304. Should I return null or something?

    Read the article

  • How to negotiate red5 connection parameters for streaming with JAVA

    - by baba
    Hi, I have been creating a thin browser client (on java) that sends an RTMP stream to a specified red5 instance. I also use RTMP Researcher to monitor the traffic and events that occur between the client and the server. Here is what I note: There is obviously a map with options that is being exchanged between the red5 instance and the client. You can see it here: (direct link : http://img716.imageshack.us/img716/661/newbitmapimagelb.png ) What I am wondering about is is there a programmatic way to obtain this map in the client side and maybe change some of the parameters or just examine them Edit: I am connecting like this connect ( host, port, app, callback ); . I assume I am sending some default parameters along, because the other connect methods have also an optionsMap as an argument. I was wondering what are the possible values that could be put in such an optionsMap and where to obtain a list of them?

    Read the article

  • imap_open() says "invalid remote specification" and fails to connect

    - by Kristopher Ives
    When I try to use imap_open I get the following error: Warning: imap_open() [function.imap-open]: Couldn't open stream {mail.domain.com:110/pop3/novalidate-cert/} in /path/to/mailbox.php on line 5 Can't open mailbox {mail.domain.com:110/pop3/novalidate-cert/}: invalid remote specification My phpinfo says that I have: IMAP c-Client Version 2007e SSL Support enabled Kerberos Support enabled On another server that gives the same phpinfo for imap it works, although that version is 2006. PHP says it was compiled with the following settings: './configure' '--disable-path-info-check' '--enable-exif' '--enable-fastcgi' '--enable-ftp' '--enable-gd-native-ttf' '--enable-libxml' '--enable-mbstring' '--enable-pdo=shared' '--enable-soap' '--enable-sockets' '--enable-zip' '--prefix=/usr' '--with-bz2' '--with-curl=/opt/curlssl/' '--with-freetype-dir=/usr' '--with-gd' '--with-gettext' '--with-imap=/opt/php_with_imap_client/' '--with-imap-ssl=/usr' '--with-jpeg-dir=/usr' '--with-kerberos' '--with-libexpat-dir=/usr' '--with-libxml-dir=/opt/xml2' '--with-libxml-dir=/opt/xml2/' '--with-mysql=/usr' '--with-mysql-sock=/var/lib/mysql/mysql.sock' '--with-mysqli=/usr/bin/mysql_config' '--with-openssl=/usr' '--with-openssl-dir=/usr' '--with-pdo-mysql=shared' '--with-pdo-sqlite=shared' '--with-pgsql=/usr' '--with-png-dir=/usr' '--with-sqlite=shared' '--with-ttf' '--with-xpm-dir=/usr' '--with-zlib' '--with-zlib-dir=/usr'

    Read the article

  • Streaming a non-PCM WAV file to a SilverLight application

    - by Satumba
    Hi, I would like to allow users to play recorded WAV files that stored on a server back to a Silverlight application as a client to play them. I saw that there is a way to play a WAV file on Silverlight (here), but when i tried to impliment it, i got an error playing the file because it is not in PCM format but encoded. The files that i'm trying to play are encoded with a special encoder, so i thought that the only way is to decode the WAV file on the server and stream it back to the client. The limitation is that the decode process should occur in real time because it is not reasonable to convert all the WAV files that exists. Is it possible to do it? Which streamer can i use? (Windows Media Service can help here?) Does somebody has any experience with such a scenario? Appreciate your help.

    Read the article

  • How to impose control access on flash player streaming through RTMP?

    - by MobiHunterz
    Hi, I'm using ICECAST and I'm streaming AAC/HE-AACv2 audio/video file through flash player and on iPhone. Both are separate. But when I submit url in WINAMP to stream it's requesting username/password. But when I use it with flash player, it just starts streaming. My case is I want to use same streaming (RTMP) for both website and iPhone app and i want to impose control access over my flash player to authenticate who can see streamed video. So, I need to allow access to authenticated users only to see flash player or just reject playing. Now, my provider is not supporting control access on flash player, but i need to do it... can u say how i can do this? Any kind of help will be appreciated. Thanks.

    Read the article

  • Unable to make 2 parallel TCP requests to the same TCP Client

    - by soldieraman
    Error: Unable to read data from the transport connection: A blocking operation was interrupted by a call to WSACancelBlockingCall Situation There is a TCP Server My web application connects to this TCP Server Using the below code: TcpClientInfo = new TcpClient(); _result = TcpClientInfo.BeginConnect(<serverAddress>,<portNumber>, null, null); bool success = _result.AsyncWaitHandle.WaitOne(20000, true); if (!success) { TcpClientInfo.Close(); throw new Exception("Connection Timeout: Failed to establish connection."); } NetworkStreamInfo = TcpClientInfo.GetStream(); NetworkStreamInfo.ReadTimeout = 20000; 2 Users use the same application from two different location to access information from this server at the SAME TIME Server takes around 2sec to reply Both Connect But One of the user gets above error "Unable to read data from the transport connection: A blocking operation was interrupted by a call to WSACancelBlockingCall" when trying to read data from stream How can I resolve this issue? Use a better way of connecting to the server Can't because it's a server issue if a server issue, how should the server handle request to avoid this problem

    Read the article

  • What does LAME text does in MP3 file?

    - by Dims
    I see here http://en.wikipedia.org/wiki/MP3 that MP3 file consists of MP3 headers interchanged with MP3 data. MP3 header consist of few bytes. But here is my MP3 file dump with ID3 tag cut. Header is highlighted with blue. You can see that "LAME3.96" text is highlighted with green. What does it does there? Is this a part of MP3 elementary stream? Or this is the part of some headers I didn't tag?

    Read the article

  • ruby-gstreamer doesn't send EOS message

    - by Cheba
    I've managed to make it play sound but it never gets EOS message. And thus script never exits. require 'gst' main_loop = GLib::MainLoop.new pipeline = Gst::Pipeline.new "audio-player" source = Gst::ElementFactory.make "filesrc", "file-source" source.location = "/usr/share/sounds/gnome/default/alerts/bark.ogg" decoder = Gst::ElementFactory.make "decodebin", "decoder" conv = Gst::ElementFactory.make "audioconvert", "converter" sink = Gst::ElementFactory.make "alsasink", "output" pipeline.add source, decoder, conv, sink source >> decoder conv >> sink decoder.signal_connect "pad-added" do |element, pad, data| pad >> conv['sink'] end pipeline.bus.add_watch do |bus, message| puts "Message: #{message.inspect}" case message.type when Gst::Message::Type::ERROR puts message.structure["debug"] main_loop.quit when Gst::Message::Type::EOS puts 'End of stream' main_loop.quit end end pipeline.play begin puts 'Running main loop' main_loop.run ensure puts 'Shutting down main loop' pipeline.stop end

    Read the article

  • How do you measure latency in low-latency environments?

    - by Ajaxx
    Here's the setup... Your system is receiving a stream of data that contains discrete messages (usually between 32-128 bytes per message). As part of your processing pipeline, each message passes through two physically separate applications which exchange the data using a low-latency approach (such as messaging over UDP) or RDMA and finally to a client via the same mechanism. Assuming you can inject yourself at any level, including wire protocol analysis, what tools and/or techniques would you use to measure the latency of your system. As part of this, I'm assuming that every message that is delivered to the system results in a corresponding (though not equivalent) message being pushed through the system and delivered to the client. The only tool that I've seen on the market like this is TS-Associates TipOff. I'm sure that with the right access you could probably measure the same information using a wire analysis tool (ala wireshark) and the right dissectors, but is this the right approach or are there any commodity solutions that I can use?

    Read the article

  • Serializing response from JSP and converting them to C# objects

    - by SARAVAN
    I have a silverlight web application. From this app I am making a call to .jsp pages using WebClient class. Now jsp returns a response in the following format { "results":[{"Value":"1","Name":"Advertising"}, {"Value":"2","Name":"Automotive Expenses"},{"Value":"3","Name":"Business Miscellaneous"}] } The above response is assigned to my Stream object. I have a c# class CategoryType public class CategoryType { public string Value{get;set;} public string Name{get;set;} } My aim is to convert the reponses in to Collection<CategoryType> and use it in my C# Code As of now I am trying to use DataContractJSONSerialiser. But not sure if there is an easy and efficent way to do this. Any help would be appreciated

    Read the article

  • RTSP streaming and save into mp4 file using VLC

    - by Vivek Navadia
    Hello All let say i am having one RTSP url (rtsp://192.168.0.17/mpeg4). the live camera is setup on the machine which relay live video. i am streaming it using vlc player and i am saving it in mp4 file on some location (i.e. c:\temp.mp4). Now i am opening another vlc player instance and open this file (c:\temp.mp4). but as it is in use and saving live streaming to that file. that will not be played. if if stop the streaming and then played temp.mp4 file then it will play the streamed (saved) video. Now my requirement is VLC player should also stream and save into temp.mp4 file continuously and at the same time that file should be played in any standard player. is it possible to do with any option using VLC player that we can do both this things simultaneously. Thanks Vivek

    Read the article

  • PHP getimagesize with ampersand in string creates errors

    - by RobHardgood
    I'm using the getimagesize function in PHP, and the path string contains an ampersand, which otherwise is fine. The page gives me errors where getimagesize() is called. Looking at the source code, though, I see the ampersand is being passed through as & rather than just & I presume this is causing errors because PHP doesn't need to convert it to the html tag in order to find the path, right? Here is the error: Warning: getimagesize(image.php?name=username&pic=picture) [function.getimagesize]: failed to open stream: No such file or directory

    Read the article

  • How do I remove (or apply) transparency on a gdk-pixbuf?

    - by Andrew Stacey
    I have a c++ program in which a gdk-pixbuf is created. I want to output it as an image, so I call gdk_pixbuf_save_to_stream(pixbuf,stream,type,NULL,&err,NULL). This works fine when "type" is png or tiff, but with jpeg or bmp it just produces a black square. The original pixbuf consists of black-on-transparent (and gdk_pixbuf_get_has_alpha returns true) so I'm guessing that the problem is with the alpha mask. GdkPixbuf has a function to add an alpha channel, but I can't see one that removes it again, or (which might be as good) to invert it. Is there a simple way to get the jpeg and bmp formats to work properly? (I should say that I'm very new to proper programming like this.)

    Read the article

< Previous Page | 144 145 146 147 148 149 150 151 152 153 154 155  | Next Page >