Search Results

Search found 18933 results on 758 pages for 'dynamic programming'.

Page 260/758 | < Previous Page | 256 257 258 259 260 261 262 263 264 265 266 267  | Next Page >

  • How is GUID pronounced?

    - by Roberto Sebestyen
    Is it pronounced "Gewid" or is it prononced "G.U.I.D" by spelling out the letters. It seems inconsistently used. What is the proper pronountiaton? Same story goes for SQL. It seems more people say "S.Q.L." than "Sequel".

    Read the article

  • How is a referencing environment generally implemented for closures?

    - by Alexandr Kurilin
    Let's say I have a statically/lexically scoped language with deep binding and I create a closure. The closure will consist of the statements I want executed plus the so called referencing environment, or, to quote this post, the collection of variables which can be used. What does this referencing environment actually look like implementation-wise? I was recently reading about ObjectiveC's implementation of blocks, and the author suggests that behind the scenes you get a copy of all of the variables on the stack and also of all the references to heap objects. The explanation claims that you get a "snapshot" of the referencing environment at the point in time of the closure's creation. Is that more or less what happens, or did I misread that? Is anything done to "freeze" a separate copy of the heap objects, or is it safe to assume that if they get modified between closure creation and the closure executing, the closure will no longer be operating on the original version of the object? If indeed there's copying being made, are there memory usage considerations in situations where one might want to create plenty of closures and store them somewhere? I think that misunderstanding of some of these concepts might lead to tricky issues like the ones Eric Lippert mentions in this blog post. It's interesting because you'd think that it wouldn't make sense to keep a reference to a value type that might be gone by the time the closure is called, but I'm guessing that in C# the compiler will figure out that the variable is needed later and put it into the heap instead. It seems that in most memory-managed languages everything is a reference and thus ObjectiveC is a somewhat unique situation with having to deal with copying what's on the stack.

    Read the article

  • How to change internal buffer size of DataInputStream

    - by Gaks
    I'm using this kind of code for my TCP/IP connection: sock = new Socket(host, port); sock.setKeepAlive(true); din = new DataInputStream(sock.getInputStream()); dout = new DataOutputStream(sock.getOutputStream()); Then, in separate thread I'm checking din.available() bytes to see if there are some incoming packets to read. The problem is, that if a packet bigger than 2048 bytes arrives, the din.available() returns 2048 anyway. Just like there was a 2048 internal buffer. I can't read those 2048 bytes when I know it's not the full packet my application is waiting for. If I don't read it however - it'll all stuck at 2048 bytes and never receive more. Can I enlarge the buffer size of DataInputStream somehow? Socket receive buffer is 16384 as returned by sock.getReceiveBufferSize() so it's not the socket limiting me to 2048 bytes. If there is no way to increase the DataInputStream buffer size - I guess the only way is to declare my own buffer and read everything from DataInputStream to that buffer? Regards

    Read the article

  • What does a linux device need to be seen by Hal?

    - by Jaime Soriano
    I'm trying to learn about device drivers on Linux Kernel, for that I've created three modules with: A bus type A device driver A fake device that does nothing now, only is registered Everything works fine, I can load the bus, the driver and the module that creates the device. Everything appears on sysfs, including the link between the device and the device driver that indicates that they are binded. And when the driver and device are loaded, I can see using udevadm monitor that also some events are provoked: KERNEL[1275564332.144997] add /module/bustest_driver (module) KERNEL[1275564332.145289] add /bus/bustest/drivers/bustest_example (drivers) UDEV [1275564332.157428] add /module/bustest_driver (module) UDEV [1275564332.157483] add /bus/bustest/drivers/bustest_example (drivers) KERNEL[1275564337.656650] add /module/bustest_device (module) KERNEL[1275564337.656817] add /devices/bustest_device (bustest) UDEV [1275564337.658294] add /module/bustest_device (module) UDEV [1275564337.664707] add /devices/bustest_device (bustest) But after everything, the device doesn't appear on hal. What else need a device to be seen by hal?

    Read the article

  • C++ Program Flow: Sockets in an Object and the Main Function

    - by jfm429
    I have a rather tricky problem regarding C++ program flow using sockets. Basically what I have is this: a simple command-line socket server program that listens on a socket and accepts one connection at a time. When that connection is lost it opens up for further connections. That socket communication system is contained in a class. The class is fully capable of receiving the connections and mirroring the data received to the client. However, the class uses UNIX sockets, which are not object-oriented. My problem is that in my main() function, I have one line - the one that creates an instance of that object. The object then initializes and waits. But as soon as a connection is gained, the object's initialization function returns, and when that happens, the program quits. How do I somehow wait until this object is deleted before the program quits? Summary: main() creates instance of object Object listens Connection received Object's initialization function returns main() exits (!) What I want is for main() to somehow delay until that object is finished with what it's doing (aka it will delete itself) before it quits. Any thoughts?

    Read the article

  • Vim (terminal) - copy to x clipboard and paste while suspeneded

    - by gmatt
    I have vimx installed, so I can copy in vimx to the x clipboard by using "+y and the like, which works well as long as I can keep the current vimx running. However, I also love to be able to switch to the current running shell with ctrl-z and be able to paste what I copied from vim into the shell. Does anyone know how to do this, because as soon as I suspend vim with ctr-z the x-clipboard becomes empty, until I put vim into the fg again.

    Read the article

  • .NET / WPF Alternative

    - by eWolf
    I know the .NET framework and WPF pretty well, but I think the whole thing has gotten too blown up, especially for small apps as the whole .NET framework 3.5 weighs 197 MB by now. I am looking for a language/framework/library that provides functionality similar to that of WPF (animations, gradients, a.s.o.) and the .NET framework (of course not everything, but the basic features) and which is faster and more lightweight than the .NET framework and creates smaller and faster applications than the ones using .NET. Do you have any suggestions?

    Read the article

  • Why does Java force user-agent through simple Socket IO?

    - by Zombies
    I am using nothing but raw Socket IO. There isn't one HttpURLConnection nor any http client libs in my project. When I run it through wireshark I see somethign very revealing: GET / HTTP/1.1 User-Agent: Java/1.6.0_15 Host: www.google.com Accept: text/html, image/gif, image/jpeg, *; q=.2, */*; q=.2 Connection: keep-alive Here is the crazy part, I never put ANY of that in my original request. My original request was: "GET http://www.google.com/ HTTP/1.1\r\n" + "Host: www.google.com\r\n" + "User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.8) Gecko/20100214 Ubuntu/9.10 (karmic) Firefox/3.5.8\r\n" + "Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8\r\n" + "Accept-Language: en-us,en;q=0.5\r\n" + "Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7\r\n" + "Keep-Alive: 300\r\n" + "\r\n"; I am using the default Sun JVM.

    Read the article

  • For what applications is Forth best suited?

    - by namin
    I am intrigued by stack-based languages like Forth. Are there situations where Forth is the best tool for the job or is it just an intellectual and historical curiosity? What about derivative languages like Factor or Joy? Which of these languages would you recommend learning? And for what purpose (apart from mind expansion)?

    Read the article

  • Using sys/socket.h functions on windows

    - by BSchlinker
    Hello, I'm attempting to utilize the socket.h functions within Windows. Essentially, I'm currently looking at the sample code at http://beej.us/guide/bgnet/output/html/multipage/clientserver.html#datagram. I understand that socket.h is a Unix function -- is there anyway I can easily emulate that environment while compiling this sample code? Does a different IDE / compiler change anything? Otherwise, I imagine that I need to utilize a virtualized Linux environment, which may be best anyways as the code will most likely be running in a UNIX environment. Thanks.

    Read the article

  • F# - core benefits

    - by David Neale
    Since the release of VS 2010 I've seen F# more strongly advertised by Microsoft. What are the core benefits of using this language? What problems does it most naturally lend itself to? What is the learning curve like?

    Read the article

  • paket drop and splits in udp tunnel

    - by sr-dusad
    hi guys ! Currently I am working on video conferencing project.For this i m using pwnat for nat traversing. pwnat is based on udp tunneling.I m using the TCP connection for data transmission. My problem is that when i send a packet , it does not reach properly at its destination side . Sometime it drops the packet and many times it breaks ( split ) the packet into pieces. Please Help me .. How can i send and recieve a packet into single piece. So, i can draw image properly and play sound. Any kind of help will be appriciated . Thanks in advance

    Read the article

  • How can I define a verb in J that applies a different verb alternately to each atom in a list?

    - by Gregory Higley
    Imagine I've defined the following name in J: m =: >: i. 2 4 5 This looks like the following: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 I want to create a monadic verb of rank 1 that applies to each list in this list of lists. It will double (+:) or add 1 (>:) to each alternate item in the list. If we were to apply this verb to the first row, we'd get 2 3 6 5 10. It's fairly easy to get a list of booleans which alternate with each item, e.g., 0 1 $~{:$ m gives us 0 1 0 1 0. I thought, aha! I'll use something like +:`>: @. followed by some expression, but I could never quite get it to work. Any suggestions? UPDATE The following appears to work, but perhaps it can be refactored into something more elegant by a J pro. poop =: monad define (($ y) $ 0 1 $~{:$ y) ((]+:)`(]:) @. [)"0 y )

    Read the article

  • Active directory logonCount is 0, though the user has logged in

    - by Arun
    For a user in active directory, the properties hold values for lastlogontime & lastlogontimestamp but the logoncount is 0. I am having only one domain controller in that domain. I found from surfing, that logonCount value of 0 indicates that the value is unknown. But I am totally confused with why it is unknown. Is that an issue with AD.

    Read the article

  • How to replace for-loops with a functional statement in C#?

    - by Lernkurve
    A colleague once said that God is killing a kitten every time I write a for-loop. When asked how to avoid for-loops, his answer was to use a functional language. However, if you are stuck with a non-functional language, say C#, what techniques are there to avoid for-loops or to get rid of them by refactoring? With lambda expressions and LINQ perhaps? If so, how? Questions So the question boils down to: Why are for-loops bad? Or, in what context are for-loops to avoid and why? Can you provide C# code examples of how it looks before, i.e. with a loop, and afterwards without a loop?

    Read the article

  • How to get an internship with a low GPA?

    - by Jason Baker
    A lot of changed majors and some other mitigating circumstances have left me with a pretty low GPA. My GPA in the last couple of semesters hasn't been stellar, but my grades have gotten a LOT better. I want to try and start putting in some resumes to get a good internship this summer. I do think that I have some decent experience for someone at my level, but I see my GPA being a pretty big potential stumbling block. Is there anything I can do to help my chances of getting a good internship? (For the record, the mitigating circumstances aren't something I'd feel comfortable discussing with a potential employer. I'd prefer getting a job by proving my merit, not making excuses.)

    Read the article

< Previous Page | 256 257 258 259 260 261 262 263 264 265 266 267  | Next Page >