Search Results

Search found 7985 results on 320 pages for 'multi byte'.

Page 231/320 | < Previous Page | 227 228 229 230 231 232 233 234 235 236 237 238  | Next Page >

  • Object to Network serialization - with an existing protocol

    - by cpf
    I'm writing a client for a server program written in C++. As is not unusual, all the networking protocol is in a format where packets can be easily memcopied into/out of a C++ structure (1 byte packet code, then different arrangements per packet type). I could do the same thing in C#, but is there an easier way, especially considering lots of the data is fixed-length char arrays that I want to play with as strings? Or should I just suck it up and convert types as needed? I've looked at using the ISerializable interface, but it doesnt look as low level as is required.

    Read the article

  • the characters except 0x00-0x7F are not been shown when converted to "UTF-8" from "ISO-8859-1"

    - by Mike.Huang
    I need to get a string from URL request of brower, and then create a text image by requested text. I know the default encoding of the Java net transmission is "ISO-8859-1", it can works normally with all characters what defined in "ISO-8859-1". But when I request a multi-byte Unicode character (e.g. chinese or something like ¤?), then I need to decode it by "UTF-8" from "ISO-8859-1". My codes like: String reslut = new String(requestString.getBytes("ISO-8859-1"), "UTF-8"); Everything is fine, but I found some characters in ISO-8859-1 are not been shown now, which characters are 0x80 - 0xFF(defined in" ISO-8859-1"), i.e. the characters except 0x00-0x7F are not been shown when converted to "UTF-8" from "ISO-8859-1" Any other method can solve this query?

    Read the article

  • Python - Finding unicode/ascii problems

    - by user330739
    Hi all, I am csv.reader to pull in info from a very long sheet. I am doing work on that data set and then I am using the xlwt package to give me a workable excel file. However, I get this error: UnicodeDecodeError: 'ascii' codec can't decode byte 0x92 in position 34: ordinal not in range(128) My question to you all is, how can I find exactly where that error is in my data set? Also, is there some code that I can write which will look through my data set and find out where the issues lie (because some data sets run without the above error and others have problems)?

    Read the article

  • Is there a concise way to create an InputSupplier for an InputStream in Google Guava?

    - by Fabian Steeg
    There are a few factory methods in Google Guava to create InputSuppliers, e.g. from a byte[]: ByteStreams.newInputStreamSupplier(bytes); Or from a File: Files.newInputStreamSupplier(file); Is there a similar way to to create an InputSupplier for a given InputStream? That is, a way that's more concise than an anonymous class: new InputSupplier<InputStream>() { public InputStream getInput() throws IOException { return inputStream; } }; Background: I'd like to use InputStreams with e.g. Files.copy(...) or ByteStreams.equal(...).

    Read the article

  • With Maven2, how would I specify a custom directory to which a dependency should be copied?

    - by Benny
    Basically, I have a multi-module project consisting of 5 different modules. One of the modules is kind of the parent module to the other 4, meaning the other 4 need to be built before the 5th, so you could say that each of the 4 modules is a dependency of the 5th. Thus, I've made dependency entries for each of the modules in the 5th module's pom.xml. However, when I build the project, I don't want those 4 dependencies copied to the "lib" directory of the 5th module. I'd like to specify the directory into which each of them should be placed explicitly. Is there any way to do this with Maven2? Thanks for your help, B.J.

    Read the article

  • openmp vs opencl for computer vision

    - by user1235711
    I am creating a computer vision application that detect objects via a web camera. I am currently focusing on the performance of the application My problem is in a part of the application that generates the XML cascade file using Haartraining file. This is very slow and takes about 6days . To get around this problem I decided to use multiprocessing, to minimize the total time to generate Haartraining XML file. I found two solutions: opencl and (openMp and openMPI ) . Now I'm confused about which one to use. I read that opencl is to use multiple cpu and GPU but on the same machine. Is that so? On the other hand OpenMP is for multi-processing and using openmpi we can use multiple CPUs over the network. But OpenMP has no GPU support. Can you please suggest the pros and cons of using either of the libraries.

    Read the article

  • How should my team decide between 3-tier and 2-tier architectures?

    - by j0rd4n
    My team is discussing the future direction we take our projects. Half the team believes in a pure 3-tier architecture while the other half favors a 2-tier architecture. Project Assumptions: Enterprise business applications Business logic needed between user and database Data validation necessary Service-oriented (prefer RESTful services) Multi-year maintenance plan Support hundreds of users 3-tier Team Favors: Persistant layer <== Domain layer <== UI layer Service boundary between at least persistant layer and domain layer. Domain layer might have service boundary between it. Translations between each layer (clean DTO separation) Hand roll persistance unless we can find creative yet elegant automation 2-tier Team Favors: Entity Framework + WCF Data Service layer <== UI layer Business logic kept in WCF Data Service interceptors Minimal translation between layers - favor faster coding So that's the high-level argument. What considerations should we take into account? What experiences have you had with either approach?

    Read the article

  • Can in-memory SQLite databases scale with concurrency?

    - by Kent Boogaart
    In order to prevent a SQLite in-memory database from being cleaned up, one must use the same connection to access the database. However, using the same connection causes SQLite to synchronize access to the database. Thus, if I have many threads performing reads against an in-memory database, it is slower on a multi-core machine than the exact same code running against a file-backed database. Is there any way to get the best of both worlds? That is, an in-memory database that permits multiple, concurrent calls to the database?

    Read the article

  • How can I get Perl to detect the bad UTF-8 sequences?

    - by gorilla
    I'm running Perl 5.10.0 and Postgres 8.4.3, and strings into a database, which is behind a DBIx::Class. These strings should be in UTF-8, and therefore my database is running in UTF-8. Unfortunatly some of these strings are bad, containing malformed UTF-8, so when I run it I'm getting an exception DBI Exception: DBD::Pg::st execute failed: ERROR: invalid byte sequence for encoding "UTF8": 0xb5 I thought that I could simply ignore the invalid ones, and worry about the malformed UTF-8 later, so using this code, it should flag and ignore the bad titles. if(not utf8::valid($title)){ $title="Invalid UTF-8"; } $data->title($title); $data->update(); However Perl seems to think that the strings are valid, but it still throws the exceptions. How can I get Perl to detect the bad UTF-8?

    Read the article

  • utf-8 convertion doesn't work always

    - by Marco Piccinni
    I searched into other stack before to type here and I didn't find anythong similar. I have to scrape different utf-8 webpages which contain text like "Oggi è una bellissima giornata" the problem is on the characther "è" I extract this text with jtidy and xpath query expression and I convert it with byte[] content = filteredEncodedString.getBytes("utf-8"); String result = new String(content,"utf-8"); where filteredEncodedString contains the text "Oggi è una bellissima giornata". This procedures works on the most webpages analyzed so far but in some case it doesn't extract a utf-8 string. Page encoding is always the same as the text is similar. Any ideas about the problem? thanks Marco

    Read the article

  • How to receive a datastream from a device on your computer, in C#

    - by WebDevHobo
    I plan to build a small audio-recorder app in C#. My laptop has a built in Microphone that's always active, so I want to use that as an early-stage test. I would simply start recording, save the file as a .wav or even use the LAME dll to make it into an MP3. The problem is, I don't know how to contact that microphone. Do I use a library that can detect a device, or do I just catch a stream of bytes from the port that the device is on? I don't have any experience with receiving data from connected devices. I suppose that I'll need to enter all the data into a byte array and then Serialize that into a WAV file, but I'm not sure. Can I get some pointers on this subject?

    Read the article

  • What would be a good "CMS" for me to use?

    - by Tim Geerts
    Hey, I'm looking for some sort of CMS system to implement here in terms of "documentation" system. Now, I'm not to sure about which system(s) would suit my needs best, so I thought I'd come here and type up my requirements so you could help me in narrowing down all the different options. One important note to make is that I'm not looking at a system where I can store certain documents (word, pdf, whatever). Rather at a system where I can type the "documentation"-text in some sort of post (like a blog). Requirements: - Multilanguage support - Tagging - Decent search support (tags, groupings, categories) - Version-control of posts/articles - Possibility of exporting post(s) to a pdf file - Support for multi-user (usergroup X can only see those posts, usergroup Y can see others, etc...) I know, these are some strange requirements if they're all combined, and I reckon most of you would perhaps say that I'd have to develop something like this inhouse rather then finding a descent working product out there (open source if possible). None the less, I thought I'd at least ask the opinion of y'all. Regards, Tim

    Read the article

  • CryptGenRandom to generate asp.net session id

    - by DoDo
    Hi! does anyone have working example of CryptGenrRandom class to generate session id (need to use in my iis module). HCRYPTPROV hCryptProv; BYTE pbData[16]; if(CryptAcquireContext( &hCryptProv, NULL, NULL, PROV_RSA_FULL, CRYPT_VERIFYCONTEXT)) { if(CryptGenRandom(hCryptProv, 8, pbData)) { std::string s(( const char *) pbData); printf(s.c_str()); } else { MyHandleError("Error during CryptGenRandom."); } } else { MyHandleError("Error during CryptAcquireContext!\n"); } i tried this code but, its not working quite well (i get it from msdn) and this example don't work for me ( http://www.codeproject.com/KB/security/plaintextsessionkey.aspx ) so if anyone know how to generate sessionid using this class plz let me know tnx anyway!

    Read the article

  • How to pass around event as parameter in c#

    - by Jerry Liu
    Am writing unit test for a multi-threading application, where I need to wait until a specific event triggered so that I know the asyn operation is done. E.g. When I call repository.add(something), I wait for event AfterChange before doing any assertion. So I write a util function to do that. public static void SyncAction(EventHandler event_, Action action_) { var signal = new object(); EventHandler callback = null; callback = new EventHandler((s, e) => { lock (signal) { Monitor.Pulse(signal); } event_ -= callback; }); event_ += callback; lock (signal) { action_(); Assert.IsTrue(Monitor.Wait(signal, 10000)); } } However, the compiler prevents from passing event out of the class. Is there a way to achieve that?

    Read the article

  • Maximum number of bytes that can be sent on a TCP connection

    - by iamrohitbanga
    I initially assumed that since tcp has a sequence number field of 32 bits and each byte sent on a tcp connection is labeled with a unique number, maximum number of bytes that can be sent on a tcp connection is about 2^32-1 or 2^32-2 (which?). but now I feel that since TCP is a sliding window protocol, the wraparound of sequence numbers during the connection should not have an affect on the maximum number of bytes that can be sent over a tcp connection as long as the when wraparound occurs the old packet is no longer in the network (it is sent after 2*MSL). What is the correct answer?

    Read the article

  • How i store the images pixels in matrix form?

    - by Rajendra Bhole
    Hi, I developing an application in which the pixelize image i want to be store in matrix format. The code is as follows. struct pixel { //unsigned char r, g, b,a; Byte r, g, b; int count; }; (NSInteger) processImage1: (UIImage*) image { // Allocate a buffer big enough to hold all the pixels struct pixel* pixels = (struct pixel*) calloc(1, image.size.width * image.size.height * sizeof(struct pixel)); if (pixels != nil) { // Create a new bitmap CGContextRef context = CGBitmapContextCreate( (void*) pixels, image.size.width, image.size.height, 8, image.size.width * 4, CGImageGetColorSpace(image.CGImage), kCGImageAlphaPremultipliedLast ); NSLog(@"1=%d, 2=%d, 3=%d", CGImageGetBitsPerComponent(image), CGImageGetBitsPerPixel(image),CGImageGetBytesPerRow(image)); if (context != NULL) { // Draw the image in the bitmap CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, image.size.width, image.size.height), image.CGImage); NSUInteger numberOfPixels = image.size.width * image.size.height; I confusing about how to initialize the 2-D matrix in which the matrix store data of pixels.

    Read the article

  • C++ : size of int, long, etc...

    - by Jérôme
    I'm looking for detailed informations regarding the size of basic C++ types. I know that it depends on the architecture (16 bits, 32 bits, 64 bits) and the compiler. But are there any standards ? I'm using Visual Studio 2008 on a 32 bit achitecture. Here is what I get : char : 1 byte short : 2 bytes int : 4 bytes long : 4 bytes float : 4 bytes double : 8 bytes I tried to find, without much success, reliable informations telling the sizes of char, short, int , long, double, float (and other types I don't think of) under different architecture and compiler.

    Read the article

  • How to get Host OS (operating system) parameter of ZIP archive in C#

    - by bao
    Look. I have ZIP archives prepared in different os'es: mac, linux, windows. In windows file names encoded in DOS CP866, mac & linux in UTF-8. I need to know (in code) in which os zip file was prepared to decode file names correctly. There is a Host OS paramterer in "Central directory structure" of zip file (look http://www.fileformat.info/format/zip/corion.htm ). How to get 0005h 1 byte Host OS parameter in C#?

    Read the article

  • using C# how to convert iso8859-1 encoded text files that contain Latin-1 accented characters to utf

    - by Tim
    I am being sent text files saved in iso88591-1 format that contain accented characters from the Latin-1 range (as well as normal ASCII a-z etc). How to convert these files to utf-8 using C# so that the single-byte accented characters in iso8859-1 become valid utf-8 characters? I have tried to use a StreamReader with ASCIIEncoding, and then converting the ascii string to UTF-8 by instantiating an ascii encoding and a utf8 encoding and then using Encoding.Convert(ascii, utf8, ascii.GetBytes( asciiString) ) — but the accented characters are being rendered as question marks. What step am I missing? Thanks

    Read the article

  • Java String to SHA1

    - by AeroDroid
    I'm trying to make a simple String to SHA1 converter in Java and this is what I've got... public static String toSHA1(byte[] convertme) { MessageDigest md = null; try { md = MessageDigest.getInstance("SHA-1"); } catch(NoSuchAlgorithmException e) { e.printStackTrace(); } return new String(md.digest(convertme)); } When I pass it toSHA1("password".getBytes()), I get "[?a?????%l?3~??." I know it's probably a simple encoding fix like UTF-8, but could someone tell me what I should do to get what I want which is "5baa61e4c9b93f3f0682250b6cf8331b7ee68fd8"? Or am I doing this completely wrong? Thanks a lot!

    Read the article

  • What platforms have something other than 8-bit char?

    - by Craig McQueen
    Every now and then, someone on SO points out that char (aka 'byte') isn't necessarily 8 bits. It seems that 8-bit char is almost universal. I would have thought that for mainstream platforms, it is necessary to have an 8-bit char to ensure its viability in the marketplace. Both now and historically, what platforms use a char that is not 8 bits, and why would they differ from the "normal" 8 bits? When writing code, and thinking about cross-platform support (e.g. for general-use libraries), what sort of consideration is it worth giving to platforms with non-8-bit char? In the past I've come across some Analog Devices DSPs for which char is 16 bits. DSPs are a bit of a niche architecture I suppose. (Then again, at the time hand-coded assembler easily beat what the available C compilers could do, so I didn't really get much experience with C on that platform.)

    Read the article

  • Me As Child Type In General Function

    - by Steven
    I have a MustInherit Parent class with two Child classes which Inherit from the Parent. How can I use (or Cast) Me in a Parent function as the the child type of that instance? EDIT: My actual goal is to be able to serialize (BinaryFormatter.Serialize(Stream, Object)) either of my child classes. However, "repeating the code" in each child "seems" wrong. EDIT2: This is my Serialize function. Where should I implement this function? Copying and pasting to each child doesn't seem right, but casting the parent to a child doesn't seem right either. Public Function Serialize() As Byte() Dim bFmt As New BinaryFormatter() Dim mStr As New MemoryStream() bFmt.Serialize(mStr, Me) Return mStr.ToArray() End Function

    Read the article

  • Can in-memory SQLite databases be used concurrently?

    - by Kent Boogaart
    In order to prevent a SQLite in-memory database from being cleaned up, one must use the same connection to access the database. However, using the same connection causes SQLite to synchronize access to the database. Thus, if I have many threads performing reads against an in-memory database, it is slower on a multi-core machine than the exact same code running against a file-backed database. Is there any way to get the best of both worlds? That is, an in-memory database that permits multiple, concurrent calls to the database?

    Read the article

  • Why hasn't functional programming taken over yet?

    - by pankrax
    I've read some texts about declarative/functional programming (languages), tried out Haskell as well as written one myself. From what I've seen, functional programming has several advantages over the classical imperative style: Stateless programs; No side effects Concurrency; Plays extremely nice with the rising multi-core technology Programs are usually shorter and in some cases easier to read Productivity goes up (example: Erlang) Imperative programming is a very old paradigm (as far as I know) and possibly not suitable for the 21st century Why are companies using or programs written in functional languages still so "rare"? Why, when looking at the advantages of functional programming, are we still using imperative programming languages? Maybe it was too early for it in 1990, but today?

    Read the article

  • CRT not initialized

    - by jfhs
    I'm trying to compile one project with MSVC 2010, compilation is ok, but when I try to run the app, it gives me CRT not initialized error. It is a console application, so I tried to specify mainCRTStartup as Entry Point, but it didn't help. In the same solution there are other projects, and they don't have such a problem. The difference which I see between them is that one which is not working, uses boost. Boost v1.38.0 if this is important. Runtime Library is Multi-threaded DLL. Linker command line is: /OUT:"D:\temp\ghost\Release\ghost.exe" /INCREMENTAL:NO /NOLOGO /LIBPATH:"..\zlib\lib" /LIBPATH:"..\mysql\lib\opt" /LIBPATH:"..\boost\lib" "ws2_32.lib" "winmm.lib" "zdll.lib" "StormLibRAS.lib" "kernel32.lib" "user32.lib" "gdi32.lib" "winspool.lib" "comdlg32.lib" "advapi32.lib" "shell32.lib" "ole32.lib" "oleaut32.lib" "uuid.lib" "odbc32.lib" "odbccp32.lib" "D:\temp\ghost\bncsutil\vc8_build\Release\BNCSutil.lib" /MANIFEST /ManifestFile:"Release\ghost.exe.intermediate.manifest" /ALLOWISOLATION /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /DEBUG /PDB:"D:\temp\ghost\Release\ghost.pdb" /SUBSYSTEM:CONSOLE /OPT:REF /OPT:ICF /PGD:"D:\temp\ghost\Release\ghost.pgd" /LTCG /TLBID:1 /ENTRY:"mainCRTStartup" /DYNAMICBASE /NXCOMPAT /MACHINE:X86 /ERRORREPORT:QUEUE

    Read the article

< Previous Page | 227 228 229 230 231 232 233 234 235 236 237 238  | Next Page >