Search Results

Search found 3443 results on 138 pages for 'byte'.

Page 102/138 | < Previous Page | 98 99 100 101 102 103 104 105 106 107 108 109  | Next Page >

  • using C# how to convert iso8859-1 encoded text files that contain Latin-1 accented characters to utf

    - by Tim
    I am being sent text files saved in iso88591-1 format that contain accented characters from the Latin-1 range (as well as normal ASCII a-z etc). How to convert these files to utf-8 using C# so that the single-byte accented characters in iso8859-1 become valid utf-8 characters? I have tried to use a StreamReader with ASCIIEncoding, and then converting the ascii string to UTF-8 by instantiating an ascii encoding and a utf8 encoding and then using Encoding.Convert(ascii, utf8, ascii.GetBytes( asciiString) ) — but the accented characters are being rendered as question marks. What step am I missing? Thanks

    Read the article

  • How do I write raw binary data in Python?

    - by Chris B.
    I've got a Python program that stores and writes data to a file. The data is raw binary data, stored internally as str. I'm writing it out through a utf-8 codec. However, I get UnicodeDecodeError: 'charmap' codec can't decode byte 0x8d in position 25: character maps to <undefined> in the cp1252.py file. This looks to me like Python is trying to interpret the data using the default code page. But it doesn't have a default code page. That's why I'm using str, not unicode. I guess my questions are: How do I represent raw binary data in memory, in Python? When I'm writing raw binary data out through a codec, how do I encode/unencode it?

    Read the article

  • What platforms have something other than 8-bit char?

    - by Craig McQueen
    Every now and then, someone on SO points out that char (aka 'byte') isn't necessarily 8 bits. It seems that 8-bit char is almost universal. I would have thought that for mainstream platforms, it is necessary to have an 8-bit char to ensure its viability in the marketplace. Both now and historically, what platforms use a char that is not 8 bits, and why would they differ from the "normal" 8 bits? When writing code, and thinking about cross-platform support (e.g. for general-use libraries), what sort of consideration is it worth giving to platforms with non-8-bit char? In the past I've come across some Analog Devices DSPs for which char is 16 bits. DSPs are a bit of a niche architecture I suppose. (Then again, at the time hand-coded assembler easily beat what the available C compilers could do, so I didn't really get much experience with C on that platform.)

    Read the article

  • Quick way to do data lookup in PHP

    - by Ghostrider
    I have a data table with 600,000 records that is around 25 megabytes large. It is indexed by a 4 byte key. Is there a way to find a row in such dataset quickly with PHP without resorting to MySQL? The website in question is mostly static with minor PHP code and no database dependencies and therefore fast. I would like to add this data without having to use MySQL if possible. In C++ I would memory map the file and do a binary search in it. Is there a way to do something similar in PHP?

    Read the article

  • Android Out of memory regarding png image

    - by turtleboy
    I have a jpg image in my app that shows correctly. In my listview i'd like to make the image more transparent so it is easier to see the text. I changed the image to a png format and altered it's opacity in GIMP. Now that the new image is in the app drawable folder. Im getting the following error. why? 09-28 09:24:07.560: I/global(20140): call socket shutdown, tmpsocket=Socket[address=/178.250.50.40,port=80,localPort=35172] 09-28 09:24:07.570: I/global(20140): call socket shutdown, tmpsocket=Socket[address=/212.169.27.217,port=84,localPort=55656] 09-28 09:24:07.690: D/dalvikvm(20140): GC_FOR_ALLOC freed 113K, 4% free 38592K/39907K, paused 32ms 09-28 09:24:07.690: I/dalvikvm-heap(20140): Forcing collection of SoftReferences for 28072816-byte allocation 09-28 09:24:07.740: D/dalvikvm(20140): GC_BEFORE_OOM freed 9K, 4% free 38582K/39907K, paused 43ms 09-28 09:24:07.740: E/dalvikvm-heap(20140): Out of memory on a 28072816-byte allocation. 09-28 09:24:07.740: I/dalvikvm(20140): "main" prio=5 tid=1 RUNNABLE 09-28 09:24:07.740: I/dalvikvm(20140): | group="main" sCount=0 dsCount=0 obj=0x40a57490 self=0x1b6e9a8 09-28 09:24:07.740: I/dalvikvm(20140): | sysTid=20140 nice=0 sched=0/0 cgrp=default handle=1074361640 09-28 09:24:07.740: I/dalvikvm(20140): | schedstat=( 2289118000 760844000 2121 ) utm=195 stm=33 core=1 09-28 09:24:07.740: I/dalvikvm(20140): at android.graphics.BitmapFactory.nativeDecodeAsset(Native Method) 09-28 09:24:07.740: I/dalvikvm(20140): at android.graphics.BitmapFactory.decodeResourceStream(BitmapFactory.java:486) 09-28 09:24:07.740: I/dalvikvm(20140): at android.graphics.drawable.Drawable.createFromResourceStream(Drawable.java:773) 09-28 09:24:07.740: I/dalvikvm(20140): at android.content.res.Resources.loadDrawable(Resources.java:2042) 09-28 09:24:07.740: I/dalvikvm(20140): at android.content.res.TypedArray.getDrawable(TypedArray.java:601) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.View.<init>(View.java:2812) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.ViewGroup.<init>(ViewGroup.java:410) 09-28 09:24:07.740: I/dalvikvm(20140): at android.widget.LinearLayout.<init>(LinearLayout.java:174) 09-28 09:24:07.740: I/dalvikvm(20140): at android.widget.LinearLayout.<init>(LinearLayout.java:170) 09-28 09:24:07.740: I/dalvikvm(20140): at java.lang.reflect.Constructor.constructNative(Native Method) 09-28 09:24:07.740: I/dalvikvm(20140): at java.lang.reflect.Constructor.newInstance(Constructor.java:417) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.LayoutInflater.createView(LayoutInflater.java:586) 09-28 09:24:07.740: I/dalvikvm(20140): at com.android.internal.policy.impl.PhoneLayoutInflater.onCreateView(PhoneLayoutInflater.java:56) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.LayoutInflater.onCreateView(LayoutInflater.java:653) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:678) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.LayoutInflater.inflate(LayoutInflater.java:466) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.LayoutInflater.inflate(LayoutInflater.java:396) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.LayoutInflater.inflate(LayoutInflater.java:352) 09-28 09:24:07.740: I/dalvikvm(20140): at com.android.internal.policy.impl.PhoneWindow.setContentView(PhoneWindow.java:278) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.Activity.setContentView(Activity.java:1897) 09-28 09:24:07.740: I/dalvikvm(20140): at com.carefreegroup.ShowMoreDetails.onCreate(ShowMoreDetails.java:26) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.Activity.performCreate(Activity.java:4543) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1071) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2181) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2260) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.ActivityThread.access$600(ActivityThread.java:139) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1277) 09-28 09:24:07.740: I/dalvikvm(20140): at android.os.Handler.dispatchMessage(Handler.java:99) 09-28 09:24:07.740: I/dalvikvm(20140): at android.os.Looper.loop(Looper.java:156) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.ActivityThread.main(ActivityThread.java:5045) 09-28 09:24:07.740: I/dalvikvm(20140): at java.lang.reflect.Method.invokeNative(Native Method) 09-28 09:24:07.740: I/dalvikvm(20140): at java.lang.reflect.Method.invoke(Method.java:511) 09-28 09:24:07.740: I/dalvikvm(20140): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:784) 09-28 09:24:07.740: I/dalvikvm(20140): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:551) 09-28 09:24:07.740: I/dalvikvm(20140): at dalvik.system.NativeStart.main(Native Method) 09-28 09:24:07.740: E/dalvikvm(20140): Out of memory: Heap Size=46115KB, Allocated=38582KB, Limit=65536KB 09-28 09:24:07.740: E/dalvikvm(20140): Extra info: Footprint=39907KB, Allowed Footprint=46115KB, Trimmed=892KB 09-28 09:24:07.740: E/Bitmap_JNI(20140): Create Bitmap Failed. 09-28 09:24:07.740: A/libc(20140): Fatal signal 11 (SIGSEGV) at 0x00000004 (code=1) 09-28 09:24:09.750: I/dalvikvm(20367): Turning on JNI app bug workarounds for target SDK version 10... 09-28 09:24:09.940: D/dalvikvm(20367): GC_CONCURRENT freed 864K, 21% free 3797K/4771K, paused 2ms+2ms thanks. [update] @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.showmoredetailslayout); actualCallTime = (TextView)findViewById(R.id.actualcalltime); doubleUp = (TextView)findViewById(R.id.doubleupcallid); needName = (TextView)findViewById(R.id.needname); needNameLabel = (TextView)findViewById(R.id.neednamelabel); getRotaDetails = (Button)findViewById(R.id.buttongetrotadetails); intent = this.getIntent(); String actualTimeIn = intent.getStringExtra("actTimeIn"); String actualTimeOut = intent.getStringExtra("actTimeOut"); String doubleUpValue = intent.getStringExtra("doubleUpValue"); String needNameWithCommas = intent.getStringExtra("needNameWithCommas"); callID = intent.getStringExtra("callID"); String[] needs = needNameWithCommas.split(","); actualCallTime.setText("This call was completed at " + actualTimeIn + " -" + actualTimeOut); if( ! doubleUpValue.equalsIgnoreCase("") || doubleUpValue.equalsIgnoreCase("]")){ doubleUp.setText("This call was not a double up "); }else{ doubleUp.setText("This call was a double up " + doubleUpValue); } needNameLabel.setText("Purpose of Call: "); for (int i = 0; i < needs.length; i++){ needName.append( needs[i] + "\n"); } getRotaDetails.setOnClickListener(new OnClickListener() { @Override public void onClick(View v) { Intent intent = new Intent(ShowMoreDetails.this, GetRotaDetails.class); intent.putExtra("callIDExtra", callID); startActivity(intent); } }); } }

    Read the article

  • How to convert from integer to unsigned char in C, given integers larger than 256?

    - by Alf_InPogform
    As part of my CS course I've been given some functions to use. One of these functions takes a pointer to unsigned chars to write some data to a file (I have to use this function, so I can't just make my own purpose built function that works differently BTW). I need to write an array of integers whose values can be up to 4095 using this function (that only takes unsigned chars). However am I right in thinking that an unsigned char can only have a max value of 256 because it is 1 byte long? I therefore need to use 4 unsigned chars for every integer? But casting doesn't seem to work with larger values for the integer. Does anyone have any idea how best to convert an array of integers to unsigned chars?

    Read the article

  • KeybListener problem

    - by rgksugan
    In my apllication i am using a jpanel in which i want to add a key listener. I did it. But it doesnot work. Is it because i am using a swingworker to update the contents of the panel every second. Here is my code to update the panel RenderedImage image = ImageIO.read(new ByteArrayInputStream((byte[]) get())); Graphics graphics = remote.rdpanel.getGraphics(); if (graphics != null) { Image readyImage = new ImageIcon(UtilityFunctions.convertRenderedImage(image)).getImage(); graphics.drawImage(readyImage, 0, 0, remote.rdpanel.getWidth(), remote.rdpanel.getHeight(), null); }

    Read the article

  • in java, which is better - three arrays of booleans or 1 array of bytes?

    - by joe_shmoe
    I know the question sounds silly, but consider this: I have an array of items and a labelling algorithm. at any point the item is in one of three states. The current version holds these states in a byte array, where 0, 1 and 2 represent the three states. alternatively, I could have three arrays of boolean - one for each state. which is better (consumes less memory) depends on how jvm (sun's version) stores the arrays - is a boolean represented by 1 bit? (p.s. don't start with all that "this is not the way OO/Java works" - I know, but here performance comes in front. plus the algorithm is simple and perfectly readable even in such form). Thanks a lot

    Read the article

  • Map derived class as an independent one with FNH's Automap

    - by Anton Gogolev
    Hi! Basically, I have an ImageMetadata class and an Image class, which derives from ImageMetadata. Image adds one property: byte[] Content, which actually contains binary data. What I want to do is to map these two classes onto one table, but I absolutely do not need NHibernates' inheritance support to kick in. I want to tailor FNH Automap to produce something like: <class name="ImageMetadata" ...> <property name="Name" ... /> < ... /> <class name="Image" ...> <property name="Name" ... /> <property name="Content" ... /> < ... /> Is this at all possible?

    Read the article

  • Screen capture code produces black bitmap

    - by wadetandy
    I need to add the ability to take a screenshot of the entire screen, not just the current window. The following code produces a bmp file with the correct dimensions, but the image is completely black. What am I doing wrong? void CaptureScreen(LPCTSTR lpszFilePathName) { BITMAPFILEHEADER bmfHeader; BITMAPINFO *pbminfo; HBITMAP hBmp; FILE *oFile; HDC screen; HDC memDC; int sHeight; int sWidth; LPBYTE pBuff; BITMAP bmp; WORD cClrBits; RECT rcClient; screen = GetDC(0); memDC = CreateCompatibleDC(screen); sHeight = GetDeviceCaps(screen, VERTRES); sWidth = GetDeviceCaps(screen, HORZRES); //GetObject(screen, sizeof(BITMAP), &bmp); hBmp = CreateCompatibleBitmap ( screen, sWidth, sHeight ); // Retrieve the bitmap color format, width, and height. GetObject(hBmp, sizeof(BITMAP), (LPSTR)&bmp) ; // Convert the color format to a count of bits. cClrBits = (WORD)(bmp.bmPlanes * bmp.bmBitsPixel); if (cClrBits == 1) cClrBits = 1; else if (cClrBits bmiHeader.biSize = sizeof(BITMAPINFOHEADER); pbminfo-bmiHeader.biWidth = bmp.bmWidth; pbminfo-bmiHeader.biHeight = bmp.bmHeight; pbminfo-bmiHeader.biPlanes = bmp.bmPlanes; pbminfo-bmiHeader.biBitCount = bmp.bmBitsPixel; if (cClrBits bmiHeader.biClrUsed = (1bmiHeader.biCompression = BI_RGB; // Compute the number of bytes in the array of color // indices and store the result in biSizeImage. // The width must be DWORD aligned unless the bitmap is RLE // compressed. pbminfo-bmiHeader.biSizeImage = ((pbminfo-bmiHeader.biWidth * cClrBits +31) & ~31) /8 * pbminfo-bmiHeader.biHeight; // Set biClrImportant to 0, indicating that all of the // device colors are important. pbminfo-bmiHeader.biClrImportant = 0; CreateBMPFile(lpszFilePathName, pbminfo, hBmp, memDC); } void CreateBMPFile(LPTSTR pszFile, PBITMAPINFO pbi, HBITMAP hBMP, HDC hDC) { HANDLE hf; // file handle BITMAPFILEHEADER hdr; // bitmap file-header PBITMAPINFOHEADER pbih; // bitmap info-header LPBYTE lpBits; // memory pointer DWORD dwTotal; // total count of bytes DWORD cb; // incremental count of bytes BYTE *hp; // byte pointer DWORD dwTmp; int lines; pbih = (PBITMAPINFOHEADER) pbi; lpBits = (LPBYTE) GlobalAlloc(GMEM_FIXED, pbih-biSizeImage); // Retrieve the color table (RGBQUAD array) and the bits // (array of palette indices) from the DIB. lines = GetDIBits(hDC, hBMP, 0, (WORD) pbih-biHeight, lpBits, pbi, DIB_RGB_COLORS); // Create the .BMP file. hf = CreateFile(pszFile, GENERIC_READ | GENERIC_WRITE, (DWORD) 0, NULL, CREATE_ALWAYS, FILE_ATTRIBUTE_NORMAL, (HANDLE) NULL); hdr.bfType = 0x4d42; // 0x42 = "B" 0x4d = "M" // Compute the size of the entire file. hdr.bfSize = (DWORD) (sizeof(BITMAPFILEHEADER) + pbih-biSize + pbih-biClrUsed * sizeof(RGBQUAD) + pbih-biSizeImage); hdr.bfReserved1 = 0; hdr.bfReserved2 = 0; // Compute the offset to the array of color indices. hdr.bfOffBits = (DWORD) sizeof(BITMAPFILEHEADER) + pbih-biSize + pbih-biClrUsed * sizeof (RGBQUAD); // Copy the BITMAPFILEHEADER into the .BMP file. WriteFile(hf, (LPVOID) &hdr, sizeof(BITMAPFILEHEADER), (LPDWORD) &dwTmp, NULL); // Copy the BITMAPINFOHEADER and RGBQUAD array into the file. WriteFile(hf, (LPVOID) pbih, sizeof(BITMAPINFOHEADER) + pbih-biClrUsed * sizeof (RGBQUAD), (LPDWORD) &dwTmp, ( NULL)); // Copy the array of color indices into the .BMP file. dwTotal = cb = pbih-biSizeImage; hp = lpBits; WriteFile(hf, (LPSTR) hp, (int) cb, (LPDWORD) &dwTmp,NULL); // Close the .BMP file. CloseHandle(hf); // Free memory. GlobalFree((HGLOBAL)lpBits); }

    Read the article

  • F# powerpack and distribution

    - by rwallace
    I need arbitrary precision rational numbers, which I'm given to understand are available in the F# powerpack. My question is about the mechanics of distribution; my program needs to be able to compile and run both on Windows/.Net and Linux/Mono at least, since I have potential users on both platforms. As I understand it, the best procedure is: Download the powerpack .zip, not the installer. Copy the DLL into my program directory. Copy the accompanying license file into my program directory, to make sure everything is above board. Declare references and go ahead and use the functions I need. Ship the above files along with my source and binary, and since the DLL uses byte code, it will work fine on any platform. Is this the correct procedure? Am I missing anything?

    Read the article

  • Visual Studio 2010 compilation general error c1010070

    - by user1747455
    I wrote a little "hello world" program to test my computer, but when i sompile the program there's an error: ------ Build started: Project: hi, Configuration: Debug Win32 ------ Build started 15/10/2012 22:36:48. InitializeBuildStatus: Touching "Debug\hi.unsuccessfulbuild". Link: hi.vcxproj - D:\MSVS\hi\Debug\hi.exe Manifest: Debug\hi.exe.intermediate.manifest : general error c1010070: Failed to load and parse the manifest. {q~ 0H Build FAILED. Time Elapsed 00:00:02.34 ========== Build: 0 succeeded, 1 failed, 0 up-to-date, 0 skipped ========== the creepiest thing i think would be that line of "OH"... i wonder if there's any way to solve this...please help. Many thanks. edit: i tried changing the character set into "multi-byte" and turning "embed manifest"(from "manifest tools") off, but it still cant solve the error

    Read the article

  • What's causing "Unable to retrieve native address from ByteBuffer object"?

    - by r0u1i
    As a very novice Java programmer, I probably should not mess with that kind of things. Unfortunately, I'm using a library which have a method that accepts a ByteBuffer object and throws when I try to use it: Exception in thread "main" java.lang.NullPointerException: Unable to retrieve native address from ByteBuffer object Is it because I'm not using a non-direct buffer? edit: There's not a lot of my code there. The library I'm using is jNetPcap, and I'm trying to dump a packet to file. My code takes an existing packet, and extract a ByteBuffer out of it: byte[] bytes = m_packet.getByteArray(0, m_packet.size()); ByteBuffer buffer = ByteBuffer.wrap(bytes); Then it calls on of the dump methods of jNetPcap that takes a ByteBuffer.

    Read the article

  • Does Postgresql varchar count using unicode character length or ASCII character length?

    - by bennylope
    I tried importing a database dump from a SQL file and the insert failed when inserting the string Mér into a field defined as varying(3). I didn't capture the exact error, but it pointed to that specific value with the constraint of varying(3). Given that I considered this unimportant to what I was doing at the time, I just changed the value to Mer, it worked, and I moved on. Is a varying field with its limit taking into account length of the byte string? What really boggles my mind is that this was dumped from another PostgreSQL database. So it doesn't make sense how a constraint could allow the value to be written initially.

    Read the article

  • Taking screen shot of a SurfaceView in android

    - by Mostafa Imran
    I am using following method to taking screen shot of a particular view which is a SurfaceView. public void takeScreenShot(View surface_view){ // create bitmap screen capture Bitmap bitmap; View v1 = surface_view; v1.setDrawingCacheEnabled(true); bitmap = Bitmap.createBitmap(v1.getDrawingCache()); v1.setDrawingCacheEnabled(false); ByteArrayOutputStream bos = new ByteArrayOutputStream(); bitmap.compress(CompressFormat.PNG, 0, bos); byte[] imageData = bos.toByteArray(); } the problem is its giving me the whole activity screen image. But I need to take screen shot of the particular view. I tried other ways but those give me a black screen as screen shot, some posts says that it requires rooted device. Can any one help me please. I'm in need of this solution. Help me....

    Read the article

  • Appropriate data structure for a buffer of the packets

    - by psihodelia
    How to implement a buffer of the packets where each packet is of the form: typedef struct{ int32 IP; //4-byte IP-address int16 ID; //unique sequence id }t_Packet; What should be the most appropriate data structure which: (1) allows to collect at least 8000 such packets (fast Insert and Delete operations) (2) allows very fast filtering using IP address, so that only packets with given IP will be selected (3) allows very fast find operation using ID as a key (4) allows very fast (2), then (3) within filtered results ? RAM size does matter, e.g. no huge lookup table is possible to use.

    Read the article

  • Are C++ Reads and Writes of an int atomic

    - by theschmitzer
    I have two threads, one updating an int and one reading it. This value is a statistic where the order of the read and write is irrelevant. My question is, do I need to synchronize access to this multi-byte value anyway? Or, put another way, can part of the write be complete and get interrupted, and then the read happen. For example, think of value = ox0000FFFF increment value to 0x00010000 Is there a time where the value looks like 0x0001FFFF that I should be worried about? Certainly the larger the type, the more possible something like this is I've always synchronized these types of accesses, but was curious what the community thought.

    Read the article

  • Loading xml with encoding UTF 16 using XDocument

    - by Sangram
    Hi, I am trying to read the xml document using XDocument method . but i am getting an error when xml has <?xml version="1.0" encoding="utf-16"?> When i removed encoding manually.It works perfectly. I am getting error " There is no Unicode byte order mark. Cannot switch to Unicode. " i tried searching and i landed up here-- Why does C# XmlDocument.LoadXml(string) fail when an XML header is included? But could not solve my problem. My code : XDocument xdoc = XDocument.Load(path); Any suggestions ?? thank you.

    Read the article

  • How can chunks be allocated in a node.js stream in object mode all at once?

    - by Quentin Engles
    I can see how buffers, and strings can be sent as chunks, but I'm having a problem thinking about how streams can be dealt when working in object mode. Say I have a byte stream from an http request message. I want to take that message, parse, and then transform it into one big object. I already know how to parse the message. What I'm wondering is if the message is big so it has many chunks, but I want to make one object for the output how can I make sure the data event waits for the whole thing? Is this just a matter of not using the push method until the chunked data has finished being sent? That would then restrict the stream data output to a smaller object which I think I'm fine with for now. As an added condition the larger data will be reduced in size after the the transform.

    Read the article

  • Codesample with bufferoverflow (gets method). Why does it not behave as expected?

    - by citronas
    This an extract from an c program that should demonstrate a bufferoverflow. void foo() { char arr[8]; printf(" enter bla bla bla"); gets(arr); printf(" you entered %s\n", arr); } The question was "How many input chars can a user maximal enter without a creating a buffer overflow" My initial answer was 8, because the char-array is 8 bytes long. Although I was pretty certain my answer was correct, I tried a higher amount of chars, and found that the limit of chars that I can enter, before I get a segmentation fault is 11. (Im running this on A VirtualBox Ubuntu) So my question is: Why is it possible to enter 11 chars into that 8 byte array?

    Read the article

  • C# - How to convert 10 bytes to unsigned long

    - by Justin
    Hey, I have 10 bytes - 4 bytes of low order, 4 bytes of high order, 2 bytes of highest order - that I need to convert to an unsigned long. I've tried a couple different methods but neither of them worked: Try #1: var id = BitConverter.ToUInt64(buffer, 0); Try #2: var id = GetID(buffer, 0); long GetID(byte[] buffer, int startIndex) { var lowOrderUnitId = BitConverter.ToUInt32(buffer, startIndex); var highOrderUnitId = BitConverter.ToUInt32(buffer, startIndex + 4); var highestOrderUnitId = BitConverter.ToUInt16(buffer, startIndex + 8); return lowOrderUnitId + (highOrderUnitId * 100000000) + (highestOrderUnitId * 10000000000000000); } Any help would be appreciated, thanks!

    Read the article

  • highlight text in 2 textboxes at the same time

    - by user1907736
    I am trying to create a packet analyzer for an online game using C# and I am new to c#. I have 2 RichTextBoxes, 1 shows the packet in bytes and the other one shows the packet in ANSI. Here is what I want to achieve: When I select(highlight) data in the byte text box, I want the corresponding data in the ANSI text box to also be highlighted. (and vice-versa) When I change data in the 1 of the textboxes, I want the corresponding data in the other textbox to also be changed. How do I do these?

    Read the article

  • Line by line image swing

    - by user1046017
    I want to show an image as it is downloading, I have the URL, and I am trying to get the image parts like this: InputStream openStream = url.openStream(); DataInputStream dis = new DataInputStream(new BufferedInputStream(openStream)); ByteArrayOutputStream os = new ByteArrayOutputStream(); while ((s = dis.read(b)) != -1) { os.write(b , 0, s); support.firePropertyChange("stream", null, os); } This way, any listener get the stream and creates an image, this way: if("stream".equals(evt.getPropertyName())){ try { ByteArrayOutputStream stream = (ByteArrayOutputStream) evt.getNewValue(); byte[] byteArray = stream.toByteArray(); stream.flush(); Image createImage = Toolkit.getDefaultToolkit().createImage(byteArray); this.getContentPane().add(new JLabel(new ImageIcon(createImage))); } catch (IOException ex) { Logger.getLogger(ImageTest.class.getName()).log(Level.SEVERE, null, ex); } } However, I am getting a "Premature end of JPEG file sun.awt.image.ImageFormatException: JPEG datastream contains no image" error, the image is a JPG image format, is there any library or method known to make something similar?

    Read the article

  • How to encode and decode chinese characters?

    - by melaos
    I've try googling around but wasn't able to find what charset that this text below belongs to: 具有éœé›»ç”¢ç”Ÿè£ç½®ä¹‹å½±åƒè¼¸å…¥è£ç½® But putting <meta http-equiv="Content-Type" Content="text/html; charset=utf-8"> and keeping that string into a html file i was able to view the chinese character wording properly. which is: ??????????????? So my question is: what tools can i use to detect the character set of those text? And how do i convert/encode/decode them properly in C#? Updates: Added some test code [TestMethod] public void TestMethod1() { string encodedText = "具有éœé›»ç”¢ç”Ÿè£ç½®ä¹‹å½±åƒè¼¸å…¥è£ç½®"; Encoding encoder = new UTF8Encoding(); byte[] postBytes = encoder.GetBytes(encodedText); postBytes = UTF8Encoding.Convert(Encoding.UTF8, Encoding.Unicode, postBytes); string decodedText = Encoding.Unicode.GetString(postBytes); Assert.AreNotEqual(encodedText, decodedText); } thanks

    Read the article

  • C++ USB Programming

    - by JB_SO
    Hi, I am new to hardware programming(especially USB) so please bear with me and my questions. I am using C++ and I need to send/receive some data (a byte array) to/from a USB port on a microprocessor board. Now, I have done some serial port programming before and I know that for a serial port you have to open a port, setup, perform i/o and finally close the port. I am guessing to use a USB port, it is not as simple as what I mentioned above. I do know that I want to use Microsoft standard drivers and implement standard Windows IO commands to accomplish this, since I believe there are no drivers for the microprocessor board for me to interact with. If somebody can point me in the right direction as to the steps needed to "talk" to a USB port (open, setup, i/o) via standard Windows IO commands, I would truly and greatly appreciate it. Thanks you so much!!

    Read the article

< Previous Page | 98 99 100 101 102 103 104 105 106 107 108 109  | Next Page >