Search Results

Search found 3449 results on 138 pages for 'tranquil byte'.

Page 102/138 | < Previous Page | 98 99 100 101 102 103 104 105 106 107 108 109  | Next Page >

  • Multi-threaded downloader in C# question

    - by blez
    Currently I have multi-threaded downloader class that uses HttpWebRequest/Response. All works fine, it's super fast, BUT.. the problem is that the data needs to be streamed while it's downloading to another app. That means that it must be streamed in the right order, the first chunk first, and then the next in the queue. Currently my downloader class is sync and Download() returns byte[]. In my async multi-threaded class I make for example, list with 4 empty elements (for slots) and I pass each index of the slot to each thread using the Download() function. That simulates synchronization, but that's not what I need. How should I do the queue thing, to make sure the data is streamed as soon as the first chunk start downloading.

    Read the article

  • C# string equality operator returns false, but I'm pretty sure it should be true... What?!

    - by Daniel Schaffer
    I'm trying to write a unit test for a piece of code that generates a large amount of text. I've run into an issue where the "expected" and "actual" strings appear to be equal, but Assert.AreEqual throws, and both the equality operator and Equals() return false. The result of GetHashCode() is different for both values as well. However, putting both strings into text files and comparing with DiffMerge tells me they're the same. Additionally, using Encoding.ASCII.GetBytes() on both values and then using SequenceEquals to compare the resulting byte arrays returns true. The values are 34KB each, so I'll hold off putting them here for now. Any ideas? I'm completely stumped.

    Read the article

  • How to receive a datastream from a device on your computer, in C#

    - by WebDevHobo
    I plan to build a small audio-recorder app in C#. My laptop has a built in Microphone that's always active, so I want to use that as an early-stage test. I would simply start recording, save the file as a .wav or even use the LAME dll to make it into an MP3. The problem is, I don't know how to contact that microphone. Do I use a library that can detect a device, or do I just catch a stream of bytes from the port that the device is on? I don't have any experience with receiving data from connected devices. I suppose that I'll need to enter all the data into a byte array and then Serialize that into a WAV file, but I'm not sure. Can I get some pointers on this subject?

    Read the article

  • Me As Child Type In General Function

    - by Steven
    I have a MustInherit Parent class with two Child classes which Inherit from the Parent. How can I use (or Cast) Me in a Parent function as the the child type of that instance? EDIT: My actual goal is to be able to serialize (BinaryFormatter.Serialize(Stream, Object)) either of my child classes. However, "repeating the code" in each child "seems" wrong. EDIT2: This is my Serialize function. Where should I implement this function? Copying and pasting to each child doesn't seem right, but casting the parent to a child doesn't seem right either. Public Function Serialize() As Byte() Dim bFmt As New BinaryFormatter() Dim mStr As New MemoryStream() bFmt.Serialize(mStr, Me) Return mStr.ToArray() End Function

    Read the article

  • in java, which is better - three arrays of booleans or 1 array of bytes?

    - by joe_shmoe
    I know the question sounds silly, but consider this: I have an array of items and a labelling algorithm. at any point the item is in one of three states. The current version holds these states in a byte array, where 0, 1 and 2 represent the three states. alternatively, I could have three arrays of boolean - one for each state. which is better (consumes less memory) depends on how jvm (sun's version) stores the arrays - is a boolean represented by 1 bit? (p.s. don't start with all that "this is not the way OO/Java works" - I know, but here performance comes in front. plus the algorithm is simple and perfectly readable even in such form). Thanks a lot

    Read the article

  • What platforms have something other than 8-bit char?

    - by Craig McQueen
    Every now and then, someone on SO points out that char (aka 'byte') isn't necessarily 8 bits. It seems that 8-bit char is almost universal. I would have thought that for mainstream platforms, it is necessary to have an 8-bit char to ensure its viability in the marketplace. Both now and historically, what platforms use a char that is not 8 bits, and why would they differ from the "normal" 8 bits? When writing code, and thinking about cross-platform support (e.g. for general-use libraries), what sort of consideration is it worth giving to platforms with non-8-bit char? In the past I've come across some Analog Devices DSPs for which char is 16 bits. DSPs are a bit of a niche architecture I suppose. (Then again, at the time hand-coded assembler easily beat what the available C compilers could do, so I didn't really get much experience with C on that platform.)

    Read the article

  • KeybListener problem

    - by rgksugan
    In my apllication i am using a jpanel in which i want to add a key listener. I did it. But it doesnot work. Is it because i am using a swingworker to update the contents of the panel every second. Here is my code to update the panel RenderedImage image = ImageIO.read(new ByteArrayInputStream((byte[]) get())); Graphics graphics = remote.rdpanel.getGraphics(); if (graphics != null) { Image readyImage = new ImageIcon(UtilityFunctions.convertRenderedImage(image)).getImage(); graphics.drawImage(readyImage, 0, 0, remote.rdpanel.getWidth(), remote.rdpanel.getHeight(), null); }

    Read the article

  • Map derived class as an independent one with FNH's Automap

    - by Anton Gogolev
    Hi! Basically, I have an ImageMetadata class and an Image class, which derives from ImageMetadata. Image adds one property: byte[] Content, which actually contains binary data. What I want to do is to map these two classes onto one table, but I absolutely do not need NHibernates' inheritance support to kick in. I want to tailor FNH Automap to produce something like: <class name="ImageMetadata" ...> <property name="Name" ... /> < ... /> <class name="Image" ...> <property name="Name" ... /> <property name="Content" ... /> < ... /> Is this at all possible?

    Read the article

  • How do I write raw binary data in Python?

    - by Chris B.
    I've got a Python program that stores and writes data to a file. The data is raw binary data, stored internally as str. I'm writing it out through a utf-8 codec. However, I get UnicodeDecodeError: 'charmap' codec can't decode byte 0x8d in position 25: character maps to <undefined> in the cp1252.py file. This looks to me like Python is trying to interpret the data using the default code page. But it doesn't have a default code page. That's why I'm using str, not unicode. I guess my questions are: How do I represent raw binary data in memory, in Python? When I'm writing raw binary data out through a codec, how do I encode/unencode it?

    Read the article

  • Are C++ Reads and Writes of an int atomic

    - by theschmitzer
    I have two threads, one updating an int and one reading it. This value is a statistic where the order of the read and write is irrelevant. My question is, do I need to synchronize access to this multi-byte value anyway? Or, put another way, can part of the write be complete and get interrupted, and then the read happen. For example, think of value = ox0000FFFF increment value to 0x00010000 Is there a time where the value looks like 0x0001FFFF that I should be worried about? Certainly the larger the type, the more possible something like this is I've always synchronized these types of accesses, but was curious what the community thought.

    Read the article

  • Java String to SHA1

    - by AeroDroid
    I'm trying to make a simple String to SHA1 converter in Java and this is what I've got... public static String toSHA1(byte[] convertme) { MessageDigest md = null; try { md = MessageDigest.getInstance("SHA-1"); } catch(NoSuchAlgorithmException e) { e.printStackTrace(); } return new String(md.digest(convertme)); } When I pass it toSHA1("password".getBytes()), I get "[?a?????%l?3~??." I know it's probably a simple encoding fix like UTF-8, but could someone tell me what I should do to get what I want which is "5baa61e4c9b93f3f0682250b6cf8331b7ee68fd8"? Or am I doing this completely wrong? Thanks a lot!

    Read the article

  • F# powerpack and distribution

    - by rwallace
    I need arbitrary precision rational numbers, which I'm given to understand are available in the F# powerpack. My question is about the mechanics of distribution; my program needs to be able to compile and run both on Windows/.Net and Linux/Mono at least, since I have potential users on both platforms. As I understand it, the best procedure is: Download the powerpack .zip, not the installer. Copy the DLL into my program directory. Copy the accompanying license file into my program directory, to make sure everything is above board. Declare references and go ahead and use the functions I need. Ship the above files along with my source and binary, and since the DLL uses byte code, it will work fine on any platform. Is this the correct procedure? Am I missing anything?

    Read the article

  • C++ USB Programming

    - by JB_SO
    Hi, I am new to hardware programming(especially USB) so please bear with me and my questions. I am using C++ and I need to send/receive some data (a byte array) to/from a USB port on a microprocessor board. Now, I have done some serial port programming before and I know that for a serial port you have to open a port, setup, perform i/o and finally close the port. I am guessing to use a USB port, it is not as simple as what I mentioned above. I do know that I want to use Microsoft standard drivers and implement standard Windows IO commands to accomplish this, since I believe there are no drivers for the microprocessor board for me to interact with. If somebody can point me in the right direction as to the steps needed to "talk" to a USB port (open, setup, i/o) via standard Windows IO commands, I would truly and greatly appreciate it. Thanks you so much!!

    Read the article

  • Screen capture code produces black bitmap

    - by wadetandy
    I need to add the ability to take a screenshot of the entire screen, not just the current window. The following code produces a bmp file with the correct dimensions, but the image is completely black. What am I doing wrong? void CaptureScreen(LPCTSTR lpszFilePathName) { BITMAPFILEHEADER bmfHeader; BITMAPINFO *pbminfo; HBITMAP hBmp; FILE *oFile; HDC screen; HDC memDC; int sHeight; int sWidth; LPBYTE pBuff; BITMAP bmp; WORD cClrBits; RECT rcClient; screen = GetDC(0); memDC = CreateCompatibleDC(screen); sHeight = GetDeviceCaps(screen, VERTRES); sWidth = GetDeviceCaps(screen, HORZRES); //GetObject(screen, sizeof(BITMAP), &bmp); hBmp = CreateCompatibleBitmap ( screen, sWidth, sHeight ); // Retrieve the bitmap color format, width, and height. GetObject(hBmp, sizeof(BITMAP), (LPSTR)&bmp) ; // Convert the color format to a count of bits. cClrBits = (WORD)(bmp.bmPlanes * bmp.bmBitsPixel); if (cClrBits == 1) cClrBits = 1; else if (cClrBits bmiHeader.biSize = sizeof(BITMAPINFOHEADER); pbminfo-bmiHeader.biWidth = bmp.bmWidth; pbminfo-bmiHeader.biHeight = bmp.bmHeight; pbminfo-bmiHeader.biPlanes = bmp.bmPlanes; pbminfo-bmiHeader.biBitCount = bmp.bmBitsPixel; if (cClrBits bmiHeader.biClrUsed = (1bmiHeader.biCompression = BI_RGB; // Compute the number of bytes in the array of color // indices and store the result in biSizeImage. // The width must be DWORD aligned unless the bitmap is RLE // compressed. pbminfo-bmiHeader.biSizeImage = ((pbminfo-bmiHeader.biWidth * cClrBits +31) & ~31) /8 * pbminfo-bmiHeader.biHeight; // Set biClrImportant to 0, indicating that all of the // device colors are important. pbminfo-bmiHeader.biClrImportant = 0; CreateBMPFile(lpszFilePathName, pbminfo, hBmp, memDC); } void CreateBMPFile(LPTSTR pszFile, PBITMAPINFO pbi, HBITMAP hBMP, HDC hDC) { HANDLE hf; // file handle BITMAPFILEHEADER hdr; // bitmap file-header PBITMAPINFOHEADER pbih; // bitmap info-header LPBYTE lpBits; // memory pointer DWORD dwTotal; // total count of bytes DWORD cb; // incremental count of bytes BYTE *hp; // byte pointer DWORD dwTmp; int lines; pbih = (PBITMAPINFOHEADER) pbi; lpBits = (LPBYTE) GlobalAlloc(GMEM_FIXED, pbih-biSizeImage); // Retrieve the color table (RGBQUAD array) and the bits // (array of palette indices) from the DIB. lines = GetDIBits(hDC, hBMP, 0, (WORD) pbih-biHeight, lpBits, pbi, DIB_RGB_COLORS); // Create the .BMP file. hf = CreateFile(pszFile, GENERIC_READ | GENERIC_WRITE, (DWORD) 0, NULL, CREATE_ALWAYS, FILE_ATTRIBUTE_NORMAL, (HANDLE) NULL); hdr.bfType = 0x4d42; // 0x42 = "B" 0x4d = "M" // Compute the size of the entire file. hdr.bfSize = (DWORD) (sizeof(BITMAPFILEHEADER) + pbih-biSize + pbih-biClrUsed * sizeof(RGBQUAD) + pbih-biSizeImage); hdr.bfReserved1 = 0; hdr.bfReserved2 = 0; // Compute the offset to the array of color indices. hdr.bfOffBits = (DWORD) sizeof(BITMAPFILEHEADER) + pbih-biSize + pbih-biClrUsed * sizeof (RGBQUAD); // Copy the BITMAPFILEHEADER into the .BMP file. WriteFile(hf, (LPVOID) &hdr, sizeof(BITMAPFILEHEADER), (LPDWORD) &dwTmp, NULL); // Copy the BITMAPINFOHEADER and RGBQUAD array into the file. WriteFile(hf, (LPVOID) pbih, sizeof(BITMAPINFOHEADER) + pbih-biClrUsed * sizeof (RGBQUAD), (LPDWORD) &dwTmp, ( NULL)); // Copy the array of color indices into the .BMP file. dwTotal = cb = pbih-biSizeImage; hp = lpBits; WriteFile(hf, (LPSTR) hp, (int) cb, (LPDWORD) &dwTmp,NULL); // Close the .BMP file. CloseHandle(hf); // Free memory. GlobalFree((HGLOBAL)lpBits); }

    Read the article

  • How to convert from integer to unsigned char in C, given integers larger than 256?

    - by Alf_InPogform
    As part of my CS course I've been given some functions to use. One of these functions takes a pointer to unsigned chars to write some data to a file (I have to use this function, so I can't just make my own purpose built function that works differently BTW). I need to write an array of integers whose values can be up to 4095 using this function (that only takes unsigned chars). However am I right in thinking that an unsigned char can only have a max value of 256 because it is 1 byte long? I therefore need to use 4 unsigned chars for every integer? But casting doesn't seem to work with larger values for the integer. Does anyone have any idea how best to convert an array of integers to unsigned chars?

    Read the article

  • Android Out of memory regarding png image

    - by turtleboy
    I have a jpg image in my app that shows correctly. In my listview i'd like to make the image more transparent so it is easier to see the text. I changed the image to a png format and altered it's opacity in GIMP. Now that the new image is in the app drawable folder. Im getting the following error. why? 09-28 09:24:07.560: I/global(20140): call socket shutdown, tmpsocket=Socket[address=/178.250.50.40,port=80,localPort=35172] 09-28 09:24:07.570: I/global(20140): call socket shutdown, tmpsocket=Socket[address=/212.169.27.217,port=84,localPort=55656] 09-28 09:24:07.690: D/dalvikvm(20140): GC_FOR_ALLOC freed 113K, 4% free 38592K/39907K, paused 32ms 09-28 09:24:07.690: I/dalvikvm-heap(20140): Forcing collection of SoftReferences for 28072816-byte allocation 09-28 09:24:07.740: D/dalvikvm(20140): GC_BEFORE_OOM freed 9K, 4% free 38582K/39907K, paused 43ms 09-28 09:24:07.740: E/dalvikvm-heap(20140): Out of memory on a 28072816-byte allocation. 09-28 09:24:07.740: I/dalvikvm(20140): "main" prio=5 tid=1 RUNNABLE 09-28 09:24:07.740: I/dalvikvm(20140): | group="main" sCount=0 dsCount=0 obj=0x40a57490 self=0x1b6e9a8 09-28 09:24:07.740: I/dalvikvm(20140): | sysTid=20140 nice=0 sched=0/0 cgrp=default handle=1074361640 09-28 09:24:07.740: I/dalvikvm(20140): | schedstat=( 2289118000 760844000 2121 ) utm=195 stm=33 core=1 09-28 09:24:07.740: I/dalvikvm(20140): at android.graphics.BitmapFactory.nativeDecodeAsset(Native Method) 09-28 09:24:07.740: I/dalvikvm(20140): at android.graphics.BitmapFactory.decodeResourceStream(BitmapFactory.java:486) 09-28 09:24:07.740: I/dalvikvm(20140): at android.graphics.drawable.Drawable.createFromResourceStream(Drawable.java:773) 09-28 09:24:07.740: I/dalvikvm(20140): at android.content.res.Resources.loadDrawable(Resources.java:2042) 09-28 09:24:07.740: I/dalvikvm(20140): at android.content.res.TypedArray.getDrawable(TypedArray.java:601) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.View.<init>(View.java:2812) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.ViewGroup.<init>(ViewGroup.java:410) 09-28 09:24:07.740: I/dalvikvm(20140): at android.widget.LinearLayout.<init>(LinearLayout.java:174) 09-28 09:24:07.740: I/dalvikvm(20140): at android.widget.LinearLayout.<init>(LinearLayout.java:170) 09-28 09:24:07.740: I/dalvikvm(20140): at java.lang.reflect.Constructor.constructNative(Native Method) 09-28 09:24:07.740: I/dalvikvm(20140): at java.lang.reflect.Constructor.newInstance(Constructor.java:417) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.LayoutInflater.createView(LayoutInflater.java:586) 09-28 09:24:07.740: I/dalvikvm(20140): at com.android.internal.policy.impl.PhoneLayoutInflater.onCreateView(PhoneLayoutInflater.java:56) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.LayoutInflater.onCreateView(LayoutInflater.java:653) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:678) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.LayoutInflater.inflate(LayoutInflater.java:466) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.LayoutInflater.inflate(LayoutInflater.java:396) 09-28 09:24:07.740: I/dalvikvm(20140): at android.view.LayoutInflater.inflate(LayoutInflater.java:352) 09-28 09:24:07.740: I/dalvikvm(20140): at com.android.internal.policy.impl.PhoneWindow.setContentView(PhoneWindow.java:278) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.Activity.setContentView(Activity.java:1897) 09-28 09:24:07.740: I/dalvikvm(20140): at com.carefreegroup.ShowMoreDetails.onCreate(ShowMoreDetails.java:26) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.Activity.performCreate(Activity.java:4543) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1071) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2181) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2260) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.ActivityThread.access$600(ActivityThread.java:139) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1277) 09-28 09:24:07.740: I/dalvikvm(20140): at android.os.Handler.dispatchMessage(Handler.java:99) 09-28 09:24:07.740: I/dalvikvm(20140): at android.os.Looper.loop(Looper.java:156) 09-28 09:24:07.740: I/dalvikvm(20140): at android.app.ActivityThread.main(ActivityThread.java:5045) 09-28 09:24:07.740: I/dalvikvm(20140): at java.lang.reflect.Method.invokeNative(Native Method) 09-28 09:24:07.740: I/dalvikvm(20140): at java.lang.reflect.Method.invoke(Method.java:511) 09-28 09:24:07.740: I/dalvikvm(20140): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:784) 09-28 09:24:07.740: I/dalvikvm(20140): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:551) 09-28 09:24:07.740: I/dalvikvm(20140): at dalvik.system.NativeStart.main(Native Method) 09-28 09:24:07.740: E/dalvikvm(20140): Out of memory: Heap Size=46115KB, Allocated=38582KB, Limit=65536KB 09-28 09:24:07.740: E/dalvikvm(20140): Extra info: Footprint=39907KB, Allowed Footprint=46115KB, Trimmed=892KB 09-28 09:24:07.740: E/Bitmap_JNI(20140): Create Bitmap Failed. 09-28 09:24:07.740: A/libc(20140): Fatal signal 11 (SIGSEGV) at 0x00000004 (code=1) 09-28 09:24:09.750: I/dalvikvm(20367): Turning on JNI app bug workarounds for target SDK version 10... 09-28 09:24:09.940: D/dalvikvm(20367): GC_CONCURRENT freed 864K, 21% free 3797K/4771K, paused 2ms+2ms thanks. [update] @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.showmoredetailslayout); actualCallTime = (TextView)findViewById(R.id.actualcalltime); doubleUp = (TextView)findViewById(R.id.doubleupcallid); needName = (TextView)findViewById(R.id.needname); needNameLabel = (TextView)findViewById(R.id.neednamelabel); getRotaDetails = (Button)findViewById(R.id.buttongetrotadetails); intent = this.getIntent(); String actualTimeIn = intent.getStringExtra("actTimeIn"); String actualTimeOut = intent.getStringExtra("actTimeOut"); String doubleUpValue = intent.getStringExtra("doubleUpValue"); String needNameWithCommas = intent.getStringExtra("needNameWithCommas"); callID = intent.getStringExtra("callID"); String[] needs = needNameWithCommas.split(","); actualCallTime.setText("This call was completed at " + actualTimeIn + " -" + actualTimeOut); if( ! doubleUpValue.equalsIgnoreCase("") || doubleUpValue.equalsIgnoreCase("]")){ doubleUp.setText("This call was not a double up "); }else{ doubleUp.setText("This call was a double up " + doubleUpValue); } needNameLabel.setText("Purpose of Call: "); for (int i = 0; i < needs.length; i++){ needName.append( needs[i] + "\n"); } getRotaDetails.setOnClickListener(new OnClickListener() { @Override public void onClick(View v) { Intent intent = new Intent(ShowMoreDetails.this, GetRotaDetails.class); intent.putExtra("callIDExtra", callID); startActivity(intent); } }); } }

    Read the article

  • Visual Studio 2010 compilation general error c1010070

    - by user1747455
    I wrote a little "hello world" program to test my computer, but when i sompile the program there's an error: ------ Build started: Project: hi, Configuration: Debug Win32 ------ Build started 15/10/2012 22:36:48. InitializeBuildStatus: Touching "Debug\hi.unsuccessfulbuild". Link: hi.vcxproj - D:\MSVS\hi\Debug\hi.exe Manifest: Debug\hi.exe.intermediate.manifest : general error c1010070: Failed to load and parse the manifest. {q~ 0H Build FAILED. Time Elapsed 00:00:02.34 ========== Build: 0 succeeded, 1 failed, 0 up-to-date, 0 skipped ========== the creepiest thing i think would be that line of "OH"... i wonder if there's any way to solve this...please help. Many thanks. edit: i tried changing the character set into "multi-byte" and turning "embed manifest"(from "manifest tools") off, but it still cant solve the error

    Read the article

  • Loading xml with encoding UTF 16 using XDocument

    - by Sangram
    Hi, I am trying to read the xml document using XDocument method . but i am getting an error when xml has <?xml version="1.0" encoding="utf-16"?> When i removed encoding manually.It works perfectly. I am getting error " There is no Unicode byte order mark. Cannot switch to Unicode. " i tried searching and i landed up here-- Why does C# XmlDocument.LoadXml(string) fail when an XML header is included? But could not solve my problem. My code : XDocument xdoc = XDocument.Load(path); Any suggestions ?? thank you.

    Read the article

  • Appropriate data structure for a buffer of the packets

    - by psihodelia
    How to implement a buffer of the packets where each packet is of the form: typedef struct{ int32 IP; //4-byte IP-address int16 ID; //unique sequence id }t_Packet; What should be the most appropriate data structure which: (1) allows to collect at least 8000 such packets (fast Insert and Delete operations) (2) allows very fast filtering using IP address, so that only packets with given IP will be selected (3) allows very fast find operation using ID as a key (4) allows very fast (2), then (3) within filtered results ? RAM size does matter, e.g. no huge lookup table is possible to use.

    Read the article

  • Codesample with bufferoverflow (gets method). Why does it not behave as expected?

    - by citronas
    This an extract from an c program that should demonstrate a bufferoverflow. void foo() { char arr[8]; printf(" enter bla bla bla"); gets(arr); printf(" you entered %s\n", arr); } The question was "How many input chars can a user maximal enter without a creating a buffer overflow" My initial answer was 8, because the char-array is 8 bytes long. Although I was pretty certain my answer was correct, I tried a higher amount of chars, and found that the limit of chars that I can enter, before I get a segmentation fault is 11. (Im running this on A VirtualBox Ubuntu) So my question is: Why is it possible to enter 11 chars into that 8 byte array?

    Read the article

  • Using typedefs (or #defines) on built in types - any sensible reason?

    - by jb
    Well I'm doing some Java - C integration, and throught C library werid type mappings are used (theres more of them;)): #define CHAR char /* 8 bit signed int */ #define SHORT short /* 16 bit signed int */ #define INT int /* "natural" length signed int */ #define LONG long /* 32 bit signed int */ typedef unsigned char BYTE; /* 8 bit unsigned int */ typedef unsigned char UCHAR; /* 8 bit unsigned int */ typedef unsigned short USHORT; /* 16 bit unsigned int */ typedef unsigned int UINT; /* "natural" length unsigned int*/ Is there any legitimate reason not to use them? It's not like char is going to be redefined anytime soon. I can think of: Writing platform/compiler portable code (size of type is underspecified in C/C++) Saving space and time on embedded systems - if you loop over array shorter than 255 on 8bit microprocessor writing: for(uint8_t ii = 0; ii < len; ii++) will give meaureable speedup.

    Read the article

  • using .NET how to convert iso8859-1 encoded text files that contain Latin-1 accented characters to u

    - by Tim
    I am being sent text files saved in iso88591-1 format that contain accented characters from the Latin-1 range (as well as normal ASCII a-z etc). How to convert these files to utf-8 using C# so that the single-byte accented characters in iso8859-1 become valid utf-8 characters? I have tried to use a StreamReader with ASCIIEncoding, and then converting the ascii string to UTF-8 by instantiating an ascii encoding and a utf8 encoding and then using Encoding.Convert(ascii, utf8, ascii.GetBytes( asciiString) ) — but the accented characters are being rendered as question marks. What step am I missing?

    Read the article

  • What's causing "Unable to retrieve native address from ByteBuffer object"?

    - by r0u1i
    As a very novice Java programmer, I probably should not mess with that kind of things. Unfortunately, I'm using a library which have a method that accepts a ByteBuffer object and throws when I try to use it: Exception in thread "main" java.lang.NullPointerException: Unable to retrieve native address from ByteBuffer object Is it because I'm not using a non-direct buffer? edit: There's not a lot of my code there. The library I'm using is jNetPcap, and I'm trying to dump a packet to file. My code takes an existing packet, and extract a ByteBuffer out of it: byte[] bytes = m_packet.getByteArray(0, m_packet.size()); ByteBuffer buffer = ByteBuffer.wrap(bytes); Then it calls on of the dump methods of jNetPcap that takes a ByteBuffer.

    Read the article

  • utf-8 convertion doesn't work always

    - by Marco Piccinni
    I searched into other stack before to type here and I didn't find anythong similar. I have to scrape different utf-8 webpages which contain text like "Oggi è una bellissima giornata" the problem is on the characther "è" I extract this text with jtidy and xpath query expression and I convert it with byte[] content = filteredEncodedString.getBytes("utf-8"); String result = new String(content,"utf-8"); where filteredEncodedString contains the text "Oggi è una bellissima giornata". This procedures works on the most webpages analyzed so far but in some case it doesn't extract a utf-8 string. Page encoding is always the same as the text is similar. Any ideas about the problem? thanks Marco

    Read the article

  • How i store the images pixels in matrix form?

    - by Rajendra Bhole
    Hi, I developing an application in which the pixelize image i want to be store in matrix format. The code is as follows. struct pixel { //unsigned char r, g, b,a; Byte r, g, b; int count; }; (NSInteger) processImage1: (UIImage*) image { // Allocate a buffer big enough to hold all the pixels struct pixel* pixels = (struct pixel*) calloc(1, image.size.width * image.size.height * sizeof(struct pixel)); if (pixels != nil) { // Create a new bitmap CGContextRef context = CGBitmapContextCreate( (void*) pixels, image.size.width, image.size.height, 8, image.size.width * 4, CGImageGetColorSpace(image.CGImage), kCGImageAlphaPremultipliedLast ); NSLog(@"1=%d, 2=%d, 3=%d", CGImageGetBitsPerComponent(image), CGImageGetBitsPerPixel(image),CGImageGetBytesPerRow(image)); if (context != NULL) { // Draw the image in the bitmap CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, image.size.width, image.size.height), image.CGImage); NSUInteger numberOfPixels = image.size.width * image.size.height; I confusing about how to initialize the 2-D matrix in which the matrix store data of pixels.

    Read the article

< Previous Page | 98 99 100 101 102 103 104 105 106 107 108 109  | Next Page >