Search Results

Search found 46178 results on 1848 pages for 'java home'.

Page 410/1848 | < Previous Page | 406 407 408 409 410 411 412 413 414 415 416 417  | Next Page >

  • How can I make Swig correctly wrap a char* buffer that is modified in C as a Java Something-or-other

    - by Ukko
    I am trying to wrap some legacy code for use in Java and I was quite happy to see that Swig was able to handle the header file and it generate a great wrapper that almost works. Now I am looking for the deep magic that will make it really work. In C I have a function that looks like this DLL_IMPORT int DustyVoodoo(char *buff, int len, char *curse); This integer returned by this function is an error code in case it fails. The arguments are buff is a character buffer len is the length of the data in the buffer curse the another character buffer that contains the result of calling DustyVoodoo So, you can see where this is going, the result is actually coming back via the third argument. Also len is confusing since it may be the length of both buffers, they are always allocated as being the same size in calling code but given what DustyVoodoo does I don't think that they need be the same. To be safe both buffers should be the same size in practice, say 512 chars. The C code generated for the binding is as follows: SWIGEXPORT jint JNICALL Java_pemapiJNI_DustyVoodoo(JNIEnv *jenv, jclass jcls, jstring jarg1, jint jarg2, jstring jarg3) { jint jresult = 0 ; char *arg1 = (char *) 0 ; int arg2 ; char *arg3 = (char *) 0 ; int result; (void)jenv; (void)jcls; arg1 = 0; if (jarg1) { arg1 = (char *)(*jenv)->GetStringUTFChars(jenv, jarg1, 0); if (!arg1) return 0; } arg2 = (int)jarg2; arg3 = 0; if (jarg3) { arg3 = (char *)(*jenv)->GetStringUTFChars(jenv, jarg3, 0); if (!arg3) return 0; } result = (int)PemnEncrypt(arg1,arg2,arg3); jresult = (jint)result; if (arg1) (*jenv)->ReleaseStringUTFChars(jenv, jarg1, (const char *)arg1); if (arg3) (*jenv)->ReleaseStringUTFChars(jenv, jarg3, (const char *)arg3); return jresult; } It is correct for what it does; however, it misses the fact that cursed is not just an input, it is altered by the function and should be returned as an output. It also does not know that the java Strings are really buffers and should be backed by a suitably sized array. I think that Swig can do the right thing here, I just can't figure out from the documentation how to tell Swig what it needs to know. Any typemap masers in the house?

    Read the article

  • How can JVM arguments be passed to apps started through java webstart in MacOS?

    - by siva
    We have a application which is triggered from browser. This application consumes around 800 mb of memory. This works perfectly when invoked from any browsers in windows OS. The same application when triggered from MacOS throws an out of memory exception which occurs when the application is short of memory. Is there any way to increase the memory allocated for apps running in mac os environment. Also please let me know how JVM arguments can be passed to apps started through java webstart in macOS.

    Read the article

  • How do I read Unicode characters from an MS Access 2007 database through Java?

    - by Peter
    In Java, I have written a program that reads a UTF8 text file. The text file contains a SQL query of the SELECT kind. The program then executes the query on the Microsoft Access 2007 database and writes all fields of the first row to a UTF8 text file. The problem I have is when a row is returned that contains unicode characters, such as "?". These characters show up as "?" in the text file. I know that the text files are read and written correctly, because a dummy UTF8 character ("?") is read from the text file containing the SQL query and written to the text file containing the resulting row. The UTF8 character looks correct when the written text file is opened in Notepad, so the reading and writing of the text files are not part of the problem. This is how I connect to the database and how I execute the SQL query: ---- START CODE Connection c = DriverManager.getConnection("jdbc:odbc:Driver={Microsoft Access Driver (*.mdb, *.accdb)};DBQ=C:/database.accdb;Pwd=temp"); ResultSet r = c.createStatement().executeQuery(sql); ---- END CODE I have tried making a charSet property to the Connection but it makes no difference: ---- START CODE Properties p = new Properties(); p.put("charSet", "utf-8"); p.put("lc_ctype", "utf-8"); p.put("encoding", "utf-8"); Connection c = DriverManager.getConnection("...", p); ---- END CODE Tried with "utf8"/"UTF8"/"UTF-8", no difference. If I enter "UTF-16" I get the following exception: "java.lang.IllegalArgumentException: Illegal replacement". Been searching around for hours with no results and now turn my hope to you. Please help! I also accept workaround suggestions. =) What I want to be able to do is to make a Unicode query (for example one that searches for posts that contain the "?" character) and to have results with Unicode characters receieved and saved correctly. Thank you!

    Read the article

  • Get rid of redundant cast to javax.xml.bind.JAXBElement<java.lang.Boolean> warning from CXF-generate

    - by Binary255
    I generate some code using CXF from a WSDL-file. When compiling the code with version "1.6.0_16" with the flag -Xlint I get the following warning: warning: [cast] redundant cast to javax.xml.bind.JAXBElement<java.lang.Boolean> [javac] this.r = ((JAXBElement<Boolean> ) value); What does the warning mean, should I be worried? As I have generated and not written the code, what can I do to get rid of this specific warning?

    Read the article

  • How do I send email over SMTP with SSL using Java client?

    - by Ido
    I need to send email over smtp with ssl using java client. I'm not sure how to do that. If I have my server certificate installed on my Windows machine, how do I use it? If I want it to work on a non-Windows machine, do I need to get the certificates in a different way? BTW: If the SMTP server that I use is using SSL, can I be sure that it will send the mail to the recipient using SSL?

    Read the article

  • Which SHA-256 is correct? The Java SHA-256 digest or the Linux commandline tool

    - by Peter Tillemans
    When I calculate in Java an SHA-256 of a string with the following method I get : 5e884898da2847151d0e56f8dc6292773603dd6aabbdd62a11ef721d1542d8 on the commandline I do : echo "password" | sha256sum and get 5e884898da28047151d0e56f8dc6292773603d0d6aabbdd62a11ef721d1542d8 if we compare these more closely I find 2 subtle differences 5e884898da2847151d0e56f8dc6292773603dd6aabbdd62a11ef721d1542d8 5e884898da28047151d0e56f8dc6292773603d0d6aabbdd62a11ef721d1542d8 or : 5e884898da28 47151d0e56f8dc6292773603d d6aabbdd62a11ef721d1542d8 5e884898da28 0 47151d0e56f8dc6292773603d 0 d6aabbdd62a11ef721d1542d8 Which of the 2 is correct here?

    Read the article

  • How do I use xStream to output Java Objects with List properties?

    - by Zachary Spencer
    Hello, I am trying to output some Java objects as JSON, they have List properties which I want to be formatted as { "People" : [ { "Name" : "Bob" } , { "Name" : "Jim" } ] } However, I cannot figure out how to do this with XStream. It always outputs as { "Person" : { "Name" : "Bob" }, "Person" : { "Name" : "Bob" } Is there a way to fix this? I've put together some sample code with a unit test in github if you need something more concrete to play with: http://gist.github.com/371358 Thanks!

    Read the article

  • How can we stop a running java process through windows cmd?

    - by Wing C. Chen
    I am a newbie in cmd, so please allow me to ask a stupid question: How can we stop a running java process through windows cmd? For example, if we start jetty(a mini web server) with the following cmd: start javaw -jar start.jar How do we find the process and stop it later? Obviously the following cmd does not work: stop javaw -jar start.jar

    Read the article

  • How can a 1Gb Java heap on a 64bit machine use 3Gb of VIRT space?

    - by Graeme Moss
    I run the same process on a 32bit machine as on a 64bit machine with the same memory VM settings (-Xms1024m -Xmx1024m) and similar VM version (1.6.0_05 vs 1.6.0_16). However the virtual space used by the 64bit machine (as shown in top under "VIRT") is almost three times as big as that in 32bit! I know 64bit VMs will use a little more memory for the larger references, but how can it be three times as big? Am I reading VIRT in top incorrectly? Full data shown below, showing top and then the result of jmap -heap, first for 64bit, then for 32bit. Note the VIRT for 64bit is 3319m for 32bit is 1220m. * 64bit * PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 22534 agent 20 0 3319m 163m 14m S 4.7 2.0 0:04.28 java $ jmap -heap 22534 Attaching to process ID 22534, please wait... Debugger attached successfully. Server compiler detected. JVM version is 10.0-b19 using thread-local object allocation. Parallel GC with 4 thread(s) Heap Configuration: MinHeapFreeRatio = 40 MaxHeapFreeRatio = 70 MaxHeapSize = 1073741824 (1024.0MB) NewSize = 2686976 (2.5625MB) MaxNewSize = -65536 (-0.0625MB) OldSize = 5439488 (5.1875MB) NewRatio = 2 SurvivorRatio = 8 PermSize = 21757952 (20.75MB) MaxPermSize = 88080384 (84.0MB) Heap Usage: PS Young Generation Eden Space: capacity = 268500992 (256.0625MB) used = 247066968 (235.62142181396484MB) free = 21434024 (20.441078186035156MB) 92.01715277089181% used From Space: capacity = 44695552 (42.625MB) used = 0 (0.0MB) free = 44695552 (42.625MB) 0.0% used To Space: capacity = 44695552 (42.625MB) used = 0 (0.0MB) free = 44695552 (42.625MB) 0.0% used PS Old Generation capacity = 715849728 (682.6875MB) used = 0 (0.0MB) free = 715849728 (682.6875MB) 0.0% used PS Perm Generation capacity = 21757952 (20.75MB) used = 16153928 (15.405586242675781MB) free = 5604024 (5.344413757324219MB) 74.24378912132907% used * 32bit * PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 30168 agent 20 0 1220m 175m 12m S 0.0 2.2 0:13.43 java $ jmap -heap 30168 Attaching to process ID 30168, please wait... Debugger attached successfully. Server compiler detected. JVM version is 14.2-b01 using thread-local object allocation. Parallel GC with 8 thread(s) Heap Configuration: MinHeapFreeRatio = 40 MaxHeapFreeRatio = 70 MaxHeapSize = 1073741824 (1024.0MB) NewSize = 1048576 (1.0MB) MaxNewSize = 4294901760 (4095.9375MB) OldSize = 4194304 (4.0MB) NewRatio = 8 SurvivorRatio = 8 PermSize = 16777216 (16.0MB) MaxPermSize = 67108864 (64.0MB) Heap Usage: PS Young Generation Eden Space: capacity = 89522176 (85.375MB) used = 80626352 (76.89128112792969MB) free = 8895824 (8.483718872070312MB) 90.0629940005033% used From Space: capacity = 14876672 (14.1875MB) used = 14876216 (14.187065124511719MB) free = 456 (4.3487548828125E-4MB) 99.99693479832048% used To Space: capacity = 14876672 (14.1875MB) used = 0 (0.0MB) free = 14876672 (14.1875MB) 0.0% used PS Old Generation capacity = 954466304 (910.25MB) used = 10598496 (10.107513427734375MB) free = 943867808 (900.1424865722656MB) 1.1104107034039412% used PS Perm Generation capacity = 16777216 (16.0MB) used = 11366448 (10.839889526367188MB) free = 5410768 (5.1601104736328125MB) 67.74930953979492% used

    Read the article

< Previous Page | 406 407 408 409 410 411 412 413 414 415 416 417  | Next Page >