Search Results

Search found 38328 results on 1534 pages for 'write xml'.

Page 305/1534 | < Previous Page | 301 302 303 304 305 306 307 308 309 310 311 312  | Next Page >

  • Disk performance below expectations

    - by paulH
    this is a follow-up to a previous question that I asked (Two servers with inconsistent disk speed). I have a PowerEdge R510 server with a PERC H700 integrated RAID controller (call this Server B) that was built using eight disks with 3Gb/s bandwidth that I was comparing with an almost identical server (call this Server A) that was built using four disks with 6Gb/s bandwidth. Server A had much better I/O rates than Server B. Once I discovered the difference with the disks, I had Server A rebuilt with faster 6Gbps disks. Unfortunately this resulted in no increase in the performance of the disks. Expecting that there must be some other configuration difference between the servers, we took the 6Gbps disks out of Server A and put them in Server B. This also resulted in no increase in the performance of the disks. We now have two identical servers built, with the exception that one is built with six 6Gbps disks and the other with eight 3Gbps disks, and the I/O rates of the disks is pretty much identical. This suggests that there is some bottleneck other than the disks, but I cannot understand how Server B originally had better I/O that has subsequently been 'lost'. Comparative I/O information below, as measured by SQLIO. The same parameters were used for each test. It's not the actual numbers that are significant but rather the variations between systems. In each case D: is a 2 disk RAID 1 volume, and E: is a 4 disk RAID 10 volume (apart from the original Server A, where E: was a 2 disk RAID 0 volume). Server A (original setup with 6Gpbs disks) D: Read (MB/s) 63 MB/s D: Write (MB/s) 170 MB/s E: Read (MB/s) 68 MB/s E: Write (MB/s) 320 MB/s Server B (original setup with 3Gpbs disks) D: Read (MB/s) 52 MB/s D: Write (MB/s) 88 MB/s E: Read (MB/s) 112 MB/s E: Write (MB/s) 130 MB/s Server A (new setup with 3Gpbs disks) D: Read (MB/s) 55 MB/s D: Write (MB/s) 85 MB/s E: Read (MB/s) 67 MB/s E: Write (MB/s) 180 MB/s Server B (new setup with 6Gpbs disks) D: Read (MB/s) 61 MB/s D: Write (MB/s) 95 MB/s E: Read (MB/s) 69 MB/s E: Write (MB/s) 180 MB/s Can anybody suggest any ideas what is going on here? The drives in use are as follows: Dell Seagate F617N ST3300657SS 300GB 15K RPM SAS Dell Hitachi HUS156030VLS600 300GB 3.5 inch 15000rpm 6GB SAS Hitachi Hus153030vls300 300GB Server SAS Dell ST3146855SS Seagate 3.5 inch 146GB 15K SAS

    Read the article

  • some clarification on accept field in http request

    - by Salvador Dali
    Can anyone enlighten me on the following question: What do different fields in accept field in HTTP request mean? I can understand the basics that through accept the client is telling the server what type of information it is waiting to receive, so for example: Accept:text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 This way the client will tell the server that it can understand three following formats: text/html application/xhtml+xml application/xml But can someone tell me what this q values mean and that / Also if I have any flaws in my understanding - please tell me.

    Read the article

  • What is some good lossless video codec for recording gameplay?

    - by Don Salva
    I'm an avid gamer and I like to record my gameplay. Usually I've been using Fraps to do it, however I'm thinking of switching to Dxtory as it allows to write on multiple HDDs at once. Say I have 3 HDDs with the following write speeds: HDD1 with 50 mb/s, HDD2 with 22 mb/s and HDD3 with 45 mb/s. Combined write speed would be: 117 mb/s. Dxtory allows you to utilize all 3 HDD's at once while recording your gameplay. Using this formula: RGB24 YUV24: Width x Height x 3 x fps = bitrate (byte/sec) YUV420: Width x Height x 3 / 2 x fps = bitrate (byte/sec) YUV410: Width x Height x 9 / 8 x fps = bitrate (byte/sec) And recording in YUV420 colorspace at 1920x1080 with 30 fps I'd need about 95 mb/s write speed. Dxtory is good because it allows me to play with constant 60 fps while recording in 30 fps. Fraps does not (even though they say it does), once you start recording with Fraps, the game's fps drops. So I'm looking for a codec that doesn't need a very high write speed (bitrate) yet records in good (lossless) quality. Dxtory comes with its own codec, the Dxtory codec. Which allows me some experimentation. Fraps has it's own codec which I can use in Dxtory to expirement around. I also came across http://lags.leetcode.net/codec.html . Are there more lossless codecs out there (besides Fraps' and Dxtory's) which are good for what I want to do? Edit: To clarify, yes, I'm aware a lossless codec always has "good" quality. But that's not what I'm looking for. Let me take the Fraps codec and Dxtory codec to clarify what I'm looking for. When I record with the Dxtory codec in RGB colorspace at 1920x1080 with targeted 30 fps, I can play the game at 60 fps, BUT I'm recording with 10-15 fps, that's because RGB with Dxtory needs much, much more write speed than my hdd can handle. When recording with Dxtory codec in YUV410 colorspace at 1920x1080 with targeted 30 fps, I can play at 60 fps and record at 30 fps, again, that's because YUV410 in Dxtory's codec takes much, much less write speed than RGB When recording with Fraps codec in ??? (I dunno the color space Fraps records in, I guess YUV420), I can play with 60 fps and record with 30 fps. What I'm looking for is a lossless codec that can record in YUV420 (or even RGB??) which does not exceed a write speed (or bitrate if you will) of 100 mb/s in 1920x1080 or in other words, which will allow me to record in constant 30fps. Obviously the best solution would be to buy an SDD, but that's not what I'm after.

    Read the article

  • vsftpd: ECONNREFUSED with "allow_writeable_chroot=YES"

    - by heinob
    When setting up vsftpd I am trapped. When I leave the ftpuser's home directory without write permission I can login and all is fine despite the fact, that I cannot write (of course). When I add write permission I get something like cannot change to directory with write permissions if user is chrooted Then I added allow_writeable_chroot=YES to vsftpd.conf. But now I get ECONNREFUSED - Connection refused by server I am lost. What am I doing wrong?

    Read the article

  • Scanstate.exe Migration Error

    - by user1438147
    I'm trying to create a migration store on a server using Scanstate and this is the error I recieve: C:\USMTscanstate.exe \(Server_Name)\migration\mystore /f:migdocs.xml /f: migapp.xml /v:13 /f:scan.log Failed. An error occurred processing the command line. scanstate.exe \svdataitfl1\migration\mystore ##ERROR## -- /f:migdocs.xml /f migapp.xml /v:13 /f:scan.log Undefined or incomplete command line option ScanState return code: 11 I can't seem to find the answer to this...need some help.

    Read the article

  • e2fsck extremely slow, although enough memory exists

    - by kaefert
    I've got this external USB-Disk: kaefert@blechmobil:~$ lsusb -s 2:3 Bus 002 Device 003: ID 0bc2:3320 Seagate RSS LLC As can be seen in this dmesg output, there is some problem that prevents that disk from beeing mounted: kaefert@blechmobil:~$ dmesg ... [ 113.084079] usb 2-1: new high-speed USB device number 3 using ehci_hcd [ 113.217783] usb 2-1: New USB device found, idVendor=0bc2, idProduct=3320 [ 113.217787] usb 2-1: New USB device strings: Mfr=2, Product=3, SerialNumber=1 [ 113.217790] usb 2-1: Product: Expansion Desk [ 113.217792] usb 2-1: Manufacturer: Seagate [ 113.217794] usb 2-1: SerialNumber: NA4J4N6K [ 113.435404] usbcore: registered new interface driver uas [ 113.455315] Initializing USB Mass Storage driver... [ 113.468051] scsi5 : usb-storage 2-1:1.0 [ 113.468180] usbcore: registered new interface driver usb-storage [ 113.468182] USB Mass Storage support registered. [ 114.473105] scsi 5:0:0:0: Direct-Access Seagate Expansion Desk 070B PQ: 0 ANSI: 6 [ 114.474342] sd 5:0:0:0: [sdb] 732566645 4096-byte logical blocks: (3.00 TB/2.72 TiB) [ 114.475089] sd 5:0:0:0: [sdb] Write Protect is off [ 114.475092] sd 5:0:0:0: [sdb] Mode Sense: 43 00 00 00 [ 114.475959] sd 5:0:0:0: [sdb] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA [ 114.477093] sd 5:0:0:0: [sdb] 732566645 4096-byte logical blocks: (3.00 TB/2.72 TiB) [ 114.501649] sdb: sdb1 [ 114.502717] sd 5:0:0:0: [sdb] 732566645 4096-byte logical blocks: (3.00 TB/2.72 TiB) [ 114.504354] sd 5:0:0:0: [sdb] Attached SCSI disk [ 116.804408] EXT4-fs (sdb1): ext4_check_descriptors: Checksum for group 3976 failed (47397!=61519) [ 116.804413] EXT4-fs (sdb1): group descriptors corrupted! ... So I went and fired up my favorite partition manager - gparted, and told it to verify and repair the partition sdb1. This made gparted call e2fsck (version 1.42.4 (12-Jun-2012)) e2fsck -f -y -v /dev/sdb1 Although gparted called e2fsck with the "-v" option, sadly it doesn't show me the output of my e2fsck process (bugreport https://bugzilla.gnome.org/show_bug.cgi?id=467925 ) I started this whole thing on Sunday (2012-11-04_2200) evening, so about 48 hours ago, this is what htop says about it now (2012-11-06-1900): PID USER PRI NI VIRT RES SHR S CPU% MEM% TIME+ Command 3704 root 39 19 1560M 1166M 768 R 98.0 19.5 42h56:43 e2fsck -f -y -v /dev/sdb1 Now I found a few posts on the internet that discuss e2fsck running slow, for example: http://gparted-forum.surf4.info/viewtopic.php?id=13613 where they write that its a good idea to see if the disk is just that slow because maybe its damaged, and I think these outputs tell me that this is not the case in my case: kaefert@blechmobil:~$ sudo hdparm -tT /dev/sdb /dev/sdb: Timing cached reads: 3562 MB in 2.00 seconds = 1783.29 MB/sec Timing buffered disk reads: 82 MB in 3.01 seconds = 27.26 MB/sec kaefert@blechmobil:~$ sudo hdparm /dev/sdb /dev/sdb: multcount = 0 (off) readonly = 0 (off) readahead = 256 (on) geometry = 364801/255/63, sectors = 5860533160, start = 0 However, although I can read quickly from that disk, this disk speed doesn't seem to be used by e2fsck, considering tools like gkrellm or iotop or this: kaefert@blechmobil:~$ iostat -x Linux 3.2.0-2-amd64 (blechmobil) 2012-11-06 _x86_64_ (2 CPU) avg-cpu: %user %nice %system %iowait %steal %idle 14,24 47,81 14,63 0,95 0,00 22,37 Device: rrqm/s wrqm/s r/s w/s rkB/s wkB/s avgrq-sz avgqu-sz await r_await w_await svctm %util sda 0,59 8,29 2,42 5,14 43,17 160,17 53,75 0,30 39,80 8,72 54,42 3,95 2,99 sdb 137,54 5,48 9,23 0,20 587,07 22,73 129,35 0,07 7,70 7,51 16,18 2,17 2,04 Now I researched a little bit on how to find out what e2fsck is doing with all that processor time, and I found the tool strace, which gives me this: kaefert@blechmobil:~$ sudo strace -p3704 lseek(4, 41026998272, SEEK_SET) = 41026998272 write(4, "\212\354K[_\361\3nl\212\245\352\255jR\303\354\312Yv\334p\253r\217\265\3567\325\257\3766"..., 4096) = 4096 lseek(4, 48404766720, SEEK_SET) = 48404766720 read(4, "\7t\260\366\346\337\304\210\33\267j\35\377'\31f\372\252\ffU\317.y\211\360\36\240c\30`\34"..., 4096) = 4096 lseek(4, 41027002368, SEEK_SET) = 41027002368 write(4, "\232]7Ws\321\352\t\1@[+5\263\334\276{\343zZx\352\21\316`1\271[\202\350R`"..., 4096) = 4096 lseek(4, 48404770816, SEEK_SET) = 48404770816 read(4, "\17\362r\230\327\25\346//\210H\v\311\3237\323K\304\306\361a\223\311\324\272?\213\tq \370\24"..., 4096) = 4096 lseek(4, 41027006464, SEEK_SET) = 41027006464 write(4, "\367yy>x\216?=\324Z\305\351\376&\25\244\210\271\22\306}\276\237\370(\214\205G\262\360\257#"..., 4096) = 4096 lseek(4, 48404774912, SEEK_SET) = 48404774912 read(4, "\365\25\0\21|T\0\21}3t_\272\373\222k\r\177\303\1\201\261\221$\261B\232\3142\21U\316"..., 4096) = 4096 ^CProcess 3704 detached around 16 of these lines every second, so 4 read and 4 write operations every second, which I don't consider to be a lot.. And finally, my question: Will this process ever finish? If those numbers from fseek (48404774912) represent bytes, that would be something like 45 gigabytes, with this beeing a 3 terrabyte disk, which would give me 134 days to go, if the speed stays constant, and e2fsck scans the disk like this completly and only once. Do you have some advice for me? I have most of the data on that disk elsewhere, but I've put a lot of hours into sorting and merging it to this disk, so I would prefer to getting this disk up and running again, without formatting it anew. I don't think that the hardware is damaged since the disk is only a few months and since I can't see any I/O errors in the dmesg output. UPDATE: I just looked at the strace output again (2012-11-06_2300), now it looks like this: lseek(4, 1419860611072, SEEK_SET) = 1419860611072 read(4, "3#\f\2447\335\0\22A\355\374\276j\204'\207|\217V|\23\245[\7VP\251\242\276\207\317:"..., 4096) = 4096 lseek(4, 43018145792, SEEK_SET) = 43018145792 write(4, "]\206\231\342Y\204-2I\362\242\344\6R\205\361\324\177\265\317C\334V\324\260\334\275t=\10F."..., 4096) = 4096 lseek(4, 1419860615168, SEEK_SET) = 1419860615168 read(4, "\262\305\314Y\367\37x\326\245\226\226\320N\333$s\34\204\311\222\7\315\236\336\300TK\337\264\236\211n"..., 4096) = 4096 lseek(4, 43018149888, SEEK_SET) = 43018149888 write(4, "\271\224m\311\224\25!I\376\16;\377\0\223H\25Yd\201Y\342\r\203\271\24eG<\202{\373V"..., 4096) = 4096 lseek(4, 1419860619264, SEEK_SET) = 1419860619264 read(4, ";d\360\177\n\346\253\210\222|\250\352T\335M\33\260\320\261\7g\222P\344H?t\240\20\2548\310"..., 4096) = 4096 lseek(4, 43018153984, SEEK_SET) = 43018153984 write(4, "\360\252j\317\310\251G\227\335{\214`\341\267\31Y\202\360\v\374\307oq\3063\217Z\223\313\36D\211"..., 4096) = 4096 So the numbers in the lseek lines before the reads, like 1419860619264 are already a lot bigger, standing for 1.29 terabytes if those numbers are bytes, so it doesn't seem to be a linear progress on a big scale, maybe there are only some areas that need work, that have big gaps in between them. UPDATE2: Okey, big disappointment, the numbers are back to very small again (2012-11-07_0720) lseek(4, 52174548992, SEEK_SET) = 52174548992 read(4, "\374\312\22\\\325\215\213\23\0357U\222\246\370v^f(\312|f\212\362\343\375\373\342\4\204mU6"..., 4096) = 4096 lseek(4, 46603526144, SEEK_SET) = 46603526144 write(4, "\370\261\223\227\23?\4\4\217\264\320_Am\246CQ\313^\203U\253\274\204\277\2564n\227\177\267\343"..., 4096) = 4096 so either e2fsck goes over the data multiple times, or it just hops back and forth multiple times. Or my assumption that those numbers are bytes is wrong. UPDATE3: Since it's mentioned here http://forums.fedoraforum.org/showthread.php?t=282125&page=2 that you can testisk while e2fsck is running, i tried that, though not with a lot of success. When asking testdisk to display the data of my partition, this is what I get: TestDisk 6.13, Data Recovery Utility, November 2011 Christophe GRENIER <[email protected]> http://www.cgsecurity.org 1 P Linux 0 4 5 45600 40 8 732566272 Can't open filesystem. Filesystem seems damaged. And this is what strace currently gives me (2012-11-07_1030) lseek(4, 212460343296, SEEK_SET) = 212460343296 read(4, "\315Mb\265v\377Gn \24\f\205EHh\2349~\330\273\203\3375\206\10\r3=W\210\372\352"..., 4096) = 4096 lseek(4, 47347830784, SEEK_SET) = 47347830784 write(4, "]\204\223\300I\357\4\26\33+\243\312G\230\250\371*m2U\t_\215\265J \252\342Pm\360D"..., 4096) = 4096 (times are in CET)

    Read the article

  • What's the difference between one-dash and two-dashes for command prompt parameters?

    - by Pacerier
    I was wondering why is it that some programs requires their command prompt parameters to have two dashes in front whereas some (most) only require one dash in front? For example most programs look like this: relaxer -dtd toc.xml toc_gr.xml toc_jp.xml Whereas some programs look like this: xmllint --valid toc.xml --noout What's the reason that some requires two dashes instead of one? Doesn't it make sense for everyone to stick to one standard (i.e. a single dash will do).

    Read the article

  • SSD performance

    - by Tom
    I recently upgraded to a Kingston Hyper-X 120GB SSD, when I run Crystaldiskmark my scores look really slow, my MB (gigabyte 775) does not have an option for ACHI in the BIOS, I'm wondering if that's an issue. The scores were: Seq read -233 write-176.8 512K-224 write-175.8 4K-25 write-80 4K-23 write-102 This drive is rated for over 500, Any help or input would be greatly appreciated..

    Read the article

  • How can I tell Notepad++ to always use a particular language with a particular file extension.

    - by MatrixFrog
    I've associated .xul with Notepad++ so if I double-click on a .xul file, it will open in Notepad++. But Notepad++ doesn't know that XUL is just a particular type of XML, so I then have to manually click on "Language XML" to get XML syntax highlighting. Is there a way that I can tell it: "every time you open a file with the extension .xul, automatically switch to the XML language"?

    Read the article

  • WSDLException : An error occurred trying to resolve schema referenced at ...

    - by Stefano
    Hello i'm trying to generate a proxy class from a local wsdl file with eclipse Galileo and axis 2 1.4 on windows xp . My problem is that i get an error due to an imported schema inside the wsdl . The line tha troubles me is : <xsd:import namespace="http://www.w3.org/2005/05/xmlmime" schemaLocation="http://www.w3.org/2005/05/xmlmime"/> i've tried to run the wsdl2java following command: wsdl2java.bat -uri SOAService.wsdl -o D:\temp p test -d xmlbeans -a -s -ns2p -uw and i get the following exception : Exception in thread "main" org.apache.axis2.wsdl.codegen.CodeGenerationException : Error parsing WSDL at org.apache.axis2.wsdl.codegen.CodeGenerationEngine.(CodeGenerat ionEngine.java:156) at org.apache.axis2.wsdl.WSDL2Code.main(WSDL2Code.java:35) at org.apache.axis2.wsdl.WSDL2Java.main(WSDL2Java.java:24) Caused by: javax.wsdl.WSDLException: WSDLException (at /wsdl:definitions/wsdl:ty pes/xsd:schema): faultCode=OTHER_ERROR: An error occurred trying to resolve sche ma referenced at 'http://www.w3.org/2005/05/xmlmime', relative to 'file:/D:/Prog rammi/axis2-1.4/bin/SOAService.wsdl'.: java.net.ConnectException: Connection tim ed out: connect at com.ibm.wsdl.xml.WSDLReaderImpl.parseSchema(Unknown Source) at com.ibm.wsdl.xml.WSDLReaderImpl.parseSchema(Unknown Source) at com.ibm.wsdl.xml.WSDLReaderImpl.parseTypes(Unknown Source) at com.ibm.wsdl.xml.WSDLReaderImpl.parseDefinitions(Unknown Source) at com.ibm.wsdl.xml.WSDLReaderImpl.readWSDL(Unknown Source) at com.ibm.wsdl.xml.WSDLReaderImpl.readWSDL(Unknown Source) at com.ibm.wsdl.xml.WSDLReaderImpl.readWSDL(Unknown Source) at com.ibm.wsdl.xml.WSDLReaderImpl.readWSDL(Unknown Source) at com.ibm.wsdl.xml.WSDLReaderImpl.readWSDL(Unknown Source) at org.apache.axis2.wsdl.codegen.CodeGenerationEngine.readInTheWSDLFile( CodeGenerationEngine.java:288) at org.apache.axis2.wsdl.codegen.CodeGenerationEngine.(CodeGenerat ionEngine.java:111) ... 2 more Caused by: java.net.ConnectException: Connection timed out: connect at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:333) at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:195) at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:182) at java.net.Socket.connect(Socket.java:520) at java.net.Socket.connect(Socket.java:470) at sun.net.NetworkClient.doConnect(NetworkClient.java:157) at sun.net.www.http.HttpClient.openServer(HttpClient.java:388) at sun.net.www.http.HttpClient.openServer(HttpClient.java:523) at sun.net.www.http.HttpClient.(HttpClient.java:231) at sun.net.www.http.HttpClient.New(HttpClient.java:304) at sun.net.www.http.HttpClient.New(HttpClient.java:321) at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLC onnection.java:813) at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConne ction.java:765) at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection .java:690) at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLCon nection.java:934) at java.net.URL.openStream(URL.java:1007) at com.ibm.wsdl.util.StringUtils.getContentAsInputStream(Unknown Source) i suspect it's due to the system proxy which doesn't let retrieve the xsd to the wsdl2java tool. In fact i can download the file from the browser without problems. There's an option to specify a proxy to wsdl2java or someone has resolved this issue ? For the moment i've downloaded the xsd, added it to the project and changed the wsdl to include the relative file (instead of the remote one) , but i'd prefer to avoid this , because the file is a third party service wsdl. thank you in advance for any hint Stefano

    Read the article

  • Including configuration files while compiling a Flex application with MXMLC

    - by Daniel
    Hello there, I'm using: - Flex SDK 3.5.0 - Parsley 2.2.2. - Flash Builder 4 Down in my src folder (which is configured as part of the source path in the Flash Builder), I have a logging.xml which I configure via Parsley: FlexLoggingXmlSupport.initialize(); XmlContextBuilder.build("com/company/product/util/log/logging.xml"); When I run my application through Flash Builder, the XmlContentBuilder seems to locate the logging.xml (the implementation is a regular URLLoader one). When I compile my application using MXMLC (whether in Ant or command-line), and then run the swf, I get the following error: Cause(0): Error loading com/company/product/util/log/logging.xml: Error in URLLoader - cause: Error #2032: Stream Error. URL: file:///C|/workspace/folder01/product/target/com/company/product/util/log/logging.xml - cause: Error #2032: Stream Error. URL: file:///C|/workspace/folder01/product/target/com/company/product/util/log/logging.xml Here is the MXMLC tag in Ant: <mxmlc file="${product.src.dir}/com/company/product/view/Main.mxml" output="${product.target.dir}/${product.release.filename}" keep-generated-actionscript="false"> <load-config filename="${FLEX_HOME}/frameworks/flex-config.xml" /> <!-- source paths --> <source-path path-element="${FLEX_HOME}/frameworks" /> <compiler.source-path path-element="${product.src.dir}" /> <compiler.source-path path-element="${product.locale.dir}/{locale}" /> <compiler.library-path dir="${product.basedir}" append="true"> <include name="libs" /> </compiler.library-path> <warnings>false</warnings> <debug>false</debug> </mxmlc> And here is the command line: \mxmlc.exe -output "C:\temp\Rap.swf" -load-config "C:\Program Files\Adobe\Adobe Flash Builder 4 Plug-in\sdks\3.5.0\frameworks\flex-config.xml" -source-path "C:\Program Files\Adobe\Adobe Flash Builder 4 Plug-in\sdks\3.5.0\frameworks" C:\workspace\folder01\product\src C:\workspace\folder01\product\locale\en_US -library-path+=C:\workspace\folder01\product\libs -file-specs C:\workspace\folder01\product\src\com\company\product\view\main.mxml Now perhaps I don't get this correctly, but as far as I understand the SWF should be compiled with all of the resources in the paths I give MXMLC as source-paths. For some reason it seems that the XML file is not compiled into the SWF, hence the relative path of the XmlContentBuilder isn't located successfully. I could not find any argument to provide the MXMLC with that might solve this. I tried using the -dump-config option with the Flash Builder's compiler, then giving that configuration to MXMLC, but it didn't work either. I tried providing the XmlContentBuilder with an absolute path. That worked fine when I compiled with MXMLC via Ant, but still didn't work when I used MXMLC in the command-line... I'd be happy to be enlightened here, regarding all subjects - using MXMLC, accessing resources with relative paths, configuring logging in Parsley, etc. Many thanks in advance, Daniel

    Read the article

  • Cannot create a row of size 8074 which is greater than the allowable maximum row size of 8060.

    - by Lieven Cardoen
    I have already asked a question about this, but the problems keeps on hitting me ;-) I have two tables that are identical. I want to add a xml column. In the first table this is no problem, but in the second table I get the sqlException (title). However, apart from the data in it, they are the same. So, can I get the sqlException because of data in the table? I have also tried to store the field off page with EXEC sp_tableoption 'dbo.PackageSessionNodesFinished', 'large value types out of row', 1 but without any succes. The same SqlException keeps coming. First table: PackageSessionNodes CREATE TABLE [dbo].[PackageSessionNodes]( [PackageSessionNodeId] [int] IDENTITY(1,1) NOT NULL, [PackageSessionId] [int] NOT NULL, [TreeNodeId] [int] NOT NULL, [Duration] [int] NULL, [Score] [float] NOT NULL, [ScoreMax] [float] NOT NULL, [Interactions] [xml] NOT NULL, [BrainTeaser] [bit] NULL, [DateCreated] [datetime] NULL, [CompletionStatus] [int] NOT NULL, [ReducedScore] [float] NOT NULL, [ReducedScoreMax] [float] NOT NULL, [ContentInteractions] [xml] NOT NULL, CONSTRAINT [PK_PackageSessionNodes] PRIMARY KEY CLUSTERED ( [PackageSessionNodeId] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] Second table: PackageSessionNodesFinished CREATE TABLE [dbo].[PackageSessionNodesFinished]( [PackageSessionNodeFinishedId] [int] IDENTITY(1,1) NOT NULL, [PackageSessionId] [int] NOT NULL, [TreeNodeId] [int] NOT NULL, [Duration] [int] NULL, [Score] [float] NOT NULL, [ScoreMax] [float] NOT NULL, [Interactions] [xml] NOT NULL, [BrainTeaser] [bit] NULL, [DateCreated] [datetime] NULL, [CompletionStatus] [int] NOT NULL, [ReducedScore] [float] NOT NULL, [ReducedScoreMax] [float] NOT NULL, [ContentInteractions] [xml] NULL, CONSTRAINT [PK_PackageSessionNodesFinished] PRIMARY KEY CLUSTERED ( [PackageSessionNodeFinishedId] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] First script I tried to run (First two ALTER TABLE work fine, the third crashes on SqlException) ALTER TABLE dbo.PackageSessionNodes ADD ContentInteractions xml NOT NULL CONSTRAINT DF_PackageSessionNodes_ContentInteractions DEFAULT (('<contentinteractions/>')); ALTER TABLE dbo.PackageSessionNodes DROP CONSTRAINT DF_PackageSessionNodes_ContentInteractions ALTER TABLE dbo.PackageSessionNodesFinished ADD ContentInteractions xml NOT NULL CONSTRAINT DF_PackageSessionNodesFinished_ContentInteractions DEFAULT (('<contentinteractions/>')); ALTER TABLE dbo.PackageSessionNodesFinished DROP CONSTRAINT DF_PackageSessionNodesFinished_ContentInteractions Second script I tried to run with the same result as previous script: EXEC sp_tableoption 'dbo.PackageSessionNodes', 'large value types out of row', 1 ALTER TABLE dbo.PackageSessionNodes ADD ContentInteractions xml NOT NULL CONSTRAINT DF_PackageSessionNodes_ContentInteractions DEFAULT (('<contentinteractions/>')); ALTER TABLE dbo.PackageSessionNodes DROP CONSTRAINT DF_PackageSessionNodes_ContentInteractions EXEC sp_tableoption 'dbo.PackageSessionNodesFinished', 'large value types out of row', 1 ALTER TABLE dbo.PackageSessionNodesFinished ADD ContentInteractions xml NOT NULL CONSTRAINT DF_PackageSessionNodesFinished_ContentInteractions DEFAULT (('<contentinteractions/>')); ALTER TABLE dbo.PackageSessionNodesFinished DROP CONSTRAINT DF_PackageSessionNodesFinished_ContentInteractions Now, In PackageSessionNodes there are 234 records, in PackageSessionNodesFinished there are 4256946 records. Really would appreciate some help here as I'm stuck.

    Read the article

  • Ruby SerialPorts Gem

    - by Seth Archer
    Using Ruby SerialPorts Gem to interact with hardware. When I write a byte array to the hardware using a program called "Serial Port Monitor" the hardware responds correctly. However, when I write the same byte array using ruby it doesn't work unless I do a read request just before the write request. This device doesn't respond correctly with this sp = SerialPort.new(args) sp.write [200.chr, 30.chr, 6.chr, 5.chr, 1.chr, 2.chr, 0.chr, 244.chr] But it does if I add a read request before the write. Like this sp SerialPort.new(args) sp.read sp.write [200.chr, 30.chr, 6.chr, 5.chr, 1.chr, 2.chr, 0.chr, 244.chr] This works, but I'm at a loss as to why. I should also add that the first snippet does work occasionally maybe 1/10 of the time.

    Read the article

  • Optimized .htaccess???

    - by StackOverflowNewbie
    I'd appreciate some feedback on the compression and caching configuration below. Trying to come up with a general purpose, optimized compression and caching configuration. If possible: Note your PageSpeed and YSlow grades Add configuration to your .htaccess Clear your cache Note your PageSpeed and YSlow grades to see if there are any improvements (or degradations) NOTE: Make sure you have appropriate modules loaded. Any feedback is much appreciated. Thanks. # JavaScript MIME type issues: # 1. Apache uses "application/javascript": http://svn.apache.org/repos/asf/httpd/httpd/branches/1.3.x/conf/mime.types # 2. IIS uses "application/x-javascript": http://technet.microsoft.com/en-us/library/bb742440.aspx # 3. SVG specification says it is text/ecmascript: http://www.w3.org/TR/2001/REC-SVG-20010904/script.html#ScriptElement # 4. HTML specification says it is text/javascript: http://www.w3.org/TR/1999/REC-html401-19991224/interact/scripts.html#h-18.2.2.2 # 5. "text/ecmascript" and "text/javascript" are considered obsolete: http://www.rfc-editor.org/rfc/rfc4329.txt #------------------------------------------------------------------------------- # Compression #------------------------------------------------------------------------------- <IfModule mod_deflate.c> AddOutputFilterByType DEFLATE application/xhtml+xml AddOutputFilterByType DEFLATE application/xml AddOutputFilterByType DEFLATE application/atom+xml AddOutputFilterByType DEFLATE text/css AddOutputFilterByType DEFLATE text/html AddOutputFilterByType DEFLATE text/plain AddOutputFilterByType DEFLATE text/xml # The following MIME types are in the process of registration AddOutputFilterByType DEFLATE application/xslt+xml AddOutputFilterByType DEFLATE image/svg+xml # The following MIME types are NOT registered AddOutputFilterByType DEFLATE application/mathml+xml AddOutputFilterByType DEFLATE application/rss+xml # Deal with JavaScript MIME type issues AddOutputFilterByType DEFLATE application/javascript AddOutputFilterByType DEFLATE application/x-javascript AddOutputFilterByType DEFLATE text/ecmascript AddOutputFilterByType DEFLATE text/javascript </IfModule> #------------------------------------------------------------------------------- # Expires header #------------------------------------------------------------------------------- <IfModule mod_expires.c> # 1. Set Expires to a minimum of 1 month, and preferably up to 1 year, in the future # (but not more than 1 year as that would violate the RFC guidelines) # 2. Use "Expires" over "Cache-Control: max-age" because it is more widely accepted ExpiresActive on ExpiresByType application/pdf "access plus 1 year" ExpiresByType application/x-shockwave-flash "access plus 1 year" ExpiresByType image/bmp "access plus 1 year" ExpiresByType image/gif "access plus 1 year" ExpiresByType image/jpeg "access plus 1 year" ExpiresByType image/png "access plus 1 year" ExpiresByType image/svg+xml "access plus 1 year" ExpiresByType image/tiff "access plus 1 year" ExpiresByType image/x-icon "access plus 1 year" ExpiresByType text/css "access plus 1 year" ExpiresByType video/x-flv "access plus 1 year" # Deal with JavaScript MIME type issues ExpiresByType application/javascript "access plus 1 year" ExpiresByType application/x-javascript "access plus 1 year" ExpiresByType text/ecmascript "access plus 1 year" ExpiresByType text/javascript "access plus 1 year" # Probably better to explicitly declare MIME types than to have a blanket rule for expiration # Uncomment below if you disagree #ExpiresDefault "access plus 1 year" </IfModule> #------------------------------------------------------------------------------- # Caching #------------------------------------------------------------------------------- <IfModule mod_headers.c> <FilesMatch "\.(bmp|css|flv|gif|ico|jpg|jpeg|js|pdf|png|svg|swf|tif|tiff)$"> Header add Cache-Control "public" Header unset ETag Header unset Last-Modified FileETag none </FilesMatch> </IfModule>

    Read the article

  • C# SerialPort - Problems mixing ports with different baud rates.

    - by GrandAdmiral
    Greetings, I have two devices that I would like to connect over a serial interface, but they have incompatible connections. To get around this problem, I connected them both to my PC and I'm working on a C# program that will route traffic on COM port X to COM port Y and vice versa. The program connects to two COM ports. In the data received event handler, I read in incoming data and write it to the other COM port. To do this, I have the following code: private void HandleDataReceived(SerialPort inPort, SerialPort outPort) { byte[] data = new byte[1]; while (inPort.BytesToRead > 0) { // Read the data data[0] = (byte)inPort.ReadByte(); // Write the data if (outPort.IsOpen) { outPort.Write(data, 0, 1); } } } That code worked fine as long as the outgoing COM port operated at a higher baud rate than the incoming COM port. If the incoming COM port was faster than the outgoing COM port, I started missing data. I had to correct the code like this: private void HandleDataReceived(SerialPort inPort, SerialPort outPort) { byte[] data = new byte[1]; while (inPort.BytesToRead > 0) { // Read the data data[0] = (byte)inPort.ReadByte(); // Write the data if (outPort.IsOpen) { outPort.Write(data, 0, 1); while (outPort.BytesToWrite > 0); //<-- Change to fix problem } } } I don't understand why I need that fix. I'm new to C# (this is my first program), so I'm wondering if there is something I am missing. The SerialPort defaults to a 2048 byte write buffer and my commands are less than ten bytes. The write buffer should have the ability to buffer the data until it can be written to a slower COM port. In summary, I'm receiving data on COM X and writing the data to COM Y. COM X is connected at a faster baud rate than COM Y. Why doesn't the buffering in the write buffer handle this difference? Why does it seem that I need to wait for the write buffer to drain to avoid losing data? Thanks!

    Read the article

  • Hibernate exception

    - by Mark
    Hi all, im new to hibernate! i have followed the netbeans tutorial on creating a hibernate enabled application. after sucessfully creating a database in mysql workbench i reversed engineered the pojos etc and then tried to run a simple query(from Course) and got the following org.hibernate.MappingException: An association from the table coursemodule refers to an unmapped class: DAL.Module at org.hibernate.cfg.Configuration.secondPassCompileForeignKeys(Configuration.java:1252) at org.hibernate.cfg.Configuration.secondPassCompile(Configuration.java:1170) at org.hibernate.cfg.AnnotationConfiguration.secondPassCompile(AnnotationConfiguration.java:324) at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1286) at org.hibernate.cfg.AnnotationConfiguration.buildSessionFactory(AnnotationConfiguration.java:859) heres the generated class for Course package DAL; // Generated 02-May-2010 16:41:16 by Hibernate Tools 3.2.1.GA import java.util.HashSet; import java.util.Set; /** * Course generated by hbm2java */ public class Course implements java.io.Serializable { private int id; private String name; private Set<Module> modules = new HashSet<Module>(0); public Course() { } public Course(int id, String name) { this.id = id; this.name = name; } public Course(int id, String name, Set<Module> modules) { this.id = id; this.name = name; this.modules = modules; } public int getId() { return this.id; } public void setId(int id) { this.id = id; } public String getName() { return this.name; } public void setName(String name) { this.name = name; } public Set<Module> getModules() { return this.modules; } public void setModules(Set<Module> modules) { this.modules = modules; } } and its config file course.hbm.xml <?xml version="1.0"?> <!DOCTYPE hibernate-mapping PUBLIC "-//Hibernate/Hibernate Mapping DTD 3.0//EN" "http://hibernate.sourceforge.net/hibernate-mapping-3.0.dtd"> <!-- Generated 02-May-2010 16:41:16 by Hibernate Tools 3.2.1.GA --> <hibernate-mapping> <class name="DAL.Course" table="course" catalog="walkthrough"> <id name="id" type="int"> <column name="id" /> <generator class="assigned" /> </id> <property name="name" type="string"> <column name="name" not-null="true" /> </property> <set name="modules" inverse="false" table="coursemodule"> <key> <column name="courseId" not-null="true" unique="true" /> </key> <many-to-many entity-name="DAL.Module"> <column name="moduleId" not-null="true" unique="true" /> </many-to-many> </set> </class> </hibernate-mapping> hibernate.reveng.xml <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE hibernate-reverse-engineering PUBLIC "-//Hibernate/Hibernate Reverse Engineering DTD 3.0//EN" "http://hibernate.sourceforge.net/hibernate-reverse-engineering-3.0.dtd"> <hibernate-reverse-engineering> <schema-selection match-catalog="Walkthrough"/> <table-filter match-name="walkthrough"/> <table-filter match-name="course"/> <table-filter match-name="module"/> <table-filter match-name="studentmodule"/> <table-filter match-name="attendee"/> <table-filter match-name="student"/> <table-filter match-name="coursemodule"/> <table-filter match-name="session"/> <table-filter match-name="test"/> </hibernate-reverse-engineering> hibernate.cfg.xml <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE hibernate-configuration PUBLIC "-//Hibernate/Hibernate Configuration DTD 3.0//EN" "http://hibernate.sourceforge.net/hibernate-configuration-3.0.dtd"> <hibernate-configuration> <session-factory> <property name="hibernate.dialect">org.hibernate.dialect.MySQLDialect</property> <property name="hibernate.connection.driver_class">com.mysql.jdbc.Driver</property> <property name="hibernate.connection.url">jdbc:mysql://localhost:3306/Walkthrough</property> <property name="hibernate.connection.username">root</property> <property name="hibernate.connection.password">password</property> <property name="hibernate.show_sql">true</property> <property name="hibernate.current_session_context_class">thread</property> <mapping resource="DAL/Student.hbm.xml"/> <mapping resource="DAL/Walkthrough.hbm.xml"/> <mapping resource="DAL/Test.hbm.xml"/> <mapping resource="DAL/Module.hbm.xml"/> <mapping resource="DAL/Session.hbm.xml"/> <mapping resource="DAL/Course.hbm.xml"/> </session-factory> </hibernate-configuration> any ideas on why im getting this exception? ps. test is just a table with an id in it and is not related to anything. running "from Test" works

    Read the article

  • Flex ANT tasks can't find my assets

    - by lach
    I'm attempting to compile my Flex project with an ANT build script. One of my MXML components references an external XML data file, like this: <mx:XML id="treeData" source="assets/data/help.xml" /> When I build the project using Flex Builder, it compiles fine. However, when I try to compile it using ANT, I get the following error: Error: Problem finding external XML: assets/data/help.xml How come ANT isn't finding the XML file? Apparently it knows the source path otherwise it would not have found the component to begin with. I added the source path to the target anyway, but it doesn't seem to have made any difference: <source-path path-element="${SRC}" /> Any ideas?

    Read the article

  • ASP Web Service Not Working

    - by BlitzPackage
    Good people, Hello. Our webservice is not working. We should be receiving information via an HTTP POST. However, nothing is working. Below are the code files. Let me know what you think. Thanks in advance for any help or information you can provide. (By the way, some information (e.g. class names, connection strings, etc...) has been removed or changed in order to hide any sensitive information. Imports System.Web.Mail Imports System.Data Imports System.Data.SqlClient Imports System.IO Partial Class hbcertification Inherits System.Web.UI.Page Public strBody As String = "" Public sqlInsertStr As String = "" Public errStr As String = "" Public txn_id, first_name, last_name, address_street, address_city, address_state, address_zip, address_country, address_phone, payer_email, Price, key, invoice, payment_date, mc_fee, buyer_ip As String Dim myConn As New SqlConnection(ConfigurationSettings.AppSettings("ConnectionInfo")) '******************************************************************************************* Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load strBody += "Test email sent.Customer name: " & Request("first_name") & " " & Request("last_name") strBody += "Reg Key: " & Request("key") & "Transaction ID: " & Request("txn_id") & "Tran Type: " & Request("txn_type") updateFile(Server.MapPath("log.txt"), strBody) txn_id = Request("txn_id") first_name = Request("first_name") last_name = Request("last_name") address_street = Request("address_street") address_city = Request("address_city") address_state = Request("address_state") address_zip = Request("address_zip") address_country = Request("address_country") address_phone = Request("address_phone") payer_email = Request("payer_email") Price = Request("Price") key = Request("key") invoice = Request("invoice") payment_date = Request("payment_date") mc_fee = Request("mc_fee") buyer_ip = Request("buyer_ip") If Request("first_name") "" And Request("last_name") "" Then SendMail("[email protected]", "[email protected]", strBody, "Software Order Notification", "[email protected]") Else Response.Write("Email not sent. Name missing.") End If Dim sItem As String Response.Write("") If Request.Form("dosubmit") = "1" Then Response.Write("FORM VALS:") For Each sItem In Request.Form Response.Write("" & sItem & " - [" & Request.Form(sItem) & "]") Next sqlInsertStr += "insert into aspnet_MorrisCustomerInfo (TransactionID,FirstName,LastName,AddressStreet,AddressCity,AddressState,AddressZip,AddressCountry,AddressPhone,PayerEmail,Price,AuthenticationCode,InvoiceID,PurchaseDate,PaypalFee,PurchaseIPAddress) values ('" & SQLSafe(txn_id) & "','" & SQLSafe(first_name) & "','" & SQLSafe(last_name) & "','" & SQLSafe(address_street) & "','" & SQLSafe(address_city) & "','" & SQLSafe(address_state) & "','" & SQLSafe(address_zip) & "','" & SQLSafe(address_country) & "','" & SQLSafe(address_phone) & "','" & SQLSafe(payer_email) & "','" & SQLSafe(Price) & "','" & SQLSafe(key) & "','" & SQLSafe(invoice) & "','" & SQLSafe(payment_date) & "','" & SQLSafe(mc_fee) & "','" & SQLSafe(buyer_ip) & "')" runMyQuery(sqlInsertStr, False) End If Response.Write("sqlInsertStr is: " & sqlInsertStr) Response.Write("") End Sub '******************************************************************************************* Sub SendMail(ByVal strEmailAddress, ByVal strEmailAddress_cc, ByVal Email_Body, ByVal Email_Subject, ByVal Email_From) If Request.ServerVariables("server_name") "localhost" Then Try Dim resumeEmail As New MailMessage resumeEmail.To = strEmailAddress resumeEmail.Cc = strEmailAddress_cc resumeEmail.From = Email_From resumeEmail.Subject = Email_Subject resumeEmail.Priority = MailPriority.High 'resumeEmail.BodyFormat = MailFormat.Html resumeEmail.BodyFormat = MailFormat.Html resumeEmail.Body = Email_Body 'System.Web.Mail.SmtpMail.SmtpServer = "morris.com" System.Web.Mail.SmtpMail.SmtpServer = "relay-hosting.secureserver.net" System.Web.Mail.SmtpMail.Send(resumeEmail) Response.Write("Email sent.") Catch exc As Exception Response.Write("MAIL ERROR OCCURRED" & exc.ToString() & "From: " & Email_From) End Try Else Response.Write("TEST RESPONSE" & strBody & "") End If End Sub End Function End Class Process Data Asp.Net Configuration option in Visual Studio. A full list of settings and comments can be found in machine.config.comments usually located in \Windows\Microsoft.Net\Framework\v2.x\Config -- section enables configuration of the security authentication mode used by ASP.NET to identify an incoming user. -- section enables configuration of what to do if/when an unhandled error occurs during the execution of a request. Specifically, it enables developers to configure html error pages to be displayed in place of a error stack trace. --

    Read the article

  • Exporting info from PS script to csv

    - by George
    Hi, This is a powershell/AD/Exchange question.... I'm running a script against a number of Users to check some of their attributes however I'm having trouble getting this to output to CSV. The script runs well and does exactly what I need it to do, the output on screen is fine, I'm just having trouble getting it to directly export to csv. The input is a comma seperated txt file of usernames (eg "username1,username2,username3") I've experimented with creating custom ps objects, adding to them and then exporting those but its not working.... Any suggestions gratefully received.. Thanks George $array = Get-Content $InputPath #split the comma delimited string into an array $arrayb = $array.Split(","); foreach ($User in $arrayb) { #find group memebership Write-Host "AD group membership for $User" Get-QADMemberOf $User #Get Mailbox Info Write-Host "Mailbox info for $User" Get-Mailbox $User | select ServerName, Database, EmailAddresses, PrimarySmtpAddress, WindowsEmailAddress #get profile details Write-Host "Home drive info for $User" Get-QADUser $User| select HomeDirectory,HomeDrive #add space between users Write-Host "" Write-Host "******************************************************" } Write-Host "End Script"

    Read the article

  • Strategy for storing and displaying form dropdown data for provinces, states, prefixes?

    - by meder
    I'm currently migrating from a class that stores lists of countries, states & provinces in the form of arrays to using Zend's Locale data in the form of ldml xml files. These ldml files provide localised lists of countries, currencies, languages - so I'm not exactly sure where I should store US States, ( Canadian Provinces ), Prefixes - I was thinking possibly just create a generic xml file and store it in the same directory as the ldml files, but having doubts because it wouldn't really be localised as I'd store it in English. Should I go with storing it in a generic xml file, or possibly update each of the locale files ( eg en.xml ) and append them? The latter is probably not worth the work, which is why I'm swaying towards just a general.xml or dropdown-data.xml. As for generating dropdown options, I suppose I could just say, grab all US states, append the array with Canadian provinces, and append that with an 'Other' option - does this seem like the right way to go about?

    Read the article

  • simple regex to splice out text in ruby

    - by user141146
    I'm using ruby and I want to splice out a piece of a string that matches a regex (I think this is relatively easy, but I'm having difficulty) I have several thousand strings that look like this (to varying degrees) my_string = "adfa <b>weru</b> orua fklajdfqwieru ofaslkdfj alrjeowur woer woeriuwe <img src=\"/images/abcde_111-222-333/111-222-333.xml/blahblahblah.jpg\" />" I would like to splice out the /111-222-333.xml (the value of this changes from string to string, but suffice it to say is that I want to remove the piece between 2 forward slashes that contains something.xml. my hope was to find a match like this my_match = my_string.match(/\/.+?\.xml\//) but this actually captures "/b> orua fklajdfqwieru ofaslkdfj alrjeowur woer woeriuwe <img src=\"/images/abcde_111-222-333/111-222-333.xml/" I assumed that .+? would match what I am looking for, but it seems like it starts with the first forward slash that it finds (even though it's non-greedy) and then expands forward to the ".xml"). Any thoughts on what I'm doing wrong? TKS!!

    Read the article

  • XSLT, JSTL e JSF

    - by Paulo S.
    I have a xml file which I want to transform in a jsf code page. To do that I've created a xsl file. xml: <?xml version='1.0' encoding='ISO-8859-1'?> <questionario xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance' xsi:noNamespaceSchemaLocation='Schema2.xsd'> <componente nome='input'> <id>input1</id> </componente> <componente nome='input'> <id>input2</id> </componente> </questionario> code: <%@ taglib uri="http://java.sun.com/jstl/core_rt" prefix="c" %> <%@ taglib uri="http://java.sun.com/jsp/jstl/xml" prefix="x" %> <c:set var="xml" value="${questionarioXSLBean.xml}"/> <c:set var="xsl"> <xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="2.0" xmlns:f="http://java.sun.com/jsf/core" xmlns:h="http://java.sun.com/jsf/html" exclude-result-prefixes="f h"> <xsl:template match="/"> <xsl:for-each select="questionario/componente"> <xsl:if test="attribute::nome = 'input'"> <xsl:variable name="id"> <xsl:value-of select="id" /> </xsl:variable> <h:inputText id="{$id}"/> </xsl:if> </xsl:for-each> </xsl:template> </xsl:stylesheet> </c:set> <x:transform xml="${xml}" xslt="${xsl}" /> The problem is that nothing is shown in my screen because the generated code for <h:inputText id="input1"/> is <h:inputText id="input_1" xmlns:h="http://java.sun.com/jsf/html"/> how can I replace the xmlns:h="http://java.sun.com/jsf/html" or suppress it. Thanks! Update: Let me clarify what I want to do. I want to generate a jsf page dynamically depending on the attributes of a xml file, for instance, 2 input texts, 3 check boxes, etc. To make the transformation to jsf I thought in two approaches, one using jstl and another using xslt. The problem with the former is that I couldn't integrate jstl with jsf code (to set jsf components attributes using jstl variables) and with the last approach, I'm facing the problem described above.I wouldn't like to create components in java (UIComponents). Any suggestions?

    Read the article

  • python sax error "junk after document element"

    - by user293487
    Hi, I use python sax to parse xml file. The xml file is actually a combination of multiple xml files. It looks like as follows: <row name="abc" age="40" body="blalalala..." creationdate="03/10/10" /> <row name="bcd" age="50" body="blalalala..." creationdate="03/10/09" /> My python code is in the following. It show "junk after document element" error. Any good idea to solve this problem. Thanks. from xml.sax.handler import ContentHandler from xml.sax import make_parser,SAXException import sys class PostHandler (ContentHandler): def __init__(self): self.find = 0 self.buffer = '' self.mapping={} def startElement(self,name,attrs): if name == 'row': self.find = 1 self.body = attrs["body"] print attrs["body"] def character(self,data): if self.find==1: self.buffer+=data def endElement(self,name): if self.find == 1: self.mapping[self.body] = self.buffer print self.mapping parser = make_parser() handler = PostHandler() parser.setContentHandler(handler) try: parser.parse(open("2.xml")) except SAXException:

    Read the article

  • How to embed a progressbar into a HTML form?

    - by Noah Brainey
    I have this code below and want it to show the progress of a form submission of a file upload. I want it to work on my website visit it through this IP (24.148.156.217). So if you saw the website I want the progress bar to be displayed when the user fills in the information and then hits the submit button. Then the progress bar displays with the time until it's finished. <style> <!-- .hide { position:absolute; visibility:hidden; } .show { position:absolute; visibility:visible; } --> </style> <SCRIPT LANGUAGE="JavaScript"> //Progress Bar script- by Todd King ([email protected]) //Modified by JavaScript Kit for NS6, ability to specify duration //Visit JavaScript Kit (http://javascriptkit.com) for script var duration=3 // Specify duration of progress bar in seconds var _progressWidth = 50; // Display width of progress bar. var _progressBar = "|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||" var _progressEnd = 5; var _progressAt = 0; // Create and display the progress dialog. // end: The number of steps to completion function ProgressCreate(end) { // Initialize state variables _progressEnd = end; _progressAt = 0; // Move layer to center of window to show if (document.all) { // Internet Explorer progress.className = 'show'; progress.style.left = (document.body.clientWidth/2) - (progress.offsetWidth/2); progress.style.top = document.body.scrollTop+(document.body.clientHeight/2) - (progress.offsetHeight/2); } else if (document.layers) { // Netscape document.progress.visibility = true; document.progress.left = (window.innerWidth/2) - 100+"px"; document.progress.top = pageYOffset+(window.innerHeight/2) - 40+"px"; } else if (document.getElementById) { // Netscape 6+ document.getElementById("progress").className = 'show'; document.getElementById("progress").style.left = (window.innerWidth/2)- 100+"px"; document.getElementById("progress").style.top = pageYOffset+(window.innerHeight/2) - 40+"px"; } ProgressUpdate(); // Initialize bar } // Hide the progress layer function ProgressDestroy() { // Move off screen to hide if (document.all) { // Internet Explorer progress.className = 'hide'; } else if (document.layers) { // Netscape document.progress.visibility = false; } else if (document.getElementById) { // Netscape 6+ document.getElementById("progress").className = 'hide'; } } // Increment the progress dialog one step function ProgressStepIt() { _progressAt++; if(_progressAt > _progressEnd) _progressAt = _progressAt % _progressEnd; ProgressUpdate(); } // Update the progress dialog with the current state function ProgressUpdate() { var n = (_progressWidth / _progressEnd) * _progressAt; if (document.all) { // Internet Explorer var bar = dialog.bar; } else if (document.layers) { // Netscape var bar = document.layers["progress"].document.forms["dialog"].bar; n = n * 0.55; // characters are larger } else if (document.getElementById){ var bar=document.getElementById("bar") } var temp = _progressBar.substring(0, n); bar.value = temp; } // Demonstrate a use of the progress dialog. function Demo() { ProgressCreate(10); window.setTimeout("Click()", 100); } function Click() { if(_progressAt >= _progressEnd) { ProgressDestroy(); return; } ProgressStepIt(); window.setTimeout("Click()", (duration-1)*1000/10); } function CallJS(jsStr) { //v2.0 return eval(jsStr) } </script> <SCRIPT LANGUAGE="JavaScript"> // Create layer for progress dialog document.write("<span id=\"progress\" class=\"hide\">"); document.write("<FORM name=dialog id=dialog>"); document.write("<TABLE border=2 bgcolor=\"#FFFFCC\">"); document.write("<TR><TD ALIGN=\"center\">"); document.write("Progress<BR>"); document.write("<input type=text name=\"bar\" id=\"bar\" size=\"" + _progressWidth/2 + "\""); if(document.all||document.getElementById) // Microsoft, NS6 document.write(" bar.style=\"color:navy;\">"); else // Netscape document.write(">"); document.write("</TD></TR>"); document.write("</TABLE>"); document.write("</FORM>"); document.write("</span>"); ProgressDestroy(); // Hides </script> <form name="form1" method="post"> <center> <input type="button" name="Demo" value="Display progress" onClick="CallJS('Demo()')"> </center> </form> <a href="javascript:CallJS('Demo()')">Text link example</a>

    Read the article

  • visual studio localhost server can't locate file

    - by mhenk
    i have a very simple web project. just one htm file with some javascript that opens a file test.xml. the xml file is in the same folder as the htm file (and it is part of the project) but when i start the page (f5 or ctrl-f5) it can't find the xml. it starts as http://localhost:50586/main.htm when i do a folder list the test.xml file is right there. opening the page directly in firefox works fine and the script can read and extract data from the xml. how can i convince the development server that the xml is indeed there?? or better still: how can i turn the development server off entirely? for my purposes simply opening the page in my browser is all i need and that should happen from within vs of course.

    Read the article

< Previous Page | 301 302 303 304 305 306 307 308 309 310 311 312  | Next Page >