Search Results

Search found 33433 results on 1338 pages for 'java annotations'.

Page 1012/1338 | < Previous Page | 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019  | Next Page >

  • WebLogic Server?????????????·??·???????????????????????????????

    - by ???02
    WebLogic Server?????????????·??·???????????????????????????????WebLogic Server??????????????????·??·???????????????????????????????????????????????¦?????!? WebLogic Server??????????~ WebLogic Server??? ??????????????????!! ~¦???????????????????? ?????? ??????¦ WebLogic Server 10.3.5 ¦ Database 11g????????¦ Oracle WebLogic Server - ??   - WebLogic Server???  - WebLogic Server 10gR3 ?????Update  - ????????????????????????WebLogic Server?????  - WebLogic Server 11gR1 Update  - ?????????/Oracle????????  - ???¦ WebLogic Server - ???????   - ????/???????????????/?????/????????/JRF????/?????????????/???????/JDBC/RAC????¦ WebLogic Server - JDBC??????  - WebLogic Server?JDBC????????????????(?????????11gR1(10.3.3))  - ????:JDBC????????/??/??/????/??·????/??????  - ??????????GridLink for RAC????????????????¦ WebLogic Server - GridLink for RAC  - GridLink???·???? Oracle RAC?????????????????????????·???????·??????¦ WebLogic Server - Enterprise Grid Messaging  - Enterprise Grid Messaging???Oracle WebLogic Server??????????????????????Java Messaging Service (JMS)???????  - Oracle Advanced Queuing???Oracle RAC???????????Oracle????????????????¦ WebLogic Server - Web????  - Web??????/????/??  - WebLogic Server 10.3.x?Web???????  - JAX-WS Stack ???/JAX-RPC Stack ???  - JAX-WS??/JAX-WS?JAX-PRC???  - JAX-WS???WebLogic Web ?????????¦ WebLogic Server - ?????????????  - Java EE??/Java EE????????/Java EE????/WebLogic Server/JVM??????????·???????¦ WebLogic Server - ????????/????  -  ????????:??????????????????????????????????????/?????????????????/?????????EJB??????????????  - ???????·?????????????????????/HTTP????????/??????????????/???????????????

    Read the article

  • ????????! ?????????????????JRockit Flight Recorder????|WebLogic Channel|??????

    - by ???02
    ???????????????????????????????????????WebLogic Server????????·??????????????????????????????????????????????WebLogic Server????????????????????????????????????????????????????????????????????????JRockit Flight Recorder???WebLogic Server????????????????????????????????????????????????????????????????????????????????????????(???)?????????????????――????????JRockit Flight Recorder ???????????·?????????????????????????????????????????????????????????????????????????????????? ????Java?????(JVM:Java Virtual Machine)??????????????????????????????????·??????????????????·?????????????????????????????????·???·????????????????????????????????????????????·?????JVM????????????????????????????????????????????????????????????????????????????????????????????? ???JVM?????????????·??????(Out of Memory Error)????????????????·???????????·??????(GC)?????????????????????????????????????????????????????????JVM?????????GC???????????????????????????????????????????????? ???????????????????2010?7?????????WebLogic Server Enterprise Edition 11g R1 Patch Set 2???????????????????JRockit Flight Recorder????????????????????????????????????????????????????????????????????????????????????????????????????JRockit Flight Recorder????????OS?????????·?????CPU??????????????????????JVM?????????JVM?????????·????????????·??????????????????·????????JVM?CPU????????·?????????·????JIT?????????????????Java????????·???????/???????????????/????????????WebLogic Server????????????EJB?JDBC?Web?????JTA?????????????????????????????? ??WebLogic Server??????????????????????????JRockit Runtime Analyzer??????????Runtime Analyzer?????????????????????????????????????????WebLogic Server?????????????????????????????????JRockit Flight Recorder?????????????????????????????????????????????? JRockit Flight Recorder???????????????????????????????????????????????????????? ???Java EE???????????????????????????????????????????????????JVM????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????? ???????JRockit Flight Recorder????JVM(JRockit)?????????????????????????????????????????????????????????????3%?????????????????????????????????????????????????????????????????????????? ????????JRockit Flight Recorder?????????????????????????????????????????????????????????????????????????JRockit Flight Recorder????? JRockit Flight Recorder????????????????????(????????????????????????)??????????????????????????(???????·????)??????????????? ??????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????? ???????JRockit Flight Recorder??????????????????????????????????????????·?????????????????????????????????????(?????GUI????Oracle JRockit Mission Control????)????????·??????????????????????????????JRockit Flight Recorder????????????????????? ??????????????????????JRockit Flight Recorder?WebLogic Server Enterprise Edition???????????WebLogic Server?????????????????????????????????????????????????????????????????????????????????????????????????? ??????????JRockit Flight Recorder????????????????????????????????????? ????????JRockit Flight Recorder??????????????????????????????????????????·?????????????????JRockit Mission Control??????????????????????????????????????? ?????????????????????????????????????????????????????????????????????????????????????????????????????????????? ??????????????????????????????????????????????????????????????????????????????????????????????????????????????????? ??????????????1????????????????????????????????????Thread Parked????????????????????????????????????????????Thread Parked????????????????????????????? ???JRockit Flight Recorder?JDBC??????????????????SQL?????????????????????????????????? ??????·???????????????????????????JRockit Flight Recorder?????????1???????????SLA?????? JRockit Flight Recorder????????????????????? ???????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????? ????JRockit Flight Recorder???????????????????????????????????SLA(Service Level Agreement)?????????????????????????????????????????????????????????????????????????????????????????? ????????????JRockit Flight Recorder???????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????JRockit Flight Recorder????????????1???JRockit Flight Recorder????????????????? ????????JRockit Flight Recorder?????????????????????????????????????????????????? ???JRockit Flight Recorder??????????????????1??????! ?????????????????????????????????3????????????????????????????6??????????????????13???????????????????????????????????????????????????????20???????????????????????35???????????????38?????????????????????????? ???????JRockit Flight Recorder???????????????????????????1??????! ??????????????????JRockit Flight Recorder??????????????????????3???????????????????18???????????????21?????????????????????????? ??????????????JRockit Flight Recorder????????????????????????????????????????JRockit Flight Recorder???????????????????????????????????????????????????JRockit Flight Recorder???????????? ??????????????????·???????????????????????JRockit Flight Recorder???????????????????????????? ???????????????????????????????――??????????????????? ???????????JRockit Flight Recorder??????2????1?????????????????????????????????????????????????????????? ??????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????JRockit Flight Recorder???????????????????????????????????????????????????????????????????SIer???????――???????/???????????????? JRockit Flight Recorder????????????????????IT?????????SIer????????????? ??????????????????????????SIer?????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????(???????)?????????????????????????????????????????????????????????????????????????? ???JRockit Flight Recorder??????????????????????????????????????????????????????????????????????????????????????????/????????????????????????????????????????*   *   * ???????WebLogic Server Enterprise Edition???????????????????JRockit Flight Recorder????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????WebLogic Server?????????????JVM Oracle JRockit??? ???? ···???JVM?Oracle JRockit??????????????????????????JRockit Flight Recorder????????????????? ?????45???????????????????????????????????? ?WMV?? ?MP4??

    Read the article

  • kSOAP2 and SOAPAction on Android

    - by Matze
    Hi everyone, I am trying to access a Webservice with kSOAP2 on an Android Phone. I think the connection is being established, but the server won't answer my request since I'm not providing a SOAP Action Header which seems to be required in SOAP Version 1.1(please correct me if I'm wrong here) which I have to use since the server does not support Version 1.2 . The concrete Faultcode which is returning in the request looks like this: faultactor null faultcode "S:Server" (id=830064966432) faultstring "String index out of range: -11" (id=830064966736) The errorcode which is generated on the server (I'm running it on a localhost) looks like this: 4.05.2010 20:20:29 com.sun.xml.internal.ws.transport.http.HttpAdapter fixQuotesAroundSoapAction WARNUNG: Received WS-I BP non-conformant Unquoted SoapAction HTTP header: http://server.contextlayer.bscwi.de/createContext 24.05.2010 20:20:29 com.sun.xml.internal.ws.server.sei.EndpointMethodHandler invoke SCHWERWIEGEND: String index out of range: -11 java.lang.StringIndexOutOfBoundsException: String index out of range: -11 at java.lang.String.substring(Unknown Source) at de.bscwi.contextlayer.xml.XmlValidator.isValid(XmlValidator.java:41) at de.bscwi.contextlayer.server.ContextWS.createContext(ContextWS.java:45) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at com.sun.xml.internal.ws.api.server.InstanceResolver$1.invoke(Unknown Source) at com.sun.xml.internal.ws.server.InvokerTube$2.invoke(Unknown Source) at com.sun.xml.internal.ws.server.sei.EndpointMethodHandler.invoke(Unknown Source) at com.sun.xml.internal.ws.server.sei.SEIInvokerTube.processRequest(Unknown Source) at com.sun.xml.internal.ws.api.pipe.Fiber.__doRun(Unknown Source) at com.sun.xml.internal.ws.api.pipe.Fiber._doRun(Unknown Source) at com.sun.xml.internal.ws.api.pipe.Fiber.doRun(Unknown Source) at com.sun.xml.internal.ws.api.pipe.Fiber.runSync(Unknown Source) at com.sun.xml.internal.ws.server.WSEndpointImpl$2.process(Unknown Source) at com.sun.xml.internal.ws.transport.http.HttpAdapter$HttpToolkit.handle(Unknown Source) at com.sun.xml.internal.ws.transport.http.HttpAdapter.handle(Unknown Source) at com.sun.xml.internal.ws.transport.http.server.WSHttpHandler.handleExchange(Unknown Source) at com.sun.xml.internal.ws.transport.http.server.WSHttpHandler.handle(Unknown Source) at com.sun.net.httpserver.Filter$Chain.doFilter(Unknown Source) at sun.net.httpserver.AuthFilter.doFilter(Unknown Source) at com.sun.net.httpserver.Filter$Chain.doFilter(Unknown Source) at sun.net.httpserver.ServerImpl$Exchange$LinkHandler.handle(Unknown Source) at com.sun.net.httpserver.Filter$Chain.doFilter(Unknown Source) at sun.net.httpserver.ServerImpl$Exchange.run(Unknown Source) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source) at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.lang.Thread.run(Unknown Source) The relevant part of the WSDL (at least that's what I'm thinking) looks like this: <operation name="createContext"> <soap:operation soapAction=""/> - <input> <soap:body use="literal" namespace="http://server.contextlayer.bscwi.de/"/> </input> - <output> <soap:body use="literal" namespace="http://server.contextlayer.bscwi.de/"/> </output> </operation> In my code I'm adding a Header, but it seems like I'm doing it wrong: private static final String SOAP_ACTION = ""; //... SoapSerializationEnvelope soapEnvelope = new SoapSerializationEnvelope (SoapEnvelope.VER11); soapEnvelope.setOutputSoapObject(Request); AndroidHttpTransport aht = new AndroidHttpTransport (URL); //... aht.call(SOAP_ACTION, soapEnvelope); SoapPrimitive resultString = (SoapPrimitive) soapEnvelope.getResponse(); Any advice would be great since I'm running out of ideas.. Thanks folks!

    Read the article

  • Encoding in Scene Builder

    - by Agafonova Victoria
    I generate an FXML file with Scene Builder. I need it to contain some cirillic text. When i edit this file with Scene Builder i can see normal cirillic letters (screen 1) After compileing and running my program with this FXML file, i'll see not cirillic letters, but some artefacts (screen 2) But, as you can see on the screen 3, its xml file encoding is UTF-8. Also, you can see there that it is saved in ANSI. I've tried to open it with other editors (default eclipse and sublime text 2) and they shoen wrong encoding either. (screen 4 and screen 5) At first i've tried to convert it from ansi to utf-8 (with notepad++). After that eclipse and sublime text 2 started display cirillic letters as they must be. But. Scene builder gave an error, when i've tried to open this file with it: Error loading file C:\eclipse\workspace\equification\src\main\java\ru\igs\ava\equification\test.fxml. C:\eclipse\workspace\equification\src\main\java\ru\igs\ava\equification\test.fxml:1: ParseError at [row,col]:[1,1] Message: Content is not allowed in prolog. And java compiler gave me an error: ??? 08, 2012 8:11:03 PM javafx.fxml.FXMLLoader logException SEVERE: javax.xml.stream.XMLStreamException: ParseError at [row,col]:[1,1] Message: Content is not allowed in prolog. /C:/eclipse/workspace/equification/target/classes/ru/igs/ava/equification/test.fxml:1 at javafx.fxml.FXMLLoader.load(Unknown Source) at javafx.fxml.FXMLLoader.load(Unknown Source) at javafx.fxml.FXMLLoader.load(Unknown Source) at javafx.fxml.FXMLLoader.load(Unknown Source) at javafx.fxml.FXMLLoader.load(Unknown Source) at javafx.fxml.FXMLLoader.load(Unknown Source) at ru.igs.ava.equification.EquificationFX.start(EquificationFX.java:22) at com.sun.javafx.application.LauncherImpl$5.run(Unknown Source) at com.sun.javafx.application.PlatformImpl$4.run(Unknown Source) at com.sun.javafx.application.PlatformImpl$3.run(Unknown Source) at com.sun.glass.ui.win.WinApplication._runLoop(Native Method) at com.sun.glass.ui.win.WinApplication.access$100(Unknown Source) at com.sun.glass.ui.win.WinApplication$2$1.run(Unknown Source) at java.lang.Thread.run(Unknown Source) Exception in Application start method Exception in thread "main" java.lang.RuntimeException: Exception in Application start method at com.sun.javafx.application.LauncherImpl.launchApplication1(Unknown Source) at com.sun.javafx.application.LauncherImpl.access$000(Unknown Source) at com.sun.javafx.application.LauncherImpl$1.run(Unknown Source) at java.lang.Thread.run(Unknown Source) Caused by: javafx.fxml.LoadException: javax.xml.stream.XMLStreamException: ParseError at [row,col]:[1,1] Message: Content is not allowed in prolog. at javafx.fxml.FXMLLoader.load(Unknown Source) at javafx.fxml.FXMLLoader.load(Unknown Source) at javafx.fxml.FXMLLoader.load(Unknown Source) at javafx.fxml.FXMLLoader.load(Unknown Source) at javafx.fxml.FXMLLoader.load(Unknown Source) at javafx.fxml.FXMLLoader.load(Unknown Source) at ru.igs.ava.equification.EquificationFX.start(EquificationFX.java:22) at com.sun.javafx.application.LauncherImpl$5.run(Unknown Source) at com.sun.javafx.application.PlatformImpl$4.run(Unknown Source) at com.sun.javafx.application.PlatformImpl$3.run(Unknown Source) at com.sun.glass.ui.win.WinApplication._runLoop(Native Method) at com.sun.glass.ui.win.WinApplication.access$100(Unknown Source) at com.sun.glass.ui.win.WinApplication$2$1.run(Unknown Source) ... 1 more Caused by: javax.xml.stream.XMLStreamException: ParseError at [row,col]:[1,1] Message: Content is not allowed in prolog. at com.sun.org.apache.xerces.internal.impl.XMLStreamReaderImpl.next(Unknown Source) at javax.xml.stream.util.StreamReaderDelegate.next(Unknown Source) ... 14 more So, i've converted it back to ANSI. And, having this file in ANSI, changed its "artefacted" text to cirillic letters manually. Now i can see normal text when i run my program, but when i open this fixed file via Scene Builder, Scene Builder shows me some "artefacted" text (screen 7). So, how can i fix this situation?

    Read the article

  • Is it possible to scan Entities in jar files using JPA and hibernate

    - by user1260109
    I have the following situation : Project A - Contains a few entities and is independent Project B - Contains a few entities and is independent Project C - Contains few entities and is dependent on Project A & Project B. I am using Maven to manage dependencies and builds. When I try to test Project A and project B it goes through fine. Each of them has a persistence.xml and a separate persistent context. When I run Project C , It does map any of the entities. I have tried to use the auto-detect, specified the jar file attribute ... but nothing seems to work. It gives me a Mapping Exception saying unknown entity and wont persist or read the Entities from Projects A or B. I have posted the 3 persistence.xml files here. Also, I tried using the class attribute and using the same persistent context but it just wont find the files. Any help is really appreciated. Thanks in advance ! <persistence xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_1_0.xsd" version="1.0"> <persistence-unit name="A" transaction-type="RESOURCE_LOCAL"> <provider>org.hibernate.ejb.HibernatePersistence</provider> <properties> <property name="hibernate.dialect" value="org.hibernate.dialect.Oracle9Dialect"/> <property name="hibernate.connection.driver_class" value="oracle.jdbc.OracleDriver"/> <property name="hibernate.show_sql" value="true"/> <property name="hibernate.connection.username" value="username"/> <property name="hibernate.connection.password" value="password"/> <property name="hibernate.connection.url" value="jdbc:oracle:thin:@webdev.epi.web:1521/webdev.world"/> <property name="hibernate.max_fetch_depth" value="3"/> <property name="hibernate.archive.autodetection" value="class"/> </properties> </persistence-unit> </persistence> <persistence xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_1_0.xsd" version="1.0"> <persistence-unit name="B" transaction-type="RESOURCE_LOCAL"> <provider>org.hibernate.ejb.HibernatePersistence</provider> <properties> <property name="hibernate.dialect" value="org.hibernate.dialect.Oracle9Dialect"/> <property name="hibernate.connection.driver_class" value="oracle.jdbc.OracleDriver"/> <property name="hibernate.show_sql" value="true"/> <property name="hibernate.connection.username" value="username"/> <property name="hibernate.connection.password" value="password"/> <property name="hibernate.connection.url" value="jdbc:oracle:thin:@webdev.epi.web:1521/webdev.world"/> <property name="hibernate.max_fetch_depth" value="3"/> <property name="hibernate.archive.autodetection" value="class"/> </properties> </persistence-unit> </persistence> <persistence xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_1_0.xsd" version="1.0"> <persistence-unit name="C" transaction-type="RESOURCE_LOCAL"> <provider>org.hibernate.ejb.HibernatePersistence</provider> <jar-file>A-0.0.1-SNAPSHOT.jar</jar-file> <jar-file>B-0.0.1-SNAPSHOT.jar</jar-file> <properties> <property name="hibernate.dialect" value="org.hibernate.dialect.Oracle9Dialect"/> <property name="hibernate.connection.driver_class" value="oracle.jdbc.OracleDriver"/> <property name="hibernate.show_sql" value="true"/> <property name="hibernate.connection.username" value="username"/> <property name="hibernate.connection.password" value="password"/> <property name="hibernate.connection.url" value="jdbc:oracle:thin:@webdev.epi.web:1521/webdev.world"/> <property name="hibernate.max_fetch_depth" value="3"/> <property name="hibernate.archive.autodetection" value="class"/> </properties> </persistence-unit> </persistence>

    Read the article

  • Problems with Update Manager

    - by user65965
    Whenever I try to update with update manager I get the following errors: W:Failed to fetch http://archive.ubuntu.com/ubuntu/dists/precise/Release Unable to find expected entry 'commercial/source/Sources' in Release file (Wrong sources.list entry or malformed file) W:Failed to fetch http://archive.ubuntu.com/ubuntu/dists/precise-updates/Release Unable to find expected entry 'commercial/source/Sources' in Release file (Wrong sources.list entry or malformed file) W:Failed to fetch http://archive.ubuntu.com/ubuntu/dists/precise-backports/Release Unable to find expected entry 'commercial/source/Sources' in Release file (Wrong sources.list entry or malformed file) W:Failed to fetch http://archive.ubuntu.com/ubuntu/dists/precise-security/Release Unable to find expected entry 'commercial/source/Sources' in Release file (Wrong sources.list entry or malformed file) W:Failed to fetch http://ppa.launchpad.net/iefremov/ppa/ubuntu/dists/precise/main/source/Sources 404 Not Found W:Failed to fetch http://ppa.launchpad.net/iefremov/ppa/ubuntu/dists/precise/main/binary-amd64/Packages 404 Not Found W:Failed to fetch http://ppa.launchpad.net/iefremov/ppa/ubuntu/dists/precise/main/binary-i386/Packages 404 Not Found E:Some index files failed to download. They have been ignored, or old ones used instead. Any help would be much appreciated, thank you. Thank you very much Eliah. I'm still pretty new to Ubuntu. Here's the output I got from the terminal: No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 12.04 LTS Release: 12.04 Codename: precise # See http://help.ubuntu.com/community/UpgradeNotes for how to upgrade to # newer versions of the distribution. deb http://archive.ubuntu.com/ubuntu precise main restricted commercial deb-src http://archive.ubuntu.com/ubuntu precise restricted main commercial multiverse universe #Added by software-properties ## Major bug fix updates produced after the final release of the ## distribution. deb http://archive.ubuntu.com/ubuntu precise-updates main restricted commercial deb-src http://archive.ubuntu.com/ubuntu precise-updates restricted main commercial multiverse universe #Added by software-properties ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team. Also, please note that software in universe WILL NOT receive any ## review or updates from the Ubuntu security team. deb http://archive.ubuntu.com/ubuntu precise universe deb http://archive.ubuntu.com/ubuntu precise-updates universe ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## multiverse WILL NOT receive any review or updates from the Ubuntu ## security team. deb http://archive.ubuntu.com/ubuntu precise multiverse deb http://archive.ubuntu.com/ubuntu precise-updates multiverse ## N.B. software from this repository may not have been tested as ## extensively as that contained in the main release, although it includes ## newer versions of some applications which may provide useful features. ## Also, please note that software in backports WILL NOT receive any review ## or updates from the Ubuntu security team. deb http://archive.ubuntu.com/ubuntu precise-backports main restricted universe multiverse commercial deb-src http://archive.ubuntu.com/ubuntu precise-backports main restricted universe multiverse commercial #Added by software-properties deb http://archive.ubuntu.com/ubuntu precise-security main restricted commercial deb-src http://archive.ubuntu.com/ubuntu precise-security restricted main commercial multiverse universe #Added by software-properties deb http://archive.ubuntu.com/ubuntu precise-security universe deb http://archive.ubuntu.com/ubuntu precise-security multiverse ## Uncomment the following two lines to add software from Canonical's ## 'partner' repository. ## This software is not part of Ubuntu, but is offered by Canonical and the ## respective vendors as a service to Ubuntu users. deb http://archive.canonical.com/ubuntu oneiric partner deb-src http://archive.canonical.com/ubuntu precise partner ## This software is not part of Ubuntu, but is offered by third-party ## developers who want to ship their latest software. deb http://extras.ubuntu.com/ubuntu precise main deb-src http://extras.ubuntu.com/ubuntu precise main ## This is a 3rd party script to install and update Oracle Java deb http://www.duinsoft.nl/pkg debs all ## Sun-Java6-JRE deb http://security.ubuntu.com/ubuntu hardy-security main multiverse ** /etc/apt/sources.list.d/askubuntu-tools-ppa-precise.list: deb http://ppa.launchpad.net/askubuntu-tools/ppa/ubuntu precise main deb-src http://ppa.launchpad.net/askubuntu-tools/ppa/ubuntu precise main ** /etc/apt/sources.list.d/askubuntu-tools-ppa-precise.list.save: deb http://ppa.launchpad.net/askubuntu-tools/ppa/ubuntu precise main deb-src http://ppa.launchpad.net/askubuntu-tools/ppa/ubuntu precise main ** /etc/apt/sources.list.d/effie-jayx-turpial-oneiric.list: deb http://ppa.launchpad.net/effie-jayx/turpial/ubuntu precise main # disabled on upgrade to precise deb-src http://ppa.launchpad.net/effie-jayx/turpial/ubuntu precise main # disabled on upgrade to precise ** /etc/apt/sources.list.d/effie-jayx-turpial-oneiric.list.distUpgrade: deb http://ppa.launchpad.net/effie-jayx/turpial/ubuntu oneiric main deb-src http://ppa.launchpad.net/effie-jayx/turpial/ubuntu oneiric main ** /etc/apt/sources.list.d/effie-jayx-turpial-oneiric.list.save: deb http://ppa.launchpad.net/effie-jayx/turpial/ubuntu precise main # disabled on upgrade to precise deb-src http://ppa.launchpad.net/effie-jayx/turpial/ubuntu precise main # disabled on upgrade to precise ** /etc/apt/sources.list.d/getdeb.list: # deb http://archive.getdeb.net/ubuntu oneiric-getdeb apps # disabled on upgrade to precise ** /etc/apt/sources.list.d/getdeb.list.distUpgrade: deb http://archive.getdeb.net/ubuntu oneiric-getdeb apps ** /etc/apt/sources.list.d/getdeb.list.save: # deb http://archive.getdeb.net/ubuntu oneiric-getdeb apps # disabled on upgrade to precise ** /etc/apt/sources.list.d/hotot-team-ppa-oneiric.list: deb http://ppa.launchpad.net/hotot-team/ppa/ubuntu precise main # disabled on upgrade to precise deb-src http://ppa.launchpad.net/hotot-team/ppa/ubuntu precise main # disabled on upgrade to precise ** /etc/apt/sources.list.d/hotot-team-ppa-oneiric.list.distUpgrade: deb http://ppa.launchpad.net/hotot-team/ppa/ubuntu oneiric main deb-src http://ppa.launchpad.net/hotot-team/ppa/ubuntu oneiric main ** /etc/apt/sources.list.d/hotot-team-ppa-oneiric.list.save: deb http://ppa.launchpad.net/hotot-team/ppa/ubuntu precise main # disabled on upgrade to precise deb-src http://ppa.launchpad.net/hotot-team/ppa/ubuntu precise main # disabled on upgrade to precise ** /etc/apt/sources.list.d/iefremov-ppa-precise.list: deb http://ppa.launchpad.net/iefremov/ppa/ubuntu precise main deb-src http://ppa.launchpad.net/iefremov/ppa/ubuntu precise main ** /etc/apt/sources.list.d/iefremov-ppa-precise.list.save: deb http://ppa.launchpad.net/iefremov/ppa/ubuntu precise main deb-src http://ppa.launchpad.net/iefremov/ppa/ubuntu precise main ** /etc/apt/sources.list.d/jockey.list: deb http://www.openprinting.org/download/printdriver/debian/ lsb3.2 main-nonfree # disabled on upgrade to precise ** /etc/apt/sources.list.d/jockey.list.distUpgrade: deb http://www.openprinting.org/download/printdriver/debian/ lsb3.2 main-nonfree ** /etc/apt/sources.list.d/jockey.list.save: deb http://www.openprinting.org/download/printdriver/debian/ lsb3.2 main-nonfree # disabled on upgrade to precise ** /etc/apt/sources.list.d/plexydesk-plexydesk-dailybuild-precise.list: deb http://ppa.launchpad.net/plexydesk/plexydesk-dailybuild/ubuntu precise main deb-src http://ppa.launchpad.net/plexydesk/plexydesk-dailybuild/ubuntu precise main ** /etc/apt/sources.list.d/plexydesk-plexydesk-dailybuild-precise.list.save: deb http://ppa.launchpad.net/plexydesk/plexydesk-dailybuild/ubuntu precise main deb-src http://ppa.launchpad.net/plexydesk/plexydesk-dailybuild/ubuntu precise main ** /etc/apt/sources.list.d/precise-partner.list: deb http://archive.canonical.com/ubuntu precise partner #Added by software-center ** /etc/apt/sources.list.d/precise-partner.list.save: deb http://archive.canonical.com/ubuntu precise partner #Added by software-center ** /etc/apt/sources.list.d/private-ppa.launchpad.net_commercial-ppa-uploaders_crossover-pro_ubuntu.list: # deb https://justin-dormandy:[email protected]/commercial-ppa-uploaders/crossover-pro/ubuntu precise main #Added by software-center disabled on upgrade to precise ** /etc/apt/sources.list.d/private-ppa.launchpad.net_commercial-ppa-uploaders_crossover-pro_ubuntu.list.distUpgrade: cat: /etc/apt/sources.list.d/private-ppa.launchpad.net_commercial-ppa-uploaders_crossover-pro_ubuntu.list.distUpgrade: Permission denied ** /etc/apt/sources.list.d/private-ppa.launchpad.net_commercial-ppa-uploaders_crossover-pro_ubuntu.list.save: cat: /etc/apt/sources.list.d/private-ppa.launchpad.net_commercial-ppa-uploaders_crossover-pro_ubuntu.list.save: Permission denied ** /etc/apt/sources.list.d/screenlets-ppa-precise.list: deb http://ppa.launchpad.net/screenlets/ppa/ubuntu precise main deb-src http://ppa.launchpad.net/screenlets/ppa/ubuntu precise main ** /etc/apt/sources.list.d/screenlets-ppa-precise.list.save: deb http://ppa.launchpad.net/screenlets/ppa/ubuntu precise main deb-src http://ppa.launchpad.net/screenlets/ppa/ubuntu precise main ** /etc/apt/sources.list.d/webupd8team-java-precise.list: deb http://ppa.launchpad.net/webupd8team/java/ubuntu precise main deb-src http://ppa.launchpad.net/webupd8team/java/ubuntu precise main ** /etc/apt/sources.list.d/webupd8team-java-precise.list.save: deb http://ppa.launchpad.net/webupd8team/java/ubuntu precise main deb-src http://ppa.launchpad.net/webupd8team/java/ubuntu precise main

    Read the article

  • SQL – Migrate Database from SQL Server to NuoDB – A Quick Tutorial

    - by Pinal Dave
    Data is growing exponentially and every organization with growing data is thinking of next big innovation in the world of Big Data. Big data is a indeed a future for every organization at one point of the time. Just like every other next big thing, big data has its own challenges and issues. The biggest challenge associated with the big data is to find the ideal platform which supports the scalability and growth of the data. If you are a regular reader of this blog, you must be familiar with NuoDB. I have been working with NuoDB for a while and their recent release is the best thus far. NuoDB is an elastically scalable SQL database that can run on local host, datacenter and cloud-based resources. A key feature of the product is that it does not require sharding (read more here). Last week, I was able to install NuoDB in less than 90 seconds and have explored their Explorer and Admin sections. You can read about my experiences in these posts: SQL – Step by Step Guide to Download and Install NuoDB – Getting Started with NuoDB SQL – Quick Start with Admin Sections of NuoDB – Manage NuoDB Database SQL – Quick Start with Explorer Sections of NuoDB – Query NuoDB Database Many SQL Authority readers have been following me in my journey to evaluate NuoDB. One of the frequently asked questions I’ve received from you is if there is any way to migrate data from SQL Server to NuoDB. The fact is that there is indeed a way to do so and NuoDB provides a fantastic tool which can help users to do it. NuoDB Migrator is a command line utility that supports the migration of Microsoft SQL Server, MySQL, Oracle, and PostgreSQL schemas and data to NuoDB. The migration to NuoDB is a three-step process: NuoDB Migrator generates a schema for a target NuoDB database It loads data into the target NuoDB database It dumps data from the source database Let’s see how we can migrate our data from SQL Server to NuoDB using a simple three-step approach. But before we do that we will create a sample database in MSSQL and later we will migrate the same database to NuoDB: Setup Step 1: Build a sample data CREATE DATABASE [Test]; CREATE TABLE [Department]( [DepartmentID] [smallint] NOT NULL, [Name] VARCHAR(100) NOT NULL, [GroupName] VARCHAR(100) NOT NULL, [ModifiedDate] [datetime] NOT NULL, CONSTRAINT [PK_Department_DepartmentID] PRIMARY KEY CLUSTERED ( [DepartmentID] ASC ) ) ON [PRIMARY]; INSERT INTO Department SELECT * FROM AdventureWorks2012.HumanResources.Department; Note that I am using the SQL Server AdventureWorks database to build this sample table but you can build this sample table any way you prefer. Setup Step 2: Install Java 64 bit Before you can begin the migration process to NuoDB, make sure you have 64-bit Java installed on your computer. This is due to the fact that the NuoDB Migrator tool is built in Java. You can download 64-bit Java for Windows, Mac OSX, or Linux from the following link: http://java.com/en/download/manual.jsp. One more thing to remember is that you make sure that the path in your environment settings is set to your JAVA_HOME directory or else the tool will not work. Here is how you can do it: Go to My Computer >> Right Click >> Select Properties >> Click on Advanced System Settings >> Click on Environment Variables >> Click on New and enter the following values. Variable Name: JAVA_HOME Variable Value: C:\Program Files\Java\jre7 Make sure you enter your Java installation directory in the Variable Value field. Setup Step 3: Install JDBC driver for SQL Server. There are two JDBC drivers available for SQL Server.  Select the one you prefer to use by following one of the two links below: Microsoft JDBC Driver jTDS JDBC Driver In this example we will be using jTDS JDBC driver. Once you download the driver, move the driver to your NuoDB installation folder. In my case, I have moved the JAR file of the driver into the C:\Program Files\NuoDB\tools\migrator\jar folder as this is my NuoDB installation directory. Now we are all set to start the three-step migration process from SQL Server to NuoDB: Migration Step 1: NuoDB Schema Generation Here is the command I use to generate a schema of my SQL Server Database in NuoDB. First I go to the folder C:\Program Files\NuoDB\tools\migrator\bin and execute the nuodb-migrator.bat file. Note that my database name is ‘test’. Additionally my username and password is also ‘test’. You can see that my SQL Server database is running on my localhost on port 1433. Additionally, the schema of the table is ‘dbo’. nuodb-migrator schema –source.driver=net.sourceforge.jtds.jdbc.Driver –source.url=jdbc:jtds:sqlserver://localhost:1433/ –source.username=test –source.password=test –source.catalog=test –source.schema=dbo –output.path=/tmp/schema.sql The above script will generate a schema of all my SQL Server tables and will put it in the folder C:\tmp\schema.sql . You can open the schema.sql file and execute this file directly in your NuoDB instance. You can follow the link here to see how you can execute the SQL script in NuoDB. Please note that if you have not yet created the schema in the NuoDB database, you should create it before executing this step. Step 2: Generate the Dump File of the Data Once you have recreated your schema in NuoDB from SQL Server, the next step is very easy. Here we create a CSV format dump file, which will contain all the data from all the tables from the SQL Server database. The command to do so is very similar to the above command. Be aware that this step may take a bit of time based on your database size. nuodb-migrator dump –source.driver=net.sourceforge.jtds.jdbc.Driver –source.url=jdbc:jtds:sqlserver://localhost:1433/ –source.username=test –source.password=test –source.catalog=test –source.schema=dbo –output.type=csv –output.path=/tmp/dump.cat Once the above command is successfully executed you can find your CSV file in the C:\tmp\ folder. However, you do not have to do anything manually. The third and final step will take care of completing the migration process. Migration Step 3: Load the Data into NuoDB After building schema and taking a dump of the data, the very next step is essential and crucial. It will take the CSV file and load it into the NuoDB database. nuodb-migrator load –target.url=jdbc:com.nuodb://localhost:48004/mytest –target.schema=dbo –target.username=test –target.password=test –input.path=/tmp/dump.cat Please note that in the above script we are now targeting the NuoDB database, which we have already created with the name of “MyTest”. If the database does not exist, create it manually before executing the above script. I have kept the username and password as “test”, but please make sure that you create a more secure password for your database for security reasons. Voila!  You’re Done That’s it. You are done. It took 3 setup and 3 migration steps to migrate your SQL Server database to NuoDB.  You can now start exploring the database and build excellent, scale-out applications. In this blog post, I have done my best to come up with simple and easy process, which you can follow to migrate your app from SQL Server to NuoDB. Download NuoDB I strongly encourage you to download NuoDB and go through my 3-step migration tutorial from SQL Server to NuoDB. Additionally here are two very important blog post from NuoDB CTO Seth Proctor. He has written excellent blog posts on the concept of the Administrative Domains. NuoDB has this concept of an Administrative Domain, which is a collection of hosts that can run one or multiple databases.  Each database has its own TEs and SMs, but all are managed within the Admin Console for that particular domain. http://www.nuodb.com/techblog/2013/03/11/getting-started-provisioning-a-domain/ http://www.nuodb.com/techblog/2013/03/14/getting-started-running-a-database/ Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: NuoDB

    Read the article

  • How to create a simple adf dashboard application with EJB 3.0

    - by Rodrigues, Raphael
    In this month's Oracle Magazine, Frank Nimphius wrote a very good article about an Oracle ADF Faces dashboard application to support persistent user personalization. You can read this entire article clicking here. The idea in this article is to extend the dashboard application. My idea here is to create a similar dashboard application, but instead ADF BC model layer, I'm intending to use EJB3.0. There are just a one small trick here and I'll show you. I'm using the HR usual oracle schema. The steps are: 1. Create a ADF Fusion Application with EJB as a layer model 2. Generate the entities from table (I'm using Department and Employees only) 3. Create a new Session Bean. I called it: HRSessionEJB 4. Create a new method like that: public List getAllDepartmentsHavingEmployees(){ JpaEntityManager jpaEntityManager = (JpaEntityManager)em.getDelegate(); Query query = jpaEntityManager.createNamedQuery("Departments.allDepartmentsHavingEmployees"); JavaBeanResult.setQueryResultClass(query, AggregatedDepartment.class); return query.getResultList(); } 5. In the Departments entity, create a new native query annotation: @Entity @NamedQueries( { @NamedQuery(name = "Departments.findAll", query = "select o from Departments o") }) @NamedNativeQueries({ @NamedNativeQuery(name="Departments.allDepartmentsHavingEmployees", query = "select e.department_id, d.department_name , sum(e.salary), avg(e.salary) , max(e.salary), min(e.salary) from departments d , employees e where d.department_id = e.department_id group by e.department_id, d.department_name")}) public class Departments implements Serializable {...} 6. Create a new POJO called AggregatedDepartment: package oramag.sample.dashboard.model; import java.io.Serializable; import java.math.BigDecimal; public class AggregatedDepartment implements Serializable{ @SuppressWarnings("compatibility:5167698678781240729") private static final long serialVersionUID = 1L; private BigDecimal departmentId; private String departmentName; private BigDecimal sum; private BigDecimal avg; private BigDecimal max; private BigDecimal min; public AggregatedDepartment() { super(); } public AggregatedDepartment(BigDecimal departmentId, String departmentName, BigDecimal sum, BigDecimal avg, BigDecimal max, BigDecimal min) { super(); this.departmentId = departmentId; this.departmentName = departmentName; this.sum = sum; this.avg = avg; this.max = max; this.min = min; } public void setDepartmentId(BigDecimal departmentId) { this.departmentId = departmentId; } public BigDecimal getDepartmentId() { return departmentId; } public void setDepartmentName(String departmentName) { this.departmentName = departmentName; } public String getDepartmentName() { return departmentName; } public void setSum(BigDecimal sum) { this.sum = sum; } public BigDecimal getSum() { return sum; } public void setAvg(BigDecimal avg) { this.avg = avg; } public BigDecimal getAvg() { return avg; } public void setMax(BigDecimal max) { this.max = max; } public BigDecimal getMax() { return max; } public void setMin(BigDecimal min) { this.min = min; } public BigDecimal getMin() { return min; } } 7. Create the util java class called JavaBeanResult. The function of this class is to configure a native SQL query to return POJOs in a single line of code using the utility class. Credits: http://onpersistence.blogspot.com.br/2010/07/eclipselink-jpa-native-constructor.html package oramag.sample.dashboard.model.util; /******************************************************************************* * Copyright (c) 2010 Oracle. All rights reserved. * This program and the accompanying materials are made available under the * terms of the Eclipse Public License v1.0 and Eclipse Distribution License v. 1.0 * which accompanies this distribution. * The Eclipse Public License is available at http://www.eclipse.org/legal/epl-v10.html * and the Eclipse Distribution License is available at * http://www.eclipse.org/org/documents/edl-v10.php. * * @author shsmith ******************************************************************************/ import java.lang.reflect.Constructor; import java.lang.reflect.InvocationTargetException; import java.util.ArrayList; import java.util.List; import javax.persistence.Query; import org.eclipse.persistence.exceptions.ConversionException; import org.eclipse.persistence.internal.helper.ConversionManager; import org.eclipse.persistence.internal.sessions.AbstractRecord; import org.eclipse.persistence.internal.sessions.AbstractSession; import org.eclipse.persistence.jpa.JpaHelper; import org.eclipse.persistence.queries.DatabaseQuery; import org.eclipse.persistence.queries.QueryRedirector; import org.eclipse.persistence.sessions.Record; import org.eclipse.persistence.sessions.Session; /*** * This class is a simple query redirector that intercepts the result of a * native query and builds an instance of the specified JavaBean class from each * result row. The order of the selected columns musts match the JavaBean class * constructor arguments order. * * To configure a JavaBeanResult on a native SQL query use: * JavaBeanResult.setQueryResultClass(query, SomeBeanClass.class); * where query is either a JPA SQL Query or native EclipseLink DatabaseQuery. * * @author shsmith * */ public final class JavaBeanResult implements QueryRedirector { private static final long serialVersionUID = 3025874987115503731L; protected Class resultClass; public static void setQueryResultClass(Query query, Class resultClass) { JavaBeanResult javaBeanResult = new JavaBeanResult(resultClass); DatabaseQuery databaseQuery = JpaHelper.getDatabaseQuery(query); databaseQuery.setRedirector(javaBeanResult); } public static void setQueryResultClass(DatabaseQuery query, Class resultClass) { JavaBeanResult javaBeanResult = new JavaBeanResult(resultClass); query.setRedirector(javaBeanResult); } protected JavaBeanResult(Class resultClass) { this.resultClass = resultClass; } @SuppressWarnings("unchecked") public Object invokeQuery(DatabaseQuery query, Record arguments, Session session) { List results = new ArrayList(); try { Constructor[] constructors = resultClass.getDeclaredConstructors(); Constructor javaBeanClassConstructor = null; // (Constructor) resultClass.getDeclaredConstructors()[0]; Class[] constructorParameterTypes = null; // javaBeanClassConstructor.getParameterTypes(); List rows = (List) query.execute( (AbstractSession) session, (AbstractRecord) arguments); for (Object[] columns : rows) { boolean found = false; for (Constructor constructor : constructors) { javaBeanClassConstructor = constructor; constructorParameterTypes = javaBeanClassConstructor.getParameterTypes(); if (columns.length == constructorParameterTypes.length) { found = true; break; } // if (columns.length != constructorParameterTypes.length) { // throw new ColumnParameterNumberMismatchException( // resultClass); // } } if (!found) throw new ColumnParameterNumberMismatchException( resultClass); Object[] constructorArgs = new Object[constructorParameterTypes.length]; for (int j = 0; j < columns.length; j++) { Object columnValue = columns[j]; Class parameterType = constructorParameterTypes[j]; // convert the column value to the correct type--if possible constructorArgs[j] = ConversionManager.getDefaultManager() .convertObject(columnValue, parameterType); } results.add(javaBeanClassConstructor.newInstance(constructorArgs)); } } catch (ConversionException e) { throw new ColumnParameterMismatchException(e); } catch (IllegalArgumentException e) { throw new ColumnParameterMismatchException(e); } catch (InstantiationException e) { throw new ColumnParameterMismatchException(e); } catch (IllegalAccessException e) { throw new ColumnParameterMismatchException(e); } catch (InvocationTargetException e) { throw new ColumnParameterMismatchException(e); } return results; } public final class ColumnParameterMismatchException extends RuntimeException { private static final long serialVersionUID = 4752000720859502868L; public ColumnParameterMismatchException(Throwable t) { super( "Exception while processing query results-ensure column order matches constructor parameter order", t); } } public final class ColumnParameterNumberMismatchException extends RuntimeException { private static final long serialVersionUID = 1776794744797667755L; public ColumnParameterNumberMismatchException(Class clazz) { super( "Number of selected columns does not match number of constructor arguments for: " + clazz.getName()); } } } 8. Create the DataControl and a jsf or jspx page 9. Drag allDepartmentsHavingEmployees from DataControl and drop in your page 10. Choose Graph > Type: Bar (Normal) > any layout 11. In the wizard screen, Bars label, adds: sum, avg, max, min. In the X Axis label, adds: departmentName, and click in OK button 12. Run the page, the result is showed below: You can download the workspace here . It was using the latest jdeveloper version 11.1.2.2.

    Read the article

  • How to Recover From a Virus Infection: 3 Things You Need to Do

    - by Chris Hoffman
    If your computer becomes infected with a virus or another piece of malware, removing the malware from your computer is only the first step. There’s more you need to do to ensure you’re secure. Note that not every antivirus alert is an actual infection. If your antivirus program catches a virus before it ever gets a chance to run on your computer, you’re safe. If it catches the malware later, you have a bigger problem. Change Your Passwords You’ve probably used your computer to log into your email, online banking websites, and other important accounts. Assuming you had malware on your computer, the malware could have logged your passwords and uploaded them to a malicious third party. With just your email account, the third party could reset your passwords on other websites and gain access to almost any of your online accounts. To prevent this, you’ll want to change the passwords for your important accounts — email, online banking, and whatever other important accounts you’ve logged into from the infected computer. You should probably use another computer that you know is clean to change the passwords, just to be safe. When changing your passwords, consider using a password manager to keep track of strong, unique passwords and two-factor authentication to prevent people from logging into your important accounts even if they know your password. This will help protect you in the future. Ensure the Malware Is Actually Removed Once malware gets access to your computer and starts running, it has the ability to do many more nasty things to your computer. For example, some malware may install rootkit software and attempt to hide itself from the system. Many types of Trojans also “open the floodgates” after they’re running, downloading many different types of malware from malicious web servers to the local system. In other words, if your computer was infected, you’ll want to take extra precautions. You shouldn’t assume it’s clean just because your antivirus removed what it found. It’s probably a good idea to scan your computer with multiple antivirus products to ensure maximum detection. You may also want to run a bootable antivirus program, which runs outside of Windows. Such bootable antivirus programs will be able to detect rootkits that hide themselves from Windows and even the software running within Windows. avast! offers the ability to quickly create a bootable CD or USB drive for scanning, as do many other antivirus programs. You may also want to reinstall Windows (or use the Refresh feature on Windows 8) to get your computer back to a clean state. This is more time-consuming, especially if you don’t have good backups and can’t get back up and running quickly, but this is the only way you can have 100% confidence that your Windows system isn’t infected. It’s all a matter of how paranoid you want to be. Figure Out How the Malware Arrived If your computer became infected, the malware must have arrived somehow. You’ll want to examine your computer’s security and your habits to prevent more malware from slipping through in the same way. Windows is complex. For example, there are over 50 different types of potentially dangerous file extensions that can contain malware to keep track of. We’ve tried to cover many of the most important security practices you should be following, but here are some of the more important questions to ask: Are you using an antivirus? – If you don’t have an antivirus installed, you should. If you have Microsoft Security Essentials (known as Windows Defender on Windows 8), you may want to switch to a different antivirus like the free version of avast!. Microsoft’s antivirus product has been doing very poorly in tests. Do you have Java installed? – Java is a huge source of security problems. The majority of computers on the Internet have an out-of-date, vulnerable version of Java installed, which would allow malicious websites to install malware on your computer. If you have Java installed, uninstall it. If you actually need Java for something (like Minecraft), at least disable the Java browser plugin. If you’re not sure whether you need Java, you probably don’t. Are any browser plugins out-of-date? – Visit Mozilla’s Plugin Check website (yes, it also works in other browsers, not just Firefox) and see if you have any critically vulnerable plugins installed. If you do, ensure you update them — or uninstall them. You probably don’t need older plugins like QuickTime or RealPlayer installed on your computer, although Flash is still widely used. Are your web browser and operating system set to automatically update? – You should be installing updates for Windows via Windows Update when they appear. Modern web browsers are set to automatically update, so they should be fine — unless you went out of your way to disable automatic updates. Using out-of-date web browsers and Windows versions is dangerous. Are you being careful about what you run? – Watch out when downloading software to ensure you don’t accidentally click sketchy advertisements and download harmful software. Avoid pirated software that may be full of malware. Don’t run programs from email attachments. Be careful about what you run and where you get it from in general. If you can’t figure out how the malware arrived because everything looks okay, there’s not much more you can do. Just try to follow proper security practices. You may also want to keep an extra-close eye on your credit card statement for a while if you did any online-shopping recently. As so much malware is now related to organized crime, credit card numbers are a popular target.     

    Read the article

  • axis2 web service behave differently when tested with web service client or with local test class

    - by Stefano
    Hello I need to update a facade to some web service proxy classes to a third party web service, and expose them as a service. This for two reason : to maintain the same interface for all application that need to use the system : actually its migrating and there are a few differences in the third party ws (method names); and to expose a simplified interface. The third party has provided me with a manual and some pregenerated proxy classes to their service (the java file says generated with axis2 1,4) . I've used netbeans 6.8 and the axis2 plugin to create an axis2 service . This service contains the proxy classes and the facade class which instantiate the web service proxy and calls its method; the facade class is exposed as service. I've used axis2 1.4 (at beginnig and later 1.5 ) and tomcat 6.0. The first test i did was to call the facede methods from inside the project itself and it worked. Then i've created a new project with a jax-ws web service client to call my class deployed on axis2. At this point has happened two strange thing : In the axis2 services page has appeared the third party proxy class as if it were a new service (if i try to get the wsdl axis raises an error ). eg. the proxy interface is named WebServiceAPI (_stub is the concrete class) and , after the first call to my service , i find a new "WebServicesAPI1272968932531_1" service inside axis . The call obvoiusly fail i've began to sniff soap messages with wireshark and i've found they differs when using proxy classes direclty from my facade test class by the messages created after being deployed on axis. i've noticed they differs for the presence of the soap header in the failing message. any help would be greatly appreciated : maybe i messed up something, there might be some incompatibilities or version mistakes? below i've added the signature of the third party proxy, its impementation and the different soap messages: /* * WebServicesAPI.java * This file was auto-generated from WSDL * by the Apache Axis2 version: 1.4 Built on : Apr 26, 2008 (06:24:30 EDT) */ package com.ibm.eci.wsapi; public interface WebServicesAPI { public com.ibm.eci.wsapi.ArrayOfstring getWorkItemHistory( java.lang.String stateKey,java.lang.String logonID,com.ibm.eci.wsapi.RepoItemHandle workItemHandle) throws java.rmi.RemoteException,com.ibm.eci.wsapi.ExceptionException0; ...etc the concrete class is : /** * WebServicesAPIStub.java * * This file was auto-generated from WSDL * by the Apache Axis2 version: 1.4 Built on : Apr 26, 2008 (06:24:30 EDT) */ package com.ibm.eci.wsapi; /* * WebServicesAPIStub java implementation */ public class WebServicesAPIStub extends org.apache.axis2.client.Stub implements WebServicesAPI{ protected org.apache.axis2.description.AxisOperation[] _operations; ... public com.ibm.eci.wsapi.ArrayOfstring getWorkItemHistory( java.lang.String stateKey297,java.lang.String logonID298,com.ibm.eci.wsapi.RepoItemHandle workItemHandle299) throws java.rmi.RemoteException ,com.ibm.eci.wsapi.ExceptionException0{ org.apache.axis2.context.MessageContext _messageContext = null; try{ org.apache.axis2.client.OperationClient _operationClient = _serviceClient.createClient(_operations[0].getName()); _operationClient.getOptions().setAction("\"\""); _operationClient.getOptions().setExceptionToBeThrownOnSOAPFault(true); addPropertyToOperationClient(_operationClient,org.apache.axis2.description.WSDL2Constants.ATTR_WHTTP_QUERY_PARAMETER_SEPARATOR,"&"); // create a message context _messageContext = new org.apache.axis2.context.MessageContext(); // create SOAP envelope with that payload org.apache.axiom.soap.SOAPEnvelope env = null; com.ibm.eci.wsapi.GetWorkItemHistoryE dummyWrappedType = null; env = toEnvelope(getFactory(_operationClient.getOptions().getSoapVersionURI()), stateKey297, logonID298, workItemHandle299, dummyWrappedType, optimizeContent(new javax.xml.namespace.QName("http://wsapi.eci.ibm.com", "getWorkItemHistory"))); //adding SOAP soap_headers _serviceClient.addHeadersToEnvelope(env); // set the message context with that soap envelope _messageContext.setEnvelope(env); // add the message contxt to the operation client _operationClient.addMessageContext(_messageContext); //execute the operation client _operationClient.execute(true); org.apache.axis2.context.MessageContext _returnMessageContext = _operationClient.getMessageContext( org.apache.axis2.wsdl.WSDLConstants.MESSAGE_LABEL_IN_VALUE); org.apache.axiom.soap.SOAPEnvelope _returnEnv = _returnMessageContext.getEnvelope(); java.lang.Object object = fromOM( _returnEnv.getBody().getFirstElement() , com.ibm.eci.wsapi.GetWorkItemHistoryResponseE.class, getEnvelopeNamespaces(_returnEnv)); return getGetWorkItemHistoryResponse_return((com.ibm.eci.wsapi.GetWorkItemHistoryResponseE)object); ... the failing soap message (generated by jax-ws client to the axis deployed service) is : POST /vbr_wsapi/services/WebServicesAPI.Endpoint HTTP/1.1 Content-Type: text/xml; charset=UTF-8 SOAPAction: "" User-Agent: Axis2 Host: n0611049:9083 Transfer-Encoding: chunked <?xml version='1.0' encoding='UTF-8'?> <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"> <soapenv:Header xmlns:wsa="http://www.w3.org/2005/08/addressing"> <wsa:To>http://n0611049:9083/vbr_wsapi/services/WebServicesAPI.Endpoint</wsa:To> <wsa:MessageID>urn:uuid:A31AD99897F9045E981272964443982</wsa:MessageID><wsa:Action>""</wsa:Action> </soapenv:Header> <soapenv:Body> <ns1:initializeProps xmlns:ns1="http://wsapi.eci.ibm.com"> <props><val>client.locale=it_IT</val> </props> </ns1:initializeProps> </soapenv:Body> </soapenv:Envelope> HTTP/1.1 500 Internal Server Error Content-Type: text/xml; charset=UTF-8 Content-Language: en-US Transfer-Encoding: chunked Connection: Close Date: Tue, 04 May 2010 09:16:15 GMT Server: WebSphere Application Server/7.0 <?xml version='1.0' encoding='UTF-8'?> <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"> <soapenv:Header xmlns:wsa="http://www.w3.org/2005/08/addressing"> <wsa:Action>http://www.w3.org/2005/08/addressing/fault</wsa:Action> <wsa:RelatesTo>urn:uuid:A31AD99897F9045E981272964443982</wsa:RelatesTo> <wsa:FaultDetail> <wsa:ProblemAction> <wsa:Action>""</wsa:Action> </wsa:ProblemAction> </wsa:FaultDetail> </soapenv:Header> <soapenv:Body> <soapenv:Fault xmlns:wsa="http://www.w3.org/2005/08/addressing"> <faultcode>wsa:ActionNotSupported</faultcode> <faultstring>The [action] cannot be processed at the receiver.</faultstring> <detail /> </soapenv:Fault> </soapenv:Body> </soapenv:Envelope> the succesful call (generated by my local test class, not being deployed to axis yet) : POST /vbr_wsapi/services/WebServicesAPI.Endpoint HTTP/1.1 Content-Type: text/xml; charset=UTF-8 SOAPAction: "" User-Agent: Axis2 Host: n0611049:9083 Transfer-Encoding: chunked <?xml version='1.0' encoding='UTF-8'?> <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"> <soapenv:Body> <ns1:initializeProps xmlns:ns1="http://wsapi.eci.ibm.com"> <props> <val>client.locale=it_IT</val> </props> </ns1:initializeProps> </soapenv:Body> </soapenv:Envelope> HTTP/1.1 200 OK Content-Type: text/xml; charset=UTF-8 Content-Language: en-US Transfer-Encoding: chunked Date: Tue, 04 May 2010 09:40:03 GMT Server: WebSphere Application Server/7.0 <?xml version='1.0' encoding='UTF-8'?> <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"> <soapenv:Body> <dlwmin:initializePropsResponse xmlns:dlwmin="http://wsapi.eci.ibm.com"> <return>e0e40cc51ceb0adf96c582bb6e047b3d0f</return> </dlwmin:initializePropsResponse> </soapenv:Body> </soapenv:Envelope> POST /vbr_wsapi/services/WebServicesAPI.Endpoint HTTP/1.1 Content-Type: text/xml; charset=UTF-8 SOAPAction: "" User-Agent: Axis2 Host: n0611049:9083 Transfer-Encoding: chunked <?xml version='1.0' encoding='UTF-8'?> <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"> <soapenv:Body> <ns1:logon xmlns:ns1="http://wsapi.eci.ibm.com"> <stateKey>e0e40cc51ceb0adf96c582bb6e047b3d0f</stateKey> <systemID>----</systemID> <authBundle> <password>-----</password> <sealed>false</sealed> <username>---</username> </authBundle> </ns1:logon> </soapenv:Body> </soapenv:Envelope> HTTP/1.1 200 OK Content-Type: text/xml; charset=UTF-8 Content-Language: en-US Transfer-Encoding: chunked Date: Tue, 04 May 2010 09:40:21 GMT Server: WebSphere Application Server/7.0 <?xml version='1.0' encoding='UTF-8'?> <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"> <soapenv:Body> <dlwmin:logonResponse xmlns:dlwmin="http://wsapi.eci.ibm.com"> <return>e0e40cc51ceb0adf96c582bb6e047b3d10</return> </dlwmin:logonResponse> </soapenv:Body> </soapenv:Envelope> ... goes on with other calls

    Read the article

  • xslt cookbook example not working

    - by Liza dawson
    Hi I am working on this from xslt cookbook type my.xml <?xml version="1.0" encoding="UTF-8"?> <people> <person name="Al Zehtooney" age="33" sex="m" smoker="no"/> <person name="Brad York" age="38" sex="m" smoker="yes"/> <person name="Charles Xavier" age="32" sex="m" smoker="no"/> <person name="David Williams" age="33" sex="m" smoker="no"/> <person name="Edward Ulster" age="33" sex="m" smoker="yes"/> <person name="Frank Townsend" age="35" sex="m" smoker="no"/> <person name="Greg Sutter" age="40" sex="m" smoker="no"/> <person name="Harry Rogers" age="37" sex="m" smoker="no"/> <person name="John Quincy" age="43" sex="m" smoker="yes"/> <person name="Kent Peterson" age="31" sex="m" smoker="no"/> <person name="Larry Newell" age="23" sex="m" smoker="no"/> <person name="Max Milton" age="22" sex="m" smoker="no"/> <person name="Norman Lamagna" age="30" sex="m" smoker="no"/> <person name="Ollie Kensington" age="44" sex="m" smoker="no"/> <person name="John Frank" age="24" sex="m" smoker="no"/> <person name="Mary Williams" age="33" sex="f" smoker="no"/> <person name="Jane Frank" age="38" sex="f" smoker="yes"/> <person name="Jo Peterson" age="32" sex="f" smoker="no"/> <person name="Angie Frost" age="33" sex="f" smoker="no"/> <person name="Betty Bates" age="33" sex="f" smoker="no"/> <person name="Connie Date" age="35" sex="f" smoker="no"/> <person name="Donna Finster" age="20" sex="f" smoker="no"/> <person name="Esther Gates" age="37" sex="f" smoker="no"/> <person name="Fanny Hill" age="33" sex="f" smoker="yes"/> <person name="Geta Iota" age="27" sex="f" smoker="no"/> <person name="Hillary Johnson" age="22" sex="f" smoker="no"/> <person name="Ingrid Kent" age="21" sex="f" smoker="no"/> <person name="Jill Larson" age="20" sex="f" smoker="no"/> <person name="Kim Mulrooney" age="41" sex="f" smoker="no"/> <person name="Lisa Nevins" age="21" sex="f" smoker="no"/> </people> type generic-attr-to-csv.xslt <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:csv="http://www.ora.com/XSLTCookbook/namespaces/csv"> <xsl:param name="delimiter" select=" ',' "/> <xsl:output method="text" /> <xsl:strip-space elements="*"/> <xsl:template match="/"> <xsl:for-each select="$columns"> <xsl:value-of select="@name"/> <xsl:if test="position( ) != last( )"> <xsl:value-of select="$delimiter/> </xsl:if> </xsl:for-each> <xsl:text>&#xa;</xsl:text> <xsl:apply-templates/> </xsl:template> <xsl:template match="/*/*"> <xsl:variable name="row" select="."/> <xsl:for-each select="$columns"> <xsl:apply-templates select="$row/@*[local-name(.)=current( )/@attr]" mode="csv:map-value"/> <xsl:if test="position( ) != last( )"> <xsl:value-of select="$delimiter"/> </xsl:if> </xsl:for-each> <xsl:text>&#xa;</xsl:text> </xsl:template> <xsl:template match="@*" mode="map-value"> <xsl:value-of select="."/> </xsl:template> </xsl:stylesheet> type my.xsl <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:csv="http://www.ora.com/XSLTCookbook/namespaces/csv"> <xsl:import href="generic-attr-to-csv.xslt"/> <!--Defines the mapping from attributes to columns --> <xsl:variable name="columns" select="document('')/*/csv:column"/> <csv:column name="Name" attr="name"/> <csv:column name="Age" attr="age"/> <csv:column name="Gender" attr="sex"/> <csv:column name="Smoker" attr="smoker"/> <!-- Handle custom attribute mappings --> <xsl:template match="@sex" mode="csv:map-value"> <xsl:choose> <xsl:when test=".='m'">male</xsl:when> <xsl:when test=".='f'">female</xsl:when> <xsl:otherwise>error</xsl:otherwise> </xsl:choose> </xsl:template> </xsl:stylesheet> using the apache xalan parser D:\Test>java org.apache.xalan.xslt.Process -in my.xml -xsl my.xsl -out my.csv [Fatal Error] generic-attr-to-csv.xslt:15:6: The value of attribute "select" associated with an element type "xsl:v alue-of" must not contain the '<' character. file:///D:/Test/generic-attr-to-csv.xslt; Line #15; Column #6; org.xml.sax.SAXParseException: The value of attribut e "select" associated with an element type "xsl:value-of" must not contain the '<' character. java.lang.NullPointerException at org.apache.xalan.transformer.TransformerImpl.createSerializationHandler(TransformerImpl.java:1171) at org.apache.xalan.transformer.TransformerImpl.createSerializationHandler(TransformerImpl.java:1060) at org.apache.xalan.transformer.TransformerImpl.transform(TransformerImpl.java:1268) at org.apache.xalan.transformer.TransformerImpl.transform(TransformerImpl.java:1251) at org.apache.xalan.xslt.Process.main(Process.java:1048) Exception in thread "main" java.lang.RuntimeException at org.apache.xalan.xslt.Process.doExit(Process.java:1155) at org.apache.xalan.xslt.Process.main(Process.java:1128) Any ideas what am i doing wrong

    Read the article

  • Reason for Null pointer Exception

    - by Rahul Varma
    Hi, I cant figure out why my program is showing null pointer exception. Plz help me...Here's the program... public class MusicListActivity extends Activity { List<HashMap<String, String>> songNodeDet = new ArrayList<HashMap<String,String>>(); HashMap<?,?>[] songNodeWeb; XMLRPCClient client; String logInSess; ArrayList<String> paths=new ArrayList<String>(); public ListAdapter adapter ; Object[] websongListObject; List<SongsList> SngList=new ArrayList<SongsList>(); Runnable r; ProgressDialog p; ListView lv; String s; @Override public void onCreate(Bundle si){ super.onCreate(si); setContentView(R.layout.openadiuofile); lv=(ListView)findViewById(R.id.list1); r=new Runnable(){ public void run(){ try{ getSongs(); } catch (MalformedURLException e) { // TODO Auto-generated catch block e.printStackTrace(); } catch (XMLRPCException e) { // TODO Auto-generated catch block e.printStackTrace(); } } }; Thread t=new Thread(r,"background"); t.start(); Log.e("***","process over"); } @Override protected void onResume() { // TODO Auto-generated method stub super.onResume(); } private Runnable returnRes = new Runnable() { @Override public void run() { Log.d("handler","handler"); removeDialog(0); p.dismiss(); list(); } }; public void list() { Log.d("#####","#####"); LayoutInflater inflater=getLayoutInflater(); String[] from={}; int[] n={}; adapter=new SongsAdapter(getApplicationContext(),songNodeDet,R.layout.row,from,n,inflater); lv.setAdapter(adapter);} private Handler handler = new Handler() { public void handleMessage(Message msg){ Log.d("*****","handler"); removeDialog(0); p.dismiss(); } }; public void webObjectList(Object[] imgListObj,String logInSess) throws XMLRPCException{ songNodeWeb = new HashMap<?,?>[imgListObj.length]; if(imgListObj!=null){ Log.e("completed","completed"); for(int i=0;i<imgListObj.length;i++){ //imgListObj.length songNodeWeb[i]=(HashMap<?,?>)imgListObj[i]; String nodeid=(String) songNodeWeb[i].get("nid"); break; Log.e("img",i+"completed"); HashMap<String,String> nData=new HashMap<String,String>(); nData.put("nid",nodeid); Object nodeget=client.call("node.get",logInSess,nodeid); HashMap<?,?> imgNode=(HashMap<?,?>)nodeget; String titleName=(String) imgNode.get("titles"); String movieName=(String) imgNode.get("album"); String singerName=(String) imgNode.get("artist"); nData.put("titles", titleName); nData.put("album", movieName); nData.put("artist", singerName); Object[] imgObject=(Object[])imgNode.get("field_image"); HashMap<?,?>[] imgDetails=new HashMap<?,?>[imgObject.length]; imgDetails[0]=(HashMap<?, ?>)imgObject[0]; String path=(String) imgDetails[0].get("filepath"); if(path.contains(" ")){ path=path.replace(" ", "%20"); } String imgPath="http://www.gorinka.com/"+path; paths.add(imgPath); nData.put("path", imgPath); Log.e("my path",path); String mime=(String)imgDetails[0].get("filemime"); nData.put("mime", mime); SongsList songs=new SongsList(titleName,movieName,singerName,imgPath,imgPath); SngList.add(i,songs); songNodeDet.add(i,nData); } Log.e("paths values",paths.toString()); // return imgNodeDet; handler.sendEmptyMessage(0); } } public void getSongs() throws MalformedURLException, XMLRPCException { String ur="http://www.gorinka.com/?q=services/xmlrpc"; URL u=new URL(ur); client = new XMLRPCClient(u); //Connecting to the website HashMap<?, ?> siteConn =(HashMap<?, ?>) client.call("system.connect"); // Getting initial sessio id String initSess=(String)siteConn.get("sessid"); //Login to the site using session id HashMap<?, ?> logInConn =(HashMap<?, ?>) client.call("user.login",initSess,"prakash","stellentsoft2009"); //Getting Login sessid logInSess=(String)logInConn.get("sessid"); websongListObject =(Object[]) client.call("nodetype.get",logInSess,""); webObjectList(websongListObject,logInSess); Log.d("webObjectList","webObjectList"); runOnUiThread(returnRes); } } Here's the Adapter associated... public class SongsAdapter extends SimpleAdapter{ static List<HashMap<String,String>> songsList; Context context; LayoutInflater inflater; public SongsAdapter(Context context,List<HashMap<String,String>> imgListWeb,int layout,String[] from,int[] to,LayoutInflater inflater) { super(context,songsList,layout,from,to); this.songsList=songsList; this.context=context; this.inflater=inflater; // TODO Auto-generated constructor stub } @Override public View getView(int postition,View convertView,ViewGroup parent)throws java.lang.OutOfMemoryError{ try { View v = ((LayoutInflater) inflater).inflate(R.layout.row,null); ImageView images=(ImageView)v.findViewById(R.id.image); TextView tvTitle=(TextView)v.findViewById(R.id.text1); TextView tvAlbum=(TextView)v.findViewById(R.id.text2); TextView tvArtist=(TextView)v.findViewById(R.id.text3); HashMap<String,String> songsHash=songsList.get(postition); String path=songsHash.get("path"); String title=songsHash.get("title"); String album=songsHash.get("album"); String artist=songsHash.get("artist"); String imgPath=path; final ImageView imageView = (ImageView) v.findViewById(R.id.image); AsyncImageLoaderv asyncImageLoader=new AsyncImageLoaderv(); Bitmap cachedImage = asyncImageLoader.loadDrawable(imgPath, new AsyncImageLoaderv.ImageCallback() { public void imageLoaded(Bitmap imageDrawable, String imageUrl) { imageView.setImageBitmap(imageDrawable); } }); imageView.setImageBitmap(cachedImage); tvTitle.setText(title); tvAlbum.setText(album); tvArtist.setText(artist); return v; } catch(Exception e){ Log.e("error",e.toString()); } return null; } public static Bitmap loadImageFromUrl(String url) { InputStream inputStream;Bitmap b; try { inputStream = (InputStream) new URL(url).getContent(); BitmapFactory.Options bpo= new BitmapFactory.Options(); bpo.inSampleSize=2; b=BitmapFactory.decodeStream(inputStream, null,bpo ); return b; } catch (IOException e) { throw new RuntimeException(e); } } } Here is what logcat is showing... 04-23 16:02:02.211: ERROR/completed(1450): completed 04-23 16:02:02.211: ERROR/paths values(1450): [] 04-23 16:02:02.211: DEBUG/*****(1450): handler 04-23 16:02:02.211: DEBUG/AndroidRuntime(1450): Shutting down VM 04-23 16:02:02.211: WARN/dalvikvm(1450): threadid=3: thread exiting with uncaught exception (group=0x4001aa28) 04-23 16:02:02.222: ERROR/AndroidRuntime(1450): Uncaught handler: thread main exiting due to uncaught exception 04-23 16:02:02.241: DEBUG/webObjectList(1450): webObjectList 04-23 16:02:02.252: ERROR/AndroidRuntime(1450): java.lang.NullPointerException 04-23 16:02:02.252: ERROR/AndroidRuntime(1450): at com.stellent.gorinka.MusicListActivity$2.handleMessage(MusicListActivity.java:81) 04-23 16:02:02.252: ERROR/AndroidRuntime(1450): at android.os.Handler.dispatchMessage(Handler.java:99) 04-23 16:02:02.252: ERROR/AndroidRuntime(1450): at android.os.Looper.loop(Looper.java:123) 04-23 16:02:02.252: ERROR/AndroidRuntime(1450): at android.app.ActivityThread.main(ActivityThread.java:4203) 04-23 16:02:02.252: ERROR/AndroidRuntime(1450): at java.lang.reflect.Method.invokeNative(Native Method) 04-23 16:02:02.252: ERROR/AndroidRuntime(1450): at java.lang.reflect.Method.invoke(Method.java:521) 04-23 16:02:02.252: ERROR/AndroidRuntime(1450): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:791) 04-23 16:02:02.252: ERROR/AndroidRuntime(1450): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:549) 04-23 16:02:02.252: ERROR/AndroidRuntime(1450): at dalvik.system.NativeStart.main(Native Method) I have declared the getter and setter methods in a seperate claa named SongsList. Plz help me determine the problem...

    Read the article

  • Referencing CDI producer method result in h:selectOneMenu

    - by user953217
    I have a named session scoped bean CustomerRegistration which has a named producer method getNewCustomer which returns a Customer object. There is also CustomerListProducer class which produces all customers as list from the database. On the selectCustomer.xhtml page the user is then able to select one of the customers and submit the selection to the application which then simply prints out the last name of the selected customer. Now this only works when I reference the selected customer on the facelets page via #{customerRegistration.newCustomer}. When I simply use #{newCustomer} then the output for the last name is null whenever I submit the form. What's going on here? Is this the expected behavior as according to chapter 7.1 Restriction upon bean instantion of JSR-299 spec? It says: ... However, if the application directly instantiates a bean class, instead of letting the container perform instantiation, the resulting instance is not managed by the container and is not a contextual instance as defined by Section 6.5.2, “Contextual instance of a bean”. Furthermore, the capabilities listed in Section 2.1, “Functionality provided by the container to the bean” will not be available to that particular instance. In a deployed application, it is the container that is responsible for instantiating beans and initializing their dependencies. ... Here's the code: Customer.java: @javax.persistence.Entity @Veto public class Customer implements Serializable, Entity { private static final long serialVersionUID = 122193054725297662L; @Column(name = "first_name") private String firstName; @Column(name = "last_name") private String lastName; @Id @GeneratedValue() private Long id; public String getFirstName() { return firstName; } public void setFirstName(String firstName) { this.firstName = firstName; } public String getLastName() { return lastName; } public void setLastName(String lastName) { this.lastName = lastName; } @Override public String toString() { return firstName + ", " + lastName; } @Override public Long getId() { return this.id; } } CustomerListProducer.java: @SessionScoped public class CustomerListProducer implements Serializable { @Inject private EntityManager em; private List<Customer> customers; @Inject @Category("helloworld_as7") Logger log; // @Named provides access the return value via the EL variable name // "members" in the UI (e.g., // Facelets or JSP view) @Produces @Named public List<Customer> getCustomers() { return customers; } public void onCustomerListChanged( @Observes(notifyObserver = Reception.IF_EXISTS) final Customer customer) { // retrieveAllCustomersOrderedByName(); log.info(customer.toString()); } @PostConstruct public void retrieveAllCustomersOrderedByName() { CriteriaBuilder cb = em.getCriteriaBuilder(); CriteriaQuery<Customer> criteria = cb.createQuery(Customer.class); Root<Customer> customer = criteria.from(Customer.class); // Swap criteria statements if you would like to try out type-safe // criteria queries, a new // feature in JPA 2.0 // criteria.select(member).orderBy(cb.asc(member.get(Member_.name))); criteria.select(customer).orderBy(cb.asc(customer.get("lastName"))); customers = em.createQuery(criteria).getResultList(); } } CustomerRegistration.java: @Named @SessionScoped public class CustomerRegistration implements Serializable { @Inject @Category("helloworld_as7") private Logger log; private Customer newCustomer; @Produces @Named public Customer getNewCustomer() { return newCustomer; } public void selected() { log.info("Customer " + newCustomer.getLastName() + " ausgewählt."); } @PostConstruct public void initNewCustomer() { newCustomer = new Customer(); } public void setNewCustomer(Customer newCustomer) { this.newCustomer = newCustomer; } } not working selectCustomer.xhtml: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" xmlns:f="http://java.sun.com/jsf/core" xmlns:h="http://java.sun.com/jsf/html" xmlns:ui="http://java.sun.com/jsf/facelets"> <h:head> <title>Auswahl</title> </h:head> <h:body> <h:form> <h:selectOneMenu value="#{newCustomer}" converter="customerConverter"> <f:selectItems value="#{customers}" var="current" itemLabel="#{current.firstName}, #{current.lastName}" /> </h:selectOneMenu> <h:panelGroup id="auswahl"> <h:outputText value="#{newCustomer.lastName}" /> </h:panelGroup> <h:commandButton value="Klick" action="#{customerRegistration.selected}" /> </h:form> </h:body> </html> working selectCustomer.xhtml: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" xmlns:f="http://java.sun.com/jsf/core" xmlns:h="http://java.sun.com/jsf/html" xmlns:ui="http://java.sun.com/jsf/facelets"> <h:head> <title>Auswahl</title> </h:head> <h:body> <h:form> <h:selectOneMenu value="#{customerRegistration.newCustomer}" converter="customerConverter"> <f:selectItems value="#{customers}" var="current" itemLabel="#{current.firstName}, #{current.lastName}" /> </h:selectOneMenu> <h:panelGroup id="auswahl"> <h:outputText value="#{newCustomer.lastName}" /> </h:panelGroup> <h:commandButton value="Klick" action="#{customerRegistration.selected}" /> </h:form> </h:body> </html> CustomerConverter.java: @SessionScoped @FacesConverter("customerConverter") public class CustomerConverter implements Converter, Serializable { private static final long serialVersionUID = -6093400626095413322L; @Inject EntityManager entityManager; @Override public Object getAsObject(FacesContext context, UIComponent component, String value) { Long id = Long.valueOf(value); return entityManager.find(Customer.class, id); } @Override public String getAsString(FacesContext context, UIComponent component, Object value) { return ((Customer) value).getId().toString(); } }

    Read the article

  • Using R to Analyze G1GC Log Files

    - by user12620111
    Using R to Analyze G1GC Log Files body, td { font-family: sans-serif; background-color: white; font-size: 12px; margin: 8px; } tt, code, pre { font-family: 'DejaVu Sans Mono', 'Droid Sans Mono', 'Lucida Console', Consolas, Monaco, monospace; } h1 { font-size:2.2em; } h2 { font-size:1.8em; } h3 { font-size:1.4em; } h4 { font-size:1.0em; } h5 { font-size:0.9em; } h6 { font-size:0.8em; } a:visited { color: rgb(50%, 0%, 50%); } pre { margin-top: 0; max-width: 95%; border: 1px solid #ccc; white-space: pre-wrap; } pre code { display: block; padding: 0.5em; } code.r, code.cpp { background-color: #F8F8F8; } table, td, th { border: none; } blockquote { color:#666666; margin:0; padding-left: 1em; border-left: 0.5em #EEE solid; } hr { height: 0px; border-bottom: none; border-top-width: thin; border-top-style: dotted; border-top-color: #999999; } @media print { * { background: transparent !important; color: black !important; filter:none !important; -ms-filter: none !important; } body { font-size:12pt; max-width:100%; } a, a:visited { text-decoration: underline; } hr { visibility: hidden; page-break-before: always; } pre, blockquote { padding-right: 1em; page-break-inside: avoid; } tr, img { page-break-inside: avoid; } img { max-width: 100% !important; } @page :left { margin: 15mm 20mm 15mm 10mm; } @page :right { margin: 15mm 10mm 15mm 20mm; } p, h2, h3 { orphans: 3; widows: 3; } h2, h3 { page-break-after: avoid; } } pre .operator, pre .paren { color: rgb(104, 118, 135) } pre .literal { color: rgb(88, 72, 246) } pre .number { color: rgb(0, 0, 205); } pre .comment { color: rgb(76, 136, 107); } pre .keyword { color: rgb(0, 0, 255); } pre .identifier { color: rgb(0, 0, 0); } pre .string { color: rgb(3, 106, 7); } var hljs=new function(){function m(p){return p.replace(/&/gm,"&").replace(/"}while(y.length||w.length){var v=u().splice(0,1)[0];z+=m(x.substr(q,v.offset-q));q=v.offset;if(v.event=="start"){z+=t(v.node);s.push(v.node)}else{if(v.event=="stop"){var p,r=s.length;do{r--;p=s[r];z+=("")}while(p!=v.node);s.splice(r,1);while(r'+M[0]+""}else{r+=M[0]}O=P.lR.lastIndex;M=P.lR.exec(L)}return r+L.substr(O,L.length-O)}function J(L,M){if(M.sL&&e[M.sL]){var r=d(M.sL,L);x+=r.keyword_count;return r.value}else{return F(L,M)}}function I(M,r){var L=M.cN?'':"";if(M.rB){y+=L;M.buffer=""}else{if(M.eB){y+=m(r)+L;M.buffer=""}else{y+=L;M.buffer=r}}D.push(M);A+=M.r}function G(N,M,Q){var R=D[D.length-1];if(Q){y+=J(R.buffer+N,R);return false}var P=q(M,R);if(P){y+=J(R.buffer+N,R);I(P,M);return P.rB}var L=v(D.length-1,M);if(L){var O=R.cN?"":"";if(R.rE){y+=J(R.buffer+N,R)+O}else{if(R.eE){y+=J(R.buffer+N,R)+O+m(M)}else{y+=J(R.buffer+N+M,R)+O}}while(L1){O=D[D.length-2].cN?"":"";y+=O;L--;D.length--}var r=D[D.length-1];D.length--;D[D.length-1].buffer="";if(r.starts){I(r.starts,"")}return R.rE}if(w(M,R)){throw"Illegal"}}var E=e[B];var D=[E.dM];var A=0;var x=0;var y="";try{var s,u=0;E.dM.buffer="";do{s=p(C,u);var t=G(s[0],s[1],s[2]);u+=s[0].length;if(!t){u+=s[1].length}}while(!s[2]);if(D.length1){throw"Illegal"}return{r:A,keyword_count:x,value:y}}catch(H){if(H=="Illegal"){return{r:0,keyword_count:0,value:m(C)}}else{throw H}}}function g(t){var p={keyword_count:0,r:0,value:m(t)};var r=p;for(var q in e){if(!e.hasOwnProperty(q)){continue}var s=d(q,t);s.language=q;if(s.keyword_count+s.rr.keyword_count+r.r){r=s}if(s.keyword_count+s.rp.keyword_count+p.r){r=p;p=s}}if(r.language){p.second_best=r}return p}function i(r,q,p){if(q){r=r.replace(/^((]+|\t)+)/gm,function(t,w,v,u){return w.replace(/\t/g,q)})}if(p){r=r.replace(/\n/g,"")}return r}function n(t,w,r){var x=h(t,r);var v=a(t);var y,s;if(v){y=d(v,x)}else{return}var q=c(t);if(q.length){s=document.createElement("pre");s.innerHTML=y.value;y.value=k(q,c(s),x)}y.value=i(y.value,w,r);var u=t.className;if(!u.match("(\\s|^)(language-)?"+v+"(\\s|$)")){u=u?(u+" "+v):v}if(/MSIE [678]/.test(navigator.userAgent)&&t.tagName=="CODE"&&t.parentNode.tagName=="PRE"){s=t.parentNode;var p=document.createElement("div");p.innerHTML=""+y.value+"";t=p.firstChild.firstChild;p.firstChild.cN=s.cN;s.parentNode.replaceChild(p.firstChild,s)}else{t.innerHTML=y.value}t.className=u;t.result={language:v,kw:y.keyword_count,re:y.r};if(y.second_best){t.second_best={language:y.second_best.language,kw:y.second_best.keyword_count,re:y.second_best.r}}}function o(){if(o.called){return}o.called=true;var r=document.getElementsByTagName("pre");for(var p=0;p|=||=||=|\\?|\\[|\\{|\\(|\\^|\\^=|\\||\\|=|\\|\\||~";this.ER="(?![\\s\\S])";this.BE={b:"\\\\.",r:0};this.ASM={cN:"string",b:"'",e:"'",i:"\\n",c:[this.BE],r:0};this.QSM={cN:"string",b:'"',e:'"',i:"\\n",c:[this.BE],r:0};this.CLCM={cN:"comment",b:"//",e:"$"};this.CBLCLM={cN:"comment",b:"/\\*",e:"\\*/"};this.HCM={cN:"comment",b:"#",e:"$"};this.NM={cN:"number",b:this.NR,r:0};this.CNM={cN:"number",b:this.CNR,r:0};this.BNM={cN:"number",b:this.BNR,r:0};this.inherit=function(r,s){var p={};for(var q in r){p[q]=r[q]}if(s){for(var q in s){p[q]=s[q]}}return p}}();hljs.LANGUAGES.cpp=function(){var a={keyword:{"false":1,"int":1,"float":1,"while":1,"private":1,"char":1,"catch":1,"export":1,virtual:1,operator:2,sizeof:2,dynamic_cast:2,typedef:2,const_cast:2,"const":1,struct:1,"for":1,static_cast:2,union:1,namespace:1,unsigned:1,"long":1,"throw":1,"volatile":2,"static":1,"protected":1,bool:1,template:1,mutable:1,"if":1,"public":1,friend:2,"do":1,"return":1,"goto":1,auto:1,"void":2,"enum":1,"else":1,"break":1,"new":1,extern:1,using:1,"true":1,"class":1,asm:1,"case":1,typeid:1,"short":1,reinterpret_cast:2,"default":1,"double":1,register:1,explicit:1,signed:1,typename:1,"try":1,"this":1,"switch":1,"continue":1,wchar_t:1,inline:1,"delete":1,alignof:1,char16_t:1,char32_t:1,constexpr:1,decltype:1,noexcept:1,nullptr:1,static_assert:1,thread_local:1,restrict:1,_Bool:1,complex:1},built_in:{std:1,string:1,cin:1,cout:1,cerr:1,clog:1,stringstream:1,istringstream:1,ostringstream:1,auto_ptr:1,deque:1,list:1,queue:1,stack:1,vector:1,map:1,set:1,bitset:1,multiset:1,multimap:1,unordered_set:1,unordered_map:1,unordered_multiset:1,unordered_multimap:1,array:1,shared_ptr:1}};return{dM:{k:a,i:"",k:a,r:10,c:["self"]}]}}}();hljs.LANGUAGES.r={dM:{c:[hljs.HCM,{cN:"number",b:"\\b0[xX][0-9a-fA-F]+[Li]?\\b",e:hljs.IMMEDIATE_RE,r:0},{cN:"number",b:"\\b\\d+(?:[eE][+\\-]?\\d*)?L\\b",e:hljs.IMMEDIATE_RE,r:0},{cN:"number",b:"\\b\\d+\\.(?!\\d)(?:i\\b)?",e:hljs.IMMEDIATE_RE,r:1},{cN:"number",b:"\\b\\d+(?:\\.\\d*)?(?:[eE][+\\-]?\\d*)?i?\\b",e:hljs.IMMEDIATE_RE,r:0},{cN:"number",b:"\\.\\d+(?:[eE][+\\-]?\\d*)?i?\\b",e:hljs.IMMEDIATE_RE,r:1},{cN:"keyword",b:"(?:tryCatch|library|setGeneric|setGroupGeneric)\\b",e:hljs.IMMEDIATE_RE,r:10},{cN:"keyword",b:"\\.\\.\\.",e:hljs.IMMEDIATE_RE,r:10},{cN:"keyword",b:"\\.\\.\\d+(?![\\w.])",e:hljs.IMMEDIATE_RE,r:10},{cN:"keyword",b:"\\b(?:function)",e:hljs.IMMEDIATE_RE,r:2},{cN:"keyword",b:"(?:if|in|break|next|repeat|else|for|return|switch|while|try|stop|warning|require|attach|detach|source|setMethod|setClass)\\b",e:hljs.IMMEDIATE_RE,r:1},{cN:"literal",b:"(?:NA|NA_integer_|NA_real_|NA_character_|NA_complex_)\\b",e:hljs.IMMEDIATE_RE,r:10},{cN:"literal",b:"(?:NULL|TRUE|FALSE|T|F|Inf|NaN)\\b",e:hljs.IMMEDIATE_RE,r:1},{cN:"identifier",b:"[a-zA-Z.][a-zA-Z0-9._]*\\b",e:hljs.IMMEDIATE_RE,r:0},{cN:"operator",b:"|=||   Using R to Analyze G1GC Log Files   Using R to Analyze G1GC Log Files Introduction Working in Oracle Platform Integration gives an engineer opportunities to work on a wide array of technologies. My team’s goal is to make Oracle applications run best on the Solaris/SPARC platform. When looking for bottlenecks in a modern applications, one needs to be aware of not only how the CPUs and operating system are executing, but also network, storage, and in some cases, the Java Virtual Machine. I was recently presented with about 1.5 GB of Java Garbage First Garbage Collector log file data. If you’re not familiar with the subject, you might want to review Garbage First Garbage Collector Tuning by Monica Beckwith. The customer had been running Java HotSpot 1.6.0_31 to host a web application server. I was told that the Solaris/SPARC server was running a Java process launched using a commmand line that included the following flags: -d64 -Xms9g -Xmx9g -XX:+UseG1GC -XX:MaxGCPauseMillis=200 -XX:InitiatingHeapOccupancyPercent=80 -XX:PermSize=256m -XX:MaxPermSize=256m -XX:+PrintGC -XX:+PrintGCTimeStamps -XX:+PrintHeapAtGC -XX:+PrintGCDateStamps -XX:+PrintFlagsFinal -XX:+DisableExplicitGC -XX:+UnlockExperimentalVMOptions -XX:ParallelGCThreads=8 Several sources on the internet indicate that if I were to print out the 1.5 GB of log files, it would require enough paper to fill the bed of a pick up truck. Of course, it would be fruitless to try to scan the log files by hand. Tools will be required to summarize the contents of the log files. Others have encountered large Java garbage collection log files. There are existing tools to analyze the log files: IBM’s GC toolkit The chewiebug GCViewer gchisto HPjmeter Instead of using one of the other tools listed, I decide to parse the log files with standard Unix tools, and analyze the data with R. Data Cleansing The log files arrived in two different formats. I guess that the difference is that one set of log files was generated using a more verbose option, maybe -XX:+PrintHeapAtGC, and the other set of log files was generated without that option. Format 1 In some of the log files, the log files with the less verbose format, a single trace, i.e. the report of a singe garbage collection event, looks like this: {Heap before GC invocations=12280 (full 61): garbage-first heap total 9437184K, used 7499918K [0xfffffffd00000000, 0xffffffff40000000, 0xffffffff40000000) region size 4096K, 1 young (4096K), 0 survivors (0K) compacting perm gen total 262144K, used 144077K [0xffffffff40000000, 0xffffffff50000000, 0xffffffff50000000) the space 262144K, 54% used [0xffffffff40000000, 0xffffffff48cb3758, 0xffffffff48cb3800, 0xffffffff50000000) No shared spaces configured. 2014-05-14T07:24:00.988-0700: 60586.353: [GC pause (young) 7324M->7320M(9216M), 0.1567265 secs] Heap after GC invocations=12281 (full 61): garbage-first heap total 9437184K, used 7496533K [0xfffffffd00000000, 0xffffffff40000000, 0xffffffff40000000) region size 4096K, 0 young (0K), 0 survivors (0K) compacting perm gen total 262144K, used 144077K [0xffffffff40000000, 0xffffffff50000000, 0xffffffff50000000) the space 262144K, 54% used [0xffffffff40000000, 0xffffffff48cb3758, 0xffffffff48cb3800, 0xffffffff50000000) No shared spaces configured. } A simple grep can be used to extract a summary: $ grep "\[ GC pause (young" g1gc.log 2014-05-13T13:24:35.091-0700: 3.109: [GC pause (young) 20M->5029K(9216M), 0.0146328 secs] 2014-05-13T13:24:35.440-0700: 3.459: [GC pause (young) 9125K->6077K(9216M), 0.0086723 secs] 2014-05-13T13:24:37.581-0700: 5.599: [GC pause (young) 25M->8470K(9216M), 0.0203820 secs] 2014-05-13T13:24:42.686-0700: 10.704: [GC pause (young) 44M->15M(9216M), 0.0288848 secs] 2014-05-13T13:24:48.941-0700: 16.958: [GC pause (young) 51M->20M(9216M), 0.0491244 secs] 2014-05-13T13:24:56.049-0700: 24.066: [GC pause (young) 92M->26M(9216M), 0.0525368 secs] 2014-05-13T13:25:34.368-0700: 62.383: [GC pause (young) 602M->68M(9216M), 0.1721173 secs] But that format wasn't easily read into R, so I needed to be a bit more tricky. I used the following Unix command to create a summary file that was easy for R to read. $ echo "SecondsSinceLaunch BeforeSize AfterSize TotalSize RealTime" $ grep "\[GC pause (young" g1gc.log | grep -v mark | sed -e 's/[A-SU-z\(\),]/ /g' -e 's/->/ /' -e 's/: / /g' | more SecondsSinceLaunch BeforeSize AfterSize TotalSize RealTime 2014-05-13T13:24:35.091-0700 3.109 20 5029 9216 0.0146328 2014-05-13T13:24:35.440-0700 3.459 9125 6077 9216 0.0086723 2014-05-13T13:24:37.581-0700 5.599 25 8470 9216 0.0203820 2014-05-13T13:24:42.686-0700 10.704 44 15 9216 0.0288848 2014-05-13T13:24:48.941-0700 16.958 51 20 9216 0.0491244 2014-05-13T13:24:56.049-0700 24.066 92 26 9216 0.0525368 2014-05-13T13:25:34.368-0700 62.383 602 68 9216 0.1721173 Format 2 In some of the log files, the log files with the more verbose format, a single trace, i.e. the report of a singe garbage collection event, was more complicated than Format 1. Here is a text file with an example of a single G1GC trace in the second format. As you can see, it is quite complicated. It is nice that there is so much information available, but the level of detail can be overwhelming. I wrote this awk script (download) to summarize each trace on a single line. #!/usr/bin/env awk -f BEGIN { printf("SecondsSinceLaunch IncrementalCount FullCount UserTime SysTime RealTime BeforeSize AfterSize TotalSize\n") } ###################### # Save count data from lines that are at the start of each G1GC trace. # Each trace starts out like this: # {Heap before GC invocations=14 (full 0): # garbage-first heap total 9437184K, used 325496K [0xfffffffd00000000, 0xffffffff40000000, 0xffffffff40000000) ###################### /{Heap.*full/{ gsub ( "\\)" , "" ); nf=split($0,a,"="); split(a[2],b," "); getline; if ( match($0, "first") ) { G1GC=1; IncrementalCount=b[1]; FullCount=substr( b[3], 1, length(b[3])-1 ); } else { G1GC=0; } } ###################### # Pull out time stamps that are in lines with this format: # 2014-05-12T14:02:06.025-0700: 94.312: [GC pause (young), 0.08870154 secs] ###################### /GC pause/ { DateTime=$1; SecondsSinceLaunch=substr($2, 1, length($2)-1); } ###################### # Heap sizes are in lines that look like this: # [ 4842M->4838M(9216M)] ###################### /\[ .*]$/ { gsub ( "\\[" , "" ); gsub ( "\ \]" , "" ); gsub ( "->" , " " ); gsub ( "\\( " , " " ); gsub ( "\ \)" , " " ); split($0,a," "); if ( split(a[1],b,"M") > 1 ) {BeforeSize=b[1]*1024;} if ( split(a[1],b,"K") > 1 ) {BeforeSize=b[1];} if ( split(a[2],b,"M") > 1 ) {AfterSize=b[1]*1024;} if ( split(a[2],b,"K") > 1 ) {AfterSize=b[1];} if ( split(a[3],b,"M") > 1 ) {TotalSize=b[1]*1024;} if ( split(a[3],b,"K") > 1 ) {TotalSize=b[1];} } ###################### # Emit an output line when you find input that looks like this: # [Times: user=1.41 sys=0.08, real=0.24 secs] ###################### /\[Times/ { if (G1GC==1) { gsub ( "," , "" ); split($2,a,"="); UserTime=a[2]; split($3,a,"="); SysTime=a[2]; split($4,a,"="); RealTime=a[2]; print DateTime,SecondsSinceLaunch,IncrementalCount,FullCount,UserTime,SysTime,RealTime,BeforeSize,AfterSize,TotalSize; G1GC=0; } } The resulting summary is about 25X smaller that the original file, but still difficult for a human to digest. SecondsSinceLaunch IncrementalCount FullCount UserTime SysTime RealTime BeforeSize AfterSize TotalSize ... 2014-05-12T18:36:34.669-0700: 3985.744 561 0 0.57 0.06 0.16 1724416 1720320 9437184 2014-05-12T18:36:34.839-0700: 3985.914 562 0 0.51 0.06 0.19 1724416 1720320 9437184 2014-05-12T18:36:35.069-0700: 3986.144 563 0 0.60 0.04 0.27 1724416 1721344 9437184 2014-05-12T18:36:35.354-0700: 3986.429 564 0 0.33 0.04 0.09 1725440 1722368 9437184 2014-05-12T18:36:35.545-0700: 3986.620 565 0 0.58 0.04 0.17 1726464 1722368 9437184 2014-05-12T18:36:35.726-0700: 3986.801 566 0 0.43 0.05 0.12 1726464 1722368 9437184 2014-05-12T18:36:35.856-0700: 3986.930 567 0 0.30 0.04 0.07 1726464 1723392 9437184 2014-05-12T18:36:35.947-0700: 3987.023 568 0 0.61 0.04 0.26 1727488 1723392 9437184 2014-05-12T18:36:36.228-0700: 3987.302 569 0 0.46 0.04 0.16 1731584 1724416 9437184 Reading the Data into R Once the GC log data had been cleansed, either by processing the first format with the shell script, or by processing the second format with the awk script, it was easy to read the data into R. g1gc.df = read.csv("summary.txt", row.names = NULL, stringsAsFactors=FALSE,sep="") str(g1gc.df) ## 'data.frame': 8307 obs. of 10 variables: ## $ row.names : chr "2014-05-12T14:00:32.868-0700:" "2014-05-12T14:00:33.179-0700:" "2014-05-12T14:00:33.677-0700:" "2014-05-12T14:00:35.538-0700:" ... ## $ SecondsSinceLaunch: num 1.16 1.47 1.97 3.83 6.1 ... ## $ IncrementalCount : int 0 1 2 3 4 5 6 7 8 9 ... ## $ FullCount : int 0 0 0 0 0 0 0 0 0 0 ... ## $ UserTime : num 0.11 0.05 0.04 0.21 0.08 0.26 0.31 0.33 0.34 0.56 ... ## $ SysTime : num 0.04 0.01 0.01 0.05 0.01 0.06 0.07 0.06 0.07 0.09 ... ## $ RealTime : num 0.02 0.02 0.01 0.04 0.02 0.04 0.05 0.04 0.04 0.06 ... ## $ BeforeSize : int 8192 5496 5768 22528 24576 43008 34816 53248 55296 93184 ... ## $ AfterSize : int 1400 1672 2557 4907 7072 14336 16384 18432 19456 21504 ... ## $ TotalSize : int 9437184 9437184 9437184 9437184 9437184 9437184 9437184 9437184 9437184 9437184 ... head(g1gc.df) ## row.names SecondsSinceLaunch IncrementalCount ## 1 2014-05-12T14:00:32.868-0700: 1.161 0 ## 2 2014-05-12T14:00:33.179-0700: 1.472 1 ## 3 2014-05-12T14:00:33.677-0700: 1.969 2 ## 4 2014-05-12T14:00:35.538-0700: 3.830 3 ## 5 2014-05-12T14:00:37.811-0700: 6.103 4 ## 6 2014-05-12T14:00:41.428-0700: 9.720 5 ## FullCount UserTime SysTime RealTime BeforeSize AfterSize TotalSize ## 1 0 0.11 0.04 0.02 8192 1400 9437184 ## 2 0 0.05 0.01 0.02 5496 1672 9437184 ## 3 0 0.04 0.01 0.01 5768 2557 9437184 ## 4 0 0.21 0.05 0.04 22528 4907 9437184 ## 5 0 0.08 0.01 0.02 24576 7072 9437184 ## 6 0 0.26 0.06 0.04 43008 14336 9437184 Basic Statistics Once the data has been read into R, simple statistics are very easy to generate. All of the numbers from high school statistics are available via simple commands. For example, generate a summary of every column: summary(g1gc.df) ## row.names SecondsSinceLaunch IncrementalCount FullCount ## Length:8307 Min. : 1 Min. : 0 Min. : 0.0 ## Class :character 1st Qu.: 9977 1st Qu.:2048 1st Qu.: 0.0 ## Mode :character Median :12855 Median :4136 Median : 12.0 ## Mean :12527 Mean :4156 Mean : 31.6 ## 3rd Qu.:15758 3rd Qu.:6262 3rd Qu.: 61.0 ## Max. :55484 Max. :8391 Max. :113.0 ## UserTime SysTime RealTime BeforeSize ## Min. :0.040 Min. :0.0000 Min. : 0.0 Min. : 5476 ## 1st Qu.:0.470 1st Qu.:0.0300 1st Qu.: 0.1 1st Qu.:5137920 ## Median :0.620 Median :0.0300 Median : 0.1 Median :6574080 ## Mean :0.751 Mean :0.0355 Mean : 0.3 Mean :5841855 ## 3rd Qu.:0.920 3rd Qu.:0.0400 3rd Qu.: 0.2 3rd Qu.:7084032 ## Max. :3.370 Max. :1.5600 Max. :488.1 Max. :8696832 ## AfterSize TotalSize ## Min. : 1380 Min. :9437184 ## 1st Qu.:5002752 1st Qu.:9437184 ## Median :6559744 Median :9437184 ## Mean :5785454 Mean :9437184 ## 3rd Qu.:7054336 3rd Qu.:9437184 ## Max. :8482816 Max. :9437184 Q: What is the total amount of User CPU time spent in garbage collection? sum(g1gc.df$UserTime) ## [1] 6236 As you can see, less than two hours of CPU time was spent in garbage collection. Is that too much? To find the percentage of time spent in garbage collection, divide the number above by total_elapsed_time*CPU_count. In this case, there are a lot of CPU’s and it turns out the the overall amount of CPU time spent in garbage collection isn’t a problem when viewed in isolation. When calculating rates, i.e. events per unit time, you need to ask yourself if the rate is homogenous across the time period in the log file. Does the log file include spikes of high activity that should be separately analyzed? Averaging in data from nights and weekends with data from business hours may alias problems. If you have a reason to suspect that the garbage collection rates include peaks and valleys that need independent analysis, see the “Time Series” section, below. Q: How much garbage is collected on each pass? The amount of heap space that is recovered per GC pass is surprisingly low: At least one collection didn’t recover any data. (“Min.=0”) 25% of the passes recovered 3MB or less. (“1st Qu.=3072”) Half of the GC passes recovered 4MB or less. (“Median=4096”) The average amount recovered was 56MB. (“Mean=56390”) 75% of the passes recovered 36MB or less. (“3rd Qu.=36860”) At least one pass recovered 2GB. (“Max.=2121000”) g1gc.df$Delta = g1gc.df$BeforeSize - g1gc.df$AfterSize summary(g1gc.df$Delta) ## Min. 1st Qu. Median Mean 3rd Qu. Max. ## 0 3070 4100 56400 36900 2120000 Q: What is the maximum User CPU time for a single collection? The worst garbage collection (“Max.”) is many standard deviations away from the mean. The data appears to be right skewed. summary(g1gc.df$UserTime) ## Min. 1st Qu. Median Mean 3rd Qu. Max. ## 0.040 0.470 0.620 0.751 0.920 3.370 sd(g1gc.df$UserTime) ## [1] 0.3966 Basic Graphics Once the data is in R, it is trivial to plot the data with formats including dot plots, line charts, bar charts (simple, stacked, grouped), pie charts, boxplots, scatter plots histograms, and kernel density plots. Histogram of User CPU Time per Collection I don't think that this graph requires any explanation. hist(g1gc.df$UserTime, main="User CPU Time per Collection", xlab="Seconds", ylab="Frequency") Box plot to identify outliers When the initial data is viewed with a box plot, you can see the one crazy outlier in the real time per GC. Save this data point for future analysis and drop the outlier so that it’s not throwing off our statistics. Now the box plot shows many outliers, which will be examined later, using times series analysis. Notice that the scale of the x-axis changes drastically once the crazy outlier is removed. par(mfrow=c(2,1)) boxplot(g1gc.df$UserTime,g1gc.df$SysTime,g1gc.df$RealTime, main="Box Plot of Time per GC\n(dominated by a crazy outlier)", names=c("usr","sys","elapsed"), xlab="Seconds per GC", ylab="Time (Seconds)", horizontal = TRUE, outcol="red") crazy.outlier.df=g1gc.df[g1gc.df$RealTime > 400,] g1gc.df=g1gc.df[g1gc.df$RealTime < 400,] boxplot(g1gc.df$UserTime,g1gc.df$SysTime,g1gc.df$RealTime, main="Box Plot of Time per GC\n(crazy outlier excluded)", names=c("usr","sys","elapsed"), xlab="Seconds per GC", ylab="Time (Seconds)", horizontal = TRUE, outcol="red") box(which = "outer", lty = "solid") Here is the crazy outlier for future analysis: crazy.outlier.df ## row.names SecondsSinceLaunch IncrementalCount ## 8233 2014-05-12T23:15:43.903-0700: 20741 8316 ## FullCount UserTime SysTime RealTime BeforeSize AfterSize TotalSize ## 8233 112 0.55 0.42 488.1 8381440 8235008 9437184 ## Delta ## 8233 146432 R Time Series Data To analyze the garbage collection as a time series, I’ll use Z’s Ordered Observations (zoo). “zoo is the creator for an S3 class of indexed totally ordered observations which includes irregular time series.” require(zoo) ## Loading required package: zoo ## ## Attaching package: 'zoo' ## ## The following objects are masked from 'package:base': ## ## as.Date, as.Date.numeric head(g1gc.df[,1]) ## [1] "2014-05-12T14:00:32.868-0700:" "2014-05-12T14:00:33.179-0700:" ## [3] "2014-05-12T14:00:33.677-0700:" "2014-05-12T14:00:35.538-0700:" ## [5] "2014-05-12T14:00:37.811-0700:" "2014-05-12T14:00:41.428-0700:" options("digits.secs"=3) times=as.POSIXct( g1gc.df[,1], format="%Y-%m-%dT%H:%M:%OS%z:") g1gc.z = zoo(g1gc.df[,-c(1)], order.by=times) head(g1gc.z) ## SecondsSinceLaunch IncrementalCount FullCount ## 2014-05-12 17:00:32.868 1.161 0 0 ## 2014-05-12 17:00:33.178 1.472 1 0 ## 2014-05-12 17:00:33.677 1.969 2 0 ## 2014-05-12 17:00:35.538 3.830 3 0 ## 2014-05-12 17:00:37.811 6.103 4 0 ## 2014-05-12 17:00:41.427 9.720 5 0 ## UserTime SysTime RealTime BeforeSize AfterSize ## 2014-05-12 17:00:32.868 0.11 0.04 0.02 8192 1400 ## 2014-05-12 17:00:33.178 0.05 0.01 0.02 5496 1672 ## 2014-05-12 17:00:33.677 0.04 0.01 0.01 5768 2557 ## 2014-05-12 17:00:35.538 0.21 0.05 0.04 22528 4907 ## 2014-05-12 17:00:37.811 0.08 0.01 0.02 24576 7072 ## 2014-05-12 17:00:41.427 0.26 0.06 0.04 43008 14336 ## TotalSize Delta ## 2014-05-12 17:00:32.868 9437184 6792 ## 2014-05-12 17:00:33.178 9437184 3824 ## 2014-05-12 17:00:33.677 9437184 3211 ## 2014-05-12 17:00:35.538 9437184 17621 ## 2014-05-12 17:00:37.811 9437184 17504 ## 2014-05-12 17:00:41.427 9437184 28672 Example of Two Benchmark Runs in One Log File The data in the following graph is from a different log file, not the one of primary interest to this article. I’m including this image because it is an example of idle periods followed by busy periods. It would be uninteresting to average the rate of garbage collection over the entire log file period. More interesting would be the rate of garbage collect in the two busy periods. Are they the same or different? Your production data may be similar, for example, bursts when employees return from lunch and idle times on weekend evenings, etc. Once the data is in an R Time Series, you can analyze isolated time windows. Clipping the Time Series data Flashing back to our test case… Viewing the data as a time series is interesting. You can see that the work intensive time period is between 9:00 PM and 3:00 AM. Lets clip the data to the interesting period:     par(mfrow=c(2,1)) plot(g1gc.z$UserTime, type="h", main="User Time per GC\nTime: Complete Log File", xlab="Time of Day", ylab="CPU Seconds per GC", col="#1b9e77") clipped.g1gc.z=window(g1gc.z, start=as.POSIXct("2014-05-12 21:00:00"), end=as.POSIXct("2014-05-13 03:00:00")) plot(clipped.g1gc.z$UserTime, type="h", main="User Time per GC\nTime: Limited to Benchmark Execution", xlab="Time of Day", ylab="CPU Seconds per GC", col="#1b9e77") box(which = "outer", lty = "solid") Cumulative Incremental and Full GC count Here is the cumulative incremental and full GC count. When the line is very steep, it indicates that the GCs are repeating very quickly. Notice that the scale on the Y axis is different for full vs. incremental. plot(clipped.g1gc.z[,c(2:3)], main="Cumulative Incremental and Full GC count", xlab="Time of Day", col="#1b9e77") GC Analysis of Benchmark Execution using Time Series data In the following series of 3 graphs: The “After Size” show the amount of heap space in use after each garbage collection. Many Java objects are still referenced, i.e. alive, during each garbage collection. This may indicate that the application has a memory leak, or may indicate that the application has a very large memory footprint. Typically, an application's memory footprint plateau's in the early stage of execution. One would expect this graph to have a flat top. The steep decline in the heap space may indicate that the application crashed after 2:00. The second graph shows that the outliers in real execution time, discussed above, occur near 2:00. when the Java heap seems to be quite full. The third graph shows that Full GCs are infrequent during the first few hours of execution. The rate of Full GC's, (the slope of the cummulative Full GC line), changes near midnight.   plot(clipped.g1gc.z[,c("AfterSize","RealTime","FullCount")], xlab="Time of Day", col=c("#1b9e77","red","#1b9e77")) GC Analysis of heap recovered Each GC trace includes the amount of heap space in use before and after the individual GC event. During garbage coolection, unreferenced objects are identified, the space holding the unreferenced objects is freed, and thus, the difference in before and after usage indicates how much space has been freed. The following box plot and bar chart both demonstrate the same point - the amount of heap space freed per garbage colloection is surprisingly low. par(mfrow=c(2,1)) boxplot(as.vector(clipped.g1gc.z$Delta), main="Amount of Heap Recovered per GC Pass", xlab="Size in KB", horizontal = TRUE, col="red") hist(as.vector(clipped.g1gc.z$Delta), main="Amount of Heap Recovered per GC Pass", xlab="Size in KB", breaks=100, col="red") box(which = "outer", lty = "solid") This graph is the most interesting. The dark blue area shows how much heap is occupied by referenced Java objects. This represents memory that holds live data. The red fringe at the top shows how much data was recovered after each garbage collection. barplot(clipped.g1gc.z[,c("AfterSize","Delta")], col=c("#7570b3","#e7298a"), xlab="Time of Day", border=NA) legend("topleft", c("Live Objects","Heap Recovered on GC"), fill=c("#7570b3","#e7298a")) box(which = "outer", lty = "solid") When I discuss the data in the log files with the customer, I will ask for an explaination for the large amount of referenced data resident in the Java heap. There are two are posibilities: There is a memory leak and the amount of space required to hold referenced objects will continue to grow, limited only by the maximum heap size. After the maximum heap size is reached, the JVM will throw an “Out of Memory” exception every time that the application tries to allocate a new object. If this is the case, the aplication needs to be debugged to identify why old objects are referenced when they are no longer needed. The application has a legitimate requirement to keep a large amount of data in memory. The customer may want to further increase the maximum heap size. Another possible solution would be to partition the application across multiple cluster nodes, where each node has responsibility for managing a unique subset of the data. Conclusion In conclusion, R is a very powerful tool for the analysis of Java garbage collection log files. The primary difficulty is data cleansing so that information can be read into an R data frame. Once the data has been read into R, a rich set of tools may be used for thorough evaluation.

    Read the article

  • ReSharper - Possible Null Assignment when using Microsoft.Contracts

    - by HVS
    Is there any way to indicate to ReSharper that a null reference won't occur because of Design-by-Contract Requires checking? For example, the following code will raise the warning (Possible 'null' assignment to entity marked with 'NotNull' attribute) in ReSharper on lines 7 and 8: private Dictionary<string, string> _Lookup = new Dictionary<string, string>(); public void Foo(string s) { Contract.Requires(!String.IsNullOrEmpty(s)); if (_Lookup.ContainsKey(s)) _Lookup.Remove(s); } What is really odd is that if you remove the Contract.Requires(...) line, the ReSharper message goes away. Update I found the solution through ExternalAnnotations which was also mentioned by Mike below. Here's an example of how to do it for a function in Microsoft.Contracts: Create a directory called Microsoft.Contracts under the ExternalAnnotations ReSharper directory. Next, Create a file called Microsoft.Contracts.xml and populate like so: <assembly name="Microsoft.Contracts"> <member name="M:System.Diagnostics.Contracts.Contract.Requires(System.Boolean)"> <attribute ctor="M:JetBrains.Annotations.AssertionMethodAttribute.#ctor"/> <parameter name="condition"> <attribute ctor="M:JetBrains.Annotations.AssertionConditionAttribute.#ctor(JetBrains.Annotations.AssertionConditionType)"> <argument>0</argument> </attribute> </parameter> </member> </assembly> Restart Visual Studio, and the message goes away!

    Read the article

  • Removing and then adding MKAnnotations to a MapView is leaving me with a blank MapView

    - by Chris McCall
    A followup to this quesiton: http://stackoverflow.com/questions/2944581/forcing-reload-of-view-when-returning-from-flipsideview-of-utility-application When returning from the flipside view, I'm calling NSArray *annotations = [NSArray arrayWithArray:[mapView annotations]]; [mapView removeAnnotations:annotations]; To remove all the pins from the map (let me know if this isn't the best way to do this). Then: for(Hole *hole in fetchedObjects) { double latitude = [hole.Latitude doubleValue]; cord.latitude = latitude; double longitude = [hole.Longitude doubleValue]; cord.longitude = longitude; WellPlaceMark *placemark = [[WellPlaceMark alloc] initWithCoordinate:cord withWellType:[NSString stringWithString: hole.WellType]]; [mapView addAnnotation:placemark]; } Plus: - (MKAnnotationView *) mapView:(MKMapView *)mapView viewForAnnotation:(id <MKAnnotation>) annotation{ MKPinAnnotationView *annView=[[MKPinAnnotationView alloc] initWithAnnotation:annotation reuseIdentifier:@"currentloc"]; if ([annotation isKindOfClass:[MKUserLocation class]]) return nil; if([annotation isKindOfClass:[WellPlaceMark class]]) { ... } annView.animatesDrop=FALSE; return annView; } All of this code seems to be called in the debugger, but when it's finished, I'm presented with a blank map. I've tried zooming out and the pins are nowhere. The map loads up fine on the first try, but once I call removeAnnotations, they never return.

    Read the article

  • MKMapView loading all annotation views at once (including those that are outside the current rect)

    - by jmans
    Has anyone else run into this problem? Here's the code: - (MKAnnotationView *)mapView:(MKMapView *)mapView viewForAnnotation:(WWMapAnnotation *)annotation { // Only return an Annotation view for the placemarks. Ignore for the current location--the iPhone SDK will place a blue ball there. NSLog(@"Request for annotation view"); if ([annotation isKindOfClass:[WWMapAnnotation class]]){ MKPinAnnotationView *browse_map_annot_view = (MKPinAnnotationView *)[mapView dequeueReusableAnnotationViewWithIdentifier:@"BrowseMapAnnot"]; if (!browse_map_annot_view) { browse_map_annot_view = [[[MKPinAnnotationView alloc] initWithAnnotation:annotation reuseIdentifier:@"BrowseMapAnnot"] autorelease]; NSLog(@"Creating new annotation view"); } else { NSLog(@"Recycling annotation view"); browse_map_annot_view.annotation = annotation; } ... As soon as the view is displayed, I get 2009-08-05 13:12:03.332 xxx[24308:20b] Request for annotation view 2009-08-05 13:12:03.333 xxx[24308:20b] Creating new annotation view 2009-08-05 13:12:03.333 xxx[24308:20b] Request for annotation view 2009-08-05 13:12:03.333 xxx[24308:20b] Creating new annotation view and on and on, for every annotation (~60) I've added. The map (correctly) only displays the two annotations in the current rect. I am setting the region in viewDidLoad: if (center_point.latitude == 0) { center_point.latitude = 35.785098; center_point.longitude = -78.669899; } if (map_span.latitudeDelta == 0) { map_span.latitudeDelta = .001; map_span.longitudeDelta = .001; } map_region.center = center_point; map_region.span = map_span; NSLog(@"Setting initial map center and region"); [browse_map_view setRegion:map_region animated:NO]; The log entry for the region being set is printed to the console before any annotation views are requested. The problem here is that since all of the annotations are being requested at once, [mapView dequeueReusableAnnotationViewWithIdentifier] does nothing, since there are unique MKAnnotationViews for every annotation on the map. This is leading to memory problems for me. One possible issue is that these annotations are clustered in a pretty small space (~1 mile radius). Although the map is zoomed in pretty tight in viewDidLoad (latitude and longitude delta .001), it still loads all of the annotation views at once. Thanks...

    Read the article

  • Google App Engine: Unit testing concurrent access to memcache

    - by Phuong Nguyen de ManCity fan
    Would you guys show me a way to simulating concurrent access to memcache on Google App Engine? I'm trying with LocalServiceTestHelpers and threads but don't have any luck. Every time I try to access Memcache within a thread, then I get this error: ApiProxy$CallNotFoundException: The API package 'memcache' or call 'Increment()' was not found I guess that the testing library of GAE SDK tried to mimic the real environment and thus setup the environment for only one thread (the thread that running the test) which cannot be seen by other thread. Here is a piece of code that can reproduce the problem package org.seamoo.cache.memcacheImpl; import org.testng.Assert; import org.testng.annotations.AfterMethod; import org.testng.annotations.BeforeMethod; import org.testng.annotations.Test; import com.google.appengine.api.memcache.MemcacheService; import com.google.appengine.api.memcache.MemcacheServiceFactory; import com.google.appengine.tools.development.testing.LocalMemcacheServiceTestConfig; import com.google.appengine.tools.development.testing.LocalServiceTestHelper; public class MemcacheTest { LocalServiceTestHelper helper; public MemcacheTest() { LocalMemcacheServiceTestConfig memcacheConfig = new LocalMemcacheServiceTestConfig(); helper = new LocalServiceTestHelper(memcacheConfig); } /** * */ @BeforeMethod public void setUp() { helper.setUp(); } /** * @see LocalServiceTest#tearDown() */ @AfterMethod public void tearDown() { helper.tearDown(); } @Test public void memcacheConcurrentAccess() throws InterruptedException { final MemcacheService service = MemcacheServiceFactory.getMemcacheService(); Runnable runner = new Runnable() { @Override public void run() { // TODO Auto-generated method stub service.increment("test-key", 1L, 1L); try { Thread.sleep(200L); } catch (InterruptedException e) { // TODO Auto-generated catch block e.printStackTrace(); } service.increment("test-key", 1L, 1L); } }; Thread t1 = new Thread(runner); Thread t2 = new Thread(runner); t1.start(); t2.start(); while (t1.isAlive()) { Thread.sleep(100L); } Assert.assertEquals((Long) (service.get("test-key")), new Long(4L)); } }

    Read the article

  • iPhone: Remove annotation from MKMapView which is in another view

    - by Nic Hubbard
    I have two views. The first is a MKMapView with some annotations. Clicking a UIButton pushes a second view on the stack. This has a UITableView with a list of annotations which correspond to the map annotations. So, when you click the delete button, how can I call my MKMapView which is in another view, so that I can remove the annotation. My MKMapView is declared in my app delegate, as well as my current class. I am trying to use the following, but it is not working: RideAppDelegate *appDelegate = (RideAppDelegate *)[[UIApplication sharedApplication] delegate]; Annotation *ano; CLLocationCoordinate2D anoPoint; anoPoint.latitude = [[eventToDelete valueForKey:@"latitude"] doubleValue]; anoPoint.longitude = [[eventToDelete valueForKey:@"longitude"] doubleValue]; ano = [[[Annotation alloc] init] autorelease]; ano.coordinate = anoPoint; [appDelegate.ridesMap removeAnnotation: ano]; [appDelegate release]; I must be trying to access the MKMapView of my other view incorrectly?

    Read the article

  • MKAnnotations are being made successfully, however they sometimes fail to render on MKMapView

    - by jtkendall
    I'm working on an iPhone app using the 3.1.3 SDK, my app finds the users current location, displays it on a MKMapView and then finds nearby locations and renders them as MKAnnotations. My code is working, however sometimes the nearby annotations do not appear on the map. They are still being made as I see the correct data in the console (from NSLog which runs just after the annotations are made). When it fails is completely random, it could be the 5th time I've hit "Build and Run" for the day, or the 500th it doesn't appear to have any pattern and doesn't throw any type of error, it just doesn't add the annotations to the MapView. This is the method called for each nearby location to add the MKAnnotation. - (void)addPinsWithLocation:(NSDictionary *)spot { CLLocationCoordinate2D location; location.longitude = [[spot objectForKey:@"spot_longitude"] doubleValue]; location.latitude = [[spot objectForKey:@"spot_latitude"] doubleValue]; PlaceMarks *placemark = [[PlaceMarks alloc] initWithCoordinate:location title:[spot objectForKey:@"spot_name"] subtitle:@""]; NSLog(@"Adding Pin for Location: '%@' at %f, %f", [spot objectForKey:@"spot_name"], location.latitude, location.longitude); [mapView addAnnotation:placemark]; } Any ideas on how to get MKAnnotations to always show?

    Read the article

  • iPhone: Use a view as a transparent overlay with closing button

    - by axooh
    I've got a map with three bar buttons for different markers to show up in the map. If I click on a bar button, the specific markers are shown in the map, which already works great. Now I would like to show a transparent overlay (popup window) with the description of the markers after I clicked on a bar button with a button to close the overlay again and show the markers (which are set in the background). The function of the bar button: - (IBAction)routeTwo:(id)sender { // The code for the overlay // ... // remove any annotations that exist [map removeAnnotations:map.annotations]; // Add any annotations which belongs to route 2 [map addAnnotation:[self.mapAnnotations objectAtIndex:2]]; [map addAnnotation:[self.mapAnnotations objectAtIndex:3]]; [map addAnnotation:[self.mapAnnotations objectAtIndex:4]]; [map addAnnotation:[self.mapAnnotations objectAtIndex:5]]; } I tried the following possibilities: 1. Using a modal view controller RouteDescriptionViewController *routeDescriptionView = [[RouteDescriptionViewController alloc] init]; [self presentModalViewController:routeDescriptionView animated:YES]; [routeDescriptionView release]; Works great, but the problem is: The map view in the background is not visible anymore (configuring alpha values of the modal view doesn't change anything). 2. Add RouteDescriptionView as a subview RouteDescriptionViewController *routeDescriptionView = [[RouteDescriptionViewController alloc] init]; [self.view addSubview:routeDescriptionView.view]; [routeDescriptionView release]; Works great as well, but the problem here is: I can't configure a close button on the subview to close/remove the subview (RouteDescriptionView). 3. Using UIAlertView Would work as expected, but the UIAlert is not really customizable and therefore not suitable for my needs. Any ideas how to achieve this?

    Read the article

  • Play! 1.2.5 with mongodb | Model Validation not happening

    - by TGV
    I have a simple User model whose fields are annotated with play validation annotations and morphia annotations like below. import play.data.validation.*; import play.modules.morphia.Model; import com.google.code.morphia.annotations.*; @Entity public class User extends Model{ @Id @Indexed(name="USERID", unique=true) public ObjectId userId; @Required public String userName; @Email @Indexed(name="USEREMAIL", unique=true) @Required public String userEmail; } Now I have a service which has a CreateNewUser method responsible for persisting the data. I have used Morphia plugin for the dao support. But the problem is that User Document gets persisted in mongo-db even if userName or userEmail is NULL. Also @Email validation does not happen // Below code is in app/controllers/Application.java User a = new User(); a.userName = "user1"; // calling bean to create user, userService is in app/service/UserService userService.createNewUser(a); It does not work even after adding @valid and validation.hasErrors() check.Below code is in app/service/UserService public void createNewUser(@Valid User user) { if (Validation.hasErrors()) { System.out.println("has errors"); } else { // TODO Auto-generated method stub userDao.save(user); } }

    Read the article

  • Welcome Stephen Chin and James Weaver to Oracle!

    - by arungupta
    Stephen Chin and James Weaver - the two JavaFX "rockstar" speakers from the community are joining Oracle's Java Evangelist Team. Both of them have co-authored a recently released book - Pro Java FX 2 and are well known for their passion to promote JavaFX. This shows Oracle's continued commitment to Java and JavaFX. Jim blogs at javafxpert.com and can be reached on @JavaFXpert. Steve blogs at and can be reached at steveonjava.com and can be reached at @steveonjava. You'll have an opportunity to meet and engage with them at different community facing activities. Welcome Stephen and James to Oracle!

    Read the article

  • ODI and OBIEE 11g Integration

    - by David Allan
    Here we will see some of the connectivity options to OBIEE 11g using the JDBC driver. You’ll see based upon some connection properties how the physical or presentation layers can be utilized. In the integrators guide for OBIEE 11g you will find a brief statement indicating that there actually is a JDBC driver for OBIEE. In OBIEE 11g its now possible to connect directly to the physical layer, Venkat has an informative post here on this topic. In ODI 11g the Oracle BI technology is shipped with the product along with KMs for reverse engineering, and using OBIEE models for a data source. When you install OBIEE in 11g a light weight demonstration application is preinstalled in the server, when you open this in the BI Administration tool we see the regular 3 panel view within the administration tool. To interrogate this system via JDBC (just like ODI does using the KMs) need a couple of things; the JDBC driver from OBIEE 11g, a java client program and the credentials. In my java client program I want to connect to the OBIEE system, when I connect I can interrogate what the JDBC driver presents for the metadata. The metadata projected via the JDBC connection’s DatabaseMetadata changes depending on whether the property NQ_SESSION.SELECTPHYSICAL is set when the java client connects. Let’s use the sample app to illustrate. I have a java client program here that will print out the tables in the DatabaseMetadata, it will also output the catalog and schema. For example if I execute without any special JDBC properties as follows; java -classpath .;%BIHOMEDIR%\clients\bijdbc.jar meta_jdbc oracle.bi.jdbc.AnaJdbcDriver jdbc:oraclebi://localhost:9703/ weblogic mypass Then I get the following returned representing the presentation layer, the sample I used is XML, and has no schema; Catalog Schema Table Sample Sales Lite null Base Facts Sample Sales Lite null Calculated Facts …     Sample Targets Lite null Base Facts …     Now if I execute with the only difference being the JDBC property NQ_SESSION.SELECTPHYSICAL with the value Yes, then I see a different set of values representing the physical layer in OBIEE; java -classpath .;%BIHOMEDIR%\clients\bijdbc.jar meta_jdbc oracle.bi.jdbc.AnaJdbcDriver jdbc:oraclebi://localhost:9703/ weblogic mypass NQ_SESSION.SELECTPHYSICAL=Yes The following is returned; Catalog Schema Table Sample App Lite Data null D01 Time Day Grain Sample App Lite Data null F10 Revenue Facts (Order grain) …     System DB (Update me)     …     If this was a database system such as Oracle, the catalog value would be the OBIEE database name and the schema would be the Oracle database schema. Other systems which have real catalog structure such as SQLServer would use its catalog value. Its this ‘Catalog’ and ‘Schema’ value that is important when integration OBIEE with ODI. For the demonstration application in OBIEE 11g, the following illustration shows how the information from OBIEE is related via the JDBC driver through to ODI. In the XML example above, within ODI’s physical schema definition on the right, we leave the schema blank since the XML data source has no schema. When I did this at first, I left the default value that ODI places in the Schema field since which was ‘<Undefined>’ (like image below) but this string is actually used in the RKM so ended up not finding any tables in this schema! Entering an empty string resolved this. Below we see a regular Oracle database example that has the database, schema, physical table structure, and how this is defined in ODI.   Remember back to the physical versus presentation layer usage when we passed the special property, well to do this in ODI, the data server has a panel for properties where you can define key/value pairs. So if you want to select physical objects from the OBIEE server, then you must set this property. An additional changed in ODI 11g is the OBIEE connection pool support, this has been implemented via a ‘Connection Pool’ flex field for the Oracle BI data server. So here you set the connection pool name from the OBIEE system that you specifically want to use and this is used by the Oracle BI to Oracle (DBLINK) LKM, so if you are using this you must set this flex field. Hopefully a useful insight into some of the mechanics of how this hangs together.

    Read the article

  • ADF and Oracle E-Business Suite Integration Series Index

    - by Juan Camilo Ruiz
    I'm creating this entry with the purpose of keeping one page that lists all the past and future entries on the series of integration of ADF with Oracle E-Business Suite, you can access all the articles and reference information that resides in other places too. Also this would the one link that I can reference while presenting about this topic. Here is the list of individual entries from the series: ADF and Oracle E-Business Suite Integration Series: Displaying Read-Only EBS data on ADF ADF and Oracle E-Business Suite Integration Series: Displaying Read-Only EBS data on iPad Using the Oracle E-Business Suite SDK for Java on ADF Applications Securing ADF Applications Using the Oracle E-Business Suite SDK JAAS Implementation Debugging ADF Security in JDeveloper 11g Adding a Role to a Responsibility for Use with the Oracle E-Business Suite SDK for Java JAAS Implementation Embedding ADF UI Components into OAF regions Bonus Material: Webcast Replays Using Oracle ADF with Oracle E-Business Suite: The Full Integration View Best Practices for Using Oracle E-Business Suite SDK for Java with Oracle ADF Documents FAQ for Integration of Oracle E-Business Suite and Oracle Application Development Framework (ADF) Applications (Doc ID 1296491.1)

    Read the article

< Previous Page | 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019  | Next Page >