Search Results

Search found 2322 results on 93 pages for 'jar'.

Page 25/93 | < Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >

  • server.sh gives me following error Exception"main" java.lang.NoClassDefFoundError:

    - by NickNaik
    i am trying to start the "Program D" from the terminal, command in terminal "sh server.sh" gives me following error Starting Alicebot Program D. Exception in thread "main" java.lang.NoClassDefFoundError: org/alicebot/server/net/AliceServer Caused by: java.lang.ClassNotFoundException: org.alicebot.server.net.AliceServer at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at java.lang.ClassLoader.loadClass(ClassLoader.java:247) My Server.sh file ALICE_HOME=.SERVLET_LIB=lib/servlet.jar ALICE_LIB=lib/aliceserver.jar JS_LIB=lib/js.jar # Set SQL_LIB to the location of your database driver. SQL_LIB=lib/mysql_comp.jar # These are for Jetty; you will want to change these if you are using a different http server. HTTP_SERVER_LIBS=lib/org.mortbay.jetty.jar:lib/javax.xml.jaxp.jar:lib/org.apache.crimson.jar PROGRAMD_CLASSPATH=$SERVLET_LIB:$ALICE_LIB:$JS_LIB:$SQL_LIB:$HTTP_SERVER_LIBS java -classpath $PROGRAMD_CLASSPATH -Xms64m -Xmx64m org.alicebot.server.net.AliceServer $1

    Read the article

  • java-maven2: How to include the a jar as depedency in pom so that I will be able to access test clas

    - by flavour-of-bru
    Hi, I have a set of functional jars(more than 3) that tests my source code. These jars just contains test classes and assisting asserter classes. I am creating a new performance jar that would import all the functional tests from these jars so that all can be run simultaneously. But when I include them as test dependencies in pom of current jar, what all I get to see is the classes in src/main/java. How can I include these functional jars as dependent jars so that I can also reference classes in src/test/java. In other words, how do I reference the test classes in other jars. In what way should I include the dependency as. Thanks for your support.

    Read the article

  • How to reference/link another project in grails workspace without using jar files?

    - by Ivan Alagenchev
    I have a Grails website that references a java core application. I have been successful in adding a .jar dependency to that project; however the java project is in the same workspace as my grails project and I would ultimately like to reference that project directly. I don't want to deal with the added step of creating a new jar file every time that there is a modification to the java project, cleaning and updating my dependencies. I added the java project to my grails' project "Java Build Path" and at first everything seemed to work fine, but when I run grailscompile, the compiler fails to resolve all imports that point to the java project. I am using Spring Source Toolsuite as my IDE.

    Read the article

  • Why does the java -Xmx not working?

    - by Zenofo
    In my Ubuntu 11.10 VPS, Before I run the jar file: # free -m total used free shared buffers cached Mem: 256 5 250 0 0 0 -/+ buffers/cache: 5 250 Swap: 0 0 0 Run a jar file that limited to maximum of 32M memory: java -Xms8m -Xmx32m -jar ./my.jar Now the memory state as follows: # free -m total used free shared buffers cached Mem: 256 155 100 0 0 0 -/+ buffers/cache: 155 100 Swap: 0 0 0 This jar occupied 150M memory. And I can't run any other java command: # java -version Error occurred during initialization of VM Could not reserve enough space for object heap Could not create the Java virtual machine. # java -Xmx8m -version Error occurred during initialization of VM Could not reserve enough space for object heap Could not create the Java virtual machine. I want to know why the -Xmx parameter does not take effect? How can I limit the jar file using the memory?

    Read the article

  • How do I make subsonic (media server) work with SSL?

    - by John Baber
    The roughly out-of-the-box setup as a regular user works fine (meaning the site appears at http://myserver.com:4040). From ps aux java -Xmx100m -Dsubsonic.home=/var/subsonic -Dsubsonic.host=0.0.0.0 -Dsubsonic.port=4040 -Dsubsonic.httpsPort=0 -Dsubsonic.contextPath=/ -Dsubsonic.defaultMusicFolder=/var/music -Dsubsonic.defaultPodcastFolder=/var/music/Podcast -Dsubsonic.defaultPlaylistFolder=/var/playlists -Djava.awt.headless=true -verbose:gc -jar subsonic-booter-jar-with-dependencies.jar but just giving an https port java -Xmx100m -Dsubsonic.home=/var/subsonic -Dsubsonic.host=0.0.0.0 -Dsubsonic.port=4040 -Dsubsonic.httpsPort=6060 -Dsubsonic.contextPath=/ -Dsubsonic.defaultMusicFolder=/var/music -Dsubsonic.defaultPodcastFolder=/var/music/Podcast -Dsubsonic.defaultPlaylistFolder=/var/playlists -Djava.awt.headless=true -verbose:gc -jar subsonic-booter-jar-with-dependencies.jar makes http://myserver.com:4040 say HTTP ERROR: 404 NOT_FOUND RequestURI=/index.view Powered by jetty:// and https://myserver.com:6060 say Unable to connect I'm only making the change by doing # SUBSONIC_ARGS="--port=80 --https-port=443 --max-memory=120" SUBSONIC_ARGS="--max-memory=100 --https-port=6060" in /etc/default/subsonic and issuing a sudo service subsonic restart (this is Ubuntu Oneiric)

    Read the article

  • How can I run a jar on a domain server?

    - by acuddlyheadcrab
    Sorry, I'm not very experience with web development so I don't know the correct names for everything. I only have FTP access to my server, so I'm hoping that's all I'll need. I could do this easily on a windows computer, via .bat files (I know how to use .jar files), but, like I said, I only have FTP access, so I don't know exactly what to do. Would this involve a script? Could I just use a .bat file somehow? Or maybe php?

    Read the article

  • Failed to resolve artifact. Missing: ---------- 1) org.codehaus.mojo:gwt-maven-plugin:jar:1.3-SNAPSHOT

    - by karim
    i want to use the addon vaadin Timeline, so i have to make "gwt-maven-plugin 3.1" as i know ,my pom.xml is the following : <?xml version="1.0"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>life</groupId> <artifactId>life</artifactId> <packaging>war</packaging> <name>life Portlet</name> <version>0.0.1-SNAPSHOT</version> <url>http://maven.apache.org</url> <properties> <vaadin-widgets-dir>src/main/webapp/VAADIN/widgetsets</vaadin-widgets-dir> </properties> <build> <plugins> <plugin> <groupId>com.liferay.maven.plugins</groupId> <artifactId>liferay-maven-plugin</artifactId> <version>6.1.0</version> <configuration> <autoDeployDir>${liferay.auto.deploy.dir}</autoDeployDir> <liferayVersion>6.1.0</liferayVersion> <pluginType>portlet</pluginType> </configuration> </plugin> <plugin> <artifactId>maven-compiler-plugin</artifactId> <configuration> <encoding>UTF-8</encoding> <source>1.5</source> <target>1.5</target> </configuration> </plugin> <plugin> <groupId>com.vaadin</groupId> <artifactId>vaadin-maven-plugin</artifactId> <version>1.0.1</version> </plugin> <!-- Compiles your custom GWT components with the GWT compiler --> <plugin> <groupId>org.codehaus.mojo</groupId> <artifactId>gwt-maven-plugin</artifactId> <version>2.1.0-1</version> <configuration> <!-- if you don't specify any modules, the plugin will find them --> <!--modules> .. </modules --> <webappDirectory>${project.build.directory}/${project.build.finalName}/VAADIN/widgetsets</webappDirectory> <extraJvmArgs>-Xmx512M -Xss1024k</extraJvmArgs> <runTarget>clean</runTarget> <hostedWebapp>${project.build.directory}/${project.build.finalName}</hostedWebapp> <noServer>true</noServer> <port>8080</port> <soyc>false</soyc> </configuration> <executions> <execution> <goals> <goal>resources</goal> <goal>compile</goal> </goals> </execution> </executions> </plugin> <!-- Updates Vaadin 6.2+ widgetset definitions based on project dependencies --> <plugin> <groupId>com.vaadin</groupId> <artifactId>vaadin-maven-plugin</artifactId> <version>1.0.1</version> <executions> <execution> <configuration> <!-- if you don't specify any modules, the plugin will find them --> <!-- <modules> <module>${package}.gwt.MyWidgetSet</module> </modules> --> </configuration> <goals> <goal>update-widgetset</goal> </goals> </execution> </executions> </plugin> </plugins> <pluginManagement> <plugins> <!--This plugin's configuration is used to store Eclipse m2e settings only. It has no influence on the Maven build itself. --> <plugin> <groupId>org.eclipse.m2e</groupId> <artifactId>lifecycle-mapping</artifactId> <version>1.0.0</version> <configuration> <lifecycleMappingMetadata> <pluginExecutions> <pluginExecution> <pluginExecutionFilter> <groupId> org.codehaus.mojo </groupId> <artifactId> gwt-maven-plugin </artifactId> <versionRange> [2.1.0-1,) </versionRange> <goals> <goal>resources</goal> </goals> </pluginExecutionFilter> <action> <ignore></ignore> </action> </pluginExecution> <pluginExecution> <pluginExecutionFilter> <groupId>com.vaadin</groupId> <artifactId> vaadin-maven-plugin </artifactId> <versionRange> [1.0.1,) </versionRange> <goals> <goal> update-widgetset </goal> </goals> </pluginExecutionFilter> <action> <ignore></ignore> </action> </pluginExecution> </pluginExecutions> </lifecycleMappingMetadata> </configuration> </plugin> </plugins> </pluginManagement> </build> <dependencies> <dependency> <groupId>com.liferay.portal</groupId> <artifactId>portal-service</artifactId> <version>6.1.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>com.liferay.portal</groupId> <artifactId>util-bridges</artifactId> <version>6.1.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.vaadin.addons</groupId> <artifactId>vaadin-timeline-agpl-3.0</artifactId> <version>1.2.4</version> </dependency> <dependency> <groupId>com.liferay.portal</groupId> <artifactId>util-taglib</artifactId> <version>6.1.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>com.liferay.portal</groupId> <artifactId>util-java</artifactId> <version>6.1.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>javax.portlet</groupId> <artifactId>portlet-api</artifactId> <version>2.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>javax.servlet</groupId> <artifactId>servlet-api</artifactId> <version>2.4</version> <scope>provided</scope> </dependency> <dependency> <groupId>javax.servlet.jsp</groupId> <artifactId>jsp-api</artifactId> <version>2.0</version> <scope>provided</scope> </dependency> <!-- sqx --> <dependency> <groupId>javax.activation</groupId> <artifactId>activation</artifactId> <version>1.1.1</version> <scope>provided</scope> </dependency> <dependency> <groupId>antlr</groupId> <artifactId>antlr</artifactId> <version>2.7.6</version> <scope>provided</scope> </dependency> <dependency> <groupId>aopalliance</groupId> <artifactId>aopalliance</artifactId> <version>1.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>asm</groupId> <artifactId>asm</artifactId> <version>1.5.3</version> <scope>provided</scope> </dependency> <dependency> <groupId>asm</groupId> <artifactId>asm-attrs</artifactId> <version>1.5.3</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.aspectj</groupId> <artifactId>aspectjrt</artifactId> <version>1.6.8</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.aspectj</groupId> <artifactId>aspectjweaver</artifactId> <version>1.6.8</version> <scope>provided</scope> </dependency> <dependency> <groupId>bsh</groupId> <artifactId>bsh</artifactId> <version>1.3.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>cglib</groupId> <artifactId>cglib</artifactId> <version>2.1_3</version> <scope>provided</scope> </dependency> <dependency> <groupId>commons-collections</groupId> <artifactId>commons-collections</artifactId> <version>3.1</version> <scope>provided</scope> </dependency> <dependency> <groupId>commons-dbcp</groupId> <artifactId>commons-dbcp</artifactId> <version>1.3</version> </dependency> <dependency> <groupId>commons-logging</groupId> <artifactId>commons-logging</artifactId> <version>1.1</version> </dependency> <dependency> <groupId>commons-pool</groupId> <artifactId>commons-pool</artifactId> <version>1.5.3</version> </dependency> <dependency> <groupId>dom4j</groupId> <artifactId>dom4j</artifactId> <version>1.6.1</version> </dependency> <dependency> <groupId>net.sf.ehcache</groupId> <artifactId>ehcache</artifactId> <version>1.2.3</version> </dependency> <dependency> <groupId>org.hibernate</groupId> <artifactId>hibernate-core</artifactId> <version>3.3.1.GA</version> </dependency> <dependency> <groupId>hsqldb</groupId> <artifactId>hsqldb</artifactId> <version>1.8.0.10</version> </dependency> <!-- <dependency> <groupId>jboss</groupId> <artifactId>jboss-backport-concurrent</artifactId> <version>2.1.0.GA</version> </dependency> --> <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-parent</artifactId> <version>1.5.0</version> </dependency> <dependency> <groupId>javax.jcr</groupId> <artifactId>jcr</artifactId> <version>1.0</version> </dependency> <!-- <dependency> <groupId>javax.sql</groupId> <artifactId>jdbc-stdext</artifactId> <version>2.0</version> </dependency> --> <dependency> <groupId>jdom</groupId> <artifactId>jdom</artifactId> <version>1.0</version> </dependency> <dependency> <groupId>javax.transaction</groupId> <artifactId>jta</artifactId> <version>1.1</version> </dependency> <dependency> <groupId>log4j</groupId> <artifactId>log4j</artifactId> <version>1.2.14</version> </dependency> <dependency> <groupId>javax.mail</groupId> <artifactId>mail</artifactId> <version>1.4.3</version> </dependency> <dependency> <groupId>com.sun.portal.portletcontainer</groupId> <artifactId>container</artifactId> <version>1.1-m4</version> </dependency> <dependency> <groupId>postgresql</groupId> <artifactId>postgresql</artifactId> <version>8.4-702.jdbc3</version> </dependency> <!-- sl4j-api-1.5.0 manquante --> <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-parent</artifactId> <version>1.5.0</version> </dependency> <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> <version>1.5.0</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-aop</artifactId> <version>2.5.6</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-aspects</artifactId> <version>3.0.3.RELEASE</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-beans</artifactId> <version>2.5.6</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-context</artifactId> <version>2.5.6</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-context-support</artifactId> <version>2.5.6</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-core</artifactId> <version>2.5.6</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-jdbc</artifactId> <version>2.5.6</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-jms</artifactId> <version>2.5.6</version> </dependency> <dependency> <groupId>javax.jms</groupId> <artifactId>jms</artifactId> <version>1.1</version> <scope>compile</scope> </dependency> <!-- <dependency> <groupId>org.springmodules</groupId> <artifactId>spring-modules-jbpm31</artifactId> <version>0.9</version> <scope>provided</scope> </dependency> --> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-orm</artifactId> <version>2.5.6</version> </dependency> <dependency> <groupId>org.springframework.ws</groupId> <artifactId>spring-oxm</artifactId> <version>1.5.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.springframework.security</groupId> <artifactId>spring-security-core</artifactId> <version>2.0.4</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-tx</artifactId> <version>2.5.6</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-web</artifactId> <version>2.5.6</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-webmvc-portlet</artifactId> <version>2.5</version> </dependency> <dependency> <groupId>com.atomikos</groupId> <artifactId>transactions-hibernate3</artifactId> <version>3.6.4</version> </dependency> <dependency> <groupId>com.atomikos</groupId> <artifactId>transactions-osgi</artifactId> <version>3.7.0</version> </dependency> <dependency> <groupId>com.vaadin</groupId> <artifactId>vaadin</artifactId> <version>6.7.0</version> </dependency> <dependency> <groupId>com.thoughtworks.xstream</groupId> <artifactId>xstream</artifactId> <version>1.3.1</version> </dependency> <!-- this is the dependency to the "jar"-subproject --> <dependency> <groupId>org.codehaus.plexus</groupId> <artifactId>plexus-utils</artifactId> <version>1.5.9</version> </dependency> <dependency> <groupId>com.google.gwt</groupId> <artifactId>gwt-user</artifactId> <version>2.1.1</version> <scope>provided</scope> </dependency> </dependencies> <!-- Define our plugin repositories --> <pluginRepositories> <pluginRepository> <id>Codehaus</id> <name>Codehaus Maven Plugin Repository</name> <url>http://repository.codehaus.org/org/codehaus/mojo</url> <snapshots> <enabled>true</enabled> </snapshots> </pluginRepository> <pluginRepository> <id>codehaus-snapshots</id> <url>[http://nexus.codehaus.org/snapshots]</url> <snapshots> <enabled>true</enabled> </snapshots> <releases> <enabled>false</enabled> </releases> </pluginRepository> </pluginRepositories> <repositories> <repository> <id>vaadin-addons</id> <url>http://maven.vaadin.com/vaadin-addons</url> </repository> <repository> <id>demoiselle.sourceforge.net</id> <name>Demoiselle Maven Repository</name> <url>http://demoiselle.sourceforge.net/repository/release</url> </repository> </repositories> AND when i do "clean install" to build my mvn , the console show me this taken : [INFO] Unable to find resource 'org.codehaus.mojo:gwt-maven-plugin:jar:1.3-SNAPSHOT' in repository demoiselle.sourceforge.net (http://demoiselle.sourceforge.net/repository/release) [INFO] ------------------------------------------------------------------------ [ERROR] BUILD ERROR [INFO] ------------------------------------------------------------------------ [INFO] Failed to resolve artifact. Missing: ---------- 1) org.codehaus.mojo:gwt-maven-plugin:jar:1.3-SNAPSHOT Try downloading the file manually from the project website. Then, install it using the command: mvn install:install-file -DgroupId=org.codehaus.mojo -DartifactId=gwt-maven-plugin - Dversion=1.3-SNAPSHOT -Dpackaging=jar -Dfile=/path/to/file Alternatively, if you host your own repository you can deploy the file there: mvn deploy:deploy-file -DgroupId=org.codehaus.mojo -DartifactId=gwt-maven-plugin -Dversion=1.3-SNAPSHOT -Dpackaging=jar -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id] Path to dependency: 1) com.vaadin:vaadin-maven-plugin:maven-plugin:1.0.1 2) org.codehaus.mojo:gwt-maven-plugin:jar:1.3-SNAPSHOT ---------- 1 required artifact is missing. for artifact: com.vaadin:vaadin-maven-plugin:maven-plugin:1.0.1 from the specified remote repositories: demoiselle.sourceforge.net (http://demoiselle.sourceforge.net/repository/release), central (http://repo1.maven.org/maven2), Codehaus (http://repository.codehaus.org/org/codehaus/mojo), codehaus-snapshots ([http://nexus.codehaus.org/snapshots]), vaadin-snapshots (http://oss.sonatype.org/content/repositories/vaadin-snapshots/), vaadin-releases (http://oss.sonatype.org/content/repositories/vaadin-releases/), vaadin-addons (http://maven.vaadin.com/vaadin-addons) your help will be welcome thank you a lot !!! :)))

    Read the article

  • Cannot get script to run at startup (tried all the simple answers)

    - by Carey Head
    I have Ubuntu Desktop 12.04 LTS running great on an older Acer desktop. I want to use this machine as an in-home server for hosting Minecraft. The command to start the Minecraft server is java -Xmx1024M -Xms1024M -jar minecraft_server.jar nogui and that works great when I cd into the correct directory and execute the above. I created a script to do this: #!/bin/bash cd /home/myuser/minecraft-server1 java -Xmx1024M -Xms1024M -jar minecraft_server.jar nogui & cd /home/myuser/minecraft-server2 java -Xmx1024M -Xms1024M -jar minecraft_server.jar nogui & exit 0 I made this .sh file executable, and it too runs great when I start it manually from the terminal. The problem I'm having is getting these to execute at startup. I have my user account on this machine to auto login. I have tried the following: Adding the following to "Startup Applications" : sh /home/myuser/myscript.sh (Nothing happens on reboot) Adding the same to /etc/rc.local (Nothing happens on reboot). I even tested this one by running /etc/rc.local from the terminal, and it executed great. Just not at boot/auto login Added the lines from the script directly to rc.local (Nothing happens on reboot). I can't help but think that there's something I'm missing. The script executes great when run manually, but will not run at boot/auto login. Many thanks in advance.

    Read the article

  • How to restore "working folders" upon restart?

    - by Smiles in a Jar
    I am not sure I am working the problem correctly but I will try my best. I am newbie to ubuntu and I am using Ubuntu 11.10. On a daily basis but I have a set of 5-10 folders which I need to refer for my work. I am looking for a way if I can create a "workspace" of these folders so that upon restart in a single click all the folders in the workspace can be opened in different tabs. Another option that I right now plan to use us create links of the folders and then select all of them and open in different tabs. Was wondering if there is a cleaner option already provided in ubuntu.

    Read the article

  • Import Data from Excel sheet to DB Table through OAF page

    - by PRajkumar
    1. Create a New Workspace and Project File > New > General > Workspace Configured for Oracle Applications File Name – PrajkumarImportxlsDemo   Automatically a new OA Project will also be created   Project Name -- ImportxlsDemo Default Package -- prajkumar.oracle.apps.fnd.importxlsdemo   2. Add JAR file jxl-2.6.3.jar to Apache Library Download jxl-2.6.3.jar from following link – http://www.findjar.com/jar/net.sourceforge.jexcelapi/jars/jxl-2.6.jar.html   Steps to add jxl.jar file in Local Machine Right Click on ImportxlsDemo > Project Properties > Libraries > Add jar/Directory and browse to directory where jxl-2.6.3.jar has been downloaded and select the JAR file            Steps to add jxl.jar file at EBS middle tier On your EBS middile tier copy jxl.jar at $FND_TOP/java/3rdparty/standalone Add $FND_TOP/java/3rdparty/standalone\jxl.jar to custom classpath in Jser.properties file which is at $IAS_ORACLE_HOME/Apache/Jserv/etc wrapper.classpath=/U01/oracle/dev/devappl/fnd/11.5.0/java/3rdparty/stdalone/jxl.jar Bounce Apache Server   3. Create a New Application Module (AM) Right Click on ImportxlsDemo > New > ADF Business Components > Application Module Name -- ImportxlsAM Package -- prajkumar.oracle.apps.fnd.importxlsdemo.server   Check Application Module Class: ImportxlsAMImpl Generate JavaFile(s)   4. Create Test Table in which we will insert data from excel CREATE TABLE xx_import_excel_data_demo (    -- --------------------      -- Data Columns      -- --------------------      column1                 VARCHAR2(100),      column2                 VARCHAR2(100),      column3                 VARCHAR2(100),      column4                 VARCHAR2(100),      column5                 VARCHAR2(100),      -- --------------------      -- Who Columns      -- --------------------      last_update_date   DATE         NOT NULL,      last_updated_by    NUMBER   NOT NULL,      creation_date         DATE         NOT NULL,      created_by             NUMBER    NOT NULL,      last_update_login  NUMBER );   5. Create a New Entity Object (EO) Right click on ImportxlsDemo > New > ADF Business Components > Entity Object Name – ImportxlsEO Package -- prajkumar.oracle.apps.fnd.importxlsdemo.schema.server Database Objects -- XX_IMPORT_EXCEL_DATA_DEMO   Note – By default ROWID will be the primary key if we will not make any column to be primary key Check the Accessors, Create Method, Validation Method and Remove Method   6. Create a New View Object (VO) Right click on ImportxlsDemo > New > ADF Business Components > View Object Name -- ImportxlsVO Package -- prajkumar.oracle.apps.fnd.importxlsdemo.server   In Step2 in Entity Page select ImportxlsEO and shuttle it to selected list In Step3 in Attributes Window select all columns and shuttle them to selected list   In Java page Uncheck Generate Java file for View Object Class: ImportxlsVOImpl Select Generate Java File for View Row Class: ImportxlsVORowImpl -> Generate Java File -> Accessors   7. Add Your View Object to Root UI Application Module Right click on ImportxlsAM > Edit ImportxlsAM > Data Model > Select ImportxlsVO and shuttle to Data Model list   8. Create a New Page Right click on ImportxlsDemo > New > Web Tier > OA Components > Page Name -- ImportxlsPG Package -- prajkumar.oracle.apps.fnd.importxlsdemo.webui   9. Select the ImportxlsPG and go to the strcuture pane where a default region has been created   10. Select region1 and set the following properties:   Attribute Property ID PageLayoutRN AM Definition prajkumar.oracle.apps.fnd.importxlsdemo.server.ImportxlsAM Window Title Import Data From Excel through OAF Page Demo Window Title Import Data From Excel through OAF Page Demo   11. Create messageComponentLayout Region Under Page Layout Region Right click PageLayoutRN > New > Region   Attribute Property ID MainRN Item Style messageComponentLayout   12. Create a New Item messageFileUpload Bean under MainRN Right click on MainRN > New > messageFileUpload Set Following Properties for New Item --   Attribute Property ID MessageFileUpload Item Style messageFileUpload   13. Create a New Item Submit Button Bean under MainRN Right click on MainRN > New > messageLayout Set Following Properties for messageLayout --   Attribute Property ID ButtonLayout   Right Click on ButtonLayout > New > Item   Attribute Property ID Go Item Style submitButton Attribute Set /oracle/apps/fnd/attributesets/Buttons/Go   14. Create Controller for page ImportxlsPG Right Click on PageLayoutRN > Set New Controller Package Name: prajkumar.oracle.apps.fnd.importxlsdemo.webui Class Name: ImportxlsCO   Write Following Code in ImportxlsCO in processFormRequest import oracle.apps.fnd.framework.OAApplicationModule; import oracle.apps.fnd.framework.OAException; import java.io.Serializable; import oracle.apps.fnd.framework.webui.OAControllerImpl; import oracle.apps.fnd.framework.webui.OAPageContext; import oracle.apps.fnd.framework.webui.beans.OAWebBean; import oracle.cabo.ui.data.DataObject; import oracle.jbo.domain.BlobDomain; public void processFormRequest(OAPageContext pageContext, OAWebBean webBean) {  super.processFormRequest(pageContext, webBean);  if (pageContext.getParameter("Go") != null)  {   DataObject fileUploadData = (DataObject)pageContext.getNamedDataObject("MessageFileUpload");   String fileName = null;                 try   {    fileName = (String)fileUploadData.selectValue(null, "UPLOAD_FILE_NAME");   }   catch(NullPointerException ex)   {    throw new OAException("Please Select a File to Upload", OAException.ERROR);   }   BlobDomain uploadedByteStream = (BlobDomain)fileUploadData.selectValue(null, fileName);   try   {    OAApplicationModule oaapplicationmodule = pageContext.getRootApplicationModule();    Serializable aserializable2[] = {uploadedByteStream};    Class aclass2[] = {BlobDomain.class };    oaapplicationmodule.invokeMethod("ReadExcel", aserializable2,aclass2);   }   catch (Exception ex)   {    throw new OAException(ex.toString(), OAException.ERROR);   }  } }     Write Following Code in ImportxlsAMImpl.java import java.io.IOException; import java.io.InputStream; import jxl.Cell; import jxl.CellType; import jxl.Sheet; import jxl.Workbook; import jxl.read.biff.BiffException; import oracle.apps.fnd.framework.server.OAApplicationModuleImpl; import oracle.jbo.Row; import oracle.apps.fnd.framework.OAViewObject; import oracle.apps.fnd.framework.server.OAViewObjectImpl; import oracle.jbo.domain.BlobDomain; public void createRecord(String[] excel_data) {   OAViewObject vo = (OAViewObject)getImportxlsVO1();            if (!vo.isPreparedForExecution())    {   vo.executeQuery();      }                      Row row = vo.createRow();  try  {   for (int i=0; i < excel_data.length; i++)   {    row.setAttribute("Column" +(i+1) ,excel_data[i]);   }  }  catch(Exception e)  {   System.out.println(e.getMessage());   }  vo.insertRow(row);  getTransaction().commit(); }      public void ReadExcel(BlobDomain fileData) throws IOException {  String[] excel_data  = new String[5];  InputStream inputWorkbook = fileData.getInputStream();  Workbook w;          try  {   w = Workbook.getWorkbook(inputWorkbook);                       // Get the first sheet   Sheet sheet = w.getSheet(0);                       for (int i = 0; i < sheet.getRows(); i++)   {    for (int j = 0; j < sheet.getColumns(); j++)    {     Cell cell = sheet.getCell(j, i);     CellType type = cell.getType();     if (cell.getType() == CellType.LABEL)     {      System.out.println("I got a label " + cell.getContents());      excel_data[j] = cell.getContents();     }     if (cell.getType() == CellType.NUMBER)     {        System.out.println("I got a number " + cell.getContents());      excel_data[j] = cell.getContents();     }    }    createRecord(excel_data);   }  }              catch (BiffException e)  {   e.printStackTrace();  } }   15. Congratulation you have successfully finished. Run Your page and Test Your Work   Consider Excel PRAJ_TEST.xls with following data --       Lets Try to import this data into DB Table --          

    Read the article

  • java.sql.SQLException: OALL8 is in an inconsistent state On weblogic 9

    - by user179056
    Hello, We are getting this error "java.sql.SQLException: OALL8 is in an inconsistent state " when executing our web app on weblogic 9. The jdk used is 1.5 and database is Oracle10.2g We have switched out oracle drivers ojdbc14.jar with ojdbc5.jar. We have also added orai18n.jar. We have ensured that this change of jar occurs with the web app library as well as other weblogic server classpaths where ojdbc14.jar existed. The problem persists Any pointers would help regards Sameer

    Read the article

  • JBoss EJB Bean not bound

    - by portoalet
    Hi, I have the following error Exception in thread "main" javax.naming.NameNotFoundException: CounterBean not bound trying to access an EJB JAR CounterBean.jar deployed on JBoss5 from a client application outside the Application Server. From the Jboss log, it looks like it does not have a global JNDI name? Is this ok? What have I done wrong? JBoss log: 13:50:39,669 INFO [JBossASKernel] Created KernelDeployment for: Counter.jar 13:50:39,672 INFO [JBossASKernel] installing bean: jboss.j2ee:jar=Counter.jar,name=CounterBean,service=EJB3 13:50:39,672 INFO [JBossASKernel] with dependencies: 13:50:39,672 INFO [JBossASKernel] and demands: 13:50:39,673 INFO [JBossASKernel] partition:partitionName=DefaultPartition; Required: Described 13:50:39,673 INFO [JBossASKernel] jboss.ejb:service=EJBTimerService; Required: Described 13:50:39,673 INFO [JBossASKernel] and supplies: 13:50:39,673 INFO [JBossASKernel] jndi:CounterBean 13:50:39,673 INFO [JBossASKernel] Added bean(jboss.j2ee:jar=Counter.jar,name=CounterBean,service=EJB3) to KernelDeployment of: Counte r.jar 13:50:39,712 INFO [SessionSpecContainer] Starting jboss.j2ee:jar=Counter.jar,name=CounterBean,service=EJB3 13:50:39,727 INFO [EJBContainer] STARTED EJB: com.don.CounterBean ejbName: CounterBean 13:50:39,732 INFO [JndiSessionRegistrarBase] Binding the following Entries in Global JNDI: The client code is: public static void main(String[] args) throws NamingException, InterruptedException { InitialContext ctx = new InitialContext(); Counter s = (Counter)ctx.lookup("CounterBean/remote"); for(int i = 0; i < 100; i++ ) { s.printCount(i); Thread.sleep(1000); } } Error message: java -Djava.naming.provider.url=jnp://123.123.123.123:1099 -Djava.naming.factory.initial=org.jnp.interfaces.NamingContextFactory com.don.Client Exception in thread "main" javax.naming.NameNotFoundException: CounterBean not bound at org.jnp.server.NamingServer.getBinding(NamingServer.java:771) at org.jnp.server.NamingServer.getBinding(NamingServer.java:779) at org.jnp.server.NamingServer.getObject(NamingServer.java:785) at org.jnp.server.NamingServer.lookup(NamingServer.java:396) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:305) at sun.rmi.transport.Transport$1.run(Transport.java:159) at java.security.AccessController.doPrivileged(Native Method) at sun.rmi.transport.Transport.serviceCall(Transport.java:155) at sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:535) at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:790) at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:649) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:619) at sun.rmi.transport.StreamRemoteCall.exceptionReceivedFromServer(StreamRemoteCall.java:255) at sun.rmi.transport.StreamRemoteCall.executeCall(StreamRemoteCall.java:233) at sun.rmi.server.UnicastRef.invoke(UnicastRef.java:142) at org.jnp.server.NamingServer_Stub.lookup(Unknown Source) at org.jnp.interfaces.NamingContext.lookup(NamingContext.java:726) at org.jnp.interfaces.NamingContext.lookup(NamingContext.java:686) at javax.naming.InitialContext.lookup(InitialContext.java:392) at com.don.Client.main(Client.java:10)

    Read the article

  • Why am I getting ClassNotFoundExpection when I have properly imported said class and am looking at it in its directory?

    - by Strider
    This is my Javac compiling statement: javac -cp "C:\java\code\j3D\j3dcore.jar;C:\java\code\j3D\j3dutils.jar;C:\java\code\j3D\vecmath.jar" Simple.java compiles with no problems. The three jar files (j3dcore, j3dutils, and vecmath) are the essential jar's for my program (or at least I am led to believe according to this official tutorial on J3D For the record I ripped this code almost line from line from this pdf file. jar files are correctly located in referenced locations When I run my Simple program, (java Simple) I am greeted with Exception in thread "main" java.lang.NoClassDefFoundError: javax/media/j3d/Cavas3d Caused by: java.lang.ClassNotFoundExpection: javax.media.j3d.Canvas3D Currently I am staring directly at this Canvas3D.class that is located within j3dcore.jar\javax\media\j3d\ wtfisthis.jpg Here is the source code: //First java3D Program import java.applet.Applet; import java.awt.BorderLayout; import java.awt.Frame; import java.awt.event.*; import com.sun.j3d.utils.applet.MainFrame; import com.sun.j3d.utils.universe.*; import com.sun.j3d.utils.geometry.ColorCube; import javax.media.j3d.*; import javax.vecmath.*; import java.awt.GraphicsConfiguration; public class Simple extends Applet { public Simple() { setLayout(new BorderLayout()); GraphicsConfiguration config = SimpleUniverse.getPreferredConfiguration(); Canvas3D canvas3D = new Canvas3D(config); add("Center", canvas3D); BranchGroup scene = createSceneGraph(); scene.compile(); // SimpleUniverse is a Convenience Utility class SimpleUniverse simpleU = new SimpleUniverse(canvas3D); // This moves the ViewPlatform back a bit so the // objects in the scene can be viewed. simpleU.getViewingPlatform().setNominalViewingTransform(); simpleU.addBranchGraph(scene); } // end of HelloJava3Da (constructor) public BranchGroup createSceneGraph() { // Create the root of the branch graph BranchGroup objRoot = new BranchGroup(); // Create a simple shape leaf node, add it to the scene graph. // ColorCube is a Convenience Utility class objRoot.addChild(new ColorCube(0.4)); return objRoot; } public static void main(String args[]){ Simple world = new Simple(); } }` Did I import correctly? Did I incorrectly reference my jar files in my Javac statement? If I clearly see Canvas3D within its correct directory why cant java find it? The first folder in both j3dcore.jar and vecmath.jar is "javax". Is the compiler getting confused? If the compiler is getting confused how do I specify where to find that exact class when referencing it within my source code?

    Read the article

  • Problem compiling hive with ant

    - by conandor
    I compiling with Solaris 10 SPARC, jdk 1.6 from Sun, Ant 1.7.1 from OpenCSW. I have no problem running hadoop 0.17.2.1 However, I have problem compiling/integrating hive with the error 'cannot find symbol', although I followed the tutorial. I have the hive source code from SVN exactly from tutorial. How can I know the hive version I compiling and how can I compile against hadoop 0.17.2.1? Please advice. Thank you. -bash-3.00$ export PATH=/usr/jdk/instances/jdk1.6.0/bin:/usr/bin:/opt/csw/bin:/opt/webstack/bin -bash-3.00$ export JAVA_HOME=/usr/jdk/instances/jdk1.6.0 -bash-3.00$ export HADOOP=/export/home/mywork/hadoop-0.17.2.1/bin/hadoop -bash-3.00$ /opt/csw/bin/ant package -Dhadoop.version=0.17.2.1 Buildfile: build.xml jar: create-dirs: compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: compile: ivy-init-dirs: ivy-download: [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] To: /export/home/mywork/hive/build/ivy/lib/ivy-2.1.0.jar [get] Not modified - so not downloaded ivy-probe-antlib: ivy-init-antlib: ivy-init: ivy-retrieve-hadoop-source: [ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ :: [ivy:retrieve] :: loading settings :: file = /export/home/mywork/hive/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working@kaili [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.17.2.1 in hadoop-source [ivy:retrieve] found hadoop#core;0.18.3 in hadoop-source [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-source [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-source [ivy:retrieve] :: resolution report :: resolve 25878ms :: artifacts dl 37ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 4 | 0 | 0 | 0 || 4 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 4 already retrieved (0kB/82ms) install-hadoopcore-internal: build_shims: [echo] Compiling shims against hadoop 0.17.2.1 (/export/home/mywork/hive/build/hadoopcore/hadoop-0.17.2.1) ivy-init-dirs: ivy-download: [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] To: /export/home/mywork/hive/build/ivy/lib/ivy-2.1.0.jar [get] Not modified - so not downloaded ivy-probe-antlib: ivy-init-antlib: ivy-init: ivy-retrieve-hadoop-source: [ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ :: [ivy:retrieve] :: loading settings :: file = /export/home/mywork/hive/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working@kaili [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.17.2.1 in hadoop-source [ivy:retrieve] found hadoop#core;0.18.3 in hadoop-source [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-source [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-source [ivy:retrieve] :: resolution report :: resolve 12041ms :: artifacts dl 30ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 4 | 0 | 0 | 0 || 4 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 4 already retrieved (0kB/39ms) install-hadoopcore-internal: build_shims: [echo] Compiling shims against hadoop 0.18.3 (/export/home/mywork/hive/build/hadoopcore/hadoop-0.18.3) ivy-init-dirs: ivy-download: [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] To: /export/home/mywork/hive/build/ivy/lib/ivy-2.1.0.jar [get] Not modified - so not downloaded ivy-probe-antlib: ivy-init-antlib: ivy-init: ivy-retrieve-hadoop-source: [ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ :: [ivy:retrieve] :: loading settings :: file = /export/home/mywork/hive/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working@kaili [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.17.2.1 in hadoop-source [ivy:retrieve] found hadoop#core;0.18.3 in hadoop-source [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-source [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-source [ivy:retrieve] :: resolution report :: resolve 11107ms :: artifacts dl 36ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 4 | 0 | 0 | 0 || 4 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 4 already retrieved (0kB/49ms) install-hadoopcore-internal: build_shims: [echo] Compiling shims against hadoop 0.19.0 (/export/home/mywork/hive/build/hadoopcore/hadoop-0.19.0) ivy-init-dirs: ivy-download: [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] To: /export/home/mywork/hive/build/ivy/lib/ivy-2.1.0.jar [get] Not modified - so not downloaded ivy-probe-antlib: ivy-init-antlib: ivy-init: ivy-retrieve-hadoop-source: [ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ :: [ivy:retrieve] :: loading settings :: file = /export/home/mywork/hive/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working@kaili [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.17.2.1 in hadoop-source [ivy:retrieve] found hadoop#core;0.18.3 in hadoop-source [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-source [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-source [ivy:retrieve] :: resolution report :: resolve 9969ms :: artifacts dl 33ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 4 | 0 | 0 | 0 || 4 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 4 already retrieved (0kB/57ms) install-hadoopcore-internal: build_shims: [echo] Compiling shims against hadoop 0.20.0 (/export/home/mywork/hive/build/hadoopcore/hadoop-0.20.0) jar: [echo] Jar: shims create-dirs: compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: install-hadoopcore: install-hadoopcore-default: ivy-init-dirs: ivy-download: [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] To: /export/home/mywork/hive/build/ivy/lib/ivy-2.1.0.jar [get] Not modified - so not downloaded ivy-probe-antlib: ivy-init-antlib: ivy-init: ivy-retrieve-hadoop-source: [ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ :: [ivy:retrieve] :: loading settings :: file = /export/home/mywork/hive/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#common;working@kaili [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-source [ivy:retrieve] :: resolution report :: resolve 4864ms :: artifacts dl 13ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 1 | 0 | 0 | 0 || 1 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#common [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 1 already retrieved (0kB/52ms) install-hadoopcore-internal: setup: compile: [echo] Compiling: common jar: [echo] Jar: common create-dirs: compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: dynamic-serde: compile: [echo] Compiling: hive [javac] Compiling 167 source files to /export/home/mywork/hive/build/serde/classes [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/objectinspector/ObjectInspectorFactory.java:30: cannot find symbol [javac] symbol : class PrimitiveObjectInspectorFactory [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorFactory; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/objectinspector/ObjectInspectorFactory.java:31: cannot find symbol [javac] symbol : class PrimitiveObjectInspectorUtils [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorUtils; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/MetadataTypedColumnsetSerDe.java:31: cannot find symbol [javac] symbol : class MetadataListStructObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector [javac] import org.apache.hadoop.hive.serde2.objectinspector.MetadataListStructObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/SerDeUtils.java:33: cannot find symbol [javac] symbol : class BooleanObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.BooleanObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/SerDeUtils.java:35: cannot find symbol [javac] symbol : class DoubleObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.DoubleObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/SerDeUtils.java:36: cannot find symbol [javac] symbol : class FloatObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.FloatObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/SerDeUtils.java:39: cannot find symbol [javac] symbol : class ShortObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.ShortObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/SerDeUtils.java:40: cannot find symbol [javac] symbol : class StringObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/binarysortable/BinarySortableSerDe.java:44: cannot find symbol [javac] symbol : class BooleanObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.BooleanObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/binarysortable/BinarySortableSerDe.java:46: cannot find symbol [javac] symbol : class DoubleObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.DoubleObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/binarysortable/BinarySortableSerDe.java:47: cannot find symbol [javac] symbol : class FloatObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.FloatObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/binarysortable/BinarySortableSerDe.java:50: cannot find symbol [javac] symbol : class ShortObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.ShortObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/binarysortable/BinarySortableSerDe.java:51: cannot find symbol [javac] symbol : class StringObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazySimpleSerDe.java:43: cannot find symbol [javac] symbol : class PrimitiveObjectInspectorFactory [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorFactory; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/columnar/ColumnarSerDe.java:41: cannot find symbol [javac] symbol : class PrimitiveObjectInspectorFactory [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorFactory; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyStruct.java:26: cannot find symbol [javac] symbol : class LazySimpleStructObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.lazy.objectinspector [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.LazySimpleStructObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyStruct.java:39: cannot find symbol [javac] symbol: class LazySimpleStructObjectInspector [javac] LazyNonPrimitive<LazySimpleStructObjectInspector> { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyStruct.java:68: cannot find symbol [javac] symbol : class LazySimpleStructObjectInspector [javac] location: class org.apache.hadoop.hive.serde2.lazy.LazyStruct [javac] public LazyStruct(LazySimpleStructObjectInspector oi) { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type/DynamicSerDe.java:36: cannot find symbol [javac] symbol : class PrimitiveObjectInspectorFactory [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorFactory; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type/DynamicSerDe.java:37: cannot find symbol [javac] symbol : class PrimitiveObjectInspectorUtils [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorUtils; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type/DynamicSerDeTypeString.java:23: cannot find symbol [javac] symbol : class StringObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type/DynamicSerDeTypei16.java:23: cannot find symbol [javac] symbol : class ShortObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.ShortObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type/DynamicSerDeTypeDouble.java:23: cannot find symbol [javac] symbol : class DoubleObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.DoubleObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type/DynamicSerDeTypeBool.java:23: cannot find symbol [javac] symbol : class BooleanObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.BooleanObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyBoolean.java:20: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyBooleanObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyBoolean.java:37: cannot find symbol [javac] symbol: class LazyBooleanObjectInspector [javac] LazyPrimitive<LazyBooleanObjectInspector, BooleanWritable> { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyBoolean.java:39: cannot find symbol [javac] symbol : class LazyBooleanObjectInspector [javac] location: class org.apache.hadoop.hive.serde2.lazy.LazyBoolean [javac] public LazyBoolean(LazyBooleanObjectInspector oi) { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyByte.java:21: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyByteObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyByte.java:37: cannot find symbol [javac] symbol: class LazyByteObjectInspector [javac] LazyPrimitive<LazyByteObjectInspector, ByteWritable> { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyByte.java:39: cannot find symbol [javac] symbol : class LazyByteObjectInspector [javac] location: class org.apache.hadoop.hive.serde2.lazy.LazyByte [javac] public LazyByte(LazyByteObjectInspector oi) { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyDouble.java:23: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyDoubleObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyDouble.java:31: cannot find symbol [javac] symbol: class LazyDoubleObjectInspector [javac] LazyPrimitive<LazyDoubleObjectInspector, DoubleWritable> { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyDouble.java:33: cannot find symbol [javac] symbol : class LazyDoubleObjectInspector [javac] location: class org.apache.hadoop.hive.serde2.lazy.LazyDouble [javac] public LazyDouble(LazyDoubleObjectInspector oi) { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:25: cannot find symbol [javac] symbol : class LazyObjectInspectorFactory [javac] location: package org.apache.hadoop.hive.serde2.lazy.objectinspector [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.LazyObjectInspectorFactory; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:26: cannot find symbol [javac] symbol : class LazySimpleStructObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.lazy.objectinspector [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.LazySimpleStructObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:27: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyBooleanObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:28: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyByteObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:29: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyDoubleObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:30: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyFloatObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:31: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyIntObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:32: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyLongObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:33: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyPrimitiveObjectInspectorFactory; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:34: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyShortObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:35: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyStringObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFloat.java:

    Read the article

  • Parse.com REST API in Java (NOT Android)

    - by Orange Peel
    I am trying to use the Parse.com REST API in Java. I have gone through the 4 solutions given here https://parse.com/docs/api_libraries and have selected Parse4J. After importing the source into Netbeans, along with importing the following libraries: org.slf4j:slf4j-api:jar:1.6.1 org.apache.httpcomponents:httpclient:jar:4.3.2 org.apache.httpcomponents:httpcore:jar:4.3.1 org.json:json:jar:20131018 commons-codec:commons-codec:jar:1.9 junit:junit:jar:4.11 ch.qos.logback:logback-classic:jar:0.9.28 ch.qos.logback:logback-core:jar:0.9.28 I ran the example code from https://github.com/thiagolocatelli/parse4j Parse.initialize(APP_ID, APP_REST_API_ID); // I replaced these with mine ParseObject gameScore = new ParseObject("GameScore"); gameScore.put("score", 1337); gameScore.put("playerName", "Sean Plott"); gameScore.put("cheatMode", false); gameScore.save(); And I got that it was missing org.apache.commons.logging, so I downloaded that and imported it. Then I ran the code again and got Exception in thread "main" java.lang.NoSuchMethodError: org.slf4j.spi.LocationAwareLogger.log(Lorg/slf4j/Marker;Ljava/lang/String;ILjava/lang/String;Ljava/lang/Throwable;)V at org.apache.commons.logging.impl.SLF4JLocationAwareLog.debug(SLF4JLocationAwareLog.java:120) at org.apache.http.client.protocol.RequestAddCookies.process(RequestAddCookies.java:122) at org.apache.http.protocol.ImmutableHttpProcessor.process(ImmutableHttpProcessor.java:131) at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:193) at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:85) at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:108) at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:186) at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82) at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:106) at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:57) at org.parse4j.command.ParseCommand.perform(ParseCommand.java:44) at org.parse4j.ParseObject.save(ParseObject.java:450) I could probably fix this with another import, but I suppose then something else would pop up. I tried the other libraries with similar results, missing a bunch of libraries. Has anyone actually used REST API successfully in Java? If so, I would be grateful if you shared which library/s you used and anything else required to get it going successfully. I am using Netbeans. Thanks.

    Read the article

  • Oracle JDBC intermittent Connection Issue

    - by Lipska
    I am experiencing a very strange problem This is a very simple use of JDBC connecting to an Oracle database OS: Ubuntu Java Version: 1.5.0_16-b02 1.6.0_17-b04 Database: Oracle 11g Release 11.1.0.6.0 When I make use of the jar file JODBC14.jar it connects to the database everytime When I make use of the jar file JODBC5.jar it connects some times and other times it throws an error ( shown below) If I recompile with Java 6 and use JODBC6.jar I get the same results as JODBC5.jar I need specific features in JODB5.jar that are not available in JODBC14.jar Any ideas Error Connecting to oracle java.sql.SQLException: Io exception: Connection reset at oracle.jdbc.driver.SQLStateMapping.newSQLException(SQLStateMapping.java:74) at oracle.jdbc.driver.DatabaseError.newSQLException(DatabaseError.java:110) at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:171) at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:227) at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:494) at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:411) at oracle.jdbc.driver.PhysicalConnection.(PhysicalConnection.java:490) at oracle.jdbc.driver.T4CConnection.(T4CConnection.java:202) at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:33) at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:474) at java.sql.DriverManager.getConnection(DriverManager.java:525) at java.sql.DriverManager.getConnection(DriverManager.java:171) at TestConnect.main(TestConnect.java:13) Code Below is the code I am using import java.io.; import java.sql.; public class TestConnect { public static void main(String[] args) { try { System.out.println("Connecting to oracle"); Connection con=null; Class.forName("oracle.jdbc.driver.OracleDriver"); con=DriverManager.getConnection( "jdbc:oracle:thin:@172.16.48.100:1535:sample", "JOHN", "90009000"); System.out.println("Connected to oracle"); con.close(); System.out.println("Goodbye"); } catch(Exception e){e.printStackTrace();} } }

    Read the article

  • Trying to use Rhino, getEngineByName("JavaScript") returns null in OpenJDK 7

    - by Yuval
    When I run the following piece of code, the engine variable is set to null when I'm using OepnJDK 7 (java-7-openjdk-i386). import javax.script.ScriptEngine; import javax.script.ScriptEngineManager; import javax.script.ScriptException; public class TestRhino { /** * @param args */ public static void main(String[] args) { // TODO Auto-generated method stub ScriptEngineManager factory = new ScriptEngineManager(); ScriptEngine engine = factory.getEngineByName("JavaScript"); try { System.out.println(engine.eval("1+1")); } catch (ScriptException e) { // TODO Auto-generated catch block e.printStackTrace(); } } } It runs fine with java-6-openjdk and Oracle's jre1.7.0. Any idea why? I'm using Ubuntu 11.10. All JVMs are installed under /usr/lib/jvm. I noticed OpenJDK 7 has a different directory structure. Perhaps something is not installed right? $ locate rhino.jar /usr/lib/jvm/java-6-openjdk/jre/lib/rhino.jar /usr/lib/jvm/java-7-openjdk-common/jre/lib/rhino.jar /usr/lib/jvm/java-7-openjdk-i386/jre/lib/rhino.jar Edit Since ScriptEngineManager uses a ServiceProvider to find the available script engines, I snooped around resources.jar's META-INF/services. I noticed that in OpenJDK 6, resources.jar has a META-INF/services/javax.script.ScriptEngineFactory entry which is missing from OpenJDK 7. Any idea why? I suspect this is a bug? Here is the contents of that entry (from OpenJDK 6): #script engines supported com.sun.script.javascript.RhinoScriptEngineFactory #javascript Another edit Apparently, according to this thread, the code simply isn't there, perhaps because of merging issues between Sun and Mozilla code. I still don't understand why it was present in OpenJDK 6 and not 7. The class com.sun.script.javascript.RhinoScriptEngineFactory exists in 6's rt.jar but not in 7's. If it was not meant to be included, why is there a OpenJDK 7 rhino.jar then; and why is the source still in the OpenJDK source tree (here)?

    Read the article

  • What is likely cause of Android runtime exception "No suitable Log implementation" related to loggin

    - by M.Bearden
    I am creating an Android app that includes a third party jar. That third party jar utilizes internal logging that is failing to initialize when I run the app, with this error: "org.apache.commons.logging.LogConfigurationException: No suitable Log implementation". The 3rd party jar appears to be using org.apache.commons.logging and to depend on log4j, specifically log4j-1.2.14.jar. I have packaged the log4j jar into the Android app. The third party jar was packaged with a log4j.xml configuration file, which I have tried packaging into the app as an XML resource (and also as a raw resource). The "No suitable Log implementation" error message is not very descriptive, and I have no immediate familiarity with Java logging. So I am looking for likely causes of the problem (what class or configuration resources might I be missing?) or for some debugging technique that will result in a different error message that is more explicit about the problem. I do not have access to source code for the 3rd party jar. Here is the exception stack trace. When I run the app, I get the following exception as soon as one of the third party jar classes attempts to initialize its internal logging. DEBUG/AndroidRuntime(15694): Shutting down VM WARN/dalvikvm(15694): threadid=3: thread exiting with uncaught exception (group=0x4001b180) ERROR/AndroidRuntime(15694): Uncaught handler: thread main exiting due to uncaught exception ERROR/AndroidRuntime(15694): java.lang.ExceptionInInitializerError ERROR/AndroidRuntime(15694): Caused by: org.apache.commons.logging.LogConfigurationException: No suitable Log implementation ERROR/AndroidRuntime(15694): at org.apache.commons.logging.impl.LogFactoryImpl.discoverLogImplementation(LogFactoryImpl.java:842) ERROR/AndroidRuntime(15694): at org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:601) ERROR/AndroidRuntime(15694): at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:333) ERROR/AndroidRuntime(15694): at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:307) ERROR/AndroidRuntime(15694): at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:645) ERROR/AndroidRuntime(15694): at org.apache.commons.configuration.ConfigurationFactory.<clinit>(ConfigurationFactory.java:77)

    Read the article

  • How do I write a J2EE/EJB Singleton?

    - by Bears will eat you
    A day ago my application was one EAR, containing one WAR, one EJB JAR, and a couple of utility JAR files. I had a POJO singleton class in one of those utility files, it worked, and all was well with the world: EAR |--- WAR |--- EJB JAR |--- Util 1 JAR |--- Util 2 JAR |--- etc. Then I created a second WAR and found out (the hard way) that each WAR has its own ClassLoader, so each WAR sees a different singleton, and things break down from there. This is not so good. EAR |--- WAR 1 |--- WAR 2 |--- EJB JAR |--- Util 1 JAR |--- Util 2 JAR |--- etc. So, I'm looking for a way to create a Java singleton object that will work across WARs (across ClassLoaders?). The @Singleton EJB annotation seemed pretty promising until I found that JBoss 5.1 doesn't seem to support that annotation (which was added as part of EJB 3.1). Did I miss something - can I use @Singleton with JBoss 5.1? Upgrading to JBoss AS 6 is not an option right now. Alternately, I'd be just as happy to not have to use EJB to implement my singleton. What else can I do to solve this problem? Basically, I need a semi-application-wide* hook into a whole bunch of other objects, like various cached data, and app config info. As a last resort, I've already considered merging my two WARs into one, but that would be pretty hellish. *Meaning: available basically anywhere above a certain layer; for now, mostly in my WARs - the View and Controller (in a loose sense).

    Read the article

  • Strategies for dealing with Circular references caused by JPA relationships?

    - by ams
    I am trying to partition my application into modules by features packaged into separate jars such as feature-a.jar, feature-b.jar, ... etc. Individual feature jars such as feature-a.jar should contain all the code for a feature a including jpa entities, business logic, rest apis, unit tests, integration test ... etc. The problem I am running into is that bi-directional relationships between JPA entities cause circular references between the jar files. For example Customer entity should be in customer.jar and the Order should be in order.jar but Customer references order and order references customer making it hard to split them into separate jars / eclipse projects. Options I see for dealing with the circular dependencies in JPA entities: Option 1: Put all the entities into one jar / one project Option 2: Don't map certain bi-directianl relationships to avoid circular dependencies across projects. Questions: What rules / principles have you used to decide when to do bi-directional mapping vs. not? Have you been able to break jpa entities into their own projects / jar by features if so how did you avoid the circular dependencies issues?

    Read the article

  • Eclipse Ant Builder problem

    - by styx777
    I made a custom ant script to automatically create a jar file each time I do a build. This is how it looks like: <?xml version="1.0" encoding="UTF-8"?> <project name="TestProj" basedir="." default="jar"> <property name="dist" value="dist" /> <property name="build" value="bin/test/testproj" /> <target name="jar"> <jar destfile="${dist}/TestProj.jar"> <manifest> <attribute name="Main-Class" value="test.testproj.TestProj" /> </manifest> <fileset dir="${build}" /> </jar> </target> </project> I added it by Right clicking my project properties builders clicked new Ant builder then I specified the location of the above xml file. However, when I run it by doing: java -jar TestProj.jar I get a NoClassDefFoundError test/testproj/TestProj I'm using Eclipse in Ubuntu. TestProj is the name of the class and it's in package test.testproj I'm pretty sure there's something wrong with the manifest and probably the location of the xml file as well but I'm not sure how to fix this. Any ideas?

    Read the article

  • make target is never determined up to date

    - by Michael
    Cygwin make always processing $(chrome_jar_file) target, after first successful build. So I never get up to date message and always see commands for $(chrome_jar_file) are executing. However it happens only on Windows 7. On Windows XP once it built and intact, no more builds. I narrowed down the issue to one prerequisite - $(jar_target_dir). Here is part of the code # The location where the JAR file will be created. jar_target_dir := $(build_dir)/chrome # The main chrome JAR file. chrome_jar_file := $(jar_target_dir)/$(extension_name).jar # The root of the JAR sources. jar_source_root := chrome # The sources for the JAR file. jar_sources := bla #... some files, doesn't matter jar_sources_no_dir := $(subst $(jar_source_root)/,,$(jar_sources)) $(chrome_jar_file): $(jar_sources) $(jar_target_dir) @echo "Creating chrome JAR file." @cd $(jar_source_root); $(ZIP) ../$(chrome_jar_file) $(jar_sources_no_dir) @echo "Creating chrome JAR file. Done!" $(jar_target_dir): $(build_dir) echo "Creating jar target dir..." if [ ! -x $(jar_target_dir) ]; \ then \ mkdir $(jar_target_dir); \ fi $(build_dir): @if [ ! -x $(build_dir) ]; \ then \ mkdir $(build_dir); \ fi so if I just remove $(jar_target_dir) from $(chrome_jar_file) rule, it works fine.

    Read the article

< Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >