Search Results

Search found 18361 results on 735 pages for 'hibernate search'.

Page 17/735 | < Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • Combining search button on google map and search API

    - by cheesebunz
    Basically, i have the google api search engine which will send back addresses based on the search by the user; And a map api which will go over to the selected place typed in the textbox. However, both of them are in different textboxes / buttons. i can't seem to be able to change the ids of the button to make ONE of the buttons function as two ways, which your able to get the address, and the map will move to the selected location.

    Read the article

  • rails solr search limit total search results / get fixed number of results

    - by kLeos
    I'm trying to perform a search, order the results randomly, and only return a number of results, not all matches. Something like limit(2) I've tried using the Solr param 'rows' but that doesn't seem to do anything: @featured_articles = Article.search do with(:is_featured, true) order_by :random adjust_solr_params do |params| params[:rows] = 2 end end @featured_articles.total should be 2, but it returns more than 2 How can I get a randomized fixed number of results?

    Read the article

  • Is Google Analytics Part Of Google's Search Engine Algorithm

    - by ub3rst4r
    I was wondering if anyone knows if Google uses the data it receives from Google Analytics to help determine a websites SERP (Search Engine Rank Position). For example, if my website is getting 1000 users visiting my website from Canada and only 100 users visiting my website from the USA, does that mean my website will be ranked higher on Google.ca and lower on Google.com? And, if a website is using Google Analytics will it be ranked higher for the organic search engine keywords?

    Read the article

  • No search data in Google Analytics or Webmasters

    - by cjk
    I have a domain that has been registered in Google Webmasters and using Google Analytics for over 4 months. I get lots of analytics data, but am getting no information on Google searches in Webmasters, or Queries in Search Engine Optimisation in Analytics, even though I am getting keywords for traffic coming to my site from search engines. I have a test sub-domain with the same setup (except not HTTPS) that is getting some of this information through, even with much less data and visits. What could be wrong to stop me getting this information?

    Read the article

  • No search data in Goolge Analytics or Webmasters

    - by cjk
    I have a domain that has been registered in Google Webmasters and using Google Analytics for over 4 months. I get lots of analytics data, but am getting no information on Google searches in Webmasters, or Queries in Search Engine Optimisation in Analytics, even though I am getting keywords for traffic coming to my site from search engines. I have a test sub-domain with the same setup (except not HTTPS) that is getting some of this information through, even with much less data and visits. What could be wrong to stop me getting this information?

    Read the article

  • How do search engines segment against locale?

    - by Hope I Helped
    Assume I run a website with multiple language modes. If I had a Spanish section, it should be included in Spanish-segmented search engines such as Google Spain, Google Peru, Google El Salvador, etc. and excluded in the others. Likewise, even though the website would have content in Chinese, multilingual countries such as Singapore should feature content in their main language (English in this case). What is the best approach to ensure the appropriate language is associated with the various geographically segmented search engines?

    Read the article

  • Kickoff and Krunner to search with less than 3 chars in search field?

    - by Benjamin
    Both Krunner and Kickoff in Kubuntu return a list of found items after the user has entered at least three characters in the search field. Usage on Synapse shows that returning a list after one character onwards is faster and efficient. I would like Kickoff and Krunner to behave similarly, by making the return a list of items after entering the first character in their search field. How can I achieve that?

    Read the article

  • Include latest searches in search engines index

    - by drcelus
    My websites generally include a page with the (user input) latest searches. I know it's not a good security practice to allow this since you can find undesired content. On the other hand it boosts the number of pages indexed since every new search can provide a link on google and people can find you with related keywords that you are not using on your web page. What is the rationale behinf including or excludingthis results in search engines index ?

    Read the article

  • LazyInitializationException when adding to a list that is held within a entity class using hibernate

    - by molleman
    Right so i am working with hibernate gilead and gwt to persist my data on users and files of a website. my users have a list of file locations. i am using annotations to map my classes to the database. i am getting a org.hibernate.LazyInitializationException when i try to add file locations to the list that is held in the user class. this is a method below that is overridden from a external file upload servlet class that i am using. when the file uploads it calls this method. the user1 is loaded from the database elsewhere. the exception occurs at user1.getFileLocations().add(fileLocation); . i dont understand it really at all.! any help would be great. the stack trace of the error is below public String executeAction(HttpServletRequest request, List<FileItem> sessionFiles) throws UploadActionException { for (FileItem item : sessionFiles) { if (false == item.isFormField()) { try { YFUser user1 = (YFUser)getSession().getAttribute(SESSION_USER); // This is the location where a file will be stored String fileLocationString = "/Users/Stefano/Desktop/UploadedFiles/" + user1.getUsername(); File fl = new File(fileLocationString); fl.mkdir(); // so here i will create the a file container for my uploaded file File file = File.createTempFile("upload-", ".bin",fl); // this is where the file is written to disk item.write(file); // the FileLocation object is then created FileLocation fileLocation = new FileLocation(); fileLocation.setLocation(fileLocationString); //test System.out.println("file path = "+file.getPath()); user1.getFileLocations().add(fileLocation); //the line above is where the exception occurs } catch (Exception e) { throw new UploadActionException(e.getMessage()); } } removeSessionFileItems(request); } return null; } //This is the class file for a Your Files User @Entity @Table(name = "yf_user_table") public class YFUser implements Serializable,ILightEntity { @Id @GeneratedValue(strategy = GenerationType.AUTO) @Column(name = "user_id",nullable = false) private int userId; @Column(name = "username") private String username; @Column(name = "password") private String password; @Column(name = "email") private String email; @ManyToMany(cascade = CascadeType.ALL) @JoinTable(name = "USER_FILELOCATION", joinColumns = { @JoinColumn(name = "user_id") }, inverseJoinColumns = { @JoinColumn(name = "locationId") }) private List<FileLocation> fileLocations = new ArrayList<FileLocation>() ; public YFUser(){ } public int getUserId() { return userId; } private void setUserId(int userId) { this.userId = userId; } public String getUsername() { return username; } public void setUsername(String username) { this.username = username; } public String getPassword() { return password; } public void setPassword(String password) { this.password = password; } public String getEmail() { return email; } public void setEmail(String email) { this.email = email; } public List<FileLocation> getFileLocations() { if(fileLocations ==null){ fileLocations = new ArrayList<FileLocation>(); } return fileLocations; } public void setFileLocations(List<FileLocation> fileLocations) { this.fileLocations = fileLocations; } /* public void addFileLocation(FileLocation location){ fileLocations.add(location); }*/ @Override public void addProxyInformation(String property, Object proxyInfo) { // TODO Auto-generated method stub } @Override public String getDebugString() { // TODO Auto-generated method stub return null; } @Override public Object getProxyInformation(String property) { // TODO Auto-generated method stub return null; } @Override public boolean isInitialized(String property) { // TODO Auto-generated method stub return false; } @Override public void removeProxyInformation(String property) { // TODO Auto-generated method stub } @Override public void setInitialized(String property, boolean initialised) { // TODO Auto-generated method stub } @Override public Object getValue() { // TODO Auto-generated method stub return null; } } @Entity @Table(name = "fileLocationTable") public class FileLocation implements Serializable { @Id @GeneratedValue(strategy = GenerationType.AUTO) @Column(name = "locationId", updatable = false, nullable = false) private int ieId; @Column (name = "location") private String location; public FileLocation(){ } public int getIeId() { return ieId; } private void setIeId(int ieId) { this.ieId = ieId; } public String getLocation() { return location; } public void setLocation(String location) { this.location = location; } } Apr 2, 2010 11:33:12 PM org.hibernate.LazyInitializationException <init> SEVERE: failed to lazily initialize a collection of role: com.example.client.YFUser.fileLocations, no session or session was closed org.hibernate.LazyInitializationException: failed to lazily initialize a collection of role: com.example.client.YFUser.fileLocations, no session or session was closed at org.hibernate.collection.AbstractPersistentCollection.throwLazyInitializationException(AbstractPersistentCollection.java:380) at org.hibernate.collection.AbstractPersistentCollection.throwLazyInitializationExceptionIfNotConnected(AbstractPersistentCollection.java:372) at org.hibernate.collection.AbstractPersistentCollection.initialize(AbstractPersistentCollection.java:365) at org.hibernate.collection.AbstractPersistentCollection.write(AbstractPersistentCollection.java:205) at org.hibernate.collection.PersistentBag.add(PersistentBag.java:297) at com.example.server.TestServiceImpl.saveFileLocation(TestServiceImpl.java:132) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at net.sf.gilead.gwt.PersistentRemoteService.processCall(PersistentRemoteService.java:174) at com.google.gwt.user.server.rpc.RemoteServiceServlet.processPost(RemoteServiceServlet.java:224) at com.google.gwt.user.server.rpc.AbstractRemoteServiceServlet.doPost(AbstractRemoteServiceServlet.java:62) at javax.servlet.http.HttpServlet.service(HttpServlet.java:713) at javax.servlet.http.HttpServlet.service(HttpServlet.java:806) at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:487) at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:362) at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216) at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:181) at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:729) at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:405) at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152) at org.mortbay.jetty.handler.RequestLogHandler.handle(RequestLogHandler.java:49) at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152) at org.mortbay.jetty.Server.handle(Server.java:324) at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:505) at org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java:843) at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:647) at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:211) at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:380) at org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:396) at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:488) Apr 2, 2010 11:33:12 PM net.sf.gilead.core.PersistentBeanManager clonePojo INFO: Third party instance, not cloned : org.hibernate.LazyInitializationException: failed to lazily initialize a collection of role: com.example.client.YFUser.fileLocations, no session or session was closed

    Read the article

  • SessionFactory in Hibernate

    - by komal
    Hi, I am using hibernate-2.1 and "net.sf.hibernate.SessionFactory" class in my spring project. Now I am switched to Spring 2.5.6.A, where they are using hibernate3 and I am not able to find out the "net.sf.hibernate" package in that. But I found SessionFactory class in the package "org.springframework.orm.toplink". Is both the class one in hibernate-2.1 "net.sf.hibernate.SessionFactory" and another in "org.springframework.orm.toplink.SessionFactory" are same? Can I replace first with second one? Thanks, Komal

    Read the article

  • Hibernate annotations cascading doesn't work

    - by user304309
    Hi all, I've decided to change hbm.xml style to annotations using hibernate. I had in my hbm.xml: <hibernate-mapping package="by.sokol.jpr.data"> <class name="Licence"> <id name="licenceId"> <generator class="native" /> </id> <many-to-one name="user" lazy="false" cascade="save-update" column="usr"/> </class> </hibernate-mapping> And changed it to: @Entity public class Licence { @Id @GeneratedValue(strategy = GenerationType.AUTO) private int licenceId; @ManyToOne(targetEntity=User.class, fetch=FetchType.EAGER, cascade = CascadeType.ALL) @Cascade(value = { org.hibernate.annotations.CascadeType.SAVE_UPDATE }) private User user; } And hibernate doesn't save user on saving. I really need help!

    Read the article

  • spring jboss ehcache

    - by boyd4715
    I am trying to configure my application to make use of ehCache. I am using Spring 2.5.6, Jboss 5.1.0 GA and its embedded version of Hibernate along with ehCache-core V2.3.1. I have done the following configuration: <property name="hibernateProperties"> <props> <prop key="hibernate.dialect">org.hibernate.dialect.MySQLDialect</prop> <prop key="hibernate.hbm2ddl.auto">update</prop> <prop key="hibernate.show_sql">true</prop> <prop key="hibernate.jdbc.batch_size">20</prop> <prop key="hibernate.cache.provider_class">net.sf.ehcache.hibernate.SingletonEhCacheProvider</prop> <prop key="net.sf.ehcache.configurationResourceName">ehcache.xml</prop> <prop key="hibernate.cache.use_second_level_cache">true</prop> <prop key="hibernate.cache.use_structured_entries">true</prop> <prop key="hibernate.cache.use_query_cache">true</prop> <prop key="hibernate.generate_statistics">true</prop> <!-- prop key="hibernate.cache.use_second_level_cache">true</prop> <prop key="hibernate.cache.region.factory_class">net.sf.ehcache.hibernate.EhCacheRegionFactory</prop> <prop key="hibernate.cache.provider_class">net.sf.ehcache.hibernate.SingletonEhCacheProvider</prop> <prop key="hibernate.cache.use_second_level_cache">true</prop> <prop key="hibernate.cache.use_query_cache">true</prop--> </props> </property> This is my ehcache.xml <defaultCache eternal="false" overflowToDisk="false" maxElementsInMemory="50000" timeToIdleSeconds="30" timeToLiveSeconds="6000" memoryStoreEvictionPolicy="LRU" /> <cache name="com.model.SystemProperty" maxElementsInMemory="5000" eternal="true" overflowToDisk="false" memoryStoreEvictionPolicy="LFU" /> This file is located in my class path. I have added the following to my domain object: @Cache(usage = CacheConcurrencyStrategy.READ_WRITE, region="vsg.ecotrak.admin.store.domain.Store", include="non-lazy") When I start up the server, it gets stuck. Here is the output: 13:17:09,000 INFO [SettingsFactory] Second-level cache: enabled 13:17:09,000 INFO [SettingsFactory] Query cache: enabled 13:17:09,016 INFO [SettingsFactory] Cache region factory : org.hibernate.cache.impl.bridge.RegionFactoryCacheProviderBridge 13:17:09,017 INFO [RegionFactoryCacheProviderBridge] Cache provider: net.sf.ehcache.hibernate.SingletonEhCacheProvider Any Ideal as to why this is happening? I am running on a Windows 7 64 bit if that matters. I downgraded the ehcache jar to V 1.2.3 and the server now starts.

    Read the article

  • Enabling Hibernate second-level cache with JPA on JBoss 4.2

    - by Peter Hilton
    What are the steps required to enable Hibernate's second-level cache, when using the Java Persistence API (annotated entities)? How do I check that it's working? I'm using JBoss 4.2.2.GA. From the Hibernate documentation, it seems that I need to enable the cache and specify a cache provider in persistence.xml, like: <property name="hibernate.cache.use_second_level_cache" value="true" /> <property name="hibernate.cache.provider_class" value="org.hibernate.cache.HashtableCacheProvider" /> What else is required? Do I need to add @Cache annotations to my JPA entities? How can I tell if the cache is working? I have tried accessing cache statistics after running a Query, but Statistics.getSecondLevelCacheStatistics returns null, perhaps because I don't know what 'region' name to use.

    Read the article

  • Terracotta With Hibernate and EHCache

    - by Joe Biron
    Head swimming with the product name soup at http://www.terracotta.org. Need someone to help clarify what I need. Background: app has some "legacy" persistence code that does not use Hibernate, but has a home-grown cache implementation. New entities are Hibernate enabled. What I want: to use Terracotta for Hibernate 2nd level cache. I think I then want to slide out the home-grown cache impl and slide in ehcache (very similar semantically to home-grown version) - obviously I want Terracotta to back that EHCache as well. Confused with: Will I be telling Hibernate that ehcache is it's cache provider, then configure ehcache to use terracotta? So (hibernate | legacy-persistence)- ehcache - terracotta Am I on the right track? Forgive the newb question but the terracotta.org site really confuses me since so much of it it trying to sell me the commercial varieties.

    Read the article

  • Weaknesses of Hibernate

    - by Sinuhe
    I would like to know which are the weak points of Hibernate 3. This is not pretended to be a thread against Hibernate. I think it will be a very useful knowledge for decide if Hibernate is the best option for a project or for estimating its time. A weakness can be: A bug Where JDBC or PLSQL are better Performance issues ... Also, can be useful to know some solutions for that problems, better ORM or techniques, or it will be corrected in Hibernate 4. For example, AFAIK, Hibernate will have a very bad performance updating 10000 rows comparing to JDBC in this query: update A set state=3 where state=2

    Read the article

  • Problem updating blog with hibernate?

    - by johnsmith51
    hi, i am having problem updating a blob with hibernate. my model have these getters/setters for hibernate, i.e. internally i deal with byte[] so any getter/setter convert the byte[] to blog. I can create an initial object without problem, but if I try to change the content of the blob, the database column is not updated. I do not get any error message, everything looks fine, except that the database is not updated. /** do not use, for hibernate only */ public Blob getLogoBinaryBlob() { if(logoBinary == null){ return null; } return Hibernate.createBlob(logoBinary); } /** do not use, for hibernate only */ public void setLogoBinaryBlob(Blob logoBinaryBlob) { ByteArrayOutputStream baos = new ByteArrayOutputStream(); try { logoBinary = toByteArrayImpl(logoBinaryBlob, baos); } catch (Exception e) { } }

    Read the article

  • Avoid implicit conversion from date to timestamp for selects with Oracle using Hibernate

    - by sapporo
    I'm using Hibernate 3.2.7.GA criteria queries to select rows from an Oracle Enterprise Edition 10.2.0.4.0 database, filtering by a timestamp field. The field in question is of type java.util.Date in Java, and DATE in Oracle. It turns out that the field gets mapped to java.sql.Timestamp, and Oracle converts all rows to TIMESTAMP before comparing to the passed in value, bypassing the index and thereby ruining performance. One solution would be to use Hibernate's sqlRestriction() along with Oracle's TO_DATE function. That would fix performance, but requires rewriting the application code (lots of queries). So is there a more elegant solution? Since Hibernate already does type mapping, could it be configured to do the right thing? Update: The problem occurs in a variety of configurations, but here's one specific example: Oracle Enterprise Edition 10.2.0.4.0 Oracle JDBC Driver 11.1.0.7.0 Hibernate 3.2.7.GA Hibernate's Oracle10gDialect Java 1.6.0_16

    Read the article

  • Using Hibernate with Dynamic Eclipse Plug-ins

    - by AlbertoPL
    I have classes that are named exactly the same across different plug-ins that I use for my application, and I'd like to be able to configure them properly with Hibernate. The problem is that it looks like Hibernate dynamically generates a class' package name when trying to find a class when it's doing its mapping. With one plug-in this scheme works, but across multiple plug-ins it's not working. It looks like Hibernate gets confused when dealing with Hibernate configuration files across multiple plug-ins. Is this because each plug-in has its own class-loader? What is the best way to proceed to make this work with the existing plug-ins and Hibernate?

    Read the article

  • Hibernate Lazy init exception in spring scheduled job

    - by Noam Nevo
    I have a spring scheduled job (@Scheduled) that sends emails from my system according to a list of recipients in the DB. This method is annotated with the @Scheduled annotation and it invokes a method from another interface, the method in the interface is annotated with the @Transactional annotation. Now, when i invoke the scheduled method manually, it works perfectly. But when the method is invoked by spring scheduler i get the LazyInitFailed exception in the method implementing the said interface. What am I doing wrong? code: The scheduled method: @Component public class ScheduledReportsSender { public static final int MAX_RETIRES = 3; public static final long HALF_HOUR = 1000 * 60 * 30; @Autowired IScheduledReportDAO scheduledReportDAO; @Autowired IDataService dataService; @Autowired IErrorService errorService; @Scheduled(cron = "0 0 3 ? * *") // every day at 2:10AM public void runDailyReports() { // get all daily reports List<ScheduledReport> scheduledReports = scheduledReportDAO.getDaily(); sendScheduledReports(scheduledReports); } private void sendScheduledReports(List<ScheduledReport> scheduledReports) { if(scheduledReports.size()<1) { return; } //check if data flow ended its process by checking the report_last_updated table in dwh int reportTimeId = scheduledReportDAO.getReportTimeId(); String todayTimeId = DateUtils.getTimeid(DateUtils.getTodayDate()); int yesterdayTimeId = Integer.parseInt(DateUtils.addDaysSafe(todayTimeId, -1)); int counter = 0; //wait for time id to update from the daily flow while (reportTimeId != yesterdayTimeId && counter < MAX_RETIRES) { errorService.logException("Daily report sender, data not ready. Will try again in one hour.", null, null, null); try { Thread.sleep(HALF_HOUR); } catch (InterruptedException ignore) {} reportTimeId = scheduledReportDAO.getReportTimeId(); counter++; } if (counter == MAX_RETIRES) { MarketplaceServiceException mse = new MarketplaceServiceException(); mse.setMessage("Data flow not done for today, reports are not sent."); throw mse; } // get updated timeid updateTimeId(); for (ScheduledReport scheduledReport : scheduledReports) { dataService.generateScheduledReport(scheduledReport); } } } The Invoked interface: public interface IDataService { @Transactional public void generateScheduledReport(ScheduledReport scheduledReport); } The implementation (up to the line of the exception): @Service public class DataService implements IDataService { public void generateScheduledReport(ScheduledReport scheduledReport) { // if no recipients or no export type - return if(scheduledReport.getRecipients()==null || scheduledReport.getRecipients().size()==0 || scheduledReport.getExportType() == null) { return; } } } Stack trace: ERROR: 2012-09-01 03:30:00,365 [Scheduler-15] LazyInitializationException.<init>(42) | failed to lazily initialize a collection of role: com.x.model.scheduledReports.ScheduledReport.recipients, no session or session was closed org.hibernate.LazyInitializationException: failed to lazily initialize a collection of role: com.x.model.scheduledReports.ScheduledReport.recipients, no session or session was closed at org.hibernate.collection.AbstractPersistentCollection.throwLazyInitializationException(AbstractPersistentCollection.java:383) at org.hibernate.collection.AbstractPersistentCollection.throwLazyInitializationExceptionIfNotConnected(AbstractPersistentCollection.java:375) at org.hibernate.collection.AbstractPersistentCollection.readSize(AbstractPersistentCollection.java:122) at org.hibernate.collection.PersistentBag.size(PersistentBag.java:248) at com.x.service.DataService.generateScheduledReport(DataService.java:219) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:309) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:202) at $Proxy208.generateScheduledReport(Unknown Source) at com.x.scheduledJobs.ScheduledReportsSender.sendScheduledReports(ScheduledReportsSender.java:85) at com.x.scheduledJobs.ScheduledReportsSender.runDailyReports(ScheduledReportsSender.java:38) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at org.springframework.util.MethodInvoker.invoke(MethodInvoker.java:273) at org.springframework.scheduling.support.MethodInvokingRunnable.run(MethodInvokingRunnable.java:65) at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:51) at org.springframework.scheduling.concurrent.ReschedulingRunnable.run(ReschedulingRunnable.java:81) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) at java.util.concurrent.FutureTask.run(FutureTask.java:166) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:165) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603) at java.lang.Thread.run(Thread.java:636) ERROR: 2012-09-01 03:30:00,366 [Scheduler-15] MethodInvokingRunnable.run(68) | Invocation of method 'runDailyReports' on target class [class com.x.scheduledJobs.ScheduledReportsSender] failed org.hibernate.LazyInitializationException: failed to lazily initialize a collection of role: com.x.model.scheduledReports.ScheduledReport.recipients, no session or session was closed at org.hibernate.collection.AbstractPersistentCollection.throwLazyInitializationException(AbstractPersistentCollection.java:383) at org.hibernate.collection.AbstractPersistentCollection.throwLazyInitializationExceptionIfNotConnected(AbstractPersistentCollection.java:375) at org.hibernate.collection.AbstractPersistentCollection.readSize(AbstractPersistentCollection.java:122) at org.hibernate.collection.PersistentBag.size(PersistentBag.java:248) at com.x.service.DataService.generateScheduledReport(DataService.java:219) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:309) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:202) at $Proxy208.generateScheduledReport(Unknown Source) at com.x.scheduledJobs.ScheduledReportsSender.sendScheduledReports(ScheduledReportsSender.java:85) at com.x.scheduledJobs.ScheduledReportsSender.runDailyReports(ScheduledReportsSender.java:38) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at org.springframework.util.MethodInvoker.invoke(MethodInvoker.java:273) at org.springframework.scheduling.support.MethodInvokingRunnable.run(MethodInvokingRunnable.java:65) at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:51) at org.springframework.scheduling.concurrent.ReschedulingRunnable.run(ReschedulingRunnable.java:81) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) at java.util.concurrent.FutureTask.run(FutureTask.java:166) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:165) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603) at java.lang.Thread.run(Thread.java:636)

    Read the article

  • Hibernate not using schema and catalog name in id generation with strategy increment

    - by Ben
    Hi, I am using the hibernate increment strategy to create my IDs on my entities. @GenericGenerator(name="increment-strategy", strategy="increment") @Id @GeneratedValue(generator="increment=strategy") @Column(name="HDR_ID", unique=true, nullable=false) public int getHdrId(){ return this.hdrId; } The entity has the following table annotation @Table(name = "PORDER.PUB.PO_HEADER", schema = "UVOSi", catalog = "VIRT_UVOS") Please note I have two datasources. When I try to insert an entity Hibernate creates the following SQL statement: select max(hdr_id) from PORDER.PUB.PO_HEADER which causes the following error: Group specified is ambiguous, resubmit the query by fully qualifying group name. When I create a query by hand with entityManager.createQuery() hibernate uses the fully qualified name select XXX from VIRT_UVOS.UVOSi.PORDER.PUB.PO_HEADER and that works fine. So how do I get Hibernate to use the fully qualified name in the Id autogeneration? Btw. I am using Hibernate 3.2 and Seam 2.2 running on JBoss 4.2.3 Regards Immo

    Read the article

  • disable hibernate logging in cosole

    - by ganiOz
    Hi, My log4j.properties looks like log4j.rootCategory=DEBUG, A1 log4j.appender.A1=org.apache.log4j.RollingFileAppender log4j.appender.A1.File=InteroperabilityFatal.log log4j.appender.A1.MaxFileSize=1000KB log4j.appender.A1.MaxBackupIndex=1000 log4j.appender.A1.layout=org.apache.log4j.PatternLayout log4j.appender.A1.layout.ConversionPattern=%p %t %c - %m%n log4j.appender.A1.Threshold=FATAL log4j.appender.A1.Append=true log4j.logger.org.hibernate=FATAL log4j.logger.org.hibernate.sql=FATAL log4j.logger.org.hibernate.hql=error I want only fatal logs into the file and nothing in console. But hibernate is logging all its info in console. Can someone pls let me know a way to stop this? I tried in eclipse and from executable jar file, still the hibernate is keep logging in console. Thanks in advance for help.

    Read the article

  • Hibernate doesn't generate cascade

    - by Shervin
    Hi. I have a set hibernate.hbm2ddl.auto to create so that Hibernate creates the tables in mysql for me. However, it doesn't seem that hibernate correctly adds Cascade on the references in the table. It does however work when I for instance delete a row, and I have a delete cascade as hibernate annotation. So I guess that means that Hibernate reads the annoation on runtime, and perform cascading manually? Is that normal behavior? For instance: @Entity class Report { @OneToOne(cascade = CascadeType.ALL) public File getPdf() { return pdf; } } Here I have set cascade to ALL. However, when running show create table Report Report | CREATE TABLE `Report` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, `pdf_id` bigint(20) DEFAULT NULL, PRIMARY KEY (`id`), KEY `FK91B14154FDE6543A` (`pdf_id`), CONSTRAINT `FK91B14154FDE6543A` FOREIGN KEY (`pdf_id`) REFERENCES `File` (`id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8 | It doesn't say anything about cascading other then the foreign key. In my opinion, it should have added the ON DELETE CASCADE ON DELETE UPDATE

    Read the article

  • How to code a 'Next in Results' within search results in PHP

    - by thebluefox
    Right, bit of a head scratcher, although I've got a feeling there's an obvious answer and I'm just not seeing the wood for the trees. Baiscally, using Solr as a search engine for my site, bringing back 15 results per page. When you click on a result, you get a detail page, that has a "Next in Results" link on it, which obviously forwards you on to the next result. Whats the best way of doing this? I've come up with a few solutions but they're either too inpractical or just don't work. I could store all the ids in a session array, then grab the one after the current one and put that in the link. But with possibly hundreds/thousands of results, the memory that array would need, and the performance hit of dealing with it isn't practical. I could take the same approach and put it into the db, but I'll still have to deal with a potentially huge array when I grab them out of the db. Or; I could do the search again, only returning the id's, and grab the one after the one we're currently looking at. I think this could be the best option? Although it does seem kind of messy, namely because of when I have to select the id thats on a different 'page' (ie the 16th, 31st etc result). Unless I pass through where it was in the results, and select from there, but that still doesn't seem like the right way to do it. I'm really sorry if this is just complete nonsense, any help is massively appreciated as always, Cheers guys!

    Read the article

  • How to create an entity with a composite primary key containing a generated value.

    - by David
    Using Hibernate + annotations, I'm trying to do the following: Two entities, Entity1 and Entity2. Entity1 contains a simple generated value primary key. Entity2 primary key is composed by a simple generated value + the id of entity one (with a many to one relationship) Unfortunately, I can't make it work. Here is an excerpt of the code: @Entity public class Entity1 { @Id @GeneratedValue private Long id; private String name; ... } @Entity public class Entity2 { @EmbeddedId private Entity2PK pk = new Entity2PK(); private String miscData; ... } @Embeddable public class Entity2PK implements Serializable { @GeneratedValue private Long id; @ManyToOne private Entity1 entity; } void test() { Entity1 e1 = new Entity1(); e1.setName("nameE1"); Entity2 e2 = new Entity2(); e2.setEntity1(e1); e2.setMiscData("test"); Transaction transaction = session.getTransaction(); try { transaction.begin(); session.save(e1); session.save(e2); transaction.commit(); } catch (Exception e) { transaction.rollback(); } finally { session.close(); } } When I run the test method I get the following errors: Hibernate: insert into Entity1 (id, name) values (null, ?) Hibernate: call identity() Hibernate: insert into Entity2 (miscData, entity_id, id) values (?, ?, ?) 07-Jun-2010 10:51:11 org.hibernate.util.JDBCExceptionReporter logExceptions WARNING: SQL Error: 0, SQLState: null 07-Jun-2010 10:51:11 org.hibernate.util.JDBCExceptionReporter logExceptions SEVERE: failed batch 07-Jun-2010 10:51:11 org.hibernate.event.def.AbstractFlushingEventListener performExecutions SEVERE: Could not synchronize database state with session org.hibernate.exception.GenericJDBCException: Could not execute JDBC batch update at org.hibernate.exception.SQLStateConverter.handledNonSpecificException(SQLStateConverter.java:103) at org.hibernate.exception.SQLStateConverter.convert(SQLStateConverter.java:91) at org.hibernate.exception.JDBCExceptionHelper.convert(JDBCExceptionHelper.java:43) at org.hibernate.jdbc.AbstractBatcher.executeBatch(AbstractBatcher.java:254) at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:266) at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:167) at org.hibernate.event.def.AbstractFlushingEventListener.performExecutions(AbstractFlushingEventListener.java:298) at org.hibernate.event.def.DefaultFlushEventListener.onFlush(DefaultFlushEventListener.java:27) at org.hibernate.impl.SessionImpl.flush(SessionImpl.java:1001) at org.hibernate.impl.SessionImpl.managedFlush(SessionImpl.java:339) at org.hibernate.transaction.JDBCTransaction.commit(JDBCTransaction.java:106) at test.App.main(App.java:32) Caused by: java.sql.BatchUpdateException: failed batch at org.hsqldb.jdbc.jdbcStatement.executeBatch(Unknown Source) at org.hsqldb.jdbc.jdbcPreparedStatement.executeBatch(Unknown Source) at org.hibernate.jdbc.BatchingBatcher.doExecuteBatch(BatchingBatcher.java:48) at org.hibernate.jdbc.AbstractBatcher.executeBatch(AbstractBatcher.java:247) ... 8 more Note that I use HSQLDB. Any ideas about what is wrong ?

    Read the article

  • Hibernate object equality checking

    - by Sujee
    As far as I understand(correct me if I am wrong) Hibernate uses object reference to check the object equality. When Hibernate identifies that there are more than one objects attached to same DB record, it throws following exception. "a different object with the same identifier value was already associated with the session" My question is, does Hibernate use equal() method to check the object equality (The default equal method uses object reference)? If it is true, will overridden equal() method change the Hibernate behavior? Note: My question is not about the issues of implementing equal() or hashCode() methods in a Hibernate persisted object. Thank you.

    Read the article

< Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >