Search Results

Search found 22897 results on 916 pages for 'query processing'.

Page 358/916 | < Previous Page | 354 355 356 357 358 359 360 361 362 363 364 365  | Next Page >

  • Nested queries in Arel

    - by Schrockwell
    I am attempting to nest SELECT queries in Arel and/or Active Record in Rails 3 to generate the following SQL statement. SELECT sorted.* FROM (SELECT * FROM points ORDER BY points.timestamp DESC) AS sorted GROUP BY sorted.client_id An alias for the subquery can be created by doing points = Table(:points) sorted = points.order('timestamp DESC').alias but then I'm stuck as how to pass it into the parent query (short of calling #to_sql, which sounds pretty ugly). How do you use a SELECT statement as a sub-query in Arel (or Active Record) to accomplish the above? Maybe there's an altogether different way to accomplish this query that doesn't use nested queries?

    Read the article

  • Rows dropping when I try to join data from two tables

    - by blcArmadillo
    I have a fairly simple query I'm try to write. If I run the following query: SELECT parts.id, parts.type_id FROM parts WHERE parts.type_id=1 OR parts.type_id=2 OR parts.type_id=4 ORDER BY parts.type_id; I get all the rows I expect to be returned. Now when I try to grab the parent_unit from another table with the following query six rows suddenly drop out of the result: SELECT parts.id, parts.type_id, sp.parent_unit FROM parts, serialized_parts sp WHERE (parts.type_id=1 OR parts.type_id=2 OR parts.type_id=4) AND sp.parts_id = parts.id ORDER BY parts.type_id In the past I've never really dealt with ORs in my queries so maybe I'm just doing it wrong. That said I'm guessing it's just a simple mistake. Let me know if you need sample data and I'll post some. Thanks.

    Read the article

  • IEnumerable<> to IList<>

    - by nachid
    I am using Linq to query my database and returning a generic IList. Whatever I tried I couldn't convert an IQueryable to an IList. Here is my code. I cannot write simpler than this and I don't understand why it is not working. public IList<IRegion> GetRegionList(string countryCode) { var query = from c in Database.RegionDataSource where (c.CountryCode == countryCode) orderby c.Name select new {c.RegionCode, c.RegionName}; return query.Cast<IRegion>().ToList(); } This returns an list with the right number of items but they are all empty Please help, I am bloqued with this for a couple of days now

    Read the article

  • nhibernate fatal error

    - by Afif Lamloumi
    i have an error ( System.InvalidCastException: Unable to cast object of type 'AccountProxy' to type 'System.String'.) when i did this code i mapped the tables( Account,AccountString,EventData,...) of the the database opengts ( open source) i have this error when i called a function from EventData.cs IQuery query = session.CreateQuery("FROM Eventdata"); IList pets = query.List(); return pets; the Stack Trace: [InvalidCastException: Impossible d'effectuer un cast d'un objet de type 'AccountProxy' en type 'System.String'.] (Object , Object[] , SetterCallback ) +431 NHibernate.Bytecode.Lightweight.AccessOptimizer.SetPropertyValues(Object target, Object[] values) +20 NHibernate.Tuple.Component.PocoComponentTuplizer.SetPropertyValues(Object component, Object[] values) +49 NHibernate.Type.ComponentType.SetPropertyValues(Object component, Object[] values, EntityMode entityMode) +34 NHibernate.Type.ComponentType.ResolveIdentifier(Object value, ISessionImplementor session, Object owner) +150 NHibernate.Type.ComponentType.NullSafeGet(IDataReader rs, String[] names, ISessionImplementor session, Object owner) +42 NHibernate.Loader.Loader.GetKeyFromResultSet(Int32 i, IEntityPersister persister, Object id, IDataReader rs, ISessionImplementor session) +93 NHibernate.Loader.Loader.GetRowFromResultSet(IDataReader resultSet, ISessionImplementor session, QueryParameters queryParameters, LockMode[] lockModeArray, EntityKey optionalObjectKey, IList hydratedObjects, EntityKey[] keys, Boolean returnProxies) +92 NHibernate.Loader.Loader.DoQuery(ISessionImplementor session, QueryParameters queryParameters, Boolean returnProxies) +675 NHibernate.Loader.Loader.DoQueryAndInitializeNonLazyCollections(ISessionImplementor session, QueryParameters queryParameters, Boolean returnProxies) +129 NHibernate.Loader.Loader.DoList(ISessionImplementor session, QueryParameters queryParameters) +116 [GenericADOException: could not execute query [ select eventdata0_.deviceID as deviceID5_, eventdata0_.timestamp as timestamp5_, eventdata0_.statusCode as statusCode5_, eventdata0_.accountID as accountID5_, eventdata0_.latitude as latitude5_, eventdata0_.longitude as longitude5_, eventdata0_.gpsAge as gpsAge5_, eventdata0_.speedKPH as speedKPH5_, eventdata0_.heading as heading5_, eventdata0_.altitude as altitude5_, eventdata0_.transportID as transpo11_5_, eventdata0_.inputMask as inputMask5_, eventdata0_.outputMask as outputMask5_, eventdata0_.address as address5_, eventdata0_.DataSource as DataSource5_, eventdata0_.rawdata as rawdata5_, eventdata0_.distanceKM as distanceKM5_, eventdata0_.odometerKM as odometerKM5_, eventdata0_.geozoneIndex as geozone19_5_, eventdata0_.geozoneID as geozoneID5_, eventdata0_.creationTime as creatio21_5_ from eventdata eventdata0_ ] [SQL: select eventdata0_.deviceID as deviceID5_, eventdata0_.timestamp as timestamp5_, eventdata0_.statusCode as statusCode5_, eventdata0_.accountID as accountID5_, eventdata0_.latitude as latitude5_, eventdata0_.longitude as longitude5_, eventdata0_.gpsAge as gpsAge5_, eventdata0_.speedKPH as speedKPH5_, eventdata0_.heading as heading5_, eventdata0_.altitude as altitude5_, eventdata0_.transportID as transpo11_5_, eventdata0_.inputMask as inputMask5_, eventdata0_.outputMask as outputMask5_, eventdata0_.address as address5_, eventdata0_.DataSource as DataSource5_, eventdata0_.rawdata as rawdata5_, eventdata0_.distanceKM as distanceKM5_, eventdata0_.odometerKM as odometerKM5_, eventdata0_.geozoneIndex as geozone19_5_, eventdata0_.geozoneID as geozoneID5_, eventdata0_.creationTime as creatio21_5_ from eventdata eventdata0_]] NHibernate.Loader.Loader.DoList(ISessionImplementor session, QueryParameters queryParameters) +213 NHibernate.Loader.Loader.ListIgnoreQueryCache(ISessionImplementor session, QueryParameters queryParameters) +18 NHibernate.Loader.Loader.List(ISessionImplementor session, QueryParameters queryParameters, ISet`1 querySpaces, IType[] resultTypes) +79 NHibernate.Hql.Ast.ANTLR.Loader.QueryLoader.List(ISessionImplementor session, QueryParameters queryParameters) +51 NHibernate.Hql.Ast.ANTLR.QueryTranslatorImpl.List(ISessionImplementor session, QueryParameters queryParameters) +231 NHibernate.Engine.Query.HQLQueryPlan.PerformList(QueryParameters queryParameters, ISessionImplementor session, IList results) +369 NHibernate.Impl.SessionImpl.List(String query, QueryParameters queryParameters, IList results) +317 NHibernate.Impl.SessionImpl.List(String query, QueryParameters parameters) +282 NHibernate.Impl.QueryImpl.List() +163 DATA1.EventdataExtensions.GetEventdata() in C:\Users\HP\Desktop\our_project\DATA1\Queries\Eventdata.cs:33 MvcApplication7.Controllers.HistoriqueController.Index() in C:\Users\HP\Desktop\our_project\MvcApplication7\Controllers\HistoriqueController.cs:17 lambda_method(Closure , ControllerBase , Object[] ) +62 System.Web.Mvc.ActionMethodDispatcher.Execute(ControllerBase controller, Object[] parameters) +17 System.Web.Mvc.ReflectedActionDescriptor.Execute(ControllerContext controllerContext, IDictionary`2 parameters) +208 System.Web.Mvc.ControllerActionInvoker.InvokeActionMethod(ControllerContext controllerContext, ActionDescriptor actionDescriptor, IDictionary`2 parameters) +27 System.Web.Mvc.<>c__DisplayClass15.<InvokeActionMethodWithFilters>b__12() +55 System.Web.Mvc.ControllerActionInvoker.InvokeActionMethodFilter(IActionFilter filter, ActionExecutingContext preContext, Func`1 continuation) +263 System.Web.Mvc.<>c__DisplayClass17.<InvokeActionMethodWithFilters>b__14() +19 System.Web.Mvc.ControllerActionInvoker.InvokeActionMethodWithFilters(ControllerContext controllerContext, IList`1 filters, ActionDescriptor actionDescriptor, IDictionary`2 parameters) +191 System.Web.Mvc.ControllerActionInvoker.InvokeAction(ControllerContext controllerContext, String actionName) +343 System.Web.Mvc.Controller.ExecuteCore() +116 System.Web.Mvc.ControllerBase.Execute(RequestContext requestContext) +97 System.Web.Mvc.ControllerBase.System.Web.Mvc.IController.Execute(RequestContext requestContext) +10 System.Web.Mvc.<>c__DisplayClassb.<BeginProcessRequest>b__5() +37 System.Web.Mvc.Async.<>c__DisplayClass1.<MakeVoidDelegate>b__0() +21 System.Web.Mvc.Async.<>c__DisplayClass8`1.<BeginSynchronous>b__7(IAsyncResult _) +12 System.Web.Mvc.Async.WrappedAsyncResult`1.End() +62 System.Web.Mvc.<>c__DisplayClasse.<EndProcessRequest>b__d() +50 System.Web.Mvc.SecurityUtil.<GetCallInAppTrustThunk>b__0(Action f) +7 System.Web.Mvc.SecurityUtil.ProcessInApplicationTrust(Action action) +22 System.Web.Mvc.MvcHandler.EndProcessRequest(IAsyncResult asyncResult) +60 System.Web.Mvc.MvcHandler.System.Web.IHttpAsyncHandler.EndProcessRequest(IAsyncResult result) +9 System.Web.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() +8841105 System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) +184 Any suggestions? how can correct this error Data entity class (outtake from comment): public class MyClass { public virtual string DeviceID { get; set; } public virtual int Timestamp { get; set; } public virtual string Account { get; set; } public virtual int StatusCode { get; set; } public virtual double Latitude { get; set; } public virtual double Longitude { get; set; } public virtual int GpsAge { get; set; } public virtual double SpeedKPH { get; set; } public virtual double Heading { get; set; } public override bool Equals(object obj) { return true; } public override int GetHashCode() { return 0; } }

    Read the article

  • Can a PL/pgSQL function contain a dynamic subquery?

    - by morpheous
    I am writing a PL/pgSQL function. The function has input parameters which specify (indirectly), which tables to read filtering information from. The function embeds business logic which allows it to select data from different tables based on the input arguments. The function dynamically builds a subquery which returns filtering data which is then used to run the main query. My questions are: Is it 'legal' to use a dynamic subquery in a PL/pgSQL function. I cant see why not - but this question is related to the next one. AFAIK, PL/pgSQL are cached or precompiled by the query engine. How does having a function that generates dynamic subqueries impact the work of the query engine?

    Read the article

  • Count Distinct With IF in MySQL?

    - by user1600801
    I need to do a query with count distinct and IF, but the results always are 0. What I need to do, is count the different users from a table in different months, using IF. My individual query, for one month is this: SELECT COUNT(DISTINCT(idUsers)) AS num_usuarios FROM table01 WHERE date1='201207' But I need to get the results by different months in the same query. What I'm trying to do is this: SELECT IF(date1=(201207), count(distinct(idUsers)), 0) as user30, IF(fecha1=(201206), count(distinct(idUsers)), 0) as user60, IF(fecha1=(201205), count(distinct(idUsers)), 0) as user90, IF(fecha1=(201204), count(distinct(idUsers)), 0) as user120, IF(fecha1=(201203), count(distinct(idUsers)), 0) as user150 FROM table01 But the all the results are always 0.

    Read the article

  • Entity framework and many to many queries unusable?

    - by John Landheer
    I'm trying EF out and I do a lot of filtering based on many to many relationships. For instance I have persons, locations and a personlocation table to link the two. I also have a role and personrole table. EDIT: Tables: Person (personid, name) Personlocation (personid, locationid) Location (locationid, description) Personrole (personid, roleid) Role (roleid, description) EF will give me persons, roles and location entities. EDIT: Since EF will NOT generate the personlocation and personrole entity types, they cannot be used in the query. How do I create a query to give me all the persons of a given location with a given role? In SQL the query would be select p.* from persons as p join personlocations as pl on p.personid=pl.personid join locations as l on pl.locationid=l.locationid join personroles as pr on p.personid=pr.personid join roles as r on pr.roleid=r.roleid where r.description='Student' and l.description='Amsterdam' I've looked, but I can't seem to find a simple solution.

    Read the article

  • How to execute an update via SQLQuery in Hibernate

    - by Udo Fholl
    Hi, I need to update a joined sub-class. Since Hibernate doesn't allow to update joined sub-classes in hql or named-query I want to do it via SQL. I also can't use a sql named-query because updates via named-query are not supported in Hibernate. So I decided to use a SQLQuery. But Hibernate complaints about not calling addScalar(). Are updates returning the number of rows affected and how is named that column? Are there any other ways to do an update on a joined sub-class in hibernate? Thanks in advance!

    Read the article

  • strange SqlAlchemy update behaviour

    - by Max
    I'm new to SqlAlchemy and Elixir, so I've started from tutorial and tried to create table, insert a record, and then update it as follows: #'elixir_test.py' from elixir import * metadata.bind = "postgresql://myuser:mypwd@localhost:5432/dbname" metadata.bind.echo = True class Movie(Entity): title = Field(Unicode(30)) year = Field(Integer) description = Field(UnicodeText) def __repr__(self): return '<Movie "%s" (%d)>' % (self.title, self.year) and in another file in the same directory: from elixir_test import * setup_all() #create table create_all() Movie(title=u"Blade Runner", year=1982) #add record session.commit() #get records Movie.query.all() #trying to update record and commit changes, BUT... movie = Movie.query.first() movie.year = 1983 session.commit() #now we have two records in our table, one #with year=1982 and one with year=1983 Movie.query.all() What did I missed?

    Read the article

  • Eager loading vs. many queries with PHP, SQLite

    - by Mike
    I have an application that has an n+1 query problem, but when I implemented a way to load the data eagerly, I found absolutely no performance gain. I do use an identity map, so objects are only created once. Here's a benchmark of ~3000 objects. first query + first object creation: 0.00636100769043 sec. memory usage: 190008 bytes iterate through all objects (queries + objects creation): 1.98003697395 sec. memory usage: 7717116 bytes And here's one when I use eager loading. query: 0.0881109237671 sec. memory usage: 6948004 bytes object creation: 1.91053009033 sec. memory usage: 12650368 bytes iterate through all objects: 1.96605396271 sec. memory usage: 12686836 bytes So my questions are Is SQLite just magically lightning fast when it comes to small queries? (I'm used to working with MySQL.) Does this just seem wrong to anyone? Shouldn't eager loading have given much better performance?

    Read the article

  • SQL: Need to SUM on results that meet a HAVING statement

    - by Wasauce
    I have a table where we record per user values like money_spent, money_spent_on_candy and the date. So the columns in this table (let's call it MoneyTable) would be: UserId Money_Spent Money_Spent_On_Candy Date My goal is to SUM the total amount of money_spent -- but only for those users where they have spent more than 10% of their total money spent for the date range on candy. What would that query be? I know how to select the Users that have this -- and then I can output the data and sum that by hand but I would like to do this in one single query. Here would be the query to pull the sum of Spend per user for only the users that have spent 10% of their money on candy. SELECT UserId, SUM(Money_Spent), SUM(Money_Spent_On_Candy) / SUM(Money_Spent) AS PercentCandySpend FROM MoneyTable WHERE DATE >= '2010-01-01' HAVING PercentCandySpend > 0.1;

    Read the article

  • Using sub filters/queries in Google App Engine

    - by fredrik
    Hi, I'm trying to use figure out how to sub query a query that uses a filter. From what I've figured out so far while using .filter() it changes the original query, that leads to a second .filter() would also have to match the first filter. I would like to make something like this: modules = data.Modules.all().filter('page = ', page.key()) modules.filter('name = ', 'Test') modules.filter('name = ', 'Test2') I can't get the "Test2" filter to work. The only solution I have at the moment is to make all new queries. data.Modules.all().filter('page = ', page.key()).filter('name = ', "Test").get() data.Modules.all().filter('page = ', page.key()).filter('name = ', "Test2").get() Or write the same as an GQL. But for me it seams quite stupid way to go. I've looked at using ancestors, but I don't quite understand it and honestly don't know if that's the way to go. Any ideas? ..fredrik

    Read the article

  • Rails eager loading

    - by Dimitar Vouldjeff
    HI, I have a Test model, which has_many questions, and Question, which has_many answers... When I make a query for a Test with :include = [:questions, {:questions = :answers}] ActiveRecord makes two more queries to fetch the questions and then to fetch the answers - it doesn`t join them!!! When I do the query with :joins ActiveRecord makes the query, but later when I need the Test.questions or Test.questions.answers ActiveRecord makes again those 2 extra queries!!! And later when I enumerate the questions or answers in the log I see other queries for each object, but it has Cache tag... Is this normal?

    Read the article

  • Common usecases and techniques when integrating a 3rd party application with Oracle Sales Cloud

    - by asantaga
    Over the last year or so I've see a lot of partners migrating and integrate their applications with Oracle Sales Cloud. Interestingly I'd say 60% of the partners use the same set of design patterns over and over again. Most of the time I see that they want to embed their application into Oracle Sales Cloud, within a tab usually, perhaps click on a link to their application (passing some piece of data + credentials) and then within their application update sales cloud again using webservices. Here are some examples of the different use-cases I've seen , and how partners are embedding their applications into Sales Cloud, NB : The following examples use the "Desktop" User Interface rather than the Newer "Simplified User Interface", I'll update the sample application soon but the integration patterns are precisely the same Use Case 1 :  Navigator "Link out" to third party application This is an example of where the developer has added a link to the global navigator and this links out to the 3rd Party Application. Typically one doesn't pass any contextual data with the exception of perhaps user credentials, or better still JWT Token. Techniques Used   Adding Link to Menu Item Using JWT Token in Sales Cloud Use Case 2 : Application Embedded within the Sales Cloud Dashboard Within the Oracle Sales Cloud application there is a tab called "Sales", within this tab its possible to embed a SubTab and embed a iFrame pointing to your application. To do this the developer simply needs to edit the page in customization mode, add the tab and then add the iFrame, simples! The developer can pass credentials/JWT Token and some other pieces of data but not object data (ie the current OpportunityID etc)  Techniques Used Adding a page to the dashboard  Using JWT Token in Sales Cloud  Use Case 3 : Embedding a Tab and Context Linking out from a Sales Cloud object to the 3rd party application In this usecase the developer embeds two components into Oracle Sales Cloud. The first is a SubTab showing summary data to the user (a quote in our case) and then secondly a hyperlink, (although it could be a button) which when clicked navigates the user to the 3rd party application. In this case the developer almost always passes context specific data (i.e. the opportunityId) and a security token (username password combo or JWT Token). The third party application usually takes the data, perhaps queries more data using the Sales Cloud SOAP/WebService interface and then displays the resulting mashup to the user for further processing. When the user has finished their work in the 3rd party application they normally navigate back to Oracle Sales Cloud using what's called a "DeepLink", ie taking them back to the object [opportunity in our case] they came from. This image visually shows a "Happy Path" a user may follow, and combines linking out to an application , webservice calls and deep linking back to Sales Cloud. Techniques Used Extending a SalesCloud application with a custom button Using JWT Token in Sales Cloud Extending Oracle Sales Cloud [Opportnity] with a custom tab exposing External Content Retrieving Data from Oracle Sales cloud using WebServices Coding some groovy script to generate the URLs required (Doc 1571200.1 on MyOracle Support) DeepLinking to specific Oracle Sales Cloud Pages (Doc 1516151.1 on My Oracle Support) Use-Case 4 :  Server Side processing/synchronization This usecase focuses on the Server Side processing of data, in this case synchronizing data. Here the 3rd party application is running on a "timer", e.g. cron or similar, and when triggered it queries data from Oracle Sales Cloud, then it queries data from the 3rd party application, determines the deltas and then inserts the data where required. Specifically here we are calling Oracle Sales Cloud using SOAP/WebServices and the 3rd party application is being communicated to using the REST API, for Oracle Sales Cloud one would use standard JAX-WS WebService calls and for REST one would use the JAX-RS api and perhap the Jackson api for managing JSON objects.. This is a very common use case and one which specifically lends itself to using the Oracle Java Cloud Service as the ideal application server where to host the mediator between the two applications.  Techniques Used Using JWT Token in Sales Cloud Integrating with the Oracle Java Cloud Service Retrieving Data from Oracle Sales cloud using WebServices General Resources The above is just a small set of techniques and use-cases which are used today. There are plenty of other sources of documentation and resources available on the internet but to get you started here are a few of my favourite places  Sales Cloud General Documentation Sales Cloud Customize Tab is useful for general customization of Sales Cloud Sales Cloud Integration Tab focuses on the 3rd party integration techniques  Official Oracle Fusion Developer Relations Blog Official Oracle Fusion Developer Relations YouTube Channel Enjoy integrating! 

    Read the article

  • Imaging: Paper Paper Everywhere, but None Should be in Sight

    - by Kellsey Ruppel
    Author: Vikrant Korde, Technical Architect, Aurionpro's Oracle Implementation Services team My wedding photos are stored in several empty shoeboxes. Yes...I got married before digital photography was mainstream...which means I'm old. But my parents are really old. They have shoeboxes filled with vacation photos on slides (I doubt many of you have even seen a home slide projector...and I hope you never do!). Neither me nor my parents should have shoeboxes filled with any form of photographs whatsoever. They should obviously live in the digital world...with no physical versions in sight (other than a few framed on our walls). Businesses grapple with similar challenges. But instead of shoeboxes, they have file cabinets and warehouses jam packed with paper invoices, legal documents, human resource files, material safety data sheets, incident reports, and the list goes on and on. In fact, regulatory and compliance rules govern many industries, requiring that this paperwork is available for any number of years. It's a real challenge...especially trying to find archived documents quickly and many times with no backup. Which brings us to a set of technologies called Image Process Management (or simply Imaging or Image Processing) that are transforming these antiquated, paper-based processes. Oracle's WebCenter Content Imaging solution is a combination of their WebCenter suite, which offers a robust set of content and document management features, and their Business Process Management (BPM) suite, which helps to automate business processes through the definition of workflows and business rules. Overall, the solution provides an enterprise-class platform for end-to-end management of document images within transactional business processes. It's a solution that provides all of the capabilities needed - from document capture and recognition, to imaging and workflow - to effectively transform your ‘shoeboxes’ of files into digitally managed assets that comply with strict industry regulations. The terminology can be quite overwhelming if you're new to the space, so we've provided a summary of the primary components of the solution below, along with a short description of the two paths that can be executed to load images of scanned documents into Oracle's WebCenter suite. WebCenter Imaging (WCI): the electronic document repository that provides security, annotations, and search capabilities, and is the primary user interface for managing work items in the imaging solution SOA & BPM Suites (workflow): provide business process management capabilities, including human tasks, workflow management, service integration, and all other standard SOA features. It's interesting to note that there a number of 'jumpstart' processes available to help accelerate the integration of business applications, such as the accounts payable invoice processing solution for E-Business Suite that facilitates the processing of large volumes of invoices WebCenter Enterprise Capture (WEC): expedites the capture process of paper documents to digital images, offering high volume scanning and importing from email, and allows for flexible indexing options WebCenter Forms Recognition (WFR): automatically recognizes, categorizes, and extracts information from paper documents with greatly reduced human intervention WebCenter Content: the backend content server that provides versioning, security, and content storage There are two paths that can be executed to send data from WebCenter Capture to WebCenter Imaging, both of which are described below: 1. Direct Flow - This is the simplest and quickest way to push an image scanned from WebCenter Enterprise Capture (WEC) to WebCenter Imaging (WCI), using the bare minimum metadata. The WEC activities are defined below: The paper document is scanned (or imported from email). The scanned image is indexed using a predefined indexing profile. The image is committed directly into the process flow 2. WFR (WebCenter Forms Recognition) Flow - This is the more complex process, during which data is extracted from the image using a series of operations including Optical Character Recognition (OCR), Classification, Extraction, and Export. This process creates three files (Tiff, XML, and TXT), which are fed to the WCI Input Agent (the high speed import/filing module). The WCI Input Agent directory is a standard ingestion method for adding content to WebCenter Imaging, the process for doing so is described below: WEC commits the batch using the respective commit profile. A TIFF file is created, passing data through the file name by including values separated by "_" (underscores). WFR completes OCR, classification, extraction, export, and pulls the data from the image. In addition to the TIFF file, which contains the document image, an XML file containing the extracted data, and a TXT file containing the metadata that will be filled in WCI, are also created. All three files are exported to WCI's Input agent directory. Based on previously defined "input masks", the WCI Input Agent will pick up the seeding file (often the TXT file). Finally, the TIFF file is pushed in UCM and a unique web-viewable URL is created. Based on the mapping data read from the TXT file, a new record is created in the WCI application.  Although these processes may seem complex, each Oracle component works seamlessly together to achieve a high performing and scalable platform. The solution has been field tested at some of the largest enterprises in the world and has transformed millions and millions of paper-based documents to more easily manageable digital assets. For more information on how an Imaging solution can help your business, please contact [email protected] (for U.S. West inquiries) or [email protected] (for U.S. East inquiries). About the Author: Vikrant is a Technical Architect in Aurionpro's Oracle Implementation Services team, where he delivers WebCenter-based Content and Imaging solutions to Fortune 1000 clients. With more than twelve years of experience designing, developing, and implementing Java-based software solutions, Vikrant was one of the founding members of Aurionpro's WebCenter-based offshore delivery team. He can be reached at [email protected].

    Read the article

  • Trying to install Team viewer on Ubuntu 12.04

    - by Teknikk
    I recently got Ubuntu installed on my server, I wanted to install TeamViewer so i could easy manage the virtual machines, However, I get errors when installing it from App store?, And I also get errors, but more detailed on the terminal. Error output: tek@tek-G53SW:~/Download$ sudo dpkg -i ipts teamviewer_linux_x64.deb dpkg: error processing ipts (--install): cannot access archive: No such file or directory (Reading database ... 142115 files and directories currently installed.) Preparing to replace teamviewer7 7.0.9360 (using teamviewer_linux_x64.deb) ... Unpacking replacement teamviewer7 ... dpkg: dependency problems prevent configuration of teamviewer7: teamviewer7 depends on libc6-i386 (>= 2.7); however: Package libc6-i386 is not installed. teamviewer7 depends on lib32asound2; however: Package lib32asound2 is not installed. teamviewer7 depends on lib32z1; however: Package lib32z1 is not installed. teamviewer7 depends on ia32-libs; however: Package ia32-libs is not installed. dpkg: error processing teamviewer7 (--install): dependency problems - leaving unconfigured Errors were encountered while processing: ipts teamviewer7 I tried to install it manually, but with no luck, I heard some others has this problems. I am running Ubuntu 12.04 x64. Error @ sudo apt-get install libc6-i386 lib32asound2 lib32z1 ia32-libs : tek@tek-G53SW:~/Download$ sudo apt-get install libc6-i386 lib32asound2 lib32z1 ia32-libs Reading package lists... Done Building dependency tree Reading state information... Done You might want to run 'apt-get -f install' to correct these: The following packages have unmet dependencies: ia32-libs : Depends: ia32-libs-multiarch E: Unmet dependencies. Try 'apt-get -f install' with no packages (or specify a solution). tek@tek-G53SW:~/Download$ More errors tek@tek-G53SW:~/Download$ sudo apt-get -f install [sudo] password for tek: Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following packages will be REMOVED: teamviewer7 0 upgraded, 0 newly installed, 1 to remove and 0 not upgraded. 1 not fully installed or removed. After this operation, 81.9 MB disk space will be freed. Do you want to continue [Y/n]? y (Reading database ... 142441 files and directories currently installed.) Removing teamviewer7 ... tek@tek-G53SW:~/Download$ sudo apt-get install libc6-i386 lib32asound2 lib32z1 ia32-libs Reading package lists... Done Building dependency tree Reading state information... Done lib32z1 is already the newest version. libc6-i386 is already the newest version. lib32asound2 is already the newest version. Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: ia32-libs : Depends: ia32-libs-multiarch E: Unable to correct problems, you have held broken packages. tek@tek-G53SW:~/Download$ sudo apt-get install ia32-libs-multiarch Reading package lists... Done Building dependency tree Reading state information... Done Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: ia32-libs-multiarch:i386 : Depends: gstreamer0.10-plugins-good:i386 but it is not going to be installed Depends: gtk2-engines:i386 but it is not going to be installed Depends: gtk2-engines-murrine:i386 but it is not going to be installed Depends: gtk2-engines-pixbuf:i386 but it is not going to be installed Depends: gtk2-engines-oxygen:i386 but it is not going to be installed Depends: ibus-gtk:i386 but it is not going to be installed Depends: libcanberra-gtk-module:i386 but it is not going to be installed Depends: libcups2:i386 but it is not going to be installed Depends: libcupsimage2:i386 but it is not going to be installed Depends: libfontconfig1:i386 but it is not going to be installed Depends: libgail-common:i386 but it is not going to be installed Depends: libgphoto2-2:i386 but it is not going to be installed Depends: libgtk2.0-0:i386 but it is not going to be installed Depends: libnss3:i386 but it is not going to be installed Depends: libqt4-opengl:i386 but it is not going to be installed Depends: libqt4-qt3support:i386 but it is not going to be installed Depends: libqt4-scripttools:i386 but it is not going to be installed Depends: libqt4-svg:i386 but it is not going to be installed Depends: libqtgui4:i386 but it is not going to be installed Depends: libqtwebkit4:i386 but it is not going to be installed Depends: librsvg2-common:i386 but it is not going to be installed Depends: libsane:i386 but it is not going to be installed E: Unable to correct problems, you have held broken packages. tek@tek-G53SW:~/Download$

    Read the article

  • Issue Calculating from Rows and Columns(Summing two columns with the third of a different row)

    - by vstsdev
    With reference to my previous question Adding columns resulting from GROUP BY clause SELECT AcctId,Date, Sum(CASE WHEN DC = 'C' THEN TrnAmt ELSE 0 END) AS C, Sum(CASE WHEN DC = 'D' THEN TrnAmt ELSE 0 END) AS D FROM Table1 where AcctId = '51' GROUP BY AcctId,Date ORDER BY AcctId,Date I executed the above query and got my desired result.. AcctId Date C D 51 2012-12-04 15000 0 51 2012-12-05 150000 160596 51 2012-12-06 600 0 now I have a another operation to do on the same query i.e. I need the result to be like this AcctId Date Result 51 2012-12-04 (15000-0)-> 15000 51 2012-12-05 (150000-160596) + (15000->The first value) 4404 51 2012-12-06 600-0 +(4404 ->The calculated 2nd value) 5004 Is it possible with the same query??.

    Read the article

  • Graph API - Get events by owner/creator

    - by jwynveen
    Is there a way with the Facebook Graph API to get a list of all events created by a single profile? Our client creates a bunch of events and we want to pull a list of them all. I said that they would just have to make sure they set themselves to be attending the event, because then I can easily pull the list of events that profileId is attending, but I'm curious if there's another way. Maybe an FQL query? They look to require a query on the primary key though. And what would that FQL query look like if that's the way to do it??

    Read the article

  • NSStream sockets missing data

    - by Chris T.
    I am trying to pull some sample data from FreeDB as a proof of concept, but I am having a tough time retrieving all of the data off the incoming stream (I am only getting the last bits for the final query listed here (if handshakeCode = 3) I think this may be something with the threading on the main runloop, but I am not sure. Odd thing is when the buffer size is larger than 1-2 bytes (which works as expected), I seem to be losing access to the data programmatically (the totalOutput variable on the first set of data is incomplete). I set up a packet capture, and it looks like those 1024 bytes are coming across the wire, but the app just isn't working with it. It looks like the next event is coming through and basically taking over. I tried using an NSLock to no avail as well. If I drop the buffer size down to 1 or 2, things seem to be reading just fine. This is probably obvious to someone who does this all the time, but this is my first foray into this with something I am familiar with, technology wise in other languages / platforms. The following code will show you what is happening. Run with the buffer set to 1024, and you will see a short final string, but once you set it to 1, you will see the amount of data I was expecting (I was even expecting it to be split, so that's not a big worry) #import <Foundation/Foundation.h> #import <Cocoa/Cocoa.h> //STACK OVERFLOW CODE: @interface stackoverflow : NSObject <NSStreamDelegate> { NSInputStream *iStream; NSOutputStream *oStream; int handshakeCode; NSString *selectedDiscId; NSString *selectedGenre; } -(void)getMatchesFromFreeDB; -(void)sendToOutputStream:(NSString*)command; @end @implementation stackoverflow -(void)getMatchesFromFreeDB { NSHost *host = [NSHost hostWithName:@"freedb.freedb.org"]; [NSStream getStreamsToHost:host port:8880 inputStream:&iStream outputStream:&oStream]; [iStream retain]; [oStream retain]; [iStream setDelegate:self]; [oStream setDelegate:self]; [iStream scheduleInRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode]; [oStream scheduleInRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode]; [iStream open]; [oStream open]; handshakeCode = 0; //not done any processing } -(void)stream:(NSStream *)aStream handleEvent:(NSStreamEvent)eventCode { switch(eventCode) { case NSStreamEventOpenCompleted: { NSLog(@"Stream open completed"); break; } case NSStreamEventHasBytesAvailable: { NSLog(@"Stream has bytes available"); if (aStream == iStream) { NSMutableString *totalOutput = [NSMutableString stringWithString:@""]; //read data uint8_t buffer[1024]; int len; while ([iStream hasBytesAvailable]) { len = [iStream read:buffer maxLength:sizeof(buffer)]; if (len 0) { NSString *output = [[NSString alloc] initWithBytes:buffer length:len encoding:NSUTF8StringEncoding]; //this could have also been put into an NSData object if (nil != output) { //append to the total output [totalOutput appendString:output]; } } } NSLog(@"OUTPUT , %i:\n\n%@", [totalOutput lengthOfBytesUsingEncoding:NSUTF8StringEncoding], totalOutput); NSArray *outputComponents = [totalOutput componentsSeparatedByString:@" "]; //Attempt to get handshake code, since we haven't done it yet: if (handshakeCode == 1) { //we are just getting the sign-on banner: //let's move on: handshakeCode = 2; } else if (handshakeCode == 2) { handshakeCode = [[outputComponents objectAtIndex:0] intValue]; if (handshakeCode == 200) { NSLog(@"---Handshake OK %i", handshakeCode); NSMutableString *query = [NSMutableString stringWithString:@"cddb query f3114b11 17 225 19915 36489 54850 69425 87025 103948 123242 136075 152817 178335 192850 211677 235104 262090 284882 308658 4430\n"]; handshakeCode = 3; [self sendToOutputStream:query]; } } else if (handshakeCode == 3) { //now, we are reading out the matches: if ([[outputComponents objectAtIndex:0] intValue] == 200) //found exact match: { NSLog(@"Found exact match"); selectedGenre = [outputComponents objectAtIndex:1] ; selectedDiscId = [outputComponents objectAtIndex:2]; if (selectedGenre && selectedDiscId) { //send off the request to get the entry: NSString *query = [NSString stringWithFormat:@"cddb read %@ %@\n", selectedGenre, selectedDiscId]; [self sendToOutputStream:query]; handshakeCode = 4; } } } } break; } case NSStreamEventEndEncountered: { NSLog(@"Stream event end encountered"); break; } case NSStreamEventErrorOccurred: { NSLog(@"Stream error occurred"); break; } case NSStreamEventHasSpaceAvailable: { NSLog(@"Stream has space available"); if (aStream == oStream) { if (handshakeCode == 0) { handshakeCode = 1; [self sendToOutputStream:@"cddb hello stackoverflow localhost.localdomain test .01BETA\n"]; } } break; } } } -(void)sendToOutputStream:(NSString*)command { const uint8_t *rawCommand = (const uint8_t *)[command UTF8String]; [oStream write:rawCommand maxLength:strlen(rawCommand)]; NSLog(@"Sent command: %@",command); } @end int main (int argc, const char * argv[]) { NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init]; stackoverflow *test = [[stackoverflow alloc] init]; [test getMatchesFromFreeDB]; NSRunLoop *runLoop = [NSRunLoop currentRunLoop]; [runLoop run]; [pool drain]; return 0; } Any help is much appreciated! Thanks

    Read the article

  • Disabling scoring in Lucene(.NET)

    - by user72185
    Hi, When searching, is there a way to disable scoring for any query? The scenario is that the user refines his query by trying different combinations of words, phrases etc., and needs realtime (well, reasonably fast at least) responses on the number of hits. Search time slows down a lot when there are millions of hits due to scoring, but the user really doesn't care about all these documents. As soon as he sees there are 1M+ hits he will start adding additional words to the query. A "Sort by relevance" option would allow him to do this quickly, while turning scoring back on when the number of hits is reasonable. Is this possible? I'm using Lucene.NET 2.9.2 but AFAIK it is identical to the Java version.

    Read the article

  • How to avoid SQLiteException locking errors

    - by TheArchedOne
    I'm developing an android app. It has multiple threads reading from and writing to the android SQLite db. I am receiving the following error: SQLiteException: error code 5: database is locked I understand the SQLite locks the entire db on inserting/updating, but these errors only seems to happen when inserting/updating while I'm running a select query. The select query returns a cursor which is being left open quite a wile (a few seconds some times) while I iterate over it. If the select query is not running, I never get the locks. I'm surprised that the select could be locking the db.... is this possible, or is something else going on? What's the best way to avoid such locks? Thanks TAO

    Read the article

  • Fetch data from multiple MySQL tables

    - by Jon McIntosh
    My two tables look like this: TABLE1 TABLE2 +--------------------+ +--------------------+ |field1|field2|field3| and |field2|field4|field5| +--------------------+ +--------------------+ I am already running a SELECT query for TABLE1, and assorting all of the data into variables: $query = "SELECT * FROM TABLE1 WHERE field2 = 2"; $result = mysql_query($query); $num_rows = mysql_num_rows($result); if((!is_bool($result) || $result) && $num_rows) { while($row = mysql_fetch_array($result)) { $field1 = $row['field1']; $field2 = $row['field2']; $field3 = $row['field3']; } } What I want to do is get the data from 'field4' on TABLE2 and add it to my variables. I would want to get field4 WHERE field2 = 2

    Read the article

  • Problem with order by in LINQ

    - by vikitor
    Hi, I'm passing from the controller an array generated by the next code: public ActionResult GetClasses(bool ajax, string kingdom) { int _kingdom = _taxon.getKingdom(kingdom); var query = (from c in vwAnimalsTaxon.All() orderby c.ClaName select new { taxRecID = c.ClaRecID, taxName = c.ClaName }).Distinct(); return Json(query, JsonRequestBehavior.AllowGet); } The query List should be ordered, but it doesn't work, I get the names of the classes ordered wrong in the array, because I've seen it debugging that the names are not ordered.The view is just a dropdownbox loaded automatically, so I'm almost sure the problem is with the action. Do you see anything wrong?Am I missing something?

    Read the article

  • Detect if a table contains a column in Android/sqlite

    - by sandis
    So I have an app on the market, and with an update I want to add some columns to the database. No problems so far. But I want to detect if the database in use is missing these columns, and add them if this is the case. I need this to be done dynamically and not just after the update to the new version, because the application is supposed to still be able to import older databases. Normally I would be able to use the PRAGMA query, but Im not sure how to do this with Android. I cant use execSQL since it is a query, and I cant figure out how to use PRAGMA with the query()-function. Ofcourse I could just catch exceptions and then add the column, or always add the columns to each table before I start to work with it, but that is not a neat solution. Cheers,

    Read the article

  • How to implement a left outer join in the Entity Framework.

    - by user206736
    I have the following SQL query:- select distinct * from dbo.Profiles profiles left join ProfileSettings pSet on pSet.ProfileKey = profiles.ProfileKey left join PlatformIdentities pId on pId.ProfileKey = profiles.Profilekey I need to convert it to a LinqToEntities expression. I have tried the following:- from profiles in _dbContext.ProfileSet let leftOuter = (from pSet in _dbContext.ProfileSettingSet select new { pSet.isInternal }).FirstOrDefault() select new { profiles.ProfileKey, Internal = leftOuter.isInternal, profiles.FirstName, profiles.LastName, profiles.EmailAddress, profiles.DateCreated, profiles.LastLoggedIn, }; The above query works fine because I haven't considered the third table "PlatformIdentities". Single left outer join works with what I have done above. How do I include PlatformIdentities (the 3rd table) ? I basically want to translate the SQL query I specified at the beginning of this post (which gives me exactly what I need) in to LinqToEntities. Thanks

    Read the article

< Previous Page | 354 355 356 357 358 359 360 361 362 363 364 365  | Next Page >