Search Results

Search found 8323 results on 333 pages for 'execute'.

Page 122/333 | < Previous Page | 118 119 120 121 122 123 124 125 126 127 128 129  | Next Page >

  • XML Return from an Oracle Stored Procedure

    - by Tequila Jinx
    Unfortunately most of my DB experience has been with MSSQL which tends to hold your hand a lot more than Oracle. What I'm trying to do is fairly trivial in tSQL, however, pl/sql is giving me a headache. I have the following procedure: CREATE OR REPLACE PROCEDURE USPX_GetUserbyID (USERID USERS.USERID%TYPE, USERRECORD OUT XMLTYPE) AS BEGIN SELECT XMLELEMENT("user" , XMLATTRIBUTES(u.USERID AS "userid", u.companyid as "companyid", u.usertype as "usertype", u.status as "status", u.personid as "personid") , XMLFOREST( p.FIRSTNAME AS "firstname" , p.LASTNAME AS "lastname" , p.EMAIL AS "email" , p.PHONE AS "phone" , p.PHONEEXTENSION AS "extension") , XMLELEMENT("roles", (SELECT XMLAGG(XMLELEMENT("role", r.ROLETYPE)) FROM USER_ROLES r WHERE r.USERID = USERID AND r.ISACTIVE = 1 ) ) , XMLELEMENT("watches", (SELECT XMLAGG( XMLELEMENT("watch", XMLATTRIBUTES(w.WATCHID AS "id", w.TICKETID AS "ticket") ) ) FROM USER_WATCHES w WHERE w.USERID = USERID AND w.ISACTIVE = 1 ) ) ) AS "RESULT" INTO USERRECORD FROM USERS u LEFT JOIN PEOPLE p ON p.PERSONID = u.PERSONID WHERE u.USERID = USERID; END USPX_GetUserbyID; When executed, it should return an XML document with the following structure: <user userid="" companyid="" usertype="" status="" personid=""> <firstname /> <lastname /> <email /> <phone /> <extension /> <roles> <role /> </roles> <watches> <watch id="" ticket="" /> </watches> </user> When I execute the query itself, replacing the USERID parameter with a string and removing the "into" clause, the query runs fine and returns the expected structure. However, when the procedure attempts to execute the query, passing the results of the XMLELEMENT function into the USERRECORD output parameter, I get the following exception: Error report: ORA-01422: exact fetch returns more than requested number of rows ORA-06512: at "USPX_GETUSERBYID", line 4 ORA-06512: at line 3 01422. 00000 - "exact fetch returns more than requested number of rows" *Cause: The number specified in exact fetch is less than the rows returned. *Action: Rewrite the query or change number of rows requested I'm baffled trying to nail this down, and unfortunately my google-fu hasn't helped. I've found plenty of Oracle SQL|XML examples, but none that deal with XML returns from a procedure. Note: I know that an alternate method of retrieving XML using DBMS methods exists, however, it's my understanding that that functionality is deprecated in favor of SQL|XML.

    Read the article

  • Removing rows in asp.net

    - by user279521
    I am attempting to remove rows on my asp.net page using the following code: try { Table t = (Table)Page.FindControl("Panel1").FindControl("tbl"); foreach (TableRow tr in t.Rows) { t.Rows.Remove(tr); } } catch (Exception e) { lblErrorMessage.Text = "Error - RemoveDynControls - " + e.Message; } however, I am getting an error on the (when the code loops the second time around) "Collection was modified; enumeration operation may not execute." Any ideas regarding what is causing the error message?

    Read the article

  • while creating archetype getting following error

    - by munna
    D:\Training\workspace\vppsourcemvn archetype:generate -B -DarchetypeGroupId=org .appfuse.archetypes -DarchetypeArtifactId=appfuse-modular-struts-archetype -Darc hetypeVersion=2.1.0-M1 -DgroupId=com.vmware -DartifactId=vpp [INFO] Scanning for projects... [INFO] Searching repository for plugin with prefix: 'archetype'. [INFO] ------------------------------------------------------------------------ [INFO] Building Maven Default Project [INFO] task-segment: [archetype:generate] (aggregator-style) [INFO] ------------------------------------------------------------------------ [INFO] Preparing archetype:generate [INFO] No goals needed for project - skipping [INFO] [archetype:generate {execution: default-cli}] [INFO] Generating project in Batch mode [WARNING] Error reading archetype catalog http://repo1.maven.org/maven2 org.apache.maven.wagon.TransferFailedException: Error transferring file: Connect ion timed out: connect at org.apache.maven.wagon.providers.http.LightweightHttpWagon.fillInputD ata(LightweightHttpWagon.java:143) at org.apache.maven.wagon.StreamWagon.getInputStream(StreamWagon.java:11 6) at org.apache.maven.wagon.StreamWagon.getIfNewer(StreamWagon.java:88) at org.apache.maven.wagon.StreamWagon.get(StreamWagon.java:61) at org.apache.maven.archetype.source.RemoteCatalogArchetypeDataSource.ge tArchetypeCatalog(RemoteCatalogArchetypeDataSource.java:97) at org.apache.maven.archetype.DefaultArchetypeManager.getRemoteCatalog(D efaultArchetypeManager.java:195) at org.apache.maven.archetype.DefaultArchetypeManager.getRemoteCatalog(D efaultArchetypeManager.java:184) at org.apache.maven.archetype.ui.DefaultArchetypeSelector.getArchetypesB yCatalog(DefaultArchetypeSelector.java:278) at org.apache.maven.archetype.ui.DefaultArchetypeSelector.selectArchetyp e(DefaultArchetypeSelector.java:69) at org.apache.maven.archetype.mojos.CreateProjectFromArchetypeMojo.execu te(CreateProjectFromArchetypeMojo.java:186) at org.apache.maven.plugin.DefaultPluginManager.executeMojo(DefaultPlugi nManager.java:490) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(Defa ultLifecycleExecutor.java:694) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeStandalone Goal(DefaultLifecycleExecutor.java:569) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoal(Defau ltLifecycleExecutor.java:539) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalAndHan dleFailures(DefaultLifecycleExecutor.java:387) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeTaskSegmen ts(DefaultLifecycleExecutor.java:284) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.execute(DefaultLi fecycleExecutor.java:180) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:328) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:138) at org.apache.maven.cli.MavenCli.main(MavenCli.java:362) at org.apache.maven.cli.compat.CompatibleMain.main(CompatibleMain.java:6 0) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.codehaus.classworlds.Launcher.launchEnhanced(Launcher.java:315) at org.codehaus.classworlds.Launcher.launch(Launcher.java:255) at org.codehaus.classworlds.Launcher.mainWithExitCode(Launcher.java:430) at org.codehaus.classworlds.Launcher.main(Launcher.java:375) Caused by: java.net.ConnectException: Connection timed out: connect at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:333) at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:195) at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:182) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366) at java.net.Socket.connect(Socket.java:529) at java.net.Socket.connect(Socket.java:478) at sun.net.NetworkClient.doConnect(NetworkClient.java:163) at sun.net.www.http.HttpClient.openServer(HttpClient.java:394) at sun.net.www.http.HttpClient.openServer(HttpClient.java:529) at sun.net.www.http.HttpClient.(HttpClient.java:233) at sun.net.www.http.HttpClient.New(HttpClient.java:306) at sun.net.www.http.HttpClient.New(HttpClient.java:323) at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLC onnection.java:860) at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConne ction.java:801) at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection .java:726) at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLCon nection.java:1049) at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:373 ) at org.apache.maven.wagon.providers.http.LightweightHttpWagon.fillInputD ata(LightweightHttpWagon.java:115) ... 28 more [WARNING] No archetype found in Remote catalog. Defaulting to internal Catalog [INFO] ------------------------------------------------------------------------ [ERROR] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] [INFO] ------------------------------------------------------------------------ [INFO] For more information, run Maven with the -e switch [INFO] ------------------------------------------------------------------------ [INFO] Total time: 46 seconds [INFO] Finished at: Wed Jun 09 16:11:07 IST 2010 [INFO] Final Memory: 11M/28M [INFO] ------------------------------------------------------------------------

    Read the article

  • Saxon XSLT-Transformation: How to change serialization of an empty tag from <x/> to <x></x>?

    - by Ben
    Hello folks! I do some XSLT-Transformation using Saxon HE 9.2 with the output later being unmarshalled by Castor 1.3.1. The whole thing runs with Java at the JDK 6. My XSLT-Transformation looks like this: <xsl:transform version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:ns="http://my/own/custom/namespace/for/the/target/document"> <xsl:output method="xml" encoding="UTF-8" indent="no" /> <xsl:template match="/"> <ns:item> <ns:property name="id"> <xsl:value-of select="/some/complicated/xpath" /> </ns:property> <!-- ... more ... --> </xsl:template> So the thing is: if the XPath-expression /some/complicated/xpath evaluates to an empty sequence, the Saxon serializer writes <ns:property/> instead of <ns:property></ns:property>. This, however, confuses the Castor unmarshaller, which is next in the pipeline and which unmarshals the output of the transformation to instances of XSD-generated Java-code. So my question is: How can I tell the Saxon-serializer to output empty tags not as standalone tags? Here is what I am proximately currently doing to execute the transformation: import net.sf.saxon.s9api.*; import javax.xml.transform.*; import javax.xml.transform.sax.SAXSource; // ... // read data XMLReader xmlReader = XMLReaderFactory.createXMLReader(); // ... there is some more setting up the xmlReader here ... InputStream xsltStream = new FileInputStream(xsltFile); InputStream inputStream = new FileInputStream(inputFile); Source xsltSource = new SAXSource(xmlReader, new InputSource(xsltStream)); Source inputSource = new SAXSource(xmlReader, new InputSource(inputStream)); XdmNode input = processor.newDocumentBuilder().build(inputSource); // initialize transformation configuration Processor processor = new Processor(false); XsltCompiler compiler = processor.newXsltCompiler(); compiler.setErrorListener(this); XsltExecutable executable = compiler.compile(xsltSource); Serializer serializer = new Serializer(); serializer.setOutputProperty(Serializer.Property.METHOD, "xml"); serializer.setOutputProperty(Serializer.Property.INDENT, "no"); serializer.setOutputStream(output); // execute transformation XsltTransformer transformer = executable.load(); transformer.setInitialContextNode(input); transformer.setErrorListener(this); transformer.setDestination(serializer); transformer.setSchemaValidationMode(ValidationMode.STRIP); transformer.transform(); I'd appreciate any hint pointing in the direction of a solution. :-) In case of any unclarity I'd be happy to give more details. Nightly greetings from Germany, Benjamin

    Read the article

  • Written a resharper plugin that modifies the TextControl?

    - by Ajaxx
    Resharper claims to eat it's own dogfood, specifically, they claim that many of the features of Resharper are written ontop of R# (OpenAPI). I'm writing a simple plugin to fix up comments of the current document of a selection. When this plugin is run, it throws an exception as follows: Document can be modified inside a command scope only I've researched the error and can't find anything to help with this, so I'm hoping that possibly, you've written a plugin to accomplish this. If not, I hope the snippet is enough to help others get their own plugins underway. using System; using System.IO; using System.Windows.Forms; using JetBrains.ActionManagement; using JetBrains.DocumentModel; using JetBrains.IDE; using JetBrains.TextControl; using JetBrains.Util; namespace TinkerToys.Actions { [ActionHandler("TinkerToys.RewriteComment")] public class RewriteCommentAction : IActionHandler { #region Implementation of IActionHandler /// <summary> /// Updates action visual presentation. If presentation.Enabled is set to false, Execute /// will not be called. /// </summary> /// <param name="context">DataContext</param> /// <param name="presentation">presentation to update</param> /// <param name="nextUpdate">delegate to call</param> public bool Update(IDataContext context, ActionPresentation presentation, DelegateUpdate nextUpdate) { ITextControl textControl = context.GetData(DataConstants.TEXT_CONTROL); return textControl != null; } /// <summary> /// Executes action. Called after Update, that set ActionPresentation.Enabled to true. /// </summary> /// <param name="context">DataContext</param> /// <param name="nextExecute">delegate to call</param> public void Execute(IDataContext context, DelegateExecute nextExecute) { ITextControl textControl = context.GetData(DataConstants.TEXT_CONTROL); if (textControl != null) { TextRange textSelectRange; ISelectionModel textSelectionModel = textControl.SelectionModel; if ((textSelectionModel != null) && textSelectionModel.HasSelection()) { textSelectRange = textSelectionModel.Range; } else { textSelectRange = new TextRange(0, textControl.Document.GetTextLength()); } IDocument textDocument = textControl.Document; String textSelection = textDocument.GetText(textSelectRange); if (textSelection != null) { StringReader sReader = new StringReader(textSelection); StringWriter sWriter = new StringWriter(); Converter.Convert(sReader, sWriter); textSelection = sWriter.ToString(); textDocument.ReplaceText(textSelectRange, textSelection); } } } #endregion } } So what is this command scope it wants so badly? I had some additional logging in this prior to posting it so I'm absolutely certain that both the range and text are valid. In addition, the error seems to indicate that I'm missing some scope that I've been, as yet, unable to find.

    Read the article

  • error to send SMS

    - by deni
    i have a problem with that code.. i got a error when i execute CommSetting.comm.SendMessage(pdu); the error is Phone reports generic communication error or syntax error can u tell me how to solve my problem?? thx i develop with c#, VS 2008

    Read the article

  • installing Python application with Python under windows

    - by mack369
    My application uses many Python libraries (Django, Twisted, xmlrpc). I cannot expect that the end user has the Python installed with all needed libraries. I've created a fancy installer for my application using Inno Setup, but I don't think that it is a good solution to execute 5 other setup programs from my installer. It would be annoying to the user to click "Next" button 15 times. Is there any way to do that quietly?

    Read the article

  • Help getting started with MEF

    - by SteveCl
    Hi, I was reading somewhere that with MEF I can simply drop a dll into a directory and my application (with some MEF magic) will be able to read it and execute the code in it? Hopefully only classes that implement an interface that I define?? Can someone help me to get going, with some links maybe for my problem. I've looked through some of the docs, but nothing seems to be what I'm after and its tricky when I don't know exactly what to search on... Thx S

    Read the article

  • Multiple key binding in WPF

    - by nihi_l_ist
    How can i execute some command on, lets say, Ctrl+Shift+E? As i saw we can write the following: KeyBinding kb = new KeyBinding(TestCommand, Key.E, ModifierKeys.Control); this.InputBindings.Add(kb); But how can i add more ModifierKeys or Keys?

    Read the article

  • Best Practice: Legitimate Cross-Site Scripting

    - by Ryan
    While cross-site scripting is generally regarded as negative, I've run into several situations where it's necessary. I was recently working within the confines of a very limiting content management system. I needed to include database code within the page, but the hosting server didn't have anything usable available. I set up a couple barebones scripts on my own server, originally thinking that I could use AJAX to import the contents of my scripts directly into the template of the CMS (thus retaining dynamic images, menu items, CSS, etc.). I was wrong. Due to the limitations of XMLHttpRequest objects, it's not possible to grab content from a different domain. So I thought "iFrame" - even though I'm not a fan of frames, I thought that I could create a frame that matched the width and height of the content, so that it would appear native. Again, I was blocked by cross-site scripting "protections." While I could indeed load a remote file into the iFrame, I couldn't execute JavaScript to modify its size on either the host page or inside the loaded page. In this particular scenario, I wasn't able to point a subdomain to my server. I also couldn't create a script on the CMS server that could proxy content from my server, so my last thought was to use a remote JavaScript. A remote JavaScript works. It breaks when the user has JavaScript disabled, which is a downside; but it works. The "problem" I was having with using a remote JavaScript was that I had to use the JS function document.write() to output any content. Any output that isn't JS causes script errors. In addition to using document.write() for every line, you also have to ensure that the content is escaped - or else you end up with more script errors. My solution was as follows: My script received a GET parameter ("page") and then looked for the file ({$page}.php), and read the contents into a variable. However, I had to use awkward buffering techniques in order to actually execute the included scripts (for things like database interaction) then strip the final content of all line break characters ("\n") followed by escaping all required characters. The end result is that my original script (which outputs JavaScript) accesses seemingly "standard" scripts on my server and converts their standard output to JavaScript for displaying within the CMS template. While this solution works, it seems like there may be a better way to accomplish the same thing. What is the best way to make cross-site scripting work specifically for the purpose of including content from a completely different domain?

    Read the article

  • Active Perl Installation on Windows operating system

    - by pavun_cool
    Hi all. I have installed the Active perl in my windows OS. I have followed below url procedure to install Active Perl Installation after done with that .. I have tried to run "perl -v " in command line . But it says me following error. The system cannot execute the specified prgoram Now What I need to do to solve this issues .. Any one help me out of this problem... Thanks ..

    Read the article

  • MySQL query, 2 similar servers, 2 minute difference in execution times

    - by mr12086
    I had a similar question on stack overflow, but it seems to be more server/mysql setup related than coding. The queries below all execute instantly on our development server where as they can take upto 2 minutes 20 seconds. The query execution time seems to be affected by home ambiguous the LIKE string's are. If they closely match a country that has few matches it will take less time, and if you use something like 'ge' for germany - it will take longer to execute. But this doesn't always work out like that, at times its quite erratic. Sending data appears to be the culprit but why and what does that mean. Also memory on production looks to be quite low (free memory)? Production: Intel Quad Xeon E3-1220 3.1GHz 4GB DDR3 2x 1TB SATA in RAID1 Network speed 100Mb Ubuntu Development Intel Core i3-2100, 2C/4T, 3.10GHz 500 GB SATA - No RAID 4GB DDR3 UPDATE 2 : mysqltuner output: [prod] -------- General Statistics -------------------------------------------------- [--] Skipped version check for MySQLTuner script [OK] Currently running supported MySQL version 5.1.61-0ubuntu0.10.04.1 [OK] Operating on 64-bit architecture -------- Storage Engine Statistics ------------------------------------------- [--] Status: +Archive -BDB -Federated +InnoDB -ISAM -NDBCluster [--] Data in MyISAM tables: 103M (Tables: 180) [--] Data in InnoDB tables: 491M (Tables: 19) [!!] Total fragmented tables: 38 -------- Security Recommendations ------------------------------------------- [OK] All database users have passwords assigned -------- Performance Metrics ------------------------------------------------- [--] Up for: 77d 4h 6m 1s (53M q [7.968 qps], 14M conn, TX: 87B, RX: 12B) [--] Reads / Writes: 98% / 2% [--] Total buffers: 58.0M global + 2.7M per thread (151 max threads) [OK] Maximum possible memory usage: 463.8M (11% of installed RAM) [OK] Slow queries: 0% (12K/53M) [OK] Highest usage of available connections: 22% (34/151) [OK] Key buffer size / total MyISAM indexes: 16.0M/10.6M [OK] Key buffer hit rate: 98.7% (162M cached / 2M reads) [OK] Query cache efficiency: 20.7% (7M cached / 36M selects) [!!] Query cache prunes per day: 3934 [OK] Sorts requiring temporary tables: 1% (3K temp sorts / 230K sorts) [!!] Joins performed without indexes: 71068 [OK] Temporary tables created on disk: 24% (3M on disk / 13M total) [OK] Thread cache hit rate: 99% (690 created / 14M connections) [!!] Table cache hit rate: 0% (64 open / 85M opened) [OK] Open file limit used: 12% (128/1K) [OK] Table locks acquired immediately: 99% (16M immediate / 16M locks) [!!] InnoDB data size / buffer pool: 491.9M/8.0M -------- Recommendations ----------------------------------------------------- General recommendations: Run OPTIMIZE TABLE to defragment tables for better performance Enable the slow query log to troubleshoot bad queries Adjust your join queries to always utilize indexes Increase table_cache gradually to avoid file descriptor limits Variables to adjust: query_cache_size (> 16M) join_buffer_size (> 128.0K, or always use indexes with joins) table_cache (> 64) innodb_buffer_pool_size (>= 491M) [dev] -------- General Statistics -------------------------------------------------- [--] Skipped version check for MySQLTuner script [OK] Currently running supported MySQL version 5.1.62-0ubuntu0.11.10.1 [!!] Switch to 64-bit OS - MySQL cannot currently use all of your RAM -------- Storage Engine Statistics ------------------------------------------- [--] Status: +Archive -BDB -Federated +InnoDB -ISAM -NDBCluster [--] Data in MyISAM tables: 185M (Tables: 632) [--] Data in InnoDB tables: 967M (Tables: 38) [!!] Total fragmented tables: 73 -------- Security Recommendations ------------------------------------------- [OK] All database users have passwords assigned -------- Performance Metrics ------------------------------------------------- [--] Up for: 1d 2h 26m 9s (5K q [0.058 qps], 1K conn, TX: 4M, RX: 1M) [--] Reads / Writes: 99% / 1% [--] Total buffers: 58.0M global + 2.7M per thread (151 max threads) [OK] Maximum possible memory usage: 463.8M (11% of installed RAM) [OK] Slow queries: 0% (0/5K) [OK] Highest usage of available connections: 1% (2/151) [OK] Key buffer size / total MyISAM indexes: 16.0M/18.6M [OK] Key buffer hit rate: 99.9% (60K cached / 36 reads) [OK] Query cache efficiency: 44.5% (1K cached / 2K selects) [OK] Query cache prunes per day: 0 [OK] Sorts requiring temporary tables: 0% (0 temp sorts / 44 sorts) [OK] Temporary tables created on disk: 24% (162 on disk / 666 total) [OK] Thread cache hit rate: 99% (2 created / 1K connections) [!!] Table cache hit rate: 1% (64 open / 4K opened) [OK] Open file limit used: 8% (88/1K) [OK] Table locks acquired immediately: 100% (1K immediate / 1K locks) [!!] InnoDB data size / buffer pool: 967.7M/8.0M -------- Recommendations ----------------------------------------------------- General recommendations: Run OPTIMIZE TABLE to defragment tables for better performance Enable the slow query log to troubleshoot bad queries Increase table_cache gradually to avoid file descriptor limits Variables to adjust: table_cache (> 64) innodb_buffer_pool_size (>= 967M) UPDATE 1: When testing the queries listed here there is usually no more than one other query taking place, and usually none. Because production is actually handling apache requests that development gets very few of as it's only myself and 1 other who accesses it - could the 4GB of RAM be getting exhausted by using the single machine for both apache and mysql server? Production: sudo hdparm -tT /dev/sda /dev/sda: Timing cached reads: 24872 MB in 2.00 seconds = 12450.72 MB/sec Timing buffered disk reads: 368 MB in 3.00 seconds = 122.49 MB/sec sudo hdparm -tT /dev/sdb /dev/sdb: Timing cached reads: 24786 MB in 2.00 seconds = 12407.22 MB/sec Timing buffered disk reads: 350 MB in 3.00 seconds = 116.53 MB/sec Server version(mysql + ubuntu versions): 5.1.61-0ubuntu0.10.04.1 Development: sudo hdparm -tT /dev/sda /dev/sda: Timing cached reads: 10632 MB in 2.00 seconds = 5319.40 MB/sec Timing buffered disk reads: 400 MB in 3.01 seconds = 132.85 MB/sec Server version(mysql + ubuntu versions): 5.1.62-0ubuntu0.11.10.1 ORIGINAL DATA : This query is NOT the query in question but is related so ill post it. SELECT f.form_question_has_answer_id FROM form_question_has_answer f INNER JOIN project_company_has_user p ON f.form_question_has_answer_user_id = p.project_company_has_user_user_id INNER JOIN company c ON p.project_company_has_user_company_id = c.company_id INNER JOIN project p2 ON p.project_company_has_user_project_id = p2.project_id INNER JOIN user u ON p.project_company_has_user_user_id = u.user_id INNER JOIN form f2 ON p.project_company_has_user_project_id = f2.form_project_id WHERE (f2.form_template_name = 'custom' AND p.project_company_has_user_garbage_collection = 0 AND p.project_company_has_user_project_id = '29') AND (LCASE(c.company_country) LIKE '%ge%' OR LCASE(c.company_country) LIKE '%abcde%') AND f.form_question_has_answer_form_id = '174' And the explain plan for the above query is, run on both dev and production produce the same plan. +----+-------------+-------+--------+----------------------------------------------------------------------------------------------------------------------------------------------+----------------------------------+---------+----------------------------------------------------+------+-------------+ | id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra | +----+-------------+-------+--------+----------------------------------------------------------------------------------------------------------------------------------------------+----------------------------------+---------+----------------------------------------------------+------+-------------+ | 1 | SIMPLE | p2 | const | PRIMARY | PRIMARY | 4 | const | 1 | Using index | | 1 | SIMPLE | f | ref | form_question_has_answer_form_id,form_question_has_answer_user_id | form_question_has_answer_form_id | 4 | const | 796 | Using where | | 1 | SIMPLE | u | eq_ref | PRIMARY | PRIMARY | 4 | new_klarents.f.form_question_has_answer_user_id | 1 | Using index | | 1 | SIMPLE | p | ref | project_company_has_user_unique_key,project_company_has_user_user_id,project_company_has_user_company_id,project_company_has_user_project_id | project_company_has_user_user_id | 4 | new_klarents.f.form_question_has_answer_user_id | 1 | Using where | | 1 | SIMPLE | f2 | ref | form_project_id | form_project_id | 4 | const | 15 | Using where | | 1 | SIMPLE | c | eq_ref | PRIMARY | PRIMARY | 4 | new_klarents.p.project_company_has_user_company_id | 1 | Using where | +----+-------------+-------+--------+----------------------------------------------------------------------------------------------------------------------------------------------+----------------------------------+---------+----------------------------------------------------+------+-------------+ This query takes 2 minutes ~20 seconds to execute. The query that is ACTUALLY being run on the server is this one: SELECT COUNT(*) AS num_results FROM (SELECT f.form_question_has_answer_id FROM form_question_has_answer f INNER JOIN project_company_has_user p ON f.form_question_has_answer_user_id = p.project_company_has_user_user_id INNER JOIN company c ON p.project_company_has_user_company_id = c.company_id INNER JOIN project p2 ON p.project_company_has_user_project_id = p2.project_id INNER JOIN user u ON p.project_company_has_user_user_id = u.user_id INNER JOIN form f2 ON p.project_company_has_user_project_id = f2.form_project_id WHERE (f2.form_template_name = 'custom' AND p.project_company_has_user_garbage_collection = 0 AND p.project_company_has_user_project_id = '29') AND (LCASE(c.company_country) LIKE '%ge%' OR LCASE(c.company_country) LIKE '%abcde%') AND f.form_question_has_answer_form_id = '174' GROUP BY f.form_question_has_answer_id;) dctrn_count_query; With explain plans (again same on dev and production): +----+-------------+-------+--------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+----------------------------------+---------+----------------------------------------------------+------+------------------------------+ | id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra | +----+-------------+-------+--------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+----------------------------------+---------+----------------------------------------------------+------+------------------------------+ | 1 | PRIMARY | NULL | NULL | NULL | NULL | NULL | NULL | NULL | Select tables optimized away | | 2 | DERIVED | p2 | const | PRIMARY | PRIMARY | 4 | | 1 | Using index | | 2 | DERIVED | f | ref | form_question_has_answer_form_id,form_question_has_answer_user_id | form_question_has_answer_form_id | 4 | | 797 | Using where | | 2 | DERIVED | p | ref | project_company_has_user_unique_key,project_company_has_user_user_id,project_company_has_user_company_id,project_company_has_user_project_id,project_company_has_user_garbage_collection | project_company_has_user_user_id | 4 | new_klarents.f.form_question_has_answer_user_id | 1 | Using where | | 2 | DERIVED | f2 | ref | form_project_id | form_project_id | 4 | | 15 | Using where | | 2 | DERIVED | c | eq_ref | PRIMARY | PRIMARY | 4 | new_klarents.p.project_company_has_user_company_id | 1 | Using where | | 2 | DERIVED | u | eq_ref | PRIMARY | PRIMARY | 4 | new_klarents.p.project_company_has_user_user_id | 1 | Using where; Using index | +----+-------------+-------+--------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+----------------------------------+---------+----------------------------------------------------+------+------------------------------+ On the production server the information I have is as follows. Upon execution: +-------------+ | num_results | +-------------+ | 3 | +-------------+ 1 row in set (2 min 14.28 sec) Show profile: +--------------------------------+------------+ | Status | Duration | +--------------------------------+------------+ | starting | 0.000016 | | checking query cache for query | 0.000057 | | Opening tables | 0.004388 | | System lock | 0.000003 | | Table lock | 0.000036 | | init | 0.000030 | | optimizing | 0.000016 | | statistics | 0.000111 | | preparing | 0.000022 | | executing | 0.000004 | | Sorting result | 0.000002 | | Sending data | 136.213836 | | end | 0.000007 | | query end | 0.000002 | | freeing items | 0.004273 | | storing result in query cache | 0.000010 | | logging slow query | 0.000001 | | logging slow query | 0.000002 | | cleaning up | 0.000002 | +--------------------------------+------------+ On development the results are as follows. +-------------+ | num_results | +-------------+ | 3 | +-------------+ 1 row in set (0.08 sec) Again the profile for this query: +--------------------------------+----------+ | Status | Duration | +--------------------------------+----------+ | starting | 0.000022 | | checking query cache for query | 0.000148 | | Opening tables | 0.000025 | | System lock | 0.000008 | | Table lock | 0.000101 | | optimizing | 0.000035 | | statistics | 0.001019 | | preparing | 0.000047 | | executing | 0.000008 | | Sorting result | 0.000005 | | Sending data | 0.086565 | | init | 0.000015 | | optimizing | 0.000006 | | executing | 0.000020 | | end | 0.000004 | | query end | 0.000004 | | freeing items | 0.000028 | | storing result in query cache | 0.000005 | | removing tmp table | 0.000008 | | closing tables | 0.000008 | | logging slow query | 0.000002 | | cleaning up | 0.000005 | +--------------------------------+----------+ If i remove user and/or project innerjoins the query is reduced to 30s. Last bit of information I have: Mysqlserver and Apache are on the same box, there is only one box for production. Production output from top: before & after. top - 15:43:25 up 78 days, 12:11, 4 users, load average: 1.42, 0.99, 0.78 Tasks: 162 total, 2 running, 160 sleeping, 0 stopped, 0 zombie Cpu(s): 0.1%us, 50.4%sy, 0.0%ni, 49.5%id, 0.0%wa, 0.0%hi, 0.0%si, 0.0%st Mem: 4037868k total, 3772580k used, 265288k free, 243704k buffers Swap: 3905528k total, 265384k used, 3640144k free, 1207944k cached top - 15:44:31 up 78 days, 12:13, 4 users, load average: 1.94, 1.23, 0.87 Tasks: 160 total, 2 running, 157 sleeping, 0 stopped, 1 zombie Cpu(s): 0.2%us, 50.6%sy, 0.0%ni, 49.3%id, 0.0%wa, 0.0%hi, 0.0%si, 0.0%st Mem: 4037868k total, 3834300k used, 203568k free, 243736k buffers Swap: 3905528k total, 265384k used, 3640144k free, 1207804k cached But this isn't a good representation of production's normal status so here is a grab of it from today outside of executing the queries. top - 11:04:58 up 79 days, 7:33, 4 users, load average: 0.39, 0.58, 0.76 Tasks: 156 total, 1 running, 155 sleeping, 0 stopped, 0 zombie Cpu(s): 3.3%us, 2.8%sy, 0.0%ni, 93.9%id, 0.0%wa, 0.0%hi, 0.0%si, 0.0%st Mem: 4037868k total, 3676136k used, 361732k free, 271480k buffers Swap: 3905528k total, 268736k used, 3636792k free, 1063432k cached Development: This one doesn't change during or after. top - 15:47:07 up 110 days, 22:11, 7 users, load average: 0.17, 0.07, 0.06 Tasks: 210 total, 2 running, 208 sleeping, 0 stopped, 0 zombie Cpu(s): 0.1%us, 0.2%sy, 0.0%ni, 99.7%id, 0.0%wa, 0.0%hi, 0.0%si, 0.0%st Mem: 4111972k total, 1821100k used, 2290872k free, 238860k buffers Swap: 4183036k total, 66472k used, 4116564k free, 921072k cached

    Read the article

  • Debugging utilities for Linux process hang issues?

    - by Niranjan
    I have a daemon process which does the configuration management. all the other processes should interact with this daemon for their functioning. But when I execute a large action, after few hours the daemon process is unresponsive for 2 to 3 hours. And After 2- 3 hours it is working normally. Debugging utilities for Linux process hang issues? How to get at what point the linux process hangs?

    Read the article

  • Google Bookmarks Accelerator for IE8

    - by MACHiNESMiTH
    To Editors: Sorry, I didnt realise the question thing till you pointed it out. Also I didn't know about the appengine tag, I'm assuming it was selected by mistake from that JS autosuggestion box (it usually happens with me, I run a P3) my apologies for that too. I've edited it now, if it doesnt work let me know. Hi all, I'm making (or at least trying to make) a Google bookmarks accelerator for IE8, but I keep running into "Internet Explorer could not install this accelerator. There was a problem with the Accelerator's information." Does anyone know why this is happening? Here is the code I've made <?xml version="1.0" encoding="UTF-8" ?> <os:openServiceDescription xmlns="http://www.microsoft.com/schemas/openservicedescription/1.0"> <os:homepageUrl>http://www.google.com/bookmarks</os:homepageUrl> <os:display> <os:description>Add this page to GoogleBookmarks</os:description> <os:name>Add to GoogleBookmarks</os:name> <os:icon>http://www.google.com/favicon.ico</os:icon> </os:display> <os:activity category="Bookmarks"> <os:activityAction context="document"> <os:execute method="get" action="http://www.google.com/bookmarks/mark?op=add&output=popup"> <os:parameter name="bkmk" value="{documentUrl}" /> <os:parameter name="title" value="{documentTitle}" /> <os:parameter name="annotation" value="{selection}" /> </os:execute> </os:activityAction> </os:activity> </os:openServiceDescription> These are the references I used: MSDN Developers Guide MSDN Format Specification Unofficial Google API (used as a look up so far) and yes i know that I cannot send document variables between schemes (HTTP - HTTPS, for example) or between servers in different security zones (Intranet - Internet etc) Also, if this is the wrong place, what are the recommended sites and/or forums for xml related posts? (not really questions - but more like full blown detailed rant like posts) Thanks

    Read the article

  • MySQL function: Rotate old entries to archive table

    - by confiq
    Hi, I'm looking for the function that will take rows older then X days and put it in archive table... Was thinking to make function so it will be easer to execute... something like CREATE TABLE archive_NUMBER_OF_WEEK (...); INSERT INTO archive_NUMBER_OF_WEEK SELECT * FROM content WHERE DATE < X days; DELETE * FROM content WHERE DATE < X days RENAME TABLE content TO content_backup, content_temp TO content; Will post when I finish it :)

    Read the article

  • NHibernate (3.1.0.4000) NullReferenceException using Query<> and NHibernate Facility

    - by TigerShark
    I have a problem with NHibernate, I can't seem to find any solution for. In my project I have a simple entity (Batch), but whenever I try and run the following test, I get an exception. I've triede a couple of different ways to perform a similar query, but almost identical exception for all (it differs in which LINQ method being executed). The first test: [Test] public void QueryLatestBatch() { using (var session = SessionManager.OpenSession()) { var batch = session.Query<Batch>() .FirstOrDefault(); Assert.That(batch, Is.Not.Null); } } The exception: System.NullReferenceException : Object reference not set to an instance of an object. at NHibernate.Linq.NhQueryProvider.PrepareQuery(Expression expression, ref IQuery query, ref NhLinqExpression nhQuery) at NHibernate.Linq.NhQueryProvider.Execute(Expression expression) at System.Linq.Queryable.FirstOrDefault(IQueryable`1 source) The second test: [Test] public void QueryLatestBatch2() { using (var session = SessionManager.OpenSession()) { var batch = session.Query<Batch>() .OrderBy(x => x.Executed) .Take(1) .SingleOrDefault(); Assert.That(batch, Is.Not.Null); } } The exception: System.NullReferenceException : Object reference not set to an instance of an object. at NHibernate.Linq.NhQueryProvider.PrepareQuery(Expression expression, ref IQuery query, ref NhLinqExpression nhQuery) at NHibernate.Linq.NhQueryProvider.Execute(Expression expression) at System.Linq.Queryable.SingleOrDefault(IQueryable`1 source) However, this one is passing (using QueryOver<): [Test] public void QueryOverLatestBatch() { using (var session = SessionManager.OpenSession()) { var batch = session.QueryOver<Batch>() .OrderBy(x => x.Executed).Asc .Take(1) .SingleOrDefault(); Assert.That(batch, Is.Not.Null); Assert.That(batch.Executed, Is.LessThan(DateTime.Now)); } } Using the QueryOver< API is not bad at all, but I'm just kind of baffled that the Query< API isn't working, which is kind of sad, since the First() operation is very concise, and our developers really enjoy LINQ. I really hope there is a solution to this, as it seems strange if these methods are failing such a simple test. EDIT I'm using Oracle 11g, my mappings are done with FluentNHibernate registered through Castle Windsor with the NHibernate Facility. As I wrote, the odd thing is that the query works perfectly with the QueryOver< API, but not through LINQ.

    Read the article

  • Is it possible for a java.lang.Process to inherit the environement variables from another java.lang.

    - by Eric Leung
    I am trying to use groovy to do shell scripting on unix, but I am not having any luck having one process retain the environment variables changed by another process. For example, def p1 = ["bash", "-c", "source /some/setEnv.sh"].execute() Now, I would like a second process, p2, to inherit the environment variables that was set in p1. How can I do this? I don't see anything in java.lang.Process or its groovy extension that would spit out the environment variables after the process has executed.

    Read the article

  • VS2010 Build Definitions - Creating a MSBuild task for JS/CSS Minification

    - by RPM1984
    Hi guys, We've recently upgraded from VS2008 - VS2010 (and hence upgrading from Web Deployment Project to proper deployment templates). Alas - my Workflow skills arent quite up to scratch. Previously, we used a MSBuild task to execute the Yahoo YUI Javascript/CSS compression module to minify/compress javascript and css files. Anyone manage to accomplish this task with Visual Studio 2010 / TFS 2010 ?

    Read the article

  • Long Running Stored Proc - Report Progress Using BackgroundWorker & Timer

    - by daveywc
    While a long running stored proc (RMR_Seek) is executing (called via a Linq-To-SQL data context) I am trying to call another stored proc (RMR_GetLatestModelMessage) to check a table for the latest status message. The long running stored proc updates the table in question with status messages as it executes. I want to display the status message on a message panel to advise the user of the status of the execution of Proc_A. For various reasons it is not possible to determine how long RMR_Seek will take to execute so a progress bar with percentage increments is not feasible. I thought I'd found the way to do it by calling the long running stored proc from in a BackgroundWorker process DoWork event handler. This worked fine and allowed me to update my message panel with some dummy status messages that were NOT obtained via Proc_B while Proc_A was running. However now that I have tried to implement this fully by calling Proc_B to obtain the status messages I am running into problems that seem to be related to the mix of the backgroundworker and my System.Windows.Forms.Timer. An extract of the code I am using is below. I have tried many different ways around this but each one seems to present its own set of problems. The code below is problematic in the bw_DoWork event. The RMR_Seek stored proc gets called but does not execute properly - it also seems to be inconsistent as to whether _IsCompleted gets set to true. I'm sure there is a better way to achieve what I am trying to do. private bool _IsCompleted; private void RunRevenueSeek() { if (_SelectedModel == null) { MessageBox.Show("Please select a model from the list and try again.", "Model Generation", MessageBoxButtons.OK, MessageBoxIcon.Information); } else { var bw = new BackgroundWorker(); bw.DoWork += new DoWorkEventHandler(bw_DoWork); ProgressPanelControl.Visible = true; _IsCompleted = false; MessageTimer.Start(); // Has an interval of 3000 bw.RunWorkerAsync(); ProgressLabelControl.Text = "Refreshing Data"; this.Update(); ...more code goes here } } private void bw_DoWork(object sender, DoWorkEventArgs e) { using (var dc = new RevMdlrDataClassesDataContext()) { dc.CommandTimeout = 300; dc.RMR_Seek(_SelectedModel.ModelSet_ID); _IsCompleted = true; } } private void MessageTimer_Tick(object sender, EventArgs e) { string message = ""; if (_IsCompleted) { MessageTimer.Stop(); } else { using (var dc = new RevMdlrDataClassesDataContext()) { dc.CommandTimeout = 300; dc.RMR_GetLatestModelMessage(_SelectedModel.ModelSet_ID, ref message); ProgressLabelControl.Text = message; this.Update(); } } }

    Read the article

< Previous Page | 118 119 120 121 122 123 124 125 126 127 128 129  | Next Page >