Search Results

Search found 65754 results on 2631 pages for 'oracle application grid basics'.

Page 338/2631 | < Previous Page | 334 335 336 337 338 339 340 341 342 343 344 345  | Next Page >

  • How to avoid this very heavy query that slows down the application?

    - by Juan Paredes
    Hi, We have a web application running in a production enviroment and at some point the client complained about how slow the application got. When we checked what was going on with the application and the database we discover this "precious" query that was being executed by several users at the same time (thus inflicting an extremely high load on the database server): SELECT NULL AS table_cat, o.owner AS table_schem, o.object_name AS table_name, o.object_type AS table_type, NULL AS remarks FROM all_objects o WHERE o.owner LIKE :1 ESCAPE :"SYS_B_0" AND o.object_name LIKE :2 ESCAPE :"SYS_B_1" AND o.object_type IN(:"SYS_B_2", :"SYS_B_3") ORDER BY table_type, table_schem, table_name Our application does not execute this query, I believe it is an Hibernate internal query. I've found little information on why Hibernate does this extremely heavy query, so any help in how to avoid it very much appreciated! The production enviroment information: Red Hat Enterprise Linux 5.3 (Tikanga), JDK 1.5, web container OC4J (whitin Oracle Application Server), Oracle Database 10.1.0.4, JDBC Driver for JDK 1.2 and 1.3, Hibernate version 3.2.6.ga, connection pool library C3P0 version 0.9.1. Thank you.

    Read the article

  • Tips On Using The Service Contracts Import Program

    - by LuciaC
    Prior to release 12.1 there was no supported way to import contracts into the EBS Service Contracts application - there were no public APIs nor contract load programs provided.  From release 12.1 onwards the 'Service Contracts Import Program' is provided to load service contracts into the application. The Service Contracts Import functionality is explained in How to Use the Service Contracts Import Program - Scope and Limitations (Doc ID 1057242.1).  This note includes an attached document which explains the program architecture, shows the Entity Relationship Diagram and details the interface table definitions. The Import program takes data from the interface tables listed below and populates the contracts schema tables:  OKS_USAGE_COUNTERS_INTERFACE OKS_SALES_CREDITS_INTERFACEOKS_NOTES_INTERFACEOKS_LINES_INTERFACEOKS_HEADERS_INTERFACEOKS_COVERED_LEVELS_INTERFACEThese interface tables must be loaded via a custom load program.The Service Contracts Import concurrent request is then submitted to create contracts from this legacy data. The parameters to run the Import program are:  Parameter Description  Mode Validate only, Import  Batch Number Batch_Id (unique id populated into the OKS_HEADERS_INTERFACE table)  Number of Workers Number of workers required (these are spawned as separate sub-requests)  Commit size Represents number of successfully processed contracts commited to database The program spawns sub-requests for the import worker(s) and the 'Service Contracts Import Report'.  The data is validated prior to import and into the Contracts tables and will report errors in the Service Contracts Import Report program output file (Import Execution Report).  Troubleshooting tips are provided in R12.1 - Common Service Contract Import Errors (Doc ID 762545.1); this document lists some, but not all, import errors.  The document will be updated over time.  Additional help is given in Debugging Tip for Service Contracts Import Errors (Doc ID 971426.1).After you successfully import contracts, you can purge the records from the interface tables by running the Service Contracts Import Purge concurrent program. Note that there is no supported way to mass delete data from the Contracts schema tables once they are populated, so data loaded by the Import program must be fully tested and verified before the program is run to load data into a Production system.A Service Contracts Import Test program has been provided which will take an existing contract in the application and load the interface tables using the data from that contract.  This can be used as an example for guidance on how to load the interface tables.  The Test program functionality is explained in How to Use the Service Contracts Test Import Program Provided in Release 12.1 (Doc ID 761209.1).  Note that the Test program has some limitations which do not apply to the full Import program and is not a supported program, it is simply a testing tool.  

    Read the article

  • Developing Schema Compare for Oracle (Part 3): Ghost Objects

    - by Simon Cooper
    In the previous blog post, I covered how we solved the problem of dependencies between objects and between schemas. However, that isn’t the end of the issue. The dependencies algorithm I described works when you’re querying live databases and you can get dependencies for a particular schema direct from the server, and that’s all well and good. To throw a (rather large) spanner in the works, Schema Compare also has the concept of a snapshot, which is a read-only compressed XML representation of a selection of schemas that can be compared in the same way as a live database. This can be useful for keeping historical records or a baseline of a database schema, or comparing a schema on a computer that doesn’t have direct access to the database. So, how do snapshots interact with dependencies? Inter-database dependencies don't pose an issue as we store the dependencies in the snapshot. However, comparing a snapshot to a live database with cross-schema dependencies does cause a problem; what if the live database has a dependency to an object that does not exist in the snapshot? Take a basic example schema, where you’re only populating SchemaA: SOURCE   TARGET (using snapshot) CREATE TABLE SchemaA.Table1 ( Col1 NUMBER REFERENCES SchemaB.Table1(col1));   CREATE TABLE SchemaA.Table1 ( Col1 VARCHAR2(100)); CREATE TABLE SchemaB.Table1 ( Col1 NUMBER PRIMARY KEY);   CREATE TABLE SchemaB.Table1 ( Col1 VARCHAR2(100)); In this case, we want to generate a sync script to synchronize SchemaA.Table1 on the database represented by the snapshot. When taking a snapshot, database dependencies are followed, but because you’re not comparing it to anything at the time, the comparison dependencies algorithm described in my last post cannot be used. So, as you only take a snapshot of SchemaA on the target database, SchemaB.Table1 will not be in the snapshot. If this snapshot is then used to compare against the above source schema, SchemaB.Table1 will be included in the source, but the object will not be found in the target snapshot. This is the same problem that was solved with comparison dependencies, but here we cannot use the comparison dependencies algorithm as the snapshot has not got any information on SchemaB! We've now hit quite a big problem - we’re trying to include SchemaB.Table1 in the target, but we simply do not know the status of this object on the database the snapshot was taken from; whether it exists in the database at all, whether it’s the same as the target, whether it’s different... What can we do about this sorry state of affairs? Well, not a lot, it would seem. We can’t query the original database, as it may not be accessible, and we cannot assume any default state as it could be wrong and break the script (and we currently do not have a roll-back mechanism for failed synchronizes). The only way to fix this properly is for the user to go right back to the start and re-create the snapshot, explicitly including the schemas of these 'ghost' objects. So, the only thing we can do is flag up dependent ghost objects in the UI, and ask the user what we should do with it – assume it doesn’t exist, assume it’s the same as the target, or specify a definition for it. Unfortunately, such functionality didn’t make the cut for v1 of Schema Compare (as this is very much an edge case for a non-critical piece of functionality), so we simply flag the ghost objects up in the sync wizard as unsyncable, and let the user sort out what’s going on and edit the sync script as appropriate. There are some things that we do do to alleviate somewhat this rather unhappy situation; if a user creates a snapshot from the source or target of a database comparison, we include all the objects registered from the database, not just the ones in the schemas originally selected for comparison. This includes any extra dependent objects registered through the comparison dependencies algorithm. If the user then compares the resulting snapshot against the same database they were comparing against when it was created, the extra dependencies will be included in the snapshot as required and everything will be good. Fortunately, this problem will come up quite rarely, and only when the user uses snapshots and tries to sync objects with unknown cross-schema dependencies. However, the solution is not an easy one, and lead to some difficult architecture and design decisions within the product. And all this pain follows from the simple decision to allow schema pre-filtering! Next: why adding a column to a table isn't as easy as you would think...

    Read the article

  • How to design a grid containing TextView in Android

    - by Cris
    Hello, i need to realize an app for Android using this background image: Over every cell i have to draw a TextView, but i don't know how to do it with the different screens; i have the background image with 3 resolutions (240x320, 320x480, 480x800) but i don't know what kind of layout to use; i would use GridView but i don't know if i can work with different column size. Can anyone help me? Thanks in advance

    Read the article

  • WebCenter Customer Spotlight: College of American Pathologists

    - by me
    Author: Peter Reiser - Social Business Evangelist, Oracle WebCenter  Solution Summary College of American Pathologists Goes Live with OracleWebCenter - Imaging, AP Invoice Automation, and EBS Managed Attachment with Support for Imaging ContentThe College of American Pathologists (CAP), the leading organization of board-certified pathologists serving more then 18,000 physician members, 7,000 laboratories are accredited by the CAP, and approximately 22,000 laboratories are enrolled in the College’s proficiency testing programs. The business objective was to content-enable their Oracle E-Business Suite (EBS) enterprise application by combining the best of Imaging and Manage Attachment functionality providing a unique opportunity for the business to have unprecedented access to both structure and unstructured content from within their enterprise application. The solution improves customer services turnaround time, provides better compliance and improves maintenance and management of the technology infrastructure. Company OverviewThe College of American Pathologists (CAP), celebrating 50 years as the gold standard in laboratory accreditation, is a medical society serving more than 17,000 physician members and the global laboratory community. It is the world’s largest association composed exclusively of board certified pathologists and is the worldwide leader in laboratory quality assurance. The College advocates accountable, high-quality, and cost-effective patient care. The more than 17,000 pathologist members of the College of American Pathologists represent board-certified pathologists and pathologists in training worldwide. More than 7,000 laboratories are accredited by the CAP, and approximately 23,000 laboratories are enrolled in the College’s proficiency testing programs.  Business ChallengesThe CAP business objective was to content-enable their Oracle E-Business Suite (EBS) enterprise application by combining the best of Imaging and Manage Attachment functionality providing a unique opportunity for the business to have unprecedented access to both structure and unstructured content from within their enterprise application.  Bring more flexibility to systems and programs in order to adapt quickly Get a 360 degree view of the customer Reduce cost of running the business Solution DeployedWith the help of Oracle Consulting, the customer implemented Oracle WebCenter Content as the centralized E-Business Suite Document Repository.  The solution enables to capture, present and manage all unstructured content (PDFs,word processing documents, scanned images, etc.) related to Oracle E-Business Suite transactions and exposing the related content using the familiar EBS User Interface. Business ResultsThe CAP achieved following benefits from the implemented solution: Managed Attachment Solution Align with strategic Oracle Fusion Middleware platform Integrate with the CAP existing data capture capabilities Single user interface provided by the Managed Attachment solution for all content Better compliance and improved collaboration  Account Payables Invoice Processing Imaging Solution Automated invoice management eliminating dependency on paper materials and improving compliance, collaboration and accuracy A single repository to house and secure scanned invoices and all supplemental documents Greater management visibility of invoice entry process Additional Information CAP OpenWorld Presentation Oracle WebCenter Content Oracle Webcenter Capture Oracle WebCenter Imaging Oracle  Consulting

    Read the article

  • WinForms Application Hangs

    - by Ram
    Hi, I have an application (ABC) that I developed and it as a windows application (.exe). It is by itself quite a big application referring a lot of dll's. However, now there is a requirement that demands that this application(ABC) be a part of an even larger application (XYZ). Hence, I had to change the project type of "ABC" from being a windows application to a class library, and by changing a few lines of code. My problem is that, ever since I started using ABC as part of XYZ, the application started hanging if I dint perform any operation on it for 10 to 15 mins... I do not have any problems while running it as a separate application. Any reasons why this might occur? Any suggestions would be really appreciated... Thanks, Ram

    Read the article

  • Cloud Computing - Multiple Physical Computers, One Logical Computer

    - by Koobz
    I know that you can set up multiple virtual machines per physical computer. I'm wondering if it's possible to make multiple physical computers behave as one logical unit? Fundamentally the way I imagine it working is that you can throw 10 computers into a facility one day. You've got one client that requires the equivalent of two computers worth, and 100 others that eat up the remaining 8. As demands change you're just reallocating logical resources, maybe the 2 computer client now requires a third physical system. You just add it to the cloud, and don't worry about sharding the database, or migrating data over to a new server. Can it work this way? If yes, why would anyone ever do things like partition their database servers anymore? Just add more computing resources. You scale horizontally with the hardware, but your server appears to scale vertically. There's no need to modify your application's infrastructure to support multiple databases etc.

    Read the article

  • Save the date! Manageability Partner Community Forum at Oracle Openworld - Oct. 1st

    - by Javier Puerta
    The Exadata & Manageability Partner Communities will be celebrating a Community Forum in San Francisco during Oracle Openworld. The session will take place on Monday, October 1st, from 4:00 - 6:00 pm local time.If you would like to present an experience around a customer project or sales best practice in the Manageability or Quality & Testing areas, please contact [email protected] with a short description of your proposal.

    Read the article

  • Visual Studio 2010, Entity Framework, and Oracle

    - by Tobias Gunn
    While I was working on a SilverLight 4 demo I found out that Entity Framework is not supported directly through the .NET provider or ODP tools. In order to make them work you need to either write a wrapper of your own (wouldn't chance it) or else use a provider like DataDirect or Quest's upcoming tool. So far, I've been very happy with the DataDirect tool (found here http://www.datadirect.com/products/net/index.ssp). As I get a little farther along I'll post more on SL4, RIA, and EF.

    Read the article

  • asp.net grid view hyper link pass values to new window

    - by srihari
    hai guys here is my question please help me I have a gridview with hyperlink fields here my requirement is if I click on hyperlink of particular row I need to display that particular row values into other page in that page user will edit record values after that if he clicks on update button I need to update that record values and get back to previous home page. if iam clicking hyper link in first window new window should open with out any url tool bar and minimize close buttons like popup modular Ajax ModalPopUpExtender but iam un able to ge that that one please help me the fillowing code is defalut.aspx <head runat="server"> <title>PassGridviewRow values </title> <style type="text/css"> #gvrecords tr.rowHover:hover { background-color:Yellow; font-family:Arial; } </style> </head> <body> <form id="form1" runat="server"> <div> <asp:GridView runat="server" ID="gvrecords" AutoGenerateColumns="false" HeaderStyle-BackColor="#7779AF" HeaderStyle-ForeColor="White" DataKeyNames="UserId" RowStyle-CssClass="rowHover"> <Columns> <asp:TemplateField HeaderText="Change Password" > <ItemTemplate> <a href ='<%#"UpdateGridviewvalues.aspx?UserId="+DataBinder.Eval(Container.DataItem,"UserId") %>'> <%#Eval("UserName") %> </a> </ItemTemplate> </asp:TemplateField> <asp:BoundField DataField="FirstName" HeaderText="FirstName" /> <asp:BoundField DataField="LastName" HeaderText="LastName" /> <asp:BoundField DataField="Email" HeaderText="Email" /> </Columns> </asp:GridView> </div> </form> </body> </html> following code default.aspx.cs code using System; using System.Collections.Generic; using System.Data; using System.Data.SqlClient; using System.Linq; using System.Web; using System.Web.UI; using System.Web.UI.WebControls; public partial class _Default : System.Web.UI.Page { protected void Page_Load(object sender, EventArgs e) { if (!IsPostBack) { BindGridview(); } } protected void BindGridview() { SqlConnection con = new SqlConnection("Data Source=.;Integrated Security=SSPI;Initial Catalog=testdb1"); con.Open(); SqlCommand cmd = new SqlCommand("select * from UserDetails", con); SqlDataAdapter da = new SqlDataAdapter(cmd); cmd.ExecuteNonQuery(); con.Close(); DataSet ds = new DataSet(); da.Fill(ds); gvrecords.DataSource = ds; gvrecords.DataBind(); } } this is updategridviewvalues.aspx <head runat="server"> <title>Update Gridview Row Values</title> <script type="text/javascript"> function Showalert(username) { alert(username + ' details updated successfully.'); if (alert) { window.location = 'Default.aspx'; } } </script> </head> <body> <form id="form1" runat="server"> <div> <table> <tr> <td colspan="2" align="center"> <b> Edit User Details</b> </td> </tr> <tr> <td> User Name: </td> <td> <asp:Label ID="lblUsername" runat="server"/> </td> </tr> <tr> <td> First Name: </td> <td> <asp:TextBox ID="txtfname" runat="server"></asp:TextBox> </td> </tr> <tr> <td> Last Name: </td> <td> <asp:TextBox ID="txtlname" runat="server"></asp:TextBox> </td> </tr> <tr> <td> Email: </td> <td> <asp:TextBox ID="txtemail" runat="server"></asp:TextBox> </td> </tr> <tr> <td> </td> <td> <asp:Button ID="btnUpdate" runat="server" Text="Update" onclick="btnUpdate_Click" /> <asp:Button ID="btnCancel" runat="server" Text="Cancel" onclick="btnCancel_Click"/> </td> </tr> </table> </div> </form> </body> </html> and my updategridviewvalues.aspx.cs code is follows using System; using System.Collections.Generic; using System.Data; using System.Data.SqlClient; using System.Linq; using System.Web; using System.Web.UI; using System.Web.UI.WebControls; public partial class UpdateGridviewvalues : System.Web.UI.Page { SqlConnection con = new SqlConnection("Data Source=.;Integrated Security=SSPI;Initial Catalog=testdb1"); private int userid=0; protected void Page_Load(object sender, EventArgs e) { userid = Convert.ToInt32(Request.QueryString["UserId"].ToString()); if(!IsPostBack) { BindControlvalues(); } } private void BindControlvalues() { con.Open(); SqlCommand cmd = new SqlCommand("select * from UserDetails where UserId=" + userid, con); SqlDataAdapter da = new SqlDataAdapter(cmd); cmd.ExecuteNonQuery(); con.Close(); DataSet ds = new DataSet(); da.Fill(ds); lblUsername.Text = ds.Tables[0].Rows[0][1].ToString(); txtfname.Text = ds.Tables[0].Rows[0][2].ToString(); txtlname.Text = ds.Tables[0].Rows[0][3].ToString(); txtemail.Text = ds.Tables[0].Rows[0][4].ToString(); } protected void btnUpdate_Click(object sender, EventArgs e) { con.Open(); SqlCommand cmd = new SqlCommand("update UserDetails set FirstName='" + txtfname.Text + "',LastName='" + txtlname.Text + "',Email='" + txtemail.Text + "' where UserId=" + userid, con); SqlDataAdapter da = new SqlDataAdapter(cmd); int result= cmd.ExecuteNonQuery(); con.Close(); if(result==1) { ScriptManager.RegisterStartupScript(this, this.GetType(), "ShowSuccess", "javascript:Showalert('"+lblUsername.Text+"')", true); } } protected void btnCancel_Click(object sender, EventArgs e) { Response.Redirect("~/Default.aspx"); } } My requirement is new window should come like pop window as new window with out having url and close button my database tables is ColumnName DataType ------------------------------------------- UserId Int(set identity property=true) UserName varchar(50) FirstName varchar(50) LastName varchar(50) Email Varchar(50)

    Read the article

  • What happens when Oracle's Enterprise Single-Sign-On database goes down? [migrated]

    - by Unai
    We're working on setting up Oracle's Enterprise Single-Sign-On with High Availability. At the moment every component provides HA except our database backend (i.e. we have just one instance). While conducting some kick-the-plug tests we learnt that the ESSO system works even with the database turned OFF. This was a nice surprise but now we need to understand what are the implications of a database failure; sure the sessions might be handled on the application servers and the policies might have been cached but... for how long? how big is this cache? what is the role of the database? I would appreciate if anyone shares her/his experience and/or points out to documentation that covers this. Thank you so much.

    Read the article

  • Serveur d'application Java : Téléchargez SpringSource Tc Server 2.0, un Apache Tomcat optimisée et a

    SpringSource annonce la sortie de Tc Server 2.0, un Tomcat packagé et supporté par SpringSource. Ce produit est décliné en trois version : Standard Edition : Un tomcat préconfiguré, optimisé et amélioré par des fonctionnalités d'administration et de diagostic. Spring Edition : Un Tomcat specialement conçu pour déployer des applications Spring. Celui-ci, en utilisant une version instrumentée des JAR de Spring, permet d'avoir du monitoring très détaillé sur le fonctionnement des applications. Developer Edition : Gratuite, c'est une version de tomcat dédié au développement, permettant dans le cas d'application Spring (et en particulier Spring MVC) d'avoir des informations sur le fonctionnement de l'application

    Read the article

  • OpenWorld Day 1

    - by Antony Reynolds
    A Day in the Life of an OpenWorld Attendee Part I Lots of people are blogging insightfully about OpenWorld so I thought I would provide some non-insightful remarks to buck the trend! With 50,000 attendees I didn’t expect to bump into too many people I knew, boy was I wrong!  I walked into the registration area and immediately was hailed by a couple of customers I had worked with a few months ago.  Moving to the employee registration area in a different hall I bumped into a colleague from the UK who was also registering.  As soon as I got my badge I bumped into a friend from Ireland!  So maybe OpenWorld isn’t so big after all! First port of call was Larrys Keynote.  As always Larry was provocative and thought provoking.  His key points were announcing the Oracle cloud offering in IaaS, PaaS and SaaS, pointing out that Fusion Apps are cloud enabled and finally announcing the 12c Database, making a big play of its new multi-tenancy features.  His contention was that multi-tenancy will simplify cloud development and provide better security by providing DB level isolation for applications and customers. Next day, Monday, was my first full day at OpenWorld.  The first session I attended was on monitoring of OSB, very interesting presentation on the benefits achieved by an Illinois area telco – US Cellular.  Great discussion of why they bought the SOA Management Packs and the benefits they are already seeing from their investment in terms of improved provisioning and time to market, as well as better performance insight and assistance with capacity planning. Craig Blitz provided a nice walkthrough of where Coherence has been and where it is going. Last night I attended the BOF on Managed File Transfer where Dave Berry replayed Oracles thoughts on providing dedicated Managed File Transfer as part of the 12c SOA release.  Dave laid out the perceived requirements and solicited feedback from the audience on what if anything was missing.  He also demoed an early version of the functionality that would simplify setting up MFT in SOA Suite and make tracking activity much easier. So much for Day 1.  I also ran into scores of old friends and colleagues and had a pleasant dinner with my friend from Ireland where I caught up on the latest news from Oracle UK.  Not bad for Day 1!

    Read the article

  • Unable to start the web application from localhost:3000(ruby on rails)

    - by vipin8169
    I had created a demo web application by executing rails tickets and then i executed rails script/server to run it on localhost. Initially i was able to execute the application in the browser by typing localhost:3000 in the address bar, but then I deleted the folder tickets from my hard disk. Now again i created the same folder but when i try to run it using the same command rails script/server it says that vverma@l-vverma:~/railsExp/tickets$ rails script/server create File exists - script/server I tried deleting the script/server file but i still couldn't run the localhost:3000 in the browser Tried rails s as well, it gave the following output:- http://paste.ubuntu.com/1317610/ But still i was unable to run the app on 'localhost:3000' What is the solution to this

    Read the article

  • Closing the Gap: 2012 IOUG Enterprise Data Security Survey

    - by Troy Kitch
    The new survey from the Independent Oracle Users Group (IOUG) titled "Closing the Security Gap: 2012 IOUG Enterprise Data Security Survey," uncovers some interesting trends in IT security among IOUG members and offers recommendations for securing data stored in enterprise databases. "Despite growing threats and enterprise data security risks, organizations that implement appropriate detective, preventive, and administrative safeguards are seeing significant results," finds the report's author, Joseph McKendrick, analyst, Unisphere Research. Produced by Unisphere Research and underwritten by Oracle, the report is based on responses from 350 IOUG members representing a variety of job roles, organization sizes, and industry verticals. Key findings include Corporate budgets increase, but trailing. Though corporate data security budgets are increasing this year, they still have room to grow to reach the previous year’s spending. Additionally, more than half of respondents say their organizations still do not have, or are unaware of, data security plans to help address contingencies as they arise. Danger of unauthorized access. Less than a third of respondents encrypt data that is either stored or in motion, and at the same time, more than three-fifths say they send actual copies of enterprise production data to other sites inside and outside the enterprise. Privileged user misuse. Only about a third of respondents say they are able to prevent privileged users from abusing data, and most do not have, or are not aware of, ways to prevent access to sensitive data using spreadsheets or other ad hoc tools. Lack of consistent auditing. A majority of respondents actively collect native database audits, but there has not been an appreciable increase in the implementation of automated tools for comprehensive auditing and reporting across databases in the enterprise. IOUG RecommendationsThe report's author finds that securing data requires not just the ability to monitor and detect suspicious activity, but also to prevent the activity in the first place. To achieve this comprehensive approach, the report recommends the following. Apply an enterprise-wide security strategy. Database security requires multiple layers of defense that include a combination of preventive, detective, and administrative data security controls. Get business buy-in and support. Data security only works if it is backed through executive support. The business needs to help determine what protection levels should be attached to data stored in enterprise databases. Provide training and education. Often, business users are not familiar with the risks associated with data security. Beyond IT solutions, what is needed is a well-engaged and knowledgeable organization to help make security a reality. Read the IOUG Data Security Survey Now.

    Read the article

  • Design review for application facing memory issues

    - by Mr Moose
    I apologise in advance for the length of this post, but I want to paint an accurate picture of the problems my app is facing and then pose some questions below; I am trying to address some self inflicted design pain that is now leading to my application crashing due to out of memory errors. An abridged description of the problem domain is as follows; The application takes in a “dataset” that consists of numerous text files containing related data An individual text file within the dataset usually contains approx 20 “headers” that contain metadata about the data it contains. It also contains a large tab delimited section containing data that is related to data in one of the other text files contained within the dataset. The number of columns per file is very variable from 2 to 256+ columns. The original application was written to allow users to load a dataset, map certain columns of each of the files which basically indicating key information on the files to show how they are related as well as identify a few expected column names. Once this is done, a validation process takes place to enforce various rules and ensure that all the relationships between the files are valid. Once that is done, the data is imported into a SQL Server database. The database design is an EAV (Entity-Attribute-Value) model used to cater for the variable columns per file. I know EAV has its detractors, but in this case, I feel it was a reasonable choice given the disparate data and variable number of columns submitted in each dataset. The memory problem Given the fact the combined size of all text files was at most about 5 megs, and in an effort to reduce the database transaction time, it was decided to read ALL the data from files into memory and then perform the following; perform all the validation whilst the data was in memory relate it using an object model Start DB transaction and write the key columns row by row, noting the Id of the written row (all tables in the database utilise identity columns), then the Id of the newly written row is applied to all related data Once all related data had been updated with the key information to which it relates, these records are written using SqlBulkCopy. Due to our EAV model, we essentially have; x columns by y rows to write, where x can by 256+ and rows are often into the tens of thousands. Once all the data is written without error (can take several minutes for large datasets), Commit the transaction. The problem now comes from the fact we are now receiving individual files containing over 30 megs of data. In a dataset, we can receive any number of files. We’ve started seen datasets of around 100 megs coming in and I expect it is only going to get bigger from here on in. With files of this size, data can’t even be read into memory without the app falling over, let alone be validated and imported. I anticipate having to modify large chunks of the code to allow validation to occur by parsing files line by line and am not exactly decided on how to handle the import and transactions. Potential improvements I’ve wondered about using GUIDs to relate the data rather than relying on identity fields. This would allow data to be related prior to writing to the database. This would certainly increase the storage required though. Especially in an EAV design. Would you think this is a reasonable thing to try, or do I simply persist with identity fields (natural keys can’t be trusted to be unique across all submitters). Use of staging tables to get data into the database and only performing the transaction to copy data from staging area to actual destination tables. Questions For systems like this that import large quantities of data, how to you go about keeping transactions small. I’ve kept them as small as possible in the current design, but they are still active for several minutes and write hundreds of thousands of records in one transaction. Is there a better solution? The tab delimited data section is read into a DataTable to be viewed in a grid. I don’t need the full functionality of a DataTable, so I suspect it is overkill. Is there anyway to turn off various features of DataTables to make them more lightweight? Are there any other obvious things you would do in this situation to minimise the memory footprint of the application described above? Thanks for your kind attention.

    Read the article

  • Even More Storage Options in VDI 3.4.1

    - by mprove
    Oracle Virtual Desktop Infrastructure 3.4.1 has been released to complete the storage matrix below. Storage Type VirtualBox on Solaris VirtualBox on Enterprise Linux Sun ZFS yes yes Sun ZFS (pool on Solaris) yes yes iSCSI - new in VDI 3.4 Network File System new in VDI 3.4.1 new in VDI 3.4 Local Storage new in VDI 3.4.1 new in VDI 3.4

    Read the article

  • Solving Inbound Refinery PDF Conversion Issues, Part 1

    - by Kevin Smith
    Working with Inbound Refinery (IBR)  and PDF Conversion can be very frustrating. When everything is working smoothly you kind of forgot it is even there. Documents are cheeked into WebCenter Content (WCC), sent to IBR for conversion, converted to PDF, returned to WCC, and viola your Office documents have a nice PDF rendition available for viewing. Then a user checks in a bunch of password protected Word files, the conversions fail, your IBR queue starts backing up, users start calling asking why their document have not been released yet, and your spend a frustrating afternoon trying to recover and get things back running properly again. Password protected documents are one cause of PDF conversion failures, and I will cover those in a future blog post, but there are many other problems that can cause conversions to fail, especially when working with the WinNativeConverter and using the native applications, e.g. Word, to convert a document to PDF. There are other conversion options like PDFExportConverter which uses Oracle OutsideIn to convert documents directly to PDF without the need for the native applications. However, to get the best fidelity to the original document the native applications must be used. Many customers have tried PDFExportConverter, but have stayed with the native applications for conversion since the conversion results from PDFExportConverter were not as good as when the native applications are used. One problem I ran into recently, that at least has a easy solution, are Word documents that display a Show Repairs dialog when the document is opened. If you open the problem document yourself you will see this dialog. This will cause the conversion to time out. Any time the native application displays a dialog that requires user input the conversion will time out. The solution is to set add a setting for BulletProofOnCorruption to the registry for the user running Word on the IBR server. See this support note from Microsoft for details. The support note says to set the registry key under HKEY_CURRENT_USER, but since we are running IBR as a service the correct location is under HKEY_USERS\.DEFAULT. Also since in our environment we were using Office 2007, the correct registry key to use was: HKEY_USERS\.DEFAULT\Software\Microsoft\Office\11.0\Word\Options Once you have done this restart the IBR managed server and resubmit your problem document. It should now be converted successfully. For more details on IBR see the Oracle® WebCenter Content Administrator's Guide for Conversion.

    Read the article

  • How to change the cursor type

    - by Jessy
    This question is related to the previous post. http://stackoverflow.com/questions/2532936/how-to-save-file-and-read How can I change the cursor to "Hand" only when the mouse pointed on grid which is not Null (contained images)? So far the cursor turn to "Hand" all over the grids (null or not null). public GUI() { .... JPanel pDraw = new JPanel(); .... for(Component component: pDraw.getComponents()){ JLabel lbl = (JLabel)component; //add mouse listener to grid box which contained image if (lbl.getIcon() != null) lbl.addMouseListener(this); } public void mouseEntered(MouseEvent e) { Cursor cursor = Cursor.getDefaultCursor(); //change cursor appearance to HAND_CURSOR when the mouse pointed on images cursor = Cursor.getPredefinedCursor(Cursor.HAND_CURSOR); setCursor(cursor); }

    Read the article

  • ExtJS Data Grid Column renderer to have multiple values

    - by Podlsk
    Hi, I am wondering if it is possible in ExtJS to have several values of the data source available to the renderer of the column. For example with the "Actions" column, the id is passed to the renderer. However I require both the user_id and id passed to the render. How may I do this? table_cols = [{ header: "User ID", width: 30, sortable: true, dataIndex: 'user_id' }, { header: "Actions", width: 60, sortable: false, dataIndex: 'id', renderer: function(val) { //IF USER ID MEETS A CONSTRAINT PRINT THE ID } }]; Thanks.

    Read the article

  • Great Expectations - Fusion HCM Highlights at OOW

    - by Kathryn Perry
    A guest post by Lisa Conley, Principal Product Strategy Manager, Fusion HCM, Oracle Applications Development Oracle Open World is just around the corner! There's always so much to see and do and learn at the conference so I want to share some of the 'don't miss' Fusion HCM highlights with you. (Use this tool to search by session number to get a full description.) For starters, we have several customers who will be sharing their Fusion HCM implementation stories. We'll kick off these presentations with a customer panel at 12:15 on Monday in Moscone West 2005 (CON9420). You'll hear from Zillow, the Gerson Lehrman Group, UBS, and ConAgra about their experiences with our products. Oracle partners MarketSphere (CON8581) and eVerge (CON3800) have implemented Fusion HCM themselves and and will talk about how they'll use their experiences to help customers with their implementations (both are in Moscone West 2006). Beth Correa, CEO of Official Payroll Advisor, will highlight her favorite things about Oracle Fusion HCM Payroll on Tuesday at 11:45 in Moscone West 2006 (CON6691). And you'll get to hear from customers again when they speak with Steve Miranda in his Oracle Applications: Strategic Directions and Recommendations session on Tuesday at 1:15 in Moscone West 2002/2004 (CON11434). To bring it all together for you, we've listed all your Fusion HCM opportunities to learn and interact in this Focus On Document. I am really looking forward to the sessions on Human Capital Management in the Cloud. The Oracle Cloud combines the multiple product offerings into a single environment that leverages a common technology infrastructure enabling users to focus on their business - not the business of managing environments. On Tuesday at 10:15 in Moscone West 2002/2004, there is a General Session entitled the Future of Oracle HCM -- Strategy and Roadmap (GEN9505). This will touch on all product lines. Fusion HCM will be highlighted in Gretchen Alarcon's Oracle HCM: Overview, Strategy, Customer Experiences, and Roadmap session on Monday at 12:15 in Moscone West 2005 (CON9410). Also on Tuesday at 1:15 in Moscone West 2006, is a session focused on Talent Management and how you can try out these new products, co-existing with your current product set (CON9430). This is important in that you can test the waters before diving in. ConAgra will be sharing their experience in this session as well.  And of course, if you want to have a personal demonstration, please come by the Oracle DEMOgrounds in West Exhibition Hall Level 1 or the Oracle Cloud Services Lounge at Moscone West Level 3 where our Oracle HCM Cloud Services experts will be ready to answer your questions. I hope you have a wonderful week in San Francisco.Lisa ConleyPrincipal Product Strategy Manager, Fusion HCMApplications DevelopmentOracle Corporation

    Read the article

  • What are the basics (best practices, etc) of writing a good Flash application?

    - by donut
    I've been tasked with writing a Flash application for facilitating a Fantasy Football draft pick. Normally, my choice would be to write this application in JavaScript. Unfortunately, it will require the use of sockets and since we don't have enough experience with Web Socket JS to be sure it would work in this situation I need to learn Flash. Currently, I'm most proficient in JavaScript/HTML/CSS/PHP/MySQL and have never really used Flash in a meaningful way. As best as I can I want to start this project off "the right way". So, what are good to know best practices and/or references/resources that will teach my the right way of doing things in Flash?

    Read the article

  • Networking Guidelines

    - by ACShorten
    One of the things I have noticed in my years in IT is the changes in networking. In the past networking was pretty simple with the host name and name resolution (via DNS) being pretty simple. Some sites still use this simple networking setup. These days, more complex name resolution, proxies, firewalls, demarcation nd virtualization, can make networking more complex. This can cause issues when installing products with in built networking that can frustrate even seasoned veterans. I have put together a few basic guidelines to hopefully help along with product installation and getting a product to operate in a somewhat complex network setup. All the components of the product (including the infrastructure) need to communicate via a network (even it is within a local machine/host). Ensure any host names referred to within configuration files are accessible via your networking setup. This may mean defining the hosts to the machines, to the DNS for name resolution and even your firewall to allow machines to communicate within your network. Make sure the ports used for any of the infrastructure are accessible (even through your firewall) and are unique within the host. Host duplication can cause the product to fail on startup as the port is already in use. If there are still issues, consider using localhost as your host name. I have used this in so many situations that I tend to use it now as a default anytime I install anything myself. Most Oracle products suggest to use localhost when using dynamic host or dynamic IP addresses and this is no different for the Oracle Utilities Application Framework. If you do use localhost then installing a Loopback Adapter for the operating system is recommended to force networking to a minimum. Usually localhost resolves to 127.0.0.1. When using multiple network connections, especially in a virtualized environment, ensure the host and ports used are relevent for the network cards you have setup. One of the common issues is finding the product is using a vierualized network card only to find that it is not setup for correct networking. If you are using the batch component, do not forget to ensure that the multicast protocol is enabled on your host and that the multicast address and port number specified are valid and accessible from all machines in the batch cluster (if clustering used). The same advice applies if you are using unicast where each host/port combination should be accessible. Hopefully these basic networking recommendations will help minimize any networking issues you might encounter.

    Read the article

  • Tap into MySQL's Amazing Performance Results with the Performance Tuning Course

    - by Antoinette O'Sullivan
    Want to leverage the high-speed load utilities, distinctive memory caches, full text indexes, and other performance-enhancing mechanisms that MySQL offers to fuel today's critical business systems. The authentic MySQL Performance Tuning course, in 4 days, teaches you to evaluate the MySQL architecture, learn to use the tools, configure the database for performance, tune application and SQL code, tune the server, examine the storage engines, assess the application architecture, and learn general tuning concepts. You can take this course in one the following three ways: Training-on-Demand: Access the streaming video, instructor delivery of this course from your own desk, at your own pace. Book time for hands-on practice when it suits you. Live-Virtual Class: Take this instructor-led class live from your own desk. With 700 events on the schedule you are sure to find a time and date to suit you! In-Class: Travel to a classroom to take this class. A sample of events on the schedule are as follows.  Location  Date  Delivery Language  Hamburg, Germany  22 October 2012  German  Prague, Czech Republic  1 October 2012  Czech  Warsaw, Poland  3 December 2012  Polish  London, England  19 November 2012  English  Rome, Italy  23 October 2012  Italian Lisbon, Portugal  6 November 2012  European Portugese  Aix en Provence, France  4 September 2012   French  Strasbourg, France 16 October 2012   French  Nieuwegein, Netherlands 26 November 2012   Dutch  Madrid, Spain 17 December 2012   Spanish  Mechelen, Belgium  1 October 2012  English  Riga, Latvia  10 December 2012  Latvian  Petaling Jaya, Malaysia  10 September 2012 English   Edmonton, Canada 10 December 2012   English  Vancouver, Canada 10 December 2012   English  Ottawa, Canada 26 November 2012   English  Toronto, Canada 26 November 2012   English  Montreal, Canada 26 November 2012   English  Mexico City, Mexico 10 September 2012   Spanish  Sao Paolo, Brazil 26 November 2012  Brazilian Portugese   Tokyo, Japan 19 November 2012   Japanese  Tokyo, Japan  19 November 2012  Japanese For further information on this class, or to register your interest in additional events, go to the Oracle University Portal: http://oracle.com/education/mysql

    Read the article

< Previous Page | 334 335 336 337 338 339 340 341 342 343 344 345  | Next Page >