Search Results

Search found 67192 results on 2688 pages for 'excel external data'.

Page 26/2688 | < Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >

  • Queued Loadtest to remove Concurrency issues using Shared Data Service in OpenScript

    - by stefan.thieme(at)oracle.com
    Queued Processing to remove Concurrency issues in Loadtest ScriptsSome scripts act on information returned by the server, e.g. act on first item in the returned list of pending tasks/actions. This may lead to concurrency issues if the virtual users simulated in a load test scenario are not synchronized in some way.As the load test cases should be carried out in a comparable and straight forward manner simply cancel a transaction in case a collision occurs is clearly not an option. In case you increase the number of virtual users this approach would lead to a high number of requests for the early steps in your transaction (e.g. login, retrieve list of action points, assign an action point to the virtual user) but later steps would be rarely visited successfully or at all, depending on the application logic.A way to tackle this problem is to enqueue the virtual users in a Shared Data Service queue. Only the first virtual user in this queue will be allowed to carry out the critical steps (retrieve list of action points, assign an action point to the virtual user) in your transaction at any one time.Once a virtual user has passed the critical path it will dequeue himself from the head of the queue and continue with his actions. This does theoretically allow virtual users to run in parallel all steps of the transaction which are not part of the critical path.In practice it has been seen this is rarely the case, though it does not allow adding more than N users to perform a transaction without causing delays due to virtual users waiting in the queue. N being the time of the total transaction divided by the sum of the time of all critical steps in this transaction.While this problem can be circumvented by allowing multiple queues to act on individual segments of the list of actions, e.g. per country filter, ends with 0..9 filter, etc.This would require additional handling of these additional queues of slots for the virtual users at the head of the queue in order to maintain the mutually exclusive access to the first element in the list returned by the server at any one time of the load test. Such an improved handling of multiple queues and/or multiple slots is above the subject of this paper.Shared Data Services Pre-RequisitesStart WebLogic Server to host Shared Data ServicesYou will have to make sure that your WebLogic server is installed and started. Shared Data Services may not work if you installed only the minimal installation package for OpenScript. If however you installed the default package including OLT and OTM, you may follow the instructions below to start and verify WebLogic installation.To start the WebLogic Server deployed underneath of Oracle Load Testing and/or Oracle Test Manager you can go to your Start menu, Oracle Application Testing Suite and select the Restart Oracle Application Testing Suite Application Service entry from the Tools submenu.To verify the service has been started you can run the Microsoft Management Console for Services by Selecting Run from the Start Menu and entering services.msc. Look for the entry that reads Oracle Application Testing Suite Application Service, once it has changed it status from Starting to Started you can proceed to verify the login. Please note that this may take several minutes, I would say up to 10 minutes depending on the strength of your CPU horse-power.Verify WebLogic Server user credentialsYou will have to make sure that your WebLogic Server is installed and started. Next open the Oracle WebLogic Server Adminstration Console on http://localhost:8088/console.It may take a while until the application is deployed and started. It may display the following until the Administration Console has been deployed on the fly.Afterwards you can login using the username oats and the password that you selected during install time for your Application Testing Suite administrative purposes.This will bring up the Home page of you WebLogic Server. You have actually verified that you are able to login with these credentials already. However if you want to check the details, navigate to Security Realms, myrealm, Users and Groups tab.Here you could add users to your WebLogic Server which could be used in the later steps. Details on the Groups required for such a custom user to work are exceeding this quick overview and have to be selected with the WebLogic Server Adminstration Guide in mind.Shared Data Services pre-requisites for Load testingOpenScript Preferences have to be set to enable Encryption and provide a default Shared Data Service Connection for Playback.These are pre-requisites you want to use for load testing with Shared Data Services.Please note that the usage of the Connection Parameters (individual directive in the script) for Shared Data Services did not playback reliably in the current version 9.20.0370 of Oracle Load Testing (OLT) and encryption of credentials still seemed to be mandatory as well.General Encryption settingsSelect OpenScript Preferences from the View menu and navigate to the General, Encryption entry in the tree on the left. Select the Encrypt script data option from the list and enter the same password that you used for securing your WebLogic Server Administration Console.Enable global shared data access credentialsSelect OpenScript Preferences from the View menu and navigate to the Playback, Shared Data entry in the tree on the left. Enable the global shared data access credentials and enter the Address, User name and Password determined for your WebLogic Server to host Shared Data Services.Please note, that you may want to replace the localhost in Address with the hosts realname in case you plan to run load tests with Loadtest Agents running on remote systems.Queued Processing of TransactionsEnable Shared Data Services Module in Script PropertiesThe Shared Data Services Module has to be enabled for each Script that wants to employ the Shared Data Service Queue functionality in OpenScript. It can be enabled under the Script menu selecting Script Properties. On the Script Properties Dialog select the Modules section and check Shared Data to enable Shared Data Service Module for your script. Checking the Shared Data Services option will effectively add a line to your script code that adds the sharedData ScriptService to your script class of IteratingVUserScript.@ScriptService oracle.oats.scripting.modules.sharedData.api.SharedDataService sharedData;Record your scriptRecord your script as usual and then add the following things for Queue handling in the Initialize code block, before the first step and after the last step of your critical path and in the Finalize code block.The java code to be added at individual locations is explained in the following sections in full detail.Create a Shared Data Queue in InitializeTo create a Shared Data Queue go to the Java view of your script and enter the following statements to the initialize() code block.info("Create queueA with life time of 120 minutes");sharedData.createQueue("queueA", 120);This will create an instantiation of the Shared Data Queue object named queueA which is maintained for upto 120 minutes.If you want to use the code for multiple scripts, make sure to use a different queue name for each one here and in the subsequent steps. You may even consider to use a dynamic queueName based on filters of your result list being concurrently accessed.Prepare a unique id for each IterationIn order to keep track of individual virtual users in our queue we need to create a unique identifier from the virtual user id and the used username right after retrieving the next record from our databank file.getDatabank("Usernames").getNextDatabankRecord();getVariables().set("usernameValue1","VU_{{@vuid}}_{{@iterationnum}}_{{db.Usernames.Username}}_{{@timestamp}}_{{@random(10000)}}");String usernameValue = getVariables().get("usernameValue1");info("Now running virtual user " + usernameValue);As you can see from the above code block, we have set the OpenScript variable usernameValue1 to VU_{{@vuid}}_{{@iterationnum}}_{{db.Usernames.Username}}_{{@timestamp}}_{{@random(10000)}} which is a concatenation of the virtual user id and the iterationnumber for general uniqueness; as well as the username from our databank, the timestamp and a random number for making it further unique and ease spotting of errors.Not all of these fields are actually required to make it really unique, but adding the queue name may also be considered to help troubleshoot multiple queues.The value is then retrieved with the getVariables.get() method call and assigned to the usernameValue String used throughout the script.Please note that moving the getDatabank("Usernames").getNextDatabankRecord(); call to the initialize block was later considered to remove concurrency of multiple virtual users running with the same userid and therefor accessing the same "My Inbox" in step 6. This will effectively give each virtual user a userid from the databank file. Make sure you have enough userids to remove this second hurdle.Enqueue and attend Queue before Critical PathTo maintain the right order of virtual users being allowed into the critical path of the transaction the following pseudo step has to be added in front of the first critical step. In the case of this example this is right in front of the step where we retrieve the list of actions from which we select the first to be assigned to us.beginStep("[0] Waiting in the Queue", 0);{info("Enqueued virtual user " + usernameValue + " at the end of queueA");sharedData.offerLast("queueA", usernameValue);info("Wait until the user is the first in queueA");String queueValue1 = null;do {// we wait for at least 0.7 seconds before we check the head of the// queue. This is the time it takes one user to move through the// critical path, i.e. pass steps [5] Enter country and [6] Assign// to meThread.sleep(700);queueValue1 = (String) sharedData.peekFirst("queueA");info("The first user in queueA is currently: '" + queueValue1 + "' " + queueValue1.getClass() + " length " + queueValue1.length() );info("The current user is '"+ usernameValue + "' " + usernameValue.getClass() + " length " + usernameValue.length() + ": indexOf " + usernameValue.indexOf(queueValue1) + " equals " + usernameValue.equals(queueValue1) );} while ( queueValue1.indexOf(usernameValue) < 0 );info("Now the user is the first in queueA");}endStep();This will enqueue the username to the tail of our Queue. It will will wait for at least 700 milliseconds, the time it takes for one user to exit the critical path and then compare the head of our queue with it's username. This last step will be repeated while the two are not equal (indexOf less than zero). If they are equal the indexOf will yield a value of zero or larger and we will perform the critical steps.Dequeue after Critical PathAfter the virtual user has left the critical path and complete its last step the following code block needs to dequeue the virtual user. In the case of our example this is right after the action has been actually assigned to the virtual user. This will allow the next virtual user to retrieve the list of actions still available and in turn let him make his selection/assignment.info("Get and remove the current user from the head of queueA");String pollValue1 = (String) sharedData.pollFirst("queueA");The current user is removed from the head of the queue. The next one will now be able to match his username against the head of the queue.Clear and Destroy Queue for FinishWhen the script has completed, it should clear and destroy the queue. This code block can be put in the finish block of your script and/or in a separate script in order to clear and remove the queue in case you have spotted an error or want to reset the queue for some reason.info("Clear queueA");sharedData.clearQueue("queueA");info("Destroy queueA");sharedData.destroyQueue("queueA");The users waiting in queueA are cleared and the queue is destroyed. If you have scripts still executing they will be caught in a loop.I found it better to maintain a separate Reset Queue script which contained only the following code in the initialize() block. I use to call this script to make sure the queue is cleared in between multiple Loadtest runs. This script could also even be added as the first in a larger scenario, which would execute it only once at very start of the Loadtest and make sure the queues do not contain any stale entries.info("Create queueA with life time of 120 minutes");sharedData.createQueue("queueA", 120);info("Clear queueA");sharedData.clearQueue("queueA");This will create a Shared Data Queue instance of queueA and clear all entries from this queue.Monitoring QueueWhile creating the scripts it was useful to monitor the contents, i.e. the current first user in the Queue. The following code block will make sure the Shared Data Queue is accessible in the initialize() block.info("Create queueA with life time of 120 minutes");sharedData.createQueue("queueA", 120);In the run() block the following code will continuously monitor the first element of the Queue and write an informational message with the current username Value to the Result window.info("Monitor the first users in queueA");String queueValue1 = null;do {queueValue1 = (String) sharedData.peekFirst("queueA");if (queueValue1 != null)info("The first user in queueA is currently: '" + queueValue1 + "' " + queueValue1.getClass() + " length " + queueValue1.length() );} while ( true );This script can be run from OpenScript parallel to a loadtest performed by the Oracle Load Test.However it is not recommend to run this in a production loadtest as the performance impact is unknown. Accessing the Queue's head with the peekFirst() method has been reported with about 2 seconds response time by both OpenScript and OTL. It is advised to log a Service Request to see if this could be lowered in future releases of Application Testing Suite, as the pollFirst() and even offerLast() writing to the tail of the Queue usually returned after an average 0.1 seconds.Debugging QueueWhile debugging the scripts the following was useful to remove single entries from its head, i.e. the current first user in the Queue. The following code block will make sure the Shared Data Queue is accessible in the initialize() block.info("Create queueA with life time of 120 minutes");sharedData.createQueue("queueA", 120);In the run() block the following code will remove the first element of the Queue and write an informational message with the current username Value to the Result window.info("Get and remove the current user from the head of queueA");String pollValue1 = (String) sharedData.pollFirst("queueA");info("The first user in queueA was currently: '" + pollValue1 + "' " + pollValue1.getClass() + " length " + pollValue1.length() );ReferencesOracle Functional Testing OpenScript User's Guide Version 9.20 [E15488-05]Chapter 17 Using the Shared Data Modulehttp://download.oracle.com/otn/nt/apptesting/oats-docs-9.21.0030.zipOracle Fusion Middleware Oracle WebLogic Server Administration Console Online Help 11g Release 1 (10.3.4) [E13952-04]Administration Console Online Help - Manage users and groupshttp://download.oracle.com/docs/cd/E17904_01/apirefs.1111/e13952/taskhelp/security/ManageUsersAndGroups.htm

    Read the article

  • New version of SQL Server Data Tools is now available

    - by jamiet
    If you don’t follow the SQL Server Data Tools (SSDT) blog then you may not know that two days ago an updated version of SSDT was released (and by SSDT I mean the database projects, not the SSIS/SSRS/SSAS stuff) along with a new version of the SSDT Power Tools. This release incorporates a an updated version of the SQL Server Data Tier Application Framework (aka DAC Framework, aka DacFX) which you can read about on Adam Mahood’s blog post SQL Server Data-Tier Application Framework (September 2012) Available. DacFX is essentially all the gubbins that you need to extract and publish .dacpacs and according to Adam’s post it incorporates a new feature that I think is very interesting indeed: Extract DACPAC with data – Creates a database snapshot file (.dacpac) from a live SQL Server or Windows Azure SQL Database that contains data from user tables in addition to the database schema. These packages can be published to a new or existing SQL Server or Windows Azure SQL Database using the SqlPackage.exe Publish action. Data contained in package replaces the existing data in the target database. In short, .dacpacs can now include data as well as schema. I’m very excited about this because one of my long-standing complaints about SSDT (and its many forebears) is that whilst it has great support for declarative development of schema it does not provide anything similar for data – if you want to deploy data from your SSDT projects then you have to write Post-Deployment MERGE scripts. This new feature for .dacpacs does not change that situation yet however it is a very important pre-requisite so I am hoping that a feature to provide declaration of data (in addition to declaration of schema which we have today) is going to light up in SSDT in the not too distant future. Read more about the latest SSDT, Power Tools & DacFX releases at: Now available: SQL Server Data Tools - September 2012 update! by Janet Yeilding New SSDT Power Tools! Now for both Visual Studio 2010 and Visual Studio 2012 by Sarah McDevitt SQL Server Data-Tier Application Framework (September 2012) Available by Adam Mahood @Jamiet

    Read the article

  • Not able to access external Hard disk

    - by Jash Jacob
    I have a 1TB External Hard drive which I'm currently not able to access. When I open the External drive in Finder, It shows it's empty. When I use the option to "Get Info", I get the dialog box stating it has about 300GB Free. Tried to get into the External Drive using Terminal, I had no luck. Checking in Disk Utility, It showed that I have many number of files but ZERO folder. I tried to "repair disk", in the process the external Drive got unmounted in between the process. I checked this drive on Windows. I was able to open almost all the folders but I wasn't able to copy anything onto the external drive. One folder caused my windows computer to hang, So i connected the drive back onto my MacBook Pro and tried to access the drive through terminal (this time it worked!) and then I tried to delete the folder with rm command, I got an "input/output error" What should i do to recover the files in that folder? How can i access my external drive on my mac

    Read the article

  • Importing data from many excel workbooks and sheets into a single workbook/table

    - by Max Rusalen
    Hi, I have 54 excel files with three sheets each, each sheet has a different amount of data entries but they are set out in a identical format, and I need to import the data from those sheets into a single workbook using VBA. Is there any way I can program it so I can build the loops to import the data, but without having to write in each workbook name for each loop/sheet? I think I can use the call function, but I don't know how to make the loop codes independent of the workbook name they apply to. Thank you so much in advance, Millie

    Read the article

  • How Oracle Data Integration Customers Differentiate Their Business in Competitive Markets

    - by Irem Radzik
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 With data being a central force in driving innovation and competing effectively, data integration has become a key IT approach to remove silos and ensure working with consistent and trusted data. Especially with the release of 12c version, Oracle Data Integrator and Oracle GoldenGate offer easy-to-use and high-performance solutions that help companies with their critical data initiatives, including big data analytics, moving to cloud architectures, modernizing and connecting transactional systems and more. In a recent press release we announced the great momentum and analyst recognition Oracle Data Integration products have achieved in the data integration and replication market. In this press release we described some of the key new features of Oracle Data Integrator 12c and Oracle GoldenGate 12c. In addition, a few from our 4500+ customers explained how Oracle’s data integration platform helped them achieve their business goals. In this blog post I would like to go over what these customers shared about their experience. Land O’Lakes is one of America’s premier member-owned cooperatives, and offers an extensive line of agricultural supplies, as well as production and business services. Rich Bellefeuille, manager, ETL & data warehouse for Land O’Lakes told us how GoldenGate helped them modernize their critical ERP system without impacting service and how they are moving to new projects with Oracle Data Integrator 12c: “With Oracle GoldenGate 11g, we've been able to migrate our enterprise-wide implementation of Oracle’s JD Edwards EnterpriseOne, ERP system, to a new database and application server platform with minimal downtime to our business. Using Oracle GoldenGate 11g we reduced database migration time from nearly 30 hours to less than 30 minutes. Given our quick success, we are considering expansion of our Oracle GoldenGate 12c footprint. We are also in the midst of deploying a solution leveraging Oracle Data Integrator 12c to manage our pricing data to handle orders more effectively and provide a better relationship with our clients. We feel we are gaining higher productivity and flexibility with Oracle's data integration products." ICON, a global provider of outsourced development services to the pharmaceutical, biotechnology and medical device industries, highlighted the competitive advantage that a solid data integration foundation brings. Diarmaid O’Reilly, enterprise data warehouse manager, ICON plc said “Oracle Data Integrator enables us to align clinical trials intelligence with the information needs of our sponsors. It helps differentiate ICON’s services in an increasingly competitive drug-development industry."  You can find more info on ICON's implementation here. A popular use case for Oracle GoldenGate’s real-time data integration is offloading operational reporting from critical transaction processing systems. SolarWorld, one of the world’s largest solar-technology producers and the largest U.S. solar panel manufacturer, implemented Oracle GoldenGate for real-time data integration of manufacturing data for fast analysis. Russ Toyama, U.S. senior database administrator for SolarWorld told us real-time data helps their operations and GoldenGate’s solution supports high performance of their manufacturing systems: “We use Oracle GoldenGate for real-time data integration into our decision support system, which performs real-time analysis for manufacturing operations to continuously improve product quality, yield and efficiency. With reliable and low-impact data movement capabilities, Oracle GoldenGate also helps ensure that our critical manufacturing systems are stable and operate with high performance."  You can watch the full interview with SolarWorld's Russ Toyama here. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Starwood Hotels and Resorts is one of the many customers that found out how well Oracle Data Integration products work with Oracle Exadata. Gordon Light, senior director of information technology for StarWood Hotels, says they had notable performance gain in loading Oracle Exadata reporting environment: “We leverage Oracle GoldenGate to replicate data from our central reservations systems and other OLTP databases – significantly decreasing the overall ETL duration. Moving forward, we plan to use Oracle GoldenGate to help the company achieve near-real-time reporting.”You can listen about Starwood Hotels' implementation here. Many companies combine the power of Oracle GoldenGate with Oracle Data Integrator to have a single, integrated data integration platform for variety of use cases across the enterprise. Ufone is another good example of that. The leading mobile communications service provider of Pakistan has improved customer service using timely customer data in its data warehouse. Atif Aslam, head of management information systems for Ufone says: “Oracle Data Integrator and Oracle GoldenGate help us integrate information from various systems and provide up-to-date and real-time CRM data updates hourly, rather than daily. The applications have simplified data warehouse operations and allowed business users to make faster and better informed decisions to protect revenue in the fast-moving Pakistani telecommunications market.” You can read more about Ufone's use case here. In our Oracle Data Integration 12c launch webcast back in November we also heard from BT’s CTO Surren Parthab about their use of GoldenGate for moving to private cloud architecture. Surren also shared his perspectives on Oracle Data Integrator 12c and Oracle GoldenGate 12c releases. You can watch the video here. These are only a few examples of leading companies that have made data integration and real-time data access a key part of their data governance and IT modernization initiatives. They have seen real improvements in how their businesses operate and differentiate in today’s competitive markets. You can read about other customer examples in our Ebook: The Path to the Future and access resources including white papers, data sheets, podcasts and more via our Oracle Data Integration resource kit. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • How can I make results of a formula values that can be filtered or use vlookup with Excel

    - by Burt
    I am having an issue in that I am using various formulas to move, split data, etc from various sources. The problem is when my final results post to the final destination that I want, I still need to either run advanced filters, or a vlookup with the results. I can’t do this because as an example if cell A1 shows a value of: A127 the actual cell content is: =RIGHT(A2,FIND(" ",A2&" ")-2) Everything I read said to copy and paste special values, but this doesn’t work for me as the idea is to have the formulas/macros run everything and eliminating cutting and pasting. In the case above I have a formula that pulls that info from a spreadsheet that is saved every week. Once it is pulled part of it is cut out in another column. I then need to run a vlookup on those results for data already contained on another tab.

    Read the article

  • How to Export data to Excel using LINQ to Entity?

    - by Rita
    Hi I have the data coming from Entity Data model table on my ASP.NET page. Now I have to export this data into Excel on button click. If it is using OLEDB, it is straight forward as it is here: http://csharp.net-informations.com/excel/csharp-excel-oledb-insert.htm Here is my function to read data from inquiries table: var model = from i in myEntity.Inquiries where i.User_Id == 5 orderby i.TX_Id descending select new { RequestID = i.TX_Id, CustomerName = i.CustomerMaster.FirstName, RequestDate = i.RequestDate, Email = i.CustomerMaster.MS_Id, DocDescription = i.Document.Description, ProductName = i.Product.Name

    Read the article

  • Properly clean up excel interop objects revisited: Wrapper objects

    - by chiccodoro
    Hi all, Excel 2007 Hangs When Closing via .NET How to properly clean up Excel interop objects in C# How to properly clean up interop objects in C# All of these struggle with the problem that C# does not release the Excel COM objects properly after using them. There are mainly two directions of working around this issue: Kill the Excel process when Excel is not used anymore. Take care to assign each COM object used explicitly to a variable and to Marshal.ReleaseComObject all of these. Some have stated that 2 is too tedious and there is always some uncertainty whether you forget to stick to this rule at some places in the code. Still 1 seems dirty and dangerous to me, also I could imagine that in an environment with restricted access killing processes is not allowed. So I've been thinking about solving 2 by creating another proxy object model which mimics the Excel object model (for me, it would suffice to implement the objects I actually need). The principle would look as follows: Each Excel Interop class has its proxy which wraps an object of that class. The proxy releases the COM object in its destructor. The proxy mimics the interface of the Interop class (maybe by inheriting it). Any methods that usually return another COM object return a proxy instead. The other methods simply delegate the implementation to the inner COM object. This is a rough sketch of the code: public class Application : Microsoft.Office.Interop.Excel.Application { private Microsoft.Office.Interop.Excel.Application innerApplication = new Microsoft.Office.Interop.Excel.Application innerApplication(); ~Application() { Marshal.ReleaseCOMObject(innerApplication); } public Workbooks Workbooks { get { return new Workbooks(innerApplication.Workbooks); } } } public class Workbooks { private Microsoft.Office.Interop.Excel.Workbooks innerWorkbooks; Workbooks(Microsoft.Office.Interop.Excel.Workbooks innerWorkbooks) { this.innerWorkbooks = innerWorkbooks; } ~Workbooks() { Marshal.ReleaseCOMObject(innerWorkbooks); } } My questions to you are in particular: Who finds this a bad idea and why? Who finds this a gread idea? If so, why hasn't anybody implemented/published such a model yet? Just due to the effort, or am I missing a killing problem with that idea? Is it impossible/bad/dangerous to do the ReleaseCOMObject in the destructor? (I've only seen proposals to put it in a Dispose() rather than in a destructor - why?) If the approach makes sense, any suggestions to improve it?

    Read the article

  • Create Excel (.XLS and .XLSX) file from C#

    - by mistrmark
    What is the best tool for creating an Excel Spreadsheet with C#. Ideally, I would like open source so I don't have to add any third party dependencies to my code, and I would like to avoid using Excel directly to create the file (using OLE Automation.) The .CSV file solution is easy, and is the current way I am handling this, but I but I would like to control the output formats. EDIT: I am still looking at these to see the best alternative for my solution. Interop will work, but it requires Excel to be on the machine you are using. Also the OLEDB method is intriguing, but may not yield much more than what I can achieve with CSV files. I will look more into the 2003 xml format, but that also puts a Excel 2003 requirement on the file. I am currently looking at a port of the PEAR (PHP library) Excel Writer that will allow some pretty good XLS data and formatting and it is in the Excel_97 compatible format that all modern versions of Excel support. The PEAR Excel Writer is here: PEAR - Excel Writer

    Read the article

  • Excel and Tab Delimited Files Question

    - by OneNerd
    I am encountering what I believe to be a strange issue with Excel (in this case, Excel 2007, but maybe also Excel 2003, but don't have access to it as I write this). I can reliably convert some server data over into a tab-delimited format (been doing this for years) and then open it using Excel - no issue. However, what seems to be happening is if I have an html <table inside one of the fields, it looks like Excel 2007 thinks it should be converting the table into rows and columns inside Excel (not what I want). As you might imagine, this throws off the entire spreadsheet. So question is, is there any way to set up excel to NOT do this (perhaps some setting in Excel that pertains to reading tab delimited files), or am I missing something? Thanks.

    Read the article

  • Change Border in Excel left,right,bottom and top

    - by Le Viet Hung
    Firstly i changed color borders of my sheet to white, because i want to have a white sheet. Then i made some headers and want to make border around it. The Problem is it made borders between the values in header but top, down are not seenable. My code: xlWorkSheet5.Columns.Borders.Color = System.Drawing.ColorTranslator.ToOle(System.Drawing.Color.White); // Color Sheet5 to white, BusLoad xlWorkSheet5.Columns.NumberFormat = "@"; Excel.Range rng = (Excel.Range)xlWorkSheet5.get_Range("A7","J7"); rng.RowHeight = 25.5; rng.BorderAround2(Excel.XlLineStyle.xlContinuous, Excel.XlBorderWeight.xlHairline, Excel.XlColorIndex.xlColorIndexAutomatic, Excel.XlColorIndex.xlColorIndexAutomatic); rng.Borders.LineStyle = Excel.XlLineStyle.xlContinuous; rng.Borders.Weight = 1d; rng.Font.Bold = true; rng.HorizontalAlignment = Microsoft.Office.Interop.Excel.XlHAlign.xlHAlignCenter; rng.Interior.Color = System.Drawing.ColorTranslator.ToOle(System.Drawing.Color.LightGray);

    Read the article

  • Saving a file in a CSV type in Excel always removes the BOM

    - by rickp
    I've been trying to find a reasonable solution/explanation (unsuccessfully) to find out why Excel defaults to removing the BOM when saving a file to the CSV type. Please forgive me if you find this a duplicate of this question. This handles reading CSV files with non-ASCII encoding, but it doesn't cover saving the file back out (which is where the biggest issue lies). Here is my current situation (which I'm going to gather is common among localized software dealing with Unicode characters and a CSV format): We export data to a CSV format using UTF-16LE, ensuring the BOM is set (0xFFFE). We validate after the file is generated with a Hex editor to ensure it was set correctly. Open the file in Excel (for this example we're exporting Japanese characters) and witness that Excel handles loading the file with the correct encoding. Attempts to save this file will prompt you with a warning message indicating that the file may contain features that may not be compatible with Unicode encoding, but asks if you'd like to save anyway. If you select the Save As dialog, it will immediately ask you to save the file as "Unicode Text" rather than CSV. If you select the "CSV" extension and save the file it removes the BOM (obviously along with all the Japanese characters). Why would this happen? Is there a solution to this problem, or is this a known 'bug'/limitation of Excel? Additionally (as a side issue) it appears that Excel, when loading UTF-16LE encoded CSV files, only uses TAB delimiters. Again, is this another known 'bug'/limitation of Excel?

    Read the article

  • Is there a limit on number of OLE objects that can be embedded in an excel sheet?

    - by Varun Mahajan
    I am adding OLE objects to an excel sheet through .net interop. However, after some calls, excel is not allowing adding more objects through code. Is there a limit? or am I doing something wrong. Dim Htmlshape As Microsoft.Office.Interop.Excel.Shape Htmlshape = xlWorkSheet.Shapes.AddOLEObject(, tmpFile, , True, strExplorerPath, 1, "") Running this code gives an error after say 1000 calls. So, am I crossing some limit here?

    Read the article

  • Excel 2010 Access to path is denied temp

    - by Chris Anderson
    I am using excel data reader to read data from an excel file. FileStream stream = File.Open(filePath, FileMode.Open, FileAccess.Read); //1. Reading from a binary Excel file ('97-2003 format; *.xls) IExcelDataReader excelReader = ExcelReaderFactory.CreateBinaryReader(stream); //2. Reading from a OpenXml Excel file (2007 format; *.xlsx) IExcelDataReader excelReader = ExcelReaderFactory.CreateOpenXmlReader(stream); http://exceldatareader.codeplex.com/ This reads excel 1997-2003 format and excel 2007 format on my local machine and when we move it to our test server. However, when moved to production, it works for excel 97-2003 files, but when I try to read 2007 files I receive the following error: Access to the path 'C:\Documents and Settings\PORTALS03\ASPNET\LOCALS~1\Temp\TMP_Z129388041687919815' is denied. How is it possible that the 97-2003 excel file can be read but the 2007 files throw access is denied?

    Read the article

  • VB.net Excel sorting

    - by Lora
    I am trying to get a macro convert from VBA over to vb.net and I am getting a type mismatched error and can't figure it out. I am hoping someone here will be able to help me. This is the code. Sub SortRawData() Dim oSheet As Excel.Worksheet Dim oRange As Excel.Range Try oSheet = SetActiveSheet(mLocalDocument, "Sheet 1") oRange = mApplication.ActiveSheet.UsedRange oRange.Sort(Key1:=oRange("J2"), Order1:=Excel.XlSortOrder.xlAscending, _ Header:=Excel.XlYesNoGuess.xlYes, OrderCustom:=1, MatchCase:=False, _ Orientation:=Excel.XlSortOrientation.xlSortColumns, _ DataOption1:=Excel.XlSortDataOption.xlSortNormal, _ DataOption2:=Excel.XlSortDataOption.xlSortNormal, _ DataOption3:=Excel.XlSortDataOption.xlSortNormal) Catch ex As Exception ErrorHandler.HandleError(ex.Message, ex.Source, ex.StackTrace) End Try End Sub This is the code from the macro Sub SortRawData(ByRef poRange As Range) Set poRange = Application.ActiveSheet.UsedRange poRange.Sort Key1:=Range("J2"), Order1:=xlAscending _ , Header:=xlYes, OrderCustom:=1, MatchCase:=False, Orientation:= _ xlTopToBottom, DataOption1:=xlSortNormal, DataOption2:=xlSortNormal, _ DataOption3:=xlSortNormal poRange.Sort Key1:=Range("D2"), Order1:=xlAscending, _ Key2:=Range("H2"), Order2:=xlAscending, _ Key3:=Range("L2"), Order3:=xlAscending, _ Header:=xlYes, OrderCustom:=1, MatchCase:=False, Orientation:= _ xlTopToBottom, DataOption1:=xlSortNormal, DataOption2:=xlSortNormal, _ DataOption3:=xlSortNormal End Sub Any help would be appreciated. Thanks!

    Read the article

  • Saving a file in a CSV type in Excel always removes the BOM

    - by rickp
    I've been trying to find a reasonable solution/explanation (unsuccessfully) to find out why Excel defaults to removing the BOM when saving a file to the CSV type. Please forgive me if you find this a duplicate of this question. This handles reading CSV files with non-ASCII encoding, but it doesn't cover saving the file back out (which is where the biggest issue lies). Here is my current situation (which I'm going to gather is common among localized software dealing with Unicode characters and a CSV format): We export data to a CSV format using UTF-16LE, ensuring the BOM is set (0xFFFE). We validate after the file is generated with a Hex editor to ensure it was set correctly. Open the file in Excel (for this example we're exporting Japanese characters) and witness that Excel handles loading the file with the correct encoding. Attempts to save this file will prompt you with a warning message indicating that the file may contain features that may not be compatible with Unicode encoding, but asks if you'd like to save anyway. If you select the Save As dialog, it will immediately ask you to save the file as "Unicode Text" rather than CSV. If you select the "CSV" extension and save the file it removes the BOM (obviously along with all the Japanese characters). Why would this happen? Is there a solution to this problem, or is this a known 'bug'/limitation of Excel? Additionally (as a side issue) it appears that Excel, when loading UTF-16LE encoded CSV files, only uses TAB delimiters. Again, is this another known 'bug'/limitation of Excel?

    Read the article

  • Powershell interact with open Excel

    - by HKK
    To interact with excel in Powershell it is common to start a new excel as follows: $x = New-Object -comobject Excel.Application Instead of that I have an open Excel process already. (I get it as follows) $excelprocess = Get-Process | Where-Object {$_.name -eq "excel"} | Sort-Object -Property "Starttime" -descending | Select-Object -First 1 Is there a way to interact with this specific excel process over PS?

    Read the article

  • Form wont stay on top. How do I keep my excel form from hiding behind other windows after I browse

    - by ScottK
    I have a vb.net program that opens up an excel workbook and runs a macro ("Report") in that workbook when a button is clicked. //Workbook with macro and form xlWorkbook = xlApp.Workbooks.Open("W:Data\Excel Program.xls") //Macro: xlApp.Run("Report") //Macro opens form from workbook. I browse for my two .csv files //and then click a button to run code that creates my reports. //form closes, show the excel report after its created xlApp.Visible = True After I browse my first file and select it so that its location is displayed in my text box, the excel form then hides behind any open windows. I want this form to stay on top. It is after this code executes that the form will hide behind all other open windows: Private Sub btnBrowseFile1_Click() Dim fileName1 As String fileName1 = Application.GetOpenFilename("CSV file (*.csv), *.csv") If fileName1 <> "False" Then Me.txtFileName1.text = fileName1 End If End Sub EDIT: I still have no luck with this problem. When the excel macro is opened from a vb program I have this hiding issue...but only after browsing for a file. Why does the focus leave the form and go to Windows after browsing a file? Any one have any suggestions?

    Read the article

  • Create multiple columns on one sheet from 1 column on another with logic

    - by user450252
    I am new to VBA and I am struggling trying to make this work. I am in need of a macro that will process each cell\column on Sheet1 and put the results on Sheet2. I'm sure this is pretty easy for those who are more advanced with VB code. It contains many columns.. Anytime we encounter a — or an empty cell, we populate the cell with -999 (see the example on Sheet2) on the first column, which contains "0-2". Then we create 2 new columns and populate them with 0 on the first column and 2 on second column (see example on Sheet2). If a value is found on only one side, we then populate both sides with the same number. As each column is being process in Sheet1 and Sheet1: A B Column1 Column2 Title Title2 0–2 0–4 3 — — 5 — — — 10–23 11—29 And the results should look like this on Sheet2 Sheet2 A B C D Column1 Column1 Column2 Column2 Title-A Title-B Title-A Title-B 0 2 0 4 3 3 -999 -999 -999 -999 5 5 -999 -999 -999 -999 —999 -999 -999 -999 10 23 11 29

    Read the article

  • How to disable auto recover?

    - by user70010
    I've tried disabling "Save AutoRecover information" option, but to no avail. The window "Excel has recovered the following files" still pops up at the start. Any other ideas? UPD Auto-recover window is shown each start after I terminate Excel during my debug session. All file entries Excel shows in the auto-recover window are dated by year 1601. I checked Excel auto-recover folder, and there is no any real file there.

    Read the article

  • Excel Data List Validation in Powershell

    - by idazuwaika
    Hi, I have a range named "STATE". I want to set data validation in range ("A1") to only take value within this range using Powershell. Below is what I have tried. Does not work. I dont know what to put as 4th and 5th parameters. The first 3 are Excel constants equivalent to xlValidateList, xlValidAlertStop and xlBetween respectively. $ws.Range("A1").Validation.Add(3, 1, 1, "=STATE", 0) Please help. Thanks.

    Read the article

  • Data Security Through Structure, Procedures, Policies, and Governance

    Security Structure and Procedures One of the easiest ways to implement security is through the use of structure, in particular the structure in which data is stored. The preferred method for this through the use of User Roles, these Roles allow for specific access to be granted based on what role a user plays in relation to the data that they are manipulating. Typical data access actions are defined by the CRUD Principle. CRUD Principle: Create New Data Read Existing Data Update Existing Data Delete Existing Data Based on the actions assigned to a role assigned, User can manipulate data as they need to preform daily business operations.  An example of this can be seen in a hospital where doctors have been assigned Create, Read, Update, and Delete access to their patient’s prescriptions so that a doctor can prescribe and adjust any existing prescriptions as necessary. However, a nurse will only have Read access on the patient’s prescriptions so that they will know what medicines to give to the patients. If you notice, they do not have access to prescribe new prescriptions, update or delete existing prescriptions because only the patient’s doctor has access to preform those actions. With User Roles comes responsibility, companies need to constantly monitor data access to ensure that the proper roles have the most appropriate access levels to ensure users are not exposed to inappropriate data.  In addition this also protects rouge employees from gaining access to critical business information that could be destroyed, altered or stolen. It is important that all data access is monitored because of this threat. Security Governance Current Data Governance laws regarding security Health Insurance Portability and Accountability Act (HIPAA) Sarbanes-Oxley Act Database Breach Notification Act The US Department of Health and Human Services defines HIIPAA as a Privacy Rule. This legislation protects the privacy of individually identifiable health information. Currently, HIPAA   sets the national standards for securing electronically protected health records. Additionally, its confidentiality provisions protect identifiable information being used to analyze patient safety events and improve patient safety. In 2002 after the wake of the Enron and World Com Financial scandals Senator Paul Sarbanes and Representative Michael Oxley lead the creation of the Sarbanes-Oxley Act. This act administered by the Securities and Exchange Commission (SEC) dramatically altered corporate financial practices and data governance. In addition, it also set specific deadlines for compliance. The Sarbanes-Oxley is not a set of standard business rules and does not specify how a company should retain its records; In fact, this act outlines which pieces of data are to be stored as well as the storage duration. The Database Breach Notification Act requires companies, in the event of a data breach containing personally identifiable information, to notify all California residents whose information was stored on the compromised system at the time of the event, according to Gregory Manter. He further explains that this act is only California legislation. However, it does affect “any person or business that conducts business in California, and that owns or licenses computerized data that includes personal information,” regardless of where the compromised data is located.  This will force any business that maintains at least limited interactions with California residents will find themselves subject to the Act’s provisions. Security Policies All companies must work in accordance with the appropriate city, county, state, and federal laws. One way to ensure that a company is legally compliant is to enforce security policies that adhere to the appropriate legislation in their area or areas that they service. These types of polices need to be mandated by a company’s Security Officer. For smaller companies, these policies need to come from executives, Directors, and Owners.

    Read the article

  • I will need a formula showing counts, totals and sub-totals for data set from different sheet

    - by Sapthagiri
    I am using MS2003 EXCEL. I have a cell in Sheet 1 with a color value and totals, with sub-totals. On sheet 2, I have a data set with 3 columns (colors, dress, type). On Sheet 1, I will need a tabulation showing Totals for Colors, with totals at sub-group of dress (shirt,pants) split by type totals (Full, Half, Tee) Below table represents my Data set in Sheet 2 Colors Make Dress Type -------------------------------- Red Arrow shirt full Red Levi shirt half blue Rugger Pant full yellow Wrangler shirt tee yellow Rugger Pant half yellow Arrow shirt tee yellow Wrangler Pant half Green Rugger Pant full Red Levi shirt tee blue Rugger Pant full blue Arrow shirt full blue Wrangler Pant half Green Levi shirt full I will need a formula showing counts, totals and sub-totals on Sheet 1 for data set from Sheet 2. Refer my table below which represent my expected data on Sheet 1, total Shirt Full Half Tees Pants Full Shorts Red 10 8 4 3 1 2 1 1 Blue Green Yellow Please note I am not looking for a Pivot table solution.

    Read the article

  • Recover data from hard drive with partitions (but not most data) overwritten

    - by Macha
    I have a 500GB hard drive I've been keeping around to recover data from that I removed from a failing NAS drive that got sort of... erratic at the end. I finally got rid of the NAS when during a firmware update it removed the partition table. Fast forward to a week ago, when I was building a new PC, and a mixup resulted in me placing the hard drive in question in the new PC and installing Windows XP on the first 100GB. I'm presuming any data on that first 100GB is now gone, but for the rest of it, is there any way I can recover it at home, as professional data recovery is currently too expensive? I have a blank 1TB HDD if I can store any images of that hard drive on. The problem was definitely with the NAS and not the hard drive, as the hard drive had a successful install of Windows when mistakenly place in the new PC, and there were capacitors in the NAS's circuitry clearly broken. The data I want to recover (in order of priority) is: High: Some jpgs of family photos. Medium: Some RAW files. (There are also jpg versions of all of these) Low: Some mp3s, avis and ISOs, I can re-rip most of these if need be, but it'd be handy not to have to. (I don't need a backup lecture, and if you can hold it in from nagging Jeff Atwood for it, you can hold it in from nagging me for it) In short: The partition tables are gone and overwritten. The data is not overwritten, except for an amount equal to the size of a Windows XP SP3 installation.

    Read the article

  • Why Oracle Delivers More Value than IBM in Data Integration Solutions

    - by irem.radzik(at)oracle.com
    For data integration projects, IT organization look for a robust but an easy-to-use solution, which simplifies enterprise data architecture while providing exceptional value-- not one that adds complexity and costs. This is a major challenge today for customers who are using IBM InfoSphere products like DataStage or Change Data Capture. Whereas, Oracle consistently delivers higher level value with its data integration products such as Oracle Data Integrator, Oracle GoldenGate. There are many differentiators for Oracle's Data Integration offering in comparison to IBM. Here are the top five: Lower cost of ownership Higher performance in both real-time and bulk data movement Ease of use and flexibility Reliability Complete, Open, and Integrated Middleware Offering Architectural differences between products contribute a great deal to these differences. First of all, Oracle's ETL architecture does not require a middle-tier transformation server, something IBM does require. Not only it costs more to manage an additional transformation server including energy costs, but it adds a performance bottleneck as well. In addition, IBM's data integration products are complex and often require lengthy professional services engagements to integrate. This translates to higher costs and delayed time to market. Then there's the reliability factor. Our customers choose Oracle GoldenGate over IBM's InfoSphere Change Data Capture product because Oracle GoldenGate is designed for mission-critical systems that require guaranteed data delivery and automatic recovery in case of process interruptions. On Thursday we will discuss these key differentiators in detail and provide customer examples that chose Oracle over IBM in data integration projects. Join us on Thursday Feb 10th at 11am PT to learn how Oracle delivers more value than IBM in data integration solutions.

    Read the article

< Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >