Search Results

Search found 22065 results on 883 pages for 'performance testing'.

Page 456/883 | < Previous Page | 452 453 454 455 456 457 458 459 460 461 462 463  | Next Page >

  • Use-cases for node.js and c#

    - by Chase Florell
    I do quite a bit of ASP.NET work (C#, MVC), but most of it is typical web development. I do Restful architecture using CRUD repositories. Most of my clients don't have a lot of advanced requirements within their applications. I'm now looking at node.js and it's performance implications (I'm addicted to speed), but I haven't delved into it all that much. I'm wondering if node.js can realistically replace my typical web development in C# and ASP.NET MVC (not rewriting existing apps, but when working on new ones) node.js can complement an ASP.NET MVC app by adding some async goodness to the existing architecture. Are there use-cases for/against C# and node.js? Edit I love ASP.NET MVC and am super excited with where it's going. Just trying to see if there are special use cases that would favor node.js

    Read the article

  • Top Innovations for Sales Managers

    - by divya.malik
    Sales managers are always looking for ways to motivate their troops as well as make themselves more effective and productive. Here is a small X’mas present for those folks that are looking for some effective tips. Our friends at Selling Power magazine recently wrote an interesting blog post with top 10 best practices for sales managers. Here we go: Harness social media Strategically align marketing campaigns with sales efforts Establish a customer-centric sales process Realize ROI with CRM Embrace online collaboration Improve accuracy in sales forecasting and pipeline metrics Coach for sales success Leverage mobile technology Focus on sales enablement Improve sales performance and compensation management We have a complete suite of sales applications, to help increase sales revenues, sales productivity as well as to improve your sales execution. You can find more details here. For more details on the SellingPower blog post click here. Happy Holidays to you and your family.

    Read the article

  • Welcome to new blog!! Agile.NAV

    - by ssmantha
    I am quite ecstatic to announce a new blog, to which I am also a co-author. http://agilenav.wordpress.com. Agile.NAV brings in a vast amount of information of the work I did together with my colleague on bringing Microsoft Dynamics NAV under the hood of Team Foundation Server. For the past couple of years we have been working on creating development tools (more on integration side) for Microsoft Dynamics NAV which includes, Version Control, Automated Build system and our new automation testing integration with Dynamics NAV 2013. To start of with we got very good initial responses from community’s distinguished members like Luc van Vugt (see here). The idea is to drive the shift in mind-set for the Microsoft Dynamics NAV developer community. We share the same passion as people like Luc, about creating software in a professional manner.

    Read the article

  • Ceská obchodní banka, a.s. Upgrades to Oracle Database 11g On Time, On Budget and without Disrupting Business Operations

    - by jgelhaus
    You want the new features of the latest release, but upgrading a database is one of those things DBAs can "lose sleep" over.  Ceská obchodní banka, a.s."CSOB" needed to upgrade its production systems in the Czech Republic and Slovakia that supported 90 key applications for its retail, corporate, internet, and ATM services from Oracle Database 9i to Oracle Database 11g with simultaneous migration from Alpha processors/OpenVMS-based hardware to a Power7, AIX system. Oracle Consulting helped to complete the upgrade within schedule and budget, while meeting tight restrictions on downtime. Knowledge transfer by Oracle Consulting to the bank’s IT team has improved self-sufficiency in support and maintenance while the technical and advisory services of Oracle Consulting Expert Services continue to optimize performance and availability while lowering cost of ownership. Read how CSOB maximized the value of its investment in Oracle Database technology with an upgrade to Oracle Database 11g.

    Read the article

  • Computacenter first partner to offer Oracle Exadata proof-of-concept environment for real-world test

    - by kimberly.billings
    Computacenter (http://www.computacenter.com/), Europe's leading independent provider of IT infrastructure services, recently announced that it is the first partner to offer an Oracle Exadata 'proof-of concept' environment for real-world testing. This new center, combined with Computacenter's extensive database storage skills, will enable organisations to accurately test Oracle Exadata with their own workloads, clearly demonstrating the case for migration. For more information, read the press release. Are you planning to migrate to Oracle Exadata? Tell us about it! var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); try { var pageTracker = _gat._getTracker("UA-13185312-1"); pageTracker._trackPageview(); } catch(err) {}

    Read the article

  • Creating the Business Card Request InfoPath Form

    - by JKenderdine
    Business Card Request Demo Files Back in January I spoke at SharePoint Saturday Virginia Beach about InfoPath forms and Web Part deployment.  Below is some of the information and details regarding the form I created for the session.  There are many blogs and Microsoft articles on how to create a basic form so I won’t repeat that information here.   This blog will just explain a few of the options I chose when creating the solutions for SPS Virginia Beach.  The above link contains the zipped package files of the two InfoPath forms(no code solution and coded solution), the list template for the Location list I used, and the PowerPoint deck.  If you plan to use these templates, you will need to update the forms to work within your own environments (change data connections, code links, etc.).  Also, you must have the SharePoint Enterprise version, with InfoPath Services configured in order to use the Web Browser enabled forms. So what are the requirements for this template? Business Card Request Form Template Design Plan: Gather user information and requirements for card Pull in as much user information as possible. Use data from the user profile web services as a data source Show and hide fields as necessary for requirements Create multiple views – one for those submitting the form and Another view for the executive assistants placing the orders. Browser based form integrated into SharePoint team site Submitted directly to form library The base form was created using the blank template.  The table and rows were added using Insert tab and selecting Custom Table.  The use of tables is a great way to make sure everything lines up.  You do have to split the tables from time to time.  If you’ve ever split cells and then tried to re-align one to find that you impacted the others, you know why.  Here is what the base form looks like in InfoPath.   Show and hide fields as necessary for requirements You will notice I also used Sections within the form.  These show or hide depending on options selected or whether or not fields are blank.  This is a great way to prevent your users from feeling overwhelmed with a large form (this one wouldn’t apply).  Although not used in this one, you can also use various views with a tab interface.  I’ll show that in another post. Gather user information and requirements for card Pull in as much user information as possible. Use data from the user profile web services as a data source Utilizing rules you can load data when the form initiates (Data tab, Form Load).  Anything you can automate is always appreciated by the user as that is data they don’t have to enter.  For example, loading their user id or other user information on load: Always keep in mind though how much data you load and the method for loading that data (through rules, code, etc.).  They have an impact on form performance.  The form will take longer to load if you bring in a ton of data from external sources.  Laura Rogers has a great blog post on using the User Information List to load user information.   If the user has logged into SharePoint, then this can be used quite effectively and without a huge performance hit.   What I have found is that using the User Profile service via code behind or the Web Service “GetUserProfileByName” (as above) can take more time to load the user data.  Just food for thought. You must add the data connection in order for the above rules to work.  You can connect to the data connection through the Data tab, Data Connections or select Manage Data Connections link which appears under the main data source.  The data connections can be SharePoint lists or libraries, SQL data tables, XML files, etc.  Create multiple views – one for those submitting the form and Another view for the executive assistants placing the orders. You can also create multiple views for the users to enhance their experience.  Once they’ve entered the information and submitted their request for business cards, they don’t really need to see the main data input screen any more.  They just need to view what they entered. From the Page Design tab, select New View and give the view a name.  To review the existing views, click the down arrow under View: The ReviewView shows just what the user needs and nothing more: Once you have everything configured, the form should be tested within a Test SharePoint environment before final deployment to production.  This validates you don’t have any rules or code that could impact the server negatively. Submitted directly to form library   You will need to know the form library that you will be submitting to when publishing the template.  Configure the Submit data connection to connect to this library.  There is already one configured in the sample,  but it will need to be updated to your environment prior to publishing. The Design template is different from the Published template.  While both have the .XSN extension, the published template contains all the “package” information for the form.  The published form is what is loaded into Central Admin, not the design template. Browser based form integrated into SharePoint team site In Central Admin, under General Settings, select Manage Form Templates.  Upload the published form template and Activate it to a site collection. Now it is available as a content type to select in the form library.  Some documentation on publishing form templates:  Technet – Manage administrator approved form templates And that’s all our base requirements.  Hope this helps to give a good start.

    Read the article

  • Replace Broadcom "wl" driver with "b43"

    - by Laszlo Boros
    I'm using Ubuntu 10.04.4 LTS, and in my laptop there is a Broadcom BCM4312 wlan card. lspci output: 04:00.0 Network controller: Broadcom Corporation BCM4312 802.11b/g (rev 01) Subsystem: Broadcom Corporation Device 04b5 Flags: bus master, fast devsel, latency 0, IRQ 18 Memory at f4500000 (64-bit, non-prefetchable) [size=16K] Capabilities: [40] Power Management version 3 Capabilities: [58] Vendor Specific Information Capabilities: [e8] Message Signalled Interrupts: Mask- 64bit+ Queue=0/0 Enable- Capabilities: [d0] Express Endpoint, MSI 00 Capabilities: [100] Advanced Error Reporting Capabilities: [13c] Virtual Channel Capabilities: [160] Device Serial Number 81-ac-1d-ff-ff-12-54-92 Capabilities: [16c] Power Budgeting Kernel driver in use: wl Kernel modules: wl, ssb So as you can see, the current (and default) driver is wl - installed with jockey. But I have another Ubuntu based distribution on my laptop (BackTrack linux), which is also 10.04, but it has the b43 driver installed and the overall performance is MUCH better. So I would like to install it on this OS too, but even google didn't help me. So my question is how to install the latest b43 driver on my Ubuntu?

    Read the article

  • home-folder encryption: Does it work?

    - by jpaugh
    Back when Ubuntu first sported home folder encryption (what, around the time of Jaunty Jackalope?), I opted in. That caused me some grief when I decided to change my login password. I found that I couldn't decrypt my home anymore! In trying to fix this, I eventually muddled things to the point that using my old password didn't work anymore, either. That experience has left me very shy of using an encrypted home directory--nevermind the performance hit of encryption. Has this feature become more "stable" since it came out? Does it break if you change your login password? Has your [more recent] experience been better? (Does it work in Natty Narwhal?)

    Read the article

  • Virtuelle Tour durch das Oracle Universum

    - by A&C Redaktion
    Die neue „Oracle Hardware Virtual Tour“ fürs iPhone und iPad ist eine animierte Entdeckungsreise zu verschiedenen Oracle Produkten: Man öffnet Gehäuse, findet diverse Komponenten, kann diese anschauen, drehen und herausfinden, wozu sie gut sind. Zu sehen und erfahren gibt es unter anderem Oracle Exadata, SPARC Systeme, Sun x86 Systeme, Sun Blade und Sun Netra Systeme. Sie alle treten mit dem Anspruch an, Rekorde in Sachen Performance zu brechen, einfach in der Handhabung zu sein, mit hoher Verfügbarkeit zu punkten und Kosten zu sparen. Ein verspieltes Feature – aber eines, das Partner im Kundenkontakt gewinnbringend einsetzen können. Die 3D-Apps laufen auf dem iPhone 3GS, dem iPad 2 oder neueren Geräten.

    Read the article

  • Virtuelle Tour durch das Oracle Universum

    - by A&C Redaktion
    Die neue „Oracle Hardware Virtual Tour“ fürs iPhone und iPad ist eine animierte Entdeckungsreise zu verschiedenen Oracle Produkten: Man öffnet Gehäuse, findet diverse Komponenten, kann diese anschauen, drehen und herausfinden, wozu sie gut sind. Zu sehen und erfahren gibt es unter anderem Oracle Exadata, SPARC Systeme, Sun x86 Systeme, Sun Blade und Sun Netra Systeme. Sie alle treten mit dem Anspruch an, Rekorde in Sachen Performance zu brechen, einfach in der Handhabung zu sein, mit hoher Verfügbarkeit zu punkten und Kosten zu sparen. Ein verspieltes Feature – aber eines, das Partner im Kundenkontakt gewinnbringend einsetzen können. Die 3D-Apps laufen auf dem iPhone 3GS, dem iPad 2 oder neueren Geräten.

    Read the article

  • Marketing texts for freelance programmers [closed]

    - by chiborg
    I'm a freelance developer and would like to set up a website that describes my services. When trying to come up with texts for the web site I got a severe case of writers block. I know that I'd like to describe what I do (websites, CMS, web-based applications), the different stages of projects (analysis, contract, prototype, testing, improvement, delivery, payment, etc) and who the target audience is (owners of small to medium businesses). But I have this feeling that there are some rules/tips on how to write such texts and I don't know them - any pointers?

    Read the article

  • Use of Service Bus in a Pub-Sub Engine

    - by JoseK
    In one of our projects, we've built a Publisher - Subscriber Engine on Oracle Service Bus. The functionality being a series of events are published and subscribers (JMS queues) receive these whenever a new event is published. We are facing some technical issues now, performance-wise and hence an architectural review is underway. Now for my questions: Architecturally the ESB has to publish events into a DB and read from the DB which users wish to be notified, then push the event onto their respective queues. There is a high amount of DB interaction and the question is whether ESB should be having such high amount of interaction with the DB in the first place? Or should there have been some alternate component responsible for doing this. Alternately is there any non-DB approach in which we can store the events and subscribers? Where else can this application data be held within the ESB context?

    Read the article

  • Where to find clients who are willing to pay top dollar for highly reliable code?

    - by Robin Green
    I'm looking to find clients who are willing to pay a premium above usual contractor rates, for software that is developed with advanced tools and techniques to eliminate certain classes of bugs. However, I have little experience of contracting, and relatively few contacts. It's important to state that the kind of tools and techniques I'm thinking of (e.g. formal verification) are used commercially extremely rarely, as far as I'm aware. There is kind of a continuum of approaches to higher reliability, with basic testing and basic static typing at one end and full-blown formal verification at the other, but the methods I'm thinking of are towards the latter end of the spectrum.

    Read the article

  • Updating a database connection password using a script

    - by Tim Dexter
    An interesting customer requirement that I thought was worthy of sharing today. Thanks to James for the requirement and Bryan for the proposed solution and me for testing the solution and proving it works :0) A customers implementation of Sarbanes Oxley requires them to change all database account passwords every 90 days. This is scripted leveraging shell scripts today for most of their environments. But how can they manage the BI Publisher connections? Now, the customer is running 11g and therefore using weblogic on the middle tier, which is the first clue to Bryans proposed solution. To paraphrase and embellish Bryan's solution a little; why not use a JNDI connection from BIP to the database. Then employ the web logic scripting engine to make updates to the JNDI as needed? BIP is completely uninvolved and with a little 'timing' users will be completely unaware of the password updates i.e. change the password when reports are not being executed. Perfect! James immediately tracked down the WLST script that could be used here, http://middlewaremagic.com/weblogic/?p=4261 (thanks Ravish) Now it was just a case of testing the theory. Some steps: Create the JNDI connection in WLS Create the JNDI connection in BI Publisher pointing to the WLS connection Build new data models using or re-point data sources to use the JNDI connection. Create the WLST script to update the WLS JNDI password as needed. Test! Some details. Creating the JNDI connection in web logic is pretty straightforward. Log into hte console and look for Data Sources under the Services section of the home page and click it Click New >> Generic Datasource Give the connection a name. For the JNDI name, prefix it with 'jdbc/' so I have 'jdbc/localdb' - this name is important you'll need it on the BIP side. Select your db type - this will influence the drivers and information needed on the next page. Being a company man, Im using an Oracle db. Click Next Select the driver of choice, theres lots I know, you can read about them I just chose 'Oracle's Driver (Thin) for Instance connections; Versions 9.0.1 and later' Click Next >> Next Fill out the db name (SID), server, port, username to connect and password >> Next Test the config to ensure you can connect. >> Next Now you need to deploy the connection to your BI server, select it and click Next. You're done with the JNDI config. Creating the JNDI connection on the Publisher side is covered here. Just remember to the connection name you created in WLS e.g. 'jdbc/localdb' Not gonna tell you how to do this, go read the user guide :0) Suffice to say, it works. This requires a little reading around the subject to understand the scripting engine and how to execute scripts. Nicely covered here. However a bit of googlin' and I found an even easier way of running the script. ${ServerHome}/common/bin/wlst.sh updatepwd.py Where updatepwd.py is my script file, it can be in another directory. As part of the wlst.sh script your environment is set up for you so its very simple to execute. The nitty gritty: Need to take Ravish's script above and create a file with a .py extension. Its going to need some modification, as he explains on the web page, to make it work in your environment. I played around with it for a while but kept running into errors. The script as is, tries to loop through all of your connections and modify the user and passwords for each. Not quite what we are looking for. Remember our requirement is to just update the password for a given connection. I also found another issue with the script. WLS 10.x does not allow updates to passwords using clear type ie un-encrypted text while the server is in production mode. Its a bit much to set it back to developer mode bounce it, change the passwords and then bounce and then change back to production and bounce again. After lots of messing about I finally came up with the following: ############################################################################# # # Update password for JNDI connections # ############################################################################# print("*** Trying to Connect.... *****") connect('weblogic','welcome1','t3://localhost:7001') print("*** Connected *****") edit() startEdit() print ("*** Encrypt the password ***") en = encrypt('hr') print "Encrypted pwd: ", en print ("*** Changing pwd for LocalDB ***") dsName = 'LocalDB' print 'Changing Password for DataSource ', dsName cd('/JDBCSystemResources/'+dsName+'/JDBCResource/'+dsName+'/JDBCDriverParams/'+dsName) set('PasswordEncrypted',en) save() activate() Its pretty simple and you can expand on it to loop through the data sources and change each as needed. I have hardcoded the password into the file but you can pass it as a parameter as needed using the properties file method. Im not going to get into the detail of that here but its covered with an example here. Couple of points to note: 1. The change to the password requires a server bounce to get the changes picked up. You can add that to the shell script you will use to call the script above. 2. The script above needs to be run from the MW_HOME\user_projects\domains\bifoundation_domain directory to get the encryption libraries set correctly. My command to run the whole script was: d:\oracle\bi_mw\wlserver_10.3\common\bin\wlst.cmd updatepwd.py - where wlst.cmd is the scripting command line and updatepwd.py was my update password script above. I have not quite spoon fed everything you need to make it a robust script but at least you know you can do it and you can work out the rest I think :0)

    Read the article

  • How to Use a 64-bit Web Browser on Windows

    - by Chris Hoffman
    64-bit version of Windows don’t use 64-bit browsers by default – they’re still in their infancy, although even Adobe Flash now supports 64-bit browsers. Using a 64-bit browser can offer significant performance benefits, according to some benchmarks. This article is for Windows users – 64-bit Linux distributions include 64-bit browsers, so you don’t have to do anything special on Linux. How to Own Your Own Website (Even If You Can’t Build One) Pt 1 What’s the Difference Between Sleep and Hibernate in Windows? Screenshot Tour: XBMC 11 Eden Rocks Improved iOS Support, AirPlay, and Even a Custom XBMC OS

    Read the article

  • 9/18 Live Webcast: Three Compelling Reasons to Upgrade to Oracle Database 11g

    - by jgelhaus
    Webcast: Three Compelling Reasons to Upgrade to Oracle Database 11g Date: Tuesday, September 18, 2012 Time: 10 a.m. PT/1 p.m. ET If you or your organization is still working with Oracle Database 10g or an even older version, now is the time to upgrade. Oracle Database 11g offers a wide variety of advantages to enhance your operation. Join us for this live Webcast and learn about what you’re missing: the business, operational, and technical benefits. With Oracle Database 11g, you can: Upgrade with zero downtime Improve application performance and database security Reduce the amount of storage required Save time and money Register today here

    Read the article

  • Is it typical for a provider of a web services to also provide client libraries?

    - by HDave
    My company is building a corporate Java web-app and we are leaning towards using GWT-RPC as the client-server protocol for performance reasons. However, in the future, we will need to provide an API for other enterprise systems to access our data as well. For this, we were thinking of a SOAP based web service. In my experience it is common for commercial providers of enterprise web applications to provide client libraries (Java, .NET, C#, etc.). Is this generally the case? I ask because if so, then why bother using SOAP or REST or any standard web services protocol at all? Why not just create a client libraries that communicate via GWT-RPC?

    Read the article

  • Incremental file system backups

    - by brunopereira81
    I use Virtual Box a lot for distro / applications testing purposes. One of the features I simply love about it is virtual machines snapshots, its saves a state of a virtual machine and is able to restore it to its former glory if something you did went wrong without any problems and without consuming your all hard disk space. On my live systems I know how to create a 1:1 image of the file system but all the solutions I'v known will create a new image of the complete file system. Are there any programs / file systems that are capable of taking a snapshot of a current file system, save it on another location but instead of making a complete new image it creates incremental backups? To easy describe what I want, it should be as dd images of a file system, but instead of only a full backup it would also create incremental. I am not looking for clonezilla, etc. It should run within the system itself with no (or almost none) intervention from the user, but contain all the data of the file systems.

    Read the article

  • How can I reduce lagging with GUI/GPU stuff -- make Unity run smaller, quicker, faster?

    - by chris
    Finally installed Ubuntu 12.04 on my HP Pavilion 2000. Have all of my apps on and loaded and am happy thus far. ONE ISSUE -- I'm experiencing a small amount of GUI/GPU style lagging when I go to open menus, move windows, etc. What settings can I disable to allow it to run sharply and quickly, even if i t means sacrificing some of the graphics? Have already installed pre-load. Just want the OS to run sharply and quickly with menu refreshes, window moves, etc. I do not mind sacrificing graphics. Somone mentionted to me I have to install video drivers but the two that come up in system settings under drivers it won't let me install. ALSO : I am driving a second 19" monitor -- would that make a difference performance wise as well? Thanks in advance. Chris

    Read the article

  • What are the appropriate mount options for a shared NTFS partition on an SSD in a dual boot Ubuntu/Windows setup?

    - by Andreas Jonsson
    I have Ubuntu 13.10 and Windows 7 installed in dual boot on a single SSD. In addition they share an NTFS partition where I put all my data and documents. What is the optimal way to mount this NTFS partition in /etc/fstab (considering performance and minimizing wear of the SSD)? Similar questions have been asked, but I could not find answers to this particular scenario. As I understand it, the mount option 'discard' is not supported for NTFS and so should not be used (although it is recommended here). Another often quoted mount option is 'noatime'. I use it for my ext4 partitions. Does it apply to NTFS? My current /etc/fstab line is: UUID=XXXXXXXXXXXXXXXX /dos ntfs defaults,nls=utf8,uid=1000,gid=1000 0 0

    Read the article

  • Programming language features that help to catch bugs early

    - by Christian Neumanns
    Do you know any programming language features that help to detect bugs early in the software development process - ideally at compile-time or else as early as possible at run-time? Examples of well-known and effective bug-reducing features are: Static typing and generic types: type incompatibility errors are detected by the compiler Design by Contract (TM), also called Contract Programming: invalid values are quickly detected at runtime (through preconditions, postconditions and class invariants) Unit testing I ask this question in the context of improving an object-oriented programming language (called Obix) which has been designed from the ground up to 'make it easy to quickly write reliable code'. Besides the features mentioned above this language also incorporates other Fail-fast features such as: Objects are immutable by default Void (null) values are not allowed by default The aim is to add more Fail-fast concepts to the language. If you know other features which help to write less error-prone code then please let us know. Thank you.

    Read the article

  • 2 min video about the SQL_Compare

    - by CatherineRussell
    It is nice to start blogging again! I am working on new project in a small company now. We do not have a full time database admin. I have to cover multiple roles: getting requirements, writing docs and creating diagrams, designing app, writing code, testing and DBA role. I am not a DBA. But, I have to do day to day database changes: adding new new columns and tables. Check out 2 min video about the SQL_Compare. This tool saves time by automatically comparing and synchronizing database schemas; eliminate mistakes migrating database changes from dev, to test, to production; speed up the deployment of new database schema updates; generate T-SQL scripts to update one database to match the schema of another; find and fix errors caused by differences between databases;  keeps an accurate history of all previous database records.  http://www.red-gate.com/products/SQL_Compare/index.htm

    Read the article

  • 2D Tile Map for Platformer, XML or SQLite?

    - by Stephen Tierney
    I'm developing a 2D platformer with some uni friends. We've based it upon the XNA Platformer Starter Kit which uses .txt files to store the tile map. While this is simple it does not give us enough control and flexibility with level design. I'm doing some research into whether to store level data in an XML file or in a database like SQLite. Which would be the best for this situation? Do either have any drawbacks (performance etc) compared to the other?

    Read the article

  • Using Transaction Logging to Recover Post-Archived Essbase data

    - by Keith Rosenthal
    Data recovery is typically performed by restoring data from an archive.  Data added or removed since the last archive took place can also be recovered by enabling transaction logging in Essbase.  Transaction logging works by writing transactions to a log store.  The information in the log store can then be recovered by replaying the log store entries in sequence since the last archive took place.  The following information is recorded within a transaction log entry: Sequence ID Username Start Time End Time Request Type A request type can be one of the following categories: Calculations, including the default calculation as well as both server and client side calculations Data loads, including data imports as well as data loaded using a load rule Data clears as well as outline resets Locking and sending data from SmartView and the Spreadsheet Add-In.  Changes from Planning web forms are also tracked since a lock and send operation occurs during this process. You can use the Display Transactions command in the EAS console or the query database MAXL command to view the transaction log entries. Enabling Transaction Logging Transaction logging can be enabled at the Essbase server, application or database level by adding the TRANSACTIONLOGLOCATION essbase.cfg setting.  The following is the TRANSACTIONLOGLOCATION syntax: TRANSACTIONLOGLOCATION [appname [dbname]] LOGLOCATION NATIVE ENABLE | DISABLE Note that you can have multiple TRANSACTIONLOGLOCATION entries in the essbase.cfg file.  For example: TRANSACTIONLOGLOCATION Hyperion/trlog NATIVE ENABLE TRANSACTIONLOGLOCATION Sample Hyperion/trlog NATIVE DISABLE The first statement will enable transaction logging for all Essbase applications, and the second statement will disable transaction logging for the Sample application.  As a result, transaction logging will be enabled for all applications except the Sample application. A location on a physical disk other than the disk where ARBORPATH or the disk files reside is recommended to optimize overall Essbase performance. Configuring Transaction Log Replay Although transaction log entries are stored based on the LOGLOCATION parameter of the TRANSACTIONLOGLOCATION essbase.cfg setting, copies of data load and rules files are stored in the ARBORPATH/app/appname/dbname/Replay directory to optimize the performance of replaying logged transactions.  The default is to archive client data loads, but this configuration setting can be used to archive server data loads (including SQL server data loads) or both client and server data loads. To change the type of data to be archived, add the TRANSACTIONLOGDATALOADARCHIVE configuration setting to the essbase.cfg file.  Note that you can have multiple TRANSACTIONLOGDATALOADARCHIVE entries in the essbase.cfg file to adjust settings for individual applications and databases. Replaying the Transaction Log and Transaction Log Security Considerations To replay the transactions, use either the Replay Transactions command in the EAS console or the alter database MAXL command using the replay transactions grammar.  Transactions can be replayed either after a specified log time or using a range of transaction sequence IDs. The default when replaying transactions is to use the security settings of the user who originally performed the transaction.  However, if that user no longer exists or that user's username was changed, the replay operation will fail. Instead of using the default security setting, add the REPLAYSECURITYOPTION essbase.cfg setting to use the security settings of the administrator who performs the replay operation.  REPLAYSECURITYOPTION 2 will explicitly use the security settings of the administrator performing the replay operation.  REPLAYSECURITYOPTION 3 will use the administrator security settings if the original user’s security settings cannot be used. Removing Transaction Logs and Archived Replay Data Load and Rules Files Transaction logs and archived replay data load and rules files are not automatically removed and are only removed manually.  Since these files can consume a considerable amount of space, the files should be removed on a periodic basis. The transaction logs should be removed one database at a time instead of all databases simultaneously.  The data load and rules files associated with the replayed transactions should be removed in chronological order from earliest to latest.  In addition, do not remove any data load and rules files with a timestamp later than the timestamp of the most recent archive file. Partitioned Database Considerations For partitioned databases, partition commands such as synchronization commands cannot be replayed.  When recovering data, the partition changes must be replayed manually and logged transactions must be replayed in the correct chronological order. If the partitioned database includes any @XREF commands in the calc script, the logged transactions must be selectively replayed in the correct chronological order between the source and target databases. References For additional information, please see the Oracle EPM System Backup and Recovery Guide.  For EPM 11.1.2.2, the link is http://docs.oracle.com/cd/E17236_01/epm.1112/epm_backup_recovery_1112200.pdf

    Read the article

  • Oracle Solaris 11 Developer Webinar Series

    - by Larry Wake
    This coming Tuesday, a new series of webcasts (not to be confused with a series of tubes) is kicking off, aimed at developers. Register today Next week's session covers IPS and related topics: What: Modern Software Packaging for Enterprise Developers When: Tuesday, March 27, 9 AM Pacific Who: Eric Reid, Oracle Systems ISV Engineering We've got several more queued up -- here's the full schedule, with registration links for each one. Or, see the series overview, which includes a link to a "teaser" preview of all the sessions. Topic Date (all sessions 9 AM Pacific) Speaker Modern Software Packaging for Enterprise Developers March 27th Eric Reid (Principal Software Engineer) Simplify Your Development Environment with Zones, ZFS & More April 10th Eric Reid (Principal Software Engineer)Stefan Schneider (Chief Technologist, ISV Engineering) Managing Application Services – Using SMF Manifests in Solaris 11 April 24th Matthew Hosanee (Principal Software Engineer) Optimize Your Applications on Oracle Solaris 11: The DTrace Advantage May 8th Angelo Rajadurai (Principal Software Engineer) Maximize Application Performance and Reliability on Oracle Solaris 11 May 22nd Ikroop Dhillon (Principal Product Manager) Writing Oracle Solaris 11 Device Drivers June 6th Bill Knoche (Principal Software Engineer)

    Read the article

< Previous Page | 452 453 454 455 456 457 458 459 460 461 462 463  | Next Page >