Search Results

Search found 41998 results on 1680 pages for 'oracle best practices'.

Page 457/1680 | < Previous Page | 453 454 455 456 457 458 459 460 461 462 463 464  | Next Page >

  • BPM 11.1.1.5 for Apps: BPM for EBS Demo available

    - by JuergenKress
    For access to the Oracle demo systems please visit OPN and talk to your Partner Expert Demo Highlights This demo showcases BPM integration with E-Business Suite BPM Process Spaces, providing role-based dashboards and monitoring EBS processes Automated workflow generation, enforcement of business rules Seamless integration with E-Business Suite-iExpense module using SOA Worklist approvals via a mobile device Demo Architecture  & Demo Collateral & OFM Demos Corner & DSS Offerings & Scheduling Demos on DSS & DSS Support SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: BPM11g,BPM demo,dss SOA,BPM Suite,SOA Community,Oracle SOA,Oracle BPM,BPM,Community,OPN,Jürgen Kress

    Read the article

  • Even EA's Have Bad Days - it's Time to Reset

    - by Pat Shepherd
    I saw this article and thought I'd share it because, even we EA's have bad days and the 7 points listed are a great way for you to hit the "reset" button. From Geoffrey James on INC.COM, here are 7 ways to change your view of things when, say, you are hitting a frustration point coordinating stakeholders to agree on an approach (never happens, right?) Positive Thinking: 7 Easy Ways to Improve a Bad Day http://www.inc.com/geoffrey-james/positive-thinking-7-easy-ways-to-improve-a-bad-day.html To paraphrase:          You can decide (in an instant) to change patterns of the past          Believe in (or even visualize) good things happening, and they will          Keep a healthy perspective on the work-life / life-life continuum (what things REALLY matter in the big scheme of things)                  Focus on the good (the laws of positive-attraction apply)

    Read the article

  • Trigger Happy

    - by Tim Dexter
    Its been a while, I know, we’ll say no more OK? I’ll just write …In the latest BIP 11.1.1.6 release and if I’m really honest; the release before this (we'll call it dot 5 for brevity.) The boys and gals in the engine room have been real busy enhancing BIP with some new functionality. Those of you that use the scheduling engine in OBIEE may already know and use the ‘conditional scheduling’ feature. This allows you to be more intelligent about what reports get run and sent to folks on a scheduled basis. You create a ‘trigger’ analysis (answer) that is executed at schedule time prior to the main report. When the schedule rolls around, the trigger is run, if it returns rows, then the main report is run and delivered. If there are no rows returned, then the main report is not run. Useful right? Your users are not bombarded with 20 reports in their inbox every week that they need to wade throu. They get a handful that they know they need to look at. If you ensure you use conditional formatting in the report then they can find the anomalous data in the reports very quickly and move on to the rest of their day more quickly. You could even think of OBIEE as a virtual team member, scouring the data on your behalf 24/7 and letting you know when its found an issue.BI Publisher, wanting the team t-shirt and the khaki pants, has followed suit. You can now set up ‘triggers’ for it to execute before it runs the main report. Just like its big brother, if the scheduled report trigger returns rows of data; it then executes the main report. Otherwise, the report is skipped until the next schedule time rolls around. Sound familiar?BIP differs a little, in that you only need to construct a query to act as the trigger rather than a complete report. Let assume we have a monthly wage by department report on a schedule. We only want to send the report to managers if their departmental wages reach and/or exceed a certain amount. The toughest part about this is coming up with the SQL to test the business rule you want to implement. For my example, its not that tough: select d.department_name, sum(e.salary) as wage_total from employees e, departments d where d.department_id = e.department_id group by d.department_name having sum(e.salary) > 230000 We're looking for departments where the wage cost is greater than 230,000 Dexter Dollars! With a bit of messing I found out you can parametrize the query. Users can then set a value at schedule time if they need to. To create the trigger is straightforward enough. You can create multiple triggers for users to select at schedule time. Notice I also used a parameter in the query, :wamount. Note the matching parameter in the tree on the left. You also dont need to return multiple columns, one is fine, the key is if there are rows returned. You can build the rest of your report as usual. At scheduling time the Schedule tab has a bit more on it. If your users want to set the trigger, they check the Use Trigger box. The page will then pop fields to pick the appropriate trigger they want to use, even a trigger on another data model if needed. Note it will also ask for the parameter value associated with the trigger. At this point you should note that the data model does not make a distinction between trigger and data model (extract) parameters. So users will see the parameters on the General and Schedule tabs. If per chance you do need to just have a trigger parameters. You can just hide them from the report using the Parameters popup in the report designer, just un-check the 'Show' box I have tested the opposite case where you do not want main report parameters seen in the trigger section. BIP handles that for you! Once the report hits its allotted schedule time, the trigger is executed. Based on the results the report will either run or be 'skipped.' Now, you have a smarter scheduler that will only deliver reports when folks need to see them and take action on the contents. More official info here for developers and here for users.

    Read the article

  • Translation and Localization Resources for UX Designers

    - by ultan o'broin
    Here is a handy list of translation and localization-related resources for user experience professionals. Following these will help you design an easily translatable user experience. Most of the references here are for web pages or software. Fundamentally, remember your designs will be consumed globally, and never divorce the design process from the development or deployment effort that goes into bringing your designs to life in code. Ask yourself today: Do you know how the text you are using in your designs are delivered to the customer, even in English? Key areas that UX designers always seen to fall foul of, in my space anyway, are: Terminology that is impossible to translate (jargon, multiple modifiers, gerunds) or is used inconsistently Poorly written, verbose text (really, just write well in English, no special considerations) String construction (concatenation of parts assembled dynamically) Composite widget positioning (my favourite) Hard-coded fonts, small font sizes, or character formatting or casing that doesn't work globally Format that is not separate from content Restricted real estate not allowing for text expansion in translation Forcing formatting with breaks, and hard-coding alphabetical sorting Graphics that do not work in Bi-Di languages (because they indicate directionality and can't flip) or contain embedded text. The problems of culturally offensive icons are well known by now in the enterprise applications space, though there are some dangers, such as the use of flags to indicate language, for example. Resources Internationalization Techniques: Authoring HTML & CSS Global By Design Insert Title Here : Variables in Interface Language Prose: Internationalisation Doc and help considerations I can deal with later.

    Read the article

  • SOA Suite 11g Database Growth Management

    - by JuergenKress
    This whitepaper “Oracle SOA Suite 11g Database Growth Management”  has been written to highlight the need to implement an appropriate strategy to manage the growth the of SOA 11g database. The advice presented should facilitate better dialog between SOA and Database administrators when planning database and host requirements Whitepaper Oracle SOA Suite 11g Database Growth Management Advisor Webcast “Oracle SOA Suite 11g Database Growth Management” April 11th 2012 Author: Michael Bousamra Contributing Authors: Deepak Arora Sai Sudarsan Pogaru SOA Partner Community For regular information on Oracle SOA Suite become a member in the SOA Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: SOA Community Forum,SOA Specialization,purging,Michael Bousamra Contributing Authors: Deepak Arora Sai Sudarsan Pogaru,SOA Suite 11g Database Growth Management

    Read the article

  • Conky to Monitor WLS Managed Servers

    - by John Graves
    I've been using a little utility on my linux-based machines for years called conky.  It can be used to monitor system resources, but I wanted to modify it to monitor my WebLogic managed servers too. Once installing conky, you'll need to update the .conkyrc file.  Here is a simple example. Basically, the important lines are these: - Admin (7001) ${if_empty ${exec /usr/sbin/lsof -i :7001 | grep LISTEN}}${color red}DOWN${color} ${else}${color green} UP ${color}(${tcp_portmon 7001 7001 count}) ${endif} - OSB (8011) ${if_empty ${exec /usr/sbin/lsof -i :8011 | grep LISTEN}}${color red}DOWN${color} ${else}${color green} UP ${color}(${tcp_portmon 8011 8011 count}) ${endif} - BAM (9001) ${if_empty ${exec /usr/sbin/lsof -i :9001 | grep LISTEN}}${color red}DOWN${color} ${else}${color green} UP ${color}(${tcp_portmon 9001 9001 count}) ${endif} - DB (1521) ${if_empty ${exec /usr/sbin/lsof -i :1521 | grep LISTEN}}${color red}DOWN${color} ${else}${color green} UP ${color}(${tcp_portmon 1521 1521 count}) ${endif} It uses lsof to find out if ports are in use. Here is a video showing it in action.

    Read the article

  • Neuste Version zum Download: Apex 4.2 ist da!

    - by britta wolf
    Seit dem 12. Oktober 2012 steht APEX 4.2 zum Download bereit. Schnell installieren..... und dann gleich die neuen Features ausprobieren, z.B. das einfache, deklarative Erstellen von APEX-Anwendungen für mobile Endgeräte oder HTML5-Diagramme. Aber auch darüber hinaus gibt es weitere Neuerungen.: so wurde zum Beispiel der Excel-Upload für den Endanwender verbessert. Ausserdem kann man nun 200 (anstelle von 100) Elemente auf eine Seite setzen.

    Read the article

  • ISE (Germany) and Techdata Azlan: Exadata win over IBM at Immonet

    - by Javier Puerta
    Immonet, a subsidiary of German media company Axel Springer, provides cross-media real estate marketing via the Internet, newspapers, and other channels. The Immonet.de website is the number two German property portal with approximately 1.8 million unique visitors per month and over than 950,000 current online offerings. Read here how ISE solved with Exadata the performance problems that Immonet was experiencing.

    Read the article

  • Automatic Storage Management (ASM)

    - by jean-marc.gaudron(at)oracle.com
    Master Note for Automatic Storage Management (ASM) (Doc ID 1187723.1)This Master Note is intended to provide an index and references to the most frequently used My Oracle Support Notes with respect to Oracle Automatic Storage Management (ASM) environments. This Master Note is subdivided into categories to allow for easy access and reference to notes that are applicable to your area of interest. This includes the following categories: Automatic Storage Management (ASM) Concepts and Overview Automatic Storage Management (ASM) Installation Automatic Storage Management (ASM) Configuration Automatic Storage Management (ASM) Administration Automatic Storage Management (ASM) Migration and Upgrade Automatic Storage Management (ASM) Monitoring Automatic Storage Management (ASM) Troubleshooting and Debugging Automatic Storage Management (ASM) Best Practices Automatic Storage Management (ASM) Versions and Patches ASMLIB Database Machine, Exadata Storage Server and RAC Documentation Using My Oracle Support Effectively

    Read the article

  • ArchBeat Link-o-Rama for 2012-07-10

    - by Bob Rhubart
    Free Event Today: Virtual Developer Day: Oracle Fusion Development This free event—another in the ongoing series of OTN Virtual Developer Days—focuses on Oracle Fusion development, and features three session tracks plus hands-on labs. Agenda and session abstracts are available now so you can be ready for the live event when it kicks off today, July 10, 9am to 1pm PST / 12pm to 4pm EST / 1pm to 5pm BRT. Podcast: The Role of the Cloud Architect - Part 1/3 In part one of this three-part conversation, cloud architects Ron Batra (AT&T) and James Baty (Oracle) talk about how cloud computing is driving the supply-chaining of IT and the "democratization of the activity of architecture." Middleware and Cloud Computing Book | Tom Laszewski Cloud migration expert Tom Laszewski describes Middleware and Cloud Computing by Frank Munz as "one of only a couple books that really discuss AWS and Oracle in depth." Cloud computing moves from fad to foundation | David Linthicum "When enterprises make cloud computing work, they view the application of the technology as a trade secret of sorts, so there are no press releases or white papers," says David Linthicum. "Indeed, if you see one presentation around a successful cloud computing case study, you can bet you're not hearing about 100 more." Oracle Real-Time Decisions: Combined Likelihood Models | Lukas Vermeer Lukas Vermeer concludes his extensive series of posts on decision models with a look "an advanced approach to amalgamate models, taking us to a whole new level of predictive modeling and analytical insights; combination models predicting likelihoods using multiple child models." Running Oracle BPM 11g PS5 Worklist Task Flow and Human Task Form on Non-SOA Domain | Andrejus Baranovskis "With a standard setup, both the BPM worklist application and the Human task form run on the same SOA domain, where the BPM process is running," says Oracle ACE Director Andrejus Baranovskis. "While this work fine, this is not what we want in the development, test and production environment." BAM design pointers | Kavitha Srinivasan "When using EMS (Enterprise Message Source) as a BAM feed, the best practice is to use one EMS to write to one Data Object," says Oracle Fusion Middleware A-Team blogger Kavitha Srinivasan. "There is a possibility of collisions and duplicates when multiple EMS write to the same row of a DO at the same time." Changes in SOA Human Task Flow (Run-Time) for Fusion Applications | Jack Desai Oracle Fusion Middleware A-Team blogger Jack Desai shares a troubleshooting tip. Thought for the Day "A program which perfectly meets a lousy specification is a lousy program." — Cem Kaner Source: SoftwareQuotes.com

    Read the article

  • SOA online seminar by Griffiths Waite &ndash; adopt Fusion Applications patterns today

    - by Jürgen Kress
    Our SOA Specialized partner Griffiths Waite developed a series of Oracle Fusion Middleware online seminars. Mark Simpson Oracle ACE Director gives an insight of the Oracle strategy, how Oracle is using Fusion Middleware to build Fusion Applications and how you can profit in your project from the Fusion Architecture. Giving examples how customers can adopt use cases for Application Integration & Composite Application Portals & Application Modernization & Business Process Management. If you are interested make sure you watch the online seminar and take the SOA Maturity Assessment For more information on SOA Specialization and the SOA Partner Community please feel free to register at www.oracle.com/goto/emea/soa (OPN account required) Blog Twitter LinkedIn Mix Forum Wiki Website Technorati Tags: Mark Simpson,Griffiths Waite,Fusion Middleware,Fusion Applications,SOA,Oracle,SOA Community,OPN,SOA Specialization,Specialization,Jürgen Kress

    Read the article

  • Virtualization and best hardware sharing scenario for me

    - by azera
    Hello, Following this thread on super user, I now want to start installing all my vm on the hardware. As a remainder, i have a (powerful enough) server on which i want to install 3 OS: there is a debian (general dev testbed purposes), an ipcop (network control/firewall) and a freenas (local network file sharing). I'm wondering which scenario would be the best for me and if I will be able to share the hardware to do what i want; either a - install an hypervisor like the free vmware esx and all three vms in it, or b - install debian, and the other two running inside it with virtual box My need being that: the ipcop should handle all network traffic to the internet, meaning all traffic from my main computer but also all traffic from the other two vm the freenas shares should be accessible from the other two vm and my main computer too i don't really care about the debian access, i only need to access it from my main computer, not the other vms Will I need to install additionnal network cards for each vm or can they all share the same one happily ? (right now I have two, one linking the server to my router [which only ipcop is gonna use] and one linking it to my switch [which i would like all three to use]) As for harddrives, I was going to use 1 harddrive cut in 3 partitions to install all three OSes, then add to that the freenas drives, will it be correct ? Thanks a lot for anyone who can help me, this is kind of a vast area and I'm not sure which way to go at all

    Read the article

  • Tips On Using The Service Contracts Import Program

    - by LuciaC
    Prior to release 12.1 there was no supported way to import contracts into the EBS Service Contracts application - there were no public APIs nor contract load programs provided.  From release 12.1 onwards the 'Service Contracts Import Program' is provided to load service contracts into the application. The Service Contracts Import functionality is explained in How to Use the Service Contracts Import Program - Scope and Limitations (Doc ID 1057242.1).  This note includes an attached document which explains the program architecture, shows the Entity Relationship Diagram and details the interface table definitions. The Import program takes data from the interface tables listed below and populates the contracts schema tables:  OKS_USAGE_COUNTERS_INTERFACE OKS_SALES_CREDITS_INTERFACEOKS_NOTES_INTERFACEOKS_LINES_INTERFACEOKS_HEADERS_INTERFACEOKS_COVERED_LEVELS_INTERFACEThese interface tables must be loaded via a custom load program.The Service Contracts Import concurrent request is then submitted to create contracts from this legacy data. The parameters to run the Import program are:  Parameter Description  Mode Validate only, Import  Batch Number Batch_Id (unique id populated into the OKS_HEADERS_INTERFACE table)  Number of Workers Number of workers required (these are spawned as separate sub-requests)  Commit size Represents number of successfully processed contracts commited to database The program spawns sub-requests for the import worker(s) and the 'Service Contracts Import Report'.  The data is validated prior to import and into the Contracts tables and will report errors in the Service Contracts Import Report program output file (Import Execution Report).  Troubleshooting tips are provided in R12.1 - Common Service Contract Import Errors (Doc ID 762545.1); this document lists some, but not all, import errors.  The document will be updated over time.  Additional help is given in Debugging Tip for Service Contracts Import Errors (Doc ID 971426.1).After you successfully import contracts, you can purge the records from the interface tables by running the Service Contracts Import Purge concurrent program. Note that there is no supported way to mass delete data from the Contracts schema tables once they are populated, so data loaded by the Import program must be fully tested and verified before the program is run to load data into a Production system.A Service Contracts Import Test program has been provided which will take an existing contract in the application and load the interface tables using the data from that contract.  This can be used as an example for guidance on how to load the interface tables.  The Test program functionality is explained in How to Use the Service Contracts Test Import Program Provided in Release 12.1 (Doc ID 761209.1).  Note that the Test program has some limitations which do not apply to the full Import program and is not a supported program, it is simply a testing tool.  

    Read the article

  • Retail CEO Interviews

    - by David Dorf
    Businessweek's 2012 Interview Issue has interviews with three retail CEOs that are worth a quick read.  I copied some excerpts below, but please follow the links to the entire interviews. Ron Johnson, CEO JCPenney Take me through your merchandising. One of the things I learned from Steve [Jobs]—Steve said three times in his life he had the chance to be part of the change of an interface. If you change the interface, you can dramatically change the entire experience of the product. For Steve, that was the mouse, the scroll wheel on the iPod, and then the [touch]screen. What we’re trying to do here is change the interface of retail. What we call that is the street, and you’re standing in the middle of it. When you walk into a store today, you’re overwhelmed by merchandise. There is a narrow aisle. Typically, it’s filled with product on tables and you’re overwhelmed with the noise of signs and promotions. Especially in the age of the Internet, the idea of going to a very large store and having so much abundance is actually not very appealing. The first thing you find here is you’re inspired. I have used the mannequins. The street is actually this new navigation path for a retail store. So if you come in here—you’ll notice that these aisles are 14 feet wide. These are wider than Nordstrom’s (JWN). Slide show of JCPenney store. Walter Robb, co-CEO Whole Foods What did you learn from the recent recession about selling groceries?It was a lot of humble pie, because our sales experienced a drop that I have never seen in 32 years of retail. Customers left us in droves. We also learned that there were some very loyal customers who loved Whole Foods (WFM), people who said, “I like what you stand for. I like coming here. I like this experience.” That was very affirming. I think the realization was that we’ve got some customers, and we need to make sure we know who they are. So instead of chasing every customer out there, we started doing customer discussion groups. We were growing for growth’s sake, which is not a good strategy. We were chasing the rainbow. We cut the growth in half overnight and said, “All right, slow down. Let’s make sure we’re doing this better and more thoroughly and more thoughtfully.” This company is a mission-based company. This company started to change the world by bringing healthier food to the world. It’s not about the money, it’s about the impact, and this company is back on track as a result of those experiences. Video of Whole Foods store tour. Kay Krill, CEO Ann Taylor You’ve worked in retail all your life. What drew you to it?I graduated from college, and I did not know what I wanted to do. Macy’s (M) came to campus to interview for their training program, and I thought, “Let me give it a try.” I got the job and fell in love with the industry. The president of Macy’s at the time said, “If you don’t wake up every morning dying to go to work, then retailing is not for you; it has to be in your blood.” It was in my blood. I love the fact that every day is different. You can get to be creative one day, financial the next day, marketing the next. I love going to stores. I love talking to associates. I love talking to clients. There’s not a predictable day.

    Read the article

  • Highlights from recent Yammer video

    - by Eric Jensen
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} A few weeks back, Ryan Kennedy of Yammer gave a talk about Berkeley DB Java Edition. You can find it posted here on Alex Popescu's Blog, or go directly to the video post itself. It was full of useful nuggets of information, such as why they chose to use BDB JE, performance, and some tips & tricks at the end. At over 40 minutes, the video is quite long. Ryan is an entertaining speaker, so I suggest you watch all of it. But if you only have time for the highlights, here are some times you can sync to:  06:18 hear the Berkeley DB JE features that caused Yammer select it, including: replication auto leader election, failover configurable durability and consistency guarantees 23:10 System performance characteristics 35:08 Check out the tips and tricks for using Berkeley DB JE I know the Berkeley DB development team is very pleased that BDB JE is working out well for Yammer. We definitely encourage others out there to take note of this success, especially if your requirements are similar to Yammer's (which Ryan outlines at the beginning of his talk)

    Read the article

  • Barcodes and Bugs

    - by Tim Dexter
    A great mail from Mike at Browning last week. He has been through the ringer getting his BIP barcoding sorted out but he's now out of the woods. Here's the final result. By way of explanation, an excerpt from Mike's email:   This is an example of the GS1_128 carton shipping labels we are now producing with BIP in our web application for our vendors who drop ship products to our dealers. It produces 4 labels per printed page, in PDF format, on peel & stick label paper. Each label has a unique carton number, and a unique carton serial number in the SSCC-18 barcode. This example is for Cabelas (each customer has slightly different GS1-128 label format requirements – custom template for each - a pain!). I am using custom java encoders I wrote for the UPC and SSCC-18 barcodes, and a standard encoder (code128b) for the ShipTo zip barcode. Is there any way yet to get around that SUPER ANNOYING bug when opening the rtf template in MS Word, and it replaces my xsl code text in the barcode fields with gibberish??? Every time I open it I have to re-enter all the xsl code. Not only to be able to read & edit it, but also to get it to work in BIP (BIP doesn’t like the gibberish if I upload the template that has it). Mike's last point, regarding the annoying bug in the template builder, is one that I have experienced occasionally. The development team have looked at it and found it to be an issue with MSWord and not a plugin problem. That's all well and good but how can you get around it? Well, you can take advantage of the font mapping that BIP offers to get the barcodes into the PDF output. As many of you know, getting a barcode font to appear in the PDF output, you need employ the use of the xdo.cfg file in the template builder config directory.You would normally have an entry such as this:         <font family="Code 128" style="normal" weight="normal">        <truetype path="C:\windows\fonts\128R00.TTF" />       </font>to map a barcode font to get it to render in the PDF output when testing from the template builder plugin.   Mike's issue is only present when the formfield is highlighted with a barcode font. The other fields in the template are OK. What you can do to get around the issue is to bend the config entry to get around having to use the barcode font in the template at all. Changing the entry to something like:         <font family="Calibri" style="normal" weight="normal">        <truetype path="C:\windows\fonts\128R00.TTF" />       </font>   Note that we are mapping the Calibri; a humanly readable and non 'erroring' font in the template, to the code 128 barcode font. Where you used to highlight the field with the barcode in MSWord, you now use the Calibri font instead. At run time, BIP will go look for the Calibri font mapping and will drop in the Code128 font. Of course, Calibri is an example; you need to pick a font that you are not going to use any where else in the layout.

    Read the article

  • The Workshop Technique Handbook

    - by llowitz
    The #OUM method pack contains a plethora of information, but if you browse through the activities and tasks contained in OUM, you will see very few references to workshops.  Consequently, I am often asked whether OUM supports a workshop-type approach.  In general, OUM does not prescribe the manner in which tasks should be conducted, as many factors such as culture, availability of resources, potential travel cost of attendees, can influence whether a workshop is appropriate in a given situation.  Although workshops are not typically called out in OUM, OUM encourages the project manager to group the OUM tasks in a way that makes sense for the project. OUM considers a workshop to be a technique that can be applied to any OUM task or group of tasks.  If a workshop is conducted, it is important to identify the OUM tasks that are executed during the workshop.  For example, a “Requirements Gathering Workshop” is quite likely to Gathering Business Requirements, Gathering Solution Requirements and perhaps Specifying Key Structure Definitions. Not only is a workshop approach to conducting the OUM tasks perfectly acceptable, OUM provides in-depth guidance on how to maximize the value of your workshops.  I strongly encourage you to read the “Workshop Techniques Handbook” included in the OUM Manage Focus Area under Method Resources. The Workshop Techniques Handbook provides valuable information on a variety of workshop approaches and discusses the circumstances in which each type of workshop is most affective.  Furthermore, it provides detailed information on how to structure a workshops and tips on facilitating the workshop. You will find guidance on some popular workshop techniques such as brainstorming, setting objectives, prioritizing and other more specialized techniques such as Value Chain Analysis, SWOT analysis, the Delphi Technique and much more. Workshops can and should be applied to any type of OUM project, whether that project falls within the Envision, Manage or Implementation focus areas.  If you typically employ workshops to gather information, walk through a business process, develop a roadmap or validate your understanding with the client, by all means continue utilizing them to conduct the OUM tasks during your project, but first take the time to review the Workshop Techniques Handbook to refresh your knowledge and hone your skills. 

    Read the article

  • WNA Configuration in OAM 11g

    - by P Patra
    Pre-Requisite: Kerberos authentication scheme has to exist. This is usually pre-configured OAM authentication scheme. It should have Authentication Level - "2", Challenge Method - "WNA", Challenge Direct URL - "/oam/server" and Authentication Module- "Kerberos". The default authentication scheme name is "KerberosScheme", this name can be changed. The DNS name has to be resolvable on the OAM Server. The DNS name with referrals to AD have to be resolvable on OAM Server. Ensure nslookup work for the referrals. Pre-Install: AD team to produce keytab file on the AD server by running ktpass command. Provide OAM Hostname to AD Team. Receive from AD team the following: Keypass file produced when running the ktpass command ktpass username ktpass password Copy the keytab file to convenient location in OAM install tree and rename the file if desired. For instance where oam-policy.xml file resides. i.e. /fa_gai2_d/idm/admin/domains/idm-admin/IDMDomain/config/fmwconfig/keytab.kt Configure WNA Authentication on OAM Server: Create config file krb.config and set the environment variable to the path to this file: KRB_CONFIG=/fa_gai2_d/idm/admin/domains/idm-admin/IDMDomain/config/fmwconfig/krb.conf The variable KRB_CONFIG has to be set in the profile for the user that OAM java container(i.e. Wbelogic Server) runs as, so that this setting is available to the OAM server. i.e. "applmgr" user. In the krb.conf file specify: [libdefaults] default_realm= NOA.ABC.COM dns_lookup_realm= true dns_lookup_kdc= true ticket_lifetime= 24h forwardable= yes [realms] NOA.ABC.COM={ kdc=hub21.noa.abc.com:88 admin_server=hub21.noa.abc.com:749 default_domain=NOA.ABC.COM [domain_realm] .abc.com=ABC.COM abc.com=ABC.COM .noa.abc.com=NOA.ABC.COM noa.abc.com=NOA.ABC.COM Where hub21.noa.abc.com is load balanced DNS VIP name for AD Server and NOA.ABC.COM is the name of the domain. Create authentication policy to WNA protect the resource( i.e. EBSR12) and choose the "KerberosScheme" as authentication scheme. Login to OAM Console => Policy Configuration Tab => Browse Tab => Shared Components => Application Domains => IAM Suite => Authentication Policies => Create Name: ABC WNA Auth Policy Authentication Scheme: KerberosScheme Failure URL: http://hcm.noa.abc.com/cgi-bin/welcome Edit System Configuration for Kerberos System Configuration Tab => Access Manager Settings => expand Authentication Modules => expand Kerberos Authentication Module => double click on Kerberos Edit "Key Tab File" textbox - put in /fa_gai2_d/idm/admin/domains/idm-admin/IDMDomain/config/fmwconfig/keytab.kt Edit "Principal" textbox - put in HTTP/[email protected] Edit "KRB Config File" textbox - put in /fa-gai2_d/idm/admin/domains/idm-admin/IDMDomain/config/fmwconfig/krb.conf Cilck "Apply" In the script setting environment for the WLS server where OAM is deployed set the variable: KRB_CONFIG=/fa_gai2_d/idm/admin/domains/idm-admin/IDMDomain/config/fmwconfig/krb.conf Re-start OAM server and OAM Server Container( Weblogic Server)

    Read the article

  • Security Controls on data for P6 Analytics

    - by Jeffrey McDaniel
    The Star database and P6 Analytics calculates security based on P6 security using OBS, global, project, cost, and resource security considerations. If there is some concern that users are not seeing expected data in P6 Analytics here are some areas to review: 1. Determining if a user has cost security is based on the Project level security privileges - either View Project Costs/Financials or Edit EPS Financials. If expecting to see costs make sure one of these permissions are allocated.  2. User must have OBS access on a Project. Not WBS level. WBS level security is not supported. Make sure user has OBS on project level.  3. Resource Access is determined by what is granted in P6. Verify the resource access granted to this user in P6. Resource security is hierarchical. Project access will override Resource access based on the way security policies are applied. 4. Module access must be given to a P6 user for that user to come over into Star/P6 Analytics. For earlier version of RDB there was a report_user_flag on the Users table. This flag field is no longer used after P6 Reporting Database 2.1. 5. For P6 Reporting Database versions 2.2 and higher, the Extended Schema Security service must be run to calculate all security. Any changes to privileges or security this service must be rerun before any ETL. 6. In P6 Analytics 2.0 or higher, a Weblogic user must exist that matches the P6 username. For example user Tim must exist in P6 and Weblogic users for Tim to be able to log into P6 Analytics and access data based on  P6 security.  In earlier versions the username needed to exist in RPD. 7. Cache in OBI is another area that can sometimes make it seem a user isn't seeing the data they expect. While cache can be beneficial for performance in OBI. If the data is outdated it can retrieve older, stale data. Clearing or turning off cache when rerunning a query can determine if the returned result set was from cache or from the database.

    Read the article

  • A Method for Reducing Contention and Overhead in Worker Queues for Multithreaded Java Applications

    - by Janice J. Heiss
    A java.net article, rich in practical resources, by IBM India Labs’ Sathiskumar Palaniappan, Kavitha Varadarajan, and Jayashree Viswanathan, explores the challenge of writing code in a way that that effectively makes use of the resources of modern multicore processors and multiprocessor servers.As the article states: “Many server applications, such as Web servers, application servers, database servers, file servers, and mail servers, maintain worker queues and thread pools to handle large numbers of short tasks that arrive from remote sources. In general, a ‘worker queue’ holds all the short tasks that need to be executed, and the threads in the thread pool retrieve the tasks from the worker queue and complete the tasks. Since multiple threads act on the worker queue, adding tasks to and deleting tasks from the worker queue needs to be synchronized, which introduces contention in the worker queue.” The article goes on to explain ways that developers can reduce contention by maintaining one queue per thread. It also demonstrates a work-stealing technique that helps in effectively utilizing the CPU in multicore systems. Read the rest of the article here.

    Read the article

  • ODI 11g - Dynamic and Flexible Code Generation

    - by David Allan
    ODI supports conditional branching at execution time in its code generation framework. This is a little used, little known, but very powerful capability - this let's one piece of template code behave dynamically based on a runtime variable's value for example. Generally knowledge module's are free of any variable dependency. Using variable's within a knowledge module for this kind of dynamic capability is a valid use case - definitely in the highly specialized area. The example I will illustrate is much simpler - how to define a filter (based on mapping here) that may or may not be included depending on whether at runtime a certain value is defined for a variable. I define a variable V_COND, if I set this variable's value to 1, then I will include the filter condition 'EMP.SAL > 1' otherwise I will just use '1=1' as the filter condition. I use ODIs substitution tags using a special tag '<$' which is processed just prior to execution in the runtime code - so this code is included in the ODI scenario code and it is processed after variables are substituted (unlike the '<?' tag).  So the lines below are not equal ... <$ if ( "#V_COND".equals("1")  ) { $> EMP.SAL > 1 <$ } else { $> 1 = 1 <$ } $> <? if ( "#V_COND".equals("1")  ) { ?> EMP.SAL > 1 <? } else { ?> 1 = 1 <? } ?> When the <? code is evaluated the code is executed without variable substitution - so we do not get the desired semantics, must use the <$ code. You can see the jython (java) code in red is the conditional if statement that drives whether the 'EMP.SAL > 1' or '1=1' is included in the generated code. For this illustration you need at least the ODI 11.1.1.6 release - with the vanilla 11.1.1.5 release it didn't work for me (may be patches?). As I mentioned, normally KMs don't have dependencies on variables - since any users must then have these variables defined etc. but it does afford a lot of runtime flexibility if such capabilities are required - something to keep in mind, definitely.

    Read the article

  • Yammer, Berkeley DB, and the 3rd Platform

    - by Eric Jensen
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Cambria","serif"; mso-ascii-font-family:Cambria; mso-ascii-theme-font:major-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:major-fareast; mso-hansi-font-family:Cambria; mso-hansi-theme-font:major-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:major-bidi; mso-bidi-language:EN-US;} If you read the news, you know that the latest high-profile social media acquisition was just confirmed. Microsoft has agreed to acquire Yammer for 1.2 billion. Personally, I believe that Yammer’s amazing success can be mainly attributed to their wise decision to use Berkeley DB Java Edition as their backend data store. :-) I’m only kidding, of course. However, as Ryan Kennedy points out in the video I recently blogged about, BDB JE did provide the right feature set that allowed them to reliably grow their business. Which in turn allowed them to focus on their core value add. As it turns out, their ‘add’ is quite valuable! This actually makes sense to me, a lot more sense than certain other recent social acquisitions, and here’s why. Last year, IDC declared that we are entering a new computing era, the era of the “3rd Platform.” In case you’re curious, the first 2 were terminal computing and client/server computing, IIRC. Anyway, this 3rd one is more complicated. This year, IDC refined the concept further. It now involves 4 distinct buzzwords: cloud, social, mobile, and big data. Yammer is a social media platform that runs in the cloud, designed to be used from mobile devices. Their approach, using Berkeley DB Java Edition with High Availability, qualifies as big data. This means that Yammer is sitting right smack in the center if IDC’s new computing era. Another way to put it is: the folks at Yammer were prescient enough to predict where things were headed, and get there first. They chose Berkeley DB to handle their data. Maybe you should too!

    Read the article

  • Back from Istanbul - Presentations available for download

    - by Javier Puerta
       (Picture by Paul Thompson, 14-March-2012) On March 14-15th we have celebrated our 2012 Exadata Partner Community EMEA Forum, in Istanbul, Turkey. It has been an intense two days, packed with great content and a lot of networking. Organizing it jointly with the Manageability Partner Forum has allowed participants to benefit from the content of the Manageability sessions, which is a topic that is becoming key as we move to cloud architectures. During the sessions we have listened to two thought-leaders in our industry, Ron Tolido, from Capgemni, and Julian Dontcheff, from Accenture. We thank our Exadata partners -ISE (Germany), Inserve (Sweden), Fors (Russia), Linkplus (Turkey) and Sogeti,  for sharing with the community their experiences in selling and implementing Exadata and Manageability projects. The slide decks used in the presentations are now available for download at the Exadata Partner Community Collaborative Workspace (for community members only - if you get an error message, please register for the Community first).I want to thank all who have participated at the event, and look forward to meeting again at next year's Forum.

    Read the article

< Previous Page | 453 454 455 456 457 458 459 460 461 462 463 464  | Next Page >