Search Results

Search found 28301 results on 1133 pages for 'external process'.

Page 1/1133 | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Enterprise Process Maps: A Process Picture worth a Million Words

    - by raul.goycoolea
    p { margin-bottom: 0.08in; }h1 { margin-top: 0.33in; margin-bottom: 0in; color: rgb(54, 95, 145); page-break-inside: avoid; }h1.western { font-family: "Cambria",serif; font-size: 14pt; }h1.cjk { font-family: "DejaVu Sans"; font-size: 14pt; }h1.ctl { font-size: 14pt; } Getting Started with Business Transformations A well-known proverb states that "A picture is worth a thousand words." In relation to Business Process Management (BPM), a credible analyst might have a few questions. What if the picture was taken from some particular angle, like directly overhead? What if it was taken from only an inch away or a mile away? What if the photographer did not focus the camera correctly? Does the value of the picture depend on who is looking at it? Enterprise Process Maps are analogous in this sense of relative value. Every BPM project (holistic BPM kick-off, enterprise system implementation, Service-oriented Architecture, business process transformation, corporate performance management, etc.) should be begin with a clear understanding of the business environment, from the biggest picture representations down to the lowest level required or desired for the particular project type, scope and objectives. The Enterprise Process Map serves as an entry point for the process architecture and is defined: the single highest level of process mapping for an organization. It is constructed and evaluated during the Strategy Phase of the Business Process Management Lifecycle. (see Figure 1) Fig. 1: Business Process Management Lifecycle Many organizations view such maps as visual abstractions, constructed for the single purpose of process categorization. This, in turn, results in a lesser focus on the inherent intricacies of the Enterprise Process view, which are explored in the course of this paper. With the main focus of a large scale process documentation effort usually underlying an ERP or other system implementation, it is common for the work to be driven by the desire to "get to the details," and to the type of modeling that will derive near-term tangible results. For instance, a project in American Pharmaceutical Company X is driven by the Director of IT. With 120+ systems in place, and a lack of standardized processes across the United States, he and the VP of IT have decided to embark on a long-term ERP implementation. At the forethought of both are questions, such as: How does my application architecture map to the business? What are each application's functionalities, and where do the business processes utilize them? Where can we retire legacy systems? Well-developed BPM methodologies prescribe numerous model types to capture such information and allow for thorough analysis in these areas. Process to application maps, Event Driven Process Chains, etc. provide this level of detail and facilitate the completion of such project-specific questions. These models and such analysis are appropriately carried out at a relatively low level of process detail. (see figure 2) Fig. 2: The Level Concept, Generic Process HierarchySome of the questions remaining are ones of documentation longevity, the continuation of BPM practice in the organization, process governance and ownership, process transparency and clarity in business process objectives and strategy. The Level Concept in Brief Figure 2 shows a generic, four-level process hierarchy depicting the breakdown of a "Process Area" into progressively more detailed process classifications. The number of levels and the names of these levels are flexible, and can be fit to the standards of the organization's chosen terminology or any other chosen reference model that makes logical sense for both short and long term process description. It is at Level 1 (in this case the Process Area level), that the Enterprise Process Map is created. This map and its contained objects become the foundation for a top-down approach to subsequent mapping, object relationship development, and analysis of the organization's processes and its supporting infrastructure. Additionally, this picture serves as a communication device, at an executive level, describing the design of the business in its service to a customer. It seems, then, imperative that the process development effort, and this map, start off on the right foot. Figuring out just what that right foot is, however, is critical and trend-setting in an evolving organization. Key Considerations Enterprise Process Maps are usually not as living and breathing as other process maps. Just as it would be an extremely difficult task to change the foundation of the Sears Tower or a city plan for the entire city of Chicago, the Enterprise Process view of an organization usually remains unchanged once developed (unless, of course, an organization is at a stage where it is capable of true, high-level process innovation). Regardless, the Enterprise Process map is a key first step, and one that must be taken in a precise way. What makes this groundwork solid depends on not only the materials used to construct it (process areas), but also the layout plan and knowledge base of what will be built (the entire process architecture). It seems reasonable that care and consideration are required to create this critical high level map... but what are the important factors? Does the process modeler need to worry about how many process areas there are? About who is looking at it? Should he only use the color pink because it's his boss' favorite color? Interestingly, and perhaps surprisingly, these are all valid considerations that may just require a bit of structure. Below are Three Key Factors to consider when building an Enterprise Process Map: Company Strategic Focus Process Categorization: Customer is Core End-to-end versus Functional Processes Company Strategic Focus As mentioned above, the Enterprise Process Map is created during the Strategy Phase of the Business Process Management Lifecycle. From Oracle Business Process Management methodology for business transformation, it is apparent that business processes exist for the purpose of achieving the strategic objectives of an organization. In a prescribed, top-down approach to process development, it must be ensured that each process fulfills its objectives, and in an aggregated manner, drives fulfillment of the strategic objectives of the company, whether for particular business segments or in a broader sense. This is a crucial point, as the strategic messages of the company must therefore resound in its process maps, in particular one that spans the processes of the complete business: the Enterprise Process Map. One simple example from Company X is shown below (see figure 3). Fig. 3: Company X Enterprise Process Map In reviewing Company X's Enterprise Process Map, one can immediately begin to understand the general strategic mindset of the organization. It shows that Company X is focused on its customers, defining 10 of its process areas belonging to customer-focused categories. Additionally, the organization views these end-customer-oriented process areas as part of customer-fulfilling value chains, while support process areas do not provide as much contiguous value. However, by including both support and strategic process categorizations, it becomes apparent that all processes are considered vital to the success of the customer-oriented focus processes. Below is an example from Company Y (see figure 4). Fig. 4: Company Y Enterprise Process Map Company Y, although also a customer-oriented company, sends a differently focused message with its depiction of the Enterprise Process Map. Along the top of the map is the company's product tree, overarching the process areas, which when executed deliver the products themselves. This indicates one strategic objective of excellence in product quality. Additionally, the view represents a less linear value chain, with strong overlaps of the various process areas. Marketing and quality management are seen as a key support processes, as they span the process lifecycle. Often, companies may incorporate graphics, logos and symbols representing customers and suppliers, and other objects to truly send the strategic message to the business. Other times, Enterprise Process Maps may show high level of responsibility to organizational units, or the application types that support the process areas. It is possible that hundreds of formats and focuses can be applied to an Enterprise Process Map. What is of vital importance, however, is which formats and focuses are chosen to truly represent the direction of the company, and serve as a driver for focusing the business on the strategic objectives set forth in that right. Process Categorization: Customer is Core In the previous two examples, processes were grouped using differing categories and techniques. Company X showed one support and three customer process categorizations using encompassing chevron objects; Customer Y achieved a less distinct categorization using a gradual color scheme. Either way, and in general, modeling of the process areas becomes even more valuable and easily understood within the context of business categorization, be it strategic or otherwise. But how one categorizes their processes is typically more complex than simply choosing object shapes and colors. Previously, it was stated that the ideal is a prescribed top-down approach to developing processes, to make certain linkages all the way back up to corporate strategy. But what about external influences? What forces push and pull corporate strategy? Industry maturity, product lifecycle, market profitability, competition, etc. can all drive the critical success factors of a particular business segment, or the company as a whole, in addition to previous corporate strategy. This may seem to be turning into a discussion of theory, but that is far from the case. In fact, in years of recent study and evolution of the way businesses operate, cross-industry and across the globe, one invariable has surfaced with such strength to make it undeniable in the game plan of any strategy fit for survival. That constant is the customer. Many of a company's critical success factors, in any business segment, relate to the customer: customer retention, satisfaction, loyalty, etc. Businesses serve customers, and so do a business's processes, mapped or unmapped. The most effective way to categorize processes is in a manner that visualizes convergence to what is core for a company. It is the value chain, beginning with the customer in mind, and ending with the fulfillment of that customer, that becomes the core or the centerpiece of the Enterprise Process Map. (See figure 5) Fig. 5: Company Z Enterprise Process Map Company Z has what may be viewed as several different perspectives or "cuts" baked into their Enterprise Process Map. It has divided its processes into three main categories (top, middle, and bottom) of Management Processes, the Core Value Chain and Supporting Processes. The Core category begins with Corporate Marketing (which contains the activities of beginning to engage customers) and ends with Customer Service Management. Within the value chain, this company has divided into the focus areas of their two primary business lines, Foods and Beverages. Does this mean that areas, such as Strategy, Information Management or Project Management are not as important as those in the Core category? No! In some cases, though, depending on the organization's understanding of high-level BPM concepts, use of category names, such as "Core," "Management" or "Support," can be a touchy subject. What is important to understand, is that no matter the nomenclature chosen, the Core processes are those that drive directly to customer value, Support processes are those which make the Core processes possible to execute, and Management Processes are those which steer and influence the Core. Some common terms for these three basic categorizations are Core, Customer Fulfillment, Customer Relationship Management, Governing, Controlling, Enabling, Support, etc. End-to-end versus Functional Processes Every high and low level of process: function, task, activity, process/work step (whatever an organization calls it), should add value to the flow of business in an organization. Suppose that within the process "Deliver package," there is a documented task titled "Stop for ice cream." It doesn't take a process expert to deduce the room for improvement. Though stopping for ice cream may create gain for the one person performing it, it likely benefits neither the organization nor, more importantly, the customer. In most cases, "Stop for ice cream" wouldn't make it past the first pass of To-Be process development. What would make the cut, however, would be a flow of tasks that, each having their own value add, build up to greater and greater levels of process objective. In this case, those tasks would combine to achieve a status of "package delivered." Figure 3 shows a simple example: Just as the package can only be delivered (outcome of the process) without first being retrieved, loaded, and the travel destination reached (outcomes of the process steps), some higher level of process "Play Practical Joke" (e.g., main process or process area) cannot be completed until a package is delivered. It seems that isolated or functionally separated processes, such as "Deliver Package" (shown in Figure 6), are necessary, but are always part of a bigger value chain. Each of these individual processes must be analyzed within the context of that value chain in order to ensure successful end-to-end process performance. For example, this company's "Create Joke Package" process could be operating flawlessly and efficiently, but if a joke is never developed, it cannot be created, so the end-to-end process breaks. Fig. 6: End to End Process Construction That being recognized, it is clear that processes must be viewed as end-to-end, customer-to-customer, and in the context of company strategy. But as can also be seen from the previous example, these vital end-to-end processes cannot be built without the functionally oriented building blocks. Without one, the other cannot be had, or at least not in a complete and organized fashion. As it turns out, but not discussed in depth here, the process modeling effort, BPM organizational development, and comprehensive coverage cannot be fully realized without a semi-functional, process-oriented approach. Then, an Enterprise Process Map should be concerned with both views, the building blocks, and access points to the business-critical end-to-end processes, which they construct. Without the functional building blocks, all streams of work needed for any business transformation would be lost mess of process disorganization. End-to-end views are essential for utilization in optimization in context, understanding customer impacts, base-lining all project phases and aligning objectives. Including both views on an Enterprise Process Map allows management to understand the functional orientation of the company's processes, while still providing access to end-to-end processes, which are most valuable to them. (See figures 7 and 8). Fig. 7: Simplified Enterprise Process Map with end-to-end Access Point The above examples show two unique ways to achieve a successful Enterprise Process Map. The first example is a simple map that shows a high level set of process areas and a separate section with the end-to-end processes of concern for the organization. This particular map is filtered to show just one vital end-to-end process for a project-specific focus. Fig. 8: Detailed Enterprise Process Map showing connected Functional Processes The second example shows a more complex arrangement and categorization of functional processes (the names of each process area has been removed). The end-to-end perspective is achieved at this level through the connections (interfaces at lower levels) between these functional process areas. An important point to note is that the organization of these two views of the Enterprise Process Map is dependent, in large part, on the orientation of its audience, and the complexity of the landscape at the highest level. If both are not apparent, the Enterprise Process Map is missing an opportunity to serve as a holistic, high-level view. Conclusion In the world of BPM, and specifically regarding Enterprise Process Maps, a picture can be worth as many words as the thought and effort that is put into it. Enterprise Process Maps alone cannot change an organization, but they serve more purposes than initially meet the eye, and therefore must be designed in a way that enables a BPM mindset, business process understanding and business transformation efforts. Every Enterprise Process Map will and should be different when looking across organizations. Its design will be driven by company strategy, a level of customer focus, and functional versus end-to-end orientations. This high-level description of the considerations of the Enterprise Process Maps is not a prescriptive "how to" guide. However, a company attempting to create one may not have the practical BPM experience to truly explore its options or impacts to the coming work of business process transformation. The biggest takeaway is that process modeling, at all levels, is a science and an art, and art is open to interpretation. It is critical that the modeler of the highest level of process mapping be a cognoscente of the message he is delivering and the factors at hand. Without sufficient focus on the design of the Enterprise Process Map, an entire BPM effort may suffer. For additional information please check: Oracle Business Process Management.

    Read the article

  • How does process priority influence a process

    - by Luis Alvarado - The Wolverine
    Assuming we have read the following question: Change niceness (priority) of a running process and we know about root, non-root permissions: What actually happens when a running process (Through renice) or a new process (Through nice) gets its priority changed to a positive/negative value it previously had. Does it mean more memory is assign to it? Does more CPU power go to that particular process? Does it reduce any timing for resources for that process? What happens when the process priority change?

    Read the article

  • Ubuntu 12.04 LTS , Dell d410 output to External Monitor/TV with docking

    - by Gck
    I am not able to see the video output on My external monitor/TV after installing 12.04 LTS. I had the same issue with 11 versions also. This issue does not happen on 10.04 LTS version. This is my setup. Dell latitude D410 with Intel Mobile 915 GM/910GML express controller using the Dell docking station Connecting to my 42inch TV with DVI (from Docking) to HDMI cable (on TV) If i close the lid, some times i am able to see the video output on TV, but the resolution is making the fonts so small, i cannot even see it and when i click displays option to change the resolution, the output goes off completely and it is shown now on the laptop screen. one time, i got the output on both the screens, but i am not able to replicate that scenario even after trying the Fn+F8 key option on my laptop. earlier some time back when i ran version 11.x ubuntu, i was able to get the output on both the screens (basically mirrored output) with large font resolution on my TV using the monitor Test tool as mentioned here :http://www.bigfatostrich.com/2011/05/solved-ubuntu-11-04-broken-external-display/ The problem with this approach is that i have to do it every time i restart the machine and more over the video output is present on both the screens (mirrored), which is of no use. As i stated before, this does not happen with 10.04 LTS. With 10.04 LTS my laptop screen is off automatically and the output is seen on TV/external display. Can anyone share any options that i can try? How can we achieve the behavior of 10.04 LTS with respect to the External display on 12.04 LTS . Thankyou very much for your help in advance.

    Read the article

  • External display, windows are moving to external display

    - by hextler
    I use external display with lenovo w520, I am running it with following script xrandr --output LVDS1 --auto --primary --output VIRTUAL --mode 1920x1200 --left-of LVDS1 optirun screenclone -d :8 -x 1 xrandr --output VIRTUAL --off but I have number of problems. All windows are moving on external display If I interrupt this script I cannot run it second time without restarting xorg. yakuake is changing height and does not showing tabs. I cannot make external display primary, in that case I cannot see windows. yakuake is changing height and does not showing tabs. If you know how to solve thisw issues please let me know. Thanks in advance!

    Read the article

  • Have local admin privileges on Windows XP, but getting "Error terminating process: Access is denied"

    - by Chris W. Rea
    On one of the Windows XP machines I use regularly, there is a process that starts up periodically. I'd like to be able to kill the process – sometimes – because it occasionally runs when I'm busy doing something machine-intensive. I've already tried dropping the process priority to "Idle" to mitigate the effects, but it isn't the CPU that's the problem. Rather, the process is very disk-intensive and no matter the process priority, it still causes significant disk thrashing when running, impacting everything else I'm doing at the time. Using Process Explorer, I can find the process, right-click, and choose Kill Process, but I always get the message "Error terminating process: Access is denied." This is not an operating system process, but third-party software. What might that process be doing to prevent itself from being terminated? How can I kill such a process? Is there a way for me to modify the process's security or access control list (ACL) somewhere, using Process Explorer or another tool, so that I can effectively kill it?

    Read the article

  • Internal drives vs USB-3 with external SSD or eSata with External SSD

    - by normstorm
    I have a need to carry VMWare Virtual Machines with me for work. These are very large files (each VM is 20GB or more) and I carry around about 40 to 50 VM's to simulate different software configurations for different client needs. Key: they won't fit on the internal hard drive of my current laptop. I currently execute the VM's from an external 7200RPM 2.5" USB-2 drive. I keep copies of the VM's on other 5400 external USB-2 drives. The VM's work from this drive, but they are slow, costing me much time and frustration. It can take upwards of 30 minutes just to make a copy of one of the VM's. They can take upwards of 10-15 minutes to fully launch and then they operate sluggishly. I am buying a new laptop (Core I7, 8GB RAM and other high-end specs). I intend to buy an SSD for the O/S volume (C:). This SSD will not be large enough to hold the VM's. I have always wanted a second internal hard drive to operate the VM's. To have two hard drives, though, I am finding that I will have to go to a 17" laptop which would be bulky/heavy. I am instead considering purchasing a 15" laptop with either an eSATA port or USB-3 ports and then purchasing two external drives. One of the drives might be an external SSD (maybe OCX brand) for operating the VM's and the other a 7400RPM 1TB hard drive for carrying around the VM's not currently in use. The question is which options would give me the biggest bang for the buck and the weight: 1) 2nd Internal SSD hard drive. This would mean buying a 17" laptop with two drive "bays". The first bay would hold an SSD drive for the C: drive. I would leave the first bay empty from the manufacture and then purchase/install an aftermarket SSD drive. This second SSD drive would have to be very large (256 GB), which would be expensive. I would still also need another external hard drive for carrying around the VM's not in use. 2) 2nd internal hard drive - 7400 RPM. Again, a 17" laptop would be required, but there are models available with on SSD drive for the C: drive and a second 7200 RPM hard drives. The second drive could probably be large enough to hold the VM's in use as well as those not in use. But would it be fast enough to drive the VM's? 3) USB-3 with External SSD. I could buy a 15" laptop with an SSD drive for the C: drive and a second hard drive for general files. I would operate the VM's from an external USB-3 SSD drive and have a third USB-3 external 7200 RPM drive for holding the VM's not in use. 4) eSATA with External SSD. Ditto, just eSATA instead of USB-3 5) USB-3 with External 7400 RPM drive. Ditto, but the drive running the VM's would be USB-3 attached 7400 RPM drives rather than SSD. 6) eSATA with External 7400 RPM drive. Dittor, but the drive running the VM's would be eSATA attached 7400 RPM drives rather than SSD. Any thoughts on this and any creative solutions?

    Read the article

  • Problem with external monitor on my asus u36sd laptop

    - by Abonec
    To connect my laptop to external monitor (for the dual monitor configuration) I have to perform 7 weird steps: Suspend OS (close notebook for that) Connect external monitor to vga output Open notebook and unsuspend OS (at this moment in laptop screen is native resolution but on external monitor resolution is lower than native (not same as at laptop)). Suspend OS (close notebook for that) Open notebook and unsuspend OS (at this moment laptop screen has resolution as a external monitor but external monitor has lower resolution that should be in native) Suspend OS (close notebook for that) Unsuspend OS (at this moment laptop and external monitor have native resolution which will should be) I just open and close hood of the laptop until external monitor and laptop screen become in native resolution. Adjusting monitors in displays not give me proper result. I have ASUS U36SD with optimus (disabled by acpicall) with 1366x768 screen and external monitor with 1280x1024 and latest ubuntu 11.10. How to perform laptop to work with external monitor without this weird actions?

    Read the article

  • Why do we need fork to create new process

    - by user3671483
    In Unix whenever we want to create a new process, we fork the current process i.e. we create a new child process which is exactly the same as the parent process and then we do exec system call to replace the child process with a new process i.e. we replace all the data for the parent process eith that for the new process. Why do we create a copy of the parent process in the first place and why don't we create a new process directly? I am new to Unix please explain in lay-man terms.

    Read the article

  • Node.js MMO - process and/or map division

    - by Gipsy King
    I am in the phase of designing a mmo browser based game (certainly not massive, but all connected players are in the same universe), and I am struggling with finding a good solution to the problem of distributing players across processes. I'm using node.js with socket.io. I have read this helpful article, but I would like some advice since I am also concerned with different processes. Solution 1: Tie a process to a map location (like a map-cell), connect players to the process corresponding to their location. When a player performs an action, transmit it to all other players in this process. When a player moves away, he will eventually have to connect to another process (automatically). Pros: Easier to implement Cons: Must divide map into zones Player reconnection when moving into a different zone is probably annoying If one zone/process is always busy (has players in it), it doesn't really load-balance, unless I split the zone which may not be always viable There shouldn't be any visible borders Solution 1b: Same as 1, but connect processes of bordering cells, so that players on the other side of the border are visible and such. Maybe even let them interact. Solution 2: Spawn processes on demand, unrelated to a location. Have one special process to keep track of all connected player handles, their location, and the process they're connected to. Then when a player performs an action, the process finds all other nearby players (from the special player-process-location tracking node), and instructs their matching processes to relay the action. Pros: Easy load balancing: spawn more processes Avoids player reconnecting / borders between zones Cons: Harder to implement and test Additional steps of finding players, and relaying event/action to another process If the player-location-process tracking process fails, all other fail too I would like to hear if I'm missing something, or completely off track.

    Read the article

  • Unable to boot ubuntu 11.10 from external usb drive

    - by user45006
    I'm new to Ubuntu (and actually all things Linux) as of this morning, so please excuse any stupid mistakes I may be making. I recently bought an external hard drive for my newly built PC (that is running windows 7 if it matters). I would like to install Ubuntu onto the external drive and boot from there. I downloaded Ubuntu 11.10 and made a bootable cd, unplugged my internal HDDs, plugged in the external drive, installed Ubuntu 11.10 on the external drive via the installer, and replugged in my internal HDDs. Then I set my bios boot order to: Boot from USB-HDD - Boot from Hard Disk - Boot from CD/DVD. Now when I restart I get the message "Starting Operating System..." (or something like that, I forget exactly what it says) that lingers on the screen for a moment and then windows starts. Any idea what the problem may be? ~Relevant info~ BIOS version: Award Software International, Inc. F2, 2/22/2011 Ubuntu Version: 11.10 External Hard Drive: Western Digital My Passport Essential 500GB Portable Hard Drive (Black) ~Things I've already tried~ 1) Unplug internal HDDs so that only external drive was plugged in via usb. Same thing happened only obviously my BIOS could not detect any hard drives besides the external one. When booting received error "Could not detect operating system" 2) Formatted external hard drive and re installed. It didn't make a difference, however interestingly when I booted from cd the ubuntu installer said it detected ubuntu 11.10 on the external hard drive. 3) Within BIOS I've messed around with every boot order combo I could think of both in the "Hard Disk boot order" screen and the "Boot order" screen. I'm a little confused of why there are two screens for this. 4) Held F12 during startup which opens (what I think is) the one time boot screen and it gave me the options "Hard Drive", "cd/dvd", "USB-FDD", "USB-cdrom", "USB-HDD", and "USB-something else I can't remember what it was". I tried all of them, but the same thing as before happened each time. ~References~ I noticed several people on askubuntu have tried to do something similar if not the exact same. In fact, I even found a post that pretty much outlines step by step exactly what I did... only their's worked. /jealous. Linky: Install Ubuntu or Kubuntu on a External USB Drive I'm willing to try a different version of ubuntu - it's not like my heart is set on 11.10, but it's a pain to open my case and unplug my internal hard drives so I'd prefer not to do this unless someone is reasonably confident it'll work. Thank you for all of your help in advance! I'm really looking forward to exploring Ubuntu!

    Read the article

  • SOA 10g Developing a Simple Hello World Process

    - by [email protected]
    Softwares & Hardware Needed Intel Pentium D CPU 3 GHz, 2 GB RAM, Windows XP System ( Thats what i am using ) You could as well use Linux , but please choose High End RAM 10G SOA Suite from Oracle(TM) , Read Installation documents at www.Oracle.com J Developer 10.1.3.3 Official Documents at http://www.oracle.com/technology/products/ias/bpel/index.html java -version Java HotSpot(TM) Client VM (build 1.5.0_06-b05, mixed mode)BPEL Introduction - Developing a Simple Hello World Process  Synchronous BPEL Process      This Exercise focuses on developing a Synchronous Process, which mean you give input to the BPEL Process you get output immediately no waiting at all. The Objective of this exercise is to give input as name and it greets with Hello Appended by that name example, if I give input as "James" the BPEL process returns "Hello James". 1. Open the Oracle JDeveloper click on File -> New Application give the name "JamesApp" you can give your own name if it pleases you. Select the folder where you want to place the application. Click "OK" 2. Right Click on the "JamesApp" in the Application Navigator, Select New Menu. 3. Select "Projects" under "General" and "BPEL Process Project", click "OK" these steps remain same for all BPEL Projects 4. Project Setting Wizard Appears, Give the "Process Name" as "MyBPELProc" and Namespace as http://xmlns.james.com/ MyBPELProc, Select Template as "Synchronous BPEL Process click "Next" 5. Accept the input and output schema names as it is, click "Finish" 6. You would see the BPEL Process Designer, some of the folders such as Integration content and Resources are created and few more files 7. Assign Activity : Allows Assigning values to variables or copying values of one variable to another and also do some string manipulation or mathematical operations In the component palette at extreme right, select Process Activities from the drop down, and drag and drop "Assign" between "receive Input" and "replyOutput" 8. You can right click and edit the Assign activity and give any suitable name "AssignHello", 9. Select "Copy Operation" Tab create "Copy Operation" 10. In the From variables click on expression builder, select input under "input variable", Click on insert into expression bar, complete the concat syntax, Note to use "Ctrl+space bar" inside expression window to Auto Populate the expression as shown in the figure below. What we are actually doing here is concatenating the String "Hello ", with the variable value received through the variable named "input" 11. Observe that once an expression is completed the "To Variable" is assigned to a variable by name "result" 12. Finally the copy variable looks as below 13. It's the time to deploy, start the SOA Suite 14. Establish connection to the Server from JDeveloper, this can be done adding a New Application Server under Connection, give the server name, username and password and test connection. 15. Deploy the "MyBPELProc" to the "default domain" 16. http://localhost:8080/ allows connecting to SOA Suite web portal, click on "BPEL Control" , login with the username "oc4jadmin" password what ever you gave during installation 17. "MyBPELProc" is visisble under "Deployed BPEL Processes" in the "Dashboard" Tab, click on the it 18. Initiate tab open to accept input, enter data such as input is "James" click on "Post XML Button" 19. Click on Visual Flow 20. Click on receive Input , it shows "James" as input received 21. Click on reply Output, it shows "Hello James" so the BPEL process is successfully executed. 22. It may be worth seeing all the instance created everytime a BPEL process is executed by giving some inputs. Purge All button allows to delete all the unwanted previous instances of BPEL process, dont worry it wont delete the BPEL process itself :-) 23. It may also be some importance to understand the XSD File which holds input & output variable names & data types. 24. You could drag n drop variables as elements over sequence at the designer or directly edit the XML Source file. 

    Read the article

  • Start a Mapping or Process Flow from OWB Browser

    - by Dong Ruirong
    Basically, we start a Mapping or Process Flow from Oracle Warehouse Builder (OWB) Design Client. But actually we can also start a Mapping or Process Flow from OWB Browser. This paper will introduce the Start Report first and then introduce how to start/rerun a Mapping or Process Flow from OWB Browser. Start Report Start Report is used to start an execution of a Mapping or Process Flow. So there are two kinds of Start Report: Mapping Start Report (See Figure 1) and Process Flow Start Report (See Figure 2). Start Report shows the Mapping or Process Flow identification properties, including latest deployment and latest execution, lists all execution parameters for the Mapping or Process Flow, which were specified by the latest deployment, and assigns parameter default values from the latest deployment specification. You can do a couple of things from Start Report: Sort execution parameters on name, category. Table 1 lists all parameters of a Mapping. Table 2 lists all parameters of a Process Flow. Change values of any input parameter where permitted. For some parameters, selection lists are provided. For example, Mapping’s parameter Audit Level has a selection list. Reset all parameter settings to their default values. Apply basic validation to parameter values before starting an execution. Start the Mapping or Process Flow, which means it is executed immediately. Navigate to Deployment Report for latest deployment details of the Mapping or Process Flow. Navigate to Execution Job Report for latest execution of current Mapping or Process Flow Link to on-link help Warehouse Report Page, Deployment Report, Execution Report, Execution Schedule Report and Execution Summary Report. Figure 1 Mapping Start Report Table 1 Execution Parameters and default values for a Mapping Category Name Mode Input Value System Audit Level In Error Details System Bulk Size In 1000 System Commit Frequency In 1000 System EXECUTE_RESUME_TASK In FALSE System FORCE_RESUME_OPTION In FALSE System Max No of Errors In 50 System NUMBER_OF_TIMES_TO_RETRY In 2 System Operating Mode In Set Based Fail Over to Row Based System PARALLEL_LEVEL In 0 System Procedure Name In main System Purge Group In WB Figure 2 Process Flow Start Report Table 2 Execution Parameters and default values for a Process Flow Category Name Mode Input Value System EVAL_LOCATION In   System Item Key In-Out   System Item Type In PFPKG_1 Start a Mapping or Process Flow To navigate to Start Report, it’s better to login OWB Browser with Control Center option; if not, after logging in OWB Browser, go to Control Center first. Then you can follow the ways introduced in this section to navigate to Start Report. One more thing you need to pay attention to is that you are not allowed to deploy any Mappings and Process Flows from OWB Browser as it’s not supported. So it’s necessary to deploy the Mappings and Process Flows first before starting them from OWB Browser. If you have deployed a Mapping or Process Flow but have not started it, please navigate from Object Summary Report or Deployment Schedule Report to Start Report. 1. Navigating from Object Summary Report to Start Report Open the Object Summary Report to see all deployed Mappings and Process Flows. Click the Mapping Name or Process Flow Name link to see its Deployment Report. Select the Start link in the Available Reports tab for the given Mapping or Process Flow to display a Start Report for the Mapping or Process Flow. The execution parameters have the default deployment-time settings. Change any of the input parameter values as required. Click Start Execution button to execute the Mapping or Process Flow. 2. Navigating from Deployment Schedule Report to Start Report Open the Deployment Schedule Report to see deployment details of Mapping and Process Flow. Expand the project trees to find the deployed Mappings and Process Flows. Click the Mapping Name or Process Flow Name link to see its Deployment Report. Select the Start link in the Available Reports tab for the given Mapping or Process Flow to display a Start Report for the Mapping or Process Flow. The execution parameters have the default deployment-time settings. Change any of the input parameter values as required. Click Start Execution button to execute the Mapping or Process Flow. Re-run a Mapping or Process Flow If you have executed a Mapping or Process Flow, you can navigate from Object Summary Report, Deployment Schedule Report, Execution Summary Report or Execution Schedule Report to Start Report. 1. Navigating from the Execution Summary Report to Start Report Open the Execution Summary Report to see all execution jobs including Mapping jobs and Process Flow jobs. Click on the Mapping Name or Process Flow Name to see its Execution Report. Select the Start link in the Available Reports tab for the given Mapping or Process Flow to display a Start Report for the Mapping or Process Flow. The execution parameters have the default deployment-time settings. Change any of the input parameter values as required. Click Start Execution button to execute the Mapping or Process Flow. 2. Navigating from the Execution Schedule Report to Start Report Open the Execution Schedule Report to see list of all executions of Mapping and Process Flow. Click on the Mapping Name or Process Flow Name to see its Execution Report. Select the Start link in the Available Reports tab for the given Mapping or Process Flow to display a Start Report for the Mapping or Process Flow. The execution parameters have the default deployment-time settings. Change any of the input parameter values as required. Click Start Execution button to execute the Mapping or Process Flow. If the execution of a Mapping or Process Flow is successful, you will see this message from the Start Report: Start Execution request successful. (See Figure 3) Figure 3 Execution Result You can also confirm the execution of the Mapping or Process Flow by referring to Execution Report of the current Mapping or Process Flow by clicking the link in the Available Reports tab for the given Mapping or Process Flow. One new record of execution job details is added to Execution Report of the Mapping or Process Flow which shows the details of the execution such as Start Time, Elapsed Time, Status, the number of records selected, inserted, updated, deleted etc.

    Read the article

  • java Process stop entire process tree

    - by ages04
    I am using Java Runtime to run commands, including certain CVS commands. I use: process = runtime.exec ("cmd /C cvs..."); format for running the Process in Java I need to have the option of stopping it. For this I use the Java Process destroy method process.destroy(); However only the cmd is stopped not the cvs process. It continues to run as a separate process without the cmd process as the parent. There are many references to this on the internet, but I haven't found any satisfactory solution. Thanks

    Read the article

  • Is calling Process.Refresh() required for Process.HasFinished

    - by Rekreativc
    Hello I am interested if calling Process.Refresh() is mandatory when waiting for the process to terminate by checking Process.HasFinished property? I have a piece of code that works fine without the Process.Refresh() call, however I am curious weather this is a coincidence? I can see that a msdn example has the Process.Refresh() call... If its not necessary, and Process.HasExited is the only property I need, are there any advantages to making the call to Process.Refresh() ? If not, is there a reason it is in the msdn example? Thank you for your answers.

    Read the article

  • Exit code of a process terminated with Process.Kill() , in C#

    - by Emil D
    If in my C# application, I am creating a child process that can either terminate normally, or start misbehaving, in which case I terminate it with a call to Process.Kill().However, I would like to know if the process has exited normally.I know I can get the error code of a terminated process, but what would be a normal exit code and what would signify that the process was killed?

    Read the article

  • Plex Media Server: Won't find media External Hard Drive

    - by grayed01
    I am attempting to turn an older PC into a home media server with Ubuntu 12.04 using Plex Media Server. I have a newer WD 2 TB external USB HD with all of my media on it. I can not for the life of me figure out why Plex will not recognize the files on this hard drive. It shows the external as there, Ubuntu shows the files and allows me to play them, view them, etc. Plex shows the name "External". But when I click it, it is 100% empty and shows nothing to add. I can access the files on the external through file sharing just fine, on my windows computers but would love to be able to use Plex for streaming with our Roku. I am fairly new to Ubuntu, I have used Plex with the same HD on Windows and it worked fine. I have read multiple articles on this and nothing seems to be doing the trick. How can I solve this?

    Read the article

  • Laptop screen blank after login when external monitor is not connected

    - by Ramon Suarez
    Ubuntu does not switch back automatically to only monitor present when booting after disconnecting external monitor. Here's a video showing what happens. I get to the login window and everything looks ok, then I type my password, the desktop image shows up and... everything goes blank. It does not happen when I just login as a guest. When possible I work with my laptop connected to an external screen via the VGA port. The problem comes when I boot the computer without that secondary screen connected: The login screen comes out ok. After login the screen goes black, but I can hear the login sound. If I hit ctr + alt + backwards-delete and login again sometimes it is fixed, but not all. If I log in as a different user everything is OK. Then I log in as my user and sometimes it works. To have a screen I have to plug a monitor. Although I have turned on the laptop display with that monitor on, if I reboot it goes blank again after login, even if I turn off the external monitor before turning off the computer. I've managed to get my screen back with my username after going into recovery mode, but only sometimes. Failsafe would not load after second screen asking me what I wanted to do (no mouse to click nor keyboard working). My computer is a LDLC Aurore BB1-i5 -8 -S1. Which is the configuration file that keeps the information about the monitors using Displays under lightgdm and where is it? I guess if I could edit it I may have a chance :) One of the things I tried following a solution in another post was removing my monitors.xml file, but it does not work and I don't know how to create a good one that I could use now. When doing DISPLAY=:0 xrandrI get: Screen 0: minimum 320 x 200, current 320 x 200, maximum 8192 x 8192 LVDS1 connected (normal left inverted right x axis y axis) 1366x768 60.0 + 1360x768 59.8 60.0 1024x768 60.0 800x600 60.3 56.2 640x480 59.9 VGA1 disconnected (normal left inverted right x axis y axis) HDMI1 disconnected (normal left inverted right x axis y axis) DP1 disconnected (normal left inverted right x axis y axis) This is the full dmesg after activating sudo xdiagnoseas Bryce sugested. (If you tell me the relevant parts I will paste them here) When conecting the external monitor, only the external will work, although I can see using Displays that the computer thinks that both are working. I've asked the question in Launchpad but have it keeps on expiring without any feedback. In my opinion Ubuntu should be able to detect automatically that there is no external monitor present and switch to the laptop monitor. There's a similar question here, but it does not apply to my case External monitor set as primary even when disconnected from laptop Update: For clarification, the problem happens only with my user and once I log in. I even get to see the screensaver for about a second, and then it goes blank. Tried Bryce's example (see his answer below), but it did not work. This is the info I get from tty1 with Display=:0 xrandr: – Ramon Suarez Jul 9 at 16:36 Screen 0: minimum 320 x 200, current 320 x 200, maximum 8192 x 8192 LVDS1 connected (normal left inverted right x axis y axis) 1366x768 60.0 + 1360x768 59.8 60.0 1024x768 60.0 800x600 60.3 56.2 640x480 59.9 VGA1 disconnected (normal left inverted right x axis y axis) HDMI1 disconnected (normal left inverted right x axis y axis) DP1 disconnected (normal left inverted right x axis y axis)

    Read the article

  • External monitor problems with Asus UX32VD in 12.04

    - by rsilva
    I've posted this video where I show one of the problems I'm facing with my UX32VD in regard of external monitors. As you can see as soon as I plugged the external monitor the laptop screen goes black and the external monitor displays a single color (color changes between red, green and blue). However this does not always happen. For instance if I reboot the laptop with the external monitor already plugged in I get to use it but not the laptop screen. In the Displays settings the laptop screen it's not even listed. Other times it just works. The to screens, without any problems. I tried with other to different external monitor and all this apparently random issues repeat, I even tried using HDMI cable instead. Same random behavior. This question seams to cover one of this issues however the answer did not helped me. I was forced to use the 3.5.2 kernel in order to use the Intel graphics card else the Gallium something was used. Also I've Bumblebee installed.

    Read the article

  • External display resolution issue

    - by Steven
    I'm new to Ubuntu. I've 'dabbled' in the past but Windows was annoying me too much so I finally made the change. One thing I use my laptop for is outputting the screen onto an external monitor. In Windows this usually works fine (but when there is an issue, it's a pain to resolve and has crashed my computer on many occasions). In Ubuntu it's a very simple case of "plug and play" because it displays the content immediately with no major problems. However, I can only alter the resolution of the external display to 4:3 formats, whereas the external monitor is 16:9, so the picture is stretched. In Windows there's loads of resolutions to choose from. If I disable the laptop's screen, it still doesn't allow me to display the correct resolution externally (on VGA). Further to that, if I close the laptop it keeps the external screen as it is, but when I reopen the laptop, the laptop screen doesn't turn back on until I restart. If it's a video I am watching, I can use VLC to alter the aspect ratio so the video looks fine, but it's quite an annoying work around. I used terminal to find my graphics controller, which is "Intel Corporation Mobile 4 Series Chipset Integrated Graphics Controller [8086:2a42] (rev 07)" (not great, I know... the laptop was a free gift with my mobile!) Any idea of how I can get the resolution on the external monitor to be correct?

    Read the article

  • Is there a way to make sure a background process spawned by my program is killed when my process ter

    - by Davy8
    Basically the child process runs indefinitely until killed in the background, and I want to clean it up when my program terminates for any reason, i.e. via the Taskmanager. Currently I have a while (Process.GetProcessesByName("ParentProcess").Count() 0) loop and exit if the parent process isn't running, but it seems pretty brittle, and if I wanted it to work under debugger in Visual Studio I'd have to add "ParentProcess.vshost" or something. Is there any way to make sure that the child process end without requiring the child process to know about the parent process? I'd prefer a solution in managed code, but if there isn't one I can PInvoke. Edit: Passing the PID seems like a more robust solution, but for curiosity's sake, what if the child process was not my code but some exe that I have no control over? Is there a way to safeguard against possibly creating orphaned child processes?

    Read the article

  • Dual-boot computer won't boot without external hard drive

    - by FrankP
    I have Ubuntu loaded on my external HDD. I tried to unplug the external drive so that this way I could run Windows as the default OS to boot when the computer turns on, but it gives me an error. I need to know how I can make it so that when my computers boots it stops saying Error: no such device: (a whole bunch of numbers and letters) then it says grub rescue>_. If I plug the external HDD in, and I let Ubuntu run the boot process, then it gives me a list of OS's/ HDD's to choose from and Windows 7 is there. The only problem is that I want Windows be my default OS, not the other way around. P.S. I have found that I dislike Ubuntu because I can't even figure out how to install the necessary programs to learn how to start writing Ruby On Rails. So installing it was a waste of my time, in my opinion. Now that I have it on the external hard drive, I will leave it installed though. I just dont want to have to keep that external drive plugged in to my computer all the time. Thank you a ton to whoever can help me! Thank you for the detail'd instructions. I am doing my best to follow you and it makes sense when I read it but, Rescatux is not doing what you said it would. None of the options you said would appear are not there. On my screen there is 4 options when MBR run's none look familiar and when I picked the best possible option based on my educated guesses it said success. I tried to restart my computer and it said Please insert windows recovery disc and hit enter. Problem being I don't have the windows recovery disc. I bought my computer from a local Computer tec and he loads windows on it for you. I have no time to run my compute over to him as sunday is my only day free. I think that I just wrecked my computer in the process of this attempted fix windows refuses to boot now WITH or WITHOUT the HDD. Please help this is getting out of hand

    Read the article

  • External hard drive issue

    - by Blind Fish
    I am running Ubuntu 12.04 and I have a 500GB external hard drive, formatted in ext4, which I have used for about a year to store batches of data that I am sifting through. About once a week I move a chunk of data from the external drive to my internal drive for processing. As I do that I delete the data from the external drive and empty the trash, thereby clearing up space on the external and also preventing myself from grabbing stuff that I've already sorted through. The vast majority of this is text files, but there are a few .jpegs and .mp4s thrown in, if that matters. Anyway, this has been working without a hitch for nearly a year. So today I plug in the drive and I have an odd issue. I was able to move folders from the external over to my internal drive with no problem, but when I tried to delete those files I was unable to do so. I kept getting the "unable to send file to trash, do you wish to delete permanently?" message. I clicked yes / delete all, but no files were actually deleted. It just sat there. Even worse, while the system was trying in vain to delete those files I was unable to move more files over. In short, I was stuck. I canceled the operation, unmounted and remounted the drive, and then I had an even bigger problem. I have a spreadsheet of all the different folders on there ( exactly 818 ) and yet when I opened the drive in Nautilus, it was only finding 512 folders. So I had 306 missing folders, but my free space was unchanged. Immediately I think that the drive may be corrupted. I went into Disk Utility and ran the check disk option. It said that the drive was NOT clean, but offered no remedy or option to repair. I went back into the drive in Nautilus and once again attempted to delete the files I had already moved. Same issue. I clicked on Details and it said that it was unable to create a trashing file I/O error. I've looked, and it's not a permissions issue. The drive and all folders that are in it all belong to me, and they are all read / write. I've started running badblocks on the drive, but it looks like that is going to take several hours to complete. Any ideas?

    Read the article

  • External JS usage in FBML - Cannot access external script

    - by santhakr
    Hi, I am trying to create a small facebook app and associate it to a fan page in a tab. I am trying to include an external javascript file in my page and call a method on a button click event. Below is a part of the code <script language="Javascript" src="http://mysite.com/fb.js"></script> <input type="button" value="Click....." onClick="javascript:showDialog();" /> content of fb.js is as below function showDialog() { new Dialog().showMessage('Dialog', 'Button clickeed'); } When I load the tab in my fan page, it shows an error "Cannot allow external script", whereas when I load the canvas url [http://apps.facebook.com/...] directly and click on the button, it works [shows the dialog]. Does script include works only on the canvas and not on the profile page? I have another question though Initially I had the script src as a relative path but it errored out with the same error - "Cannot allow external script". Can't I use relative path for the external scripts?

    Read the article

  • Laptop with Intel Graphics: external VGA monitor only gets signal on boot (no "hot plugging")

    - by iveand
    I am able to get an external VGA monitor (or projector) to work if I start my laptop with it connected. However, if I start the laptop with it disconnected there is no signal on the external. The Displays screen shows the external, and thinks that it is active, but there is no signal being sent to it. This has been a persistent problem since 10.04 (I am now on 12.04.... each upgrade hoping something is improved). I should note that even when it works (starting with display connected), Displays still says the monitor is "unknown" (but it sends the signal). For the correct resolution to display, I have had to add a few xrandr lines for my monitor to my .xprofile file... otherwise resolution is limited to default 1024x768. So, resolution issues can be worked around, but the main issue is that the external doesn't get a signal without starting the machine with it connected. I have tried: adding i915.modeset=1 to grub (also i965.modeset=1 since someone posted that this helped even though lshw shows i915) adding following ppa and doing a dist-upgrade: sudo add-apt-repository ppa:xorg-edgers/ppa Here are the details: Laptop: Toshiba Tecra M10 lspci listings for video: 00:02.0 VGA compatible controller [0300]: Intel Corporation Mobile 4 Series Chipset Integrated Graphics Controller [8086:2a42] (rev 07) sudo lshw -C video listing: *-display:0 description: VGA compatible controller product: Mobile 4 Series Chipset Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 07 width: 64 bits clock: 33MHz capabilities: msi pm vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:46 memory:ff400000-ff7fffff memory:e0000000-efffffff ioport:cff8(size=8) *-display:1 UNCLAIMED description: Display controller product: Mobile 4 Series Chipset Integrated Graphics Controller vendor: Intel Corporation physical id: 2.1 bus info: pci@0000:00:02.1 version: 07 width: 64 bits clock: 33MHz capabilities: pm bus_master cap_list configuration: latency=0 resources: memory:ffc00000-ffcfffff "System Info" shows my graphics as the following Mobile Intel® GM45 Express Chipset x86/MMX/SSE2

    Read the article

  • "The daemon is being inhibited" error message when mounting volumes on a partitioned external HD [closed]

    - by Todd
    I'm having a great deal of difficulty with an external hard drive. I'm currently running a dual boot system (XP Service Pack 3 and Ubuntu 11.04 Natty Narwahl) on a Dell Inspiron B120. I'm trying to set up a new 80 GB Hitachi external HD. Using GParted, I formatted the drive and set up the partitions. The partitioning scheme is as follows 10GB NTFS Primary, 2GB Linux-Swap Primary, 50GB FAT32 Primary, 12GB Unallocated. After applying those changes, I went into Disk Utility and the HD appears along with the correct partitions. When I try to mount the volumes for partitions 1 and 3, I get a pop-up stating: Error Mounting Volume An error occurred while performing an operation on "Home" (Partition 3 of HTS548080m9AT00): The daemon is being inhibited. When I try to to check the filesystem I get a pop-up stating: Error Checking filesystem on volume An error occurred while performing an operation on "Home" (Partition 3 of HTS548080m9AT00): The daemon is being inhibited. Throughout the time that I'm attempting to troubleshoot the problem, the external drive light is on and blinking. With my frustration hitting a boiling point, I try to shut down the drive and remove it so that I can plug in a different external HD that works PERFECTLY. However, when I try to shut down and safely remove the drive, I get a pop-up stating: Error Detaching Drive An error occurred while performing an operation on "80GB Hard Disk" (HTS548080m9AT00): The daemon is being inhibited. Can anyone tell me what I'm doing wrong? I'm a newbie and not that skilled with terminal commands, so please dumb it down for me if you request specific command output.

    Read the article

1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >