Search Results

Search found 1106 results on 45 pages for 'accurate'.

Page 4/45 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Oracle's Global Single Schema

    - by david.butler(at)oracle.com
    Maximizing business process efficiencies in a heterogeneous environment is very difficult. The difficulty stems from the fact that the various applications across the Information Technology (IT) landscape employ different integration standards, different message passing strategies, and different workflow engines. Vendors such as Oracle and others are delivering tools to help IT organizations manage the complexities introduced by these differences. But the one remaining intractable problem impacting efficient operations is the fact that these applications have different definitions for the same business data. Business data is your business information codified for computer programs to use. A good data model will represent the way your organization does business. The computer applications your organization deploys to improve operational efficiency are built to operate on the business data organized into this schema.  If the schema does not represent how you do business, the applications on that schema cannot provide the features you need to achieve the desired efficiencies. Business processes span these applications. Data problems break these processes rendering them far less efficient than they need to be to achieve organization goals. Thus, the expected return on the investment in these applications is never realized. The success of all business processes depends on the availability of accurate master data.  Clearly, the solution to this problem is to consolidate all the master data an organization uses to run its business. Then clean it up, augment it, govern it, and connect it back to the applications that need it. Until now, this obvious solution has been difficult to achieve because no one had defined a data model sufficiently broad, deep and flexible enough to support transaction processing on all key business entities and serve as a master superset to all other operational data models deployed in heterogeneous IT environments. Today, the situation has changed. Oracle has created an operational data model (aka schema) that can support accurate and consistent master data across heterogeneous IT systems. This is foundational for providing a way to consolidate and integrate master data without having to replace investments in existing applications. This Global Single Schema (GSS) represents a revolutionary breakthrough that allows for true master data consolidation. Oracle has deep knowledge of applications dating back to the early 1990s.  It developed applications in the areas of Supply Chain Management (SCM), Product Lifecycle Management (PLM), Enterprise Resource Planning (ERP), Customer Relationship Management (CRM), Human Capital Management (HCM), Financials and Manufacturing. In addition, Oracle applications were delivered for key industries such as Communications, Financial Services, Retail, Public Sector, High Tech Manufacturing (HTM) and more. Expertise in all these areas drove requirements for GSS. The following figure illustrates Oracle's unique position that enabled the creation of the Global Single Schema. GSS Requirements Gathering GSS defines all the key business entities and attributes including Customers, Contacts, Suppliers, Accounts, Products, Services, Materials, Employees, Installed Base, Sites, Assets, and Inventory to name just a few. In addition, Oracle delivers GSS pre-integrated with a wide variety of operational applications.  Business Process Automation EBusiness is about maximizing operational efficiency. At the highest level, these 'operations' span all that you do as an organization.  The following figure illustrates some of these high-level business processes. Enterprise Business Processes Supplies are procured. Assets are maintained. Materials are stored. Inventory is accumulated. Products and Services are engineered, produced and sold. Customers are serviced. And across this entire spectrum, Employees do the procuring, supporting, engineering, producing, selling and servicing. Not shown, but not to be overlooked, are the accounting and the financial processes associated with all this procuring, manufacturing, and selling activity. Supporting all these applications is the master data. When this data is fragmented and inconsistent, the business processes fail and inefficiencies multiply. But imagine having all the data under these operational business processes in one place. ·            The same accurate and timely customer data will be provided to all your operational applications from the call center to the point of sale. ·            The same accurate and timely supplier data will be provided to all your operational applications from supply chain planning to procurement. ·            The same accurate and timely product information will be available to all your operational applications from demand chain planning to marketing. You would have a single version of the truth about your assets, financial information, customers, suppliers, employees, products and services to support your business automation processes as they flow across your business applications. All company and partner personnel will access the same exact data entity across all your channels and across all your lines of business. Oracle's Global Single Schema enables this vision of a single version of the truth across the heterogeneous operational applications supporting the entire enterprise. Global Single Schema Oracle's Global Single Schema organizes hundreds of thousands of attributes into 165 major schema objects supporting over 180 business application modules. It is designed for international operations, and extensibility.  The schema is delivered with a full set of public Application Programming Interfaces (APIs) and an Integration Repository with modern Service Oriented Architecture interfaces to make data available as a services (DaaS) to business processes and enable operations in heterogeneous IT environments. ·         Key tables can be extended with unlimited numbers of additional attributes and attribute groups for maximum flexibility.  o    This enables model extensions that reflect business entities unique to your organization's operations. ·         The schema is multi-organization enabled so data manipulation can be controlled along organizational boundaries. ·         It uses variable byte Unicode to support over 31 languages. ·         The schema encodes flexible date and flexible address formats for easy localizations. No matter how complex your business is, Oracle's Global Single Schema can hold your business objects and support your global operations. Oracle's Global Single Schema identifies and defines the business objects an enterprise needs within the context of its business operations. The interrelationships between the business objects are also contained within the GSS data model. Their presence expresses fundamental business rules for the interaction between business entities. The following figure illustrates some of these connections.   Interconnected Business Entities Interconnecte business processes require interconnected business data. No other MDM vendor has this capability. Everyone else has either one entity they can master or separate disconnected models for various business entities. Higher level integrations are made available, but that is a weak architectural alternative to data level integration in this critically important aspect of Master Data Management.    

    Read the article

  • 3G/Edge/GPRS IP addresses and geocoding

    - by LookitsPuck
    Hey all! So, we're looking to develop a mobile website. On this mobile website, we'd like to automatically populate a user's location (with proper fallback) based on their IP address. I'm aware of geocoding a location based on IP address (mapping to latitude, longitude and then getting the location with that information). However, I'm curious how accurate this information is? Are mobile devices assigned IP's when they utilize 3G, EDGE, and GPRS connections? I think so. If that is so, does it map to a relatively accurate location? It doesn't have to be spot on, but relatively accurate would be nice. Thanks! -Steve

    Read the article

  • Stopwatch vs. using System.DateTime.Now for timing events

    - by Randy Minder
    I wanted to track the performance of a piece of my application so I initially stored the start time using System.DateTime.Now and the end time also using System.DateTime.Now. The difference between the two was how long my code took to execute. I noticed though that the difference didn't appear to be accurate. So I tried using a Stopwatch object. This turned out to be much, much more accurate. Can anyone tell me why Stopwatch would be more accurate than calculating the difference between a start and end time using System.DateTime.Now? Thanks.

    Read the article

  • Understanding how texCUBE works and writing cubemaps properly into a cube rendertarget

    - by cubrman
    My goal is to create accurate reflections, sampled from a dynamic cubemap, for specific 3d objects (mostly lights) in XNA 4.0. To sample the cubemap I compute the 3d reflection vector in a classic way: half3 ReflectionVec = reflect(-directionToCamera, Normal.rgb); I then use the vector to get the actual reflected color: half3 ReflectionCol = texCUBElod(ReflectionSampler, float4(ReflectionVec, 0)); The cubemap I am sampling from is a RenderTarget with 6 flat faces. So my question is, given the 3d world position of an arbitrary 3d object, how can I make sure that I get accurate reflections of this object, when I re-render the cubemap. Should I build the ViewProjection matrix in a specific way? Or is there any other approach?

    Read the article

  • Cone of Uncertainty in classic and agile projects

    - by DigiMortal
    David Starr from Scrum.org made interesting session in TechEd Europe 2012 - Implementing Scrum Using Team Foundation Server 2012. One of interesting things for me was how Cone of Uncertainty looks like in agile projects (or how agile methodologies distort the cone we know from waterfall projects). This posting illustrates two cones – one for waterfall and one for agile world. Cone of Uncertainty Cone of Uncertainty was introduced to software development community by Steve McConnell and it visualizes how accurate are our estimates over project timeline. Here is the Cone of Uncertainty when we deal with waterfall and Big Design Up-Front (BDUF). Cone of Uncertainty. Taken from MSDN Library page Estimating. The closer we are to project end the more accurate are our estimates. When project ends we know exactly how much every task took time. As we can see then cone is wide when we usually have to give our estimates – it happens somewhere between Initial Project Concept and Requirements Complete. Don’t ask me why Initial Project Concept is the stage where some companies give their best estimates – they just do it every time and doesn’t learn a thing later. This cone is inevitable for software development and agile methodologies that try to make software world better are also able to change the cone. Cone of Uncertainty in agile projects Agile methodologies usually try to avoid BDUF, waterfalls and other things that make all our mistakes highly expensive. Of course, we are not the only ones who make mistakes – don’t also forget our dear customers. Agile methodologies take development as creational work and focus on making it better. One main trick is to focus on small and short iterations. What it means? We are estimating functionalities that are easier for us to understand and implement. Therefore our estimates are more accurate. As we move from few big iterations to many small iterations we also distort and slice Cone of Uncertainty. This is how cone looks when agile methodologies are used. Cone of Uncertainty in agile projects. We have more cones to live with but they are way smaller. I don’t have any numbers to put here because I found any but still this “chart” should give you the point: more smaller iterations cause more but way smaller cones of uncertainty. We can handle these small uncertainties because steps we take to complete small tasks are more predictable and doesn’t grow very often above our heads. One more note. Consider that both of charts given in this posting describe exactly the same phase of same project – just uncertainties are different.

    Read the article

  • Is there a (free) reliable place to get statistics from sites, more reliable than Alexa, Quantcast, Compete?

    - by S.gfx
    I mean, seems there's no way. I am just asking in case someone knows of a recent new site being more accurate. I am aware of Alexa's, Compete and Quantcast inaccuracies and/or limited system/range of sites to get their stats. I also know about websitegrader perhaps being a little more accurate (although not sure if that's the data I am after). And read Seomoz tools are reliable. I am yet though looking for a free solution, a 'reliable' Alexa. And not a place depending on a toolbar installation, an easy to trick place, or one with stats way too off, or of a very limited range of sites. I am almost sure there's nothing new, but I wanted to be sure.

    Read the article

  • Leveraging Existing ERP Systems to Support Environmental Accounting and Reporting

    Organizations globally are faced with a complex set of emissions reporting requirements. Driven by country-specific regulatory mandates as well as stakeholder requests for voluntary reporting, companies are under pressure to provide consistent, transparent and accurate collection, measurement and reporting of energy usage and emissions data. In this podcast, you'll year about how the new Oracle Environmental Accounting and Reporting solution extends the capabilities of Oracle E-Business Suite and JD Edwards Financials to enable organizations to track their greenhouse gas emissions and other environmental data against reduction targets, and to obtain accurate, repeatable and verifiable methodologies for greenhouse gas calculation in accordance with global standards and for both voluntary and legislated emissions reporting schemes.

    Read the article

  • How do I install the NVIDIA driver for a GeForce 6200?

    - by 2d4skt
    I have been recently trying to find a solution for this on the web but did not find something useful or accurate for Ubuntu 11.10. I also consulted the NVIDIA help, but things there did not work for me. I installed the additional drivers from system settings but they are not fully compatible with my GeForce 6200. First I tried finding how to stop the X server. I succeeded, but another problem was the nouveau kernel. This is really frustrating. Can anybody tell me an accurate and authentic way to install NVIDIA drivers?

    Read the article

  • How do you get Windows 7 to show time remaining in the battery meter?

    - by MrDaniel
    Running Microsoft Windows 7 Home Premium on a HP Laptop. The system tray power meter never shows the time remaining in the system tray. Only really ever show a percentage remaining number as pictured. The windows help documentation on the "battery meter" seems to indicate that it should display a time remaining indicator, is this accurate? How accurate is the battery meter? The accuracy of what the battery meter reports—what percentage of a full charge remains and how long you can use your laptop before you must plug it in—depends on several factors. Most of these factors fall into the following two categories: What you use the laptop for. Because some activities drain the battery faster than others (for example, watching a DVD consumes more power than reading and writing e-mail), alternating between activities that have significantly different power requirements changes the rate at which your laptop uses battery power. This can vary the estimate of how much battery charge remains. Battery hardware and sensor circuitry. Newer, "smart" batteries are equipped with circuitry that calculates the measurements of charge remaining and reports the information to the battery meter. Older batteries use less sophisticated circuitry and might be less accurate.

    Read the article

  • stopwatch accuracy

    - by oo
    How accurate is System.Diagnostics.Stopwatch? I am trying to do some metrics for different code paths and I need it to be exact. Should I be using stopwatch or is there another solution that is more accurate. I have been told that sometimes stopwatch gives incorrect information.

    Read the article

  • Location detecting tecniques for IP addresses

    - by ilhan
    What are the location detecting tecniques for IP adresses? I know to look at the $_SERVER['HTTP_ACCEPT_LANGUAGE'] (not accurate but mostly useful to detect location, for example if an IP range's users set French to their browser then it means that this range) belongs to France and gethostbyaddr($_SERVER['REMOTE_ADDR']) (to look country code top-level domain) then may be to whois gethostbyaddr($_SERVER['REMOTE_ADDR']) sometimes: $HTTP_USER_AGENT (Firefox's user agent string has language code, not accurate but mostly can be used to detect the location) But what about cities?

    Read the article

  • Location detecting tecniques for IP adresses

    - by ilhan
    What are the location detecting tecniques for IP adresses? I know to look at the $_SERVER['HTTP_ACCEPT_LANGUAGE'] (not accurate but mostly useful to detect location, for example if an IP range's users set French to their browser then it means that this range belongs to France gethostbyaddr($_SERVER['REMOTE_ADDR']) then may be to whois gethostbyaddr($_SERVER['REMOTE_ADDR']) sometimes $HTTP_USER_AGENT (Firefox's user agent string has language code, not accurate but mostly can be used to detect the location) But what about cities?

    Read the article

  • Developing a Cost Model for Cloud Applications

    - by BuckWoody
    Note - please pay attention to the date of this post. As much as I attempt to make the information below accurate, the nature of distributed computing means that components, units and pricing will change over time. The definitive costs for Microsoft Windows Azure and SQL Azure are located here, and are more accurate than anything you will see in this post: http://www.microsoft.com/windowsazure/offers/  When writing software that is run on a Platform-as-a-Service (PaaS) offering like Windows Azure / SQL Azure, one of the questions you must answer is how much the system will cost. I will not discuss the comparisons between on-premise costs (which are nigh impossible to calculate accurately) versus cloud costs, but instead focus on creating a general model for estimating costs for a given application. You should be aware that there are (at this writing) two billing mechanisms for Windows and SQL Azure: “Pay-as-you-go” or consumption, and “Subscription” or commitment. Conceptually, you can consider the former a pay-as-you-go cell phone plan, where you pay by the unit used (at a slightly higher rate) and the latter as a standard cell phone plan where you commit to a contract and thus pay lower rates. In this post I’ll stick with the pay-as-you-go mechanism for simplicity, which should be the maximum cost you would pay. From there you may be able to get a lower cost if you use the other mechanism. In any case, the model you create should hold. Developing a good cost model is essential. As a developer or architect, you’ll most certainly be asked how much something will cost, and you need to have a reliable way to estimate that. Businesses and Organizations have been used to paying for servers, software licenses, and other infrastructure as an up-front cost, and power, people to the systems and so on as an ongoing (and sometimes not factored) cost. When presented with a new paradigm like distributed computing, they may not understand the true cost/value proposition, and that’s where the architect and developer can guide the conversation to make a choice based on features of the application versus the true costs. The two big buckets of use-types for these applications are customer-based and steady-state. In the customer-based use type, each successful use of the program results in a sale or income for your organization. Perhaps you’ve written an application that provides the spot-price of foo, and your customer pays for the use of that application. In that case, once you’ve estimated your cost for a successful traversal of the application, you can build that into the price you charge the user. It’s a standard restaurant model, where the price of the meal is determined by the cost of making it, plus any profit you can make. In the second use-type, the application will be used by a more-or-less constant number of processes or users and no direct revenue is attached to the system. A typical example is a customer-tracking system used by the employees within your company. In this case, the cost model is often created “in reverse” - meaning that you pilot the application, monitor the use (and costs) and that cost is held steady. This is where the comparison with an on-premise system becomes necessary, even though it is more difficult to estimate those on-premise true costs. For instance, do you know exactly how much cost the air conditioning is because you have a team of system administrators? This may sound trivial, but that, along with the insurance for the building, the wiring, and every other part of the system is in fact a cost to the business. There are three primary methods that I’ve been successful with in estimating the cost. None are perfect, all are demand-driven. The general process is to lay out a matrix of: components units cost per unit and then multiply that times the usage of the system, based on which components you use in the program. That sounds a bit simplistic, but using those metrics in a calculation becomes more detailed. In all of the methods that follow, you need to know your application. The components for a PaaS include computing instances, storage, transactions, bandwidth and in the case of SQL Azure, database size. In most cases, architects start with the first model and progress through the other methods to gain accuracy. Simple Estimation The simplest way to calculate costs is to architect the application (even UML or on-paper, no coding involved) and then estimate which of the components you’ll use, and how much of each will be used. Microsoft provides two tools to do this - one is a simple slider-application located here: http://www.microsoft.com/windowsazure/pricing-calculator/  The other is a tool you download to create an “Return on Investment” (ROI) spreadsheet, which has the advantage of leading you through various questions to estimate what you plan to use, located here: https://roianalyst.alinean.com/msft/AutoLogin.do?d=176318219048082115  You can also just create a spreadsheet yourself with a structure like this: Program Element Azure Component Unit of Measure Cost Per Unit Estimated Use of Component Total Cost Per Component Cumulative Cost               Of course, the consideration with this model is that it is difficult to predict a system that is not running or hasn’t even been developed. Which brings us to the next model type. Measure and Project A more accurate model is to actually write the code for the application, using the Software Development Kit (SDK) which can run entirely disconnected from Azure. The code should be instrumented to estimate the use of the application components, logging to a local file on the development system. A series of unit and integration tests should be run, which will create load on the test system. You can use standard development concepts to track this usage, and even use Windows Performance Monitor counters. The best place to start with this method is to use the Windows Azure Diagnostics subsystem in your code, which you can read more about here: http://blogs.msdn.com/b/sumitm/archive/2009/11/18/introducing-windows-azure-diagnostics.aspx This set of API’s greatly simplifies tracking the application, and in fact you can use this information for more than just a cost model. After you have the tracking logs, you can plug the numbers into ay of the tools above, which should give a representative cost or in some cases a unit cost. The consideration with this model is that the SDK fabric is not a one-to-one comparison with performance on the actual Windows Azure fabric. Those differences are usually smaller, but they do need to be considered. Also, you may not be able to accurately predict the load on the system, which might lead to an architectural change, which changes the model. This leads us to the next, most accurate method for a cost model. Sample and Estimate Using standard statistical and other predictive math, once the application is deployed you will get a bill each month from Microsoft for your Azure usage. The bill is quite detailed, and you can export the data from it to do analysis, and using methods like regression and so on project out into the future what the costs will be. I normally advise that the architect also extrapolate a unit cost from those metrics as well. This is the information that should be reported back to the executives that pay the bills: the past cost, future projected costs, and unit cost “per click” or “per transaction”, as your case warrants. The challenge here is in the model itself - statistical methods are not foolproof, and the larger the sample (in this case I recommend the entire population, not a smaller sample) is key. References and Tools Articles: http://blogs.msdn.com/b/patrick_butler_monterde/archive/2010/02/10/windows-azure-billing-overview.aspx http://technet.microsoft.com/en-us/magazine/gg213848.aspx http://blog.codingoutloud.com/2011/06/05/azure-faq-how-much-will-it-cost-me-to-run-my-application-on-windows-azure/ http://blogs.msdn.com/b/johnalioto/archive/2010/08/25/10054193.aspx http://geekswithblogs.net/iupdateable/archive/2010/02/08/qampa-how-can-i-calculate-the-tco-and-roi-when.aspx   Other Tools: http://cloud-assessment.com/ http://communities.quest.com/community/cloud_tools

    Read the article

  • Problem with running totals in jquery

    - by rshivers
    I'm having an issue trying to get an accurate running total for my calculations. When you enter numbers into the input field I get an accurate total for that line item, but the grand total comes out to a higher number. Note that this is a dynamic form and that the id's will change depending on how many form fields I have added to the form. Also, I have it set to make the calculations onKeyUp for each input field instead of a calculate button. The code that calculates a single item is this: function calcLineItem(id) { var id = $(id).attr("id"); var Item1 = $("#Item1" + id).val(); var Item2 = $("#Item2" + id).val(); var Item3 = $("#Item3" + id).val(); function calcTotal(Item1, Item2, Item3){ var total; total = Math.round((Item1 * Item2) * Item3); return total; } $("#total" + id).text(calcTotal(Item1, Item2, Item3)); calcAllFields(); } This will give me the total of this particular input field. The function at the end, calcAllFields(), is supposed to do the calculations for all items in my form to give me the grand total of all input fields: function calcAllFields(id) { var id = $(id).attr("id"); $('#target1').text($("#total" + id).map(function() { var currentValue = parseFloat(document.getElementById("currentTotal").value); var newValue = parseFloat($("#total" + id).text()); var newTotal = currentValue + newValue; document.getElementById("currentTotal").value = newTotal; return newTotal; }).get().join()); } The variable currentTotal is getting its value from a hidden field on my form: <input type="hidden" id="currentTotal" value="0"> As I enter numbers a field the calculation for that line will be accurate, but the grand total will be inaccurate because the value for currentTotal will continue to increment with every key stroke I make in the input field. Any ideas on how to avoid this from happening?

    Read the article

  • Exchange 2003 resource scheduling with mixed client versions

    - by Daniel Lucas
    We run Exchange 2003, but have a mix of Outlook 2003/2007/2010 in the environment. We have three rooms that need to be configured as resources. Some observations we've made with resource scheduling/booking are: Outlook 2010 users have trouble with the native Exchange 2003 resource scheduling method and require direct booking to be configured via registry Outlook 2007 users are unable to use direct booking (is this accurate?) Outlook 2003 users can only use the native Exchange 2003 resource scheduling method (is this accurate?) Direct booking cannot be combined with the auto-accept agent What is the correct way to setup resource scheduling in a mixed environment like this? Thanks, Daniel

    Read the article

  • Why is my NTP controlled computer clock two minutes ahead?

    - by Martin Liversage
    The clock in my computer is configured to be synchronized using NTP. To verify this I have tried two NTP clients using various NTP servers. My computer and the NTP clients are in complete agreement about the current time even across a wide range of NTP servers. I also have a GPS and my national phone company provides an accurate clock available by calling a specific phone number. Both my GPS and the phone company agrees on the current time. However, my computer is almost precisely two minutes (or 1 minute and 59 seconds) ahead of what I believe to be the "real" current time where I live. Why is my computer two minutes ahead? I realize that synchronizing clocks using the internet may not be entirely accurate as there is latency, but two minutes is a very long time on the internet. Is NTP really two minutes ahead? I'm running Windows 7 and live in the time zone UTC+1, but I don't think that is important in understanding my problem.

    Read the article

  • Extract part of a image from a big image

    - by rajat
    I have a 6 images , and each image has a certain section that i want to save as a separate image , the problem is that it has to be accurate because i am doing some animation using the sub-image so they should exactly . so I want to accurately extract a that part from each of the 6 images , i can't do it using a image editor in which i have to make the bounding box myself because it will not be accurate , is there any program that lets me do this by like defining a box using numerical values. PS: I don't want to write matlab or opencv program for this .

    Read the article

  • Oracle Outsourced Repair Solution: The “Control Tower” for the Reverse Supply Chain

    - by John Murphy
    By Hannes Sandmeier, Vice President of cMRO and Depot Repair Development Smart businesses are increasing their focus on core competencies and aggressively cutting costs in their supply chains. Outsourcing repairs can enable a business to focus on what they do best and most profitably while delivering top-notch customer service through partners that specialize in reverse logistics and repair. A well managed “virtual service organization” can deliver fast turn times, lower costs and high customer satisfaction. A poorly managed partner network can deliver disaster for your business. Managing a virtual service organization requires accurate, real-time information and collaboration tools that enable smart, informed and immediate corrective action. To meet this need, Oracle has released the Oracle Outsourced Repair Solution to provide the “control tower” for managing outsourced reverse supply chain operations from customer complaint through remediation to partner claim settlement. The new solution provides real-time visibility to return status, location, turn time, discrepancies and partner performance. Additionally, its web portals allow partners and carriers to view assigned work, request parts, enter data, capture time and submit claims. Leveraging the combined power of Oracle E-Business Suite and Oracle E-Business Suite Extensions for Oracle Endeca, the Oracle Outsourced Repair Solution provides a comprehensive set of tools that range from quick online partner registration to partner claim reconciliation, from capturing parts and labor to Oracle Cost Management and Financials integration, and from part requisition to waste and hazmat controls. These tools empower service operations managers to: · Increase customer satisfaction Ensure customers are satisfied by holding partners accountable for the speed and quality of repairs, and taking immediate corrective action when things go wrong · Reduce costs: Remove waste from the repair process using accurate job cost and cost breakdown data · Increase return velocity: Users have the tools to view all orders in flight and immediately know the current location, status, owner and contact point for repairs so as to be able to remove bottlenecks, resolve discrepancies and manage escalations The Oracle Outsourced Repair Solution further demonstrates Oracle’s commitment to helping supply chain professionals and service managers deliver high customer satisfaction at the lowest cost. For more information on the Oracle Outsourced Repair Solution, visit here. 

    Read the article

  • Don’t miss the Oracle Webcast: Enabling Effective Decision Making with “One Source of the Truth” at BB&T

    - by Rob Reynolds
    Webcast Date:  September 17th, 2012  -  9 a.m. PT / 12 p.m. ET  BB&T Corporation (NYSE: BBT) is one of the largest financial services holding companies in the United States. One of their IT goals is to provide “one source of truth” to enable more effective decision making at the corporate and local level. By using Oracle’s Hyperion Enterprise Planning Suite and Oracle Essbase, BB&T streamlined their planning and financial reporting processes. Large volumes of data were consolidated into a single reporting solution giving stakeholders more timely and accurate information. By providing a central and automated collaboration tool, BB&T is able to prepare more accurate financial forecasts, rapidly consolidate large amounts of data, and make more informed decisions. Join us on September 17th for a live webcast to hear BB&T’s journey to achieve “One Source of Truth” and learn how Oracle’s Hyperion Planning Suite and Oracle’s Essbase allows you to: Adopt best practices like rolling forecasts and driver-based planning Reduce the time and effort dedicated to the annual budget process Reduce the time and effort dedicated to the annual budget process Remove forecasting uncertainty with predictive modeling capabilities Rapidly analyze shifting market conditions with a powerful calculation engine Prioritize resources effectively with complete visibility into all potential risks Link strategy and execution with integrated strategic, financial and operational planning Register here.

    Read the article

  • SQL SERVER – T-SQL Scripts to Find Maximum between Two Numbers

    - by pinaldave
    There are plenty of the things life one can make it simple. I really believe in the same. I was yesterday traveling for community related activity. On airport while returning I met a SQL Enthusiast. He asked me if there is any simple way to find maximum between two numbers in the SQL Server. I asked him back that what he really mean by Simple Way and requested him to demonstrate his code for finding maximum between two numbers. Here is his code: DECLARE @Value1 DECIMAL(5,2) = 9.22 DECLARE @Value2 DECIMAL(5,2) = 8.34 SELECT (0.5 * ((@Value1 + @Value2) + ABS(@Value1 - @Value2))) AS MaxColumn GO I thought his logic was accurate but the same script can be written another way. I quickly wrote following code for him and which worked just fine for him. Here is my code: DECLARE @Value1 DECIMAL(5,2) = 9.22 DECLARE @Value2 DECIMAL(5,2) = 8.34 SELECT CASE WHEN @Value1 > @Value2 THEN @Value1 ELSE @Value2 END AS MaxColumn GO He agreed that my code is much simpler but as per him there is some problem with my code which apparently he does not remember at this time. There are cases when his code will give accurate values and my code will not. I think his comment has value but both of us for the moment could not come up with any valid reason. Do you think any scenario where his code will work and my suggested code will not work? Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • The Hot-Add Memory Hogs

    - by Andrew Clarke
    One of the more difficult tasks, when virtualizing a server, is to determine the amount of memory that Hypervisor should assign to the virtual machine. This requires accurate monitoring and, because of the consequences of setting the value too low, there is a great temptation to err on the side of over-provisioning. This results in fewer guest VMs and, in fact, with more accurate memory provisioning, many virtual environments could support 30% more VMs. In order to achieve a better consolidation (aka VM density) ratio, Windows Server 2008 R2 SP1 has introduced what Microsoft calls ‘Dynamic Memory’. This means that the start-up RAM VM memory assigned to guest virtual machines can be allowed to vary according to demand, changing dynamically while the VM is running, based on the workload of applications running inside. If demand outstrips supply, then memory can be rationed according to the ‘memory weight’ assigned to the guest VM. By this mechanism, memory becomes a shared resource that can be reallocated automatically as demand patterns vary. Unlike VMWare’s Memory Overcommit technology, the sum of all the memory allocations to each virtual machine will not exceed the total memory of the host computer. This is fine for applications that are self-regulating in their demands for memory, releasing memory back into the 'pool' when not under peak load. Other applications however, such as SQL Server Standard and Enterprise, are by nature, memory hogs under high workload; they can grab hot-add memory whilst running under load and then never release it. This requires more careful setting-up and the SQLOS team have provided some guidelines from for configuring SQL Server in virtual environments. Whereas VMWare’s Memory Overcommit is well-proven in a number of different configurations, Hyper-V’s ‘Dynamic Memory’ is new. So far, the indications are that it will improve the business case for virtualizing and it is probably a far more intuitive technology for the average IT professional to grasp. It is certainly worth testing to see whether it works for you.

    Read the article

  • Out-of-the-Box Integration Links Primavera Solutions with PeopleSoft Projects Applications

    - by Sylvie MacKenzie, PMP
    In a move that brings best-in-class enterprise project portfolio management to Oracle’s PeopleSoft enterprise resource planning customers, Oracle announced the integration of Oracle’s PeopleSoft projects applications and Oracle’s Primavera P6 Enterprise Project Portfolio Management. The combination of PeopleSoft financial controls and Primavera portfolio management capabilities brings greater oversight of end-to-end processes to help organizations improve the planning and execution efforts needed to deliver projects on time and within budget. “As an organization with many high-value, project-driven initiatives, we are very pleased to see Oracle’s investment in this important integration,” says Janardhanan Sankar, senior vice president for technology and quality at ITC Infotech India Ltd. Oracle’s PeopleSoft projects applications enable project-centric organizations and departments to establish core operational processes for full project lifecycle management across operations and finance. The integration with Primavera P6 Enterprise Project Portfolio Management means organizations can eliminate costly and difficult-to-maintain proprietary integrations. Organizations can also standardize on the Oracle technologies to Align back-office budgets and costs with project operations to help ensure accurate forecasting of costs, resources, and schedules Provide an accurate single source of truth to financial managers and analysts using Oracle’s PeopleSoft projects applications, and to project managers using Primavera P6 Enterprise Project Portfolio Management  Enhance project collaboration and execution by having all users utilizing common solutions to communicate, plan, and deliver projects “By bringing together Oracle’s PeopleSoft projects applications and Oracle’s Primavera P6 Enterprise Project Portfolio Management, we are able to provide customers with the infrastructure they need to achieve a single source of truth on the projects they are managing,” says Paco Aubrejuan, Oracle’s group vice president and general manager, PeopleSoft. “This real-time visibility drives profitability, increases productivity, and improves operations.” For more information, view the on-demand Webcast, “Bridging Business Processes for Optimal Portfolio Performance,” or read about the new integration.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >