Search Results

Search found 5335 results on 214 pages for 'agile processes'.

Page 137/214 | < Previous Page | 133 134 135 136 137 138 139 140 141 142 143 144  | Next Page >

  • Low hanging fruit where "a sufficiently smart compiler" is needed to get us back to Moore's Law?

    - by jamie
    Paul Graham argues that: It would be great if a startup could give us something of the old Moore's Law back, by writing software that could make a large number of CPUs look to the developer like one very fast CPU. ... The most ambitious is to try to do it automatically: to write a compiler that will parallelize our code for us. There's a name for this compiler, the sufficiently smart compiler, and it is a byword for impossibility. But is it really impossible? Can someone provide a concrete example where a paralellizing compiler would solve a pain point? Web-apps don't appear to be a problem: just run a bunch of Node processes. Real-time raytracing isn't a problem: the programmers are writing multi-threaded, SIMD assembly language quite happily (indeed, some might complain if we make it easier!). The holy grail is to be able to accelerate any program, be it MySQL, Garage Band, or Quicken. I'm looking for a middle ground: is there a real-world problem that you have experienced where a "smart-enough" compiler would have provided a real benefit, i.e that someone would pay for?

    Read the article

  • Pattern for Accessing MySQL connection

    - by Dipan Mehta
    We have an application which is C++ trying to access MySQL database. There are several (about 5 or so) threads in the application (with Boost library for threading) and in each thread has a few objects, each of which is trying to access Database for its' own purpose. It has a simple ORM kind of model but that really is not an important factor here. There are three potential access patterns i can think of: There could be single connection object per application or thread and is shared between all (or group). The object needs to be thread safe and there will be contentions but MySQL will not be fired with too many connections. Every object could initiate connection on its own. The database needs to take care of concurrency (which i think MySQL can) and the design could be much simpler. There could be two possibilities here. a. either object keeps a persistent connection for its life OR b. object initiate connection as and when needed. To simplify the contention as in case of 1 and not to create too many sockets as in case of 2, we can have group/set based connections. So there could be there could be more than one connection (say N), each of this connection could be shared connection across M objects. Naturally, each of the pattern has different resource cost and would work under different constraints and objectives. What criteria should i use to choose the pattern of this for my own application? What are some of the advantages and disadvantages of each of these pattern over the other? Are there any other pattern which is better? PS: I have been through these questions: mysql, one connection vs multiple and MySQL with mutiple threads and processes But they don't quite answer exactly what i am trying to ask.

    Read the article

  • BPM11g Launch - Spotlight on Innovation: A Unified Business Process Management Solution June 17th 2

    - by Jürgen Kress
    Spotlight on Innovation: A Unified Business Process Management Solution Thursday, June 17th, 2010 10 a.m. PT / 18:00 UK / 19:00 CET Presented by: Hasan Rizvi Senior Vice President Oracle Product Management Business Process Management (BPM) is essential for managing change and increasing business visibility, agility, and efficiency. To make the most of BPM, businesses today need to benefit from a new generation of process management solutions. Join Hasan Rizvi, Senior Vice President, Oracle Product Development, as he discusses Oracle’s innovations in the new BPM Suite 11g which will define the next generation of process management. Discover how you can leverage this complete, open, and integrated BPM solution that delivers: Management of all types of processes; including system, human, document, and decision-centric A simplified path to achieving greater business visibility, agility, and efficiency A unified process foundation that simplifies process management with a unified process engine and preintegration of process subsystems User-centric design that simplifies process modeling and interaction Social BPM interaction that provides social computing in the context of BPM to simplify and add richness to collaboration Register today for this live Webcast, another edition in a series introducing the next wave of products in Oracle Fusion Middleware 11g. Did you missed our BPM 11g webcast with Clemens Utschig-Utschig? The recorded version is now available! Here is your feedback: First experience with BPMN 2.0 in Oracle BPM Studio 11g by Hajo Normann Warum Oracle BPM Studio 11g? by Torsten Winterberg Oracle BPM 11g, less is more by Léon Smiers Oracle BPM 11g Integration with ADF and WebCenter Suite by Andrejus Baranovskis Oracle BPM11g available! by Guido Schmutz Listen to more feedback here. If you are working on BPM 11g projects and you would like to attend a hands-on training session, please contact Jürgen Kress. Technorati Tags: BPM,BPMN2.0,SOA,Hasan Rizvi,SOA Partner Community

    Read the article

  • SQL SERVER – Using MAXDOP 1 for Single Processor Query – SQL in Sixty Seconds #008 – Video

    - by pinaldave
    Today’s SQL in Sixty Seconds video is inspired from my presentation at TechEd India 2012 on Speed up! – Parallel Processes and Unparalleled Performance. There are always special cases when it is about SQL Server. There are always few queries which gives optimal performance when they are executed on single processor and there are always queries which gives optimal performance when they are executed on multiple processors. I will be presenting the how to identify such queries as well what are the best practices related to the same. In this quick video I am going to demonstrate if the query is giving optimal performance when running on single CPU how one can restrict queries to single CPU by using hint OPTION (MAXDOP 1). More on Errors: Difference Temp Table and Table Variable – Effect of Transaction Effect of TRANSACTION on Local Variable – After ROLLBACK and After COMMIT Debate – Table Variables vs Temporary Tables – Quiz – Puzzle – 13 of 31 I encourage you to submit your ideas for SQL in Sixty Seconds. We will try to accommodate as many as we can. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Video

    Read the article

  • Lenovo ThinkPad L520 slows down when AC power adapter is plugged in

    - by Aamir
    I have a new laptop Lenovo ThinkPad L520 (7859-5BG) Core i5-2520M(2.5GHz) with 4GB RAM. Having installed Ubuntu 11.10 32-bit, while browsing with Chrome on GNOME classic (no effects), I noticed 173% CPU usage by chrome browser process, and the system slowly got very very slow, Now, at this stage as I removed the power adapter, the system suddenly got faster (and stopped the lagging behavior) and CPU usage drops down to 48% !! Observation 1: I was browsing through chrome when my system seemed to be seriously lagging, so I killed chrome to see if it gets any faster. But there remained no difference. Notice that CPU usage was a bit strange here. It showed no high activity, but as soon as I would click on applications in gnome panel, it would shoot CPU usage to 70, or 80 or 90 or 143% etc. depending on how quickly i clicked back and forth. At this instance I removed by AC adapter of my laptop, and suddenly system got fine. So i again clicked on gnome panel, and noticed that it now took only 7% or 12% or 13% at max, with same kind of clicks in application menu. Observation 2: At the other times, with AC adapter plugged in, top indicates four instances of chromium taking 90%, 60%, 47% and 2% (for example), and then once I take out the AC adapter same processes take lesser CPU all of a sudden Intermediate conclusions: What does this indicate ? I cannot figure out any "other" process in "top" that is suddenly being triggered, its the same process that hogs up my CPU once AC power is plugged in ! NOTE: the problem is now CONFIRMED, as i can repeat that when I have power adapter plugged in ! Can anyone tell me what exactly does this indicate ? What is wrong, is it some bug with power management or what ?

    Read the article

  • Does concurrency inherently introduce "randomness" into a game?

    - by Jeff
    When a game is implemented with concurrency (as most games are), does this necessarily, by its very nature, introduce an element of randomness into the game that is outside of the players' control? Note that when I use the word "random", I'm not meaning to launch into a philosophical debate about the deterministic nature of the system. I understand that concurrency is deterministic in the sense that the operating system decides which processes to allow time on the CPU and in what order (or the JVM controls which Thread's turn it is to execute, etc). But my understanding of this is that there is no way to control or predict whether one thread's next command will execute before or after another. The reason I'm asking is because this seems like a fundamental difficulty for game development where a game is supposedly designed around a player's skill. Consider a game like League of Legends. Assume that two players are battling it out. It's a very close contest between the two and it's coming down to the wire -- so much so that whoever gets their last attack off will be the one to kill the other and win the game for their team. If the players are implemented using concurrency and the situation really was like this, is it essentially out of the players' hands at this point? Is the outcome of this match all up to whatever system is arbitrarily deciding which player's thread/process will execute next? If not, what am I misunderstanding about concurrency? If so, is there any way around this problem so that a game of skill can always be a game of skill, especially in those most crucial moments?

    Read the article

  • Tuning B2B Server Engine Threads in SOA Suite 11g

    - by Shub Lahiri, A-Team
    Background B2B 11g has a number of parameters that can be tweaked to tune the engine for handling high volumes of messages. These parameters are also known as B2B server properties and managed via the EM console.  This note highlights one aspect of the tuning exercise and describes the different threads, that can be configured to tune the performance of a B2B server. Symptoms The most common indicator of a B2B engine in need of a tuning is reflected in the constant build-up of messages in an internal JMS queue within the B2B server. It is called B2B_EVENT_QUEUE and can be monitored via the Weblogic server console. Whenever such a behaviour is seen, it usually results in general degradation of performance. Remedy There could be many contributing factors behind a B2B server's degradation of performance. However, one of the first places to tune the server from the out-of-the-box, default configuration is to change the number of internal engine threads allocated within the B2B server. Usually the default configuration for the B2B server engine threads is not suitable for high-volume of messaging loads. So, it is necessary to increase the counts for 3 types of such threads, by specifying the appropriate B2B server properties via the EM console, namely, Inbound - b2b.inboundThreadCount Outbound - b2b.outboundThreadCount Default - b2b.defaultThreadCount The function of these threads are fairly self-explanatory. In other words, the inbound threads process the inbound messages that are coming into the B2B server from an external endpoint. Similarly, the outbound threads processes the messages that are sent out from the B2B server. The default threads are responsible for certain B2B server-specific special tasks. In case the inbound and outbound thread counts are not specified, the default thread count also dictates the total number of inbound and outbound threads. As found in any tuning exercise, the optimisation of these threads is usually reached via an iterative process. The best working combination of the thread counts are directly related to the system infrastructure, traffic load and several other environmental factors.

    Read the article

  • Force apt to remove all emacs*

    - by wishi
    Hi! I have a bug-problem with the apt-packages of emacs: >>Error occurred processing debian-ispell.el: File error (("Opening input file" "no such file or directory" "/usr/share/emacs23/site-lisp/dictionaries-common/debian-ispell.el")) >>Error occurred processing ispell.el: File error (("Opening input file" "no such file or directory" "/usr/share/emacs23/site-lisp/dictionaries-common/ispell.el")) >>Error occurred processing flyspell.el: File error (("Opening input file" "no such file or directory" "/usr/share/emacs23/site-lisp/dictionaries-common/flyspell.el")) emacs-install: /usr/lib/emacsen-common/packages/install/dictionaries-common emacs23 failed at /usr/lib/emacsen-common/emacs-install line 28, <TSORT> line 30. dpkg: error processing emacs23-lucid (--configure): subprocess installed post-installation script returned error exit status 1 dpkg: dependency problems prevent configuration of emacs: emacs depends on emacs23 | emacs23-lucid | emacs23-nox; however: Package emacs23 is not installed. Package emacs23-lucid which provides emacs23 is not configured yet. Package emacs23-nox which provides emacs23 is not installed. Package emacs23-lucid is not configured yet. Package emacs23-nox is not installed. dpkg: error processing emacs (--configure): dependency problems - leaving unconfigured No apport report written because the error message indicates its a followup error from a previous failure. Errors were encountered while processing: emacs23-lucid emacs E: Sub-process /usr/bin/dpkg returned an error code (1) In fact I would be satisfied with just emacs23-nox, a couple of plugins - from apt. But I can neither --purge nor --purge reinstall, nor remove the packages. It always processes until this certain bug. I did some google-searching, found some stuff on Launchpad suggesting: sudo apt-get install --reinstall --purge emacsen-common But this is the same... so I hope there a way to tell app to just remove everything releated to emacs, and to start from scratch again? Thanks, Marius

    Read the article

  • A Multi-Channel Contact Center Can Reduce Total Cost of Ownership

    - by Tom Floodeen
    In order to remain competitive in today’s market, CRM customers need to provide feature-rich superior call center experience to their customers across all communication channels while improving their service agent productivity. They also require their call center to be deeply integrated with their CRM system; and they need to implement all this quickly, seamlessly, and without breaking the bank. Oracle’s Siebel Customer Relationship Management (CRM) is the world’s leading application suite for automated customer-facing operations for Sales and Marketing and for managing all aspects of providing service to customers. Oracle’s Contact On Demand (COD) is a world-class carrier grade hosted multi-channel contact center solution that can be deployed in days without up-front capital expenditures or integration costs. Agents can work efficiently from anywhere in the world with 360-degree views into customer interactions and real-time business intelligence. Customers gain from rapid and personalized sales and service, while organizations can dramatically reduce costs and increase revenues Oracle’s latest update of Siebel CRM now comes pre-integrated with Oracle’s Contact On Demand. This solution seamlessly runs fully-functional contact center provided by a single vendor, significantly reducing your total cost of ownership. This solution supports Siebel 7.8 and higher for Voice and Siebel 8.1 and higher for Voice and Siebel CRM Chat.  The impressive feature list of Oracle’s COD solution includes full-control CTI toolbar with Voice, Chat, and Click to Dial features.  It also includes context-sensitive screens, automated desktops, built-in IVR, Multidimensional routing, Supervisor and Quality monitoring, and Instant Provisioning. The solution also ships with Extensible Web Services interface for implementing more complex business processes. Click here to learn how to reduce complexity and total cost of ownership of your contact center. Contact Ann Singh at [email protected] for additional information.

    Read the article

  • Adding Suggestions to the SharePoint 2010 Search Programatically

    - by Ricardo Peres
    There are numerous pages that show how to do this with PowerShell, but I found none on how to do it with plain old C#, so here it goes! To abbreviate, I wanted to have SharePoint suggest the site collection user’s names after the first letters are filled in a search site’s search box. Here’s how I did it: 1: //get the Search Service Application (replace with your own name) 2: SearchServiceApplication searchApp = farm.Services.GetValue<SearchQueryAndSiteSettingsService>().Applications.GetValue<SearchServiceApplication>("Search Service Application") as SearchServiceApplication; 3: 4: Ranking ranking = new Ranking(searchApp); 5:  6: //replace EN-US with your language of choice 7: LanguageResourcePhraseList suggestions = ranking.LanguageResources["EN-US"].QuerySuggestionsAlwaysSuggestList; 8:  9: foreach (SPUser user in rootWeb.Users) 10: { 11: suggestions.AddPhrase(user.Name, String.Empty); 12: } 13:  14: //get the job that processes suggestions and run it 15: SPJobDefinition job = SPFarm.Local.Services.OfType<SearchService>().SelectMany(x => x.JobDefinitions).Where(x => x.Name == "Prepare query suggestions").Single(); 16: job.RunNow(); You may do this, for example, on a feature. Of course, feel free to change users for something else, all suggestions are treated as pure text.

    Read the article

  • How can I hard reset a USB device?

    - by Cory
    I have a USB device (a modem) that is really finicky. Sometimes it works fine, but other times it refuses to connect. The only solution I have found to fix it once it gets into a bad state is to physically unplug the device and plug it back in. However, I don't always have physical access to the machine it is plugged in on, so I'm looking for a way to do this through the command line. This post suggests running: sudo modprobe -w -r usb_storage; sudo modprobe usb_storage However I get an "unknown option -w" output. This slightly modified command: sudo modprobe -r usb_storage Fails with the message FATAL: Module usb_storage is in use. If I try to kill -9 the processes marked [usb-storage] before running they refuse to die (I think because they are deeply tied to the kernel). Anyone know of a way to do this? NOTE: I cross-posted this on superuser.com as I didn't know which was more appropriate. I will delete and/or link whichever one is answered first.

    Read the article

  • Shared Database Servers

    - by shivanshu.upadhyay
    As more enterprises consolidate their database environments to support private cloud initiatives, ISVs will have to deal with sceanrios where they need to run on a shared powerful database server like Exadata. Some ISVs are concerned about meeting SLAs for performance in a shared environment. Outside the virtualization world, there are capabilities of Oracle Database which can be used to prevent resource contention and guarantee SLA. These capabilities are - 1) Instance Caging - This guarantees the CPU allocated or limits the maximum number of CPUs (and so the number of Oracle processes) that an instance of Database can use simultaneously. With this feature, ISVs can be assured that their application is allocated adequate CPUs even if the database server is shared with other applications. 2) CPU Resource Allocation with Database Resource Manager - This allocates percentages of CPU time to different users and applications within a database. ISVs can use this feature to ensure that priority user or workloads within their application get CPU resources over other requirements. 3) Exadata I/O Resource Manager - The Database Resource Manager feature in Oracle Database 11g has been enhanced for use with Exadata. This allows the sharing of storage between databases without fear of one database monopolizing the I/O bandwidth and impacting the performance of the other databases sharing the storage. This can be used to ensure that I/O does not become a performance bottleneck due to poor design of other applications sharing the same server.

    Read the article

  • Updated Batch Best Practices

    - by ACShorten
    The Batch Best Practices whitepaper has been updated and published to My Oracle Support with the latest advice and new facilities. Two of the more interesting updates are: Addition of a Bache Cache Flush - Just like the online, the batch component of the framework caches data. It is now possible to refresh the cache manually using a new batch jobs F1-FLUSH. This is particularly useful if you execute long running threadpool workers across many different batch processes (submitters). New EXTENDED execution mode - A new specialist mode has been introduced for sites that use a large number of submitters (concurrent threads) and are experiencing intermittent communication issues in the threadpoolworker. This mode uses Oracle Coherences Extend mode to allow submitters to be allocated to threadpoolworkers via proxy connections. It differs from CLUSTERED mode in that a submitter can be explicitly allocated to a specific threadpoolworker via a proxy connection. This mode is only used for specific situation and customers should continue to use CLUSTERED mode gemerally. The whitepaper outlines advice for these new facilities and provides advice for existing functionality. The whitepaper is available from My Oracle Support at Doc Id: 836362.1.

    Read the article

  • Try out ubuntu on thinkpad x40 (via usb)?

    - by Oliver
    I am trying to try out ubuntu (i,e, install it within windows) on my thinkpad x40. I followed the instructions on how to create a bootable usb (http://www.ubuntu.com/download/ubuntu/download). No problem. The issue I have now is that th eusb ports on my x40 were burned before (common problem with the machine), i.e. do not work. So I got a USB notebook card from belkin. However it does not seem to recognize the usb stick from bios, thus I cannot boot from it. I also do not have a cd rom. Then tried to run wubi from the usb stick, briefly appears in task manager, no further action. So I tried it with wubi.exe, but same thing. Downloaded it to desktop, run it, briefly appears in the task manager under processes, no further action. Any one idea? I have enough memory and enough freed hd space. Thanks. Oliver

    Read the article

  • Ubuntu not shutting down ( going to black screen ) 12.04

    - by Orrin Fox
    I am currently using a USB persistent install of ubuntu. its a simple 4GB drive with a 2.8GB partition ( casper-rw storage partition ). I setup an administrator account and set it to login automatically. I also removed ubiquity to simply use this as a go anywhere install. Heres my issue. Im logged in as my account, and I click the top right gear and select "shut down". Text pops up showing its quitting processes.. etc. and then goes to the plymouth animation. But... The screen goes black, and then it goes to the login screen. Now when im at the login screen i go into terminal ( alt+F2 ) and dont you know, im logged in as Ubuntu. so then I try the following: ubuntu@ubuntu:~$ sudo shutdown now It goes to the plymouth screen again as if its shutting down, AND the screen goes black once again but the computer has not turned off, as in the usb is still flashing the light, the fans are still on, the only thing off is the screen. Is this a bug? If not maybe i did something wrong? Perhaps its that I made an account but... if there is a work around for this please let me know. Thanks again, Fox

    Read the article

  • How Mars Lost Its Atmosphere [Video]

    - by Jason Fitzpatrick
    Scientists have long theorized that Mars once had an atmosphere and surface water–but where did it go? This video showcases one of their theories about Mars’ vanished water reserves. Courtesy of NASA and the NASAexplorer channel: When you take a look at Mars, you probably wouldn’t think that it looks like a nice place to live. It’s dry, it’s dusty, and there’s practically no atmosphere. But some scientists think that Mars may have once looked like a much nicer place to live, with a thicker atmosphere, cloudy skies, and possibly even liquid water flowing over the surface. So how do you go from something like this–to something like this? NASA’s MAVEN spacecraft will give us a clearer idea of how Mars lost its atmosphere, and scientists think that several processes have had an impact. For more information about MAVEN, check out the mission page here. NASA – MAVEN: Mars Atmospheric Loss How to Get Pro Features in Windows Home Versions with Third Party Tools HTG Explains: Is ReadyBoost Worth Using? HTG Explains: What The Windows Event Viewer Is and How You Can Use It

    Read the article

  • Oracle eAM Webcast Series Announced (May-Dec 2010)

    - by [email protected]
    A series of free webinars with ReliabilityWeb will present key product capabilitiesof Oracle eAM and how they support maintenance and reliability best practices. Through this web-seminar series,companies can understand how to achieve better ROI. ReliabilityWeb will be using this as a key component of their initiative tobuild a stronger Oracle community.  For Oracle this program demonstrates leadership and commitment to the Maintenance SystemsMarketplace. Topics: (note all times are EAST)1. How can Oracle eAM enhance and support your reliability program? (May 13,2010) (1-2PM - all times East)) 2. Upgrading to Oracle eAM R12  - What's the value, when's the right time,what's involved and how do you get there? (June 17, 2010) (1-2PM) 3. Improving maintenance and reliability by aligning people, processes andsystems. (July 15, 2010) (1-2PM) 4. Using Oracle eAM to drive your Condition Based Maintenance program. (July29, 2010) (1-2PM) 5. Why and how do you get the power of Oracle eAM out to the people that arereally doing maintenance the technicians. (August 12, 2010) (1-2PM) 6. Standardizing and streamlining your maintenance work with Oracle eAM.(September 16, 2010 (1-2PM) 7. Standardizing maintenance and reliability data - How do you get there?(October 21, 2010 (1-2PM) 8. Using Oracle eAM to establish a Failure Reporting and Corrective ActionSystems (FRACAS). (November 18, 2010) (1-2PM)9. Maintenance Work Scheduling in Oracle eAM - Capabilities and Limitations(December 16, 2010) (1-2PM)to Register:   <http://img.gotomeeting.com/g2mimages/1x1.gif> <http://www1.gotomeeting.com/g2w/images/298420256/73664767535782300/embed.jpg>For additional information contact Jay West, EAM Master,+1.205.515.4326            

    Read the article

  • SL150 Modular Tape Library Demo Equipment Purchase Opportunity Limited Special Pricing on Demo Configuration

    - by Cinzia Mascanzoni
    Oracle is pleased to announce that, for a limited time, Oracle VADs may purchase special SL150 Modular Tape Library configurations for demonstration purposes at a significantly reduced price. Submit your order today for these special SL150 Modular Tape Library configurations and you can start showcasing these products in partner demonstrations and proof-of-concepts. VADs may also sell demo units to their VARs so that they may use them in their customer evaluations to help shorten the sales cycle. The offer also allows VARs to sell the demo configuration after a prescribed demonstration period to support the demo product’s cost of ownership. Why wait? Order today! Rules and Guidelines Only authorized VADs are allowed to purchase the special SL150 Modular Tape Library configurations. Purchase time frame is from now until February 28, 2013. Only the predetermined configurations are approved for purchase at the prescribed discounts. Supply is allocated per region and it’s limited. Order MUST be placed via the Oracle Partner Store* (OPS) where applicable. See below for online and offline order processes. If reselling to a VAR, VAD must include the Partner Demonstration Hardware Terms with the order (online via OPS or with offline VAD Ordering Document). Please mark your calendars for the SL150 Modular Tape Library Demo Program webcast on Sept 5th. The objective of this call is to share the details of this demo program with you. For details on how to connect to the webcast, contact your VAD Manager

    Read the article

  • Let's introduce the Oracle Enterprise Data Quality family!

    - by Sarah Zanchetti
    The Oracle Enterprise Data Quality family of products helps you to achieve maximum value from their business applications by delivering fit-­for-­purpose data. OEDQ is a state-of-the-art collaborative data quality profiling, analysis, parsing, standardization, matching and merging product, designed to help you understand, improve, protect and govern the quality of the information your business uses, all from a single integrated environment. Oracle Enterprise Data Quality products are: Oracle Enterprise Data Quality Profile and Audit Oracle Enterprise Data Quality Parsing and Standardization Oracle Enterprise Data Quality Match and Merge Oracle Enterprise Data Quality Address Verification Server Oracle Enterprise Data Quality Product Data Parsing and Standardization Oracle Enterprise Data Quality Product Data Match and Merge Also, the following are some of the key features of OEDQ: Integrated data profiling, auditing, cleansing and matching Browser-based client access Ability to handle all types of data – for example customer, product, asset, financial, operational Connection to any JDBC-compliant data sources and targets Multi-user project support (role-based access, issue tracking, process annotation, and version control) Services Oriented Architecture (SOA) - support for designing processes that may be exposed to external applications as a service Designed to process large data volumes A single repository to hold data along with gathered statistics and project tracking information, with shared access Intuitive graphical user interface designed to help you solve real-world information quality issues quickly Easy, data-led creation and extension of validation and transformation rules Fully extensible architecture allowing the insertion of any required custom processing  If you need to learn more about EDQ, or get assistance for any kind of issue, the Oracle Technology Network offers a huge range of resources on Oracle software. Discuss technical problems and solutions on the Discussion Forums. Get hands-on step-by-step tutorials with Oracle By Example. Download Sample Code. Get the latest news and information on any Oracle product. You can also get further help and information with Oracle software from: My Oracle Support Oracle Support Services An Information Center is available, where you can find technical information and fast solutions to the most common already solved issues: Information Center: Oracle Enterprise Data Quality [ID 1555073.2]

    Read the article

  • Success Quote: A Hybrid Approach for Success

    - by Lauren Clark
    We recently received this quote from a project that successfully used OUM: “On our project, we applied a combination of the Oracle Unified Method (OUM) and the client's methodology. The project was organized by OUM's phases and a subset of OUM's processes, tasks, and templates. Using a hybrid of the two methods resulted in an implementation approach that was optimized for the client-specific requirements for this project." This hybrid approach is an excellent example of using OUM in the flexible and scalable manner in which it was intended. The project team was able to scale OUM to be fit-for-purpose for their given situation. It's great to see how merging what was needed out of OUM with the client’s methodology resulted in an implementation approach that more closely aligned to the business needs. Successfully scaling OUM is dependent on the needs of the particular project and/or engagement. The key is to use no more than is necessary to satisfy the requirements of the implementation and appropriately address risks. For more information, check out the "Tailoring OUM for Your Project" page, which can be accessed by first clicking on the "OUM should be scaled to fit your implementation" link on the OUM homepage and then drilling into the link on the subsequent page. Have you used OUM in conjunction with a partner or customer methodology? Please share your experiences with us.

    Read the article

  • Big GRC: Turning Data into Actionable GRC Intelligence

    - by Jenna Danko
    While it’s no longer headline news that Governments have carried out large scale data-mining programmes aimed at terrorism detection and identifying other patterns of interest across a wide range of digital data sources, the debate over the ethics and justification over this action, will clearly continue for some time to come. What is becoming clear is that these programmes are a framework for the collation and aggregation of massive amounts of unstructured data and from this, the creation of actionable intelligence from analyses that allowed the analysts to explore and extract a variety of patterns and then direct resources. This data included audio and video chats, phone calls, photographs, e-mails, documents, internet searches, social media posts and mobile phone logs and connections. Although Governance, Risk and Compliance (GRC) professionals are not looking at the implementation of such programmes, there are many similar GRC “Big data” challenges to be faced and potential lessons to be learned from these high profile government programmes that can be applied a lot closer to home. For example, how can GRC professionals collect, manage and analyze an enormous and disparate volume of data to create and manage their own actionable intelligence covering hidden signs and patterns of criminal activity, the early or retrospective, violation of regulations/laws/corporate policies and procedures, emerging risks and weakening controls etc. Not exactly the stuff of James Bond to be sure, but it is certainly more applicable to most GRC professional’s day to day challenges. So what is Big Data and how can it benefit the GRC process? Although it often varies, the definition of Big Data largely refers to the following types of data: Traditional Enterprise Data – includes customer information from CRM systems, transactional ERP data, web store transactions, and general ledger data. Machine-Generated /Sensor Data – includes Call Detail Records (“CDR”), weblogs and trading systems data. Social Data – includes customer feedback streams, micro-blogging sites like Twitter, and social media platforms like Facebook. The McKinsey Global Institute estimates that data volume is growing 40% per year, and will grow 44x between 2009 and 2020. But while it’s often the most visible parameter, volume of data is not the only characteristic that matters. In fact, according to sources such as Forrester there are four key characteristics that define big data: Volume. Machine-generated data is produced in much larger quantities than non-traditional data. This is all the data generated by IT systems that power the enterprise. This includes live data from packaged and custom applications – for example, app servers, Web servers, databases, networks, virtual machines, telecom equipment, and much more. Velocity. Social media data streams – while not as massive as machine-generated data – produce a large influx of opinions and relationships valuable to customer relationship management as well as offering early insight into potential reputational risk issues. Even at 140 characters per tweet, the high velocity (or frequency) of Twitter data ensures large volumes (over 8 TB per day) need to be managed. Variety. Traditional data formats tend to be relatively well defined by a data schema and change slowly. In contrast, non-traditional data formats exhibit a dizzying rate of change. Without question, all GRC professionals work in a dynamic environment and as new services, new products, new business lines are added or new marketing campaigns executed for example, new data types are needed to capture the resultant information.  Value. The economic value of data varies significantly. Typically, there is good information hidden amongst a larger body of non-traditional data that GRC professionals can use to add real value to the organisation; the greater challenge is identifying what is valuable and then transforming and extracting that data for analysis and action. For example, customer service calls and emails have millions of useful data points and have long been a source of information to GRC professionals. Those calls and emails are critical in helping GRC professionals better identify hidden patterns and implement new policies that can reduce the amount of customer complaints.   Now on a scale and depth far beyond those in place today, all that unstructured call and email data can be captured, stored and analyzed to reveal the reasons for the contact, perhaps with the aggregated customer results cross referenced against what is being said about the organization or a similar peer organization on social media. The organization can then take positive actions, communicating to the market in advance of issues reaching the press, strengthening controls, adjusting risk profiles, changing policy and procedures and completely minimizing, if not eliminating, complaints and compensation for that specific reason in the future. In this one example of many similar ones, the GRC team(s) has demonstrated real and tangible business value. Big Challenges - Big Opportunities As pointed out by recent Forrester research, high performing companies (those that are growing 15% or more year-on-year compared to their peers) are taking a selective approach to investing in Big Data.  "Tomorrow's winners understand this, and they are making selective investments aimed at specific opportunities with tangible benefits where big data offers a more economical solution to meet a need." (Forrsights Strategy Spotlight: Business Intelligence and Big Data, Q4 2012) As pointed out earlier, with the ever increasing volume of regulatory demands and fines for getting it wrong, limited resource availability and out of date or inadequate GRC systems all contributing to a higher cost of compliance and/or higher risk profile than desired – a big data investment in GRC clearly falls into this category. However, to make the most of big data organizations must evolve both their business and IT procedures, processes, people and infrastructures to handle these new high-volume, high-velocity, high-variety sources of data and be able integrate them with the pre-existing company data to be analyzed. GRC big data clearly allows the organization access to and management over a huge amount of often very sensitive information that although can help create a more risk intelligent organization, also presents numerous data governance challenges, including regulatory compliance and information security. In addition to client and regulatory demands over better information security and data protection the sheer amount of information organizations deal with the need to quickly access, classify, protect and manage that information can quickly become a key issue  from a legal, as well as technical or operational standpoint. However, by making information governance processes a bigger part of everyday operations, organizations can make sure data remains readily available and protected. The Right GRC & Big Data Partnership Becomes Key  The "getting it right first time" mantra used in so many companies remains essential for any GRC team that is sponsoring, helping kick start, or even overseeing a big data project. To make a big data GRC initiative work and get the desired value, partnerships with companies, who have a long history of success in delivering successful GRC solutions as well as being at the very forefront of technology innovation, becomes key. Clearly solutions can be built in-house more cheaply than through vendor, but as has been proven time and time again, when it comes to self built solutions covering AML and Fraud for example, few have able to scale or adapt appropriately to meet the changing regulations or challenges that the GRC teams face on a daily basis. This has led to the creation of GRC silo’s that are causing so many headaches today. The solutions that stand out and should be explored are the ones that can seamlessly merge the traditional world of well-known data, analytics and visualization with the new world of seemingly innumerable data sources, utilizing Big Data technologies to generate new GRC insights right across the enterprise.Ultimately, Big Data is here to stay, and organizations that embrace its potential and outline a viable strategy, as well as understand and build a solid analytical foundation, will be the ones that are well positioned to make the most of it. A Blueprint and Roadmap Service for Big Data Big data adoption is first and foremost a business decision. As such it is essential that your partner can align your strategies, goals, and objectives with an architecture vision and roadmap to accelerate adoption of big data for your environment, as well as establish practical, effective governance that will maintain a well managed environment going forward. Key Activities: While your initiatives will clearly vary, there are some generic starting points the team and organization will need to complete: Clearly define your drivers, strategies, goals, objectives and requirements as it relates to big data Conduct a big data readiness and Information Architecture maturity assessment Develop future state big data architecture, including views across all relevant architecture domains; business, applications, information, and technology Provide initial guidance on big data candidate selection for migrations or implementation Develop a strategic roadmap and implementation plan that reflects a prioritization of initiatives based on business impact and technology dependency, and an incremental integration approach for evolving your current state to the target future state in a manner that represents the least amount of risk and impact of change on the business Provide recommendations for practical, effective Data Governance, Data Quality Management, and Information Lifecycle Management to maintain a well-managed environment Conduct an executive workshop with recommendations and next steps There is little debate that managing risk and data are the two biggest obstacles encountered by financial institutions.  Big data is here to stay and risk management certainly is not going anywhere, and ultimately financial services industry organizations that embrace its potential and outline a viable strategy, as well as understand and build a solid analytical foundation, will be best positioned to make the most of it. Matthew Long is a Financial Crime Specialist for Oracle Financial Services. He can be reached at matthew.long AT oracle.com.

    Read the article

  • Just Another Web Service (JAWS) vs SOA

    Over the last few years SOA has been a hot topic lending it to be abused by many that have no understanding of the concept. In my opinion, one of the largest issues facing SOA is the lack of understanding and experience implementing SOA by business and IT alike. I just recently deployed a new web services that is called by multiple service clients. Would you call this SOA because it is a web service that can be called by any requesting client? In my opinion, this is not SOA; instead it is Just Another Web Service (JAWS).  Just because a company creates a web service does not mean that they are using SOA, in fact it only means that they are using a web service. SOA is an architectural style that focuses on the design of systems based on the consumer and providers thorough the use of contracts.  With this approach SOA needs to be applied for the top down in order for it to reach its full potential. In the case of the web service, the service is just a small part of the entire system that is reusable and has the flexibility to change. In order for a company in this case to move towards SOA then they need to define business processes that can be shared through the use of reusable software and loose coupling. Once the company’s thought and development process change to address changes in this manner they can start to become more SOA.

    Read the article

  • Oracle Enterprise Data Quality Adds Global Address Verification Capabilities for Greater Accuracy and Broader Location Coverage

    - by Mala Narasimharajan
    Data quality – has many flavors to it.  Product, Customer – you name the data domain and there’s data quality associated with it.  Address verification and data quality are a little different.  in that there is a tremendous amount of variation as well as nuance attached to it.  Specifically, what makes address verification challenging is that more often than not, addresses are incomplete, riddled with misspellings, incorrect postal codes are assigned to locations or non-address items are present.  Almost all data has locations, and accurate locations power a wealth of business processes: Customer Relationship Management, data quality, delivery of materials, goods or services, fraud detection, insurance risk assessment, data analytics, store and territory planning, and much more. Oracle Address Verification Server provides location-based services as well as deeper parsing and analysis capabilities for Oracle Enterprise Data Quality.  Specifically, Pre-integrated with the EDQ platform, Oracle Address Verification Server provides robust parsing, validation, as well as specialized location information for over 240 countries – all populated countries on Earth.  Oracle Enterprise Data Quality (EDQ) is a data quality platform, dedicated to address the distinct challenges of customer and product data quality, and performs advanced data profiling to identify and measure poor quality data and identify rule requirements, as well as semantic and pattern-based recognition to accurately parse and standardize data that is poorly structured.   EDQ is integrated with Oracle Master Data Management, including Oracle Customer Hub and Oracle Product Hub, as well as Oracle Data Integrator Enterprise Edition and Oracle CRM.  Address Verification Server provides key address verification services for Oracle CRM and Oracle Customer Hub.  In addition, Address Verification Server provides greater accuracy when handling address data due to its expanded sources and extensible knowledge repository, solid parsing across locales and countries as well as  adept handling of extraneous data in address fields.  For more information on Oracle Address Verification Server visit:  http://bit.ly/GMUE4H and http://bit.ly/GWf7U6

    Read the article

  • Scalable solution for website polling

    - by Tom Irving
    I'm looking to add push notifications to one of my iOS apps. The app is a client for a website which doesn't offer push notifications. What I've come up with so far: App sends a message to home server when transitioning to background, asking the server to start polling the website for the logged in user. The home server starts a new process to poll for that user. Polling happens every so many seconds / minutes. When the user returns to the iOS app, the app sends a message to the home server to stop polling. The home server kills the process polling for the user. Repeat. The problem is that this soon becomes stupid: 100s of users means 100s of different processes. It's just not scalable in the slighest. What I've written so far is in PHP, using CURL to do the polling and I started with PHP a few days ago, so maybe I'm missing something obvious that could help me with this. Some advice would be great.

    Read the article

  • Form Validation Options

    The steps involved in transmitting form data from the client to the Web server User loads web form. User enters data in to web form fields User clicks submit On submit page validates fields using JavaScript. If validation errors are found then the validation script stops the browser from canceling posting the data to the web server and displays error messages as needed. If the form passes the data validation process then the browser will URL encode the values of every field and post it to the server.  The server reads the posted data from the query string and then again validates the data just to ensure data consistency and to prevent any non-validated data because JavaScript was turned off on the clients browser from being inserted in to a database or passed on to other process. If the data passes the second validation check then the server side code will continue with the requested processes. In my opinion, it is mandatory to validate data using client side and server side validation as a fail over process. The client side validation allows users to correct any error before they are sent to the web server for processing, and this allows for an immediate response back to the user regarding data that is not correct or in the proper format that is desired. In addition, this prevents unnecessary interaction between the user and the web server and will free up the server over time compared to doing only server side validation. Server validation is the last line of defense when it comes to validation because you can check to ensure the user’s data is correct before it is used in a business process or stored to a database. Honestly, I cannot foresee a scenario where I would only want to use one form of validation over another especially with the current cost of creating and maintaining data. In my opinion, the redundant validation is well worth the overhead.

    Read the article

< Previous Page | 133 134 135 136 137 138 139 140 141 142 143 144  | Next Page >