Search Results

Search found 14924 results on 597 pages for 'selector performance'.

Page 339/597 | < Previous Page | 335 336 337 338 339 340 341 342 343 344 345 346  | Next Page >

  • Webcast Series: Accelerate Business-Critical Database Deployments with Oracle Optimized Solutions

    - by ferhat
    Join us for this two-part Webcast series and learn how to safely consolidate business-critical databases and deliver quantifiable benefits to the business: Save up to 75% in operational and acquisition costs Save millions of dollars consolidating legacy infrastructure Leverage best practices from thousands of customer environments Increase end user productivity with 75% faster time to operations and 4x faster throughput   The Oracle Optimized Solution for Oracle Database  provides extensive guidelines for architecting and deploying complete database solutions that deliver superior performance and availability while minimizing cost and risk. Oracle’s world-class engineering teams work together to define these optimal architectures using Oracle's powerful SPARC M-Series and SPARC T-Series servers together with Oracle Solaris and Oracle's SAN, NAS, and flash-based storage to run the industry-leading Oracle Database. Quite simply, the Oracle Optimized Solution for Oracle Database makes it easier for you to deliver and manage business critical database environments that are fast, secure and cost-effective. Available On-Demand PART 1: Why Architecture Matters When Deploying Business-Critical Databases PART 2: How To Consolidate Databases Using Oracle Optimized Solutions   Presented by: Lawrence McIntosh, Principal Enterprise Architect, Oracle Optimized Solutions Ken Kutzer, Principal Product Manager, Infrastructure Solutions, Oracle  

    Read the article

  • SQL SERVER – Weekly Series – Memory Lane – #004

    - by pinaldave
    Here is the list of curetted articles of SQLAuthority.com across all these years. Instead of just listing all the articles I have selected a few of my most favorite articles and have listed them here with additional notes below it. Let me know which one of the following is your favorite article from memory lane. 2006 Auto Generate Script to Delete Deprecated Fields in Current Database In early career everytime I have to drop a column, I had hard time doing it because I was scared what if that column was needed somewhere in the code. Due to this fear I never dropped any column. I just renamed the column. If the column which I renamed was needed afterwards it was very easy to rename it back again. However, it is not recommended to keep the deleted column renamed in the database. At every interval I used to drop the columns which was prefixed with specific word. This script is 6 years old but still works. Give it a look, I am open for improvements. 2007 Shrinking Truncate Log File – Log Full – Part 2 Shrinking database or mdf file is indeed bad thing and it creates lots of problems. However, once in a while there is legit requirement to shrink the log file – a very rare one. In the rare occasion shrinking or truncating the log file may be the only solution. However, one should make sure to take backup before and after the truncate or shrink as in case of a disaster they can be very useful. Remember that truncating log file will break the log chain and while restore it can create major issue. Anyway, use this feature with caution. 2008 Simple Use of Cursor to Print All Stored Procedures of Database Including Schema This is a very interesting requirement I used to face in my early career days, I needed to print all the Stored procedures of my database. Interesting enough I had written a cursor to do so. Today when I look back at this stored procedure, I believe there will be a much cleaner way to do the same task, however, I still use this SP quite often when I have to document all the stored procedures of my database. Interesting Observation about Order of Resultset without ORDER BY In industry many developers avoid using ORDER BY clause to display the result in particular order thinking that Index is enforcing the order. In this interesting example, I demonstrate that without using ORDER BY, same table and similar query can return different results. Query optimizer always returns results using any method which is optimized for performance. The learning is There is no order unless ORDER BY is used. 2009 Size of Index Table – A Puzzle to Find Index Size for Each Index on Table I asked this puzzle earlier where I asked how to find the Index size for each of the tables. The puzzle was very well received and lots of interesting answers were received. To answer this question I have written following blog posts. I suggest this weekend you try to solve this problem and see if you can come up with a better solution. If not, well here are the solutions. Solution 1 | Solution 2 | Solution 3 Understanding Table Hints with Examples Hints are options and strong suggestions specified for enforcement by the SQL Server query processor on DML statements. The hints override any execution plan the query optimizer might select for a query. The SQL Server Query optimizer is a very smart tool and it makes a better selection of execution plan. Suggesting hints to the Query Optimizer should be attempted when absolutely necessary and by experienced developers who know exactly what they are doing (or in development as a way to experiment and learn). Interesting Observation – TOP 100 PERCENT and ORDER BY I have seen developers and DBAs using TOP very causally when they have to use the ORDER BY clause. Theoretically, there is no need of ORDER BY in the view at all. All the ordering should be done outside the view and view should just have the SELECT statement in it. It was quite common that to save this extra typing by including ordering inside of the view. At several instances developers want a complete resultset and for the same they include TOP 100 PERCENT along with ORDER BY, assuming that this will simulate the SELECT statement with ORDER BY. 2010 SQLPASS Nov 8-11, 2010-Seattle – An Alternative Look at Experience In year 2010 I attended most prestigious SQL Server event SQLPASS between Nov 8-11, 2010 at Seattle. I have only one expression for the event - Best Summit Ever. Instead of writing about my usual routine or the event, I wrote about the interesting things I did and how I felt about it! When I go back and read it, I feel that this is the best event I attended in year 2010. Change Database Access to Single User Mode Using SSMS Image says all. 2011 SQL Server 2012 has introduced new analytic functions. These functions were long awaited and I am glad that they are now here. Before when any of this function was needed, people used to write long T-SQL code to simulate these functions. But now there’s no need of doing so. Having available native function also helps performance as well readability. Function SQLAuthority MSDN CUME_DIST CUME_DIST CUME_DIST FIRST_VALUE FIRST_VALUE FIRST_VALUE LAST_VALUE LAST_VALUE LAST_VALUE LEAD LEAD LEAD LAG LAG LAG PERCENTILE_CONT PERCENTILE_CONT PERCENTILE_CONT PERCENTILE_DISC PERCENTILE_DISC PERCENTILE_DISC PERCENT_RANK PERCENT_RANK PERCENT_RANK Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Memory Lane, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • PlayOnLinux Unable to find 32bit opengl libraries Dual ATI Videocards

    - by Rodolfo Pires
    Im curently running Ubuntu 12.04 LTS 64Bit So, i installed league of legends fine the first time with the opensource ATI Drivers provided by ubuntu itself with no issues at all, but it runs so slow ... max 20fps because those drivers dont fully support my Dual Graphic cards Than i restored system and i installed the Linux Version of the Proper ATI Drivers from the AMD Website wich supports my APU AMD-A8-4500M with the AMD Radeon 7640G + 7670M Graphics Cards enabling me full performance from my system .. Problem is, to run League of Legends i need a 32bit opengl library, and the driver, automaticly detects a 64bits Linux install and loads the 64bit libraries but not the 32 ones . i need some kind of command, to force the 32bit libraries to load, or to make League of Legends run on the 64 ones .. Im kinda noob to ubuntu .. i installed the 32 bits ones trough terminal and still doesnt work idk why, maybe the driver doesnt want to load them .. plzz help me on this, i dont want to go back to windows just to play league since im noob idk what more details to post here so plz tell me what do you need

    Read the article

  • Low process priority problem

    - by Svepe
    I have just set Ubuntu 12.04 64bit with Cinnamon desktop and 3.5.0-030500 kernel on my new laptop with IvyBridge i7. I decided to test its performance by running a single threaded CPU-hungry program that I often use for camera calibration. Unfortunately, it ended up running much slower than I have ever expected. After some investigation it turned out that the program priority is automatically changed from normal to low which makes the program even slower. I have also noticed that all user programs such as Skype and Firefox are set to low priority. I tried manually resetting the priority to normal or even very high using the "renice" command, which works temporary until the kernel scheduler (I guess) resets the priority to low. Is this a normal behaviour and how can I overcome the problem with slowing down the execution? P.S. I also tried with the 3.2 kernel, but the problem is still present.

    Read the article

  • I want a trivial example of where MongoDB can scale but a relational database will have trouble

    - by Ryan Weir
    I'm just learning to use MongoDB, and when discussing with other programmers would like a quick example of why NoSQL can be a good choice compared to a traditional RDBMS - however the scenarios I come up with and can find online seem pretty contrived. E.g. a blog with lots of traffic could be represented relationally, but will require some performance tuning and joins across tables (assuming full denormalization is being used). Whereas MongoDB would allow direct retrieval from one collection to the same effect. But the response I'm getting from other programmers is "why not just keep it relational and then add some trivial caching later?" Does anybody have a less contrived example where MongoDB will really shine and a relational db will fall over much quicker? The smaller the project/system the better, because it leaves less room for disagreement. Something along the lines of the complexity of the blog example would be really useful. Thanks.

    Read the article

  • Differences between C# and Javascript for Unity [closed]

    - by vrinek
    Apart from the language differences (class-based vs prototypical, strong vs weak typing), what are the differences between using Javascript and using C# when developing games in Unity3D? Is there a noticable performance difference? Is the javascript code packaged as-is? And if yes, does this help the game's modability? Is it possible to use libraries developed for one language while developing in the other one? Is it possible to mix the two languages in the same Unity project by coding some parts in C# and others in Javascript? The next couple of questions are time-specific so feel free to ignore or remove: If libraries are not cross-functional, which language has better library support from the game development perspective? Which language has better game dev specific resources available (books, websites, forums)?

    Read the article

  • Oracle Developer Day: Die Oracle Datenbank in der Praxis

    - by A&C Redaktion
    Im neuen Jahr finden wieder Oracle Developer Days in verschiedenen Städten statt! In dieser speziell von den Database-Kollegen zusammengestellten Veranstaltung erfahren Sie viele Tipps und Tricks aus der Praxis und werden zu folgenden Themen auf den neuesten Stand gebracht: Die Unterschiede der Editionen und ihre Geheimnisse Umfangreiche Basisausstattung auch ohne Option Performance und Skalierbarkeit in den einzelnen Editionen Kosten- und Ressourceneinsparung leicht gemacht Sicherheit in der Datenbank Steigerung der Verfügbarkeit mit einfachen Mitteln Der Umgang mit großen Datenmengen Cloud Technologien in der Oracle Datenbank Ein Ausblick auf die Funktionen der für 2013 geplanten neuen Datenbank-Version rundet den Workshop ab. Termine, Agenda,Veranstaltungsorte und Anmeldung finden Sie hier. Melden Sie sich noch heute zur Veranstaltung an - die Teilnahme ist kostenlos!

    Read the article

  • jQuery with SharePoint solutions

    - by KunaalKapoor
    For me jQuery is the 'Plan-B' for everything.And most of my projects include the use of jQuery for something or the other, so I decided to write a small note on what works best while using jQuery along with SharePoint.I prefer to use the jQuery JavaScript library, which is far more robust, easier to use, and allows for plugins. Follow the steps below to add jQuery to your master page. For office 365, the prefered location to add jQuery files is the "Site Asserts" library.Deployment Best PracticesThey are only as good as the context it’s being referenced.  In other words, take into account your world before applying it.Script your deployment options.  Folder in SPD. Use the file system.  Make external references.  The JQuery library is on the Microsoft Ajax Content Delivery Network. You may even choose to publish to and from the document library. (pros and cons to this approach)Reference options when referencing the script.ScriptLink will make sure it’s loaded at the top of the page and only loaded once. You need Visual Studio or SPDContent Editor Web Part (CEWP).  Drop it on the page and it’s there.  Easy but dangerousCustom Actions. Great for global deployments of JQuery.  Loads it on every page. It also works in Sandbox installations.Deployment Maintenance Dont’sDon’t add scripts directly to your Master Page. That’s way too much effort because the pages are hard to maintain.Don’t add scripts directly to the CEWP.  Use a content link instead. That will allow for reuse. If you or someone deletes the CEWP you won’t lose code in the web partSecurity.  Any scripts run with the same privileges of the current user.  In other words, you can’t get in trouble.Development Best PracticesDon’t abuse the DOM.  There are better options to load the DOM without hitting it 1,000 times.User other performance boosters.Try other libraries.  Try some custom codeAvoid String conversionMinify your filesUse CAML to reduce number of returns rowsOnly update your JQuery library AFTER RIGOROUS REGRESSION TESTINGCRUD operations can come with some funSP Services wraps SharePoint’s web services for executionThe Bing SDK is pretty easy to use.  You can add it to your page with a script,  put it into a content editor web part and connect it from the address parameters in a list.Steps:1. Go to jquery.com and download the latest jQuery library to your desktop. You want to get the compressed production version, not the development version.2. Open SharePoint Designer (SPD) and connect to the root level of your site's site collection.In SPD, open the "Style Library" folder. Create a folder named "Scripts" inside of the Style Library. Drag the jQuery library JavaScript file from your desktop into the Scripts folder.In the Scripts folder, create a new JavaScript file and name it (e.g. "actions.js").3. If you are using visual studio add a folder for js, you can create a new folder at the root level or if you prefer more cleaner solutions like me, you can use the layouts folder which cleans out on deactivation/uninstall.4. Within the <head> tag of the master page, add a script reference to the jQuery library just above the content place holder named "PlaceHolderAdditonalPageHead" (and above your custom CSS references, if applicable) as follows:<script src="/Style%20Library/Scripts/{jquery library file}.js" type="text/javascript"></script>Immediately after the jQuery library reference add a script reference to your custom scripts file as follows:<script src="/Style%20Library/Scripts/actions.js" type="text/javascript"></script>Inside your script tag, you can test if jQuery is already defined and if not, then add it to the page.<script type='text/javascript'>  if (typeof jQuery == 'undefined')    document.write('<scr'+'ipt type="text/javascript" src="http://code.jquery.com/jquery-1.6.1.min.js"></sc'+'ript>');</script>For the inquisitive few... Read on if you'd like :)Why jQuery on SharePoiny is AwesomeIt’s all about that visual wow factor.  You can get past that, “But it looks like SharePoint”  Take a long list view and put it into JQuery with pagination, etc and you are the hero.  It’s also about new controls you get with JQuery that you couldn’t do before.Why jQuery with SharePoint should be AwfulAlthough it’s fairly easy to get jQuery up and running. Copy/Paste can cause a problem.  If you don’t understand what it’s doing in the Client Object Model and the Document Object Model then it will do things on your site that were completely unexpected. Many blogs will note workarounds they employed on their sites. Why it’s not working: Debugging “sucks”.You need to develop small blocks of functionality, Test it by putting in some alerts  and console.log. Set breakpoints and monitor the DOM via Firebug and some IE development toolsPerformance - It happens all the time. But you should look at the tradeoffs. More time may give you more functionality.Consistency - ”But it works fine on my computer. So test on many browsers.  Take into account client resourcesHarm the Farm -  You need to code wisely and negatively test.  Don’t be the cause of a DoS attack that’s really JQuery asking for a resource over and over and over again.  So code wisely. Do negative testing. Monitor Server Resources.They also did a demo where JQuery did an endless loop to pull data from a list. It’s a poor decision but also an easy mistake.  They spiked their server resources within a couple seconds and had to shut down the call before it brought it down.ConclusionJQuery is now another tool in your tool kit. You don’t have to use it. Use it where it makes sense and where it helps you get your job done.Don’t abuse it, you will pay for it laterIt will add to page bloat so take that into accountIt can slow your performance

    Read the article

  • Gaming with ATI open-source drivers

    - by user7174
    I have recently bought humble bundle 2 ( http://www.humblebundle.com/ ). Is there a way to run Braid using ATI's open-source drivers? The game always crashes. When do get it to start in windowed mode once i go to the first level it will crash. I am using the lastest version of Braid (ST3C ignored) When I use the proprietary drivers Braid works flawlessly and World of Goo performance is increased. However there is terrible screen tearing with the ATI propritary drivers. So my question is: How do I play Braid if I want to use the proprietary drivers?

    Read the article

  • Reminder: GlassFish 3.1 Clustering Webinar Today!

    - by alexismp
    Quick reminder for those of you that missed the GlassFish Clustering Webinar from March, we have a new session today (June 28th, 2011). The session is planned at 10:00 a.m. PT / 1:00 p.m. ET / 19.00 CT and you'll need to register first. John Clingan, Principal Product Manager for GlassFish, will walk you through the various clustering features introduced and enhanced in version 3.1. This includes the SSH-based provisioning of clusters (never log in to any machine again), the centralized administration, High Availability and smart failover, load-balancer, Domain Admin Server (DAS) performance improvements, cluster deployments and more. Other than learning about these new product features this is also your chance to ask questions to John and other GlassFish team members. See you there!

    Read the article

  • Why isn't there a python compiler to native machine code?

    - by user2986898
    As I understand, the cause of the speed difference between compiled languages and python is, that the first compiles code all way to the native machine's code, whereas python compiles to python bytecode, to be interpreted by the PVM. I see that this way python codes can be used on multiple operation system (at least in most cases), however I do not understand, why is not there an additional (and optional) compiler for python, which compiles the same way as traditional compilers. This would leave to the programmer to chose, which is more important to them; multiplatform executability or performance on native machine. In general; why are not there any languages which could be behave both as compiled and interpreted?

    Read the article

  • Coding style advice? [closed]

    - by user1064918
    I'm a newly grad. I've got a lot of complaints from my supervisor at work during code-review sessions with regard to my coding style (Surprise!). I don't know if it's just him being cranky or my style is really that annoying to read. I come from the low-level language world (assembly, mostly), so I've been taught to use bitwise ops and all the cool tricks to do math whenever possible. I also have the habits of doing some other things that've been regarded as "too excessively dense to read". So I'm hoping to get some feedback from any experienced programmers! :) Also how should I justify between code performance and readability? Thanks!!

    Read the article

  • A little SQL tip for C# developers

    - by MikeParks
    The other day at work I came across a handy little block of SQL code from Jeremiah Clark's blog. It's pretty simple logic but through the mind of a C# developer making some quick DB updates, seems to me that it's more likely to end up writing out the code in Solution 1 instead of Solution 2 below to solve the problem. Basically, I needed to check and see if a specific record existed in Table1. If it does exist, then update that record, otherwise insert a new record into Table1. Solution 1: IF EXISTS (SELECT * FROM Table1 WHERE Column1='SomeValue')     UPDATE Table1 SET (...) WHERE Column1='SomeValue' ELSE     INSERT INTO Table1 VALUES (...) Solution 2: UPDATE Table1 SET (...) WHERE Column1='SomeValue' IF @@ROWCOUNT=0     INSERT INTO Table1 VALUES (...)         As Jeremiah explains, they both accomplish the same thing but from a performance standpoint, Solution 2 is the better way to go (saved table/index scan). Just wanted to throw this small tip out there. Thanks! - Mike

    Read the article

  • Understanding #DAX Query Plans for #powerpivot and #tabular

    - by Marco Russo (SQLBI)
    Alberto Ferrari wrote a very interesting white paper about DAX query plans. We published it on a page where we'll gather articles and tools about DAX query plans: http://www.sqlbi.com/topics/query-plans/I reviewed the paper and this is the result of many months of study - we know that we just scratched the surface of this topic, also because we still don't have enough information about internal behavior of many of the operators contained in a query plan. However, by reading the paper you will start reading a query plan and you will understand how it works the optimization found by Chris Webb one month ago to the events-in-progress scenario. The white paper also contains a more optimized query (10 time faster), even if the performance depends on data distribution and the best choice really depends on the data you have. Now you should be curious enough to read the paper until the end, because the more optimized query is the last example in the paper!

    Read the article

  • Erlang web frameworks survey

    - by Zachary K
    (Inspired by similar question on Haskel) There are several web frameworks for Erlang like Nitrogen, Chicago Boss, and Zotonic, and a few more. In what aspects do they differ from each other? For example: features (e.g. server only, or also client scripting, easy support for different kinds of database) maturity (e.g. stability, documentation quality) scalability (e.g. performance, handy abstraction) main targets Also, what are examples of real-world sites / web apps using these frameworks? EDIT: Starting a bounty in hopes that it will get some conversation going

    Read the article

  • NEON Intrinsic Support in CE7

    - by Kate Moss' Open Space
    Just a side note for people who may be interested in creating high performance code to take advantage on NEON instruction set but wish to use NEON intrinsic instaed of coding assembly. Compiler won't generate NEON opcode unless application use the NEON intrinsic explicitly. Basically, you need ARMv7 build enviroment, so compiler can emit NEON opcode. Intrinsic prototype can be found in public\COMMON\sdk\inc\arm_neon.h and that is all you got. If you ever find an NEON opcode does not have corresponding intrinsic, you still need to use the old trick - write that part of code in assembly.

    Read the article

  • circle - rectangle collision in 2D, most efficient way

    - by john smith
    Suppose I have a circle intersecting a rectangle, what is ideally the least cpu intensive way between the two? method A calculate rectangle boundaries loop through all points of the circle and, for each of those, check if inside the rect. method B calculate rectangle boundaries check where the center of the circle is, compared to the rectangle make 9 switch/case statements for the following positions: top, bottom, left, right top left, top right, bottom left, bottom right inside rectangle check only one distance using the circle's radius depending on where the circle happens t be. I know there are other ways that are definitely better than these two, and if could point me a link to them, would be great but, exactly between those two, which one would you consider to be better, regarding both performance and quality/precision? Thanks in advance.

    Read the article

  • Game engine design: Multiplayer and listen servers

    - by jarx
    My game engine right now consists of a working singleplayer part. I'm now starting to think about how to do the multiplayer part. I have found out that many games actually don't have a real singleplayer mode, but when playing alone you are actually hosting a local server as well, and almost everything runs as if you were in multiplayer (except that the data packets can be passed over an alternate route for better performance) My engine would need major refactoring to adapt to this model. There would be three possible modes: Dedicated client, Dedicated server and Client-Server (listen mode) * How often is the listen-server model used in the gaming industry? * What are the (dis)advantages of it? * What other options do I have?

    Read the article

  • Designing an API on top with Java RMI and Rest APIs

    - by user1303881
    I'm working on the backend of a java web application. We have a document repository (Fedora Commons specifically) where we house xml files. I want to abstract the API of the repository internally so that we aren't tightly coupled to one product. I'd also like to give the flexibility of connecting to to a repository via Java RMI or REST APIs. I was hoping to get advice or resources on how to implement something like this. My thought it that I'd have some abstract repository class that had methods like getRecord, updateRecord, and deleteRecord. In the constructor I would pass the URI for the repository and the API method and port. This would allow some flexibility in the future so that if the REST api became more practical, but allow the flexibility or using RMI which could (should?) have better performance. Am I over thinking this or am I on the right path?

    Read the article

  • Moving from XNA/C# to DirectX/C++ quite confused

    - by misiMe
    I made some game with XNA/C# for Windows Phone and Windows 8, since XNA is dead and Visual studio doesn't support it (I have to target Windows Phone 7.1 to build with XNA), I want to start learning something more "consistent in time" and improve my skills. I'm a little confused about the possibilities, because C++/DirectX alone seems difficult, so I found some high-level classes to help: DirectX Toolkit Cocos2D My questions are: What will happen when they will "die" like XNA? Is C++'s approces more "professional" than C#/XNA and why? Is C++'s approces more "portable"? Is C++'s approces more resistant in terms of time? Is there any consideration about DirectX TK and Cocos2D in terms of performance? I ask that because I found that every Game software house in my country looks for skilled C++ programmers.

    Read the article

  • Topeka Dot Net User Group (DNUG) Meeting &ndash; April 6, 2010

    Topeka DNUG is free for anyone to attend! Mark your calendars now! SPEAKER: Troy Tuttle is a self-described pragmatic agilist, and Kanban practitioner, with more than a decade of experience in delivering software in the finance and health industries and as a consultant. He advocates teams improve their performance through pursuit of better practices like continuous integration and automated testing. Troy is the founder of the Kansas City Limited WIP Society and is a speaker at local area groups...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Moving large website to new CMS - URL changes

    - by herrherr
    Hi, I was wondering if you have any tipps on the following situation. I'm going to move a large website to a new Content Management System, here are some details on the site: online news magazine with roughly 3,000 articles domain age: 10 years online in the current form since May 2010 indexed pages: ~10.000 percent of search engine traffic: under 10% Unfortunately a custom-tailored CMS was used for the site. The performance, reliability and SEO capabilites have been really bad, so we are moving to a new and proven open source CMS. All the articles will be kept as they are, but the URL structure as well as the structure of the HTML templates will be changed. What I wanted to do now is to actually create 301 redirects for all articles from the old to the new schema, i.e: Old: www.example.com/en/html/news/detail/title-of-the-article/ New: www.example.com/category/title-of-article.html Is this a proven way to do something like this? If not, can you recommend a way that has worked for you? Thanks :)

    Read the article

  • Heading in the Right Direction: Garmin Exadata adoption

    - by Javier Puerta
    A pioneer in global positioning system (GPS) navigation, Garmin International Inc. has been adopting Exadata to support the infrastructure that powered the company’s Oracle Advanced Supply Chain Planning, but also the company’s fitness segment, which provides customers with an online platform to store, retrieve, and interact with data captured using Garmin fitness products. The environment, which is built on an Oracle Database, processes approximately 40 million queries per week. Prior to using Oracle Exadata Database Machine, as the online offering grew in popularity, it began to face reliability issues that had negatively impacted the customer experience. We included the video testimonial in a previous post. Now you can find the a complete set of materials about this customer story Garmin Customer Reference Garmin video testimonial:  Garmin Consolidates on Exadata for 50% Performance Boost Profit Magazine article:  Heading in the Right Direction

    Read the article

  • Oracle SOA Suite 11g Administrator's Handbook

    - by Antony Reynolds
    SOA Administration Book I have just received a copy of the “Oracle SOA Suite 11g Administrator's Handbook” so as soon as I have read it I will let you know what I think.  In the meantime the first thing that struck me was the author credentials, although I have never met either of them as I remember, I have read Admeds blog postings and they are a great community resource, so immediately I am well disposed towards the book.  Similarly Arun is an employee of my friend and co-author Matt Wright, and I have heard good things about him from Rubicon Red people. A first glance at the table of contents looks encouraging, I particularly like their approach to performance tuning where they give a clear concise explanation of the knobs they are using. More when I have read more.

    Read the article

  • Oracle OpenWorld Update -- Scaling Infrastructure to Meet Business Growth: A Coherence Customer Panel

    - by Ruma Sanyal
    Today being the Monday of OpenWorld is packed with great content and sessions. I have already blogged about the general session by Ajay Patel and the classic Cloud Application Foundation roadmap and strategy session by Mike Lehmann. But we will be remiss if we don’t list the customer panel for Coherence. Come listen to customers spanning a wide variety of industries such as consumer goods, railways, and agricultural biotechnology discuss how Oracle Coherence enables business growth, cost cutting, and improved customer experience. You will learn how Coherence helps scale services cost-effectively, improve performance, and assure service availability in both on-premises and cloud deployments. Each customer will present details of their specific use cases, benefits and war stories of developing, deploying and managing some of the largest data grid deployments in the world. The session will be moderated by Cameron Purdy, VP of Development, and Mr. Coherence himself J For more information about this and other Coherence sessions, review the Coherence Focus on document. Details: Monday, 10/1, 12:15 p.m. - 1:15 p.m., Moscone South Room 309

    Read the article

< Previous Page | 335 336 337 338 339 340 341 342 343 344 345 346  | Next Page >