Search Results

Search found 20275 results on 811 pages for 'general performance'.

Page 520/811 | < Previous Page | 516 517 518 519 520 521 522 523 524 525 526 527  | Next Page >

  • Advantages of Hudson and Sonar over manual process or homegrown scripts.

    - by Tom G
    My coworker and I recently got into a debate over a proposed plan at our workplace. We've more or less finished transitioning our Java codebase into one managed and built with Maven. Now, I'd like for us to integrate with Hudson and Sonar or something similar. My reasons for this are that it'll provide a 'zero-click' build step to provide testers with new experimental builds, that it will let us deploy applications to a server more easily, that tools such as Sonar will provide us with well-needed metrics on code coverage, Javadoc, package dependencies and the like. He thinks that the overhead of getting up to speed with two new frameworks is unacceptable, and that we should simply double down on documentation and create our own scripts for deployment. Since we plan on some aggressive rewrites to pay down the technical debt previous developers incurred (gratuitous use of Java's Serializable interface as a file storage mechanism that has predictably bit us in the ass) he argues that we can document as we go, and that we'll end up changing a large swath of code in the process anyways. I contend that having accurate metrics that Sonar (or fill in your favorite similar tool) provide gives us a good place to start for any refactoring efforts, not to mention general maintenance -- after all, knowing which classes are the most poorly documented, even if it's just a starting point, is better than seat-of-the-pants guessing. Am I wrong, and trying to introduce more overhead than we really need? Some more background: an alumni of our company is working at a Navy research lab now and suggested these two tools in particular as one they've had great success with using. My coworker and I have also had our share of friendly disagreements before -- he's more of the "CLI for all, compiles Gentoo in his spare time and uses Git" and I'm more of a "Give me an intuitive GUI, plays with XNA and is fine with SVN" type, so there's definitely some element of culture clash here.

    Read the article

  • Oracle Brings Analytics to Project Management

    - by Sylvie MacKenzie, PMP
    Excerpt from PROFIT - ORACLE - by Alison Weiss  Nonprofit and for-profit organizations have many differences, but there is one way they are alike—managers struggle with huge amounts of data generated every day. Project data by itself has limited use—but any organization that can gain insight to make accurate predictions or to use resources more effectively can gain an operational advantage. Oracle’s Primavera P6 Analytics 2.0 business intelligence solution enables organizations using Oracle’s Primavera P6 Professional Project Management to do just that: identify critical issues and uncover trends in stores of project data. Primavera P6 Analytics provides management with the ability to look at not only how a single effort is progressing, but also how the entire organization is doing from a project perspective. The latest release includes new features that make it even easier to gather and analyze critical information. For example, the addition of geocoding gives Primavera P6 Analytics users the ability to track resources geographically on longitude and latitude and use a map to get an overall view of how projects, programs, and activities are deployed. “A nonprofit with relief projects in Vietnam, for example, can drill down to the project and get a world view and a regional view,” says Yasser Mahmud, vice president of product strategy and industry marketing in Oracle’s Primavera Global Business Unit. “Then they can drill down further to show statistics; key performance indicators; and how that program, portfolio, or project work is actually getting done.” The addition of new mobile capabilities to Primavera P6 Analytics puts deep-dive analysis into project managers’ hands with compatibility with major tablet operating systems. Now, nonprofits or for-profits working in remote locations can provide real-time visibility into projects to alert management if issues are occurring that need to be addressed immediately. “Primavera P6 Analytics generates information that can help organizations improve their utilization and trim down overall operating costs,” says Mahmud. “But more importantly, it gives organizations improved visibility.”

    Read the article

  • CodePlex Daily Summary for Thursday, June 21, 2012

    CodePlex Daily Summary for Thursday, June 21, 2012Popular ReleasesuComponents: uComponents v3.1.1: Continuing on from 84817, we are proud to announce our 3.1.1 release! The following issues have been resolved: 14640 14696 14704 14724 Please note: This release is not to be confused with the upcoming 80410 (which will support .NET 4.0)MVVM Light Toolkit: V4RTM (binaries only) including Windows 8 RP: This package contains all the latest DLLs for MVVM Light V4 RTM. It includes the DLLs for Windows 8 Release Preview. An updated Nuget package is also available at http://nuget.org/packages/MvvmLightLibs An installer with binaries, snippets and templates will follow ASAP.Weapsy - ASP.NET MVC CMS: 1.0.0: - Some changes to Layout and CSS - Changed version number to 1.0.0.0 - Solved Cache and Session items handler error in IIS 7 - Created the Modules, Plugins and Widgets Areas - Replaced CKEditor with TinyMCE - Created the System Info page - Minor changesAuto Proxy Configuration: Windows Proxy Setup V1.2: Bug FixesXDA ROM Hub: XDA ROM HUB v0.5: Added XRH Backup -- Backup & restore data, system and cache USE AT YOUR OWN RISK! - USE ONLY IN RECOVERY!AcDown????? - AcDown Downloader Framework: AcDown????? v3.11.7: ?? ●AcDown??????????、??、??????。????,????,?????????????????????????。???????????Acfun、????(Bilibili)、??、??、YouTube、??、???、??????、SF????、????????????。 ●??????AcPlay?????,??????、????????????????。 ● AcDown??????????????????,????????????????????????????。 ● AcDown???????C#??,????.NET Framework 2.0??。?????"Acfun?????"。 ????32??64? Windows XP/Vista/7/8 ??:????????Windows XP???,?????????.NET Framework 2.0???(x86),?????"?????????"??? ??????????????,??????????: ??"AcDown?????"????????? ...Apex: Apex 1.4: Apex 1.4Apex 1.4 provides a framework for rapid MVVM development. Download Apex 1.4 to get the core binaries, Visual Studio Extensions, Project Templates, Samples and Documentation. The 1.4 Release provides a vast number of enhancements via the Apex Broker. The Apex Broker is an object that can be used to retrieve models, get the view for a view model and more, much like an IoC container. The new Zune Style application templates for WPF and Silverlight give a great starting point for makin...NShader - HLSL - GLSL - CG - Shader Syntax Highlighter AddIn for Visual Studio: NShader 1.3 - VS2010 + VS2012: This is a small maintenance release to support new VS2012 as well as VS2010. This release is also fixing the issue The "Comment Selection" include the first line after the selection If the new NShader version doesn't highlight your shader, you can try to: Remove the registry entry: HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\11.0\FontAndColors\Cache Remove all lines using "fx" or "hlsl" in file C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\CommonExtensions\Micr...JSON Toolkit: JSON Toolkit 4.0: Up to 2.5x performance improvement in stringify operations Up to 1.7x performance improvement in parse operations Improved error messages when parsing invalid JSON strings Extended support to .Net 2.0, .Net 3.5, .Net 4.0, Silverlight 4, Windows Phone, Windows 8 metro apps and Xbox JSON namespace changed to ComputerBeacon.Json namespaceXenta Framework - extensible enterprise n-tier application framework: Xenta Framework 1.8.0: System Requirements OS Windows 7 Windows Vista Windows Server 2008 Windows Server 2008 R2 Web Server Internet Information Service 7.0 or above .NET Framework .NET Framework 4.0 WCF Activation feature HTTP Activation Non-HTTP Activation for net.pipe/net.tcp WCF bindings ASP.NET MVC ASP.NET MVC 3.0 Database Microsoft SQL Server 2005 Microsoft SQL Server 2008 Microsoft SQL Server 2008 R2 Additional Deployment Configuration Started Windows Process Activation service Start...ASP.NET REST Services Framework: Release 1.3 - Standard version: The REST-services Framework v1.3 has important functional changes allowing to use complex data types as service call parameters. Such can be mapped to form or query string variables or the HTTP Message Body. This is especially useful when REST-style service URLs with POST or PUT HTTP method is used. Beginning from v1.1 the REST-services Framework is compatible with ASP.NET Routing model as well with CRUD (Create, Read, Update, and Delete) principle. These two are often important when buildin...NanoMVVM: a lightweight wpf MVVM framework: v0.10 stable beta: v0.10 Minor fixes to ui and code, added error example to async commands, separated project into various releases (mainly into logical wholes), removed expression blend satellite assembliesMFCMAPI: June 2012 Release: Build: 15.0.0.1034 Full release notes at SGriffin's blog. If you just want to run the MFCMAPI or MrMAPI, get the executables. If you want to debug them, get the symbol files and the source. The 64 bit builds will only work on a machine with Outlook 2010 64 bit installed. All other machines should use the 32 bit builds, regardless of the operating system. Facebook BadgeMonoGame - Write Once, Play Everywhere: MonoGame 2.5.1: Release Notes The MonoGame team are pleased to announce that MonoGame v2.5.1 has been released. This release contains important bug fixes and minor updates. Recent additions include project templates for iOS and MacOS. The MonoDevelop.MonoGame AddIn also works on Linux. We have removed the dependency on the thirdparty GamePad library to allow MonoGame to be included in the debian/ubuntu repositories. There have been a major bug fix to ensure textures are disposed of correctly as well as some ...????: ????2.0.2: 1、???????????。 2、DJ???????10?,?????????10?。 3、??.NET 4.5(Windows 8)????????????。 4、???????????。 5、??????????????。 6、???Windows 8????。 7、?????2.0.1???????????????。 8、??DJ?????????。Azure Storage Explorer: Azure Storage Explorer 5 Preview 1 (6.17.2012): Azure Storage Explorer verison 5 is in development, and Preview 1 provides an early look at the new user interface and some of the new features. Here's what's new in v5 Preview 1: New UI, similar to the new Windows Azure HTML5 portal Support for configuring and viewing storage account logging Support for configuring and viewing storage account monitoring Uses the Windows Azure 1.7 SDK libraries Bug fixesCodename 'Chrometro': Developer Preview: Welcome to the Codename 'Chrometro' Developer Preview! This is the very first public preview of the app. Please note that this is a highly primitive build and the app is not even half of what it is meant to be. The Developer Preview sports the following: 1) An easy to use application setup. 2) The Assistant which simplifies your task of customization. 3) The partially complete Metro UI. 4) A variety of settings 5) A partially complete web browsing experience To get started, download the Ins...KangaModeling: Kanga Modeling 1.0: This is the public release 1.0 of Kanga Modeling. -Cosmos (C# Open Source Managed Operating System): Release 92560: Prerequisites Visual Studio 2010 - Any version including Express. Express users must also install Visual Studio 2010 Integrated Shell runtime VMWare - Cosmos can run on real hardware as well as other virtualization environments but our default debug setup is configured for VMWare. VMWare Player (Free). or Workstation VMWare VIX API 1.11AutoUpdaterdotNET : Autoupdate for VB.NET and C# Developer: AutoUpdater.NET 1.1: Release Notes New feature added that allows user to select remind later interval.New Projects.NET Heatmap: This is a simple project using C#, JQuery, and heatmap.js that allows you to create a heatmap for a web page using static data from a SQL database.Advanced Data Server: Advanced Data Server (ADS) is a library that enables you to create powerful server applications with little code.ARPAMISproject: This is an ambitious project that aims to provide an extensive business solution to schools, universities or any academic institutions alike. ArraySegments (by Stephen Cleary): Lightweight extension methods for ArraySegment<T>, particularly useful for byte arrays.Auto Proxy Configuration: This Tool sets proxy server automatically according to the DNSDomain.Bongiozzo Photosite: Simple photo site written using ASP.NET MVC 4.0 over Flickr API and Galleria image gallery framework. Cloud Media SharePoint Extension: With this extensions you can easily add media from the cloud like YouTube or Vimeo. Metadata from Vimeo or YouTube are also .and will be added tooContent Organizer Rule Manager SharePoint 2010: Create and manage content organizer rules faster for SharePoint 2010.Crm Customization Manager: Crm Customization Manager (CCM) by N.JL helps Dynamics CRM System Adiminstrators to easly Import Customisations and Treanslations with scheduling possibilitydemoHello: asp.net Ghost Puzzle: Protect your files with Ghost Puzzle.IonoWumpus: A simple Hunt the Wumpus implementation.LargeSky Personal Project: This a personal project, just for code place.Maze Game: Maze Game is a game for children/ computer beginners to practice the mouse movement with fun.NASM Develop IDE (The Open Source NASM Development Environment For Windows): A simple, light weight, all one in application that can help developers develop NASM applications in Windows without the need of remember fancy commands.Navigational: Gator brings navigation to projects built around life situations.Real Folders for Visual Studio: Real Folders for Visual Studio is a free plugin which makes Solution Folders map to real file system folders. With Real Folders you have the opportunity to organize your files in a simpler way than standard Visual Studio Solution Folders behave (completely uncommited to any folder on your file system). SaltFx: SaltFx is an N-Layered Domain Driven Design (DDD) framework for .NET development.SharePoint Location-Based Weather Webpart: Uses UPS information to display local weather for the user via the Yahoo Weather service.SharePoint Publishing: The Project is aimed at supplying Publishers with tools & design elements to provide a means of publishing through SharePoint.SpellLight - Lightweight Silverlight Spell Checker Library: SpellLight is a lightweight Silverlight Spell Check librarySUDOKU APP: NoneTrainer: This is an application used for logging runs, nutrition, weight, and other goals.weibospace: An ios projecct

    Read the article

  • Out-of-the-Box Integration Links Primavera Solutions with PeopleSoft Projects Applications

    - by Sylvie MacKenzie, PMP
    In a move that brings best-in-class enterprise project portfolio management to Oracle’s PeopleSoft enterprise resource planning customers, Oracle announced the integration of Oracle’s PeopleSoft projects applications and Oracle’s Primavera P6 Enterprise Project Portfolio Management. The combination of PeopleSoft financial controls and Primavera portfolio management capabilities brings greater oversight of end-to-end processes to help organizations improve the planning and execution efforts needed to deliver projects on time and within budget. “As an organization with many high-value, project-driven initiatives, we are very pleased to see Oracle’s investment in this important integration,” says Janardhanan Sankar, senior vice president for technology and quality at ITC Infotech India Ltd. Oracle’s PeopleSoft projects applications enable project-centric organizations and departments to establish core operational processes for full project lifecycle management across operations and finance. The integration with Primavera P6 Enterprise Project Portfolio Management means organizations can eliminate costly and difficult-to-maintain proprietary integrations. Organizations can also standardize on the Oracle technologies to Align back-office budgets and costs with project operations to help ensure accurate forecasting of costs, resources, and schedules Provide an accurate single source of truth to financial managers and analysts using Oracle’s PeopleSoft projects applications, and to project managers using Primavera P6 Enterprise Project Portfolio Management  Enhance project collaboration and execution by having all users utilizing common solutions to communicate, plan, and deliver projects “By bringing together Oracle’s PeopleSoft projects applications and Oracle’s Primavera P6 Enterprise Project Portfolio Management, we are able to provide customers with the infrastructure they need to achieve a single source of truth on the projects they are managing,” says Paco Aubrejuan, Oracle’s group vice president and general manager, PeopleSoft. “This real-time visibility drives profitability, increases productivity, and improves operations.” For more information, view the on-demand Webcast, “Bridging Business Processes for Optimal Portfolio Performance,” or read about the new integration.

    Read the article

  • ArchBeat Link-o-Rama Top 20 for March 18-24, 2012

    - by Bob Rhubart
    The top-twenty most-clicked links as shared via my social networks for the week of March 18-24, 2012. Oracle's ZFS Storage Appliance Simulator | Steen Schmidt Oracle Linux Online Forum - 4 sessions, 9 speakers + live chat March 27 OWSM vs. OEG - When to use which component - 11g | Prakash Yamuna Northeast Ohio Oracle Users Group 2 Day Seminar - May 14-15 - Cleveland, OH SOA! SOA! SOA!; OSB 11g Recipes and Author Interviews Webcast: Oracle Business Intelligence Mobile - March 27 - 10am PT / 1pm ET Oracle Hardware Systems: The Extreme Performance Tour - Dates and Locations Worldwide Oracle Cloud Conference: dates and locations worldwide Mismatch: Developer skills and customer demands | Floyd Teter OTN Virtual Developer Day - Java (APAC - in English) - March 27 Webcast Q&A: Demystifying External Authorization 2 New Cloud Computing resources added to free IT Strategies from Oracle library Encapsulating OIM API’s in a Web Service for OIM Custom SOA Composites | Alex Lopez Webcast: Simplify Oracle RAC Deployment with Oracle VM SOA gets mobilized; mobile gets SOA-ized: survey | Joe McKendrick Integrating with Oracle Fusion Applications: Discovering Integration Artifacts | Rajesh Raheja Oracle Access Manager 11g - useful links | Dmitry Nefedkin Anil Gaur on Cloud Computing Support in Java EE 7 Enterprise app shops announcements are everywhere | Andy Mulholland The extraordinary software development manager | Seth Godin Thought for the Day "Every large system that works started as a small system that worked. " — Anonymous

    Read the article

  • Welcome to the FMW Install and Admin Proactive Team Blog

    - by Daniel Mortimer
    IntroductionWelcome to the Fusion Middleware Install and Administration Proactive Support blog.  This is our first post, so let's begin by introducing ourselves and our mission. Who We AreWe are a small team of support engineers based in Europe.  Our expertise covers all matters related to the installation and administration of Oracle Application Server 10g, Oracle Fusion Middleware 11g and future versions to come. We particularly focus on core components such as the Installers and Configuration Wizards Web Tier ( Oracle HTTP Server ) OPMN Enterprise Manager Console for Application Server as well as general questions / problems relating to patching, maintenance and architecture. Our Mission Improve the customer experience Enable customers to avoid / prevent issues when working with our products Enable faster resolution of problems when they occur Our Activities Enhancement and maintenance of our knowledge base In particular, develop and maintain special content such as the Fusion Middleware Information Centers and Lifecycle Support Advisors Seek continuous improvement of the product documentation Contribute to the Fusion Middleware Support News Moderation of the "Oracle Application Server" support community Participate in the Support Advisor Webcast program Involved in the Lifecycle of diagnostic tools such as RDA and OCM User Acceptance Testing Logging of enhancements and health check ideas Provide feedback to product management / development Logging of product bugs and enhancements Suggest improvements that could be made to web sites like OTN Promote new support documents, tools via channels such as Newsletter and Social Media We hope that this blog will be a two-way communication as we are interested in feedback on what we can improve. Many suggestions we can act on immediately while others may take more time, but all of them will be acknowledged and followed up.Thank you for your time and we look forward to both informing and working with you.Postscript: Many links you will find in our blog entries will require a login to My Oracle Support. For readers who do not have a login, please accept our apologies - when and where possible we will endeavour to ensure the links will supplement rather than replace wording in the blog entries.

    Read the article

  • Distributed transactions and queues, ruby, erlang

    - by chrispanda
    I have a problem that involves several machines, message queues, and transactions. So for example a user clicks on a web page, the click sends a message to another machine which adds a payment to the user's account. There may be many thousands of clicks per second. All aspects of the transaction should be fault tolerant. I've never had to deal with anything like this before, but a bit of reading suggests this is a well known problem. So to my questions. Am I correct in assuming that secure way of doing this is with a two phase commit, but the protocol is blocking and so I won't get the required performance? It appears that DBs like redis and message queuing system like Rescue, RabbitMQ etc don't really help me a lot - even if I implement some sort of two phase commit, the data will be lost if redis crashes because it is essentially memory-only. All of this has led me to look at erlang - but before I wade in and start learning a new language, I would really like to understand better if this is worth the effort. Specifically, am I right in thinking that because of its parallel processing capabilities, erlang is a better choice for implementing a blocking protocol like two phase commit, or am I confused?

    Read the article

  • CodePlex Daily Summary for Wednesday, November 07, 2012

    CodePlex Daily Summary for Wednesday, November 07, 2012Popular ReleasesMetodología General Ajustada - MGA: 03.04.03: Cambios Parmenio: Ajustes al formato F02 de programación para que la sincronización de las grillas no afecte el guardado de los datos. Cambios John: Integración de código con cambios enviados por Parmenio. Generación de instaladores. Soporte técnico por correo electrónico, telefónico y en sitio.nAPI for Windows Phone: Naver Open API Library Assemblies and Source Codes: nAPI (Naver Open API Library) for Windows Family Tested on - Windows 8 (Windows Store App) - Windows Phone 7 - Windows Phone 8 (Emulator)X-tee.NET: Xtee.NET 1.0: Generaator ja teegidFiskalizacija za developere: FiskalizacijaDev 1.2: Verzija 1.2. je, prije svega, odgovor na novu verziju Tehnicke specifikacije (v1.1.) koja je objavljena prije nekoliko dana. Pored novosti vezanih uz (sitne) izmjene u spomenutoj novoj verziji Tehnicke dokumentacije, projekt smo prošili sa nekim dodatnim feature-ima od kojih je vecina proizašla iz vaših prijedloga - hvala :) Novosti u v1.2. su: - Neusuglašenost zahtjeva (http://fiskalizacija.codeplex.com/workitem/645) - Sample projekt - iznosi se množe sa 100 (http://fiskalizacija.codeplex.c...PowerComboBox: PowerComboBox VB v1.0: Visual Basic source code class file.Edi: Themable Edi: Completed ExpressionDark theme Improved Error Handling and Reporting feature Refactored all views to be look-less controlsMFCMAPI: October 2012 Release: Build: 15.0.0.1036 Full release notes at SGriffin's blog. If you just want to run the MFCMAPI or MrMAPI, get the executables. If you want to debug them, get the symbol files and the source. The 64 bit builds will only work on a machine with Outlook 2010 64 bit installed. All other machines should use the 32 bit builds, regardless of the operating system. Facebook BadgeJayData - The cross-platform HTML5 data-management library for JavaScript: JayData 1.2.3: JayData is a unified data access library for JavaScript to CRUD + Query data from different sources like OData, MongoDB, WebSQL, SqLite, HTML5 localStorage, Facebook or YQL. The library can be integrated with Knockout.js or Sencha Touch 2 and can be used on Node.js as well. See it in action in this 6 minutes video Sencha Touch 2 example app using JayData: Netflix browser. What's new in JayData 1.2.3 For detailed release notes check the release notes. TypeScript supportWrite your code in a ...SSIS Expression Editor & Tester: Expression Editor and Tester v1.0.8.0: Getting Started Download and extract the files, no install required. The ExpressionEditor.zip download contains a folder for each SQL Server version. ExpressionEditor2005 ExpressionEditor2008 ExpressionEditor2012 Changes Fixed issues 32868 and 33291 raised by BIDS Helper users. No functional changes from previous release. Versions There are three versions included, all built from the same code with the same functionality, but each targeting a different release of SQL Server. The downlo...MCEBuddy 2.x: MCEBuddy 2.3.7: Changelog for 2.3.7 (32bit and 64bit) 1. Improved performance of MP4 Fast and M4V Fast Profiles (no deinterlacing, removed --decomb) 2. Improved priority handling 3. Added support for Pausing and Resume conversions 4. Added support for fallback to source directory if network destination directory is unavailable 5. MCEBuddy now installs ShowAnalyzer during installation 6. Added support for long description atom in iTunesFoxyXLS: FoxyXLS Releases: Source code and samplesDyanamic Reports (RDLC) - SharePoint 2010 Visual WebPart: Initial Release: This is a Initial Release.HTML Renderer: HTML Renderer 1.0.0.0 (3): Major performance improvement (http://theartofdev.wordpress.com/2012/10/25/how-i-optimized-html-renderer-and-fell-in-love-with-vs-profiler/) Minor fixes raised in issue tracker and discussions.ProDinner - ASP.NET MVC Sample (EF4.4, N-Tier, jQuery): 8: update to ASP.net MVC Awesome 3.0 udpate to EntityFramework 4.4 update to MVC 4 added dinners grid on homepageASP.net MVC Awesome - jQuery Ajax Helpers: 3.0: added Grid helper added XML Documentation added textbox helper added Client Side API for AjaxList removed .SearchButton from AjaxList AjaxForm and Confirm helpers have been merged into the Form helper optimized html output for AjaxDropdown, AjaxList, Autocomplete works on MVC 3 and 4BlogEngine.NET: BlogEngine.NET 2.7: Cheap ASP.NET Hosting - $4.95/Month - Click Here!! Click Here for More Info Cheap ASP.NET Hosting - $4.95/Month - Click Here! If you want to set up and start using BlogEngine.NET right away, you should download the Web project. If you want to extend or modify BlogEngine.NET, you should download the source code. If you are upgrading from a previous version of BlogEngine.NET, please take a look at the Upgrading to BlogEngine.NET 2.7 instructions. If you looking for Web Application Project, ...Launchbar: Launchbar 4.2.2.0: This release is the first step in cleaning up the code and using all the latest features of .NET 4.5 Changes 4.2.2 (2012-11-02) Improved handling of left clicks 4.1.0 (2012-10-17) Removed tray icon Assembly renamed and signed with strong name Note When you upgrade, Launchbar will start with the default settings. You can import your previous settings by following these steps: Run Launchbar and just save the settings without configuring anything Shutdown Launchbar Go to the folder %LOCA...Mouse Jiggler: MouseJiggle-1.3: This adds the much-requested minimize-to-tray feature to Mouse Jiggler.Umbraco CMS: Umbraco 4.10.0 Release Candidate: This is a Release Candidate, which means that if we do not find any major issues in the next week, we will release this version as the final release of 4.10.0 on November 9th, 2012. The documentation for the MVC bits still lives in the Github version of the docs for now and will be updated on our.umbraco.org with the final release of 4.10.0. Browse the documentation here: https://github.com/umbraco/Umbraco4Docs/tree/4.8.0/Documentation/Reference/Mvc If you want to do only MVC then make sur...Skype Auto Recorder: SkypeAutoRecorder 1.3.4: New icon and images. Reworked settings window. Implemented high-quality sound encoding. Implemented a possibility to produce stereo records. Added buttons with system-wide hot keys for manual starting and canceling of recording. Added buttons for opening folder with records. Added Help button. Fixed an issue when recording is continuing after call end. Fixed an issue when recording doesn't start. Fixed several bugs and improved stability. Major refactoring and optimization...New Projects"On the Fly Zip and Attach" Windows Live Writer Plugin: This is a windows live writer zipping plug-in that allows you to select files/folders and zip them on the fly that will appear as attachment inserts, while you are writing blogs. Find details @ my Blogs Site: [url: Blogs: http://www.geekscafe.net|http://www.geekscafe.net].NET Micro Framework Tools: Collection of tools usefull for creating Netduino and .NET Micro Framework applications.Aktina: Aktina is a game engine written in DirectX 11.AppHub: AppHub??????AppStore???,???????,?????????,?????????,????????????,??,???????,???????????、??、??、??????????????,???????。 ??????????,????。Calcula Calles: Calcula CallesCRM 2011 ASP.NET Membership Provider: The CRM 2011 ASP.NET Membership provider Contains a Membership and Role provider, which can be used in ASP.NET Applications and/or ASP.NET based CMS systems.CSharp Executor: Dynamically execute C# script files in the same way that you might use VB scripts.Deque (by Stephen Cleary): A simple double-ended queue (deque) in C#. Unit tested.DirST: Allows one to replicate a DIRectory STructure without copying files. Written in C#.DNNTaskManager69: Testing DNN Module DevelopmentEchelon OS: an Sister Project to Aza DOS and this one is meant to be as minimilistic as it can with a filesystem and text editorGeek Reader: Lector de noticias para windows 8. Se trata de un template que permite rápidamente construir una aplicación windows 8 store GIII_P2: projekt 2katas-gpa: Codigo de los coding dojo acerca de patrones de diseñoKendoUI: Demonstrations using Kendo UI Complete for ASP.NET MVCLTorrent: C# BitTorrent and peer-to-peer protocol implementation MECopter: A real summary will follow soon...Migrate AD Group Permissions in Sharepoint/WSS: Sharepoint AD Group Migration tool, allows conversion of AD security principals used in Sharepoint ACLs from source domain to destination domainNetSend Manager: NetSend Manager is an Asp.Net console which enables you to send windows popup messages over a domain network. Messages can be send to a single user or to a grouNONAME0: ?? ? ???????-?????? ??? ?????? ???? ?????? ???? ??? ?? ???????????????? ??? ? ?????? ??? ??????????npr2012: Projekt testowy Personal News Assistant: Vision: The Personal News Assistant is an application which collects information from different sources and informs the user either on demand or preventive.PGIrony - Tookit & Examples for AST Generation with Irony: A tool-kit to ease AST generation,, and further ease grammr construction, with Irony.Proboscis: A query provider to work against HBase. Will be developed against HBase on Windows Azure.Project13251106: papaProject13271106: sdgProjet IMA: rtjRenderCraft: This is an editor like a AMD's RenderMonkey. But this supports Direct3D 11.Rx (Reactive Extensions): The Reactive Extensions (Rx) is a library for composing asynchronous and event-based programs using observable sequences and LINQ-style query operators. Sanle: sanle project with c++Screener: Screen sharing software.SemantEx: Adds a validation wrapper around regular expressions, allowing you to automatically apply conditional logic to capture groups.SharePoint CAML Extensions: SharePoint CAML ExtensionsSilentPlace: Turn on silence mode when you are in a silence placeSmartMacros: SmartMacros allows developers to define macros in C# and use them inside other source code. These macros are much stronger and safer than C/C++ macros, therefore they're called "Smart" :-)SoundArea link grabber: Soundarea link grabber the script put a list of links in your clipboard.Time Tracker Kickstart: This project is currently a work-in-progress.TransparentImage: TransparentImage is a console application that converts BMPS into PNG files. TransparentImage is written in VB and supports drop n drag.Travis7: A Travis-CI client for Windows PhoneWindows 8 Accelerator: A set of components and controls to accelerate your Windows 8 and Windows Phone 8 application development.Windows and Windows Phone DNS Library: Simple DNS lookup library for Windows and Windows Phone applicationsWinJS Toolkit - JavaScript Toolkit for Windows 8: The WinJS Toolkit is a set of classes, helper functions and tools that help creating Windows Store applications in HTML5, CSS3 and JavaScript.WPF TB: Study WPF????????: ZJU software project homework,One part of the total stocktrade system.

    Read the article

  • How can I convert the Nvidia driver installer into a deb?

    - by Oli
    Every so often there's a beta version of the Nvidia driver that I want to try out. This has happened today: there's been a big performance issue with version 295.40 and I want to try the shiny new XRandR-enabled 302.07. I'm more than able to download the installer, remove all the repo-installed driver files and install the new version but it's frankly a pain in the bottom to turn that around and go back to the repo version. It also means I have to re-install the driver manually each time there's a Kernel upgrade. The other option we commonly give people is a PPA but in this case I'm being really impatient. It's going to be a few days before any PPA gets this but I need to try this today. I've already manually installed it on the media centre and I'm eyeing up my desktop now. So how do I take an installer (eg: NVIDIA-Linux-x86-302.07.run) and convert that into a new nvidia-current/nvidia-current-updates package? Another way of asking this might be: How do people package the Nvidia drivers?

    Read the article

  • MySQL Connect Starting in 3 Days - New Keynote Announced

    - by Bertrand Matthelié
    We're very pleased to announce a new keynote that will take place on Saturday morning at 10.00 am: "Community Perspective - Why Upgrade to MySQL 5.6" Sarah Novotny will lead a lively panel discussion with several MySQL Community members. They will share their opinions and debate about the new MySQL Database features they’re excited about. Moderator: Sarah Novotny, CIO, Meteor Entertainment Panelists: Sheeri Cabral, Database Admin/Architect, Mozilla Giuseppe Maxia, QA Director, Continuent Domas Mituzas, Database Performance Team, Facebook Mark Leith, Software Development Senior Manager, Oracle This new keynote will follow the State of the Dolphin address by Oracle's Chief Corporate Architect Edward Screven and VP of MySQL Engineering Tomas Ulin. An exciting kick-off for MySQL Connect! 72 1024x768 Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Not registered yet? You can still save US$ 300 off the on-site fee – Register Now!

    Read the article

  • Is it customary to write Java domain objects / data transfer objects with public member variables on mobile platforms?

    - by Sean Mickey
    We performed a code review recently of mobile application Java code that was developed by an outside contractor and noticed that all of the domain objects / data transfer objects are written in this style: public class Category { public String name; public int id; public String description; public int parentId; } public class EmergencyContact { public long id; public RelationshipType relationshipType; public String medicalProviderType; public Contact contact; public String otherPhone; public String notes; public PersonName personName; } Of course, these members are then accessed directly everywhere else in the code. When we asked about this, the developers told us that this is a customary performance enhancement design pattern that is used on mobile platforms, because mobile devices are resource-limited environments. It doesn't seem to make sense; accessing private members via public getters/setters doesn't seem like it could add much overhead. And the added benefits of encapsulation seem to outweigh the benefits of this coding style. Is this generally true? Is this something that is normally done on mobile platforms for the reasons given above? All feedback welcome and appreciated -

    Read the article

  • New college grad, psychology major, wants to code professionally. Should I get Sun Java-certified?

    - by Anita
    I just graduated from a fairly well-known liberal arts college in May. Interestingly, I majored in psychology, with a concentration in social psychology. In college I took Intro to Computer Science and hated it (used to blame it on myself; now I blame it on the professor :) However, I've always wanted to be a programmer, and finally got my wish by getting hired by a company that was willing to let me learn coding from scratch in exchange for low pay. Well, what do you know, I just got laid off this morning, and need a new job by November to pay the bills. I loved the coding part of my job at the company, and managed to learn enough Java to feel competent in the job and curious to learn more. I think my goal now is to become a professional programmer. I still know very little (never used Swing, for example) but nothing that a good book can't fix. That's the background anyway; sorry for the rambling - I'm still in shock from the layoff :( It seems to me the quickest way to get noticed by companies, without a CS degree, is by getting certification. I'm halfway through studying for the SCJP and can probably sit for an exam in a week or two. Am I right in my assumption that certs will help in my case? And in general, do I have a bat's chance in hell of making it against formally trained programmers? My assets are really just raw intelligence and intense curiosity; well, maybe a love for problem-solving too. Thanks all - feel free to edit/tag the post!

    Read the article

  • Virtual Trade Show Available On Demand

    - by Theresa Hickman
    If you missed the Oracle Applications Virtual Trade Show on Feb. 3rd, 2011, you can still view all the recordings now and for the next three months. There are 36 sessions at 30 minutes each, covering 5 tracks, such as Oracle E-Business Suite, PeopleSoft, JD Edwards, Fusion, and Hyperion. Multiple product areas are covered from Financials, Procurement, Supply Chain, CRM, Performance Management, etc. The following lists the Financials sessions for the various product lines. Planning Your Successful Upgrade to Oracle E-Business Suite Financials 12.1. In this session, Bryant and Stratton College talk about their upgrade. Planning Your Successful Upgrade to PeopleSoft Financials 9.1. In this session, the University of Central Florida share their upgrade story. Fusion Financials: The New Standard for Finance. In this session, Terrance Wampler, the VP of Financial Application Strategy discusses the business value of Oracle's next generation financial applications and how customers can take advantage of Fusion Financials alongside their existing investments. Click here, to register and view any session recording at your convenience!

    Read the article

  • .Net search engine architecture and technology choice

    - by shrivb
    I am in the process of designing a search engine for an asp.net site. The site currently uses Microsoft Indexing Server to index and search content which range from simple text files to MS documents to PDFs. MIS is also used to crawl File servers. MIS in tandem with Index Server Companion crawls for content from external sites. I intend to replace MIS with the indexer/crawler I am trying to build. Since my platform is completely on the Microsoft stack, I cant afford to have a Java application server. Thus, Solr, and effectively, SolrNet is ruled out. With this being the context, I have couple of questions. 1.Technology choice I had done my initial investigation and looked at Lucene.Net. There seemed to be 2 issues in using Lucene.Net. First being, it cant crawl external content. There doesn't seem to be a direct port of Nutch in .Net. Second, since it is just an indexer, it cant parse various document types. The parsing is left to the developer. So, what would be best technology choice on the .Net platform to achieve indexing & crawling? Are there any .Net open source libraries available for document parsing? 2.Architectural pattern Is there any general architectural pattern or best practice that needs to be followed in designing such a search engine? Thanks in advance.

    Read the article

  • Optimization ended up in casting an object at each method call

    - by Aybe
    I've been doing some optimization for the following piece of code : public void DrawLine(int x1, int y1, int x2, int y2, int color) { _bitmap.DrawLineBresenham(x1, y1, x2, y2, color); } After profiling it about 70% of the time spent was in getting a context for drawing and disposing it. I ended up sketching the following overload : public void DrawLine(int x1, int y1, int x2, int y2, int color, BitmapContext bitmapContext) { _bitmap.DrawLineBresenham(x1, y1, x2, y2, color, bitmapContext); } Until here no problems, all the user has to do is to pass a context and performance is really great as a context is created/disposed one time only (previously it was a thousand times per second). The next step was to make it generic in the sense it doesn't depend on a particular framework for rendering (besides .NET obvisouly). So I wrote this method : public void DrawLine(int x1, int y1, int x2, int y2, int color, IDisposable bitmapContext) { _bitmap.DrawLineBresenham(x1, y1, x2, y2, color, (BitmapContext)bitmapContext); } Now every time a line is drawn the generic context is casted, this was unexpected for me. Are there any approaches for fixing this design issue ? Note : _bitmap is a WriteableBitmap from WPF BitmapContext is from WriteableBitmapEx library DrawLineBresenham is an extension method from WriteableBitmapEx

    Read the article

  • unit/integration testing web service proxy client

    - by cori
    I'm rewriting a PHP client/proxy library that provides an interface to a SOAP-based .Net webservice, and in the process I want to add some unit and integration tests so future modifications are less risky. The work the library I'm working on performs is to marshall the calls to the web service and do a little reorganizing of the responses to present a slightly more -object-oriented interface to the underlying service. Since this library is little else than a thin layer on top of web service calls, my basic assumption is that I'll really be writing integration tests more than unit tests - for example, I don't see any reason to mock away the web service - the work that's performed by the code I'm working on is very light; it's almost passing the response from the service right back to its consumer. Most of the calls are basic CRUD operations: CreateRole(), CreateUser(), DeleteUser(), FindUser(), &ct. I'll be starting from a known database state - the system I'm using for these tests is isolated for testing purposes, so the results will be more or less predictable. My question is this: is it natural to use web service calls to confirm the results of operations within the tests and to reset the state of the application within the scope of each test? Here's an example: One test might be createUserReturnsValidUserId() and might go like this: public function createUserReturnsValidUserId() { // we're assuming a global connection to the service $newUserId = $client->CreateUser("user1"); assertNotNull($newUserId); assertNotNull($client->FindUser($newUserId); $client->deleteUser($newUserId); } So I'm creating a user, making sure I get an ID back and that it represents a user in the system, and then cleaning up after myself (so that later tests don't rely on the success or failure of this test w/r/t the number of users in the system, for example). However this still seems pretty fragile - lots of dependencies and opportunities for tests to fail and effect the results of later tests, which I definitely want to avoid. Am I missing some options of ways to decouple these tests from the system under test, or is this really the best I can do? I think this is a fairly general unit/integration testing question, but if it matters I'm using PHPUnit for the testing framework.

    Read the article

  • Engineering as a Service

    - by jgelhaus
    Oracle Exadata Database Machine is known for great compute performance, and over the past few years, it has also become known as a great platform for any type of Oracle Database workload, from data warehousing to online transaction processing (OLTP). But now organizations are turning to Oracle Exadata for business efficiencies and private cloud solutions—for consolidation and database as a service (DBaaS). University of Minnesota For an inside look at how DBaaS is working in the real world, it’s worth checking into the University of Minnesota’s database hotel.  With more than 50,000 students, the University of Minnesota in Minneapolis is one of the largest universities in the United States. The university’s centralized IT group not only has to support all those students but also must provide support and services to more than 40 departments and colleges within the university. They have two Exadata Database Machine X2-2 half-rack systems from Oracle, with four database nodes each and roughly 30 terabytes of usable disk space for each of the Oracle Exadata systems. The university is using Oracle Real Application Clusters (Oracle RAC) for high availability and the Data Guard feature of Oracle Database, Enterprise Edition, for disaster recovery capabilities. The deployment has been live in production since May 2011. Overhead Door When it comes to overhead, revolving, sliding, or other specialty residential and commercial doors, Overhead Door is the worldwide leader. But when they needed to open doors with their customers through a better, faster, and more agile IT infrastructure, Overhead Door turned to Oracle and Oracle Exadata. Oracle Exadata Database Machine plays an important part in Overhead Door’s IT and business strategy. The organization has two Exadata Database Machine X2-2s deployed, one in production and one in development and testing Read the full Oracle Magazine article Engineering as a Service

    Read the article

  • Is there a visual web application builder or rapid webapp prototyping framework?

    - by Jesper Mortensen
    Question: Is there such a thing as a self-hosted framework or CMS especially tailored towards the creation of interactive web applications without -- or with an absolute minimum of -- programming? (Substantially less programming than say a simple Rails app or a plugin for Wordpress, Joomla etc would require.) As for desired features I'd settle for whatever is available, but some ideas could be: A User authentication and Permissions system. A GUI-driven input form builder. A GUI-driven template / visual site design builder. A simple scripting language (think AppleScript-like simplicity) A highly modular architecture, with high-level business objects (users, forms data, etc) exposed for easy re-use. If something like the above doesn't exist, then what comes near this? Need: This is for self-hosted rapid prototyping of web applications, and limited user testing of webapp user interface designs in a closed user test. Notes: I know about Ruby on Rails (Rails), Django, Pyramid etc. I'm looking for something much faster to work in, for making prototypes. I know about CMS's in general but find that most of them are tailored towards displaying information to the end users. If there is an exceptionally easy-to-master CMS with easy scripting (lets say much more so than for example Wordpress) then I'd be interested.

    Read the article

  • What are the best strategies for selling Android apps?

    - by Rob S.
    I'm a young developer hoping to sell my apps I made for Android soon. My applications are basically 99% finished so I'm investigating what would be the best marketing strategy to use to sell my apps. I'm sure the brilliant minds here can give me some great advice. I'm particularly interested in your thoughts on the following points (especially from experienced Android developers): Is it more profitable to sell an app for free with ads or to sell an app without ads for a price? Perhaps a combination of a free ad version and a paid ad-free version? If you give away an app for free with ads on it is it ethical to decline bending over backwards to support it? How much does piracy actually affect potential sales? Should any effort be put towards preventing it? Can you still make a profit off your application if you make it open source? Could you perhaps make more of a profit from the attention you would get by doing so? Is Google's Android Marketplace really the best place to release Android apps? It is worthwhile enough to maintain a developer blog or website to keep users updated on your development progress and software releases? Any other suggestions you could give me to maximize profit meanwhile keeping users happy and coming back for more would also be greatly appreciated. While I appreciate general tips and tricks, I'd like to ask that if possible you please go the extra step and show how they specifically apply to selling Android apps. Marketing statistics, developer retrospect, and any additional experience you can share from your time selling Android apps is what I would love to see most. Thank you very much in advance for your time. I truly appreciate all the responses I receive.

    Read the article

  • What the Hekaton?

    - by Tony Davis
    Hekaton, the power behind SQL Server 2014′s In-Memory OLTP technology, is intended to make data operations run orders of magnitude faster on SQL Server. This works its magic partly by serving database workloads entirely from main memory, using memory-optimized table structures. It replaces the relational engine’s standard locking model with an optimistic concurrency model based on time-stamped row versions. Deeper down the Hekaton engine uses new, ‘latch free’ data structures. So far, so good, but performance improvements on this scale require a compromise, and the compromise is that these aren’t tables as we understand them. For the database developer, these differences are painful because they involve sacrificing some very important bits of the relational model. Most importantly, Hekaton tables don’t currently support FOREIGN KEY constraints or CHECK constraints, and you can’t put the checks in triggers because there aren’t any DML triggers either. Constraints allow a relational designer to enforce relational integrity and data integrity. Without them, of course, ‘bad data’ can get into our Hekaton tables. There is no easy way of preventing it. For several classes of database and data, this is a show-stopper. One may regard all these restrictions regretfully, seeing limited opportunity to try out Hekaton with current databases, but perhaps there is also a sudden glow of recognition. Isn’t this how we all originally imagined table variables were going to be, back in SQL 2005? And they have much the same restrictions. Maybe, instead of pretending that a currently-designed database can be ‘Hekatonized’ with a few mouse clicks, we should redesign databases for SQL 2014 to replace table variables with Hekaton tables, exploiting this technology for fast intermediate processing, and for the most part forget, for now, the idea of trying to convert our base relational tables into Hekaton tables. Few database developers would be averse to having their working tables running an order of magnitude faster, as long as it didn’t compromise the integrity of the data in the base tables.

    Read the article

  • Can windows XP be better than any Ubuntu (and Linux) distro for an old PC?

    - by Robert Vila
    The old laptop is a Toshiba 1800-100: CPU: Intel Celeron 800h Ram 128 MB (works ok) HDD: 15GB (works ok) Graphics adapter: Integrated 64-bit AGP graphics accelerator, BitBIT, 3D graphic acceleration, 8 MB Video RAM Only WindowsXP is installed, and works ok: it can be used, but it is slow (and hateful). I thought that I could improve performance (and its look) easily, since it is an old PC (drivers and everything known for years...) by installing a light Linux distro. So, I decided to install a light or customized Ubuntu distro, or Ubuntu/Debian derivative, but haven't been successful with any; not even booting LiveCDs: not even AntiX, not even Puppy. Lubuntu wiki says that it won't work because the last to releases need more ram (and some blogs say much more cpu -even core duo for new Lubuntu!-), let alone Xubuntu. The problems I have faced are: 1.There are thousands of pages talking about the same 10/15 lightweight distros, and saying more or less the same things, but NONE talks about a simple thing as to how should the RAM/swap-partition proportion be for this kind of installations. NONE! 2.Loading the LiveCD I have tried several different boot options (don't understand much about this and there's ALWAYS a line of explanation missing) and never receive error messages. Booting just stops at different stages but often seems to stop just when the X server is going to start. I am able to boot to command line. 3.I ignore whether the problem is ram size or a problem with the graphics driver (which surprises me because it is a well known brand and line of computers). So I don't know if doing a partition with a swap partition would help booting the LiveCD. 4.I would like to try the graphical interface with the LiveCD before installing. If doing the swap partition for this purpose would help. How can I do the partition? I tried to use Boot Rescue CD, but it advises me against continuing forward. I would appreciate any ideas as regards these questions. Thank you

    Read the article

  • Change the Integrated Weblogic Port number

    - by pavan.pvj
    There came a situation where I wanted to work with two JDevelopers simultaneously and start two different applications in two JDEVs. (Both of them have to in separate installation location, else it will create a problem because of system directory).Now, when we want to start WLS in JDEV, only the first one will be started and the other one fails with an exception of port conflict. Until few days back, $1million dollar question was how to change the integrated WLS port number?So, heres the answer after some R&D. In the view menu, click on "Application Server Navigator". Right click on Integrated Weblogic server.1) If it is the first time that you are trying to start the server, then there is a menu "Create Default Domain". If you click on this, a window will be displayed where it asks for the preferred port number. Change it here.2) If the domain is already created, then click on Properties and change the preferred port number.Again, if you want to change the port before starting JDEV from the file system, then goto $JDEV_USER_HOME/systemxxx/o.j2ee and open the file adrs-instances.xml and change the http-port in the startup-preferences:<hash n="startup-preferences">   <value n="http-port" v="7111"/></hash>Note 1: adrs-instances.xml will be created ONLY after you create the default domain.Note 2: systemxxx - refers to system.<JDEV version> like system.11.1.1.3.56.59 for PS2.Note 3: $JDEV_USER_HOME - in windows - would be C:\Documents and Settings\[user_name]\Application Data\JDeveloper"Now, you can run multiple Integrated WLS simultaneously. But please be aware that running more than one WLS server will degrade system performance.

    Read the article

  • OpenGL CPU vs. GPU

    - by Nitrex88
    So I've always been under the impression that doing work on the GPU is always faster than on the CPU. Because of this, in OpenGL, I usually try to do intensive tasks in shaders so they get the speed boost from the GPU. However, now I'm starting to realize that some things simply work better on the CPU and actually perform worse on the GPU (particularly when a geometry shader is involved). For example, in a recent project I did involving procedurally generated terrain, I tried passing a grid of single triangles into a geometry shader, and tesselated each of these triangles into quads with 400 vertices whose height was determined by a noise function. This worked fine, and looked great, but easily maxed out the GPU with only 25 base triangles and caused a very slow framerate. I then discovered that tesselating on the CPU instead, and setting the height (using noise function) in the vertex shader was actually faster! This prompted me to question the benefits of using the GPU as much as possible... So, I was wondering if someone could describe the general pros and cons of using the GPU vs CPU for intensive graphics tasks. I know this mainly comes down to what your trying to achieve, so if necessary, use the above scenario to discuss why the "CPU + vertex shader" was actually faster than doing everything in the geometry shader on the GPU. It's possible my hardware (newest macbook pro) isn't optomized well for the geometry shader (thus causing the slow framerate). Also, I read that the vertex shader is very good with parallelism, and would love a quick explanation of how this may have played a role in speeding up my procedural terrain. Any info/advice about CPU/GPU/shaders would be awesome!

    Read the article

  • Solving Big Problems with Oracle R Enterprise, Part II

    - by dbayard
    Part II – Solving Big Problems with Oracle R Enterprise In the first post in this series (see https://blogs.oracle.com/R/entry/solving_big_problems_with_oracle), we showed how you can use R to perform historical rate of return calculations against investment data sourced from a spreadsheet.  We demonstrated the calculations against sample data for a small set of accounts.  While this worked fine, in the real-world the problem is much bigger because the amount of data is much bigger.  So much bigger that our approach in the previous post won’t scale to meet the real-world needs. From our previous post, here are the challenges we need to conquer: The actual data that needs to be used lives in a database, not in a spreadsheet The actual data is much, much bigger- too big to fit into the normal R memory space and too big to want to move across the network The overall process needs to run fast- much faster than a single processor The actual data needs to be kept secured- another reason to not want to move it from the database and across the network And the process of calculating the IRR needs to be integrated together with other database ETL activities, so that IRR’s can be calculated as part of the data warehouse refresh processes In this post, we will show how we moved from sample data environment to working with full-scale data.  This post is based on actual work we did for a financial services customer during a recent proof-of-concept. Getting started with the Database At this point, we have some sample data and our IRR function.  We were at a similar point in our customer proof-of-concept exercise- we had sample data but we did not have the full customer data yet.  So our database was empty.  But, this was easily rectified by leveraging the transparency features of Oracle R Enterprise (see https://blogs.oracle.com/R/entry/analyzing_big_data_using_the).  The following code shows how we took our sample data SimpleMWRRData and easily turned it into a new Oracle database table called IRR_DATA via ore.create().  The code also shows how we can access the database table IRR_DATA as if it was a normal R data.frame named IRR_DATA. If we go to sql*plus, we can also check out our new IRR_DATA table: At this point, we now have our sample data loaded in the database as a normal Oracle table called IRR_DATA.  So, we now proceeded to test our R function working with database data. As our first test, we retrieved the data from a single account from the IRR_DATA table, pull it into local R memory, then call our IRR function.  This worked.  No SQL coding required! Going from Crawling to Walking Now that we have shown using our R code with database-resident data for a single account, we wanted to experiment with doing this for multiple accounts.  In other words, we wanted to implement the split-apply-combine technique we discussed in our first post in this series.  Fortunately, Oracle R Enterprise provides a very scalable way to do this with a function called ore.groupApply().  You can read more about ore.groupApply() here: https://blogs.oracle.com/R/entry/analyzing_big_data_using_the1 Here is an example of how we ask ORE to take our IRR_DATA table in the database, split it by the ACCOUNT column, apply a function that calls our SimpleMWRR() calculation, and then combine the results. (If you are following along at home, be sure to have installed our myIRR package on your database server via  “R CMD INSTALL myIRR”). The interesting thing about ore.groupApply is that the calculation is not actually performed in my desktop R environment from which I am running.  What actually happens is that ore.groupApply uses the Oracle database to perform the work.  And the Oracle database is what actually splits the IRR_DATA table by ACCOUNT.  Then the Oracle database takes the data for each account and sends it to an embedded R engine running on the database server to apply our R function.  Then the Oracle database combines all the individual results from the calls to the R function. This is significant because now the embedded R engine only needs to deal with the data for a single account at a time.  Regardless of whether we have 20 accounts or 1 million accounts or more, the R engine that performs the calculation does not care.  Given that normal R has a finite amount of memory to hold data, the ore.groupApply approach overcomes the R memory scalability problem since we only need to fit the data from a single account in R memory (not all of the data for all of the accounts). Additionally, the IRR_DATA does not need to be sent from the database to my desktop R program.  Even though I am invoking ore.groupApply from my desktop R program, because the actual SimpleMWRR calculation is run by the embedded R engine on the database server, the IRR_DATA does not need to leave the database server- this is both a performance benefit because network transmission of large amounts of data take time and a security benefit because it is harder to protect private data once you start shipping around your intranet. Another benefit, which we will discuss in a few paragraphs, is the ability to leverage Oracle database parallelism to run these calculations for dozens of accounts at once. From Walking to Running ore.groupApply is rather nice, but it still has the drawback that I run this from a desktop R instance.  This is not ideal for integrating into typical operational processes like nightly data warehouse refreshes or monthly statement generation.  But, this is not an issue for ORE.  Oracle R Enterprise lets us run this from the database using regular SQL, which is easily integrated into standard operations.  That is extremely exciting and the way we actually did these calculations in the customer proof. As part of Oracle R Enterprise, it provides a SQL equivalent to ore.groupApply which it refers to as “rqGroupEval”.  To use rqGroupEval via SQL, there is a bit of simple setup needed.  Basically, the Oracle Database needs to know the structure of the input table and the grouping column, which we are able to define using the database’s pipeline table function mechanisms. Here is the setup script: At this point, our initial setup of rqGroupEval is done for the IRR_DATA table.  The next step is to define our R function to the database.  We do that via a call to ORE’s rqScriptCreate. Now we can test it.  The SQL you use to run rqGroupEval uses the Oracle database pipeline table function syntax.  The first argument to irr_dataGroupEval is a cursor defining our input.  You can add additional where clauses and subqueries to this cursor as appropriate.  The second argument is any additional inputs to the R function.  The third argument is the text of a dummy select statement.  The dummy select statement is used by the database to identify the columns and datatypes to expect the R function to return.  The fourth argument is the column of the input table to split/group by.  The final argument is the name of the R function as you defined it when you called rqScriptCreate(). The Real-World Results In our real customer proof-of-concept, we had more sophisticated calculation requirements than shown in this simplified blog example.  For instance, we had to perform the rate of return calculations for 5 separate time periods, so the R code was enhanced to do so.  In addition, some accounts needed a time-weighted rate of return to be calculated, so we extended our approach and added an R function to do that.  And finally, there were also a few more real-world data irregularities that we needed to account for, so we added logic to our R functions to deal with those exceptions.  For the full-scale customer test, we loaded the customer data onto a Half-Rack Exadata X2-2 Database Machine.  As our half-rack had 48 physical cores (and 96 threads if you consider hyperthreading), we wanted to take advantage of that CPU horsepower to speed up our calculations.  To do so with ORE, it is as simple as leveraging the Oracle Database Parallel Query features.  Let’s look at the SQL used in the customer proof: Notice that we use a parallel hint on the cursor that is the input to our rqGroupEval function.  That is all we need to do to enable Oracle to use parallel R engines. Here are a few screenshots of what this SQL looked like in the Real-Time SQL Monitor when we ran this during the proof of concept (hint: you might need to right-click on these images to be able to view the images full-screen to see the entire image): From the above, you can notice a few things (numbers 1 thru 5 below correspond with highlighted numbers on the images above.  You may need to right click on the above images and view the images full-screen to see the entire image): The SQL completed in 110 seconds (1.8minutes) We calculated rate of returns for 5 time periods for each of 911k accounts (the number of actual rows returned by the IRRSTAGEGROUPEVAL operation) We accessed 103m rows of detailed cash flow/market value data (the number of actual rows returned by the IRR_STAGE2 operation) We ran with 72 degrees of parallelism spread across 4 database servers Most of our 110seconds was spent in the “External Procedure call” event On average, we performed 8,200 executions of our R function per second (110s/911k accounts) On average, each execution was passed 110 rows of data (103m detail rows/911k accounts) On average, we did 41,000 single time period rate of return calculations per second (each of the 8,200 executions of our R function did rate of return calculations for 5 time periods) On average, we processed over 900,000 rows of database data in R per second (103m detail rows/110s) R + Oracle R Enterprise: Best of R + Best of Oracle Database This blog post series started by describing a real customer problem: how to perform a lot of calculations on a lot of data in a short period of time.  While standard R proved to be a very good fit for writing the necessary calculations, the challenge of working with a lot of data in a short period of time remained. This blog post series showed how Oracle R Enterprise enables R to be used in conjunction with the Oracle Database to overcome the data volume and performance issues (as well as simplifying the operations and security issues).  It also showed that we could calculate 5 time periods of rate of returns for almost a million individual accounts in less than 2 minutes. In a future post, we will take the same R function and show how Oracle R Connector for Hadoop can be used in the Hadoop world.  In that next post, instead of having our data in an Oracle database, our data will live in Hadoop and we will how to use the Oracle R Connector for Hadoop and other Oracle Big Data Connectors to move data between Hadoop, R, and the Oracle Database easily.

    Read the article

  • How do you cope with change in open source frameworks that you use for your projects?

    - by Amy
    It may be a personal quirk of mine, but I like keeping code in living projects up to date - including the libraries/frameworks that they use. Part of it is that I believe a web app is more secure if it is fully patched and up to date. Part of it is just a touch of obsessive compulsiveness on my part. Over the past seven months, we have done a major rewrite of our software. We dropped the Xaraya framework, which was slow and essentially dead as a product, and converted to Cake PHP. (We chose Cake because it gave us the chance to do a very rapid rewrite of our software, and enough of a performance boost over Xaraya to make it worth our while.) We implemented unit testing with SimpleTest, and followed all the file and database naming conventions, etc. Cake is now being updated to 2.0. And, there doesn't seem to be a viable migration path for an upgrade. The naming conventions for files have radically changed, and they dropped SimpleTest in favor of PHPUnit. This is pretty much going to force us to stay on the 1.3 branch because, unless there is some sort of conversion tool, it's not going to be possible to update Cake and then gradually improve our legacy code to reap the benefits of the new Cake framework. So, as usual, we are going to end up with an old framework in our Subversion repository and just patch it ourselves as needed. And this is what gets me every time. So many open source products don't make it easy enough to keep projects based on them up to date. When the devs start playing with a new shiny toy, a few critical patches will be done to older branches, but most of their focus is going to be on the new code base. How do you deal with radical changes in the open source projects that you use? And, if you are developing an open source product, do you keep upgrade paths in mind when you develop new versions?

    Read the article

< Previous Page | 516 517 518 519 520 521 522 523 524 525 526 527  | Next Page >