Search Results

Search found 10519 results on 421 pages for 'standard'.

Page 213/421 | < Previous Page | 209 210 211 212 213 214 215 216 217 218 219 220  | Next Page >

  • Class Design for special business rules

    - by Samuel Front
    I'm developing an application that allows people to place custom manufacturing orders. However, while most require similar paperwork, some of them have custom paperwork that only they require. My current class design has a Manufacturer class, of which of one of the member variables is an array of RequiredSubmission objects. However, there are two issues that I am somewhat concerned about. First, some manufacturers are willing to accept either a standard form or their own custom form. I'm thinking of storing this in the RequiredSubmission object, with an array of alternate forms that are a valid substitute. I'm not sure that this is ideal, however. The major issue, however, is that some manufacturers have deadline cycles. For example, forms A, B and C have to be delivered by January 1, while payment must be rendered by January 10. If you miss those, you'll have to wait until the next cycle. I'm not exactly sure how I can get this to work with my existing classes—how can I say "this set of dates all belong to the same cycle, with date A for form A, date B for form B, etc." I would greatly appreciate any insights on how to best design these classes.

    Read the article

  • ArchBeat Link-o-Rama for 2012-03-22

    - by Bob Rhubart
    2012 Real World Performance Tour Dates |Performance Tuning | Performance Engineering www.ioug.org Coming to your town: a full day of real world database performance with Tom Kyte, Andrew Holdsworth, and Graham Wood. Rochester, NY - March 8 Los Angeles, CA - April 30 Orange County, CA - May 1 Redwood Shores, CA - May 3. Oracle Cloud Conference: dates and locations worldwide http://www.oracle.com Find the cloud strategy that’s right for your enterprise. 2 new Cloud Computing resources added to free IT Strategies from Oracle library www.oracle.com IT Strategies from Oracle, the free authorized library of guidelines and reference architectures, has just been updated to include two new documents: A Pragmatic Approach to Cloud Adoption Data Sheet: Oracle's Approach to Cloud SOA! SOA! SOA!; OSB 11g Recipes and Author Interviews www.oracle.com Featured this week on the OTN Architect Homepage, along with the latest articles, white papers, blogs, events, and other resources for software architects. Enterprise app shops announcements are everywhere | Andy Mulholland www.capgemini.com Capgemini's Andy Mulholland discusses "the 'front office' revolution using new technologies in a different manner to the standard role of IT and its attendant monolithic applications based on Client-Server technologies." Encapsulating OIM API’s in a Web Service for OIM Custom SOA Composites | Alex Lopez fusionsecurity.blogspot.com Alex Lopez describes "how to encapsulate OIM API calls in a Web Service for use in a custom SOA composite to be included as an approval process in a request template." Thought for the Day "Don't worry about people stealing your ideas. If your ideas are any good, you'll have to ram them down people's throats." — Howard H. Aiken

    Read the article

  • UPDATE: Keeping It Clean in San Francisco

    - by Oracle OpenWorld Blog Team
    by Karen Shamban The results are in, and September 15 was a huge success for the organizers of Coastal Cleanup Day - and more important, for our beautiful and unique California coastal environment.   Here are some inspiring stats. More than: 1,500 volunteers reported in for duty at the Ocean Beach cleanup location (including 150 Oracle employees and family members) 57,000 volunteers participated statewide 320 tons picked up, including 534,115 pounds of trash 105,816 pounds of recyclable materials  Remember: KEEP IT CLEAN! You don't have to wait for the annual Coastal Cleanup Day to do your part. The beaches, fish, mammals, birds, and your fellow human beings will thank you. Join us on September 15, when California's largest volunteer event -- Coastal Cleanup Day -- is taking place. You can help by joining Oracle, Oracle partners, and many others at the Ocean Beach cleanup.  Be sure to check in at the Oracle table that will be set up there. You'll receive an Oracle t-shirt for participating (while supplies last), and can sign up to receive an emailed code that will get you a complimentary Discover pass* to Oracle OpenWorld and JavaOne. And be sure to get yourself into the group photo, which will be shown on the Oracle OpenWorld and JavaOne Websites. When and where: Ocean Beach at Fulton Street, San Francisco Saturday, September 15, 2012 ">9 a.m. to Noon Click here for more information, and to register. *Note: Oracle employees should register for the Ocean Beach cleanup here, and must register for Oracle OpenWorld or JavaOne using the standard employee registration process. Oracle employees are not eligible for the Discover pass offer.

    Read the article

  • Testcase runner for parametrized testcases

    - by Razer
    Let me explain my situation. I'm planning a kind of test case runner for doing testcases on external devices, which are microcontroller based. Lets consider the devices: Device 1 Device 2 There exist a lot of test cases which can be run with one of the devices above. For example: Testcase 1 Testcase 2 The main reason that all the testcases can be run with any device is, that the testcases validates some standard and this software should be extensible for future devices. The testcases itself must be runnable with changing parameters. For example Testcase 1 does some Timing Verification the testcase needs as input parameter the datarate: 4800, 9600, 19200. Now hoping you understand the situation, let me explain my design questions. For implementing the test cases I thought about an Attribute based approach, like nunit does it. The more complicated problem is, how to define the parametrized testcases? Like this: Device 1: Testcase 1: datarate: 4800, 9600, 19200 Testcase 2: supply: 1, 2, 3 Device 2: Testcase 1: datarate: 9600, 19200, 38400 Testcase 2: supply: 3, 4, 5 How would you design such a framework? I've done a similar desin in python where I had for every device a XML containing the testcase definitions like: <Testcase="Testcase 1" datarate=4800/> <Testcase="Testcase 1" datarate=9600/> <Testcase="Testcase 1" datarate=19200/>

    Read the article

  • Designing Content-Based ETL Process with .NET and SFDC

    - by Patrick
    As my firm makes the transition to using SFDC as our main operational system, we've spun together a couple of SFDC portals where we can post customer-specific documents to be viewed at will. As such, we've had the need for pseudo-ETL applications to be implemented that are able to extract metadata from the documents our analysts generate internally (most are industry-standard PDFs, XML, or MS Office formats) and place in networked "queue" folders. From there, our applications scoop of the queued documents and upload them to the appropriate SFDC CRM Content Library along with some select pieces of metadata. I've mostly used DbAmp to broker communication with SFDC (DbAmp is a Linked Server provider that allows you to use SQL conventions to interact with your SFDC Org data). I've been able to create [console] applications in C# that work pretty well, and they're usually structured something like this: static void Main() { // Load parameters from app.config. // Get documents from queue. var files = someInterface.GetFiles(someFilterOrRegexPattern); foreach (var file in files) { // Extract metadata from the file. // Validate some attributes of the file; add any validation errors to an in-memory // structure (e.g. List<ValidationErrors>). if (isValid) { // Upload using some wrapper for an ORM an someInterface.Upload(meta.Param1, meta.Param2, ...); } else { // Bounce the file } } // Report any validation errors (via message bus or SMTP or some such). } And that's pretty much it. Most of the time I wrap all these operations in a "Worker" class that takes the needed interfaces as constructor parameters. This approach has worked reasonably well, but I just get this feeling in my gut that there's something awful about it and would love some feedback. Is writing an ETL process as a C# Console app a bad idea? I'm also wondering if there are some design patterns that would be useful in this scenario that I'm clearly overlooking. Thanks in advance!

    Read the article

  • Too much I/O in the morning ?

    - by steveh99999
    Interesting little improvement on a SQL 2005 system I encountered recently….. Some background - this system had a fairly ‘traditional OLTP’ workload ie  heavily used during day – till around 9pm, then had a batch window for several hours, then not much activity in the early hours of the day, until normal workload resumed the following morning. Using perfmon, I noticed that every morning, we would see a big spike in SQL Server I/O when the application started to be used... As it was 2005 I decided to look at what tables were in cache before and after the overnight batch processing ran… ( using DMV equivalent of dbcc memusage that I posted earlier). Here’s what I saw :-     So, contents of data cache split fairly evenly between my 'important/heavily used' tables.   After this:- some application batch processing,backups, DBCC checks and reindexes were run.  A fairly standard batch I'd suggest. Cache contents then looked like this :- Hmmmm – most of cache is now being used by a table I’ve described as ‘unimportant’. Why ? Well, that table was the last to be reindexed…. purely due to luck, as  the reindexing stored procedure performing a loop in alphabetical order through all application tables...  When the application starts to be used again – all this ‘unimportant’ data has to be replaced in cache by data that is heavily used… So, we changed the overnight reindex scripts –  the most heavily accessed tables are now the last to be reindexed. Obvious really, but we did see a significant reduction in early-morning I/O after changing the order of our reindexing.  

    Read the article

  • CodePlex Daily Summary for Thursday, September 13, 2012

    CodePlex Daily Summary for Thursday, September 13, 2012Popular ReleasesAustralia Income and Tax Calculator: Australia Income and Tax Calculator: first release, can calculate net income, tax, quarterly/monthly/weekly/daily and hourly taxable/net ratedatajs - JavaScript Library for data-centric web applications: datajs version 1.1.0-beta: datajs is a cross-browser and UI agnostic JavaScript library that enables data-centric web applications with the following features: OData client that enables CRUD operations including batching and metadata support using both ATOM and JSON payloads. Single store abstraction that provides a common API on top of HTML5 local storage technologies. Data cache component that allows reading data ranges from a collection and storing them locally to reduce the number of network requests. Changes...SharePoint (2010) Farm Backup: PowerShell SharePoint (2010) Farm Backup v2.2: Version 2.2 Changelog - Added the ability to export Solutions (WSP) from solution gallery. - Added the ability to exclude MySites from the sites backup. - Added Is-Foundation method to determine whether SharePoint edition is Foundation, Standard or Enterprise to prevent errors when running script on SharePoint Foundation 2010 as Foundation does not have MySite functionality. - Added method to determine amount of storage required for sites backup. Script will now determine total required fo...Metadata Document Generator for Microsoft Dynamics CRM 2011: Metadata Document Generator (2.0.325.117): Add latest version of McTools.Xrm.Connection library to correct Office 365 authentication supportLakana - WPF Framework: Lakana V2: Lakana V2 contains : - Lakana WPF Forms (with sample project) - Lakana WPF Navigation (with sample project)Microsoft SQL Server Product Samples: Database: OData QueryFeed workflow activity: The OData QueryFeed sample activity shows how to create a workflow activity that consumes an OData resource, and renders entity properties in a Microsoft Excel 2010 worksheet or Microsoft Word 2010 document. Using the sample QueryFeed activity, you can consume any OData resource. The sample activity uses LINQ to project OData metadata into activity designer expression items. By setting activity expressions, a fully qualified OData query string is constructed consisting of Resource, Filter, Or...Arduino for Visual Studio: Arduino 1.x for Visual Studio 2012, 2010 and 2008: Register for the visualmicro.com forum for more news and updates Version 1209.10 includes support for VS2012 and minor fixes for the Arduino debugger beta test team. Version 1208.19 is considered stable for visual studio 2010 and 2008. If you are upgrading from an older release of Visual Micro and encounter a problem then uninstall "Visual Micro for Arduino" using "Control Panel>Add and Remove Programs" and then run the install again. Key Features of 1209.10 Support for Visual Studio 2...Microsoft Script Explorer for Windows PowerShell: Script Explorer Reference Implementation(s): This download contains Source Code and Documentation for Script Explorer DB Reference Implementation. You can create your own provider and use it in Script Explorer. Refer to the documentation for more information. The source code is provided "as is" without any warranty. Read the Readme.txt file in the SourceCode.Social Network Importer for NodeXL: SocialNetImporter(v.1.5): This new version includes: - Fixed the "resource limit" bug caused by Facebook - Bug fixes To use the new graph data provider, do the following: Unzip the Zip file into the "PlugIns" folder that can be found in the NodeXL installation folder (i.e "C:\Program Files\Social Media Research Foundation\NodeXL Excel Template\PlugIns") Open NodeXL template and you can access the new importer from the "Import" menuAcDown????? - AcDown Downloader Framework: AcDown????? v4.1: ??●AcDown??????????、??、??、???????。????,????,?????????????????????????。???????????Acfun、????(Bilibili)、??、??、YouTube、??、???、??????、SF????、????????????。 ●??????AcPlay?????,??????、????????????????。 ● AcDown??????????????????,????????????????????????????。 ● AcDown???????C#??,????.NET Framework 2.0??。?????"Acfun?????"。 ????32??64? Windows XP/Vista/7/8 ???? 32??64? ???Linux ????(1)????????Windows XP???,?????????.NET Framework 2.0???(x86),?????"?????????"??? (2)???????????Linux???,????????Mono?? ??...Move Mouse: Move Mouse 2.5.2: FIXED - Minor fixes and improvements.MVC Controls Toolkit: Mvc Controls Toolkit 2.3: Added The new release is compatible with Mvc4 RTM. Support for handling Time Zones in dates. Specifically added helper methods to convert to UTC or local time all DateTimes contained in a model received by a controller, and helper methods to handle date only fileds. This together with a detailed documentation on how TimeZones are handled in all situations by the Asp.net Mvc framework, will contribute to mitigate the nightmare of dates and timezones. Multiple Templates, and more options to...DNN Metro7 style Skin package: Metro7 style Skin for DotNetNuke 06.02.00: Maintenance Release Changes on Metro7 06.02.00 Fixed width and height on the jQuery popup for the Editor. Navigation Provider changed to DDR menu Added menu files and scripts Changed skins to Doctype HTML Changed manifest to dnn6 manifest file Changed License to HTML view Fixed issue on Metro7/PinkTitle.ascx with double registering of the Actions Changed source folder structure and start folder, so the project works with the default DNN structure on developing Added VS 20...Xenta Framework - extensible enterprise n-tier application framework: Xenta Framework 1.9.0: Release Notes Imporved framework architecture Improved the framework security More import/export formats and operations New WebPortal application which includes forum, new, blog, catalog, etc. UIs Improved WebAdmin app. Reports, navigation and search Perfomance optimization Improve Xenta.Catalog domain More plugin interfaces and plugin implementations Refactoring Windows Azure support and much more... Package Guide Source Code - package contains the source code Binaries...Json.NET: Json.NET 4.5 Release 9: New feature - Added JsonValueConverter New feature - Set a property's DefaultValueHandling to Ignore when EmitDefaultValue from DataMemberAttribute is false Fix - Fixed DefaultValueHandling.Ignore not igoring default values of non-nullable properties Fix - Fixed DefaultValueHandling.Populate error with non-nullable properties Fix - Fixed error when writing JSON for a JProperty with no value Fix - Fixed error when calling ToList on empty JObjects and JArrays Fix - Fixed losing deci...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.66: Just going to bite the bullet and rip off the band-aid... SEMI-BREAKING CHANGE! Well, it's a BREAKING change to those who already adjusted their projects to use the previous breaking change's ill-conceived renamed DLLs (versions 4.61-4.65). For those who had not adapted and were still stuck in this-doesn't-work-please-fix-me mode, this is more like a fixing change. The previous breaking change just broke too many people, I'm sorry to say. Renaming the DLL from AjaxMin.dll to AjaxMinLibrary.dl...DotNetNuke® Community Edition CMS: 07.00.00 CTP (Not for Production Use): NOTE: New Minimum Requirementshttp://www.dotnetnuke.com/Portals/25/Blog/Files/1/3418/Windows-Live-Writer-1426fd8a58ef_902C-MinimumVersionSupport_2.png Simplified InstallerThe first thing you will notice is that the installer has been updated. Not only have we updated the look and feel, but we also simplified the overall install process. You shouldn’t have to click through a series of screens in order to just get your website running. With the 7.0 installer we have taken an approach that a...WinRT XAML Toolkit: WinRT XAML Toolkit - 1.2.2: WinRT XAML Toolkit based on the Windows 8 RTM SDK. Download the latest source from the SOURCE CODE page. For compiled version use NuGet. You can add it to your project in Visual Studio by going to View/Other Windows/Package Manager Console and entering: PM> Install-Package winrtxamltoolkit Features AsyncUI extensions Controls and control extensions Converters Debugging helpers Imaging IO helpers VisualTree helpers Samples Recent changes NOTE: Namespace changes DebugConsol...BIDS Helper: BIDS Helper 1.6.1: In addition to fixing a number of bugs that beta testers reported, this release includes the following new features for Tabular models in SQL 2012: New Features: Tabular Display Folders Tabular Translations Editor Tabular Sync Descriptions Fixed Issues: Biml issues 32849 fixing bug in Tabular Actions Editor Form where you type in an invalid action name which is a reserved word like CON or which is a duplicate name to another action 32695 - fixing bug in SSAS Sync Descriptions whe...Code Snippets for Windows Store Apps: Code Snippets for Windows Store Apps: First release of our snippets! For more information: Installation List of Snippets Minor update 9/13: Updated C# and VB packages -- Converted from VSI installers to ZIP files for easier usage with Visual Studio Express editions. Snippets contained in each package were not altered.New Projectsanother hello world: a very quick test.Atorpat Marquee: This is the advanced marquee pro moduleAustralia Income and Tax Calculator: Calculates australian net income, tax, ratesAuto generate C# DAL, BLL classes and Sql Store Procedures: This program helps to you for auto generate store procedures for Sql and DAL, BLL classes for C# without any extra code.BugSystem: bug systemBuild and Deploy Tool using BTDF: This tool can be used by a build and release manager who can prepare the BizTalk MSI and deploy the application in the corresponding environment. Calculation WebApplication: Calc web appChild&Family Brigade®: This Software, is specially realized for the Family Brigade of Cochabamba Bolivia, this is a nonprofit institution, that helps family's problems.Creative Style System: This is our Sheridan College Capstone project. Bitches.Customizable Process Guidance Content for VS ALM 2012: Customizable process guidance is provided for each of the default process templates that VS ALM TFS 2012 provides. DER_Autoit: 2012-9-13-14-10 ?????!Dynamics Xrm Application Speed Builder: The Dynamics Xrm Application Speed Builder will analyze databases, then create the entities in CRM, attributes, and forms for you. Feed Discovery: Want to subscribe to a web page and can't find the newsfeed? Just rightclick on the page and discover! Subscribe directly in IE, Google Reader or any other.Inmeta Tools for Visual Studio 2012 and TFS 2012: Info comingInventory Manager: Inventory Manager is a small demo project that lets you manage your items.Kayvon's Group: projectsLibreta: Something about LibretaMultiple Image choice custom field type: This solution contains "Custom Field Type" which allows the user to choose multiple images as a choice.PHP-Edin: Php kurs PROJETO PET: Um ambiente social para adoção e apreciação de animais.Read the Reader: Read the Reader is a lightweight Google Reader Client. It runs in the background and tells you, when something's happening.SharePoint 2010 File Recovery: A little utility program to allow you to easily recover files from your SharePoint 2010 content database backupsSistema para estudo do mvc: Estudando asp.mvcSports Center Asp.net MVC Demo: Sample Sports Center Asp.net MVC Project. Good start up kit for getting in to various feature of Asp.net MVC offering. testtom08092012git01: bvcT-SQL implementation of Standard Distribution PDF and CDF: Files for blog post at http://formaldev.blogspot.com/2012/09/T-SQL-NORMDIST-1.htmlWunderlist.com Shortcut Google Chrome Extension: Just exactly that, a shortcut to Wunderlist.com

    Read the article

  • How do I install Ubuntu 12.04 PPC on a PowerPC G4 from commands line or fix the graphical mode?

    - by Gerardo Rodríguez
    Good afternoon. I'm new to this Linux world so I hope someone can help me . I recently got a Mac as a gift, a PowerPC G4 , which has 1GB of RAM but came with no optical drive or hard drive. So I put a dvd burner and a hard drive of 40 GB I got. Then download an iso of Ubuntu 12.04 for PowerPC and burned onto a CD. I'm trying to install from Open Firmware ( as I haven't a Mac keyboard, I use one standard ) . Finally, the Installation CD boots , but on live mode and after the Ubuntu 12.04 screen I get a message that there is a problem with my graphics adapter but I can continue with minimal graphics and give me the command line . My question is that how do I install through a text or if there is any way to fix this problem to run the graphical mode and so can continue the installation of Ubuntu, and if once installed Ubuntu the problem will be fixed or what? I would appreciate if you help me , as I mentioned before, there is almost nothing of Ubuntu but I think it will be easier than trying to get the Mac OS X proper . Spend good.

    Read the article

  • Unity Launcher only runs once - requires lightdm restart before it runs again

    - by Don
    I have an intermittent problem that just started showing up several days ago. I am running 11.10 and all updates are current. I first saw the symptom with a custom version of the "Home" nautilus-home.desktop file I created in ~/.local.share/applications. I added a few static shortcuts to specific folders. What I found was, clikcing the icon once would open up my home folder, but after closing that nautilus window, clicking the icon again did nothing (did not even show icon backlight animation). However, I could right click on the same icon and access my short cuts as many times as I want. Symptom persisted until restarting lightdm. Just yesterday I saw the same sort of symptom happen with a custom launcher I created for a chromium-borwser to open a specific URL (with a few short cuts to other URLs). Click the icon - it works once. Then never again. Right click the icon and I can use the short cuts over and over - no problem. Note - at one point I assumed I might have a problem with my custom .desktop file, so I did a test by removing my custom nautilus-home.desktop. However, even after restarting lightdm, and verifying the home icon was the standard one from /opt/share/applications (all my custom shortcuts were gone) I saw the same symptom re-appear - it runs once and then not again until restarting lightdm. It seems to be intermittent and seems to move between various launchers. Not sure what to do or even what background data to gather. Attempt to improve question after the first answer: I tried the following: 1) remove all custom launchers 2) reboot 3) add custom lauchers back 4) reboot 5) attempt to use .... still have "runs once and never again" symptom with several launchers

    Read the article

  • Open source license with backlink requirement

    - by KajMagnus
    I'm developing a Javascript library, and I'm thinking about releasing it under an open source license (e.g. GPL, BSD, MIT) — but that requires that websites that use the software link back to my website. Do you know about any such licenses? And how have they formulated the attribution part of the license text? Do you think this BSD-license would do what you think that I want? (I suppose it doesn't :-)) [...] 3. Each website that redistributes this work must include a visible rel=follow link to my-website.example.com, reachable via rel=follow links from each page where the software is being redistributed. (For example, you could have a link back to your homepage, and from your homepage to an About-Us section, which could link to a Credits section) I realize that some companies wouldn't want to use the library because of legal issues with interpreting non-standard licenses (have a look at this answer: http://programmers.stackexchange.com/a/156859/54906). — After half a year, or perhaps some years, I'd change the license to plain GPL + MIT.

    Read the article

  • Is your TRY worth catching?

    - by Maria Zakourdaev
      A very useful error handling TRY/CATCH construct is widely used to catch all execution errors  that do not close the database connection. The biggest downside is that in the case of multiple errors the TRY/CATCH mechanism will only catch the last error. An example of this can be seen during a standard restore operation. In this example I attempt to perform a restore from a file that no longer exists. Two errors are being fired: 3201 and 3013: Assuming that we are using the TRY and CATCH construct, the ERROR_MESSAGE() function will catch the last message only: To workaround this problem you can prepare a temporary table that will receive the statement output. Execute the statement inside the xp_cmdshell stored procedure, connect back to the SQL Server using the command line utility sqlcmd and redirect it's output into the previously created temp table.  After receiving the output, you will need to parse it to understand whether the statement has finished successfully or failed. It’s quite easy to accomplish as long as you know which statement was executed. In the case of generic executions you can query the output table and search for words like“Msg%Level%State%” that are usually a part of the error message.Furthermore, you don’t need TRY/CATCH in the above workaround, since the xp_cmdshell procedure always finishes successfully and you can decide whether to fire the RAISERROR statement or not. Yours, Maria

    Read the article

  • Common request: export #Tabular model and data to #PowerPivot

    - by Marco Russo (SQLBI)
    I received this request in many courses, messages and also forum discussions: having an Analysis Services Tabular model, it would be nice being able to extract a correspondent PowerPivot data model. In order of priority, here are the specific feature people (including me) would like to see: Create an empty PowerPivot workbook with the same data model of a Tabular model Change the connections of the tables in the PowerPivot workbook extracting data from the Tabular data model Every table should have an EVALUATE ‘TableName’ query in DAX Apply a filter to data extracted from every table For example, you might want to extract all data for a single country or year or customer group Using the same technique of applying filter used for role based security would be nice Expose an API to automate the process of creating a PowerPivot workbook Use case: prepare one workbook for every employee containing only its data, that he can use offline Common request for salespeople who want a mini-BI tool to use in front of the customer/lead/supplier, regardless of a connection available This feature would increase the adoption of PowerPivot and Tabular (and, therefore, Business Intelligence licenses instead of Standard), and would probably raise the sales of Office 2013 / Office 365 driven by ISV, who are the companies who requests this feature more. If Microsoft would do this, it would be acceptable it only works on Office 2013. But if a third-party will do that, it will make sense (for their revenues) to cover both Excel 2010 and Excel 2013. Another important reason for this feature is that the “Offline cube” feature that you have in Excel is not available when your PivotTable is connected to a Tabular model, but it can only be used when you connect to Analysis Services Multidimensional. If you think this is an important features, you can vote this Connect item.

    Read the article

  • Coherence Management with EM Cloud Control 12c –demo for partners

    - by JuergenKress
    For access to the Oracle demo systems please visit OPN and talk to your Partner Expert We are pleased to announce the availability of the Coherence Management demo that showcases some of the key capabilities of Management Pack for Oracle Coherence and JVM Diagnostics (licensed under WLS Management Pack EE and Management Pack for NonOracle MW). This demo specifically focuses on some of the performance management and configuration management solutions for Oracle Coherence. The demo flow showcases the key enhancements made in Enterprise Manager 12c release which includes new customizable performance summary, cache data management and configuration management. Demo Highlights The demo showcases the following capabilities. Centralized monitoring for enterprise wide Coherence deployments Drill down diagnostics Customizable performance views Monitoring performance trends Monitoring Caches, Nodes, Services, etc Performance and Log Alerts Real-time Java Diagnostics and memory leak analysis Cache Data Management Lifecycle management Provisioning Coherence on a new machine Starting nodes on machine where Coherence is already running Killing a node process Demo Instructions Go to the DSS website for Oracle Partners. On the Standard Demo Launchpad page, under the “Middleware Management” section, click on the link “EM Cloud Control 12c Coherence Management” (tagged as “NEW”). Specific demo launchpad page contains a link to the detailed demo script with instructions on how to show the demo. Read more on Community Events and post your comment here. WebLogic Partner Community For regular information become a member in the WebLogic Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Wiki Technorati Tags: Coherence,Coherence demo,DSS,CAF,WebLogic,WebLogic Community,Oracle,OPN,Jürgen Kress

    Read the article

  • Coherence Management with EM Cloud Control 12c –demo for partners

    - by JuergenKress
    For access to the Oracle demo systems please visit OPN and talk to your Partner Expert We are pleased to announce the availability of the Coherence Management demo that showcases some of the key capabilities of Management Pack for Oracle Coherence and JVM Diagnostics (licensed under WLS Management Pack EE and Management Pack for NonOracle MW). This demo specifically focuses on some of the performance management and configuration management solutions for Oracle Coherence. The demo flow showcases the key enhancements made in Enterprise Manager 12c release which includes new customizable performance summary, cache data management and configuration management. Demo Highlights The demo showcases the following capabilities. Centralized monitoring for enterprise wide Coherence deployments Drill down diagnostics Customizable performance views Monitoring performance trends Monitoring Caches, Nodes, Services, etc Performance and Log Alerts Real-time Java Diagnostics and memory leak analysis Cache Data Management Lifecycle management Provisioning Coherence on a new machine Starting nodes on machine where Coherence is already running Killing a node process Demo Instructions Go to the DSS website for Oracle Partners. On the Standard Demo Launchpad page, under the “Middleware Management” section, click on the link “EM Cloud Control 12c Coherence Management” (tagged as “NEW”). Specific demo launchpad page contains a link to the detailed demo script with instructions on how to show the demo. Read more on Community Events and post your comment here. WebLogic Partner Community For regular information become a member in the WebLogic Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Wiki Technorati Tags: Coherence,Coherence demo,DSS,CAF,WebLogic,WebLogic Community,Oracle,OPN,Jürgen Kress

    Read the article

  • Finding a way to simplify complex queries on legacy application

    - by glenatron
    I am working with an existing application built on Rails 3.1/MySql with much of the work taking place in a JavaScript interface, although the actual platforms are not tremendously relevant here, except in that they give context. The application is powerful, handles a reasonable amount of data and works well. As the number of customers using it and the complexity of the projects they create increases, however, we are starting to run into a few performance problems. As far as I can tell, the source of these problems is that the data represents a tree and it is very hard for ActiveRecord to deterministically know what data it should be retrieving. My model has many relationships like this: Project has_many Nodes has_many GlobalConditions Node has_one Parent has_many Nodes has_many WeightingFactors through NodeFactors has_many Tags through NodeTags GlobalCondition has_many Nodes ( referenced by Id, rather than replicating tree ) WeightingFactor has_many Nodes through NodeFactors Tag has_many Nodes through NodeTags The whole system has something in the region of thirty types which optionally hang off one or many nodes in the tree. My question is: What can I do to retrieve and construct this data faster? Having worked a lot with .Net, if I was in a similar situation there, I would look at building up a Stored Procedure to pull everything out of the database in one go but I would prefer to keep my logic in the application and from what I can tell it would be hard to take the queried data and build ActiveRecord objects from it without losing their integrity, which would cause more problems than it solves. It has also occurred to me that I could bunch the data up and send some of it across asynchronously, which would not improve performance but would improve the user perception of performance. However if sections of the data appeared after page load that could also be quite confusing. I am wondering whether it would be a useful strategy to make everything aware of it's parent project, so that one could pull all the records for that project and then build up the relationships later, but given the ubiquity of complex trees in day to day programming life I wouldn't be surprised if there were some better design patterns or standard approaches to this type of situation that I am not well versed in.

    Read the article

  • sp_help

    - by David-Betteridge
    One of the nice things about SQL Server Management Studio (SSMS) is that you can highlight a table name in a script and press Alt + F1 to perform sp_help on it. Unfortunately I've never been able to use that feature as the majority of the tables in our product belong to a schema other than dbo.    On a long train journey back to York I wondered if I could solve this problem by writing my own replacement for sp_help (which I’ve called sp_help_table_schemas).  My version works by first checking the system tables to find out which schemas the table belongs to SELECT s.Name   --Find the schema FROM sys.schemas s  JOIN sys.tables t on t.schema_id = s.schema_id  WHERE t.name = 'Orders'It then dynamically calls the standard sp_help method but this time supplying the table owner as well.SET @cmd = 'EXEC sp_help ''' + QUOTENAME(@SchemaName) + '.' + QUOTENAME(@ObjectName) + ''' ;' ;           EXEC ( @cmd )Once I had proved the basics worked I wrapped it up into a stored procedure and deployed it to the master database on my laptop.  It was then just a question of going into Tools à Options within SSMS and defining the keyboard short cutA couple of notes You can’t amend the existing Alt+F1 entry to I went with Ctrl+F1.  You need to open new query window for the change to be picked upSo I can now highlight a table name and press Ctrl+F1 The completed script is attached.   Thanks go to Martin Bell who reviewed my stored procedure and give some valuable advice.

    Read the article

  • ASMLib

    - by wcoekaer
    Oracle ASMlib on Linux has been a topic of discussion a number of times since it was released way back when in 2004. There is a lot of confusion around it and certainly a lot of misinformation out there for no good reason. Let me try to give a bit of history around Oracle ASMLib. Oracle ASMLib was introduced at the time Oracle released Oracle Database 10g R1. 10gR1 introduced a very cool important new features called Oracle ASM (Automatic Storage Management). A very simplistic description would be that this is a very sophisticated volume manager for Oracle data. Give your devices directly to the ASM instance and we manage the storage for you, clustered, highly available, redundant, performance, etc, etc... We recommend using Oracle ASM for all database deployments, single instance or clustered (RAC). The ASM instance manages the storage and every Oracle server process opens and operates on the storage devices like it would open and operate on regular datafiles or raw devices. So by default since 10gR1 up to today, we do not interact differently with ASM managed block devices than we did before with a datafile being mapped to a raw device. All of this is without ASMLib, so ignore that one for now. Standard Oracle on any platform that we support (Linux, Windows, Solaris, AIX, ...) does it the exact same way. You start an ASM instance, it handles storage management, all the database instances use and open that storage and read/write from/to it. There are no extra pieces of software needed, including on Linux. ASM is fully functional and selfcontained without any other components. In order for the admin to provide a raw device to ASM or to the database, it has to have persistent device naming. If you booted up a server where a raw disk was named /dev/sdf and you give it to ASM (or even just creating a tablespace without asm on that device with datafile '/dev/sdf') and next time you boot up and that device is now /dev/sdg, you end up with an error. Just like you can't just change datafile names, you can't change device filenames without telling the database, or ASM. persistent device naming on Linux, especially back in those days ways to say it bluntly, a nightmare. In fact there were a number of issues (dating back to 2004) : Linux async IO wasn't pretty persistent device naming including permissions (had to be owned by oracle and the dba group) was very, very difficult to manage system resource usage in terms of open file descriptors So given the above, we tried to find a way to make this easier on the admins, in many ways, similar to why we started working on OCFS a few years earlier - how can we make life easier for the admins on Linux. A feature of Oracle ASM is the ability for third parties to write an extension using what's called ASMLib. It is possible for any third party OS or storage vendor to write a library using a specific Oracle defined interface that gets used by the ASM instance and by the database instance when available. This interface offered 2 components : Define an IO interface - allow any IO to the devices to go through ASMLib Define device discovery - implement an external way of discovering, labeling devices to provide to ASM and the Oracle database instance This is similar to a library that a number of companies have implemented over many years called libODM (Oracle Disk Manager). ODM was specified many years before we introduced ASM and allowed third party vendors to implement their own IO routines so that the database would use this library if installed and make use of the library open/read/write/close,.. routines instead of the standard OS interfaces. PolyServe back in the day used this to optimize their storage solution, Veritas used (and I believe still uses) this for their filesystem. It basically allowed, in particular, filesystem vendors to write libraries that could optimize access to their storage or filesystem.. so ASMLib was not something new, it was basically based on the same model. You have libodm for just database access, you have libasm for asm/database access. Since this library interface existed, we decided to do a reference implementation on Linux. We wrote an ASMLib for Linux that could be used on any Linux platform and other vendors could see how this worked and potentially implement their own solution. As I mentioned earlier, ASMLib and ODMLib are libraries for third party extensions. ASMLib for Linux, since it was a reference implementation implemented both interfaces, the storage discovery part and the IO part. There are 2 components : Oracle ASMLib - the userspace library with config tools (a shared object and some scripts) oracleasm.ko - a kernel module that implements the asm device for /dev/oracleasm/* The userspace library is a binary-only module since it links with and contains Oracle header files but is generic, we only have one asm library for the various Linux platforms. This library is opened by Oracle ASM and by Oracle database processes and this library interacts with the OS through the asm device (/dev/asm). It can install on Oracle Linux, on SuSE SLES, on Red Hat RHEL,.. The library itself doesn't actually care much about the OS version, the kernel module and device cares. The support tools are simple scripts that allow the admin to label devices and scan for disks and devices. This way you can say create an ASM disk label foo on, currently /dev/sdf... So if /dev/sdf disappears and next time is /dev/sdg, we just scan for the label foo and we discover it as /dev/sdg and life goes on without any worry. Also, when the database needs access to the device, we don't have to worry about file permissions or anything it will be taken care of. So it's a convenience thing. The kernel module oracleasm.ko is a Linux kernel module/device driver. It implements a device /dev/oracleasm/* and any and all IO goes through ASMLib - /dev/oracleasm. This kernel module is obviously a very specific Oracle related device driver but it was released under the GPL v2 so anyone could easily build it for their Linux distribution kernels. Advantages for using ASMLib : A good async IO interface for the database, the entire IO interface is based on an optimal ASYNC model for performance A single file descriptor per Oracle process, not one per device or datafile per process reducing # of open filehandles overhead Device scanning and labeling built-in so you do not have to worry about messing with udev or devlabel, permissions or the likes which can be very complex and error prone. Just like with OCFS and OCFS2, each kernel version (major or minor) has to get a new version of the device drivers. We started out building the oracleasm kernel module rpms for many distributions, SLES (in fact in the early days still even for this thing called United Linux) and RHEL. The driver didn't make sense to get pushed into upstream Linux because it's unique and specific to the Oracle database. As it takes a huge effort in terms of build infrastructure and QA and release management to build kernel modules for every architecture, every linux distribution and every major and minor version we worked with the vendors to get them to add this tiny kernel module to their infrastructure. (60k source code file). The folks at SuSE understood this was good for them and their customers and us and added it to SLES. So every build coming from SuSE for SLES contains the oracleasm.ko module. We weren't as successful with other vendors so for quite some time we continued to build it for RHEL and of course as we introduced Oracle Linux end of 2006 also for Oracle Linux. With Oracle Linux it became easy for us because we just added the code to our build system and as we churned out Oracle Linux kernels whether it was for a public release or for customers that needed a one off fix where they also used asmlib, we didn't have to do any extra work it was just all nicely integrated. With the introduction of Oracle Linux's Unbreakable Enterprise Kernel and our interest in being able to exploit ASMLib more, we started working on a very exciting project called Data Integrity. Oracle (Martin Petersen in particular) worked for many years with the T10 standards committee and storage vendors and implemented Linux kernel support for DIF/DIX, data protection in the Linux kernel, note to those that wonder, yes it's all in mainline Linux and under the GPL. This basically gave us all the features in the Linux kernel to checksum a data block, send it to the storage adapter, which can then validate that block and checksum in firmware before it sends it over the wire to the storage array, which can then do another checksum and to the actual DISK which does a final validation before writing the block to the physical media. So what was missing was the ability for a userspace application (read: Oracle RDBMS) to write a block which then has a checksum and validation all the way down to the disk. application to disk. Because we have ASMLib we had an entry into the Linux kernel and Martin added support in ASMLib (kernel driver + userspace) for this functionality. Now, this is all based on relatively current Linux kernels, the oracleasm kernel module depends on the main kernel to have support for it so we can make use of it. Thanks to UEK and us having the ability to ship a more modern, current version of the Linux kernel we were able to introduce this feature into ASMLib for Linux from Oracle. This combined with the fact that we build the asm kernel module when we build every single UEK kernel allowed us to continue improving ASMLib and provide it to our customers. So today, we (Oracle) provide Oracle ASMLib for Oracle Linux and in particular on the Unbreakable Enterprise Kernel. We did the build/testing/delivery of ASMLib for RHEL until RHEL5 but since RHEL6 decided that it was too much effort for us to also maintain all the build and test environments for RHEL and we did not have the ability to use the latest kernel features to introduce the Data Integrity features and we didn't want to end up with multiple versions of asmlib as maintained by us. SuSE SLES still builds and comes with the oracleasm module and they do all the work and RHAT it certainly welcome to do the same. They don't have to rebuild the userspace library, it's really about the kernel module. And finally to re-iterate a few important things : Oracle ASM does not in any way require ASMLib to function completely. ASMlib is a small set of extensions, in particular to make device management easier but there are no extra features exposed through Oracle ASM with ASMLib enabled or disabled. Often customers confuse ASMLib with ASM. again, ASM exists on every Oracle supported OS and on every supported Linux OS, SLES, RHEL, OL withoutASMLib Oracle ASMLib userspace is available for OTN and the kernel module is shipped along with OL/UEK for every build and by SuSE for SLES for every of their builds ASMLib kernel module was built by us for RHEL4 and RHEL5 but we do not build it for RHEL6, nor for the OL6 RHCK kernel. Only for UEK ASMLib for Linux is/was a reference implementation for any third party vendor to be able to offer, if they want to, their own version for their own OS or storage ASMLib as provided by Oracle for Linux continues to be enhanced and evolve and for the kernel module we use UEK as the base OS kernel hope this helps.

    Read the article

  • The Fantastic New WebLogic on Oracle Database Appliance 2.9 Release is Here!

    - by JuergenKress
    Last week was a big day in virtualised ODA-land as it saw the launch of WebLogic on ODA 2.9. Admittedly it doesn't sound like a very exciting release but it is one that we at O-box have been looking forward to for quite some time. Let me explain why, then we'll look into the details... The ODA X4-2 has 48 Intel Xeon cores. That is a lot of compute power. Whilst the largest O-box SOA Appliance single environment configuration can in theory use all those cores (currently with 40 vCPU of SOA!) the vast majority of O-box users will want smaller configurations. Prior to 2.9 the Oracle WebLogic implementation only supported one domain per ODA, so the conundrum O-box development faced last year was either: offer customers only one SOA environment on their O-box for now (but have the benefit of a standard, easily supportable WebLogic installation), or build our own WebLogic/OTD OVM templates from scratch. One of our driving goals with O-box is to give the best possible experience and make the appliance as supportable as possible. Therefore we took the gamble that we would stick with the Oracle's one-domain WebLogic configuration initially, and just hope that it would deliver multi-domain support for us in a timely manner (note: this is probably not a strategy that business textbooks would recommend!). Anyway, we've been working closely with Oracle Product Management for a few months now and I'm delighted to see 2.9 as the fruits of their labour. This also neatly ties in with several recent requests for O-box to include OSB as well as SOA/BPEL (which we have always wanted to have in separate domains). The diagram below is the neatest way to summarise what the new 2.9 release will allow us to deliver, i.e. previously only one 3D box was possible: Read the complete article here. WebLogic Partner Community For regular information become a member in the WebLogic Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Wiki Technorati Tags: oBox,WebLogic on ODA,ODA,WebLogic,WebLogic Community,Oracle,OPN,Jürgen Kress

    Read the article

  • Convience of mySQL over xml

    - by Bonechilla
    Currently I use XML to store specific information to correctly load a few things such as a list of specfied characters, scenes and music, Once more I use JAXB in combination with standard compression/decompression(ZIP) functionality to store a list of extrenous data. This data is called to add functionality to the character, somewhat like Skills in an RPG. Each skill is seperated into its own XML file with a grandlist which contains the names of each file with their extensions omitted and zipped in folder that gets encrypted. At first using xml was working fine however as the skill list grow i worry about its stability. I was wondering if I should begin storing the data in mySQL. Originally I planned to simply convert everything to JSON over xml but i think possibly mySQL would be a better move. Can anyone inform me of the key difference and pros and cons of each I guess i'm looking for the best way to store the data more conviently and would be easier to operate on. The data is mostly primatives and strings and the only arraylist of values i have i can just concat into a single field and parse later

    Read the article

  • Dell XPS 15 L502x and Ubuntu 11.04 - HDMI output

    - by Jones
    Recently I've bought my dream's notebook, a Dell XPS 15 but since then this dream became a kind of endless nightmare. I'm almost getting crazy to make my graphic card driver work properly, but it seems to be just impossible. Yes, I have a 2GB NVIDIA GeForce GT 540m (Optimus) in it! It simply doesn't work. Every time I generate the xorg.conf Ubuntu hangs on while starting up, which forces me to remove this file to be able to start the notebook with the standard graphic settings. Another problem is that the Dell XPS 15 does NOT have a VGA output, but a HDMI. So, to be able to use a second monitor I have to configure it by the NVIDIA X Server Settings, which just works if the driver is properly initialized with the xorg.conf. I've also tried to make it work with the Bumblebee, but unfortunately it didn't help me much with the HDMI output. Do you guys have any idea to solve this deadlock? Is there any way for me to use my second monitor?

    Read the article

  • How to make Unity 3D work with Bumblebee using the Intel chipset

    - by EboMike
    I have a Sony VAIO S laptop with the dreaded Optimus and finally managed to get Bumblebee to work fully on Ubuntu 12.04 so that I can utilize both the hardware acceleration of the Intel chipset as well as the Nvidia one via optirun and/or bumble-app-settings. However, the desktop effects don't work. But they should, I vaguely remember that they worked for a while before I had Bumblebee installed. This is what I get with the support test: :~$ /usr/lib/nux/unity_support_test -p Xlib: extension "NV-GLX" missing on display ":0". OpenGL vendor string: Tungsten Graphics, Inc OpenGL renderer string: Mesa DRI Intel(R) Ivybridge Mobile OpenGL version string: 1.4 (2.1 Mesa 8.0.2) Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: no GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no First of all, I kind of doubt that the chipset doesn't support VBOs (essentially a standard feature in GL). Neither Xorg.0.log nor Xorg.8.log show any particular errors. As for the Nvidia drivers: In order to get them to work, I had to install the 304.22 drivers (older ones wouldn't work). They clobbered libglx.so, so I reinstated the xserver-xorg-core libglx.so in its original place, moved Nvidia's libglx.so to an nvidia-specific folder and specified that folder in the bumblebee.config. That seems to work and shouldn't cause the problem I see here. For fun, I tried to use the Nvidia chipset for Unity, but that didn't fly either: ~$ optirun /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce GT 640M LE/PCIe/SSE2 OpenGL version string: 4.2.0 NVIDIA 304.22 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: no GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no

    Read the article

  • Configuring WS-Security with PeopleSoft Web Services

    - by Dave Bain
    I was speaking with a customer a few days ago about PeopleSoft Web Services.  The customer created a web service but when they went to deploy it, they had so many problems configuring ws-security, they pulled the service.  They spent several days trying to get it working but never got it working so they've put it on hold until they have time to work through the issues. Having gone through the process of configuring ws-security myself, I understand the complexity.  There is no magic 'easy' button to push.  If you are not familiar with all the moving parts like policies, certificates, public and private keys, credential stores, and so on, it can be a daunting task.  PeopleBooks documentation is good but does not offer a step-by-step example to follow.  Fear not, for those that want more help, there is a place to go. PeopleSoft released a Mobile Inventory Management application over a year ago.  It is a mobile app built with Oracle Fusion Application Development Framework (ADF) that accesses PeopleSoft content through standard web services.  Part of the installation of this app is configuring ws-security for the web services used in the application.  Appendix A of the PeopleSoft FSCM91 Mobile Inventory Management Installation Guide is called Configuring WS-Security for Mobile Inventory Management.  It is a step-by-step guide to configure ws-security between a server running Oracle Web Server Management (OWSM) and PeopleSoft Integration Broker.  Your environment might be different, but the steps will be similar, and on the PeopleSoft side, Integration Broker will remain a constant. You can find the installation guide on Oracle Suport.  Sign in to https://support.us.oracle.com and search for document 1290972.1.  Read through Appendix A for more details about how to set up ws-security with PeopleSoft web services.

    Read the article

  • Virtually the fastest way to try Solaris 11 (and Solaris 10 zones)

    - by dminer
    If you're looking to try out Solaris 11, there are the standard ISO and USB image downloads on the main page.  Those are great if you're looking to install Solaris 11 on hardware, and we hope you will.  But if you take the time to look down the page, you'll find a link off to the Oracle Solaris 11 Virtual Machine downloads.  There are two downloads there:A pre-built Solaris 10 zoneA pre-built Solaris 11 VM for use with VirtualBoxIf you're looking to try Solaris 11 on x86, the second one is what you want.  Of course, this assumes you have VirtualBox already (and if you don't, now's the time to try it, it's a terrific free desktop virtualization product).  Once you complete the 1.8 GB download, it's a simple matter of unzipping the archive and a few quick clicks in VirtualBox to get a Solaris 11 desktop booted.  While it's booting, you'll get to run through the new system configuration tool (that'll be the subject of a future posting here) to configure networking, a user account, and so on.So what about that pre-built Solaris 10 zone download?  It's a really simple way to get yourself acquainted with the Solaris 10 zones feature, which you may well find indispensible in transitioning an existing Solaris 10 infrastructure to Solaris 11.  Once you've downloaded the file, it's a self-extracting executable that'll configure the zone for you, all you have to supply is an IP address for the zone.  It's really quite slick!I expect we'll do a lot more pre-built VM's and zones going forward, as that's a big part of being a cloud OS; if there's one that would be really useful for you, let us know.

    Read the article

  • Music Manager thst can properly synch song ratings with Android

    - by Sebastian
    I'm moving from Windows 7 to Ubuntu, and so far the experience has been a really good one :) However, there is something that I've used to do with Windows 7 that I can't find how to do in Ubuntu: From my Windows music manager programs (either MediaMonkey or Windows Media Player) I could set song ratings in a way that the ratings set from either program could be also read from the other one. Additionally, song ratings were visible and updated in my iPod Touch when I synch'ed my music (either manually or using iTunes). To sum up, it seems that MediaMonkey, WMP, and the iPod device use standard mp3 metadata tag for ratings. Now, using Ubuntu 12.04, and now with an Android device: Rhythmbox can't see the song rates, despite those ratings can be seen by MediaMonkey and MS Music Player when I boot with Win7. Is this an issue I can fix with some setting? Is there any program I can use to accomplish this? What do you recommend to sync my music with Android (4.0, Galaxy s2), also keeping the song ratings information updated between Android and my PC? Thanks!!

    Read the article

  • Oracle Knowledge Courses

    - by mseika
    Oracle Knowledge products offer simple and convenient ways for users to access knowledge contained in corporate information stores. With Oracle Knowledge Training, you learn how to utilize tools that improve customer service and satisfaction by helping customers find more relevant answers to questions online or from a service agent guided by a scalable knowledge management platform. The following courses have been scheduled at Oracle in Utrecht: Oracle Knowledge Overview Rel 8.5 (1 day) Learn the technical architecture of Oracle Knowledge at a high-level and the key technologies including InfoCenter, iConnect, Search, Information Manager, Answerflow and Analytics. Dates: to be scheduled Knowledge Technical Architecture and Configuration Rel 8.5 (5 days) Learn to implement and maintain Oracle Knowledge’s core technologies through hands-on exercises including Intelligent Search, Information Manager, iConnect, AnswerFlow and Analytics. Dates: 13-17 January 2014 (afternoon/evening) Location: Live Virtual Class Knowledge Content Administration Rel 8.5 (2 days) Learn to implement, use and manage knowledge and content creation with Oracle Knowledge Information Manager. Dates: 4-5 December 2013 Location: Utrecht, The Netherlands Knowledge Analytics Rel 8.5 (1 day) Learn KPI analyses and how to close gaps using reports and tools provided in Oracle Business Intelligence Enterprise Edition. Dates: 6 December Location: Utrecht, The Netherlands Remember: your OPN discount is always applied to the standard prices shown on the Oracle University web pages. For assistance in booking, scheduling requests and more information contact the Education Service Desk: eMail: [email protected] Telephone: +31 30 66 27 675

    Read the article

< Previous Page | 209 210 211 212 213 214 215 216 217 218 219 220  | Next Page >