Search Results

Search found 20631 results on 826 pages for 'release management'.

Page 10/826 | < Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • Distribution upgrade problem "No new release found"

    - by fefe
    I'm using Ubuntu 11.04. The update manager once found the new release 'oneiric', and still shows up this screen when I log on use ssh: Welcome to Ubuntu 11.04 (GNU/Linux 2.6.38-14-generic x86_64) * Documentation: https://help.ubuntu.com/ 0 packages can be updated. 0 updates are security updates. New release 'oneiric' available. Run 'do-release-upgrade' to upgrade to it. Last login: Wed Apr 25 16:22:48 2012 from *** But I didn't upgrade then, and change my apt sources. And now I cannot upgrade to 'oneiric'. do-relase-upgrade shows: $ sudo do-release-upgrade Checking for a new ubuntu release No new release found $ And apt-get dist-upgrade shows: $ sudo apt-get dist-upgrade Reading package lists... Done Building dependency tree Reading state information... Done Calculating upgrade... Done 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. $ I can successfully update all my packages. File contents of source.list: $ cat /etc/apt/sources.list ## See sources.list(5) for more information, especialy # Remember that you can only use http, ftp or file URIs deb http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ natty main universe restricted multiverse deb-src http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ natty main universe restricted multiverse deb http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ natty-security universe main multiverse restricted deb-src http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ natty-security universe main multiverse restricted deb http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ natty-updates universe main multiverse restricted deb-src http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ natty-updates universe main multiverse restricted deb http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ natty-backports universe main multiverse restricted deb-src http://mirrors.tuna.tsinghua.edu.cn/ubuntu/ natty-backports universe main multiverse restricted # deb http://ubuntu.dormforce.net/ubuntu/ lucid main universe restricted multiverse # deb-src http://ubuntu.dormforce.net/ubuntu/ lucid main universe restricted multiverse # deb http://ubuntu.dormforce.net/ubuntu/ lucid-security universe main multiverse restricted # deb-src http://ubuntu.dormforce.net/ubuntu/ lucid-security universe main multiverse restricted # deb http://ubuntu.dormforce.net/ubuntu/ lucid-updates universe main multiverse restricted # deb-src http://ubuntu.dormforce.net/ubuntu/ lucid-updates universe main multiverse restricted # CDROMs are managed through the apt-cdrom tool. # deb http://archive.canonical.com lucid partner # deb http://archive.canonical.com lucid-security partner # deb http://archive.canonical.com lucid-updates partner # deb-src http://archive.canonical.com lucid partner # deb-src http://archive.canonical.com lucid-security partner # deb-src http://archive.canonical.com lucid-updates partner #medibuntu repo # deb http://packages.medibuntu.org/ lucid free non-free # deb-src http://packages.medibuntu.org/ lucid free non-free # deb http://extras.ubuntu.com/ubuntu maverick main #Third party developers repository deb http://mirrors.sohu.com/ubuntu/ natty main restricted multiverse universe deb-src http://mirrors.sohu.com/ubuntu/ natty main universe restricted multiverse #Added by software-properties deb http://security.ubuntu.com/ubuntu/ natty-security universe main multiverse restricted deb-src http://mirrors.sohu.com/ubuntu/ natty-security universe main multiverse restricted deb http://mirrors.sohu.com/ubuntu/ natty-updates universe main multiverse restricted deb-src http://mirrors.sohu.com/ubuntu/ natty-updates universe main multiverse restricted File contents of /etc/update-manager/meta-release: $ cat /etc/update-manager/meta-release # default location for the meta-release file [METARELEASE] URI = http://changelogs.ubuntu.com/meta-release URI_LTS = http://changelogs.ubuntu.com/meta-release-lts URI_UNSTABLE_POSTFIX = -development URI_PROPOSED_POSTFIX = -proposed What may be the problem of this?

    Read the article

  • Oracle Enterprise Manager Cloud Control 12c Release 3 (12.1.0.3)

    - by Ankit G
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Delighted to announce the GA of EM Cloud Control Release 3 on all supported platforms. This release includes a new 12.1.0.3 version of platform (OMS & Agent), along with revised new versions of several Plug-ins and Metadata plug-ins (including a brand new Metadata plug-in for Oracle Virtual Networking). This release marks yet another major & significant milestone for Enterprise Manager 12c Cloud Control product releases. Following shows the list of new plug-ins versions available along the Oracle Enterprise Manager Cloud Control 12c Release 3 (12.1.0.3). The new plug-ins have dependency on 12.1.0.3 platform, and customer needs to be on minimum 12.1.0.3 platform (OMS/Agent) version of the product before being able to deploy/use these plug-in versions. (In other words, the new plug-in versions cannot be deployed, unless Oracle Enterprise Manager Cloud Control 12c Release 3 (12.1.0.3) is installed or upgraded to). Oracle Enterprise Manager Cloud Control 12c Release 3 (12.1.0.3) release includes tons of new features, along with several stability and performance bug fixes and is available for download for all platforms from OTN:Installation/Upgrade paths: EM Customers can do a fresh installation using "Oracle Enterprise Manager Cloud Control 12c Release 3 (12.1.0.3)", and will get the latest version of the platform, along with all the latest versions of plug-ins and Metadata plug-ins out of the box. EM Customers who are on Release1 (12.1.0.1+BP1) or Release 2 (12.1.0.2), or on older releases 11g and 10.2.0.5, can choose to use Oracle Enterprise Manager Cloud Control 12c Release 3 bits, to upgrade directly to the latest Release 3, and the plug-ins will be automatically upgraded to the latest versions. Enterprise Manager Certification Matrix is also now available on My Oracle Support - here.

    Read the article

  • ConfigServer Security and Firewall -- after setup, how much daily management required?

    - by Hope4You
    I'm looking at using ConfigServer Security and Firewall (CSF; iptables-based). After I configure it properly, how much daily ongoing management is required of me to keep my server secure? Am I going to be flooded with "alert" emails that I need to check? Or does the firewall automatically take care of most security threats for me? Note: I understand that there's more to server security than just a software firewall, but this question is specifically for CSF security management.

    Read the article

  • July 2013 Release of the Ajax Control Toolkit

    - by Stephen.Walther
    I’m super excited to announce the July 2013 release of the Ajax Control Toolkit. You can download the new version of the Ajax Control Toolkit from CodePlex (http://ajaxControlToolkit.CodePlex.com) or install the Ajax Control Toolkit from NuGet: With this release, we have completely rewritten the way the Ajax Control Toolkit combines, minifies, gzips, and caches JavaScript files. The goal of this release was to improve the performance of the Ajax Control Toolkit and make it easier to create custom Ajax Control Toolkit controls. Improving Ajax Control Toolkit Performance Previous releases of the Ajax Control Toolkit optimized performance for a single page but not multiple pages. When you visited each page in an app, the Ajax Control Toolkit would combine all of the JavaScript files required by the controls in the page into a new JavaScript file. So, even if every page in your app used the exact same controls, visitors would need to download a new combined Ajax Control Toolkit JavaScript file for each page visited. Downloading new scripts for each page that you visit does not lead to good performance. In general, you want to make as few requests for JavaScript files as possible and take maximum advantage of caching. For most apps, you would get much better performance if you could specify all of the Ajax Control Toolkit controls that you need for your entire app and create a single JavaScript file which could be used across your entire app. What a great idea! Introducing Control Bundles With this release of the Ajax Control Toolkit, we introduce the concept of Control Bundles. You define a Control Bundle to indicate the set of Ajax Control Toolkit controls that you want to use in your app. You define Control Bundles in a file located in the root of your application named AjaxControlToolkit.config. For example, the following AjaxControlToolkit.config file defines two Control Bundles: <ajaxControlToolkit> <controlBundles> <controlBundle> <control name="CalendarExtender" /> <control name="ComboBox" /> </controlBundle> <controlBundle name="CalendarBundle"> <control name="CalendarExtender"></control> </controlBundle> </controlBundles> </ajaxControlToolkit> The first Control Bundle in the file above does not have a name. When a Control Bundle does not have a name then it becomes the default Control Bundle for your entire application. The default Control Bundle is used by the ToolkitScriptManager by default. For example, the default Control Bundle is used when you declare the ToolkitScriptManager like this:  <ajaxToolkit:ToolkitScriptManager runat=”server” /> The default Control Bundle defined in the file above includes all of the scripts required for the CalendarExtender and ComboBox controls. All of the scripts required for both of these controls are combined, minified, gzipped, and cached automatically. The AjaxControlToolkit.config file above also defines a second Control Bundle with the name CalendarBundle. Here’s how you would use the CalendarBundle with the ToolkitScriptManager: <ajaxToolkit:ToolkitScriptManager runat="server"> <ControlBundles> <ajaxToolkit:ControlBundle Name="CalendarBundle" /> </ControlBundles> </ajaxToolkit:ToolkitScriptManager> In this case, only the JavaScript files required by the CalendarExtender control, and not the ComboBox, would be downloaded because the CalendarBundle lists only the CalendarExtender control. You can use multiple named control bundles with the ToolkitScriptManager and you will get all of the scripts from both bundles. Support for ControlBundles is a new feature of the ToolkitScriptManager that we introduced with this release. We extended the ToolkitScriptManager to support the Control Bundles that you can define in the AjaxControlToolkit.config file. Let me be explicit about the rules for Control Bundles: 1. If you do not create an AjaxControlToolkit.config file then the ToolkitScriptManager will download all of the JavaScript files required for all of the controls in the Ajax Control Toolkit. This is the easy but low performance option. 2. If you create an AjaxControlToolkit.config file and create a ControlBundle without a name then the ToolkitScriptManager uses that Control Bundle by default. For example, if you plan to use only the CalendarExtender and ComboBox controls in your application then you should create a default bundle that lists only these two controls. 3. If you create an AjaxControlToolkit.config file and create one or more named Control Bundles then you can use these named Control Bundles with the ToolkitScriptManager. For example, you might want to use different subsets of the Ajax Control Toolkit controls in different sections of your app. I should also mention that you can use the AjaxControlToolkit.config file with custom Ajax Control Toolkit controls – new controls that you write. For example, here is how you would register a set of custom controls from an assembly named MyAssembly: <ajaxControlToolkit> <controlBundles> <controlBundle name="CustomBundle"> <control name="MyAssembly.MyControl1" assembly="MyAssembly" /> <control name="MyAssembly.MyControl2" assembly="MyAssembly" /> </controlBundle> </ajaxControlToolkit> What about ASP.NET Bundling and Minification? The idea of Control Bundles is similar to the idea of Script Bundles used in ASP.NET Bundling and Minification. You might be wondering why we didn’t simply use Script Bundles with the Ajax Control Toolkit. There were several reasons. First, ASP.NET Bundling does not work with scripts embedded in an assembly. Because all of the scripts used by the Ajax Control Toolkit are embedded in the AjaxControlToolkit.dll assembly, ASP.NET Bundling was not an option. Second, Web Forms developers typically think at the level of controls and not at the level of individual scripts. We believe that it makes more sense for a Web Forms developer to specify the controls that they need in an app (CalendarExtender, ToggleButton) instead of the individual scripts that they need in an app (the 15 or so scripts required by the CalenderExtender). Finally, ASP.NET Bundling does not work with older versions of ASP.NET. The Ajax Control Toolkit needs to support ASP.NET 3.5, ASP.NET 4.0, and ASP.NET 4.5. Therefore, using ASP.NET Bundling was not an option. There is nothing wrong with using Control Bundles and Script Bundles side-by-side. The ASP.NET 4.0 and 4.5 ToolkitScriptManager supports both approaches to bundling scripts. Using the AjaxControlToolkit.CombineScriptsHandler Browsers cache JavaScript files by URL. For example, if you request the exact same JavaScript file from two different URLs then the exact same JavaScript file must be downloaded twice. However, if you request the same JavaScript file from the same URL more than once then it only needs to be downloaded once. With this release of the Ajax Control Toolkit, we have introduced a new HTTP Handler named the AjaxControlToolkit.CombineScriptsHandler. If you register this handler in your web.config file then the Ajax Control Toolkit can cache your JavaScript files for up to one year in the future automatically. You should register the handler in two places in your web.config file: in the <httpHandlers> section and the <system.webServer> section (don’t forget to register the handler for the AjaxFileUpload while you are there!). <httpHandlers> <add verb="*" path="AjaxFileUploadHandler.axd" type="AjaxControlToolkit.AjaxFileUploadHandler, AjaxControlToolkit" /> <add verb="*" path="CombineScriptsHandler.axd" type="AjaxControlToolkit.CombineScriptsHandler, AjaxControlToolkit" /> </httpHandlers> <system.webServer> <validation validateIntegratedModeConfiguration="false" /> <handlers> <add name="AjaxFileUploadHandler" verb="*" path="AjaxFileUploadHandler.axd" type="AjaxControlToolkit.AjaxFileUploadHandler, AjaxControlToolkit" /> <add name="CombineScriptsHandler" verb="*" path="CombineScriptsHandler.axd" type="AjaxControlToolkit.CombineScriptsHandler, AjaxControlToolkit" /> </handlers> <system.webServer> The handler is only used in release mode and not in debug mode. You can enable release mode in your web.config file like this: <compilation debug=”false”> You also can override the web.config setting with the ToolkitScriptManager like this: <act:ToolkitScriptManager ScriptMode=”Release” runat=”server”/> In release mode, scripts are combined, minified, gzipped, and cached with a far future cache header automatically. When the handler is not registered, scripts are requested from the page that contains the ToolkitScriptManager: When the handler is registered in the web.config file, scripts are requested from the handler: If you want the best performance, always register the handler. That way, the Ajax Control Toolkit can cache the bundled scripts across page requests with a far future cache header. If you don’t register the handler then a new JavaScript file must be downloaded whenever you travel to a new page. Dynamic Bundling and Minification Previous releases of the Ajax Control Toolkit used a Visual Studio build task to minify the JavaScript files used by the Ajax Control Toolkit controls. The disadvantage of this approach to minification is that it made it difficult to create custom Ajax Control Toolkit controls. Starting with this release of the Ajax Control Toolkit, we support dynamic minification. The JavaScript files in the Ajax Control Toolkit are minified at runtime instead of at build time. Scripts are minified only when in release mode. You can specify release mode with the web.config file or with the ToolkitScriptManager ScriptMode property. Because of this change, the Ajax Control Toolkit now depends on the Ajax Minifier. You must include a reference to AjaxMin.dll in your Visual Studio project or you cannot take advantage of runtime minification. If you install the Ajax Control Toolkit from NuGet then AjaxMin.dll is added to your project as a NuGet dependency automatically. If you download the Ajax Control Toolkit from CodePlex then the AjaxMin.dll is included in the download. This change means that you no longer need to do anything special to create a custom Ajax Control Toolkit. As an open source project, we hope more people will contribute to the Ajax Control Toolkit (Yes, I am looking at you.) We have been working hard on making it much easier to create new custom controls. More on this subject with the next release of the Ajax Control Toolkit. A Single Visual Studio Solution We also made substantial changes to the Visual Studio solution and projects used by the Ajax Control Toolkit with this release. This change will matter to you only if you need to work directly with the Ajax Control Toolkit source code. In previous releases of the Ajax Control Toolkit, we maintained separate solution and project files for ASP.NET 3.5, ASP.NET 4.0, and ASP.NET 4.5. Starting with this release, we now support a single Visual Studio 2012 solution that takes advantage of multi-targeting to build ASP.NET 3.5, ASP.NET 4.0, and ASP.NET 4.5 versions of the toolkit. This change means that you need Visual Studio 2012 to open the Ajax Control Toolkit project downloaded from CodePlex. For details on how we setup multi-targeting, please see Budi Adiono’s blog post: http://www.budiadiono.com/2013/07/25/visual-studio-2012-multi-targeting-framework-project/ Summary You can take advantage of this release of the Ajax Control Toolkit to significantly improve the performance of your website. You need to do two things: 1) You need to create an AjaxControlToolkit.config file which lists the controls used in your app and 2) You need to register the AjaxControlToolkit.CombineScriptsHandler in the web.config file. We made substantial changes to the Ajax Control Toolkit with this release. We think these changes will result in much better performance for multipage apps and make the process of building custom controls much easier. As always, we look forward to hearing your feedback.

    Read the article

  • Oracle E-Business Suite Release 12.1 Certified on Solaris 11

    - by John Abraham
    Oracle Solaris 11 was announced last week, and I'm pleased to also announce that Oracle E-Business Suite Release 12.1 is now certified on Oracle Solaris on SPARC (64-bit). This new operating system release represents a culmination of years of hard work by our Solaris engineering group.  It has a number of new and advanced features including simplified deployment and lifecycle management tools, built-in certified virtualization technologies, support on the latest generation SPARC chips, and more. New installations of the E-Business Suite R12 on this platform will require version 12.1.1 or higher and the latest Rapid Install startCD version 12.1.1.13.  For existing 12.1 installations, we have also certified an "in place" OS upgrade or the use of cloning to a target Solaris 11 system. There are also specific requirements to upgrade technology components such as the Oracle Database and Fusion Middleware.  These requirements are noted in the links below. References Oracle E-Business Suite Installation and Upgrade Notes Release 12 (12.1.1) for Oracle Solaris on SPARC (64-bit) (My Oracle Support Document 761568.1) Oracle Database Installation Guide 11g Release 2 (11.2) for Oracle Solaris Interoperability Notes Oracle E-Business Suite Release 12 with Oracle Database 11g Release 2 (11.2.0) (My Oracle Support Document 1058763.1) Cloning Oracle Applications Release 12 with Rapid Clone (My Oracle Support Document 406982.1) Related Articles New Rapid Install StartCD (12.1.1.13) for Oracle E-Business Suite Release 12.1 Now Available Oracle E-Business Suite Release 12.1.3 Now Available

    Read the article

  • What's New in Database Lifecycle Management in Enterprise Manager 12c Release 3

    - by HariSrinivasan
    Enterprise Manager 12c Release 3 includes improvements and enhancements across every area of the product. This blog provides an overview of the new and enhanced features in the Database Lifecycle Management area. I will deep dive into specific features more in depth in subsequent posts. "What's New?"  In this release, we focused on four things: 1. Lifecycle Management Support for new Database12c - Pluggable Databases 2. Management of long running processes, such as a security patch cycle (Change Activity Planner) 3. Management of large number of systems by · Leveraging new framework capabilities for lifecycle operations, such as the new advanced ‘emcli’ script option · Refining features such as configuration search and compliance 4. Minor improvements and quality fixes to existing features · Rollback support for Single instance databases · Improved "OFFLINE" Patching experience · Faster collection of ORACLE_HOME configurations Lifecycle Management Support for new Database 12c - Pluggable Databases Database 12c introduces Pluggable Databases (PDBs), the brand new addition to help you achieve your consolidation goals. Pluggable databases offer unprecedented consolidation at database level and native lifecycle verbs for creating, plugging and unplugging the databases on a container database (CDB). Enterprise Manager can supplement the capabilities of pluggable databases by offering workflows for migrating, provisioning and cloning them using the software library and the deployment procedures. For example, Enterprise Manager can migrate an existing database to a PDB or clone a PDB by storing a versioned copy in the software library. One can also manage the planned downtime related to patching by  migrating the PDBs to a new CDB. While pluggable databases offer these exciting features, it can also pose configuration management and compliance challenges if not managed properly. Enterprise Manager features like inventory management, topology associations and configuration search can mitigate the sprawl of PDBs and also lock them to predefined golden standards using configuration comparison and compliance rules. Learn More ... Management of Long Running datacenter processes - Change Activity Planner (CAP) Currently, customers resort to cumbersome methods to create, execute, track and monitor change activities within their data center. Some customers use traditional tools such as spreadsheets, project planners and in-house custom built solutions. Customers often have weekly sync up meetings across stake holders to collect status and updates. Some of the change activities, for example the quarterly patch set update (PSU) patch rollouts are not single tasks but processes with multiple tasks. Some of those tasks are performed within Enterprise Manager Cloud Control (for example Patch) and some are performed outside of Enterprise Manager Cloud Control. These tasks often run for a longer period of time and involve multiple people or teams. Enterprise Manger Cloud Control supports core data center operations such as configuration management, compliance management, and automation. Enterprise Manager Cloud Control release 12.1.0.3 leverages these capabilities and introduces the Change Activity Planner (CAP). CAP provides the ability to plan, execute, and track change activities in real time. It covers the typical datacenter activities that are spread over a long period of time, across multiple people and multiple targets (even target types). Here are some examples of Change Activity Process in a datacenter: · Patching large environments (PSU/CPU Patching cycles) · Upgrading large number of database environments · Rolling out Compliance Rules · Database Consolidation to Exadata environments CAP provides user flows for Compliance Officers/Managers (incl. lead administrators) and Operators (DBAs and admins). Managers can create change activity plans for various projects, allocate resources, targets, and groups affected. Upon activation of the plan, tasks are created and automatically assigned to individual administrators based on target ownership. Administrators (DBAs) can identify their tasks and understand the context, schedules, and priorities. They can complete tasks using Enterprise Manager Cloud Control automation features such as patch plans (or in some cases outside Enterprise Manager). Upon completion, compliance is evaluated for validations and updates the status of the tasks and the plans. Learn More about CAP ...  Improved Configuration & Compliance Management of a large number of systems Improved Configuration Comparison:  Get to the configuration comparison results faster for simple ad-hoc comparisons. When performing a 1 to 1 comparison, Enterprise Manager will perform the comparison immediately and take the user directly to the results without having to wait for a job to be submitted and executed. Flattened system comparisons reduce comparison setup time and reduce complexity. In addition to the previously existing topological comparison, users now have an option to compare using a “flattened” methodology. Flattening means to remove duplicate target instances within the systems and remove the hierarchy of member targets. The result are much easier to spot differences particularly for specific use cases like comparing patch levels between complex systems like RAC and Fusion Apps. Improved Configuration Search & Advanced EMCLI Script option for Mass Automation Enterprise manager 12c introduces a new framework level capability to be able to script and stitch together multiple tasks using EMCLI. This powerful capability can be leveraged for lifecycle operations, especially when executing a task over a large number of targets. Specific usages of this include, retrieving a qualified list of targets using Configuration Search and then using the resultset for automation. Another example would be executing a patching operation and then re-executing on targets where it may have failed. This is complemented by other enhancements, such as a better usability for designing reusable configuration searches. IN EM 12c Rel 3, a simplified UI makes building adhoc searches even easier. Searching for missing patches is a common use of configuration search. This required the use of the advanced options which are now clearly defined and easy to use. Perform “Configuration Search” using the EMCLI. Users can find and execute Configuration Searches from the EMCLI which can be extremely useful for building sophisticated automation scripts. For an example, Run the Search named “Oracle Databases on Exadata” which finds all Database targets running on top of Exadata. Further filter the results by refining by options like name, host, etc.. emcli get_targets -config_search="Databases on Exadata" –target_name="exa%“ Use this in powerful mass automation operations using the new emcli script option. For example, to solve the use case of – Finding all DBs running on Exadata and housing E-Biz and Patch them. Create a Python script with emcli functions and invoke it in the new EMCLI script option shell. Invoke the script in the new EMCLI with script option directly: $<path to emcli>/emcli @myPSU_Patch.py Richer compliance content:  Now over 50 Oracle Provided Compliance Standards including new standards for Pluggable Database, Fusion Applications, Oracle Identity Manager, Oracle VM and Internet Directory. 9 Oracle provided Real Time Monitoring Standards containing over 900 Compliance Rules across 500 Facets. These new Real time Compliance Standards covers both Exadata Compute nodes and Linux servers. The result is increased Oracle software coverage and faster time to compliance monitoring on Exadata. Enhancements to Patch Management: Overhauled "OFFLINE" Patching experience: Simplified Patch uploads UI to improve the offline experience of patching. There is now a single step process to get the patches into software library. Customers often maintain local repositories of patches, sometimes called software depots, where they host the patches downloaded from My Oracle Support. In the past, you had to move these patches to your desktop then upload them to the Enterprise Manager's Software library through the Enterprise Manager Cloud Control user interface. You can now use the following EMCLI command to upload multiple patches directly from a remote location within the data center: $emcli upload_patches -location <Path to Patch directory> -from_host <HOSTNAME> The upload process filters all of the new patches, automatically selects the relevant metadata files from the location, and uploads the patches to software library. Other Improvements:  Patch rollback for single instance databases, new option in the Patch Plan to rollback the patches added to the patch plans. Upon execution, the procedure would rollback the patch and the SQL applied to the single instance Databases. Improved and faster configuration collection of Oracle Home targets can enable more reliable automation at higher level functions like Provisioning, Patching or Database as a Service. Just to recap, here is a list of database lifecycle management features:  * Red highlights mark – New or Enhanced in the Release 3. • Discovery, inventory tracking and reporting • Database provisioning including o Migration to Pluggable databases o Plugging and unplugging of pluggable databases o Gold image based cloning o Scaling of RAC nodes •Schema and data change management •End-to-end patch management in online and offline modes, including o Patch advisories in online (connected with My Oracle Support) and offline mode o Patch pre-deployment analysis, deployment and rollback (currently only for single instance databases) o Reporting • Upgrade planning and execution of the upgrade process • Configuration management including • Compliance management with out-of-box content • Change Activity Planner for planning, designing and tracking long running processes For more information on Enterprise Manager’s database lifecycle management capabilities, visit http://www.oracle.com/technetwork/oem/lifecycle-mgmt/index.html

    Read the article

  • Notes on Oracle BPM PS6 Adaptive Case Management

    - by gcolman
    v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} I have recently been looking at the  latest release of the BPM Case Management feature in the Oracle BPM PS6 release. I had put together some notes to help me gain a better understanding of the context of the PS6 BPM Case Management. Hopefully, this along with the other resources will enable you to gain a clear picture of the flexibility of this feature. Oracle BPM PS6 release includes Case Management capability. This initial release aims to provide: Case Management Framework Integration of Case Management with BPM & SOA suite It is best to regard the current PS6 case management feature as a case management framework. The framework provides the building blocks for creating a case management system that is fully integrated into Oracle BPM suite. As of the current PS6 release, no UI tooling exists to help manage cases or the case lifecycle. Mark Foster has written a good blog which outlines Case Management within PS6 in the following link. I wanted to provide more context on Case Management from my perspective in this blog. PS6 Case Management - High level View BPM PS6 includes “Case” as a first class component in a SOA Suite composite. The Case components (added to the SOA Composite) are created when a BPM process is assigned to a case in JDveloper. The SOA Case component is defined and configured within JDevloper, which allows us to specify the case data structures and metadata such as stakeholders, outcomes, milestones, document stores etc. "Activities" are associated with a case, and become available to be executed via the case apis. Activities are BPM processes, Human Activities or Java call outs. The PS6 release includes some additional database tables to store the case metadata and case instance data (data object, comments, etc…). These new tables are created within the SOA_INFRA schema and the documents associated with that case into a document repository that is configured with the case. One of the main features of Case Management is the control of the case logic through case events and case business rules. A PS6 Case has an associated business rule component, which can be configured to control the availability and execution of activities within the case. The business rules component is able to act upon events that the PS6 Case Management framework generates during the lifecycle of that case. Events are fired during the lifetime of the case (e.g. Case created, activity started, activity ended, note added, document uploaded.) Internal Case state The internal state of a case is represented by the diagram below. This shows the internal states and the transition paths for a Case from one state to the next Each transition in state will create an event that can be enacted upon via the Case rules engine. The internal case state lifecycle is defined as follows Defining a case A Case is created and defined as a component of a JDeveloper BPM project. When you create a Case as part of a BPM project, JDeveloper, creates the following components within the SCA composite: Case component Case component interfaces (WSDL etc) Case Rules component (Oracle Business Rules) Adds the Case Component and Case Rules Component to the BPM SOA composite Case Configuration The following section gives a high level overview of the items that can be configured for a BPM Case. Case Activities A Case is associated with a set of activities that are to be performed as part of that Case. Case activities can be: SOA Human Tasks BPM processes Custom Task (Java Class) Case activities are created from pre-existing BPM process or human tasks, which, once defined, can be configured additionally as Case activities in JDeveloper and made available within the lifecycle of a case. I've described the following configurable components of a case (very!) briefly as: Milestones Milestones are (optional) user defined logical milestones that can be achieved within a case. No activities are associates with a milestone, but milestone attainment can be programmatically set and events raised when milestones are reached Outcomes User defined status of a completed case. An event is fired when an outcome is attained. Case Data Defines the data that will be stored with a case XML schemas define the data that is stored with the case. Case Documents Defines the location of documents that are attached to a case (e.g. WebCenter Content) User Defined Events Optional user defined events that can be fired or captured to drive case processing rules Stakeholders Defines the actors who can participate in the case (roles, users, groups) Defines permissions for individual case permissions (read case, create document etc…) Business Rules Business rules are the main component controlling the flow of a Case Each case has an associated business ruleset Rules are fired on receiving Case events (or User defined events) Life cycle events Milestone events Activity events Data events Document events Comment events User event Managing the Case Managing the lifecycle of a case is achieved in two ways: Managing case logic with Business Rules Managing the case lifecycle via the Case APIs. A BPM Case can be viewed as a set of case data & documents along with the activities that can be performed within a case and also the case lifecycle state expressed as milestones and internal lifecycle state. The management of the case life is achieved though both the configuration of business rules and the “manual” interaction with a case instance through the Case APIs. Business Rules and Case Events A key component within the Case management framework is the event model. The BPM Case Management solution internally utilizes Oracle EDN (Event Delivery Network) to publish and subscribe to events generated by the Case framework. Events are generated by the Case framework on each of the processes and stages that a case instance will travel on its lifetime. The following case events are part of the BPM Case: Life cycle events Milestone events Activity events Data events Document events Comment events User event The Case business rules are configured to listen for these events, and business logic can be coded into the Case rules component to enact upon an event being received. Case API & Interaction Along with the business rules component, Cases can be managed via the Case API interfaces. These interfaces allow for the building of custom applications to integrate into case management framework. The API’s allow for updating case comments & documents, executing case activities, updating milestones etc. As there is no in built case management UI functions within the PS6 release, Cases need to be managed via a custom built UI, interacting with selected case instances, launching case activities, closing cases etc. (There is expected to be a UI component within subsequent releases) Logical Case Flow The diagram below is intended to depict a logical view of the case steps for a typical case. A UI or other service calls the Case interface to create a Case instance The case instance is created & database data inserted A lifecycle event is raised indicating a case activity (created) event The case business rules capture the event and decide on an action to take Additionally other parties can subscribe to Case events via EDN The business rules may handle the event, e.g. configured to execute a case activity on case creation event The BPM/Human Workflow/Custom activity is executed A case activity event is raised on the execute activity A case work UI or business service can inspect the case instance and call other actions to progress that case, such as: Execute activity Add Note Add document Add case data Update Milestone Raise user defined event Suspend case Resume case Close Case Summary Having had a little time to play around with the APIs and the case configuration, I really like the flexibility and power of combining Oracle Business Rules and the BPM Case Management event model. Creating something this flexible and powerful without BPM Case Management would take a lot of time and effort. This is hopefully going to save my customers a lot of time and effort! I may make amendments to this post as my understanding of Case Management increases! Take a look at the following links for official documentation etc. http://docs.oracle.com/cd/E28280_01/doc.1111/e15176/case_mgmt_bpmpd.htm https://blogs.oracle.com/bpm/entry/just_in_case Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif";}

    Read the article

  • Dependent on CVS tagging for automated builds

    - by OMG Ponies
    My current work relies on using tags in CVS for an automated build process (ANT currently) to build for respective environments (development, QA, production). From our research, neither Git or Subversion support tagging in the same manner. If we use Subversion or Git, they don't support tags (in the same manner - please correct me?). So how would ANT or Maven know what to pick up for the respective build? Example: For a webapp, when viewing our repository say for the web.xml file -- the history would look like: web.xml v1 ... web.xml v1.2.3 Tag: Prod web.xml v1.2.4 web.xml v1.2.5 Tag: QA web.xml v1.2.6 web.xml v1.2.7 Head The ANT build scripts are run as CRON jobs, at different times & intervals for different environments. The environment build is based on the repository checkout, based on the tag. Development continues, and eventually the respective tags are moved: web.xml v1 ... web.xml v1.2.3 web.xml v1.2.4 web.xml v1.2.5 web.xml v1.2.6 Tag: Prod web.xml v1.2.7 Tag: QA web.xml v1.2.8 Head

    Read the article

  • How do you manage frequent software releases to multiple clients?

    - by meeech
    hi we have a cross-platform middleware product which we typically end up customizing/bug fixing on a per client basis. In some cases, providing updates as often as once/twice per week. We have a lot of trouble efficiently managing and releasing the updates to our clients. I've done some digging, but I can't find anything to specifically address this problem. Can anyone share their experiences - how do you deal with this scenario, or do you know of a good software delivery cms? thanks

    Read the article

  • How do you update live web sites with code changes?

    - by Aaron Anodide
    I know this is a very basic question. If someone could humor me and tell me how they would handle this, I'd be greatful. I decided to post this because I am about to install SynchToy to remedy the issue below, and I feel a bit unprofessional using a "Toy" but I can't think of a better way. Many times I find when I am in this situation, I am missing some painfully obvious way to do things - this comes from being the only developer in the company. ASP.NET web application developed on my computer at work Solution has 2 projects: Website (files) WebsiteLib (C#/dll) Using a Git repository Deployed on a GoGrid 2008R2 web server Deployment: Make code changes. Push to Git. Remote desktop to server. Pull from Git. Overwrite the live files by dragging/dropping with windows explorer. In Step 5 I delete all the files from the website root.. this can't be a good thing to do. That's why I am about to install SynchToy... UPDATE: THANKS for all the useful responses. I can't pick which one to mark answer - between using a web deployment - it looks like I have several useful suggesitons: Web Project = whole site packaged into a single DLL - downside for me I can't push simple updates - being a lone developer in a company of 50, this remains something that is simpler at times. Pulling straight from SCM into web root of site - i originally didn't do this out of fear that my SCM hidden directory might end up being exposed, but the answers here helped me get over that (although i still don't like having one more thing to worry about forgetting to make sure is still true over time) Using a web farm, and systematically deploying to nodes - this is the ideal solution for zero downtime, which is actually something I care about since the site is essentially a real time revenue source for my company - i might have a hard time convincing them to double the cost of the servers though. -- finally, the re-enforcement of the basic principal that there needs to be a single click deployment for the site OR ELSE THERE SOMETHING WRONG is probably the most useful thing I got out of the answers. UPDATE 2: I thought I come back to this and update with the actual solution that's been in place for many months now and is working perfectly (for my single web server solution). The process I use is: Make code changes Push to Git Remote desktop to server Pull from Git Run the following batch script: cd C:\Users\Administrator %systemroot%\system32\inetsrv\appcmd.exe stop site "/site.name:Default Web Site" robocopy Documents\code\da\1\work\Tree\LendingTreeWebSite1 c:\inetpub\wwwroot /E /XF connectionsconfig Web.config %systemroot%\system32\inetsrv\appcmd.exe start site "/site.name:Default Web Site" As you can see this brings the site down, uses robocopy to intelligently copy the files that have changed then brings the site back up. It typically runs in less than 2 seconds. Since peak traffic on this site is about 2 requests per second, missing 4 requests per site update is acceptable. Sine I've gotten more proficient with Git I've found that the first four steps above being a "manual process" is also acceptable, although I'm sure I could roll the whole thing into a single click if I wanted to. The documentation for AppCmd.exe is here. The documentation for Robocopy is here.

    Read the article

  • NSObject release destroys local copy of object's data

    - by Spider-Paddy
    I know this is something stupid on my part but I don't get what's happening. I create an object that fetches data & puts it into an array in a specific format, since it fetches asynchronously (has to download & parse data) I put a delegate method into the object that needs the data so that the data fetching object copies it's formatted array into an array in the calling object. The problem is that when the data fetching object is released, the copy it created in the caller is being erased, code is: In .h file @property (nonatomic, retain) NSArray *imagesDataSource; In .m file // Fetch item details ImagesParser *imagesParserObject = [[ImagesParser alloc] init:self]; [imagesParserObject getArticleImagesOfArticleId:(NSInteger)currentArticleId]; [imagesParserObject release] <-- problematic release // Called by parser when images parsing is finished -(void)imagesDataTransferComplete:(ImagesParser *)imagesParserObject { self.imagesDataSource = [ImagesParserObject.returnedArray copy]; // copy array to local variable // If there are more pics, they must be assembled in an array for possible UIImageView animation NSInteger picCount = [imagesDataSource count]; if(picCount > 1) // 1 image is assumed to be the pic already displayed { // Build image array NSMutableArray *tempPicArray = [[NSMutableArray alloc] init]; // Temp space to hold images while building for(int i = 0; i < picCount; i++) { // Get Nr from only article in detailDataSource & pic name (Small) from each item in imagesDataSource NSString *picAddress = [NSString stringWithFormat:@"http://some.url.com/shopdata/image/article/%@/%@", [[detailDataSource objectAtIndex:0] objectForKey:@"Nr"], [[imagesDataSource objectAtIndex:i] objectForKey:@"Small"]]; NSURL *picURL = [NSURL URLWithString:picAddress]; NSData *picData = [NSData dataWithContentsOfURL:picURL]; [tempPicArray addObject:[UIImage imageWithData:picData]]; } imagesArray = [tempPicArray copy]; // copy makes immutable copy of array [tempPicArray release]; currentPicIndex = 0; // Assume first pic is pic already being shown } else imagesArray = nil; // No need for a needless pic array // Remove please wait message [pleaseWaitViewControllerObject.view removeFromSuperview]; } I put in tons of NSLog lines to keep track of what was going on & self.imagesDataSource is populated with the returned array but when the parser object is released self.imagesDataSource becomes empty. I thought self.imagesDataSource = [ImagesParserObject.returnedArray copy]; is supposed to make an independant object, like as if it was alloc, init'ed, so that self.imagesDataSource is not just a pointer to the parser's array but is it's own array. So why does the release of the parser object clear the copy of the array. (I checked & double checked that it's not something overwriting self.imagesDataSource, commenting out [imagesParserObject release] consistently fixes the problem) Also, I have exactly the same problem with self.detailDataSource which is declared & populated in the exact same way as self.imagesDataSource I thought that once I call the parser I could release it because the caller no longer needs to refer to it, all further activity is carried out by the parser object through it's delegate method, what am I doing wrong?

    Read the article

  • I don't know How to release NSString.

    - by Beomseok
    NSArray *path = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [path objectAtIndex:0]; NSString *databasePath = [documentsDirectory stringByAppendingPathComponent:@"DB"]; NSString *fileName = [newWordbookName stringByAppendingString:@".csv"]; NSString *fullPath = [databasePath stringByAppendingPathComponent:fileName]; [[NSFileManager defaultManager] createFileAtPath:fullPath contents:nil attributes:nil]; [databasePath release]; //[fileName release]; Error! //[fullPath release]; Error! //NSLog(@"#1 :databasePath: %d",[databasePath retainCount]); //NSLog(@"#1 :fileName: %d",[fileName retainCount]); //NSLog(@"#1 :fullPath: %d",[fullPath retainCount]); Hi guys, I'm using this code and want to release NSString* .. so, I declare fileName, fullPath, and databasePath of NSString. But database is released, fileName, fullpath doen't release. I don't know why it happen. I know that NSArray is Autoreleased. But Is documentsDirectory autoreleased? (newWordbookName is nsstring type) I hope that I look through a document about iPhone memory management. Please advice for me.

    Read the article

  • Unable to disable generation of sources JAR with maven-release-plugin

    - by Chris Lieb
    I am trying to release a web project using Maven 2.2.1 and the maven-release-plugin 2.0-beta-9, but it always fails when doing release:perform on generating the sources jar for the EAR project, which makes sense since the EAR project doesn't have any source. [INFO] [INFO] [source:jar {execution: attach-sources}] [INFO] [INFO] ------------------------------------------------------------------------ [INFO] [ERROR] BUILD ERROR [INFO] [INFO] ------------------------------------------------------------------------ [INFO] [INFO] Error creating source archive: You must set at least one file. To try to disable the building of a sources JAR for the EAR project, I added the following to the POM for my EAR project (the version of the release plugin is set in a parent POM): <build> <plugins> ... <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-release-plugin</artifactId> <configuration> <useReleaseProfile>false</useReleaseProfile> </configuration> </plugin> </plugins> </build> Upon running the release again after checking in this change, I got the same error while generating the sources JAR for the EAR project, even though this should have been disabled by the previous POM snippet. What am I doing wrong? Why is the sources JAR still being built? Edit: I've tried to make the source plugin include my application.xml file so that this error doesn't occur by adding the following POM snippet: <build> <plugins> ... <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-source-plugin</artifactId> <configuration> <includes> <include>${basedir}/META-INF/**/*</include> </includes> <useDefaultExcludes>false</useDefaultExcludes> </configuration> </plugin> </plugins> </build> Unfortunately, this does not fix the problem either.

    Read the article

  • Managing NSStoreType changes between debug and release builds

    - by Eimantas
    NSXMLTypeStore is used when starting Core Data backed application by default because it's good for debugging purposes. But practice dictates that developer should use either NSBinaryStoreType, NSInMemoryStoreType or NSSQLiteStoreType store types in release builds. How do you manage changes between debug and release builds? I believe that changing store type from NSXMLTypeStore to, say, NSBinaryStoreType in code on each release is kinda cumbersome.

    Read the article

  • Using in the same time Boost in release and debug mode

    - by Oodini
    Hello, The title is just for teasing. :-) I know it isn't possible, but here is my problem. My app (a DLL, actually) uses Boost. It also uses an external API, which also uses Boost. The external API is of course provided in a release binary, anlong the needed release Boost binaries. When I compile/link my DLL in release mode, I have no problem. I precise I link my app to Boost in a dynamic way (BOOST_ALL_DYN_LINK). In debug mode, I can't load my DLL. I am not sure it is because of Boost, but I suspect Windows doesn't like working with two Boost (the release one called by the external lib, and the debug one called by my code). Will it work better if I link my code statically with the release Boost ? I don't think it is related to CRT, because I have nothing in the Events Viewer. I use Visual Studio 2005 SP1. Thanks.

    Read the article

  • Can I just release the top object (iPhone)?

    - by yar
    If I release the object that's holding a reference to the variable that I need to release, is that sufficient? Or must I release at every level of the containment hierarchy? I fear that my logic comes from working with a garbage collector for too long. For instance, I assigned to this property of a UIPickerView instance by hand instead of using IB @property(nonatomic, assign) id<UIPickerViewDelegate> delegate Since it's an assign property, I can't just release the reference after I assign it. When I finally release my UIPickerView instance, do I need to do this: [singlePicker.delegate release]; [singlePicker release]; or is the second line sufficient? Also: Are these assign properties the norm, or is that mostly for Interface Builder? I thought that retain properties were the normal thing to expect.

    Read the article

  • Developing custom MBeans to manage J2EE Applications (Part III)

    - by philippe Le Mouel
    This is the third and final part in a series of blogs, that demonstrate how to add management capability to your own application using JMX MBeans. In Part I we saw: How to implement a custom MBean to manage configuration associated with an application. How to package the resulting code and configuration as part of the application's ear file. How to register MBeans upon application startup, and unregistered them upon application stop (or undeployment). How to use generic JMX clients such as JConsole to browse and edit our application's MBean. In Part II we saw: How to add localized descriptions to our MBean, MBean attributes, MBean operations and MBean operation parameters. How to specify meaningful name to our MBean operation parameters. We also touched on future enhancements that will simplify how we can implement localized MBeans. In this third and last part, we will re-write our MBean to simplify how we added localized descriptions. To do so we will take advantage of the functionality we already described in part II and that is now part of WebLogic 10.3.3.0. We will show how to take advantage of WebLogic's localization support to localize our MBeans based on the client's Locale independently of the server's Locale. Each client will see MBean descriptions localized based on his/her own Locale. We will show how to achieve this using JConsole, and also using a sample programmatic JMX Java client. The complete code sample and associated build files for part III are available as a zip file. The code has been tested against WebLogic Server 10.3.3.0 and JDK6. To build and deploy our sample application, please follow the instruction provided in Part I, as they also apply to part III's code and associated zip file. Providing custom descriptions take II In part II we localized our MBean descriptions by extending the StandardMBean class and overriding its many getDescription methods. WebLogic 10.3.3.0 similarly to JDK 7 can automatically localize MBean descriptions as long as those are specified according to the following conventions: Descriptions resource bundle keys are named according to: MBean description: <MBeanInterfaceClass>.mbean MBean attribute description: <MBeanInterfaceClass>.attribute.<AttributeName> MBean operation description: <MBeanInterfaceClass>.operation.<OperationName> MBean operation parameter description: <MBeanInterfaceClass>.operation.<OperationName>.<ParameterName> MBean constructor description: <MBeanInterfaceClass>.constructor.<ConstructorName> MBean constructor parameter description: <MBeanInterfaceClass>.constructor.<ConstructorName>.<ParameterName> We also purposely named our resource bundle class MBeanDescriptions and included it as part of the same package as our MBean. We already followed the above conventions when creating our resource bundle in part II, and our default resource bundle class with English descriptions looks like: package blog.wls.jmx.appmbean; import java.util.ListResourceBundle; public class MBeanDescriptions extends ListResourceBundle { protected Object[][] getContents() { return new Object[][] { {"PropertyConfigMXBean.mbean", "MBean used to manage persistent application properties"}, {"PropertyConfigMXBean.attribute.Properties", "Properties associated with the running application"}, {"PropertyConfigMXBean.operation.setProperty", "Create a new property, or change the value of an existing property"}, {"PropertyConfigMXBean.operation.setProperty.key", "Name that identify the property to set."}, {"PropertyConfigMXBean.operation.setProperty.value", "Value for the property being set"}, {"PropertyConfigMXBean.operation.getProperty", "Get the value for an existing property"}, {"PropertyConfigMXBean.operation.getProperty.key", "Name that identify the property to be retrieved"} }; } } We have now also added a resource bundle with French localized descriptions: package blog.wls.jmx.appmbean; import java.util.ListResourceBundle; public class MBeanDescriptions_fr extends ListResourceBundle { protected Object[][] getContents() { return new Object[][] { {"PropertyConfigMXBean.mbean", "Manage proprietes sauvegarde dans un fichier disque."}, {"PropertyConfigMXBean.attribute.Properties", "Proprietes associee avec l'application en cour d'execution"}, {"PropertyConfigMXBean.operation.setProperty", "Construit une nouvelle proprietee, ou change la valeur d'une proprietee existante."}, {"PropertyConfigMXBean.operation.setProperty.key", "Nom de la propriete dont la valeur est change."}, {"PropertyConfigMXBean.operation.setProperty.value", "Nouvelle valeur"}, {"PropertyConfigMXBean.operation.getProperty", "Retourne la valeur d'une propriete existante."}, {"PropertyConfigMXBean.operation.getProperty.key", "Nom de la propriete a retrouver."} }; } } So now we can just remove the many getDescriptions methods from our MBean code, and have a much cleaner: package blog.wls.jmx.appmbean; import java.io.IOException; import java.io.InputStream; import java.io.OutputStream; import java.io.FileInputStream; import java.io.FileOutputStream; import java.io.File; import java.net.URL; import java.util.Map; import java.util.HashMap; import java.util.Properties; import javax.management.MBeanServer; import javax.management.ObjectName; import javax.management.MBeanRegistration; import javax.management.StandardMBean; import javax.management.MBeanOperationInfo; import javax.management.MBeanParameterInfo; public class PropertyConfig extends StandardMBean implements PropertyConfigMXBean, MBeanRegistration { private String relativePath_ = null; private Properties props_ = null; private File resource_ = null; private static Map operationsParamNames_ = null; static { operationsParamNames_ = new HashMap(); operationsParamNames_.put("setProperty", new String[] {"key", "value"}); operationsParamNames_.put("getProperty", new String[] {"key"}); } public PropertyConfig(String relativePath) throws Exception { super(PropertyConfigMXBean.class , true); props_ = new Properties(); relativePath_ = relativePath; } public String setProperty(String key, String value) throws IOException { String oldValue = null; if (value == null) { oldValue = String.class.cast(props_.remove(key)); } else { oldValue = String.class.cast(props_.setProperty(key, value)); } save(); return oldValue; } public String getProperty(String key) { return props_.getProperty(key); } public Map getProperties() { return (Map) props_; } private void load() throws IOException { InputStream is = new FileInputStream(resource_); try { props_.load(is); } finally { is.close(); } } private void save() throws IOException { OutputStream os = new FileOutputStream(resource_); try { props_.store(os, null); } finally { os.close(); } } public ObjectName preRegister(MBeanServer server, ObjectName name) throws Exception { // MBean must be registered from an application thread // to have access to the application ClassLoader ClassLoader cl = Thread.currentThread().getContextClassLoader(); URL resourceUrl = cl.getResource(relativePath_); resource_ = new File(resourceUrl.toURI()); load(); return name; } public void postRegister(Boolean registrationDone) { } public void preDeregister() throws Exception {} public void postDeregister() {} protected String getParameterName(MBeanOperationInfo op, MBeanParameterInfo param, int sequence) { return operationsParamNames_.get(op.getName())[sequence]; } } The only reason we are still extending the StandardMBean class, is to override the default values for our operations parameters name. If this isn't a concern, then one could just write the following code: package blog.wls.jmx.appmbean; import java.io.IOException; import java.io.InputStream; import java.io.OutputStream; import java.io.FileInputStream; import java.io.FileOutputStream; import java.io.File; import java.net.URL; import java.util.Properties; import javax.management.MBeanServer; import javax.management.ObjectName; import javax.management.MBeanRegistration; import javax.management.StandardMBean; import javax.management.MBeanOperationInfo; import javax.management.MBeanParameterInfo; public class PropertyConfig implements PropertyConfigMXBean, MBeanRegistration { private String relativePath_ = null; private Properties props_ = null; private File resource_ = null; public PropertyConfig(String relativePath) throws Exception { props_ = new Properties(); relativePath_ = relativePath; } public String setProperty(String key, String value) throws IOException { String oldValue = null; if (value == null) { oldValue = String.class.cast(props_.remove(key)); } else { oldValue = String.class.cast(props_.setProperty(key, value)); } save(); return oldValue; } public String getProperty(String key) { return props_.getProperty(key); } public Map getProperties() { return (Map) props_; } private void load() throws IOException { InputStream is = new FileInputStream(resource_); try { props_.load(is); } finally { is.close(); } } private void save() throws IOException { OutputStream os = new FileOutputStream(resource_); try { props_.store(os, null); } finally { os.close(); } } public ObjectName preRegister(MBeanServer server, ObjectName name) throws Exception { // MBean must be registered from an application thread // to have access to the application ClassLoader ClassLoader cl = Thread.currentThread().getContextClassLoader(); URL resourceUrl = cl.getResource(relativePath_); resource_ = new File(resourceUrl.toURI()); load(); return name; } public void postRegister(Boolean registrationDone) { } public void preDeregister() throws Exception {} public void postDeregister() {} } Note: The above would also require changing the operations parameters name in the resource bundle classes. For instance: PropertyConfigMXBean.operation.setProperty.key would become: PropertyConfigMXBean.operation.setProperty.p0 Client based localization When accessing our MBean using JConsole started with the following command line: jconsole -J-Djava.class.path=$JAVA_HOME/lib/jconsole.jar:$JAVA_HOME/lib/tools.jar: $WL_HOME/server/lib/wljmxclient.jar -J-Djmx.remote.protocol.provider.pkgs=weblogic.management.remote -debug We see that our MBean descriptions are localized according to the WebLogic's server Locale. English in this case: Note: Consult Part I for information on how to use JConsole to browse/edit our MBean. Now if we specify the client's Locale as part of the JConsole command line as follow: jconsole -J-Djava.class.path=$JAVA_HOME/lib/jconsole.jar:$JAVA_HOME/lib/tools.jar: $WL_HOME/server/lib/wljmxclient.jar -J-Djmx.remote.protocol.provider.pkgs=weblogic.management.remote -J-Dweblogic.management.remote.locale=fr-FR -debug We see that our MBean descriptions are now localized according to the specified client's Locale. French in this case: We use the weblogic.management.remote.locale system property to specify the Locale that should be associated with the cient's JMX connections. The value is composed of the client's language code and its country code separated by the - character. The country code is not required, and can be omitted. For instance: -Dweblogic.management.remote.locale=fr We can also specify the client's Locale using a programmatic client as demonstrated below: package blog.wls.jmx.appmbean.client; import javax.management.MBeanServerConnection; import javax.management.ObjectName; import javax.management.MBeanInfo; import javax.management.remote.JMXConnector; import javax.management.remote.JMXServiceURL; import javax.management.remote.JMXConnectorFactory; import java.util.Hashtable; import java.util.Set; import java.util.Locale; public class JMXClient { public static void main(String[] args) throws Exception { JMXConnector jmxCon = null; try { JMXServiceURL serviceUrl = new JMXServiceURL( "service:jmx:iiop://127.0.0.1:7001/jndi/weblogic.management.mbeanservers.runtime"); System.out.println("Connecting to: " + serviceUrl); // properties associated with the connection Hashtable env = new Hashtable(); env.put(JMXConnectorFactory.PROTOCOL_PROVIDER_PACKAGES, "weblogic.management.remote"); String[] credentials = new String[2]; credentials[0] = "weblogic"; credentials[1] = "weblogic"; env.put(JMXConnector.CREDENTIALS, credentials); // specifies the client's Locale env.put("weblogic.management.remote.locale", Locale.FRENCH); jmxCon = JMXConnectorFactory.newJMXConnector(serviceUrl, env); jmxCon.connect(); MBeanServerConnection con = jmxCon.getMBeanServerConnection(); Set mbeans = con.queryNames( new ObjectName( "blog.wls.jmx.appmbean:name=myAppProperties,type=PropertyConfig,*"), null); for (ObjectName mbeanName : mbeans) { System.out.println("\n\nMBEAN: " + mbeanName); MBeanInfo minfo = con.getMBeanInfo(mbeanName); System.out.println("MBean Description: "+minfo.getDescription()); System.out.println("\n"); } } finally { // release the connection if (jmxCon != null) jmxCon.close(); } } } The above client code is part of the zip file associated with this blog, and can be run using the provided client.sh script. The resulting output is shown below: $ ./client.sh Connecting to: service:jmx:iiop://127.0.0.1:7001/jndi/weblogic.management.mbeanservers.runtime MBEAN: blog.wls.jmx.appmbean:type=PropertyConfig,name=myAppProperties MBean Description: Manage proprietes sauvegarde dans un fichier disque. $ Miscellaneous Using Description annotation to specify MBean descriptions Earlier we have seen how to name our MBean descriptions resource keys, so that WebLogic 10.3.3.0 automatically uses them to localize our MBean. In some cases we might want to implicitly specify the resource key, and resource bundle. For instance when operations are overloaded, and the operation name is no longer sufficient to uniquely identify a single operation. In this case we can use the Description annotation provided by WebLogic as follow: import weblogic.management.utils.Description; @Description(resourceKey="myapp.resources.TestMXBean.description", resourceBundleBaseName="myapp.resources.MBeanResources") public interface TestMXBean { @Description(resourceKey="myapp.resources.TestMXBean.threshold.description", resourceBundleBaseName="myapp.resources.MBeanResources" ) public int getthreshold(); @Description(resourceKey="myapp.resources.TestMXBean.reset.description", resourceBundleBaseName="myapp.resources.MBeanResources") public int reset( @Description(resourceKey="myapp.resources.TestMXBean.reset.id.description", resourceBundleBaseName="myapp.resources.MBeanResources", displayNameKey= "myapp.resources.TestMXBean.reset.id.displayName.description") int id); } The Description annotation should be applied to the MBean interface. It can be used to specify MBean, MBean attributes, MBean operations, and MBean operation parameters descriptions as demonstrated above. Retrieving the Locale associated with a JMX operation from the MBean code There are several cases where it is necessary to retrieve the Locale associated with a JMX call from the MBean implementation. For instance this can be useful when localizing exception messages. This can be done as follow: import weblogic.management.mbeanservers.JMXContextUtil; ...... // some MBean method implementation public String setProperty(String key, String value) throws IOException { Locale callersLocale = JMXContextUtil.getLocale(); // use callersLocale to localize Exception messages or // potentially some return values such a Date .... } Conclusion With this last part we conclude our three part series on how to write MBeans to manage J2EE applications. We are far from having exhausted this particular topic, but we have gone a long way and are now capable to take advantage of the latest functionality provided by WebLogic's application server to write user friendly MBeans.

    Read the article

  • Vista Power Management GPO

    - by Matt
    Hi, I've created a loopback GPO that has several settings (both computer and user) including a Custom User Interface (Access 2007 Application) and Power Management (has the computer sleep after being idle for 2 min). I'm also filtering so that this policy does not apply to "Admins" - only to "Users". The problem I'm having is when the "Users" login the Power Management settings don’t work, but they do for "Admins". For testing I'm allowing the "Users" to launch Task Manager and use the Run line, so I'll run Explorer and look at Power Management and it shows the settings from my GPO. So I created a test OU with copies of the aforementioned GPO, but removed the Custom User Interface and found the Power Management settings do work for both the "Users" and "Admins". When I add the Custom UI the Power Management settings break for the "User" but continue to work for "Admins". Do the Power Management options need to have User Interface be "Explorer.exe"? Is this a bug or am I doing this the wrong way? BTW the tablets are using Vista SP2. Any insight or advice would be greatly appreciated. Thanks, Matt

    Read the article

  • Project Management, Developer being project managers manager

    - by marabutt
    I am in the planning stages of a project and am looking to hire a project manager. I want be doing some coding and keeping an eye on all parts of the project but feel a project manager will get better results than I could. I can project manage the project and not code and hire another coder or code myself and hire a project manager. I am worried that the project manager will fell impeded by having the project owner as part of the development team. If I run the project, the team might fall apart causing the project to fail. To stick within budget, I have to be involved in one capacity or another. Does anyone have experience with this situation or suggestions?

    Read the article

  • Time management and self improvement

    - by Filip
    Hi, I hope I can open a discussion on this topic as this is not a specific problem. It's a topic I hope to get some ideas on how people in similar situation as mine manage their time. OK, I'm a single developer on a software project for the last 6-8 months. The project I'm working on uses several technologies, mainly .net stuff: WPF, WF, NHibernate, WCF, MySql and other third party SDKs relevant for the project nature. My experience and knowledge vary, for example I have a lot of experience in WPF but much less in WCF. I work full time on the project and im curios on how other programmers which need to multi task in many areas manage their time. I'm a very applied type of person and prefer to code instead of doing research. I feel that doing research "might" slow down the progress of the project while I recognize that research and learning more in areas which I'm not so strong will ultimately make me more productive. How would you split up your daily time in productive coding time and time to and experiment, read blogs, go through tutorials etc. I would say that Im coding about 90%+ of my day and devoting some but very little time in research and acquiring new knowledge.

    Read the article

  • Interacting with clients using project management systems

    - by Keyo
    I work in web development, that involves a lot of smaller custom projects rather than one large product. Requirements and specifications are always coming from outside the company. We've setup a ticket tracking system (Active Collab, which is rubbish compared to redmine btw) and given access to clients so they can submit issues. The idea being that less time is taken up with long phone conversations and emails. I think it can work really well if done right. However I'm not so sure it's always a good thing. Feature requests have gone up a lot on some projects. The system also needs to be friendly to non-developers while having the many features that developers use. Developers' tickets do not always map 1-to-1 with the tickets clients will create. So the requirements and broader tickets need to be separated from the more specific developer (specification) related tickets. Perhaps we could use two systems, one for clients to submit their requirements or describe a bug, and one for developers to create tickets like implement method x in class y. Maybe this can be achieved by structuring tickets into more appropriate categories or creating sub-tickets under a feature request ticket. I've briefly looked into Pivotal Tracker and it has a fundamentally different workflow. I would like to know how others are communicating with clients and keeping the technical workflow separate from the non-technical workflow. What tools do you use and how do you use them?

    Read the article

  • Time management and self improvement

    - by Filip
    I hope I can open a discussion on this topic as this is not a specific problem. It's a topic I hope to get some ideas on how people in similar situation as mine manage their time. OK, I'm a single developer on a software project for the last 6-8 months. The project I'm working on uses several technologies, mainly .net stuff: WPF, WF, NHibernate, WCF, MySql and other third party SDKs relevant for the project nature. My experience and knowledge vary, for example I have a lot of experience in WPF but much less in WCF. I work full time on the project and im curios on how other programmers which need to multi task in many areas manage their time. I'm a very applied type of person and prefer to code instead of doing research. I feel that doing research "might" slow down the progress of the project while I recognize that research and learning more in areas which I'm not so strong will ultimately make me more productive. How would you split up your daily time in productive coding time and time to and experiment, read blogs, go through tutorials etc. I would say that Im coding about 90%+ of my day and devoting some but very little time in research and acquiring new knowledge. Thanks for your replies. I think I will adopt a gradual transition to Dominics block parts. I kinda knew that coding was taking up way to much of my time but it feels good having a first version of the project completed and ready. With a few months of focused hard work behind me I hope to get more time to experiment and expand my knowlegde. Now I only hope my boss will cut me some slack and stop pressuring me for features...

    Read the article

  • Grid Infrastructure Management Repository (GIMR) database now mandatory in Oracle GI 12.1.0.2

    - by Mike Dietrich
    During the installation of Oracle Grid Infrastructure 12.1.0.1 you've had the following option to choose YES/NO to install the Grid Infrastructure Management Repository (GIMR) database MGMTDB: With Oracle Grid Infrastructure 12.1.0.2 this choice has become obsolete and the above screen does not appear anymore. The GIMR database has become mandatory.  What gets stored in the GIMR? See the documentation here See the changes in Oracle Clusterware 12.1.0.2 here: Automatic Installation of Grid Infrastructure Management Repository The Grid Infrastructure Management Repository is automatically installed with Oracle Grid Infrastructure 12c release 1 (12.1.0.2). The Grid Infrastructure Management Repository enables such features as Cluster Health Monitor, Oracle Database QoS Management, and Rapid Home Provisioning, and provides a historical metric repository that simplifies viewing of past performance and diagnosis of issues. This capability is fully integrated into Oracle Enterprise Manager Cloud Control for seamless management. Furthermore what the doc doesn't say explicitly: The -MGMTDB has now become a single-tenant deployment having a CDB with one PDB This will allow the use of a Utility Cluster that can hold the CDB for a collection of GIMR PDBs When you've had already an Oracle 12.1.0.1 GIMR this database will be destroyed and recreated Preserving the CHM/OS data can be acchieved with OCULMON to dump it out into node view The data files associated with it will be created within the same disk group as OCR and VOTING  In a future release there may be an option offered to put in into a separate disk group Some important MOS Notes: MOS Note 1568402.1FAQ: 12c Grid Infrastructure Management Repository, states there's no supported procedure to enable Management Database once the GI stack is configured MOS Note 1589394.1How to Move GI Management Repository to Different Shared Storage(shows how to delete and recreate the MGMTDB) MOS Note 1631336.1Cannot delete Management Database (MGMTDB) in 12.1 -Mike

    Read the article

  • Project Management - Asana / activeCollab / basecamp / alternative / none

    - by rickyduck
    I don't know whether this should be on programmers - I've been looking at the above three apps over the past few weeks just for myself and I'm in two minds. All three look good, are easy to use, and I came to this conclusion; Asana is the easiest to use ActiveCollab is the feature rich and easiest flow BaseCamp is the best UX / design But I didn't really find my workflow was any more quicker / efficient, in fact it was a bit slower and organized. Is there a realistic place for them in workflow - should programmers use them for themselves, or only when a project manager can take control of it?

    Read the article

  • Oracle R12 Inventory Management New Features Wrap-Up

    - by [email protected]
    Webcast: Oracle R12 Inventory Management New FeaturesHeld March 31st, 2010 Oracle Inventory management is an integrated part of Oracle SCM (Supply Chain Management). In this session you will see a comprehensive look of changed feature in Oracle R12 Inventory Management. This session will highlight about the new features added and also explore there functionalities. This webinar recording will introduce you to the built-in features of Oracle R12 Inventory Management such as: OPM Inventory Convergence Multi-mode Inventory Management Material Traceability Fulfillment Optimization Extended Best Practices View Oracle R12 Inventory Management New Features Webinar Online, Click Here: http://www.iwarelogic.com/oracle-r12-inventory-management-new-features.htm

    Read the article

< Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >