Search Results

Search found 22998 results on 920 pages for 'supervised users'.

Page 845/920 | < Previous Page | 841 842 843 844 845 846 847 848 849 850 851 852  | Next Page >

  • What's New in Oracle's EPM System?

    - by jmorourke
    Oracle’s EPM System R11.1.2.2  is now generally available to customers and partners on the download center.  Although the release number doesn’t sound significant, this is a major release of Oracle’s Hyperion EPM Suite with new modules as well as significant enhancements across the suite.  This release was announced back on April 4th as part of Oracle’s Business Analytics Strategy launch, so analytics is a key aspect of the release.  But the three biggest pieces of news in this release are Oracle Hyperion Planning support for the Exalytics In-Memory Machine, the new Project Financial Planning Application and the new Account Reconciliations Manager module. The Oracle Exalytics In-Memory Machine was announced back in October 2011, at Oracle OpenWorld.  It’s the latest installment from Oracle in a line of engineered systems that combine Oracle Sun hardware, with Oracle database and application technologies – in solutions that are designed to provide high scalability and performance for specific tasks.  Exalytics is the first engineered system specifically designed for high performance analytics.  Running in-memory versions of Oracle Essbase, as well as the Oracle TimesTen database and Oracle BI tools, Exalytics provides speed of thought response times for complex analytic processes with advanced visualizations.  Early adopter customers have achieved 5X to 100X faster interactivity and 6X to 10X faster planning cycles.  Hyperion Planning running with Oracle Exalytics will support enterprise-wide planning, budgeting and forecasting with more detailed data, with hundreds to thousands of users across an organization getting speed of thought performance. The new Hyperion Project Financial Planning application delivered with EPM 11.1.2.2 is also great news for Oracle customers.  This application follows on the heels of other special-purpose planning applications that Oracle has delivered for Workforce and Capital Asset planning.  It allows Project Managers to identify project-related expenses and revenues, plan and propose new projects, and track results over time. Finance Managers can evaluate and compare different projects, manage the funding process, monitor and report the actual financial results and impacts of projects and project portfolios. This new application is applicable to capital projects, contract projects and indirect projects like IT and HR projects across all industries.  This application is a great complement to existing Project Management applications, and helps bridge the gap between these applications, and the financial planning and budgeting process. Account reconciliations has to be one of the biggest bottlenecks and risks in the financial close and reporting process, and many organizations rely on spreadsheets and manual processes to perform this critical process.  To help address this problem, Oracle developed an Account Reconciliation Manager module that is being delivered as part of Oracle Hyperion Financial Close Management.   This module helps automate and streamline account reconciliations and eliminates the chances for errors, omissions and fraud.  But unlike standalone account reconciliation packages, it’s integrated with the rest of the Oracle Hyperion Financial Close suite, and can integrate balances from any source system.  This can help alleviate a major bottleneck in the financial close process, increase accuracy and reduce risk, and can complement existing investments in Hyperion Financial Management, as well as Oracle and non-Oracle transaction processing systems. Other enhancements in this release include an enhanced Web 2.0 interface for Hyperion Planning and Hyperion Financial Management (HFM), configurable dimensionality in HFM, new Predictive Planning feature in Hyperion Planning, new Detailed Profitability feature in Hyperion Profitability and Cost Management, new Smart View interface for Hyperion Strategic Finance, and integration of the Hyperion applications with JD Edwards Financials. For more information about Oracle EPM System R11.1.2.2 check out the links below: Press Release:  http://www.oracle.com/us/corporate/press/1575775 Product Information on O.com:  http://www.oracle.com/us/solutions/business-analytics/overview/index.html Product Information on OTN:  http://www.oracle.com/technetwork/middleware/epm/downloads/index.html Webcast Replay:  http://www.oracle.com/us/go/index.html?Src=7317510&Act=65&pcode=WWMK11054701MPP046 Please contact me if you have any questions or need additional information – [email protected]

    Read the article

  • Orchestrating the Virtual Enterprise, Part II

    - by Kathryn Perry
    A guest post by Jon Chorley, Oracle's CSO & Vice President, SCM Product Strategy Almost everyone has ordered from Amazon.com at one time or another. Our orders are as likely to be fulfilled by third parties as they are by Amazon itself. To deliver the order promptly and efficiently, Amazon has to send it to the right fulfillment location and know the availability in that location. It needs to be able to track status of the fulfillment and deal with exceptions. As a virtual enterprise, Amazon's operations, using thousands of trading partners, requires a very different approach to fulfillment than the traditional 'take an order and ship it from your own warehouse' model. Amazon had no choice but to develop a complex, expensive and custom solution to tackle this problem as there used to be no product solution available. Now, other companies who want to follow similar models have a better off-the-shelf choice -- Oracle Distributed Order Orchestration (DOO).  Consider how another of our customers is using our distributed orchestration solution. This major airplane manufacturer has a highly complex business and interacts regularly with the U.S. Government and major airlines. It sits in the middle of an intricate supply chain and needed to improve visibility across its many different entities. Oracle Fusion DOO gives the company an orchestration mechanism so it could improve quality, speed, flexibility, and consistency without requiring an organ transplant of these highly complex legacy systems. Many retailers face the challenge of dealing with brick and mortar, Web, and reseller channels. They all need to be knitted together into a virtual enterprise experience that is consistent for their customers. When a large U.K. grocer with a strong brick and mortar retail operation added an online business, they turned to Oracle Fusion DOO to bring these entities together. Disturbing the Peace with Acquisitions Quite often a company's ERP system is disrupted when it acquires a new company. An acquisition can inject a new set of processes and systems -- or even introduce an entirely new business like Sun's hardware did at Oracle. This challenge has been a driver for some of our DOO customers. A large power management company is using Oracle Fusion DOO to provide the flexibility to rapidly integrate additional products and services into its central fulfillment operation. The Flip Side of Fulfillment Meanwhile, we haven't ignored similar challenges on the supply side of the equation. Specifically, how to manage complex supply in a flexible way when there are multiple trading parties involved? How to manage the supply to suppliers? How to manage critical components that need to merge in a tier two or tier three supply chain? By investing in supply orchestration solutions for the virtual enterprise, we plan to give users better visibility into their network of suppliers to help them drive down costs. We also think this technology and full orchestration process can be applied to the financial side of organizations. An example is transactions that flow through complex internal structures to minimize tax exposure. We can help companies manage those transactions effectively by thinking about the internal organization as a virtual enterprise and bringing the same solution set to this internal challenge.  The Clear Front Runner No other company is investing in solving the virtual enterprise supply chain issues like Oracle is. Oracle is in a unique position to become the gold standard in this market space. We have the infrastructure of Oracle technology. We already have an Oracle Fusion DOO application which embraces the best of what's required in this area. And we're absolutely committed to extending our Fusion solution to other use cases and delivering even more business value. Jon ChorleyChief Sustainability Officer & Vice President, SCM Product StrategyOracle Corporation

    Read the article

  • Increasing touch surface (#wp7dev)

    - by Laurent Bugnion
    When you design for Windows Phone 7 (or for any touch device, for that matter, and most especially small screens), you need to be very careful to give enough surface to your users’ fingers. It is easy to miss a touch on such small screens, and that can be horrifyingly frustrating. This is especially true when people are on the move, and trying to hit the control while walking and holding their device in one hand, or when the device is mounted in a car and vibrating with the engine. In my experience, a touch surface should be ideally minimum 60x60 pixels to be easy to activate on the Windows Phone 7 screen (which is, as we know, 800 pixels x 480 pixels). Ideally, I try to make my touch surfaces 80x80 pixels minimum. This causes a few design challenges of course. Using transparent backgrounds However, one thing is helping us tremendously: some surfaces can be made transparent, and yet react to touch. The secret is the following: If you have a panel that has a Null background (i.e. the Background is not set at all), then the empty surface does not react to touch. If however the Background is set to the Transparent color (or any color where the Alpha channel is set to 0), then it will react to touch. Setting a transparent background is easy. For example: <Grid Background="#00000000"> </Grid> or <Grid Background="Transparent"> </Grid> In C#: var grid = new Grid { Background = new SolidColorBrush( Colors.Transparent) }; Using negative margins Having a transparent background reactive to touch is a good start, but in addition, you must make sure that the surface is big enough for my clumsy fingers. One way to achieve that is to increase the transparent, touch-reactive surface, and reposition the element using negative margins. For example, consider the following UI. I changed the transparent background of the HyperlinkButton to Red, in order to visualize the touch surface. In this figure, the Settings HyperlinkButton is 105 pixels x 31 pixels. This is wide enough, but really small in height and easy to miss. To improve this, we can use negative margins, for instance: <HyperlinkButton Content="Settings" HorizontalAlignment="Right" VerticalAlignment="Bottom" Height="60" Margin="0,0,0,-15" /> Notice the usage of negative bottom margin to bring the HyperlinkButton back at the bottom of the main Grid’s first row, where it belongs. And the result is: Notice how the touch surface is much bigger than before. This makes the HyperlinkButton easier to reach, and improves the user experience. With the background set back to normal, the UI looks exactly the same, as it should: In summary: Remember to maximize the touch surface for your controls. Plan your design in consequence by reserving enough room around each control to allow their hit surface to be expanded as shown in this article. Do not cram too many controls in one page. If REALLY needed, use an additional page (or even better: use a Pivot control with multiple pivot items) for the controls that don’t fit on the first one. This should ensure a smoother user experience and improved touch behavior. Happy coding! Laurent   Laurent Bugnion (GalaSoft) Subscribe | Twitter | Facebook | Flickr | LinkedIn

    Read the article

  • Five geeky things you must do with your Android Smartphone

    - by Gopinath
    Android is the Windows of next generation. Its open, free, widely adopted and smart enough to outsmart Apple’s iOS. It’s a stolen product and cheap imitation of iOS, but Steve Job’s once quoted saying good artists copy and great artists steal. Alright, this post is not about Android vs iOS or is it really stolen or not. Android is a great OS for mobile devices and it lets you do amazing through mobiles.  In this post I want to write about the geeky things we can do with an Android Smartphone. Control your computer using mobile Assume that it is a lazy weekend and you are on a couch watching movies on a laptop which is a meter away. Now you want to adjust volume or skip a scene/song. How to control your laptop without moving out of couch? Just install Universal Remote free app on your smartphone and start control your computer using phone. Universal Remove app controls computers over Wifi or Bluetooth networks with dedicated remote controls for various media players and applications like YouTube, VLC & Spotify.  The application is very easy to use and works amazingly well in controlling computers. Few of the remote controls provided in the app are – Mouse, Keyboard, Media Controls, Power, Start, Windows Media Player, VLC Player,  YouTube. There is also paid version of this app with additional remotes, but for most of the users Free version is good enough. Stream YouTube videos playing on you mobile to computer You can stream YouTube videos playing on your mobile to computer/smart tv. This is something similar to Apple’s most popular AirPlay feature, but works only with YouTube videos. To start streaming videos install Google’s YouTube Remote on your smartphone, open youtube.com/leanback on your computer  and pair up mobile with computer. Once the pairing is done, videos played on YouTube Remote app will be streamed on to your computer. Access your mobile using any web browser – send/receive SMS, view photos/call logs, etc. Want to control your mobile phone using a computer? Install AirDroid app on your phone and start controlling your phone using computer browser – send and receive messages, view call logs, play music, upload/download files, edit contacts and many more. At times it’s lot of fun to access mobile using a big screen devices like laptops. Launch a webpage on your mobile browser using your computer With Google Chrome to Phone installed on your computer and mobile, you can send links and other information from Chrome browser to your Android device. With a click on Chrome browser, the current webpage of Chrome browser will be automatically launched on Android device. This is very handy when you want to send links, send driving direction to mobile using Google Maps and launch phone dialer with number selected on webpage. Install Apps on mobile using computer To install apps on your smartphone you really don’t need to touch it. Open any web browser, sing in to Google Play with your Google id that is associated with smartphone and start installing apps on to your phone right from the browser. As you browse apps on Google Play store, you find Install button and all you need to do is to just click Install. Google will automatically installs app on your mobile within few seconds.

    Read the article

  • Monitor System Resources from the Windows 7 Taskbar

    - by Asian Angel
    The problem with most system monitoring apps is that they get covered up with all of your open windows, but you can solve that problem by adding monitoring apps to the Taskbar. Setting Up & Using SuperbarMonitor All of the individual monitors and the .dll files necessary to run them come in a single zip file for your convenience. Simply unzip the contents, add them to an appropriate “Program Files Folder”, and create shortcuts for the monitors that you would like to use on your system. For our example we created shortcuts for all five monitors and set the shortcuts up in their own “Start Menu Folder”. You can see what the five monitors (Battery, CPU, Disk, Memory, & Volume) look like when running…they are visual in appearance without text to clutter up the looks. The monitors use colors (red, green, & yellow) to indicate the amount of resources being used for a particular category. Note: Our system is desktop-based but the “Battery Monitor” was shown for the purposes of demonstration…thus the red color seen here. Hovering the mouse over the “Battery, CPU, Disk, & Memory Monitors” on our system displayed a small blank thumbnail. Note: The “Battery Monitor” may or may not display more when used on your laptop. Going one step further and hovering the mouse over the thumbnails displayed a small blank window. There really is nothing that you will need to worry with outside of watching the color for each individual monitor. Nice and simple! The one monitor with extra features on the thumbnail was the “Volume Monitor”. You can turn the volume down, up, on, or off from here…definitely useful if you have been wanting to hide the “Volume Icon” in the “System Tray”. You can also pin the monitors to your “Taskbar” if desired. Keep in mind that if you do close any of the monitors they will “temporarily” disappear from the “Taskbar” until the next time they are started. Note: If you want the monitors to start with your system each time you will need to add the appropriate shortcuts to the “Startup Sub-menu” in your “Start Menu”. Conclusion If you have been wanting a nice visual way to monitor your system’s resources then SuperbarMonitor is definitely worth trying out. Links Download SuperbarMonitor Similar Articles Productive Geek Tips Monitor CPU, Memory, and Disk IO In Windows 7 with Taskbar MetersUse Windows Vista Reliability Monitor to Troubleshoot CrashesTaskbar Eliminator Does What the Name Implies: Hides Your Windows TaskbarBring Misplaced Off-Screen Windows Back to Your Desktop (Keyboard Trick)How To Fix System Tray Tooltips Not Displaying in Windows XP TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Follow Finder Finds You Twitter Users To Follow Combine MP3 Files Easily QuicklyCode Provides Cheatsheets & Other Programming Stuff Download Free MP3s from Amazon Awe inspiring, inter-galactic theme (Win 7) Case Study – How to Optimize Popular Wordpress Sites

    Read the article

  • Apps UX Unveils New Face of Fusion at OpenWorld 2012

    - by Kathy.Miedema
    By Kathy Miedema, Oracle Applications User Experience The Oracle Applications User Experience (UX) team is getting ready to unveil the new face of Oracle Fusion Applications at Oracle OpenWorld 2012 in San Francisco next week. Photos by Martin Taylor, Oracle Applications User ExperienceJeremy Ashley, Vice President of Oracle Applications User Experience, shows the new face of Fusion Applications to a group of trainers at Oracle’s headquarters in Redwood Shores, Calif. Our team spent the past 6 months working on this project, which embraces simplicity with a modern, productive user experience that aims to help our applications customers rapidly scale deployment of essential self-service tasks and speed adoption by users who need quick access to do quick-entry tasks. We have spent the week before OpenWorld at Oracle headquarters in Redwood Shores, conducting training sessions with Fusion UX Advocates (FXA), Oracle UX Sales Ambassadors (SAMBA), and members of the Oracle Usability Advisory Board (OUAB). We showed the new face of Fusion to customers, partners, ACE Directors, and people from our own sales organization. Next week during OpenWorld, they will be showing demos alongside our team members. To find them, look for the Usable Apps t-shirt, with this artwork: You can also get a look at the new face of Fusion during OpenWorld at the following sessions and demopods: GEN9433 - General Session: Oracle Fusion Applications—Overview, Strategy, and Roadmap Presenter: Chris Leone, Senior Vice President, Oracle Monday, Oct. 1, 10:45 a.m. – 11:45 a.m. in Moscone West 2002/2004 AND Wednesday, Oct. 3, 10:1 a.m. – 11:15 a.m. in Moscone West 2002/2004 CON9407 - Oracle Fusion Customer Relationship Management: Overview/Strategy/Customer Experiences/Roadmap Presenter: Anthony Lye, Senior Vice President, Oracle Monday, Oct. 1, 3:15 – 4:15 p.m. in Moscone West 2008 CON9438 - Oracle Fusion Applications: Transforming Insight into Action Presenters: Jeremy Ashley, Vice President Applications User Experience, Oracle; Katie Candland, Director Applications User Experience, Oracle; Basheer Khan, founder and CEO of Innowave Technology, an Oracle ACE Director for both Fusion Middleware and Applications, and a Fusion UX Advocate Tuesday, Oct. 2, 10:15 a.m. - 11:15 a.m. in Moscone West 2007 CON9467 - Oracle’s Roadmap to a Simple, Modern User Experience Presenter: Jeremy Ashley, Vice President Applications User Experience, Oracle Wednesday, Oct. 3, 3:30 p.m. - 4:30 p.m. in Moscone West 3002/3004 On the demogrounds: Come to the Apps UX pods for a look at enterprise applications on mobile devices such as smart phones and the iPad, and stay for a demo of the new face of Oracle Fusion Applications. Our demopods will also feature some of the cutting-edge tools in Oracle’s arsenal of usability evaluation methods. The Exhibition Hall at Oracle OpenWorld 2012 will be open Monday through Wednesday, Oct. 1-3. The demogrounds for Oracle Applications are located on the lower level of Moscone West in San Francisco. Hours for the Exhibition Hall are: · Monday, 10 a.m. to 6 p.m. · Tuesday, 9:45 a.m. to 6 p.m. · Wednesday, 9:45 a.m. to 4 p.m.

    Read the article

  • UPK Hands-on Labs at OHUG

    - by Karen Rihs
    Going to OHUG, June 18-22? Be sure to attend one or more UPK hands-on labs! Choose from Basic, Advanced, What's New, and Prebuilt Content!   Oracle User Productivity Kit 11.1 Workshop – Basic Stephen Armbruster, Oracle Corporation June 19, 2012, 11:00 a.m. – 12:00 p.m. June 20, 2012, 4:30 – 5:30 p.m. The User Productivity Kit (UPK) is a comprehensive, cost-effective, customizable solution that helps your organization quickly create the critical documentation, training, and support materials needed to drive project team and user productivity throughout the lifecycle of your software. The User Productivity Kit provides system process documentation, user acceptance test scripts, comprehensive instructor-led training materials, web-based training materials, role-based performance support, and complete documentation. Also provided is the UPK Developer, which serves as a single-source development and customization tool to enable rapid content creation and customization. The User Productivity Kit delivers: Business process documentation for fit-gap analysis - providing time and cost savings that jump-start your implementation or upgrade User Acceptance test scripts to help test applications prior to go-live State-of-the-art instructional design tools to rapidly build and tailor documentation, instructor-led training materials, and web-based training to fit organizational needs Live-application performance support with transactional and procedural information to maximize user efficiency. By registering for this hands-on UPK workshop, participants will use UPK to build an application job aid and simulation that can be used as performance support for the application. But hurry, space is limited! Oracle User Productivity Kit 11.1 Workshop – Advanced Stephen Armbruster, Oracle Corporation June 20, 2012, 1:30 – 2:30 p.m. This special workshop is for those already familiar with UPK and will cover advanced concepts. In this workshop, you will gain an in-depth knowledge of working with the UPK Developer. Following this workshop, you will be able to: Create publishing categories Add a logo to a publishing project Publish using the newly created category Configure your own library view Manage topic history in a multi-user environment Oracle User Productivity Kit 11.1 Workshop – What’s NEW! Stephen Armbruster, Oracle Corporation June 19, 2012, 1:30 – 2:30 p.m. June 21, 2012, 1:00 – 2:00 p.m. This special workshop is for those already familiar with UPK and will focus on the new features included in the latest version 11.1. In this workshop, you will review most of the new features included in the UPK Developer. Oracle User Productivity Kit 11.1 Workshop – Prebuilt Content Stephen Armbruster, Oracle Corporation June 19, 2012, 4:30 – 5:30 p.m. June 21, 2012, 2:15 – 3:15 p.m. This special workshop is for those already familiar with UPK and will focus on the latest version 11.1. At the end of this workshop, you will be able to demonstrate how to: Import prebuilt content Modify content frames Add a decision frame Translate a topic into Spanish Stephen Armbruster is a principal sales consultant, specializing in HCM and UPK applications for Oracle over the past twelve years. In addition to his current role, he serves as an ambassador for the Fusion User Experience (UX) team and is tasked with evangelizing the UX for end users across all Oracle brands (Fusion, PSFT, JDE, and EBS).  He is also a trusted advisor to Oracle’s Product Management teams related to Learning Management Systems (LMS). Prior to joining Oracle, he was an instructor as well as an instructional technologist working in the medical diagnostics, high tech, and information management industries. As an expert in both LMS and UPK, he regularly speaks at Oracle conferences including Oracle OpenWorld and OHUG on topics that span using Oracle solutions to accomplish employee training, certification, and user adoption. His presentations are both entertaining and engaging.

    Read the article

  • Oracle OpenWorld Update: Oracle GoldenGate Customer Panels

    - by Doug Reid
    0 false 18 pt 18 pt 0 0 false false false /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:"Times New Roman"; mso-ascii-font-family:Cambria; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Cambria; mso-hansi-theme-font:minor-latin;} We are two weeks out from the start of Oracle OpenWorld 2012. The Data Integration team has a solid line-up of product and customer sessions for you to attend this year, plus five hands-on labs, and numerous demonstration pods in Moscone South. On Monday we kick the track off with Brad Adelberg’s Future Strategy, Direction and Roadmap for Oracle’s Data Integration Platform at 10:45AM in Moscone West 3005. Over the rest of the week we have a number of deep dive sessions that build out the themes that Brad discusses in his keynote, but the two that I would like to highlight today are our Oracle GoldenGate customer panels. The first customer panel is on Zero Downtime Operations and is on Monday at 1:45 in Moscone West 3005. The theme of this session is how to reduce downtime for critical must-succeed systems. Here’s a rundown of the session: Bank of America, TALX, and St. Jude Medical all have users communities that expect systems to be available around the clock. In this customer panel session, Bank of America discusses how it will be leveraging Oracle GoldenGate. St. Jude Medical shares how it is using Oracle GoldenGate to achieve a zero-downtime migration for a 5 TB Oracle online transaction processing (OLTP) 24/7 mission-critical database. TALX discusses how Equifax Workforce Information Services used Oracle GoldenGate to move from processing online transactions in a single site to processing concurrently from two geographically disparate data centers, providing a highly available solution with significant burst capacity. On Tuesday at 11:45 in Moscone West 3005 we switch gears and host a customer panel on Operational Reporting. The theme of this customer panel is all around reporting and how Oracle GoldenGate raises the bar on reporting by enabling real-time access to real-time data. Here’s a rundown of the session: Turk Telekom and Comcast are half a world away from each other, but these two powerhouse companies have both drastically improved performance and access to real-time data by using Oracle GoldenGate. During this panel discussion, Turk Telekom will explain its evaluation and implementation of Oracle GoldenGate, how the business has experienced significant improvements in the core database and reporting platform, and how it plans to expand its usage into its SOA architecture and its architecture based on Oracle’s Siebel platform. Comcast will explain its implementation of Oracle GoldenGate and how it moves data in real time from its mission-critical HP NonStop database to a Teradata data warehouse. Join us at our sessions to learn what other customers are doing with our products or stop by our demo pods in Moscone south and meet the product management and development teams.

    Read the article

  • Oracle Enterprise Manager Cloud Control 12c Release 2 (12.1.0.2) Now Available!

    - by Javier Puerta
    Oracle Enterprise Manager Cloud Control 12c Release 2 (12.1.0.2) is now available on OTN on ALL platforms. This is the first major release since the launch of Enterprise Manager 12c in October of 2011 and the first ever Enterprise Manager release available on all platforms simultaneously. This is primarily a stability release which incorporates many of issues and feedback reported by early adopters. In addition, this release contains many new features and enhancements in areas across the board.   New Capabilities and Features   Enhanced management capabilities for enterprise private clouds: Introduces new capabilities to allow customers to build and manage a Java Platform-as-a-Service (PaaS) cloud based on Oracle Weblogic Server. The new capabilities include guided set up of PaaS Cloud, self-service provisioning, automatic scale out and metering and chargeback. Enhanced lifecycle management capabilities for Oracle WebLogic Server environments: Combining in-context multiple domain, patching and configuration file synchronizations. Integrated Hardware-Software management for Oracle Exalogic Elastic Cloud through features such as rack schematics visualization and integrated monitoring of all hardware and software components. The latest management capabilities for business-critical applications include: Business Application Management: A new Business Application (BA) target type and dashboard with flexible definitions provides a logical view of an application’s business transactions, end-user experiences and the cloud infrastructure the monitored application is running on. Enhanced User Experience Reporting: Oracle Real User Experience Insight has been enhanced to provide reporting capabilities on client-side issues for applications running in the cloud and has been more tightly coupled with Oracle Business Transaction Management to help ensure that real-time user experience and transaction tracing data is provided to users in context. Several key improvements address ease of administration, reporting and extensibility for massively scalable cloud environments including dynamic groups, self-updateable monitoring templates, bulk operations against many events, etc. New and Revised Plug-Ins:   Several plug-Ins have been updated as a part of this release resulting in either new versions or revisions. Revised plug-ins contain only bug-fixes and while new plug-ins incorporate both bug fixes as well as new functionality.   Plug-In Name Version Enterprise Manager for Oracle Database 12.1.0.2 (revision) Enterprise Manager for Oracle Fusion Middleware 12.1.0.3 (new) Enterprise Manager for Chargeback and Capacity Planning 12.1.0.3 (new) Enterprise Manager for Oracle Fusion Applications 12.1.0.3 (new) Enterprise Manager for Oracle Virtualization 12.1.0.3 (new) Enterprise Manager for Oracle Exadata 12.1.0.3 (new) Enterprise Manager for Oracle Cloud 12.1.0.4 (new) Installation and Upgrade:   All major platforms have been released simultaneously (Linux 32 / 64 bit, Solaris (SPARC), Solaris x86-64, IBM AIX 64-bit, and Windows x86-64 (64-bit) ) Enterprise Manager 12.1.0.2 is a complete release that includes both the EM OMS and Agent versions of 12.1.0.2. Installation options available with EM 12.1.0.2: User can do fresh Install or an upgrade from versions EM 10.2.0.5, 11.1, or 12.1.0.2 ( Bundle Patch 1 not mandatory). Upgrading to EM 12.1.0.2 from EM 12.1.0.1 is not a patch application (similar to Bundle Patch 1) but is achieved through a 1-system upgrade. Documentation:   Oracle Enterprise Manager Cloud Control Introduction Document provides a broad overview of capabilities and highlights"What's New" in EM 12.1.0.2.   All updated Oracle Enterprise Manager documentation can be found on OTN   Customer Webcast - EM 12c Installation and Upgrade: This webcast is for customers who are interested in learning how to successfully deploy or upgrade to EM 12.1.0.2.   Customer Webcast - Installation and Upgrade - September 21(registration and info on OTN starting September 12)   Enterprise Manager 12c R2 Resources:   OTN Download Page Upgrade Guide

    Read the article

  • Moving the Oracle User Experience Forward with the New Release 7 Simplified UI for Oracle Sales Cloud

    - by mvaughan
    By Kathy Miedema, Oracle Applications User ExperienceIn September 2013, Release 7 for Oracle Cloud Applications became generally available for Oracle Sales Cloud and HCM Cloud. This significant release allowed the Oracle Applications User Experience (UX) team to finally talk freely about Simplified UI, a user experience project in the works since Oracle OpenWorld 2012. Simplified UI represents the direction that the Oracle user experience – for all of its enterprise applications – is heading. Oracle’s Apps UX team began by building a Simplified UI for sales representatives. You can find that today in Release 7, and it was demoed extensively during OpenWorld 2013 in San Francisco. This screenshot shows how Opportunities appear in the new Simplified UI for Oracle Sales Cloud, a user interface built for sales reps.Analyst Rebecca Wettemann, vice president of Nucleus Research, saw Simplified UI at Oracle Openworld 2013 and talked about it with CRM Buyer in “Oracle Revs Its Cloud Engines for a Better Customer Experience.” Wettemann said there are distinct themes to the latest release: "One is usability. Oracle Sales Cloud, for example, is designed to have zero training for onboarding sales reps, which it does," she explained. "It is quite impressive, actually -- the intuitive nature of the application and the design work they have done with this goal in mind."The software uses as few buttons and fields as possible, she pointed out. "The sales rep doesn't have to ask, 'what is the next step?' because she can see what it is."In fact, there are three themes driving the usability that Wettemann noted. They are simplicity, mobility, and extensibility, and we write more about them on the Usable Apps web site. These three themes embody the strategy for Oracle’s cloud applications user experiences.  Simplified UI for Oracle Sales CloudIn developing a Simplified UI for Oracle Sales Cloud, Oracle’s UX team concentrated on the tasks that sales reps need to do most frequently, and are most important. “Knowing that the majority of their work lives are spent on the road and on the go, they need to be able to quickly get in and qualify and convert their leads, monitor and progress their opportunities, update their customer and contact information, and manage their schedule,” Jeremy Ashley, Vice President of the Applications UX team, said.Ashley said the Apps UX team has a good reason for creating a Simplified UI that focuses on self-service. “Sales people spend the day selling stuff,” he said. “The only reason they use software is because the company wants to track what they’re doing.” Traditional systems of tracking that information include filling in a spreadsheet of leads or sales. Oracle wants to automate this process for the salesperson, and enable that person to keep everyone who needs to know up-to-date easily and quickly. Simplified UI addresses that problem by providing light-touch input.  “It has to be useful to the salesperson,” Ashley said about the Sales Cloud user experience. Simplified UI can tell sales reps about key opportunities, or provide information about a contact in just a click or two. Customer information is accessible quickly and easily with Simplified UI for the Oracle Sales Cloud.Simplified UI for Sales Cloud can also be extended easily, Ashley said. Users usually just need to add various business fields or create and modify analytical reports. The way that Simplified UI is constructed allows extensibility to happen by hiding or showing a few necessary fields. The Settings user interface, starting in release 7, allows for the simple configuration of the most important visual elements. “With Sales cloud, we identified a need to make the application useful and very simple,” Ashley said. Simplified UI meets that need. Where can you find out more?To find out more about the simplified UI and Oracle’s ongoing investment in applications user experience innovations, come to one of our sessions at a user group conference near you. Stay tuned to the Voice of User Experience (VoX) blog – the next post will be about Simplified UI and HCM Cloud.

    Read the article

  • TechEd Israel 2010 may only accept speakers from sponsors

    - by RoyOsherove
    A month or so ago, Microsoft Israel started sending out emails to its partners and registered event users to “Save the date!” – Micraoft Teched Israel is coming, and it’s going to be this november! “Great news” I thought to myself. I’d been to a couple of the MS teched events, as a speaker and as an attendee, and it was lovely and professionally done. Israel is an amazing place for technology and development and TechEd hosted some big names in the world of MS software. A couple of weeks ago, I was shocked to hear from a couple of people that Microsoft Israel plans to only accept non-MS teched speakers, only from sponsors of the event. That means that according to the amount that you have paid, you get to insert one or more of your own selected speakers as part of teched. I’ve spent the past couple of weeks trying to gather more evidence of this, and have gotten some input from within MS about this information. It looks like that is indeed the case, though no MS rep. was prepared to answer any email I had publicly. If they approach me now I’d be happy to print their response. What does this mean? If this is true, it means that Microsoft Israel is making a grave mistake – They are diluting the quality of the speakers for pure money factors. That means, that as a teched attendee, who paid good money, you might be sitting down to watch nothing more that a bunch of infomercials, or sub-standard speakers – since speakers are no longer selected on quality or interest in their topic. They are turning the conference from a learning event to a commercial driven event They are closing off the stage to the community of speakers who may not be associated with any organization  willing to be a sponsor They are losing speakers (such as myself) who will not want to be part of such an event. (yes – even if my company ends up sponsoring the event, I will not take part in it, Sorry Eli!) They are saying “F&$K you” to the community of MVPs who should be the people to be approached first about technical talks (my guess is many MVPs wouldn’t want to talk at an event driven that way anyway ) I do hope this ends up not being true, but it looks like it is. MS Israel had already done such a thing with the Developer Days event previouly held in Israel – only sponsors were allowed to insert speakers into the event. If this turns out to be true I would urge the MS community in Israel to NOT TAKE PART AT THIS EVENT in any form (attendee, speaker, sponsor or otherwise). by taking part, you will be telling MS Israel it’s OK to piss all over the community that they are quietly suffocating anyway. The MVP case MS Israel has managed to screw the MVP program as well. MS MVPs (I’m one) have had a tough time here in Israel the past couple of years. ever since yosi taguri left the blue badge ranks, there was not real community leader left. Whoever runs things right now has their eyes and minds set elsewhere, with the software MVP community far from mind and heart. No special MVP events (except a couple of small ones this year). No real MVP leadership happens here, with the MVP MEA lead (Ruari) being on a remote line, is not really what’s needed. “MVP? What’s that?” I’m sure many MS Israel employees would say. Exactly my point. Last word I’ve been disappointed by the MS machine for a while now, but their slowness to realize what real community means in the past couple of years really turns me off. Maybe it’s time to move on. Maybe I shouldn’t be chasing people at MS Israel begging for a room to host the Agile Israel user group. Maybe it’s time to say a big bye bye and start looking at a life a bit more disconnected.

    Read the article

  • Interrupted Upgrade from 11.10 to 12.04

    - by Tamil
    My upgrade using alternative iso from 11.10 to 12.04 got interrupted and I had to hard restart my machine. Now I feel that everything is recovered except my already installed packages like vim. How do I backup my home folder for fresh installation of ubuntu? Following are the errors I'm facing I couldn't mark any package for re-installation in synaptic or remove and install too. output of sudo apt-get install vim Building dependency tree Reading state information... Done Package vim is not available, but is referred to by another package. This may mean that the package is missing, has been obsoleted, or is only available from another source E: Package 'vim' has no installation candidate If I try installing it from synaptic I get apache2.2-common: Package apache2.2-common has no available version, but exists in the database. This typically means that the package was mentioned in a dependency and never uploaded, has been obsoleted or is not available with the contents of sources.list my sources.list file # added by the release upgrader # deb cdrom:[Ubuntu 12.04.1 LTS _Precise Pangolin_ - Release amd64 (20120822.4)]/ precise main restricted # added by the release upgrader # # deb cdrom:[Ubuntu 12.04.1 LTS _Precise Pangolin_ - Release amd64 (20120822.4)]/ precise main restricted # deb cdrom:[Ubuntu 11.04 _Natty Narwhal_ - Release amd64 (20110427.1)]/ natty main restricted # See http://help.ubuntu.com/community/UpgradeNotes for how to upgrade to # newer versions of the distribution. deb http://archive.ubuntu.com/ubuntu precise main restricted deb-src http://archive.ubuntu.com/ubuntu precise main restricted ## Major bug fix updates produced after the final release of the ## distribution. deb http://archive.ubuntu.com/ubuntu precise-updates main restricted deb-src http://archive.ubuntu.com/ubuntu precise-updates main restricted ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team. Also, please note that software in universe WILL NOT receive any ## review or updates from the Ubuntu security team. deb http://archive.ubuntu.com/ubuntu precise universe deb-src http://archive.ubuntu.com/ubuntu precise universe deb http://archive.ubuntu.com/ubuntu precise-updates universe deb-src http://archive.ubuntu.com/ubuntu precise-updates universe ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## multiverse WILL NOT receive any review or updates from the Ubuntu ## security team. deb http://archive.ubuntu.com/ubuntu precise multiverse deb-src http://archive.ubuntu.com/ubuntu precise multiverse deb http://archive.ubuntu.com/ubuntu precise-updates multiverse deb-src http://archive.ubuntu.com/ubuntu precise-updates multiverse ## Uncomment the following two lines to add software from the 'backports' ## repository. ## N.B. software from this repository may not have been tested as ## extensively as that contained in the main release, although it includes ## newer versions of some applications which may provide useful features. ## Also, please note that software in backports WILL NOT receive any review ## or updates from the Ubuntu security team. # deb http://us.archive.ubuntu.com/ubuntu/ natty-backports main restricted universe multiverse # deb-src http://us.archive.ubuntu.com/ubuntu/ natty-backports main restricted universe multiverse deb http://archive.ubuntu.com/ubuntu precise-security main restricted deb-src http://archive.ubuntu.com/ubuntu precise-security main restricted deb http://archive.ubuntu.com/ubuntu precise-security universe deb-src http://archive.ubuntu.com/ubuntu precise-security universe deb http://archive.ubuntu.com/ubuntu precise-security multiverse deb-src http://archive.ubuntu.com/ubuntu precise-security multiverse ## Uncomment the following two lines to add software from Canonical's ## 'partner' repository. ## This software is not part of Ubuntu, but is offered by Canonical and the ## respective vendors as a service to Ubuntu users. deb http://archive.canonical.com/ubuntu precise partner # deb-src http://archive.canonical.com/ubuntu natty partner ## This software is not part of Ubuntu, but is offered by third-party ## developers who want to ship their latest software. deb http://extras.ubuntu.com/ubuntu precise main deb-src http://extras.ubuntu.com/ubuntu precise main # deb http://tamil.3758_gmail.com:[email protected]/free unstable main # disabled on upgrade to oneiric # deb http://debian.datastax.com/natty oneiric main # disabled on upgrade to oneiric sudo apt-get update Err http://archive.ubuntu.com precise InRelease Err http://archive.canonical.com precise InRelease Err http://archive.ubuntu.com precise-updates InRelease Err http://archive.ubuntu.com precise-security InRelease Err http://extras.ubuntu.com precise InRelease Err http://archive.canonical.com precise Release.gpg Unable to connect to 172.16.140.249:3142: Err http://archive.ubuntu.com precise Release.gpg Unable to connect to 172.16.140.249:3142: Err http://archive.ubuntu.com precise-updates Release.gpg Unable to connect to 172.16.140.249:3142: Err http://extras.ubuntu.com precise Release.gpg Unable to connect to 172.16.140.249:3142: Err http://archive.ubuntu.com precise-security Release.gpg Unable to connect to 172.16.140.249:3142: W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/precise/InRelease

    Read the article

  • Wireless Network Found, can't connect, repeated requests for authentication

    - by Herm Holland
    After trawling through the internet, on forums, support websites, and through dozens upon dozens of answered questions on this site, I've not found a solution to what seems like a fairly regular problem... I cannot connect to a wireless network, and am continually asked for the network password. I have tried countless suggested solutions on the different locations I've already referred to. None of them have worked. Details of my experience are as follows: I have just recently installed Ubuntu 12.04.1 (32-bit). Ubuntu installed on my system seemingly fine, and I even formatted my hard drive during the process. It's as if it were a new desktop computer. During the installation I was asked to connect to a Wireless Network. I have a USB Wireless Card connected which I have used to connect desktop PC's, laptops, and a Wii to the internet from approximately the same area of the house (thus the same distance from the Wireless Router). I chose my network, entered the correct password for it (I double checked; it's definitely the right password) and proceeded with the installation. Several times before the installation was complete, I was asked to authenticate the connection, and this seemed to do nothing each time. On the repeated screens the password was already entered in the appropriate box. When Ubuntu booted up the first thing I was faced with (other than something about Language settings, or something) was another request for authentication. Again, the password was already there, so I clicked connect. It did not connect. Instead, I was once again faced with repeated requests every few minutes. I went onto my laptop, which is connected to this network, checked the details of the network, and entered them manually into my Ubuntu PC (including the IPv4 and IPv6 information) but this didn't work either, so I set it back to finding the settings automatically. Note, also, that the "Connect automatically" and "Available to all users" boxes are checked, and have been unchecked & rechecked countless times. I have also tried having my User account connect automatically, and to need a password entered at the welcome screen. Whilst I've been writing this, it has gone through a spat of connecting successfully to the network for less than a minute, before coming offline again, only to repeat the process. But it has now returned to prompting me for a password every couple of minutes. This computer has already run on the Fedora OS, and had no trouble connecting to, and maintaining a connection. I also have a laptop running Windows 7 less than a metre away from this desktop PC, which is connected and has no trouble maintaining a connection at 50%-100% strength (fluctuating). Therefore: - I know it's not the wireless card - I know it's not the PC itself - I know it's not the access point - I know it's not the location of my PC or wireless card - It is solely because of Ubuntu Everything else has worked fine, but the moment Ubuntu was introduced into the equation, it has gone completely wrong. Honestly; I prefer Ubuntu as an OS to Fedora, but if I can't solve the problem it'll be straight back to Fedora that I'll have to go. Can anyone help me at all?

    Read the article

  • Tips &amp; Tricks: How to crawl a SSL enabled Oracle E-Business Suite

    - by Rajesh Ghosh
    Oracle E-Business Suite can be integrated with Oracle Secure Enterprise Search for a superior end user experience and enhanced data retrieval capabilities. Before end-users can perform search operations, data has to be crawled and indexed into Oracle SES server. However if the Oracle E-Business Suite instance is on SSL, some additional configurations are needed in Oracle SES server as well as in Oracle Search Modeler, before a search object can be deployed and crawled. The process involves the following steps: Step 1: Export the SSL certificate of Oracle E-Business Suite Access the Oracle E-business Suite instance from a web browser. You should be able to locate a security or certificate icon somewhere in the browser toolbar or status bar, depending on which browser you are using. Click on it and you should be able to view the certificate as well as export it to a local file. While exporting make sure that you use “DER encoded” format. Step 2: Import the SSL certificate into Oracle Secure Enterprise server’s java key-store Oracle SES (10.1.8.4) by default ships a JDK under $ORACLE_HOME. The Oracle SES mid-tier uses this jdk to start the oc4j container services. In this step the Oracle E-Business Suite’s SSL certificate which has been exported in step #1, has to be imported into the Oracle SES server’s java key store. Perform the following: Copy the certificate file onto the server where Oracle SES server is running; under $ORACLE_HOME/jdk/jre/lib/security/cacerts. “ORACLE_HOME” points to the Oracle SES oracle home. Set the JAVA_HOME environment variable to $ORACLE_HOME/jdk. Append $JAVA_HOME/bin to the PATH environment variable Issue the command :  “keytool -import -keystore keystore.jks -trustcacerts -alias myOHS –file ebs.crt” . Please substitute “ebs.crt” with the name of the certificate file you copied in step #2.1. The default key-store password “changeit”. Enter the same when prompted. If successful this process will end with a message saying “certificate successfully imported”. Step 3: Import the SSL certificate into Search Modeler java key-store Unlike Oracle SES, Search Modeler is not shipped with a bundled JDK. If you are using standalone OC4J, then you actually use an external JDK to start the oc4j container services. If you are using IAS instance then the JDK comes bundled with the IAS installation. Perform the following: Copy the certificate file onto the server where Search Modeler application is running; under $JDK_HOME/jre/lib/security/cacerts. “JDK_HOME” points to the JDK directory depending on whether you are using external JDK or a bundled one. Set the JAVA_HOME environment variable to JDK directory. Append $JAVA_HOME/bin to the PATH environment variable Issue the command :  “keytool -import -keystore keystore.jks -trustcacerts -alias myOHS –file ebs.crt” . Please substitute “ebs.crt” with the name of the certificate file you copied in step #3.1. The default key-store password “changeit”. Enter the same when prompted. If successful this process will end with a message saying “certificate successfully imported”. Once you have completed the above steps successfully, you can deploy the search objects using Search Modeler and then start crawling them as well.

    Read the article

  • Old School Wizardry Tip: Batch File Comments

    - by jkauffman
    Johnny, the Endangered Keyboard-Driven Windows User Some of my proudest, obscure Windows tricks are losing their relevance. I know I’m not alone. Keyboard shortcuts are going the way of the dodo. I used to induce fearful awe by slapping Ctrl+Shift+Esc in front of the lowly, pedestrian Windows users. No windows key on the keyboard? No problem: Ctrl+Esc. No menu key on the keyboard: Shift+F10. I am also firmly planted in the habit of closing windows with the Alt+Space menu (Alt+Space, C); and I harbor a brooding, slow=growing list of programs that fail to support this correctly (that means you, Paint.NET). Every time a new version of windows comes out, the support for some of these minor time-saving habits get pared out. Will I complain publicly? Nope, I know my old ways should be axed to conserve precious design energy. In fact, I disapprove of fierce un-intuitiveness for the sake of alleged productivity. Like vim, for example. If you approach a program after being away for 5 years, having to recall encyclopedic knowledge is a flaw. The RTFM disciples have lost. Anyway, some of the items in my arsenal of goofy time-saving tricks are still relevant today. I wanted to draw attention to one that’s stood the test of time. Remember Batch Files? Yes, it’s true, batch files are fading faster than the world of print. But they're not dead yet. I still run into some situations where I opt to use batch files. They are still relevant for build processes, or just various development workflow tools. Sure, there’s powershell, but there’s that stupid Set-ExecutionPolicy speed bump standing in your way; can you really spare the time to A) hunt down that setting on all machines affected and/or B) make futile efforts to convince your coworkers/boss that the hassle was worth it? When possible, I prefer the batch file wild card. And whenever I return to batch files, I end up researching some of the unintuitive aspects such as parameters, quote handling, and ERRORLEVEL. But I never have to remember to use “REM” for comment lines, because there’s a cleaner way to do them! Double Colon For Eye-Friendly Comments Here is a very simple batch file, with pretty much minimal content: @ECHO OFF SETLOCAL REM This is a comment ECHO This batch file doesn’t do much If you code on a daily basis, this may be more suitable to your eyes: @ECHO OFF SETLOCAL :: This is a comment ECHO This batch file doesn’t do much Works great! I imagine I find it preferable due to the similarity to comments in other situations: // or ;  or # I’ve often make visual pseudo-line breaks in my code, and this colon-based syntax works wonders: @ECHO OFF SETLOCAL :: Do stuff ECHO Doing Stuff :::::::::::::::::::::::::::: :: Do more stuff ECHO This batch file doesn’t do much Not only is it more readable, but there’s a slight performance benefit. The batch file engine sees this as an invalid line label and immediately reads the following line. Use that fact to your advantage if this trick leads you into heated nerd debate. Two Pitfalls to Avoid Be aware of that there are a couple situations where this hack will fail you. It most likely won’t be a problem unless you’re getting really sophisticated with your batch files. Pitfall #1: Inline comments @ECHO OFF SETLOCAL IF EXIST C:\SomeFile.txt GOTO END ::This will fail :END Unfortunately, this fails. You can only have whitespace to the left of your comments. Pitfall #2: Code Blocks @ECHO OFF SETLOCAL IF EXIST C:\SomeFile.txt (         :: This will fail         ECHO HELLO ) Code blocks, such as if statements and for loops, cannot contain these comments. This is ultimately due to the fact that entire code blocks are processed as a single line. I originally learned this from Rob van der Woude’s site. He goes into more depth about the behavior of the pitfalls as well, if you are interested in further details. I hope this trick earns you serious geek rep!

    Read the article

  • SQL – Business Intelligence: Derive Data or Information?

    - by Pinal Dave
    We all know the value of information in our lives. Whether it’s a personal decision or a business initiated one, people need it. But the question is: who is to make the distinction between data and information? We all come across a whole lot of data daily, that may be significant or not. We filter what’s required and forget about the rest. Information is filtered and distilled data. Filtering and distillation can also alter its actual meaning and natural state. Therefore, in this blog we discover some ways to ensure that we’re using business intelligence derived from the right information for making critical management decisions. Four key questions managers must ask themselves before making a decision: 1. Am I working with data or information? 2. What is it’s context? 3. How recent is it? 4. How was it derived or what is the source? The first question is probably the most important. You must know what you’re dealing with here. If you see use of adjectives and conclusions drawn, it’s information. Not raw data. You very next concern must be whether this is guised to present a particular viewpoint or perspective. It makes a lot of difference if you take a decision based on someone’s propaganda to distort real facts. Therefore, the context and the intentions of the distillation process must be clear to you. The next consideration is whether data is recent enough to hold any value. Since it has a very short shelf life, you must ensure that its context and value is not lost out of time. The last and the most important consideration is how was it derived in the first place. The observer effect is what calls the shots here. The source can change the context to a great extent if the collection methodology  and purpose is not clear. Gathering intelligence for decision making requires users to be keen observers and not take the information provided on its face value alone. These probing questions will allow you to make sure that you’re working with clean and accurate data devoid of any influence or manipulations. Only then can you be sure of deriving true business intelligence for your organization. BI technology is also a great way to ensure accuracy of reports. SQL BI Platform  provides advanced tools and techniques for all your BI needs and concerns. Koenig Solutions offers this course along with a host of other Business Intelligence and IT courses on all latest technologies available in the market today. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Duke's Choice Award Ceremony

    - by Tori Wieldt
    The 2012 Duke's Choice Awards winners and their creative, Java-based technologies and Java community contributions were honored after the Sunday night JavaOne keynotes. Sharat Chander, Group Director for Java Technology Outreach, presented the awards. "Having the community participate directly in both submission and selection truly shows how we are driving exposure of the innovation happening in the Java community," he said. Apache Software Foundation Hadoop Project The Apache Software Foundation’s Hadoop project, written in Java, provides a framework for distributed processing of big data sets across clusters of computers, ranging from a few servers to thousands of machines. This harnessing of large data pools allows organizations to better understand and improve their business. AgroSense Project Improving farming methods to feed a hungry world is the goal of AgroSense, an open source farm information management system built in Java and the NetBeans platform. AgroSense enables farmers, agribusinesses, suppliers and others to develop modular applications that will easily exchange information through a common underlying NetBeans framework. JDuchess Rather than focus on a specific geographic area like most Java User Groups (JUGs), JDuchess fosters the participation of women in the Java community worldwide. The group has more than 500 members in 60 countries, and provides a platform through which women can connect with each other and get involved in all aspects of the Java community. Jelastic, Inc. Moving existing Java applications to the cloud can be a daunting task, but startup Jelastic, Inc. offers the first all-Java platform-as-a-service (PaaS) that enables existing Java applications to be deployed in the cloud without code changes or lock-in. Liquid Robotics Robotics – Liquid Robotics is an ocean data services provider whose Wave Glider technology collects information from the world’s oceans for application in government, science and commercial applications. The organization features the “father of Java” James Gosling as its chief software architect. London Java Community The second user group receiving a Duke’s Choice Award this year, the London Java Community (LJC) and its users have been active in the OpenJDK, the Java Community Process (JCP) and other efforts within the global Java community. NATO The first-ever Community Choice Award goes to the MASE Integrated Console Environment (MICE) in use at NATO. Built in Java on the NetBeans platform, MICE provides a high-performance visualization environment for conducting air defense and battle-space operations. Parleys.com E-learning specialist Parleys.com, based in Brussels, Belgium, uses Java technologies to bring online classes and full IT conferences to desktops, laptops, tablets and mobile devices. Parleys.com has hosted more than 1,700 conferences—including Devoxx and JavaOne—for more than 800,000 unique visitors. Student Nokia Developer Group This year’s student winner, Ram Kashyap, is the founder and president of the Nokia Student Network, and was profiled in the “The New Java Developers” feature in the March/April 2012 issue of Java Magazine. Since then, Ram has maintained a hectic pace, graduating from the People’s Education Society Institute of Technology in Bangalore, India, while working on a Java mobile startup and training students on Java ME. United Nations High Commissioner for Refugees The United Nations High Commissioner for Refugees (UNHCR) is on the front lines of crises around the world, from civil wars to natural disasters. To help facilitate its mission of humanitarian relief, the UNHCR has developed a light-client Java application on the NetBeans platform. The Level One registration tool enables the UNHCR to collect information on the number of refugees and their water, food, housing, health, and other needs in the field, and combines that with geocoding information from various sources. This enables the UNHCR to deliver the appropriate kind and amount of assistance where it is needed. You can read more about the winners in the current issue of Java Magazine.

    Read the article

  • SQL SERVER – Partition Parallelism Support in expressor 3.6

    - by pinaldave
    I am very excited to learn that there is a new version of expressor’s data integration platform coming out in March of this year.  It will be version 3.6, and I look forward to using it and telling everyone about it.  Let me describe a little bit more about what will be so great in expressor 3.6: Greatly enhanced user interface Parallel Processing Bulk Artifact Upgrading The User Interface First let me cover the most obvious enhancements. The expressor Studio user interface (UI) has had some significant work done. Kudos to the expressor Engineering team; the entire UI is a visual masterpiece that is very responsive and intuitive. The improvements are more than just eye candy; they provide significant productivity gains when developing expressor Dataflows. Operator shape icons now include a description that identifies the function of each operator, instead of having to guess at the function by the icon. Operator shapes and highlighting depict the current function and status: Disabled, enabled, complete, incomplete, and error. Each status displays an appropriate message in the message panel with correction suggestions. Floating or docking property panels provide descriptive tool tips for each property as well as auto resize when adjusting the canvas, without having to search Help or the need to scroll around to get access to the property. Progress and status indicators let you know when an operation is working. “No limit” canvas with snap-to-grid allows automatic sizing and accurate positioning when you have numerous operators in the Dataflow. The inline tool bar offers quick access to pan, zoom, fit and overview functions. Selecting multiple artifacts with a right click context allows you to easily manage your workspace more efficiently. Partitioning and Parallel Processing Partitioning allows each operator to process multiple subsets of records in parallel as opposed to processing all records that flow through that operator in a single sequential set. This capability allows the user to configure the expressor Dataflow to run in a way that most efficiently utilizes the resources of the hardware where the Dataflow is running. Partitions can exist in most individual operators. Using partitions increases the speed of an expressor data integration application, therefore improving performance and load times. With the expressor 3.6 Enterprise Edition, expressor simplifies enabling parallel processing by adding intuitive partition settings that are easy to configure. Bulk Artifact Upgrading Bulk Artifact Upgrading sounds a bit intimidating, but it actually is not and it is a welcome addition to expressor Studio. In past releases, users were prompted to confirm that they wanted to upgrade their individual artifacts only when opened. This was a cumbersome and repetitive process. Now with bulk artifact upgrading, a user can easily select what artifact or group of artifacts to upgrade all at once. As you can see, there are many new features and upgrade options that will prove to make expressor Studio quicker and more efficient.  I hope I’m not the only one who is excited about all these new upgrades, and that I you try expressor and share your experience with me. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • Some Original Expressions

    - by Phil Factor
    Guest Editorial for Simple-Talk newsletterIn a guest editorial for the Simple-Talk Newsletter, Phil Factor wonders if we are still likely to find some more novel and unexpected ways of using the newer features of Transact SQL: or maybe in some features that have always been there! There can be a great deal of fun to be had in trying out recent features of SQL Expressions to see if  they provide new functionality.  It is surprisingly rare to find things that couldn’t be done before, but in a different   and more cumbersome way; but it is great to experiment or to read of someone else making that discovery.  One such recent feature is the ‘table value constructor’, or ‘VALUES constructor’, that managed to get into SQL Server 2008 from Standard SQL.  This allows you to create derived tables of up to 1000 rows neatly within select statements that consist of  lists of row values.  E.g. SELECT Old_Welsh, number FROM (VALUES ('Un',1),('Dou',2),('Tri',3),('Petuar',4),('Pimp',5),('Chwech',6),('Seith',7),('Wyth',8),('Nau',9),('Dec',10)) AS WelshWordsToTen (Old_Welsh, number) These values can be expressions that return single values, including, surprisingly, subqueries. You can use this device to create views, or in the USING clause of a MERGE statement. Joe Celko covered  this here and here.  It can become extraordinarily handy to use once one gets into the way of thinking in these terms, and I’ve rewritten a lot of routines to use the constructor, but the old way of using UNION can be used the same way, but is a little slower and more long-winded. The use of scalar SQL subqueries as an expression in a VALUES constructor, and then applied to a MERGE, has got me thinking. It looks very clever, but what use could one put it to? I haven’t seen anything yet that couldn’t be done almost as  simply in SQL Server 2000, but I’m hopeful that someone will come up with a way of solving a tricky problem, just in the same way that a freak of the XML syntax forever made the in-line  production of delimited lists from an expression easy, or that a weird XML pirouette could do an elegant  pivot-table rotation. It is in this sort of experimentation where the community of users can make a real contribution. The dissemination of techniques such as the Number, or Tally table, or the unconventional ways that the UPDATE statement can be used, has been rapid due to articles and blogs. However, there is plenty to be done to explore some of the less obvious features of Transact SQL. Even some of the features introduced into SQL Server 2000 are hardly well-known. Certain operations on data are still awkward to perform in Transact SQL, but we mustn’t, I think, be too ready to state that certain things can only be done in the application layer, or using a CLR routine. With the vast array of features in the product, and with the tools that surround it, I feel that there is generally a way of getting tricky things done. Or should we just stick to our lasts and push anything difficult out into procedural code? I’d love to know your views.

    Read the article

  • You Probably Already Have a “Private Cloud”

    - by BuckWoody
    I’ve mentioned before that I’m not a fan of the word “Cloud”. It’s too marketing-oriented, gimmicky and non-specific. A better definition (in many cases) is “Distributed Computing”. That means that some or all of the computing functions are handled somewhere other than under your specific control. But there is a current use of the word “Cloud” that does not necessarily mean that the computing is done somewhere else. In fact, it’s a vector of Cloud Computing that can better be termed “Utility Computing”. This has to do with the provisioning of a computing resource. That means the setup, configuration, management, balancing and so on that is needed so that a user – which might actually be a developer – can do some computing work. To that person, the resource is just “there” and works like they expect, like the phone system or any other utility. The interesting thing is, you can do this yourself. In fact, you probably already have been, or are now. It’s got a cool new trendy term – “Private Cloud”, but the fact is, if you have your setup automated, the HA and DR handled, balancing and performance tuning done, and a process wrapped around it all, you can call yourself a “Cloud Provider”. A good example here is your E-Mail system. your users – pretty much your whole company – just logs into e-mail and expects it to work. To them, you are the “Cloud” provider. On your side, the more you automate and provision the system, the more you act like a Cloud Provider. Another example is a database server. In this case, the “end user” is usually the development team, or perhaps your SharePoint group and so on. The data professionals configure, monitor, tune and balance the system all the time. The more this is automated, the more you’re acting like a Cloud Provider. Lots of companies help you do this in your own data centers, from VMWare to IBM and many others. Microsoft's offering in this is based around System Center – they have a “cloud in a box” provisioning system that’s actually pretty slick. The most difficult part of operating a Private Cloud is probably the scale factor. In the case of Windows and SQL Azure, we handle this in multiple ways – and we're happy to share how we do it. It’s not magic, and the algorithms for balancing (like the one we started with called Paxos) are well known. The key is the knowledge, infrastructure and people. Sure, you can do this yourself, and in many cases such as top-secret or private systems, you probably should. But there are times where you should evaluate using Azure or other vendors, or even multiple vendors to spread your risk. All of this should be based on client need, not on what you know how to do already. So congrats on your new role as a “Cloud Provider”. If you have an E-mail system or a database platform, you can just put that right on your resume.

    Read the article

  • ResourceSerializable: an alternate to ORM and ActiveRecord

    - by Levi Morrison
    A few opinionated reasons I don't like the traditional ORM and ActiveRecord patterns: They work only with a database. Sometimes I'm dealing with objects from an API and other objects from a database. All the implementations I have seen don't allow for that. Feel free to clue me in if I'm wrong on this. They are brittle. Changes in the database will likely break your implemenation. Some implementations can help reduce this, but a few of the ones I've seen don't. Their very design is influenced by the database. If I want to switch to using an API, I'll have to redesign the object to get it to work (likely). It seems to violate the single-responsibility pattern. They know what they are and how they act, but they also know how they are created, destroyed and saved? Seems a bit much. What about an approach that is somewhat more familiar in PHP: implementing an interface? In php 5.4, we'll have the JsonSerializable interface that defines the data to be json_encoded, so users will become accustomed to this type of thing. What if there was a ResourceSerializable interface? This is still an ORM by name, but certainly not by tradition. interface ResourceSerializable { /** * Returns the id that identifies the resource. */ function resourceId(); /** * Returns the 'type' of the resource. */ function resourceType(); /** * Returns the data to be serialized. */ function resourceSerialize(); } Things might be poorly named, I'll take suggestions. Notes: ResourceId will work for API's and databases. As long as your primary key in the database is the same as the resource ID in the API, there is no conflict. All of the API's I've worked with have a unique ID for the resource, so I don't see any issues there. ResourceType is the group or type associated with the resource. You can use this to map the resource to an API call or a database table. If the ResourceType was person, it could map to /api/1/person/{resourceId} and the table persons (or people, if it's smart enough). resourceSerialize() returns the data to be stored. Keys would identify API parameters and database table columns. This also seems easier to test than ActiveRecord / Orm implemenations. I haven't done much automated testing on traditional ActiveRecord/ORM implemenations, so this is merely a guess. But it seems that I being able to create objects independently of the library helps me. I don't have to use load() to get an existing resource, I can simply create one and set all the right properties. This is not so easy in the ActiveRecord / Orm implemenations I've dealt with. Downsides: You need another object to serialize it. This also means you have more code in general as you have to use more objects. You have to map resource types to API calls and database tables. This is even more work, but some ORMs and ActiveRecord implementations require you to map objects to table names anyway. Are there other downsides that you see? Does this seem feasible to you? How would you improve it? Note: I almost asked this on StackOverflow because it might be too vague for their standards, but I'm still not really familiar with programmers.stackexchange.com, so please help me improve my question if it doesn't shape up to standards here.

    Read the article

  • How to move complete SharePoint Server 2007 from one box to another

    - by DipeshBhanani
    It was time of my first onsite client assignment on SharePoint. Client had one server production environment. They wanted to upgrade the topology with completely new SharePoint Farm of three servers. So, the task was to move whole MOSS 2007 stuff to the new server environment without impacting data. The last three scary words “… without impacting data…” were actually putting pressure on my head. Moreover SSP was required to move because additional information has been added for users apart from AD import.   I thought I had to do only backup and restore. It appeared pretty easy at first thought. Just because of these damn scary words, I thought to check out on internet for guidance related to this scenario. I couldn’t get anything except general guidance of moving server on Microsoft TechNet site. I promised myself for starting blogs with this post if I would be successful in this task. Well, I took long time to write this but finally made it. I hope it will be useful to all guys looking for SharePoint server movement.   Before beginning restoration, make sure that, there is no difference in versions of SharePoint at source and destination server. Also check whether the state of SharePoint Installation at the time of backup and restore is same or not. (E.g. SharePoint related service packs and patches if any)   The main tasks of the server movement are as follow:   Backup all the databases Install and configure SharePoint on new environment Deploy all solution (WSP Files) globally to destination server- for installing features attached to the solutions Install all the custom features Deploy/Copy custom pages/files which are added to the “12Hive” folder later Restore SSP Restore My Site Restore other web application   Tasks 3 to 5 are for making sure that we have configured the environment well enough for the web application to be restored successfully. The main and complex task was restoring SSP. I have started restoring SSP through Central Admin. After a while, the restoration status was updated to “unsuccessful”. “Damn it, what went wrong?” I thought looking at the error detail down the page. I couldn’t remember the error message but I had corrected and restored it again.   Actually once you fail restoring SSP, until and unless you don’t clean all related stuff well, your restoration will be failed again and again. I wanted to find the actual reason. So cleaned, restored, cleaned, restored… I had tried almost 5-6 times and finally, I succeeded. I had realized how pleasant it is, to see the word “Successful” on the screen. Without wasting your much time to read, let me write all the detailed steps of restoring SSP:   Delete the SSP through following STSADM command. stsadm -o deletessp -title <SSP name> -deletedatabases -force e.g.: stsadm -o deletessp -title SharedServices1 -deletedatabases –force Check and delete the web application associated with SSP if it exists. Remove Link from Check and remove “Alternate Access Mapping” associated with SSP if it exists. Check and delete IIS site as well as application pool associated with SSP if it exists. Stop following services: ·         Office SharePoint Server Search ·         Windows SharePoint Services Search ·         Windows SharePoint Services Help Search Delete all the databases associated/related to SSP from SQL Server. Reset IIS. Start again following services: ·         Office SharePoint Server Search ·         Windows SharePoint Services Search ·         Windows SharePoint Services Help Search Restore the new SSP.   After the SSP restoration, all other stuffs had completed very smoothly without any more issues. I did few modifications to sites for change of server name and finally, the new environment was ready.

    Read the article

  • Some OBI EE Tricks and Tips in the Admin Tool By Gerry Langton

    - by hamsun
    How to set the log level from a Session variable Initialization block As we know it is normal to set the log level non-zero for a particular user when we wish to debug problems. However sometimes it is inconvenient to go into each user’s properties in the Admin tool and update the log level. So I am showing a method which allows the log level to be set for all users via a session initialization block. This is particularly useful for anyone wanting an alternative way to set the log level. The screen shots shown are using the OBIEE 11g SampleApp demo but are applicable to any environment. Open the appropriate rpd in on-line mode and navigate to Manage Variables. Select Session Initialization Blocks, right click in the white space and create a New Initialization Block. I called the Initialization block Set_Loglevel . Now click on ‘Edit Data Source’ to enter the SQL. Chose the ‘Use OBI EE Server’ option for the SQL. This means that the SQL provided must use tables which have been defined in the Physical layer of the RPD, and whilst there is no need to provide a connection pool you must work in On-Line mode. The SQL can access any of the RPD tables and is purely used to return a value of 2. The ‘Test’ button confirms that the SQL is valid. Next, click on the ‘Edit Data Target’ button to add the LOGLEVEL variable to the initialization block. Check the ‘Enable any user to set the value’ option so that this will work for any user. Click OK and the following message will display as LOGLEVEL is a system session variable: Click ‘Yes’. Click ‘OK’ to save the Initialization block. Then check in the On-LIne changes. To test that LOGLEVEL has been set, log in to OBIEE using an administrative login (e.g. weblogic) and reload server metadata, either from the Analysis editor or from Administration > Reload Files and Metadata link. Run a query then navigate to Administration > Manage Sessions and click ‘View Log’ for the query just issued (which should be approximately the last in the list). A log file should exist and with LOGLEVEL set to 2 should include both logical and physical sql. If more diagnostic information is required then set LOGLEVEL to a higher value. If logging is required only for a particular analysis then an alternative method can be used directly from the Analysis editor. Edit the analysis for which debugging is required and click on the Advanced tab. Scroll down to the Advanced SQL clauses section and enter the following in the Prefix box: SET VARIABLE LOGLEVEL = 2; Click the ‘Apply SQL’ button. The SET VARIABLE statement will now prefix the Analysis’s logical SQL. So that any time this analysis is run it will produce a log. You can find information about training for Oracle BI EE products here or in the OU Learning Paths. Please send me an email at [email protected] if you have any further questions. About the Author: Gerry Langton started at Siebel Systems in 1999 working as a technical instructor teaching both Siebel application development and also Siebel Analytics (which subsequently became Oracle BI EE). From 2006 Gerry has worked as Senior Principal Instructor within Oracle University specialising in Oracle BI EE, Oracle BI Publisher and Oracle Data Warehouse development for BI.

    Read the article

  • The SmartAssembly Rearchitecture

    - by Simon Cooper
    You may have noticed that not a lot has happened to SmartAssembly in the past few months. However, the team has been very busy behind the scenes working on an entirely new version of SmartAssembly. SmartAssembly 6.5 Over the past few releases of SmartAssembly, the team had come to the realisation that the current 'architecture' - grown organically, way before RedGate bought it, from a simple name obfuscator over the years into a full-featured obfuscator and assembly instrumentation tool - was simply not up to the task. Not for what we wanted to do with it at the time, and not what we have planned for the future. Not only was it not up to what we wanted it to do, but it was severely limiting our development capabilities; long-standing bugs in the root architecture that couldn't be fixed, some rather...interesting...design decisions, and convoluted logic that increased the complexity of any bugfix or new feature tenfold. So, we set out to fix this. Earlier this year, a new engine was written on which SmartAssembly would be based. Over the following few months, each feature was ported over to the new engine and extensively tested by our existing unit and integration tests. The engine was linked into the existing UI (no easy task, due to the tight coupling between the UI and old engine), and existing RedGate products were tested on the new SmartAssembly to ensure the new engine acted in the same way. The result is SmartAssembly 6.5. The risks of a rearchitecture Are there risks to rearchitecting a product like SmartAssembly? Of course. There was a lot of undocumented behaviour in the old engine, and as part of the rearchitecture we had to find this behaviour, define it, and document it. In the process we found some behaviour of the old engine that simply did not make sense; hence the changes in pruning & obfuscation behaviour in the release notes. All the special edge cases we had to find, document, and re-implement. There was a chance that these special cases would not be found until near the end of the project, when everything is functionally complete and interacting together. By that stage, it would be hard to go back and change anything without a whole lot of extra work, delaying the release by months. We always knew this was a possibility; our initial estimate of the time required was '4 months, ± 4 months'. And that was including various mitigation strategies to reduce the likelihood of these issues being found right at the end. Fortunately, this worst-case did not happen. However, the rearchitecture did produce some benefits. As well as numerous bug fixes that we could not fix any other way, we've also added logging that lets you find out exactly why a particular field or property wasn't pruned or obfuscated. There's a new command line interface, we've tested it with WP7.1 and Silverlight 5, and we've added a new option to error reporting to improve the performance of instrumented apps by ~10%, at the cost of inaccurate line numbers in reports. So? What differences will I see? Largely none. SmartAssembly 6.5 produces the same output as SmartAssembly 6.2. The performance of 6.5 will be much faster for some users, and generally the same as 6.2 for the remaining. If you've encountered a bug with previous versions of SmartAssembly, I encourage you to try 6.5, as it has most likely been fixed in the rearchitecture. If you encounter a bug with 6.5, please do tell us; we'll be doing another release quite soon, so we'll aim to fix any issues caused by 6.5 in that release. Most importantly, the new architecture finally allows us to implement some Big Things with SmartAssembly we've been planning for many months; these will fundamentally change how you build, release and monitor your application. Stay tuned for further updates!

    Read the article

  • JDeveloper 11g R1 (11.1.1.4.0) - New Features on ADF Desktop Integration Explained

    - by juan.ruiz
    One of the areas that introduced many new features on the latest release (11.1.1.4.0)  of JDeveloper 11g R1 is ADF Desktop integration - in this article I’ll provide an overview of these new features. New ADF Desktop Integration Ribbon in Excel - After installing the ADF desktop integration add-in and depending on the mode in which you open the desktop integration workbook, the ADF Desktop integration ribbon for design time and runtime are displayed as a separate tab within Excel. In previous version the ADF Desktop integration environment used to be placed inside the add-ins tab. Above you can see both, design time ribbon as well as runtime ribbon. On the design time ribbon you can manage the workbook and worksheet properties, worksheet component properties, diagnostics, execution and publication of the workbook. The runtime version of the ribbon is totally customizable and represents what it used to be the runtime menu on the spreadsheet, in this ribbon you can include all the operations and actions that could be executed by the end user while working with the spreadsheet data. Diagnostics - A very important aspect for developers is how to debug or verify the interactions of the client with the server, for that ADF desktop integration has provided since day one a series of diagnostics tools. In this release the diagnostics tools are more visible and are really easy to configure. You can access the client console while testing the workbook, or you can simple dump all the messages to a log file – having the ability of setting the output level for both. Security - There are a number of enhancements on security but the one with more impact for developers is tha security now is optional when using ADF Desktop Integration. Until this version every time that you wanted to work with ADFdi it was a must that the application was previously secured. In this release security is optional which means that if you have previously defined security on your application, then you must secure the ADFdi servlet as explained in one of my previous (ADD LINK) posts. In the other hand, if but the time that you start working with ADFdi you have not defined security, you can test and publish your workbooks without adding security. Support for Continuous Integration - In this release we have added tooling for continuous integration building. in the ADF desktop integration space, the concept translates to adding functionality that developers can use to publish ADFdi workbooks as part of their entire application build. For that purpose, we have a publish tool that can be easily invoke from an ANT task such that all the design time workbooks are re-published into the latest version of the application building process. Key Column - At runtime, on any worksheet containing editable tables you will notice a new additional column called the key column. The purpose of this column is to make the end user aware that all rows on the table need to be selected at the time of sorting. The users cannot alter the value of this column. From the developers points of view there are no steps required in order to have the key column included into the worksheets. Installation and Creation of New Workbooks - Both use cases can be executed now directly from JDeveloper. As part of the Tools menu options the developer can install the ADF desktop integration designer. Also, creating new workbooks that previously was done through that convert tool shipped with JDeveloper is now automatic done from the New Gallery. Creating a new ADFdi workbook adds metadata information information to the Excel workbook so you can work in design time. Other Enhancements Support for Excel 2010 and the ADF components ready-only enabled don’t allow to change its value – the cell in Excel is automatically protected, this could cause confusion among customers of previous releases.

    Read the article

< Previous Page | 841 842 843 844 845 846 847 848 849 850 851 852  | Next Page >