Search Results

Search found 2748 results on 110 pages for 'maintenance plans'.

Page 60/110 | < Previous Page | 56 57 58 59 60 61 62 63 64 65 66 67  | Next Page >

  • Bridging the gap between developers and testers with VS 2010

    - by Etienne Tremblay
    Hey everyone, I know it’s been an eternity since I blogged but I have so much to do that I unfortunately need to prioritize.  Vincent Grondin and I did a 7h presentation on the new developer and tester tools available in the VS 2010 suite.  It was a blast.  We did it in front of an audience (around 120) and it was taped.  We did it as a play and really didn’t look at the crowd at all we were training each other on the technology. It is now available for anyone that would like to watch it at this location: http://www.devteach.com/ALM-TFS2010-Bridgingthegap.aspx What we covered in the full day event was Migration to TFS 2010 (10h00) 1-Migration of VSS to TFS (20 min.) 2-Automating the Build (Something you can't do with VSS) ( 20 Min.) 3-User story (Real application context for this presentation) (20 min.) 10h00 Pause Manuel Tests by Dev ( 11h30) 4-Adding a tester to the team (Into to MTM) (20 min.) 5-Define tests (what is a white bug) (20 min.) 6-Fix the bug and show Intellitrace and Play back the test (20 min.) 12h15 Lunch Manuel testing for maintenance (13h30) 7- Implement new Feature (web service) and Identify bug with MTM and branch for a production fix and also add a new Build script (20 min.) 8- Fix bug in production branch, Playback tests, merge the change in main branch (20 min.) Manuel testing with the lab manager (14h30) 9- Intro to Lab manager and environment (20 min.) 10- Change build script to deploy to lab and test with web service in lab environment. (20 min.) 15h15 Pause Automate UI test with CodeUI (15h30) 11- Reducing the effort of testing the UI (20 min.) 12- Repeating testing to make sure the application is working properly (20 min.) 13- Automate Coded UI with the Lab environment (20 min.) 16h30 Conclusions As you can see lots of stuff!! Enjoy the show and let us know how you like it Cheers, ET Technorati Tags: VS 2010,Testing Tools,ALM,Training

    Read the article

  • NRF Week - Disney Store Tour

    - by sarah.taylor(at)oracle.com
    Disney has created a real buzz at this year's NRF event. Yesterday morning we began the Oracle Retail Exchange program with a visit to the flagship Disney store in Times Square. Additionally Oracle made a key announcement with Disney  on Oracle Retail's Point of Sale implementation in 330 stores worldwide. Today   Disney's Steve Finney gave a super session on The Magic of Disney at the NRF Big Show. We also saw Disney making an exclusive news announcement about their plans for Global store openings at the Oracle trade show stand - with a little help from Mickey and Minnie Mouse. Disney Stores have been entirely reinvented since the company in 2008 took ownership after previously franchising the retail arm of the business. They have subsequently been a strong Oracle partner and technology has played a key role in their re imagination of the store environment. The new Imagination stores have a 20% higher footfall and margins are up 25%. The Disney brand is synonymous with magical and memorable experiences for children of all ages. The company is achieving a unique retail experience that delights children and shareholders alike! Technology is a key pillar in helping to deliver on both a strong operating model and a unique customer experience - the best thirty minutes in a child's day is their aim. Steve Finney this morning said their technology has to be as reliable as a theme park ride. Store experiences are much more enjoyable when there are short waiting times and children can interact with their favourite characters through magic mirrors, mobile point of sale, touch screens and custom animations that are digitally transmitted to stores globally. The Oracle Retail Point of Sale with iPad touch screens reduces check out times, stores customer data, ensures that promotions are delivered accurately and reduces losses. This means higher levels of guest conversion, increased availability and convenience for customers who want to check availability at other locations. Disney is a pioneer. At NRF's 100th show, we had the privilege of learning from a retailer using technology as a creative force to drive their business forward.

    Read the article

  • What do Oracle VM Templates for JD Edwards EnterpriseOne 9.1 have to do with your Next Vacation!

    - by Monica Kumar
    Oracle VM Templates for JD Edwards EnterpriseOne 9.1 Update 2 and JD Edwards EnterpriseOne Tools 9.1.3.3 are now available for download from Oracle Software Delivery Cloud. So, what do Oracle VM Templates have to do with your next vacation? Well, how about time savings so you can plan for that next vacation and have the peace of mind since Oracle did the work and the testing for you!! What’s inside the new Oracle VM Templates release? The Oracle VM Templates for JD Edwards EnterpriseOne enables you to rapidly install JD Edwards EnterpriseOne. The complete stack includes: JD Edwards EnterpriseOne Applications Release 9.1 Update 2 JD Edwards EnterpriseOne Tools 9.1 Update 3, maintenance pack 3 (9.1.3.3) Oracle Database 11g Release 2 Oracle WebLogic Server 10.3.5 All pre-configured and pre-tested to run on Oracle Linux 5. Yes, the OS is included in the template! Oracle Business Intelligence Publisher 11.1.1.7.1 for use with JD Edwards EnterpriseOne One View Reporting JD Edwards EnterpriseOne Business Services Server and Oracle Application Development Framework (ADF) 11.1.1.5, for use with the JD Edwards EnterpriseOne Mobile Applications All pre-tuned to support up to 100 interactive users The templates can be installed to Oracle VM Server for x86 release 3.1 or later, to the Oracle Exalogic Elastic Cloud, and to the Oracle Database Appliance. Simply visit http://edelivery.oracle.com/oraclevm. Download and unzip the files and read the readme and you’re ready to go. How long would take you to install each of the components above, configure and tune them all from scratch? We know that you can get 7-10x faster deployment using the Oracle VM Templates. Now, how about that snorkeling trip to Belize!!

    Read the article

  • How to build MVC Views that work with polymorphic domain model design?

    - by Johann de Swardt
    This is more of a "how would you do it" type of question. The application I'm working on is an ASP.NET MVC4 app using Razor syntax. I've got a nice domain model which has a few polymorphic classes, awesome to work with in the code, but I have a few questions regarding the MVC front-end. Views are easy to build for normal classes, but when it comes to the polymorphic ones I'm stuck on deciding how to implement them. The one (ugly) option is to build a page which handles the base type (eg. IContract) and has a bunch of if statements to check if we passed in a IServiceContract or ISupplyContract instance. Not pretty and very nasty to maintain. The other option is to build a view for each of these IContract child classes, breaking DRY principles completely. Don't like doing this for obvious reasons. Another option (also not great) is to split the view into chunks with partials and build partial views for each of the child types that are loaded into the main view for the base type, then deciding to show or hide the partial in a single if statement in the partial. Also messy. I've also been thinking about building a master page with sections for the fields that only occur in subclasses and to build views for each subclass referencing the master page. This looks like the least problematic solution? It will allow for fairly simple maintenance and it doesn't involve code duplication. What are your thoughts? Am I missing something obvious that will make our lives easier? Suggestions?

    Read the article

  • What are Information Centers?

    - by user12244613
    Information Centers are similar to product pages in the Oracle Sun System Handbook Many customers like the Oracle Sun System Handbook concept of a home page with all the product attributes, troubleshooting etc. access from a single home page. This concept is now available for a range of Oracle Solaris, Systems, and Storage products. The Information Center for each product covers areas such as: Overview, Hot Topics, Patching and Maintenance. The Information Center pages are dynamically generated each night to ensure the latest content is available to you. Here are the top Solaris, Systems, and Storage Information Centers: Oracle Explorer Data Collector Oracle Solaris 10 Live Upgrade Oracle Solaris 11 Booting Information Center Oracle Solaris 11 Desktop and Graphics Information Center Oracle Solaris 11 Image Packaging System (IPS) Information Center Oracle Solaris 11 Installation Information Center Oracle Solaris 11 Product Information Center Oracle Solaris 11 Security Information Center Oracle Solaris 11 System Administration Information Center Oracle Solaris 11 Zones Information Center Oracle Solaris Crash Analysis Tool(SCAT) - Information Center Oracle Solaris Cluster Information Center Oracle Solaris Internet Protocol Multipathing (IPMP) Information Center Oracle Solaris Live Upgrade Information Center Oracle Solaris ZFS Information Center Oracle Solaris Zones Information Center CMT T1000/T2000 and Netra T2000 CMT T5120/T5120/T5140/T5220/T5240/T5440 Systems M3000/M4000/M5000/M8000/M9000-32/M9000-64 Management and Diagnostic Tools for Oracle Sun Systems Netra CT410/810 and Netra CT900 Network-Attached Storage (NAS) Oracle Explorer Data Collector Oracle VM Server for SPARC (LDoms) Pillar Axiom 600 SL3000 Tape Library Sun Disk Storage Patching and Updates Sun Fire 3800/4800/4810/6800/E2900/E4900/E6900/V1280 - Netra 1280/1290 Sun Fire 12K/15K/E20K/E25K Sun Fire X4270 M2 Server Sun x86 Servers T3 and T4 Systems Tape Domain Firmware V210/V240/V440/V215/V245/V445 Servers VSM (VTSS/VLE/VTCS)

    Read the article

  • Oracle University New Courses (Week 35)

    - by swalker
    Oracle University released the following new (versions of) courses recently: Fusion Middleware Oracle Directory Services 11g: Administration (5 days) Oracle SOA Suite 11g: Essential Concepts (Training on Demand) e-Business Suite R12 Oracle HRMS iRecruitment Fundamentals (Self-Study Course) R12 Oracle Payroll Fundamentals: Administration (Self-Study Course) R12 Oracle HRMS System Administration Fundamentals (Self-Study Course) R12 Oracle HRMS Self Service Fundamentals (Self-Study Course) R12 Oracle HRMS Implement and Use Fast Formula (Self-Study Course) R12 HRMS Work Structures Fundamentals (Self-Study Course) R12 HRMS Total Compensation Foundations (Self-Study Course) Siebel Siebel 8.1.x Chat and Voice Integration Using CCA (Self-Study Course) Siebel 8.1.x Search using Oracle Secure Enterprise Search (Self-Study Course) Siebel 8.1.x COM Web Services (Self-Study Course) Siebel 8.1.x COM Asset Based Order Management (Self-Study Course) Siebel 8.1.x COM: What is New in Product Configurator (Self-Study Course) Siebel 8.1.x COM Product Configurator Caching & Performance Management (Self-Study Course) Siebel 8.1.x COM PSP Engine Caching and Performance Management (Self-Study Course) Siebel 8.1.x Remote: Administration (Self-Study Course) Siebel 8.1.x Remote: Technical Foundations (Self-Study Course) Siebel Tools: Configuring Chart and Tree Applets (Self-Study Course) Sun - Server Administration SPARC SuperCluster Administration and Maintenance Seminar (2 days) OPN Only Sparc T4-Based Servers Installation Boot Camp (1 day) Primavera Primavera P6 Application Administration Rel 8.x (2 days) Oracle Retail Retail Merchandising System (RMS) Business Overview (Self-Study Course) Retail Invoice Matching (ReIM) Product Overview (Self-Study Course) Retail Invoice Matching (ReIM) Business Introduction (Self-Study Course) Retail Demand Forecasting: RDF Classic Product Overview (Self-Study Course) Retail Demand Forecasting Introduction (Self-Study Course) Retail Data Warehouse (RDW) Overview 13.1 (Self-Study Course) Oracle Retail Point-of-Service (POS) Product Overview (Self-Study Course) Retail Sales Audit (ReSA) Product Overview (Self-Study Course) Retail Price Management (RPM) Product Overview (Self-Study Course) Retail Merchandising System (RMS) Technical Introduction (Self-Study Course) Oracle Retail Integration Bus (RIB) Product Overview (Self-Study Course) Oracle Communiucations Unified Communications Suite Convergence Customization (2 days) OSM Foundations I: Tasks, Processes and Orders Get in contact with your local Oracle University team for more details and course dates. Stay Connected to Oracle University: LinkedIn OracleMix Twitter Facebook Google+

    Read the article

  • Inevitable Corporate Bureaucracy

    - by Ahsan Alam
    Top executives of most smaller organizations want their companies to be different from the larger corporations. They want their organizations smaller in size; but bigger in productivity by eliminating red tapes and corporate bureaucracy. When the company is smaller, people often work like firefighters – taking on new business and technology challenges without thinking about any procedures and guidelines. People also tend to wear many hats to accomplish tasks quickly in order to integrate new businesses. For example, software developers in smaller organizations may take on responsibilities of client interactions, requirements gathering, design and development, code deployment, production support, network infrastructure support, database design and maintenance along with countless other duties. In addition, systems in smaller organizations tend to be loosely guarded. So, people often don't follow many procedures in order to setup environments and implement technical projects. It's not uncommon to change code and deploy without anyone realizing. Similarly, business requirements may also get defined in an informal manner without any type of documentation. As the company grows, everything starts to change significantly impacting people and the overall business process. Suddenly, following procedures become extremely important. Consequently, new roles, guidelines and procedures start to emerge. Everything from business process to technology implementation start to become more and more process oriented. Organizations start to define and document steps, invent procedure to track process and systems level changes, and start restricting access to various systems for security reasons. At the same time, as a growing company start doing businesses with larger clienteles, they are automatically forced to abide by all sorts of industry compliance laws. Moreover, growing companies tend to recruit experienced individuals to fill new roles who usually bring their expertise from larger and more bureaucratic organizations.   Despite the best efforts from the top executives, it seems increased number of procedures and guidelines as well as new recruits automatically contribute to the evolution of corporate bureaucracy. Maybe, corporate bureaucracy is an inevitable side effect of a growing organization.

    Read the article

  • Javascript autotab function not working on iPad or iPhone [migrated]

    - by freddy6
    I have this this piece of html code: <form name="postcode" method="post" onsubmit="return OnSubmitForm();"> <input class="postcode" maxlength="1" size="1" name="c" onKeyup="autotab(this, document.postcode.o)" /> <input class="postcode" maxlength="1" size="1" name="o" onKeyup="autotab(this, document.postcode.d)" /> <input class="postcode" maxlength="1" size="1" name="d" onKeyup="autotab(this, document.postcode.e)" /> <input class="postcode" maxlength="1" size="1" name="e" /> <br /> </form> which uses this javascript: <script> /* Auto tabbing script- By JavaScriptKit.com http://www.javascriptkit.com This credit MUST stay intact for use */ function autotab(original,destination){ if (original.getAttribute&&original.value.length==original.getAttribute("maxlength")) destination.focus() } </script><script src="http://ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.min.js" type="text/javascript"></script> <script src="js/scripts.js" type="text/javascript"></script> <script type="text/javascript"> function OnSubmitForm() { if(document.postcode.operation[0].checked == true) { document.postcode.action ="plans.php"; } else if(document.postcode.operation[1].checked == true) { document.postcode.action ="plans_gas.php"; } else if(document.postcode.operation[2].checked == true) { document.postcode.action ="plans_duel.php"; } return true; } </script> As soon a you enter in one character into one of the text boxes it automatically tabs across the the next text box. This works fine on a pc or mac and on safari and also in all other browsers. But when viewing the webpage on an iPad or iPhone (using safari) the auto tabbing function does not work. Any ideas on how to make the auto tab work on these mobile devices?

    Read the article

  • Does a team of developers need a manager?

    - by Amadiere
    Background: I'm currently part of a team of four: 1 manager, 1 senior developer and 2 developers. We do a range of bespoke in-house systems / projects (e.g. 6-8 weeks) for an organisation of around 3500 staff, as well as all the maintenance and support required from the systems that have been created before. There is not enough of us to do all the work that is potentially coming our way - we're understaffed. Management acknowledge this, but budget restraints limit our ability to recruit additional members to the team (even if we make the salary back in savings). The Change This leaves us where we are now. Our manager is due to leave his role for pastures new, leaving a vacancy in the team. Management are using this opportunity to restructure our team which would see the team manager role replaced by another developer and another senior developer. Their logic being that we need more developers, so here's a way of funding it (one of the roles is partially funded from another vacant post). The team would have no direct line manager and the roles and responsibilities would be divided up between the seniors and the (relatively new to post) service manager (a non-technical role with little-to-no development knowledge/experience whose focus is shared amongst a number of other teams and individuals) - who would be our next actual manager up the food chain. I guess the final question is: Is it possible to run a development team without an manager? Have you had experience of this? And what things could go wrong / could be of benefit to us? I'd ideally like to "see the light" and the benefits of doing things this way, or come up with some points for argument against it.

    Read the article

  • In the Aggregate: How Will We Maintain Legacy Systems?

    - by Jim G.
    NEW YORK - With a blast that made skyscrapers tremble, an 83-year-old steam pipe sent a powerful message that the miles of tubes, wires and iron beneath New York and other U.S. cities are getting older and could become dangerously unstable. July 2007 Story About a Burst Steam Pipe in Manhattan We've heard about software rot and technical debt. And we've heard from the likes of: "Uncle Bob" Martin - Who warned us about "the consequences of making a mess". Michael C. Feathers - Who gave us guidance for 'Working Effectively With Legacy Code'. So certainly the software engineering community is aware of these issues. But I feel like our aggregate society does not appreciate how these issues can plague working systems and applications. As Steve McConnell notes: ...Unlike financial debt, technical debt is much less visible, and so people have an easier time ignoring it. If this is true, and I believe that it is, then I fear that governments and businesses may defer regular maintenance and fortification against hackers until it is too late. [Much like NYC and the steam pipes.] My Question: Do you share my concern? And if so, is there a way that we can avoid the software equivalent of NYC and the steam pipes?

    Read the article

  • eSTEP TechCast - December 2011 - Solaris 11 - The First Cloud OS

    - by uwes
    Dear partner, we are pleased to announce our next eSTEP TechCast on Thursday 1st December and would be happy if you could join. Please see below the details for the next TechCast. Date and time: Thursday, 01. December 2011, 11:00 - 12:00 GMT (12:00 - 13:00 CET) Abstract: Solaris 11 contains many new features, particularly around improved virtualisation and network performance. Additionally, new software packaging for fool-proof upgrades, higher availability and reduced maintenance windows replace the former SRV4 packaging and upgrade/patching methods. Target audience: Tech Presales Speaker: Andrew Gabriel Call Info: Call-in-toll-free number: 08006948154 (United Kingdom) Call-in-toll-free number: +44-2081181001 (United Kingdom) Show global numbers Conference Code: 803 594 3 Security Passcode: 9876 Webex Info (Oracle Web Conference) Meeting Number: 597 686 322Meeting Password: tech2011 Playback / Recording / Archive: The webcasts will be recorded and will be available shortly after the event in the eSTEP portal under the Events tab, where you could find also material from already delivered eSTEP TechCasts. Use your email-adress and PIN: eSTEP_2011 to get access. Feel free to have a look. We are happy to get your comments and feedback. Thanks and best regards, Partner HW Enablement EMEA

    Read the article

  • PowerShell to fetch a SQL Execution Plan

    - by Rob Farley
    With PowerShell becoming the scripting language of choice for many people, I’ve occasionally wondered about using it to analyse execution plans. After all, an execution plan is just XML, and PowerShell is just one tool which will very easily handle xml. The thing is – there’s no Get-SqlPlan cmdlet available, which has frustrated me in the past. Today I figured I’d make one. I know that I can write T-SQL to get an execution plan using SET SHOWPLAN_XML ON, but the problem is that this must be the only statement in a batch. So I used go, and a couple of newlines, and whipped up the following one-liner: function Get-SqlPlan([string] $query, [string] $server, [string] $db) { return ([xml] (invoke-sqlcmd -Server $server -Database $db -Query "set showplan_xml on;`ngo`n$query").Item( 0)) } (but please bear in mind that I have the SQL Snapins installed, which provides invoke-sqlcmd) To use this, I just do something like: $plan = get-sqlplan "select name from Production.Product" "." "AdventureWorks" And then find myself with an easy way to navigate through an execution plan! At some point I should make the function more robust, but this should be a good starter for any SQL PowerShell enthusiasts (like Aaron Nelson) out there.

    Read the article

  • Knowledge Transfer without a Plan

    - by Kanini
    Hello...We are doing work for a particular client managing their CRM implementation. (The CRM itself is a product which has been largely customized to suit my client's needs). Now, they want us to manage the Oracle batch jobs/ETL as well. And for this, they are ready to provide us with Knowledge Transfer. (The Oracle batch jobs/ETL is managed in-house by the client now). After much persuasion, I got one of the Project Lead (designation-wise) to email the client asking for a KT Plan. (The Project Lead kept saying that they have never had KT plans before and all that for which I offered I will draft a template and even that was rejected!). Email from us to them - Can you please share with us the KT Plan? Response from them - Not sure what is expected from my side? The KT is planned for tomorrow from 11 am onwards where Functional knowledge of existing ETL Data migration package will be shared. How do you handle such a client? Most likely what is going to happen is this. The person who is giving the KT will say that I have given complete Knowledge Transfer and we will go back and say that "No, this was not covered. For this, they provided an overview alone and left it at that!" and so on... My Project Lead also did not respond to that email. He just said that the meeting is scheduled to happen at 11 AM (basically repeating whatever the email said and left for the day!). What could I possibly do? PS: Look for another job is a very helpful answer, but I am not looking for it. :-)

    Read the article

  • Only One Month to OpenWorld-San Francisco!

    - by Stephen Slade
    From around the world, the city is expecting 50,000+ guests to flock to this annual extravaganza.  Over 2,000 sessions will focus on Oracle’s latest product offerings, customer case studies, panels of experts and a variety of other hardware, technology, middleware and applications. For those interested  in the latest capabilities delivered by Oracle’s supply chain applications, the ‘Focus-On’ documents are now avaiable to help guide you in your schedule builder. Schedule builder allows the capability to create a personalized agenda for the sessions you wish to attend, such as: Monday October 1, 2012 TIME TITLE LOCATION  3:15 pm –4:15 pm General Session: Supply Chain Management—Strategy, Update, and Roadmap Richard Jewell, Senior Vice President, Applications Development, Oracle Moscone West Level 2 Room 3014 Tuesday October 2, 2012 TIME TITLE LOCATION  10:15 am –11:15 am Oracle Fusion Supply Chain Management: Overview, Strategy, Customer Experiences, and Roadmap Jon Chorley, CSO & VP, Product Strategy, Oracle Moscone West  Level 2 Room 2006 There is an exciting lineup of about 100 supply chain sessions at OpenWorld. Contact your sales rep or Oracle Partner to obtain a copy of the most current Focus-On document, segmented by pillars such as Manufacturing, Maintenance/EAM, Value Chain Planning, Value Chain Execution, Procurement and Agile/Product Lifecycle Management.  They will provide you with a better informed view to schedule your time in San Francisco.

    Read the article

  • Running Non-profit Web Applications on Cloud/Dedicated Hosting [closed]

    - by cillosis
    Possible Duplicate: How to find web hosting that meets my requirements? I often times build web applications purely because I enjoy it. I like building useful tools or open source applications that don't come with a price tag. That being said, many of these applications can be quite complex requiring services beyond shared hosting (ex. specific PHP extensions). This leaves me with two options: Make the web application less complex and run on shared hosting. Fork out money for cloud or dedicated/VPS hosting. Considering the application is free (I don't make money off of it intentionally), the money for hosting comes out of my own pocket. I know I am not alone in this sticky situation. So the question is, what are the hosting options that provide more advanced features such as shell access via SSH, ability to install specific software/extensions (ex. if I wish to use a NoSQL DB such as Redis, MongoDB, or Cassandra), etc., at a free or low price point? I know free usually equates to bad/unreliable hosting -- but it's not always the case. There are a couple providers with free plans I know of: Amazon EC2 - Free micro-instance for 1 year AppHarbor - Cloud based .NET web application hosting w/ free plan. What else is available for hosting of non-profit applications?

    Read the article

  • Database Schema Usage

    - by CrazyHorse
    I have a question regarding the appropriate use of SQL Server database schemas and was hoping that some database gurus might be able to offer some guidance around best practice. Just to give a bit of background, my team has recently shrunk to 2 people and we have just been merged with another 6 person team. My team had set up a SQL Server environment running off a desktop backing up to another desktop (and nightly to the network), whilst the new team has a formal SQL Server environment, running on a dedicated server, with backups and maintenance all handled by a dedicated team. So far it's good news for my team. Now to the query. My team designed all our tables to belong to a 3-letter schema name (e.g. User = USR, General = GEN, Account = ACC) which broadly speaking relate to specific applications, although there is a lot of overlap. My new team has come from an Access background and have implemented their tables within dbo with a 3-letter perfix followed by "_tbl" so the examples above would be dbo.USR_tblTableName, dbo.GEN_tblTableName and dbo.ACC_tblTableName. Further to this, neither my old team nor my new team has gone live with their SQL Servers yet (we're both coincidentally migrating away from Access environments) and the new team have said they're willing to consider adopting our approach if we can explain how this would be beneficial. We are not anticipating handling table updates at schema level, as we will be using application-level logins. Also, with regards to the unwieldiness of the 7-character prefix, I'm not overly concerned myself as we're using LINQ almost exclusively so the tables can simply be renamed in the DMBL (although I know that presents some challenges when we update the DBML). So therefore, given that both teams need to be aligned with one another, can anyone offer any convincing arguments either way?

    Read the article

  • #mvvmlight V4 update for Win8 RTM

    - by Laurent Bugnion
    With Windows 8 RTM out of the doors (at least for some of us), it was also time to create an update to MVVM Light. I selected the V4 RTM to do this (V4.0.23).This RTM version was released a few weeks ago with no much bells and whistles because I was just too busy to write much about it. Now after some vacation, I will resume blogging on all my favorite topics including of course MVVM Light. Upgrade Upgrading the installations should not require an ununistall, so try to simply run the MSI downloaded from the Download section at http://mvvmlight.codeplex.com/. Should you encounter any issue, try to uninstall the old version first following the steps at http://galasoft.ch/mvvm/cleaning/. Upgrading current apps from Win8 RP to Win8 RTM I didn’t recompile the assemblies of MVVM Light, so if you had a version running with V4.0.23 on Windows 8 RP, you should be able to use the same DLLs on Windows 8 RTM. If you were using earlier versions however, I would recommend doing an upgrade. I noticed a warning regarding the signing certificate. It is due to the PFX key which appears to be outdated after the upgrade to Windows 8 RTM. I solved this warning by replacing the old PFX key with a new one I copied from a new project. The warning did not cause the build to fail though. About MVVM Light V4 RTM The RTM is finalizing quite a lot of issues. The change log is available at http://galasoft.ch/mvvm/installing/changes/. There are some issues that are either still open or that popped up since then, and I am working on V4.1 to be released in the next few weeks. In addition to that, I have plans to support Windows Phone 8 (when it comes) and have a nice list of ideas for V5 with a few new components. Thanks again for your continuous support of MVVM Light! Laurent   Laurent Bugnion (GalaSoft) Subscribe | Twitter | Facebook | Flickr | LinkedIn

    Read the article

  • Guidance in naming awkward domain-specific objects?

    - by GlenH7
    I'm modeling a chemical system, and I'm having problems with naming my objects within an enum. I'm not sure if I should use: the atomic formula the chemical name an abbreviated chemical name. For example, sulfuric acid is H2SO4 and hydrochloric acid is HCl. With those two, I would probably just use the atomic formula as they are reasonably common. However, I have others like sodium hexafluorosilicate which is Na2SiF6. In that example, the atomic formula isn't as obvious (to me) but the chemical name is hideously long: myEnum.SodiumHexaFluoroSilicate. I'm not sure how I would be able to safely come up with an abbreviated chemical name that would have a consistent naming pattern. From a maintenance point of view, which of the options would you prefer to see and why? Some details from comments on this question: Audience for the code will be just programmers, not chemists. I'm using C#, but I think this question is more interesting when ignoring the implementation language I'm starting with 10 - 20 compounds and would have at most 100 compounds. The enum is to facilitate common calculations - the equation is the same for all compounds but you insert a property of the compound to complete the equation. For example, Molar mass (in g/mol) is used when calculating the number of moles from a mass (in grams) of the compound. Another example of a common calculation is the Ideal Gas Law and its use of the Specific Gas Constant

    Read the article

  • Free Oracle Special Edition eBooks - Cloud Architecture & Enterprise Cloud

    - by Thanos
    Cloud computing can improve your business agility, lower operating costs, and speed innovation. The key to making it work is the architecture. Learn how to define your architectural requirements and get started on your path to cloud computing with the free oracle special edition e-book, Cloud Architecture for Dummies.   Topics covered in this quick reference guide include: Cloud architecture principles and guidelines Scoping your project and choosing your deployment model Moving toward implementation with vertically integrated engineered systems Learn how to architect and model your cloud implementation to drive efficiency and leverage economies of scale. For more information, visit oracle.com/cloud and our cloud services at cloud.oracle.com Specifically Infrastructure as a Service (IaaS) is critical to the success of many enterprises. Want to build a private Cloud infrastructure and cut down IT costs? Learn more about Oracle's highly integrated infrastructure software and hardware to help you architect and deploy a cloud infrastructure that is optimized for the needs of your enterprise from day one. Download the free e-book of Enterprise Cloud Infrastructure for Dummies to: Realize the benefits of consolidation with the added cloud capabilities Simplify deployments and reduce risks with tested and proven guidelines Achieve up to 50% lower TCO than comparable multi-vendor alternatives Choosing the right infrastructure technologies is essential to capitalizing on the benefits of cloud computing. Oracle Optimized Solution for Enterprise Cloud Infrastructure helps identify the right hardware and software stack and provides configuration guidelines for your cloud. With this book, you come to understand Enterprise Cloud Infrastructure and find out how to jumpstart your IaaS cloud plans. You also discover Oracle Optimized Solutions and learn how integration testing and proven best practices maximize your IT investments. In addition, you see how to architect and deploy your IaaS cloud to drive down costs and improve performance, how to understand and select the right private cloud strategy for you, what key cloud infrastructure elements are and how to use them to achieve your business goals, and more. For more information, visit oracle.com/oos.

    Read the article

  • Splitting up a Rails/Ruby app onto multiple servers

    - by craig.kaminsky
    We recently moved a large application to two machines, both running the same codebase. I. Machine A Web server for public facing application Receives web hook call backs from our ESP Handles a few large, list-processing jobs (uploaded spreadsheets with data) II. Machine B Manages a massive set of (background) jobs but, primarily, focuses on building and assembling newsletters Runs all integration with our NetSuite platform Runs all system maintenance (read: DB) jobs To me, having these two apps running the same codebase (a large, monolithic Rails application) seems 'wrong'. I am wondering if anyone has advice on how to better break up the code for these two apps. While they both need the same DB and, ultimately, the same model code, Machine B has no need for Controllers and Views and it feels wasteful running a full-stack Rails app for its tasks. A couple things came to mind but I'm not sure if I'm trying to solve a problem that doesn't exist: Break the models out into a sub-module on git and include into both apps Build out the Mahcine B app in plain Ruby or a lighter framework like Sinatra (where I could use ActiveRecord with Sinatra in combo with a sub-module for the model folder). I'm new to this scenario and appreciate any and all feedback or direction! Thank you.

    Read the article

  • ArchBeat Link-o-Rama for December 5, 2012

    - by Bob Rhubart
    On the Cultural-Linguistic Turn | Richard Veryard "When an architect chooses to label something as a 'silo' or 'legacy,' or uses words like 'integrated' and 'standardized,', these may not always be objectively verifiable categories but subjective judgements, around which the architect may then weave an appropriate story." -- Richard Veryard Advanced Oracle SOA Suite presentations from Open World 2012 | Juergen Kress Oracle SOA and BPM Partner Community blogger Juergen Kress shares a list of 13 SOA presentations delivered or moderated by Oracle SOA Product Management at OOW12 in San Francisco. Coherence 101, Beware of cache listeners | Alexey Ragozin Alexey Ragozin's technical post will help you avoid trouble when working with the cache events facility in Oracle Coherence. 3 Key Cloud Insights for 2013 | CTO Blog Capgemini CTO blogger Ron Tolido highlights three "standout" insights from a recent Capgemini report on the business cloud. Access Control Lists for Roles | Kyle Hatlestad Oracle Fusion Middleware A-Team member Kyle Hatlestad shares background info and instructions for activating access control lists for roles in Oracle WebCenter UCM 11g PS5. Thought for the Day "If it ain’t broke, fix it anyway. You must invest least 20% of your maintenance budget in refreshing your architecture to prevent good software from becoming spaghetti code." — Larry Bernstein Source: SoftwareQuotes.com

    Read the article

  • 24 Hours of PASS scheduling

    - by Rob Farley
    I have a new appreciation for Tom LaRock (@sqlrockstar), who is doing a tremendous job leading the organising committee for the 24 Hours of PASS event (Twitter: #24hop). We’ve just been going through the list of speakers and their preferences for time slots, and hopefully we’ve kept everyone fairly happy. All the submitted sessions (59 of them) were put up for a vote, and over a thousand of you picking your favourites. The top 28 sessions as voted were all included (24 sessions plus 4 reserves), and duplicates (when a single presenter had two sessions in the top 28) were swapped out for others. For example, both sessions submitted by Cindy Gross were in the top 28. These swaps were chosen by the committee to get a good balance of topics. Amazingly, some big names missed out, and even the top ten included some surprises. T-SQL, Indexes and Reporting featured well in the top ten, and in the end, the mix between BI, Dev and DBA ended up quite nicely too. The ten most voted-for sessions were (in order): Jennifer McCown - T-SQL Code Sins: The Worst Things We Do to Code and Why Michelle Ufford - Index Internals for Mere Mortals Audrey Hammonds - T-SQL Awesomeness: 3 Ways to Write Cool SQL Cindy Gross - SQL Server Performance Tools Jes Borland - Reporting Services 201: the Next Level Isabel de la Barra - SQL Server Performance Karen Lopez - Five Physical Database Design Blunders and How to Avoid Them Julie Smith - Cool Tricks to Pull From Your SSIS Hat Kim Tessereau - Indexes and Execution Plans Jen Stirrup - Dashboards Design and Practice using SSRS I think you’ll all agree this is shaping up to be an excellent event.

    Read the article

  • Should we be able to deploy a single package to the SSIS Catalog?

    - by jamiet
    My buddy Sutha Thiru sent me an email recently asking about my opinion on a particular nuance of the project deployment model in SQL Server Integration Services (SSIS) 2012 and I’d like to share my response as I think it warrants a wider discussion. Sutha asked: Jamie What is your take on this? http://www.mattmasson.com/index.php/2012/07/can-i-deploy-a-single-ssis-package-from-my-project-to-the-ssis-catalog/ Overnight I was talking to Matt who confirmed that they got no plans to change the deployment model. For example if we have following scenrio how do we do deploy? Sprint 1 Pkg1, 2 & 3 has been developed and deployed to UAT. Once signed off its been deployed to Live. Sprint 2 Pkg 4 & 5 been developed. During this time users raised a bug on Pkg2. We want to make the change to Pkg2 and deploy that to UAT and eventually to LIVE without releasing Pkg 4 &5. How do we do it? Matt pointed me to his blog entry which I have seen before . http://www.mattmasson.com/index.php/2012/02/thoughts-on-branching-strategies-for-ssis-projects/ Thanks Sutha My response: Personally, even though I've experienced the exact problem you just outlined, I agree with the current approach. I steadfastly believe that there should not be a way for an unscrupulous developer to slide in a new version of a package under the covers. Deploying .ispac files brings a degree of rigour to your operational processes. Yes, that means that we as SSIS developers are going to have to get better at using source control and branching properly but that is no bad thing in my opinion. Claiming to be proper "developers" is a bit of a cheap claim if we don't even do the fundamentals correctly. I would be interested in the thoughts of others who have used the project deployment model. Do you agree with my point of view? @Jamiet

    Read the article

  • Internal Data Masking

    - by ACShorten
    By default, the data in the product is unmasked for authorized users. If particular data within the object is considered a candidate for data masking then the masking capabilities with the product can be used to mask the data in an appropriate fashion. The inbuilt Data Masking capabilities of the Oracle Utilities Application Framework uses a number of configuration elements: An algorithm, of type F1-MASK, is specified to configure the elements of the data masking including the masking character, number of suffix characters left unmasked, characters to ignore in the string, the application service, security type and authorization levels applicable to the mask. A Data Masking Feature Configuration is created to define where the algorithm applies. The specification of the feature allows you to define the fields to encrypt using the configured algorithm. The algorithm can be attached to a schema field, table field, characteristic, search field and even a child record (such as an identifier). The appropriate user groups are then connected to the application services with the appropriate service types and level to indicate whether the masking applies to the user group or not. For example, say there is a field called CCNBR in the product which holds the credit card details. I would create an algorithm, say CCformatCC, to mask the credit card number with the last few digits as unmasked (as the standard in most systems dictate). I would specify on the Field Mask the following: field="CCNBR", alg="CMformatCC" On the algorithm CMfomatCC, I would specify the mask, application service, security type and the authorization level which users would see the credit card unmasked. To finish the configuration off and to implemention I would connect the appropriate user groups to the application service I specified with the security type and appropriate authorization level for that group. Whenever a user accesses the CCNBR field on any of the maintenance screens, searches and other screens that use the CCNBR meta data definition would then be masked according to the user group that the user was a member of. Refer to the documentation supplied with F1-MASK algorithm type entry for more examples of what is possible.

    Read the article

  • EM12c Release 4: Database as a Service Enhancements

    - by Adeesh Fulay
    Oracle Enterprise Manager 12.1.0.4 (or simply put EM12c R4) is the latest update to the product. As previous versions, this release provides tons of enhancements and bug fixes, attributing to improved stability and quality. One of the areas that is most exciting and has seen tremendous growth in the last few years is that of Database as a Service. EM12c R4 provides a significant update to Database as a Service. The key themes are: Comprehensive Database Service Catalog (includes single instance, RAC, and Data Guard) Additional Storage Options for Snap Clone (includes support for Database feature CloneDB) Improved Rapid Start Kits Extensible Metering and Chargeback Miscellaneous Enhancements 1. Comprehensive Database Service Catalog Before we get deep into implementation of a service catalog, lets first understand what it is and what benefits it provides. Per ITIL, a service catalog is an exhaustive list of IT services that an organization provides or offers to its employees or customers. Service catalogs have been widely popular in the space of cloud computing, primarily as the medium to provide standardized and pre-approved service definitions. There is already some good collateral out there that talks about Oracle database service catalogs. The two whitepapers i recommend reading are: Service Catalogs: Defining Standardized Database Service High Availability Best Practices for Database Consolidation: The Foundation for Database as a Service [Oracle MAA] EM12c comes with an out-of-the-box service catalog and self service portal since release 1. For the customers, it provides the following benefits: Present a collection of standardized database service definitions, Define standardized pools of hardware and software for provisioning, Role based access to cater to different class of users, Automated procedures to provision the predefined database definitions, Setup chargeback plans based on service tiers and database configuration sizes, etc Starting Release 4, the scope of services offered via the service catalog has been expanded to include databases with varying levels of availability - Single Instance (SI) or Real Application Clusters (RAC) databases with multiple data guard based standby databases. Some salient points of the data guard integration: Standby pools can now be defined across different datacenters or within the same datacenter as the primary (this helps in modelling the concept of near and far DR sites) The standby databases can be single instance, RAC, or RAC One Node databases Multiple standby databases can be provisioned, where the maximum limit is determined by the version of database software The standby databases can be in either mount or read only (requires active data guard option) mode All database versions 10g to 12c supported (as certified with EM 12c) All 3 protection modes can be used - Maximum availability, performance, security Log apply can be set to sync or async along with the required apply lag The different service levels or service tiers are popularly represented using metals - Platinum, Gold, Silver, Bronze, and so on. The Oracle MAA whitepaper (referenced above) calls out the various service tiers as defined by Oracle's best practices, but customers can choose any logical combinations from the table below:  Primary  Standby [1 or more]  EM 12cR4  SI  -  SI  SI  RAC -  RAC SI  RAC RAC  RON -  RON RON where RON = RAC One Node is supported via custom post-scripts in the service template A sample service catalog would look like the image below. Here we have defined 4 service levels, which have been deployed across 2 data centers, and have 3 standardized sizes. Again, it is important to note that this is just an example to get the creative juices flowing. I imagine each customer would come up with their own catalog based on the application requirements, their RTO/RPO goals, and the product licenses they own. In the screenwatch titled 'Build Service Catalog using EM12c DBaaS', I walk through the complete steps required to setup this sample service catalog in EM12c. 2. Additional Storage Options for Snap Clone In my previous blog posts, i have described the snap clone feature in detail. Essentially, it provides a storage agnostic, self service, rapid, and space efficient approach to solving your data cloning problems. The net benefit is that you get incredible amounts of storage savings (on average 90%) all while cloning databases in a matter of minutes. Space and Time, two things enterprises would love to save on. This feature has been designed with the goal of providing data cloning capabilities while protecting your existing investments in server, storage, and software. With this in mind, we have pursued with the dual solution approach of Hardware and Software. In the hardware approach, we connect directly to your storage appliances and perform all low level actions required to rapidly clone your databases. While in the software approach, we use an intermediate software layer to talk to any storage vendor or any storage configuration to perform the same low level actions. Thus delivering the benefits of database thin cloning, without requiring you to drastically changing the infrastructure or IT's operating style. In release 4, we expand the scope of options supported by snap clone with the addition of database CloneDB. While CloneDB is not a new feature, it was first introduced in 11.2.0.2 patchset, it has over the years become more stable and mature. CloneDB leverages a combination of Direct NFS (or dNFS) feature of the database, RMAN image copies, sparse files, and copy-on-write technology to create thin clones of databases from existing backups in a matter of minutes. It essentially has all the traits that we want to present to our customers via the snap clone feature. For more information on cloneDB, i highly recommend reading the following sources: Blog by Tim Hall: Direct NFS (DNFS) CloneDB in Oracle Database 11g Release 2 Oracle OpenWorld Presentation by Cern: Efficient Database Cloning using Direct NFS and CloneDB The advantages of the new CloneDB integration with EM12c Snap Clone are: Space and time savings Ease of setup - no additional software is required other than the Oracle database binary Works on all platforms Reduce the dependence on storage administrators Cloning process fully orchestrated by EM12c, and delivered to developers/DBAs/QA Testers via the self service portal Uses dNFS to delivers better performance, availability, and scalability over kernel NFS Complete lifecycle of the clones managed by EM12c - performance, configuration, etc 3. Improved Rapid Start Kits DBaaS deployments tend to be complex and its setup requires a series of steps. These steps are typically performed across different users and different UIs. The Rapid Start Kit provides a single command solution to setup Database as a Service (DBaaS) and Pluggable Database as a Service (PDBaaS). One command creates all the Cloud artifacts like Roles, Administrators, Credentials, Database Profiles, PaaS Infrastructure Zone, Database Pools and Service Templates. Once the Rapid Start Kit has been successfully executed, requests can be made to provision databases and PDBs from the self service portal. Rapid start kit can create complex topologies involving multiple zones, pools and service templates. It also supports standby databases and use of RMAN image backups. The Rapid Start Kit in reality is a simple emcli script which takes a bunch of xml files as input and executes the complete automation in a matter of seconds. On a full rack Exadata, it took only 40 seconds to setup PDBaaS end-to-end. This kit works for both Oracle's engineered systems like Exadata, SuperCluster, etc and also on commodity hardware. One can draw parallel to the Exadata One Command script, which again takes a bunch of inputs from the administrators and then runs a simple script that configures everything from network to provisioning the DB software. Steps to use the kit: The kit can be found under the SSA plug-in directory on the OMS: EM_BASE/oracle/MW/plugins/oracle.sysman.ssa.oms.plugin_12.1.0.8.0/dbaas/setup It can be run from this default location or from any server which has emcli client installed For most scenarios, you would use the script dbaas/setup/database_cloud_setup.py For Exadata, special integration is provided to reduce the number of inputs even further. The script to use for this scenario would be dbaas/setup/exadata_cloud_setup.py The database_cloud_setup.py script takes two inputs: Cloud boundary xml: This file defines the cloud topology in terms of the zones and pools along with host names, oracle home locations or container database names that would be used as infrastructure for provisioning database services. This file is optional in case of Exadata, as the boundary is well know via the Exadata system target available in EM. Input xml: This file captures inputs for users, roles, profiles, service templates, etc. Essentially, all inputs required to define the DB services and other settings of the self service portal. Once all the xml files have been prepared, invoke the script as follows for PDBaaS: emcli @database_cloud_setup.py -pdbaas -cloud_boundary=/tmp/my_boundary.xml -cloud_input=/tmp/pdb_inputs.xml          The script will prompt for passwords a few times for key users like sysman, cloud admin, SSA admin, etc. Once complete, you can simply log into EM as the self service user and request for databases from the portal. More information available in the Rapid Start Kit chapter in Cloud Administration Guide.  4. Extensible Metering and Chargeback  Last but not the least, Metering and Chargeback in release 4 has been made extensible in all possible regards. The new extensibility features allow customer, partners, system integrators, etc to : Extend chargeback to any target type managed in EM Promote any metric in EM as a chargeback entity Extend list of charge items via metric or configuration extensions Model abstract entities like no. of backup requests, job executions, support requests, etc  A slew of emcli verbs have also been added that allows administrators to create, edit, delete, import/export charge plans, and assign cost centers all via the command line. More information available in the Chargeback API chapter in Cloud Administration Guide. 5. Miscellaneous Enhancements There are other miscellaneous, yet important, enhancements that are worth a mention. These mostly have been asked by customers like you. These are: Custom naming of DB Services Self service users can provide custom names for DB SID, DB service, schemas, and tablespaces Every custom name is validated for uniqueness in EM 'Create like' of Service Templates Now creating variants of a service template is only a click away. This would be vital when you publish service templates to represent different database sizes or service levels. Profile viewer View the details of a profile like datafile, control files, snapshot ids, export/import files, etc prior to its selection in the service template Cleanup automation - for failed and successful requests Single emcli command to cleanup all remnant artifacts of a failed request Cleanup can be performed on a per request bases or by the entire pool As an extension, you can also delete successful requests Improved delete user workflow Allows administrators to reassign cloud resources to another user or delete all of them Support for multiple tablespaces for schema as a service In addition to multiple schemas, user can also specify multiple tablespaces per request I hope this was a good introduction to the new Database as a Service enhancements in EM12c R4. I encourage you to explore many of these new and existing features and give us feedback. Good luck! References: Cloud Management Page on OTN Cloud Administration Guide [Documentation] -- Adeesh Fulay (@adeeshf)

    Read the article

< Previous Page | 56 57 58 59 60 61 62 63 64 65 66 67  | Next Page >