Search Results

Search found 15099 results on 604 pages for 'runtime environment'.

Page 379/604 | < Previous Page | 375 376 377 378 379 380 381 382 383 384 385 386  | Next Page >

  • IOUG SIG Webcast on October 30th : Performance Tuning your DB Cloud

    - by Anand Akela
    The Oracle Enterprise Manager Special Interest Group (SIG) is a growing body of IOUG members who manage or are interested in all aspects of Oracle Enterprise Manager. This IOUG SIG is managed by volunteers and supported by Oracle Enterprise Manager product managers and developers. The purpose of the SIG is to bring relevant information and education through webcasts, discussions and networking to users interested in learning more about the product, and to share user experiences. On October 30th at 10 AM pacific time, Oracle Enterprise Manager SIG is hosting a webcast on "Performance Tuning your DB Cloud in OEM 12c Cloud Control - 360 Degrees". In this webcast, Tariq Farooq , CEO, BrainSurface and Mike Ault, Oracle  will provide a tutorial on how to monitor and perform performance tuning of the Oracle database cloud environment. You will learn how to leverage Oracle Enterprise Manager for tuning, trouble-shooting & monitoring your Oracle Database Cloud Ecosystem. The session covers lessons learned, tips/tricks, recommendations, best practices, gotchas and a whole lot more on how to effectively use Oracle Enterprise Manager Cloud Control 12c for quick, easy & intuitive performance tuning of your Oracle Database Cloud. Session Objectives:• Leveraging OEM12c Cloud Control for Oracle DB Tuning/Monitoring • Limited Deep-Dive on AWR • Oracle DB Cloud Performance Tuning • Best Practices for DB Cloud Maintenance/Monitoring Register Now ! Stay Connected: Twitter |  Face book |  You Tube |  Linked in |  Google+ |  Newsletter

    Read the article

  • The effects of Agile Programming can alter the five desirable properties of modeling tools and techniques

    The effects of Agile Programming can alter the five desirable properties of modeling tools and techniques as documented by Pfleeger. The agile methodology does promote human understanding and communication through the use of short iterative software development life cycles which forces stakeholders to review the project and adjust the project for any requirement changes.  Due to the consistent evaluations of a project and requirements, process are continually being refined, upgraded, and compared against other alternatives to ensure the best design delivered to the client. Due to the short repetitive development cycles, increased time is devoted to process management due to the fact that requirements and designs could be constantly changing. This requires additional forecasting, monitoring, and planning for each iteration. Because things can change so rapidly, automated guidance in performing process must be updated for each iteration because the environment and the available reusable process could change. In addition, the original guidance and suggestions for the project also need to be updated to account for these changes as well.   In essence the automation of process execution is supported by the agile methodology because during every iteration all processes must be tested, evaluated to ensure process integrity and compliance with the customer’s requirements. I do not think the agile approach diminishes modeling, in fact I think it increases the modeling because before the start of every development cycle, modeling must be checked for accuracy based on the changed requirements. So in essence the reduced time spent initially designing the models is in fact gained as the project completes every iteration of the project.

    Read the article

  • EBS Workflow Overview & Best Practices - US

    - by Annemarie Provisero
    ADVISOR WEBCAST:  EBS Workflow Overview & Best Practices - US PRODUCT FAMILY:  ATG - Workflow   February 17, 2011 at 17:00 UK / 18:00 CET / 09:00 am Pacific / 10:00 am Mountain / 12:00 Eastern This 1.5-hour session is recommended for technical and functional Users who are interested to get an generic overview about the Tools and Utilities available to get a closer look into the Java Virtual Machine used in an E-Business Suite Environment and how to tune it. TOPICS WILL INCLUDE: Introduction of Workflow Useful Utilities and Tools Best Practices Q&A A short, live demonstration (only if applicable) and question and answer period will be included. Oracle Advisor Webcasts are dedicated to building your awareness around our products and services. This session does not replace offerings from Oracle Global Support Services. Click here to register for this session ------------------------------------------------------------------------------------------------------------- The above webcast is a service of the E-Business Suite Communities in My Oracle Support.For more information on other webcasts, please reference the Oracle Advisor Webcast Schedule.Click here to visit the E-Business Communities in My Oracle Support Note that all links require access to My Oracle Support.

    Read the article

  • How to start a high school Java/Android development club for 13-17 year olds

    - by PaulHurleyuk
    My wife is a high school maths teacher, and is considering starting a programming club for 13-17 years olds who show an interest. Their interest seems to be around Apps and Android which I have little experience of. The kids would be (presumably) interested in programming, and have a fairly high level of computing knowledge. We would provide them with resources and some knowledge, but hopefully a lot would be self guided. I'm hoping stack overflow'ers can provide some tips or starting points. Specific things I think I'll need are; A development Environment; Currently I'm looking towards Java and Android, developed in Eclipse, probably installed on donated older hardware Some initial direction; There seem to be a plethora or 'start android' tutorials, so some recommendations for good ones are valuable, as are recommended paper books A Target; Some final project they should be shooting for A Route; This is where I'm most stuck, how to lead them through the required Java concepts and learning they would need Some related questions already out there Language+IDE for teaching high school students? Teaching "web design/development" to high-school home-school group. Good sources? How can I bootstrap a software development community at my school?

    Read the article

  • What to do when you're the interviewer and you don't like your job?

    - by emcb
    I'm in a sorta strange predicament, and I could use some advice. When I was interviewing for my current job, the job description I was given seemed pretty darn nice to me. Without going into the details, the job hasn't quite turned out the way it was advertised. The company is great and takes care of its employees, but for someone who cares about the code they write and the work they do, it's a bad environment - effectively, we operate between 0.5 and 1.0 on the Joel test, and due to political issues we're not going to move beyond that any time soon. Bitter? Maybe. OK...so I'm in the market for a new job. But that's not where my dilemma is. The problem that I see coming is that I will be participating in interviewing some candidates for a position on my team, and I'm not sure what to do. I've heard through the grapevine that we have some really solid, promising, fresh-out-of-college prospects coming in to interview, and I honestly dread the thought of somebody having their first experience of engineering in this department. So I'm wondering: what should I do if/when the interviewee asks me "Do you like your job?" (no) "What kind of projects would I be working on?" (mostly static HTML/CSS changes) Anything else that would elicit a negative answer if told truthfully Do I tell the truth, to give the candidate a real picture of the job? What if this scares them away, and what if it gets blamed on me? Do I fib or lie, saying we work on exciting projects with lots of flexibility, like the pitch my boss will give when the reality is quite different? Should I feel any kind of moral responsibility to let a promising young developer know that this isn't the job for them, or should I shut up and be loyal 100% to the company? Any approaches or advice is appreciated. I hope I don't come across as overly dramatic - I honestly struggle with this question.

    Read the article

  • Small change in MVVM Light Toolkit templates for Blend 4 RC

    - by Laurent Bugnion
    Ah, the joy of new releases… You will find that the MVVM Light Toolkit works fine with Visual Studio 2010 RTM and Blend 4 RC except for a few adjustments: Blend templates The path to the Expression Blend 4 project templates changed. If you start Expression Blend 4 RC now, you will likely not see the MVVM Light templates in the New Project dialog.   New Project dialog with MVVM Light To restore the templates, follow the steps: Open Windows Explorer Navigate to C:\Users\[username]\Documents\Expression (or simply type My Documents in Windows Explorer and then open the Expression folder). Change the name of the “Blend 4 beta” folder into “Blend 4”. That’s it, you should now see the templates in the New Project dialog in Blend 4. Note that since the new name is “Blend 4”, I hope that I won’t need to do the same exercise when Blend 4 RTM is released! Windows Phone 7 templates Since the Windows Phone 7 tools are not ready yet for Visual Studio 2010 RTM and Blend 4 RC, the templates in the Silverlight for Windows Phone folders will not work. You will get an error if you try to create a new such project in the newly released environment. I hesitated to remove these templates from the current packages, but honestly that is a lot of trouble for a very short time before the tools for Windows Phone 7 are released (note: I don’t have any information as to when these tools will be released). In the mean time, just don’t create a WinPhone7 application. Reminder: If you want to write code for Windows Phone 7, you need to keep the Visual Studio 2010 RC as well as Expression Blend 4 beta. Updated package I uploaded an update to the Blend 4 templates. It is available like before on the “Install manually” page and on the Codeplex page.   Laurent Bugnion (GalaSoft) Subscribe | Twitter | Facebook | Flickr | LinkedIn

    Read the article

  • Amazon AMIs and Oracle VM templates

    - by llaszews
    I have worked with Oracle VM templates and most recently with Amazon Machine Images (AMI). The similarities in the functionality and capabilities they provide are striking. Just take a look a the definitions: An Amazon Machine Image (AMI) is a special type of pre-configured operating system and virtual application software which is used to create a virtual machine within the Amazon Elastic Compute Cloud (EC2). It serves as the basic unit of deployment for services delivered using EC2. AWS AMIs Oracle VM Templates provide an innovative approach to deploying a fully configured software stack by offering pre-installed and pre-configured software images. Use of Oracle VM Templates eliminates the installation and configuration costs, and reduces the ongoing maintenance costs helping organizations achieve faster time to market and lower cost of operations. Oracle VM Templates Other things they have in common: 1. Both have 35 Oracle images or templates: AWS AMI pre-built images Oracle pre-built VM Templates 2. Both allow to build your own images or templates: A. OVM template builder - OVM Template Builder - Oracle VM Template Builder, an open source, graphical utility that makes it easy to use Oracle Enterprise Linux “Just enough OS” (JeOS)–based scripts for developing pre-packaged virtual machines for Oracle VM. B. AMI 'builder' - AMI builder However, AWS has the added feature/benefit of adding your own AMI to the AWS AMI catalog: AMI - Adding to the AWS AMI catalog Another plus with AWS and AMI is there are hundreds of MySQL AMIs (AWS MySQL AMIs ). A benefit of Oracle VM templates is they can run on any public or private cloud environment, not just AWS EC2. However, with Oracle VM templates they first need to be images as AMIs before they can run in the AWS cloud.

    Read the article

  • Can I set my Optimus Nvidia card to run Unity3D with bumblebee?

    - by manuhalo
    I'd like to know whether I can run compiz on my Nvidia card to speed things up. It's a Dell XPS15 laptop but I'm mostly using it as a desktop, so battery life is not important. Apparently my Intel integrated card is able to run unity 3D, but my Nvidia GT 420M is not. Here's the output of unity_support_test, both with optirun and without it: manuhalo@Ubuntu-XPS-L501X:~$ optirun /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce GT 420M/PCI/SSE2 OpenGL version string: 4.1.0 NVIDIA 280.13 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: no GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no manuhalo@Ubuntu-XPS-L501X:~$ /usr/lib/nux/unity_support_test -p OpenGL vendor string: Tungsten Graphics, Inc OpenGL renderer string: Mesa DRI Intel(R) Ironlake Mobile OpenGL version string: 2.1 Mesa 7.11 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: yes Any ideas of why this is happening? Thanks in advance to anyone able to shed some light on this. What I have tried: Installed the v290 drivers from the x-stable PPA. Tried forcing Unity-3D to work by telling Unity to ignore the unity-support-test results i.e. gksudo gedit /etc/environment add the following UNITY_FORCE_START=1 to the end of the file.

    Read the article

  • Office design and layout for agile development

    - by Adam Eberbach
    (moved from stackoverflow) I have found lot of discussions here on about which keyboard, desk, light or colored background is best - but I can't find one addressing the layout of the whole office. We are a company with about 20 employees moving to a new place, something larger. There are two main development practices going on here with regular combination, the back end people often needing to work with the mobile people to arrange web services. There are about twice as many back end people as mobile people. About half of the back end developers are working on-site at any time and while they are almost never all in the office at once at least 5-10 spaces need to be provided - so most of the time the two groups are about equal. We have the chance to arrange desks, partitions and possibly even walls to make the space good. There won't be cash for dot-com frills like catering or massages but now's the time to be planning to avoid ending up with a bunch of desks in a long line. Joel on Software's Bionic Office is an article I've remembered from way back and it has some good ideas but I* (and more importantly the company's owners) are not completely sold on the privacy idea in an environment where we are supposed to be collaborating. This is another great link - The Ultimate Software Development Office Layout - I hadn't even remembered enclosed meeting rooms until reading this. Does the private office stand in the way of agile development? Is the scrum enough forced contact and if you need to bug someone you should need to get up and knock on their door? What design layouts can you point to and why would you recommend them? *I'm not against closed offices at all but would be happy if some other solution can do just as well. If it can't... well, that's what this question is all about.

    Read the article

  • ASP.NET Session Management

    - by geekrutherford
    Great article (a little old but still relevant) about the inner workings of session management in ASP.NET: Underpinnings of the Session State Management Implementation in ASP.NET.   Using StateServer and the BinaryFormatter serialization occuring caused me quite the headache over the last few days. Curiously, it appears the w3wp.exe process actually consumes more memory when utilizing StateServer and storing somewhat large and complex data types in session.   Users began experiencing Out Of Memory exceptions in the production environment. Looking at the stack trace it related to serialization using the BinaryFormatter. Using remote debugging against our QA server I noted that the code in the application functioned without issue. The exception occured outside the context of the application itself when the request had completed and the web server was trying to serialize session state into the StateServer.   The short term solution is switching back to the InProc method. Thus far this has proven to consume considerably less memory and has caused no issues. Long term the complex object stored in session will be off-loaded into a web service used to access the information directly from the database outside the context of the object used to encapsulate it.

    Read the article

  • How To Capture Screenshot Of Logon Screen In Windows 7?

    - by Gopinath
    There are plenty of freeware’s and paid applications that lets you capture screenshots. But none of them let you grab screenshot of Logon screen. In order to capture the screenshots of Logon screen we either had to use a Digital Camera and take a photo or run Windows in a virtual environment and capture screenshot.  Is there any other simple and easy way to grab Logon screenshots in Windows 7? Windows 7 Login Camera is a nice freeware that lets you capture screenshots of Logon screen very easily. To grab the screenshots install the application, lock the screen by pressing CTRL + L and use ease of access button located on the bottom left side. Windows 7 Login Camera launches and allows you save the captured screen on desired location. This handy tool is developed by deviantart.com website user yvidhiatama  and it’s compatible with all the 32bit version of Windows 7. Download Windows 7 Login Camera This article titled,How To Capture Screenshot Of Logon Screen In Windows 7?, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • SOA Starting Point: Methods for Service Identification and Definition

    As more and more companies start to incorporate a Service Oriented Architectural design approach into their existing enterprise systems, it creates the need for a standardized integration technology. One common technology used by companies is an Enterprise Service Bus (ESB). An ESB, as defined by Progress Software, connects and mediates all communications and interactions between services. In essence an ESB is a form of middleware that allows services to communicate with one another regardless of framework, environment, or location. With the emergence of ESB, a new emphasis is now being placed on approaches that can be used to determine what Web services should be built. In addition, what order should these services be built? In May 2011, SOA Magazine published an article that identified 10 common methods for identifying and defining services. SOA’s Ten Common Methods for Service Identification and Definition: Business Process Decomposition Business Functions Business Entity Objects Ownership and Responsibility Goal-Driven Component-Based Existing Supply (Bottom-Up) Front-Office Application Usage Analysis Infrastructure Non-Functional Requirements  Each of these methods provides various pros and cons in regards to their use within the design process. I personally feel that during a design process, multiple methodologies should be used in order to accurately define a design for a system or enterprise system. Personally, I like to create a custom cocktail derived from combining these methodologies in order to ensure that my design fits with the project’s and business’s needs while still following development standards and guidelines. Of these ten methods, I am particularly fond of Business Process Decomposition, Business Functions, Goal-Driven, Component-Based, and routinely use them in my designs.  Works Cited Hubbers, J.-W., Ligthart, A., & Terlouw , L. (2007, 12 10). Ten Ways to Identify Services. Retrieved from SOA Magazine: http://www.soamag.com/I13/1207-1.php Progress.com. (2011, 10 30). ESB ARCHITECTURE AND LIFECYCLE DEFINITION. Retrieved from Progress.com: http://web.progress.com/en/esb-architecture-lifecycle-definition.html

    Read the article

  • How best to keep bumbling, non-technical managers at bay and still deliver good work?

    - by Curious
    This question may be considered subjective (I got a warning) and be closed, but I will risk it, as I need some good advice/experience on this. I read the following at the 'About' page of Fog Creek Software, the company that Joel Spolsky founded and is CEO of: Back in the year 2000, the founders of Fog Creek, Joel Spolsky and Michael Pryor, were having trouble finding a place to work where programmers had decent working conditions and got an opportunity to do great work, without bumbling, non-technical managers getting in the way. Every high tech company claimed they wanted great programmers, but they wouldn’t put their money where their mouth was. It started with the physical environment (with dozens of cubicles jammed into a noisy, dark room, where the salespeople shouting on the phone make it impossible for developers to concentrate). But it went much deeper than that. Managers, terrified of change, treated any new idea as a bizarre virus to be quarantined. Napoleon-complex junior managers insisted that things be done exactly their way or you’re fired. Corporate Furniture Police writhed in agony when anyone taped up a movie poster in their cubicle. Disorganization was so rampant that even if the ideas were good, it would have been impossible to make a product out of them. Inexperienced managers practiced hit-and-run management, issuing stern orders on exactly how to do things without sticking around to see the farcical results of their fiats. And worst of all, the MBA-types in charge thought that coding was a support function, basically a fancy form of typing. A blunt truth about most of today's big software companies! Unfortunately not every developer is as gutsy (or lucky, may I say?) as Joel Spolsky! So my question is: How best to work with such managers, keep them at bay and still deliver great work?

    Read the article

  • Partner OBI 11g 5-Day Hands-on Training Workshop

    - by Mike.Hallett(at)Oracle-BI&EPM
    Normal 0 false false false EN-GB X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi; mso-fareast-language:EN-US;} 14 - 18 January 2013, Oracle Reading (UK) REGISTER HERE NOW This 5 day hands-on workshop provides attendees a hands-on experience to practice with OBI11g environment. Participants will gain in-depth understanding of new architecture of OBIEE 11g, security mode, installation/configuration as well as reporting aspects like new ROLAP/MOLAP style hierarchical browsing, new chart types, Action Framework and Visualization. Please note that attendees are required to have a laptop.  This training is only for OPN member Partners. View here laptop requirements and detailed agenda.

    Read the article

  • Are the Ubuntu ISO images updated From release .ubuntu.com

    - by tijybba
    Just got idea from this(may not be related though) question however. Are the ISO images from the official site updated with updates in Core Ubuntu system , like Kernel Updates , desktop Environment Updates(unity), i mean Updates of BASE system including X-org, Office suite, Package Manager , Update manager or Gnome Base Modules, those released in update Branches like precise-Updates branch. The reason i am asking this is , if i download the ISO image of Ubuntu 12.04 Say after two or three months from release , i have to do an update of approximately 200~300 MB's size. So why are these ISO images not updated to recent updates, i am aware that all of the components are not updated at the same time , but let's say after One month from actual release ( Both LTS and normal releases), the updated components can be added to form a Updated ISO in regular intervals, which provides new users to use latest versions and features with improved stability and less bandwidth Consumption. I am not mentioning the idea of comparison to Rolling Release , or External PPA's added updates , and neither Netinstal but the ISO of updated Packages .This can be provided as optional download. Since my question is within the boundary of Official updates releases so stability could not be the reason. I guess there are custom packagers out there , but having an official option would be better. It helps in distributing Newest ISO OS which impresses a lot new users , since it makes availability of newer features and a faster system ofcourse. Another reason of asking this is here. Edit: Since almost all new (Desktop) users download the Default ISO's having one or few issue , which may have been corrected in following updates. But most of new laptop users i encountered gave up because of it , so should i suggest ,for laptop not listed on Certified H/W list , to try daily Builds , if needed.

    Read the article

  • Worker roles in Windows Azure to host a multiplayer server

    - by MrWiggels
    I've been doing research on where to host a simple multi-player backend for a simple game I'm developing. So as a first choice I downloaded the Windows Azure SDK, which provides a nice and simple emulator environment where you can test out your application before uploading. I also download the Azure Social Game Toolkit (Visit), and followed as far as my understanding can take me. So, down to the main question. Is there anybody with experience developing Azure applications. I'm developing a Action RPG game, in a similar vein to Diablo III. I was thinking of putting up Matchmaking, Friends Lists, etc. Is there another way to connect to Azure services via something like UDP or TCP for sending packets or does everything have to go through HTTP requests? Is it even possible to use HTTP request/response for something like this? All game commands will be simple. Because the game server and the clients will be kept in-sync and will have deterministic actions, I'm just going to send actions like "Use Primary Skill" and "Use Secondary Skill". Any hints, ideas, light bulbs or a smack-in-the-face presentation will be much appreciated.

    Read the article

  • JavaOne Latin America Sessions

    - by Tori Wieldt
    The stars of Java are gathering in São Paulo next week. Here are just a few of the outstanding sessions you can attend at JavaOne Latin America: “Designing Java EE Applications in the Age of CDI” Michel Graciano, Michael Santos “Don’t Get Hacked! Tips and Tricks for Securing Your Java EE Web Application” Fabiane Nardon, Fernando Babadopulos “Java and Security Programming” Juan Carlos Herrera “Java Craftsmanship: Lessons Learned on How to Produce Truly Beautiful Java Code” Edson Yanaga “Internet of Things with Real Things: Java + Things – API + Raspberry PI + Toys!” Vinicius Senger “OAuth 101: How to Protect Your Resources in a Web-Connected Environment” Mauricio Leal “Approaching Pure REST in Java: HATEOAS and HTTP Tuning” Eder Ignatowicz “Open Data in Politics: Using Java to Follow Your Candidate” Bruno Gualda, Thiago Galbiatti Vespa "Java EE 7 Platform: More Productivity and Integrated HTML" Arun Gupta  Go to the JavaOne site for a complete list of sessions. JavaOne Latin America will in São Paulo, 4-6 December 2012 at the Transamerica Expo Center. Register by 3 December and Save R$ 300,00! Para mais informações ou inscrição ligue para (11) 2875-4163. 

    Read the article

  • Scene graphs and spatial partitioning structures: What do you really need?

    - by tapirath
    I've been fiddling with 2D games for awhile and I'm trying to go into 3D game development. I thought I should get my basics right first. From what I read scene graphs hold your game objects/entities and their relation to each other like 'a tire' would be the child of 'a vehicle'. It's mainly used for frustum/occlusion culling and minimizing the collision checks between the objects. Spatial partitioning structures on the other hand are used to divide a big game object (like the map) to smaller parts so that you can gain performance by only drawing the relevant polygons and again minimizing the collision checks to those polygons only. Also a spatial partitioning data structure can be used as a node in a scene graph. But... I've been reading about both subjects and I've seen a lot of "scene graphs are useless" and "BSP performance gain is irrelevant with modern hardware" kind of articles. Also some of the game engines I've checked like gameplay3d and jmonkeyengine are only using a scene graph (That also may be because they don't want to limit the developers). Whereas games like Quake and Half-Life only use spatial partitioning. I'm aware that the usage of these structures very much depend on the type of the game you're developing so for the sake of clarity let's assume the game is a FPS like Counter-Strike with some better outdoor environment capabilities (like a terrain). The obvious question is which one is needed and why (considering the modern hardware capabilities). Thank you.

    Read the article

  • Open World Day 2

    - by Antony Reynolds
    A Day in the Life of an Oracle OpenWorld Attendee Part III My second full day started with me waking up and realising that I was supposed to meet my friend Tejas Joshi (co-author of the Oracle Exalogic Elastic Cloud Handbook) at the station in 20 minutes!  Needless to say I didn’t make it, but then I felt better later when I found out he had caught the wrong shuttle bus and ended up at the airport instead of the BART! The morning was spent in the Authors Seminar arranged to give authors a whirlwind tour of Oracle Product updates and strategy plans.  It was useful to see what was happening in areas I knew little or nothing about.  In the afternoon I wandered around Java One, a very different show to OpenWorld with much more bleeding edge stuff and just plain blue sky thinking.  Of course who couldn’t love a show with a full size Duke wondering around and available for photographs. Attended a presentation on a highly available Weblogic JMS environment wich did a great job of laying out to architect a highly available solution. Dinner with customers and then collapsed exhausted into bed!

    Read the article

  • How to go from Mainframe to the Cloud?

    - by Ruma Sanyal
    Running applications on IBM mainframes is expensive, complex, and hinders IT responsiveness. The high costs from frequent forced upgrades, long integration cycles, and complex operations infrastructures can only be alleviated by migrating away from a mainframe environment.  Further, data centers are planning for cloud enablement pinned on principles of operating at significantly lower cost, very low upfront investment, operating on commodity hardware and open, standards based systems, and decoupling of hardware, infrastructure software, and business applications. These operating principles are in direct contrast with the principles of operating businesses on mainframes. By utilizing technologies such as Oracle Tuxedo, Oracle Coherence, and Oracle GoldenGate, businesses are able to quickly and safely migrate away from their IBM mainframe environments. Further, running Oracle Tuxedo and Oracle Coherence on Oracle Exalogic, the first and only integrated cloud machine on the market, Oracle customers can not only run their applications on standards-based open systems, significantly cutting their time to market and costs, they can start their journey of cloud enabling their mainframe applications. Oracle Tuxedo re-hosting tools and techniques can provide automated migration coverage for more than 95% of mainframe application assets, at a fraction of the cost Oracle GoldenGate can migrate data from mainframe systems to open systems, eliminating risks associated with the data migration Oracle Coherence hosts transactional data in memory providing mainframe-like data performance and linear scalability Running Oracle software on top of Oracle Exalogic empowers customers to start their journey of cloud enabling their mainframe applications Join us in a series of events across the globe where you you'll learn how you can build your enterprise cloud and add tremendous value to your business. In addition, meet with Oracle experts and your peers to discuss best practices and see how successful organizations are lowering total cost of ownership and achieving rapid returns by moving to the cloud. Register for the Oracle Fusion Middleware Forum event in a city new you!

    Read the article

  • Oracle Tutor: Document Audit and Maintenance

    - by Emily Chorba
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} Perhaps the most critical phase in the process of documenting policies and procedure -- and the greatest challenge to owners -- is the maintenance of published documents. Documents must reflect current practice and they must be accurate. The most effective way to ensure this is through the regular audit of documents. In the Tutor environment, a Document Owner must audit each of his/her documents once every 6 to 12 months to verify that the document reflects actual practice. If it does not, the document is updated or employees are retrained (depending on the nature of the discrepancy). If a document update is required, the Tutor system enables the owner to modify and redistribute the document within one work day. This is possible because: Documents contain a minimum of detail, thereby reducing the edits. Document format and structure are simple, so changes are easy to identify The Tutor Author software tool enables the Document Owner or the Document Administrator to update the file quickly. The Document Administrator verifies the document format and integration, publishes the document, and distributes it to all affected employees, thereby freeing the Document Owner of the more tedious tasks. Learn More For more information about Tutor, visit Oracle.Com or the Tutor Blog. Post your questions at the Tutor Forum. Emily Chorba Principle Product Manager Oracle Tutor & UPK

    Read the article

  • Release/Change management - best aproach

    - by Bob Rivers
    I asked this question an year ago in StackOverflow and never got a good answer. Since Programmers seems to be a better place to ask it, I'll give it a try... What is the better way to work with release management? More specifically what would be the best way to release packages? For example, assuming that you have a relatively stable system, a good quality assurance process (QA), etc. How do you prefer to release new versions? Let's assume that we are talking about a mid to large "centralized" web system (no clients), in-house development. This system can be considered "vital" for a corporate operations. I have a tendency to prefer to do this by releasing packets at regular intervals, not greater than 1 to 3 months. During this period, I will include into the package,fixes and improvements and make the implementation in production environment only once. But I've seen some people who prefer to place small changes in production, but with a greater frequency. The claim of these people is that by doing so, it is easier to identify bugs that have gone through the process of QA: in a package with 10 changes and another with only 1, it is much easier to know what caused the problem in the package with just one change... What is the opinion came from you?

    Read the article

  • Wordpress .htaccess preventing subfolder access

    - by John K.
    This is sort of a goofy setup, but it's not in my power to reconfigure it at this time. I'm running in a shared hosting environment. The domain is example.com. This is an add-on domain on the host side with example.com being redirected to the www/example.com sub-directory. That directory houses a standard Wordpress site which acts as the main site when you visit example.com. The .htaccess file within that directory is: # BEGIN WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> # END WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteRule ^wp-admin/profile\.php$ /ssm/welcome [R] </IfModule> I have a subdirectory, at the root level with the /example.com subdirectory that houses a cake php application. That subdirectory is /tracker. My problem is that when I attempt to browse to example.com/tracker, I get a 404 from Wordpress because perma links are on. What I think I need is a rewrite rule in the Wordpress .htaccess file that short circuits the existing rewrite rules and permits example.com/tracker to work independently of the Wordpress install. Or a rewrite rule at the root level that short circuits the redirect to the /example.com directory in the first place. Not sure how well I explained that so here's a summary. The www/ directory structure: example.com/ tracker/ Add on domain of www.example.com redirecting to the /example.com directory with Wordpress and a tracker/ directory running CakePHP which I would like to access via www.example.com/tracker. If you need further info or clarification let me know!

    Read the article

  • How do you manage a complexity jump?

    - by glenatron
    It seems an infrequent but common experience that sometimes you're working on a project and suddenly something turns up unexpectedly, throws a massive spanner in the works and ramps up the complexity a whole lot. For example, I was working on an application that talked to SOAP services on various other machines. I whipped up a prototype that worked fine, then went on to develop a regular front end and generally get everything up and running in a nice, fairly simple and easy to follow fashion. It worked great until we started testing across a wider network and suddenly pages started timing out as the latency of the connections and the time required to perform calculations on remote machines resulted in timed out requests to the soap services. It turned out that we needed to change the architecture to spin requests out onto their own threads and cache the returned data so it could be updated progressively in the background rather than performing calculations on a request by request basis. The details of that scenario are not too important - indeed it's not a great example as it was quite forseeable and people who have written a lot of apps of this type for this type of environment might have anticipated it - except that it illustrates a way that one can start with a simple premise and model and suddenly have an escalation of complexity well into the development of the project. What strategies do you have for dealing with these types of functional changes whose need arises - often as a result of environmental factors rather than specification change - later on in the development process or as a result of testing? How do you balance between avoiding the premature optimisation/ YAGNI/ overengineering risks of designing a solution that mitigates against possible but not necessarily probable issues as opposed to developing a simpler and easier solution that is likely to be as effective but doesn't incorporate preparedness for every possible eventuality?

    Read the article

  • How can I reduce the amount of time it takes to fully regression test an application ready for release?

    - by DrLazer
    An app I work on is being developed with a modified version of scrum. If you are not familiar with scrum, it's just an alternative approach to a more traditional watefall model, where a series of features are worked on for a set amount of time known as a sprint. The app is written in C# and makes use of WPF. We use Visual C# 2010 Express edition as an IDE. If we work on a sprint and add in a few new features, but do not plan to release until a further sprint is complete, then regression testing is not an issue as such. We just test the new features and give the app a good once over. However, if a release is planned that our customers can download - a full regression test is factored in. In the past this wasn't a big deal, it took 3 or 4 days and the devs simply fix up any bugs found in the regression phase, but now, as the app is getting larger and larger and incorporating more and more features, the regression is spanning out for weeks. I am interested in any methods that people know of or use that can decrease this time. At the moment the only ideas I have are to either start writing Unit Tests, which I have never fully tried out in a commercial environment, or to research the possibilty of any UI Automation API's or tools that would allow me to write a program to perform a series of batch tests. I know literally nothing about the possibilities of UI automation so any information would be valuable. I don't know that much about Unit testing either, how complicated can the tests be? Is it possible to get Unit tests to use the UI? Are there any other methods I should consider? Thanks for reading, and for any advice in advance. Edit: Thanks for the information. Does anybody know of any alternatives to what has been mentioned so far (NUnit, RhinoMocks and CodedUI)?

    Read the article

< Previous Page | 375 376 377 378 379 380 381 382 383 384 385 386  | Next Page >