Search Results

Search found 17063 results on 683 pages for 'window restore'.

Page 631/683 | < Previous Page | 627 628 629 630 631 632 633 634 635 636 637 638  | Next Page >

  • Technical workshop with the gurus: Architecting Oracle Database-As-A-Service (DBaaS)

    - by Javier Puerta
    Hardware and Software, Engineered to Work Together inside the Click Here The order you must follow to make the colored link appear in browsers. If not the default window link will appear 1. Select the word you want to use for the link 2. Select the desired color, Red, Black, etc 3. Select bold if necessary ___________________________________________________________________________________________________________________ Templates use two sizes of fonts and the sans-serif font tag for the email. All Fonts should be (Arial, Helvetica, sans-serif) tags Normal size reading body fonts should be set to the size of 2. Small font sizes should be set to 1 !!!!!!!DO NOT USE ANY OTHER SIZE FONT FOR THE EMAILS!!!!!!!! ___________________________________________________________________________________________________________________ -- OCTOBER 2013 Invitation: Architecting Oracle Database-As-A-Service (DBaaS) Stay Connected Sign up for Specific Updates Architecting Oracle Database-As-A-Service (DBaaS) Dear partner, We are pleased to invite you to a 2-day workshop dedicated to EMEA partners on "Architecting Oracle Private Database Cloud & Delivering Database-As-A-Service (DBaaS)". This exclusive workshop will be delivered by Product Management and Product Development from Oracle HQ and focuses on the main theme CIOs are tackling with in the last decade: Consolidation to Private Cloud. For many customers the journey to consolidation has led to DBaaS Cloud deployments to significantly reduce costs and offer agile IT services. With the recent launch of Oracle Database 12c, the game really has changed in terms of what Oracle offers and how database clouds can be deployed. REGISTER NOW Who should attend: Enterprise Architects Infrastructure Architects DB Architects from System Integrators and large Independent Software Vendors. Take this opportunity to learn from the gurus, how you can help your customers maximize on their cloud consolidation strategies. The workshops main focus is service delivery, which includes standardization and consolidation, and how you would help your customers transform their current IT infrastructure to a service delivery model. It will discuss best practices and reviews customer examples that have successfully implemented a database cloud. The agenda is split into two days sessions: Day 1: Overview & Planning Database Cloud - Demos Customer Case Studies Database 12c Day 2: Database Cloud - Design Database Cloud - Implementation EM Cloud Control DBaaS on Engineered Systems Question and Answers Attendance is free of charge for qualified Oracle partners - Register now for one of the below sessions: Date Country Location 5 & 6 November 2013  United Kingdom   Manchester 7 & 8 November 2013  Germany  Munich 11 & 12 November 2013  Netherlands  Amsterdam 14 & 15 November 2013  Turkey Istanbul 18 & 19 November 2013  Austria Vienna Looking forward to seeing you! Javier Puerta Director, Core Technology Partner Programs EMEA Prashant Barot Director, Core Technology     Resources OPN Portal OPN Enablement News Blog Oracle Partner Store Use Oracle Trademark in Google AdWords OPN Events Calendar OPN Information Center OPN Solutions Catalog Promote Your Events on Oracle Calendar Copyright © 2013, Oracle and/or its affiliates. All rights reserved. Contact Us | Legal Notices and Terms of Use | Privacy Statement Oracle Corporation - Worldwide Headquarters, 500 Oracle Parkway, OPL - E-mail Services, Redwood Shores, CA 94065, United States

    Read the article

  • 24 hours to pass until 24 Hours of PASS

    - by Rob Farley
    There’s a bunch of stuff going on at the moment in the SQL world, so if you’ve missed this particular piece of news, let me tell you a bit about it. Twice a year, the SQL community puts on its biggest virtual event – 24 Hours of PASS. And the next one is tomorrow – March 21st, 2012. Twenty-four sessions, back-to-back, featuring a selection of some of the best presenters in the SQL world, speakers from all over the world, coming together in an online collaboration that so far has well over thirty thousand registrations across the presentations. Some people are signed up for all 24 sessions, some only one. Traditionally, LiveMeeting has been used as the platform for this event, but this year we’re going with a new platform – IBTalk. It promises big, and we’re hoping it won’t let us down. LiveMeeting has been great, and we thank Microsoft for providing it as a platform for the past few years. However, as the event has grown, we’ve found that a new idea is necessary. Last year a search was done for a new platform, and IBTalk ticked the right boxes. The feedback from the presenters and moderators so far has been overwhelmingly positive, and we’re hoping that this is going to really enhance the user experience. One of my favourite features of the platform is the language side. It provides a pretty good translation service. Users who join a session will see a flag on the left of the screen. If they click it, they can change the language to one of 15 on offer. Picking this changes all the labels on everything. It even translates the text in the Q&A window. What this means is that someone from Brazil can ask their question in Portuguese, and the presenter will see it in English. Then if the answer is typed in English, the questioner will be able to see the answer, also in Portuguese. Or they can switch to English to see it as the answerer typed it. I know there’s always the risk of bad translations going on, but I’ve heard good things about this translation service. But there’s more – IBTalk are providing staff to type up closed captioning live during the event. So if English isn’t your first language, don’t worry! Picking your language will also let you see subtitles in your chosen language. I’m hoping that this event is the start of PASS being able to reach people from all corners of the world. Wouldn’t it be great to find that this event is successful, and that the next 24HOP (later in the year, our Summit Preview event) has just as many non-English speakers tuning in as English speakers? If you haven’t been planning which sessions you’re going to attend, you really should get over to sqlpass.org/24hours and have a look through what’s on offer. There’s some amazing material from some of the industry’s brightest, covering a wide range of topics, from classic SQL areas to the brand new SQL 2012 features. There really should be something for every SQL professional. Check the time zones though – if you’re in the US you might be on Summer time, and an hour closer to GMT than normal. Massive thanks must go to Microsoft, SQL Sentry and Idera for sponsoring this event. Without sponsors we wouldn’t be able to put any of this on. These companies are helping 24HOP continue to grow into an event for the whole world. See you tomorrow! @rob_farley | #24hop | #sqlpass

    Read the article

  • 24 hours to pass until 24 Hours of PASS

    - by Rob Farley
    There’s a bunch of stuff going on at the moment in the SQL world, so if you’ve missed this particular piece of news, let me tell you a bit about it. Twice a year, the SQL community puts on its biggest virtual event – 24 Hours of PASS. And the next one is tomorrow – March 21st, 2012. Twenty-four sessions, back-to-back, featuring a selection of some of the best presenters in the SQL world, speakers from all over the world, coming together in an online collaboration that so far has well over thirty thousand registrations across the presentations. Some people are signed up for all 24 sessions, some only one. Traditionally, LiveMeeting has been used as the platform for this event, but this year we’re going with a new platform – IBTalk. It promises big, and we’re hoping it won’t let us down. LiveMeeting has been great, and we thank Microsoft for providing it as a platform for the past few years. However, as the event has grown, we’ve found that a new idea is necessary. Last year a search was done for a new platform, and IBTalk ticked the right boxes. The feedback from the presenters and moderators so far has been overwhelmingly positive, and we’re hoping that this is going to really enhance the user experience. One of my favourite features of the platform is the language side. It provides a pretty good translation service. Users who join a session will see a flag on the left of the screen. If they click it, they can change the language to one of 15 on offer. Picking this changes all the labels on everything. It even translates the text in the Q&A window. What this means is that someone from Brazil can ask their question in Portuguese, and the presenter will see it in English. Then if the answer is typed in English, the questioner will be able to see the answer, also in Portuguese. Or they can switch to English to see it as the answerer typed it. I know there’s always the risk of bad translations going on, but I’ve heard good things about this translation service. But there’s more – IBTalk are providing staff to type up closed captioning live during the event. So if English isn’t your first language, don’t worry! Picking your language will also let you see subtitles in your chosen language. I’m hoping that this event is the start of PASS being able to reach people from all corners of the world. Wouldn’t it be great to find that this event is successful, and that the next 24HOP (later in the year, our Summit Preview event) has just as many non-English speakers tuning in as English speakers? If you haven’t been planning which sessions you’re going to attend, you really should get over to sqlpass.org/24hours and have a look through what’s on offer. There’s some amazing material from some of the industry’s brightest, covering a wide range of topics, from classic SQL areas to the brand new SQL 2012 features. There really should be something for every SQL professional. Check the time zones though – if you’re in the US you might be on Summer time, and an hour closer to GMT than normal. Massive thanks must go to Microsoft, SQL Sentry and Idera for sponsoring this event. Without sponsors we wouldn’t be able to put any of this on. These companies are helping 24HOP continue to grow into an event for the whole world. See you tomorrow! @rob_farley | #24hop | #sqlpass

    Read the article

  • Craig Mundie's video

    - by GGBlogger
    Timothy recently posted “Microsoft Shows Off Radical New UI, Could Be Used In Windows 8” on Slashdot. I took such grave exception to his post that I found it necessary to my senses to write this blog. We need to go back many years to the days of hand cranked calculators and early main frame computers. These devices had singular purposes – they were “number crunchers” used to make accounting easier. The front facing display in early mainframes was “blinken lights.” The calculators did provide printing – in the form of paper tape and the mainframes used line printers to generate reports as needed. We had other metaphors to work with. The typewriter was/is a mechanical device that substitutes for a type setting machine. The originals go back to 1867 and the keyboard layout has remained much the same to this day. In the earlier years the Morse code telegraphs gave way to Teletype machines. The old ASR33, seen on the left in this photo of one of the first computers I help manufacture, used a keyboard very similar to the keyboards in use today. It also generated punched paper tape that we generated to program this computer in machine language. Everything considered this computer which dates back to the late 1960s has a keyboard for input and a roll of paper as output. So in a very rudimentary fashion little has changed. Oh – we didn’t have a mouse! The entire point of this exercise is to point out that we still use very similar methods to get data into and out of a computer regardless of the operating system involved. The Altair, IMSAI, Apple, Commodore and onward to our modern machines changed the hardware that we interfaced to but changed little in the way we input, view and output the results of our computing effort. The mouse made some changes and the advent of windowed interfaces such as Windows and Apple made things somewhat easier for the user. My 4 year old granddaughter plays here Dora games on our computer. She knows how to start programs, use the mouse, play the game and is quite adept so we have come some distance in making computers useable. One of my chief bitches is the constant harangues leveled at Microsoft. Yup – they are a money making organization. You like Apple? No problem for me. I don’t use Apple mostly because I’m comfortable in the Windows environment but probably more because I don’t like Apple’s “Holier than thou” attitude. Some think they do superior things and that’s also fine with me. Obviously the iPhone has not done badly and other Apple products have fared well. But they are expensive. I just build a new machine with 4 Terabytes of storage, an Intel i7 Core 950 processor and 12 GB of RAMIII. It cost me – with dual monitors – less than 2000 dollars. Now to the chief reason for this blog. I’m going to continue developing software for as long as I’m able. For that reason I don’t see my keyboard, mouse and displays changing much for many years. I also don’t think Microsoft is going to spoil that for me by making radical changes to my developer experience. What Craig Mundie does in his video here:  http://www.ispyce.com/2011/02/microsoft-shows-off-radical-new-ui.html is explore the potential future of computer interfaces for the masses of potential users. Using a computer today requires a person to have rudimentary capabilities with keyboards and the mouse. Wouldn’t it be great if all they needed was hand gestures? Although not mentioned it would also be nice if computers responded intelligently to a user’s voice. There is absolutely no argument with the fact that user interaction with these machines is going to change over time. My personal prediction is that it will take years for much of what Craig discusses to come to a cost effective reality but it is certainly coming. I just don’t believe that what Craig discusses will be the future look of a Window 8.

    Read the article

  • SQL SERVER – Identity Fields – Contest Win Joes 2 Pros Combo (USD 198) – Day 2 of 5

    - by pinaldave
    August 2011 we ran a contest where every day we give away one book for an entire month. The contest had extreme success. Lots of people participated and lots of give away. I have received lots of questions if we are doing something similar this month. Absolutely, instead of running a contest a month long we are doing something more interesting. We are giving away USD 198 worth gift every day for this week. We are giving away Joes 2 Pros 5 Volumes (BOOK) SQL 2008 Development Certification Training Kit every day. One copy in India and One in USA. Total 2 of the giveaway (worth USD 198). All the gifts are sponsored from the Koenig Training Solution and Joes 2 Pros. The books are available here Amazon | Flipkart | Indiaplaza How to Win: Read the Question Read the Hints Answer the Quiz in Contact Form in following format Question Answer Name of the country (The contest is open for USA and India residents only) 2 Winners will be randomly selected announced on August 20th. Question of the Day: Which of the following statement is incorrect? a) Identity value can be negative. b) Identity value can have negative interval. c) Identity value can be of datatype VARCHAR d) Identity value can have increment interval larger than 1 Query Hints: BIG HINT POST A simple way to determine if a table contains an identity field is to use the SSMS Object Explorer Design Interface. Navigate to the table, then right-click it and choose Design from the pop-up window. When your design tab opens, select the first field in the table to view its list of properties in the lower pane of the tab (In this case the field is ProductID). Look to see if the Identity Specification property in the lower pane is set to either yes or no. SQL Server will allow you to utilize IDENTITY_INSERT with just one table at a time. After you’ve completed the needed work, it’s very important to reset the IDENTITY_INSERT back to OFF. Additional Hints: I have previously discussed various concepts from SQL Server Joes 2 Pros Volume 2. SQL Joes 2 Pros Development Series – Output Clause in Simple Examples SQL Joes 2 Pros Development Series – Ranking Functions – Advanced NTILE in Detail SQL Joes 2 Pros Development Series – Ranking Functions – RANK( ), DENSE_RANK( ), and ROW_NUMBER( ) SQL Joes 2 Pros Development Series – Advanced Aggregates with the Over Clause SQL Joes 2 Pros Development Series – Aggregates with the Over Clause SQL Joes 2 Pros Development Series – Overriding Identity Fields – Tricks and Tips of Identity Fields SQL Joes 2 Pros Development Series – Many to Many Relationships Next Step: Answer the Quiz in Contact Form in following format Question Answer Name of the country (The contest is open for USA and India) Bonus Winner Leave a comment with your favorite article from the “additional hints” section and you may be eligible for surprise gift. There is no country restriction for this Bonus Contest. Do mention why you liked it any particular blog post and I will announce the winner of the same along with the main contest. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Joes 2 Pros, PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • CodePlex Daily Summary for Sunday, February 13, 2011

    CodePlex Daily Summary for Sunday, February 13, 2011Popular ReleasesTV4Home - The all-in-one TV solution!: 0.1.0.0 Preview: This is the beta preview release of the TV4Home software.Finestra Virtual Desktops: 1.2: Fixes a few minor issues with 1.1 including the broken per-desktop backgrounds Further improves the speed of switching desktops A few UI performance improvements Added donations linksNuGet: NuGet 1.1: NuGet is a free, open source developer focused package management system for the .NET platform intent on simplifying the process of incorporating third party libraries into a .NET application during development. This release is a Visual Studio 2010 extension and contains the the Package Manager Console and the Add Package Dialog. The URL to the package OData feed is: http://go.microsoft.com/fwlink/?LinkID=206669 To see the list of issues fixed in this release, visit this our issues listEnhSim: EnhSim 2.4.0: 2.4.0This release supports WoW patch 4.06 at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 Changes since 2.3.0 - Upd...Sterling Isolated Storage Database with LINQ for Silverlight and Windows Phone 7: Sterling OODB v1.0: Note: use this changeset to download the source example that has been extended to show database generation, backup, and restore in the desktop example. Welcome to the Sterling 1.0 RTM. This version is not backwards-compatible with previous versions of Sterling. Sterling is also available via NuGet. This product has been used and tested in many applications and contains a full suite of unit tests. You can refer to the User's Guide for complete documentation, and use the unit tests as guide...PDF Rider: PDF Rider 0.5.1: Changes from the previous version * Use dynamic layout to better fit text in other languages * Includes French and Spanish localizations Prerequisites * Microsoft Windows Operating Systems (XP - Vista - 7) * Microsoft .NET Framework 3.5 runtime * A PDF rendering software (i.e. Adobe Reader) that can be opened inside Internet Explorer. Installation instructionsChoose one of the following methods: 1. Download and run the "pdfRider0.5.1-setup.exe" (reccomended) 2. Down...Snoop, the WPF Spy Utility: Snoop 2.6.1: This release is a bug fixing release. Most importantly, issues have been seen around WPF 4.0 applications not always showing up in the app chooser. Hopefully, they are fixed now. I thought this issue warranted a minor release since more and more people are going WPF 4.0 and I don't want anyone to have any problems. Dan Hanan also contributes again with several usability features. Thanks Dan! Happy Snooping! p.s. By request, I am also attaching a .zip file ... so that people can install it ...RIBA - Rich Internet Business Application for Silverlight: Preview of MVVM Framework Source + Tutorials: This is a first public release of the MVVM Framework which is part of the final RIBA application. The complete RIBA example LOB application has yet to be published. Further Documentation on the MVVM part can be found on the Blog, http://www.SilverlightBlog.Net and in the downloadable source ( mvvm/doc/ ). Please post all issues and suggestions in the issue tracker.SharePoint Learning Kit: 1.5: SharePoint Learning Kit 1.5 has the following new functionality: *Support for SharePoint 2010 *E-Learning Actions can be localised *Two New Document Library Edit Options *Automatically add the Assignment List Web Part to the Web Part Gallery *Various Bug Fixes for the Drop Box There are 2 downloads for this release SLK-1.5-2010.zip for SharePoint 2010 SLK-1.5-2007.zip for SharePoint 2007 (WSS3 & MOSS 2007)Facebook C# SDK: 5.0.3 (BETA): This is fourth BETA release of the version 5 branch of the Facebook C# SDK. Remember this is a BETA build. Some things may change or not work exactly as planned. We are absolutely looking for feedback on this release to help us improve the final 5.X.X release. For more information about this release see the following blog posts: Facebook C# SDK - Writing your first Facebook Application Facebook C# SDK v5 Beta Internals Facebook C# SDK V5.0.0 (BETA) Released We have spend time trying ...NodeXL: Network Overview, Discovery and Exploration for Excel: NodeXL Excel Template, version 1.0.1.161: The NodeXL Excel template displays a network graph using edge and vertex lists stored in an Excel 2007 or Excel 2010 workbook. What's NewThis release adds a new Twitter List network importer, makes some minor feature improvements, and fixes a few bugs. See the Complete NodeXL Release History for details. Installation StepsFollow these steps to install and use the template: Download the Zip file. Unzip it into any folder. Use WinZip or a similar program, or just right-click the Zip file...WCF Data Services Toolkit: WCF Data Services Toolkit: The source code and binary releases of the WCF Data Services Toolkit. For simplicity, the source code download doesn't include any of the MSTest files. If you want those, you can pull the code down via MercurialyoutubeFisher: youtubeFisher 3.0 [beta]: What's new: Video capturing improved Supports YouTube's new layout (january 2011) Internal refactoringNearforums - ASP.NET MVC forum engine: Nearforums v5.0: Version 5.0 of the ASP.NET MVC Forum Engine, containing the following improvements: .NET 4.0 as target framework using ASP.NET MVC 3. All views migrated to Razor for cleaner markup. Alternate template (Layout file) for mobile devices 4 Bug Fixes since Version 4.1 Visit the project Roadmap for more details. Webdeploy package sha1 checksum: 28785b7248052465ea0738a7775e8e8744d84c27fuv: 1.0 release, codename Chopper Joe: features: search/replace :o to open file :s to save file :q to quitASP.NET MVC Project Awesome, jQuery Ajax helpers (controls): 1.7: A rich set of helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form, Popup and Pager html generation optimized new features for the lookup (add additional search data ) live demo went aeroAutoLoL: AutoLoL v1.5.5: AutoChat now allows up to 6 items. Items with nr. 7-0 will be removed! News page url's are now opened in the default browser Added a context menu to the system tray icon (thanks to Alex Banagos) AutoChat now allows configuring the Chat Keys and the Modifier Key The recent files list now supports compact and full mode Fix: Swapped mouse buttons are now properly detected Fix: Sometimes the Play button was pressed while still greyed out Champion: Karma Note: You can also run the u...mojoPortal: 2.3.6.2: see release notes on mojoportal.com http://www.mojoportal.com/mojoportal-2362-released.aspx Note that we have separate deployment packages for .NET 3.5 and .NET 4.0 The deployment package downloads on this page are pre-compiled and ready for production deployment, they contain no C# source code. To download the source code see the Source Code Tab I recommend getting the latest source code using TortoiseHG, you can get the source code corresponding to this release here.Rawr: Rawr 4.0.19 Beta: Rawr is now web-based. The link to use Rawr4 is: http://elitistjerks.com/rawr.phpThis is the Cataclysm Beta Release. More details can be found at the following link http://rawr.codeplex.com/Thread/View.aspx?ThreadId=237262 As of the 4.0.16 release, you can now also begin using the new Downloadable WPF version of Rawr!This is a pre-alpha release of the WPF version, there are likely to be a lot of issues. If you have a problem, please follow the Posting Guidelines and put it into the Issue Trac...IronRuby: 1.1.2: IronRuby 1.1.2 is a servicing release that keeps on improving compatibility with Ruby 1.9.2 and includes IronRuby integration to Visual Studio 2010. We decided to drop 1.8.6 compatibility mode in all post-1.0 releases. We recommend using IronRuby 1.0 if you need 1.8.6 compatibility. In this release we fixed several major issues: - problems that blocked Gem installation in certain cases - regex syntax: the parser was replaced with a new one that is much more compatible with Ruby 1.9.2 - cras...New ProjectsAbstract | .NET DDD abstraction for infra-structure (Data, Blobs, Queues): In the last few years we have seen many tools abstract access to infra-structures. They are all very different - what makes it difficult for you to move from Azure or to Azure. Abstract makes migration easier by standardising access to these infra-structures.Apex APRS: Apex APRS is a new APRS client application that is unlike any other. Key Features: Online and offline-cached map viewing from multiple popular sources Fast, simple, intuitive & powerful user interface Customizable Notification System: Customizable Notification SystemDaniel Singleton for C++: An elegant solution for C++ singletons using dependency declaration to control lifetime. One object created during any execution, lazy-init, thread safety... nice and compact.Deduplicator: Deduplicator helps to organize your file system. Create one folder organized by choice containing unique files. To be used for photo's, mp3's or any other binary format. Deduplicator is released yet, user interface is limited and some hardcoding is still in placeDigitypon (ASP.NET MVC 3): Digitypon will be a new web application specialized to be used by those who want to set an e-newspaper or an e-magazine. The main difference among other CMSs is that Digitypon’s workflow is a virtualized way of how employees of printed matters (newspaperes, magazinews) work.EdgeJournalImporter: Import journal files written on the Entourage (Pocket) Edge into Microsoft OneNote 2007+FlatFileSerializer: Serialize and deserialize flat file records from and to self defined classes using Attributes.Google Chart Helper: Controls to insert Google Charts to your web application. No Javascript code to do. We do it for you !How to display records from MySQL 5.1 database in asp.net using VB.net or CSharp: How to display records from MySQl 5.1+ database in asp.net with vb.net or C# code.HTTP Filer: HTTP Filer is a utility that allow users to share files and documents over http protocol. This utility was designed especially for Windows Phone users to send files from computer to their phone easily without send emails with attachments or upload files to an internet server.ibamonitoring: Source code for the avian point-count data collection web site www.ibamonitoring.org.JoPack Ultra Light Packaging for large teams: JoPack is an opensource ultra light package management software – that is targeted for simplifying development with large teams sharing volatile assemblies across several solutions. Latest project source code can be found on project home site: http://code.google.com/p/jo-pack/ L-System Turtle Based Fractal Tool (L-Fractal Tool): A tool to help you play with L-System turtle graphic based fractal curves( http://en.wikipedia.org/wiki/L-system) This tool helps you look into some of the well known curves & lets you define new patterns & production rules to build your own. Have a fun-fractal day !mailer: mailer is a application to mail. It's developed in Python.NJamb: A C# DSL for more rigorous tests: NJamb is a C# syntax for tests and DDD specifications. It makes them more readable, faster to write, and more rigorous. Its Linq-style expressions can assert preconditions and postconditions. IntelliSense makes the syntax almost foolproof. And, it's designed to be extended.NUpdater: NUpdater makes it easier for .NET Framework developers to add auto-updating capability to their software. Putting together numerous patching capabilities, this library is an all-around updater. Developed in C# with CLS compliance (this library is fully compatible with Mono).Perihelia - The .NET & Silverlight Socket Project: Perihelia is an open-source socket framework. The framework includes (or will include) all the necessities you need to satisfy your networking needs. Windows and WPF applications are currently supported, and Silverlight applications will be supported soon.PLogger: PLogger is a light, fast configuration-less file appender logger build using a parallel pipeline architecture. It is much easier and faster to set up and use then Log4Net or the enterprise libraryQuasar: Quasar is a professional .Net utility library which adds sugar on .net framework.Sectors Game Engine: Sectors is a XNA-based 2.5D (Doom-like) game engine with console and scripting support for Windows.SharePoint 2010 Server-Side-Scanner WebPart - embDocumentInhalator: embDocumentInhalator makes it possible for SharePoint 2010 users to scan documents from scanners attached directly to the server. For developers it may help to see the relationship between the individual components required. SIAJUR: Projeto Web para controle de documentossvcutil2: svcutil2 generates Wcf client proxies from Wsdl2 documents.TemporalMemoryNetwork: TemporalMemoryNetwork is a research project exploring how dynamical systems can store and represent patterns that occur through time.WebDAV#: This project aims to implement WebDAV support for .NET, both for client software as well as software hosting their own WebDAV server. The project will start with the server portion. The project will be developed in C# 3.5 for .NET 3.5 and 4.0.

    Read the article

  • Installation problem Ubuntu 12.04 Crashing hardware error

    - by user93640
    I am running on Ubuntu 8.04 for quite some time without many problems. About almost a year ago or so I have been trying to upgrade to 10.04 LTS, but without any success. Each time when trying to upgrade or even newly install the installation process crashed after about an hour or so (I forgot exactly how long). Now I wanted to try Ubuntu 12.04 (not even installing, but I only selected "Try Ubuntu without installing") and I got similar errors. I did not try to install it, because of earlier experience with 10.04 when after I also lost 8.04 and had to install from scratch again (after which it worked). I get the following screen (as I am not allowed to upload photos here the text): 26.767262] [Hardware Error]: CPU 0: Machine Check Exception: 0 Bank 5: b200001804000e0f 26.767279] [Hardware Error]: TSC 0 26.767287] [Hardware Error]: PROCESSOR 0:6f6 TIME 1349017924 SOCKET 0 APIC 0 microcode 44 26.767297] [Hardware Error]: Run the above through 'mcelog --ascii' 26.767307] [Hardware Error]: CPU 1: Machine Check Exception: 0 Bank 1: b200000000000175 26.767316] [Hardware Error]: TSC 0 26.767323] [Hardware Error]: PROCESSOR 0:6f6 TIME 1349017924 SOCKET 0 APIC 1 microcode 44 26.767331] [Hardware Error]: Run the above through 'mcelog --ascii' 26.767339] [Hardware Error]: CPU 1: Machine Check Exception: 0 Bank 5: b200003000000e0f 26.767348] [Hardware Error]: TSC 0 26.767354] [Hardware Error]: PROCESSOR 0:6f6 TIME 1349017924 SOCKET 0 APIC 1 microcode 44 26.767363] [Hardware Error]: Run the above through 'mcelog --ascii' 26.767371] [Hardware Error]: CPU 1: Machine Check Exception: 4 Bank 1: b200000000000175 26.767379] [Hardware Error]: TSC 1bf231e65f 26.767386] [Hardware Error]: PROCESSOR 0:6f6 TIME 1349017951 SOCKET 0 APIC 1 microcode 44 26.767395] [Hardware Error]: Run the above through 'mcelog --ascii' 26.767403] [Hardware Error]: CPU 1: Machine Check Exception: 4 Bank 5: b200003008000e0f 26.767413] [Hardware Error]: TSC 1bf231e65f 26.767421] [Hardware Error]: PROCESSOR 0:6f6 TIME 1349017951 SOCKET 0 APIC 1 microcode 44 26.767429] [Hardware Error]: Run the above through 'mcelog --ascii' 26.767437] [Hardware Error]: CPU 0: Machine Check Exception: 5 Bank 5: b200001806000e0f 26.767447] [Hardware Error]: RIP |INEXACT| 60:<00000000c1018b5c> {mwait_idle+0x7c/0x1d0} 26.767464] [Hardware Error]: TSC 1bf231e674 26.767471] [Hardware Error]: PROCESSOR 0:6f6 TIME 1349017951 SOCKET 0 APIC 0 microcode 44 26.767480] [Hardware Error]: Run the above through 'mcelog --ascii' 26.767487] [Hardware Error]: Machine check: Processor context corrupt 26.767495] Kernel panic - not syncing: Fatal Machine check 26.767505] Pid: 579, comm: debconf-communi Tainted: G M 3.2.0.29-generic-pae #46-Ubuntu 26.767515] Call Trace: 26.767525] [<c158f812>] ? printk+0x2d/0x2f 26.767534] [<c158f6e0>] panic+0x5c/0x161 26.767542] [<c10247ef>] mce_panic.part.14+0x13f/0x170 26.767551] [<c1024872>] mce_panic+0x52/0x90 26.767558] [<c1024a18>] mce_reign+0x168/0x170 26.767565] [<c1024bb5>] mce_end+0x105/0x110 26.767572] [<c10252db>] do_machine_check+0x32b/0x4f0 26.767581] [<c1024fb0>] ? mce_log+0x120/0x120 26.767590] [<c15a5e47>] error_code+0x67/0x6c 26.767602] panic occurred, switching back to text console 26.768498] Rebooting in 30 seconds.. For information, I have also tried earlier Arch Linux. I can install it, but when I try to install a window manager (LXDE) again I got similar errors. Fedora also crashes when installing and also Mandriva did not work for me. Therefore I think something deep in the machine might be wrong. But as stated above I can (clean) install 8.04 and also 9.10 can be installed without problems. Also updates for 8.04 can be installed. My machine is dual boot with XP next to it on a different partition. My HW: Memory : 2.0 GiB; Processor 0: Intel(R) Core(TM)2 CPU 6320 @ 1.86GHz; Processor 1: Intel(R) Core(TM)2 CPU 6320 @ 1.86GHz; How can I install Ubuntu 12.04? Last option would be to completely format my machine and install everything from scratch, but even I am not sure if that would solve it in the end. Can anybody help me out?

    Read the article

  • How to Watch NCAA March Madness Online

    - by DigitalGeekery
    You’ve filled out your brackets and now you are ready for one of America’s most popular sporting events. But what if you are you stuck at work or away from your TV?  Or your local affiliate is showing a different game? Today we show how to catch all the March Madness online. March Madness on Demand You’ll need a broadband connection, 512 MB RAM or higher, with cookies and Javascript enabled in your browser. March Madness on Demand offers two viewing options, a Standard Player and a High Quality player. The High Quality player is not, unfortunately, high definition. Standard Player Requirements Windows XP/Vista/7 or Mac OS X IE 6+ (We also successfully tested it in Firefox, Chrome, & Opera) Adobe Flash Player 9 or higher High Quality Player Requirements 2.4 GHz Pentium 4 or Intel-based Macintosh Mac OS 10.4.8+ (Intel-based) Windows: XP SP2, Vista, Server 2003, Server 2008, Windows 7 Firefox 1.5+ or IE 6/7/8 Silverlight 3 browser plug-in Watching March Madness on Demand Go to the March Madness on Demand website. (Link below) Check the “Watch in High Quality” section to see if your browser is ready and compatible for the High Quality viewer. If not, you’ll see a message indicating either your browser and system are incompatible… Or that you need to install Silverlight. To install Silverlight, click on the “Get HQ” button and follow the prompts to download and install Silverlight. To launch the player, click the large red “Launch Player” button. At the top of the screen, you’ll see the current and upcoming games. Click on “Watch Now” below to begin watching. At the bottom left, is where you click to watch with the High Quality player. If to many people are watching the High Quality player, you’ll see the following message and have to go back to the Standard Player. At the lower right are volume controls, a “Full Screen” button, and a “Share” button which allows you to share the game you are watching on various social networking sites like Facebook and Twitter. Perhaps most importantly for those who want to steal a bit of viewing time while at work is the “Boss Button” at the top right. Clicking on the “Boss Button” will open a fake Office document so it may appear at first glance like you’re actually doing legitimate work. To return to the game, click anywhere on the screen with your mouse. You’ll be able to catch every single game of the tournament from the first round all the way through the championship with March Madness on Demand. If your computer and Internet connection can handle it, you can even watch multiple games at the same time by opening March Madness on Demand in multiple browser windows. Watch March Madness online Similar Articles Productive Geek Tips Weekend Fun: Watch Television on Your PC with AnyTVWatch NFL Sunday Night Football On Your PCWatch TV On Your PC with FreeZ Online TVGeek Fun: Download Favorite NBC Programs for FreeDitch the RealPlayer Bloat with Real Alternative TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional How to Browse Privately in Firefox Kill Processes Quickly with Process Assassin Need to Come Up with a Good Name? Try Wordoid StockFox puts a Lightweight Stock Ticker in your Statusbar Explore Google Public Data Visually The Ultimate Excel Cheatsheet

    Read the article

  • How do I implement a quaternion based camera?

    - by kudor gyozo
    I looked at several tutorials about this and when I thought I understood I tried to implement a quaternion based camera. The problem is it doesn't work correctly, after rotating for approx. 10 degrees it jumps back to -10 degrees. I have no idea what's wrong. I'm using openTK and it already has a quaternion class. I'm a noob at opengl, I'm doing this just for fun, and don't really understand quaternions, so probably I'm doing something stupid here. Here is some code: (Actually almost all the code except the methods that load and draw a vbo (it is taken from an OpenTK sample that demonstrates vbo-s)) I load a cube into a vbo and initialize the quaternion for the camera protected override void OnLoad(EventArgs e) { base.OnLoad(e); cameraPos = new Vector3(0, 0, 7); cameraRot = Quaternion.FromAxisAngle(new Vector3(0,0,-1), 0); GL.ClearColor(System.Drawing.Color.MidnightBlue); GL.Enable(EnableCap.DepthTest); vbo = LoadVBO(CubeVertices, CubeElements); } I load a perspective projection here. This is loaded at the beginning and every time I resize the window. protected override void OnResize(EventArgs e) { base.OnResize(e); GL.Viewport(0, 0, Width, Height); float aspect_ratio = Width / (float)Height; Matrix4 perpective = Matrix4.CreatePerspectiveFieldOfView(MathHelper.PiOver4, aspect_ratio, 1, 64); GL.MatrixMode(MatrixMode.Projection); GL.LoadMatrix(ref perpective); } Here I get the last rotation value and create a new quaternion that represents only the last rotation and multiply it with the camera quaternion. After this I transform this into axis-angle so that opengl can use it. (This is how I understood it from several online quaternion tutorials) protected override void OnRenderFrame(FrameEventArgs e) { base.OnRenderFrame(e); GL.Clear(ClearBufferMask.ColorBufferBit | ClearBufferMask.DepthBufferBit); double speed = 1; double rx = 0, ry = 0; if (Keyboard[Key.A]) { ry = -speed * e.Time; } if (Keyboard[Key.D]) { ry = +speed * e.Time; } if (Keyboard[Key.W]) { rx = +speed * e.Time; } if (Keyboard[Key.S]) { rx = -speed * e.Time; } Quaternion tmpQuat = Quaternion.FromAxisAngle(new Vector3(0,1,0), (float)ry); cameraRot = tmpQuat * cameraRot; cameraRot.Normalize(); GL.MatrixMode(MatrixMode.Modelview); GL.LoadIdentity(); Vector3 axis; float angle; cameraRot.ToAxisAngle(out axis, out angle); GL.Rotate(angle, axis); GL.Translate(-cameraPos); Draw(vbo); SwapBuffers(); } Here are 2 images to explain better: I rotate a while and from this: it jumps into this Any help is appreciated. Update1: I add these to a streamwriter that writes into a file: sw.WriteLine("camerarot: X:{0} Y:{1} Z:{2} W:{3} L:{4}", cameraRot.X, cameraRot.Y, cameraRot.Z, cameraRot.W, cameraRot.Length); sw.WriteLine("ry: {0}", ry); The log is available here: http://www.pasteall.org/26133/text. At line 770 the cube jumps from right to left, when camerarot.Y changes signs. I don't know if this is normal. Update2 Here is the complete project.

    Read the article

  • Preview Before You Paste with Live Preview in Office 2010

    - by DigitalGeekery
    Do you often find yourself frustrated that content you just copied and pasted didn’t turn out the way you expected? With the new Live Preview in Office 2010, you can preview how copied content will look when it’s pasted even between Office applications. Not every paste preview option will be available in every circumstance. The available options will be based on the applications being used and what content is copied. Copy your content like normal by right-clicking and selecting Copy, pressing Crtl + C, or selecting Copy from the Home tab. Next, select your location to paste the content. Now you can access the Paste Preview buttons either by selecting the Paste dropdown list from the Home tab…   …Or by right-clicking. As you hover your cursor over each of the Paste Options buttons, you will see a preview of what it will look like if you paste using that option. Click the corresponding button when you find the paste option you like. The “Paste” will paste all the content and formatting as you can see below. Values will paste values only, no formatting.   Formatting will paste only the formatting, no values. Hover over Paste Special to reveal any additional paste options. The process is similar in other Office applications. As you can see in the Word document below, Keep Text Only will paste the text, but not the orange color format from the original text.   Even after you’ve pasted, there is still time to change your mind. After you paste content you’ll see a Paste Option button near your content. If you don’t, you can pull it up by pressing the Ctrl key. Note: This is also available after using Ctrl + V to paste. Click to enable the dropdown and select one of the available options.   Using Live Paste Preview between multiple applications is just as easy. If we preview pasting the content from our Word document into PowerPoint by using the Keep Source Formatting option, we’ll see that the outcome looks awful. Selecting the Use Destination Theme will merge the text into the theme of the PowerPoint document and looks a lot better on our slide.   Live Paste Preview is a nice addition to Office 2010 and is sure to save time spent undoing the unexpected consequences of pasting content. Looking for more Office 2010 tips? Check out some of our other Office 2010 posts like how to create a customized tab on the Office 2010 ribbon, and how to use the streamlined printing features in Office 2010. Similar Articles Productive Geek Tips Edit Microsoft Word 2007 Documents in Print PreviewPreview Documents Without Opening Them In Word 2007How to See Where a TinyUrl Is Really Linking ToHow To Upload Office 2010 Documents to Web Apps Technical PreviewPreview Links and Images in Google Chrome TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Acronis Online Backup DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows Check these Awesome Chrome Add-ons iFixit Offers Gadget Repair Manuals Online Vista style sidebar for Windows 7 Create Nice Charts With These Web Based Tools Track Daily Goals With 42Goals Video Toolbox is a Superb Online Video Editor

    Read the article

  • SQL Server source control from Visual Studio

    - by David Atkinson
    Developers have long since had to context switch between two IDEs, Visual Studio for application code development and SQL Server Management Studio for database development. While this is accepted, especially given the richness of the database development feature set in SSMS, loading a separate tool can seem a little overkill. This is where SQL Connect comes in. This is an add-in to Visual Studio that provides a connected development experience for the SQL Server developer. Connected database development involves modifying a development sandbox database, as opposed to offline development, where SQL text files are modified independently of the database. One of the main complaints of Data Dude (VS DBPro) is that it enforces the offline approach. This gripe is what SQL Connect addresses. If you don't already use SQL Source Control, you can get up and running with SQL Connect by adding a new project to your Visual Studio solution as follows: Then choose your existing development database and you're ready to go. If you already use SQL Source Control, you will need to link SQL Connect to your existing database scripts folder repository, so SQL Connect and SQL Source Control can be used collaboratively (note that SQL Source Control v.3.0.9.18 or later is required). Locate the repository (this can be found in the Setup tab in SQL Source Control). .and create a working folder for it (here I'm using TortoiseSVN). Back in Visual Studio, locate the SQL Connect panel (in the View menu if it hasn't auto loaded) and select Import SQL Source Control project Locate your working folder and click Import. This creates a Red Gate database project under your solution: From here you can modify your development database, and manage your changes in source control. To associate your development database with the project, right click on the project node, select Properties, set the database and Save. Now you're ready to make some changes. Locate the object you'd like to modify in the Solution Explorer, and double click it to invoke a query window or table designer. You also have the option to edit the creation SQL directly using Edit SQL File in Project. Keeping the development database and Visual Studio project in sync is as easy as clicking on a button. One you've made your change, you can use whichever mechanism you choose to commit to source control. Here I'm using the free open-source AnkhSVN to integrate Subversion with Visual Studio. Maintaining your database in a Visual Studio solution means that you can commit database changes and application code changes in the same changeset. This is desirable if you have continuous integration set up as you want to ensure that all files related to a change are committed atomically, so you avoid an interim "broken build". More discussion on SQL Connect and its benefits can be found in the following article on Simple Talk: No More Disconnected SQL Development in Visual Studio The SQL Connect project team is currently assessing the backlog for the next development effort, and they'd appreciate your feature suggestions, as well as your votes on their suggestions site: http://redgate.uservoice.com/forums/140800-sql-connect-for-visual-studio- A 28-day free trial of SQL Connect is available from the Red Gate website. Technorati Tags: SQL Server

    Read the article

  • Linq To SQL: Behaviour for table field which is NotNull and having Default value or binding

    - by kaushalparik27
    I found this something interesting while wandering over community which I would like to share. The post is whole about: DBML is not considering the table field's "Default value or Binding" setting which is a NotNull. I mean the field which can not be null but having default value set needs to be set IsDbGenerated = true in DBML file explicitly.Consider this situation: There is a simple tblEmployee table with below structure: The fields are simple. EmployeeID is a Primary Key with Identity Specification = True with Identity Seed = 1 to autogenerate numeric value for this field. EmployeeName and their EmailAddress to store in rest of 2 fields. And the last one is "DateAdded" with DateTime datatype which doesn't allow NULL but having Default Value/Binding with "GetDate()". That means if we don't pass any value to this field then SQL will insert current date in "DateAdded" field.So, I start with a new website, add a DBML file and dropped the said table to generate LINQ To SQL context class. Finally, I write a simple code snippet to insert data into the tblEmployee table; BUT, I am not passing any value to "DateAdded" field. Because I am considering SQL Server's "Default Value or Binding (GetDate())" setting to this field and understand that SQL will insert current date to this field.        using (TestDatabaseDataContext context = new TestDatabaseDataContext())        {            tblEmployee tblEmpObjet = new tblEmployee();            tblEmpObjet.EmployeeName = "KaushaL";            tblEmpObjet.EmployeeEmailAddress = "[email protected]";            context.tblEmployees.InsertOnSubmit(tblEmpObjet);            context.SubmitChanges();        }Here comes the twist when application give me below error:  This is something not expecting! From the error it clearly depicts that LINQ is passing NULL value to "DateAdded" Field while according to my understanding it should respect Sql Server's "Default value or Binding" setting for this field. A bit googling and I found very interesting related to this problem.When we set Primary Key to any field with "Identity Specification" Property set to true; DBML set one important property "IsDbGenerated=true" for this field. BUT, when we set "Default Value or Biding" property for some field; we need to explicitly tell the DBML/LINQ to let it know that this field is having default binding at DB side that needs to be respected if I don't pass any value. So, the solution is: You need to explicitly set "IsDbGenerated=true" for such field to tell the LINQ that the field is having default value or binding at Sql Server side so, please don't worry if i don't pass any value for it.You can select the field and set this property from property window in DBML Designer file or write the property in DBML.Designer.cs file directly. I have attached a working example with required table script with this post here. I hope this would be helpful for someone hunting for the same. Happy Discovery!

    Read the article

  • Make Text and Images Easier to Read with the Windows 7 Magnifier

    - by DigitalGeekery
    Do you have impaired vision or find it difficult to read small print on your computer screen? Today, we’ll take a closer look at how to magnify that hard to read content with the Magnifier in Windows 7. Magnifier was available in previous versions of Windows, but the Windows 7 version comes with some notable improvements. There are now three screen modes in Magnifier. Full Screen and Lens mode, however, require Windows Aero to be enabled. If your computer doesn’t support Aero, or if you’re not using am Aero theme, Magnifier will only work in Docked mode. Using Magnifier in Windows 7 You can find the Magnifier by going to Start > All Programs > Accessories > Ease of Access > Magnifier.   Alternately, you can type magnifier into the Search box in the Start Menu and hit Enter. On the Magnifier toolbar, choose your View mode by clicking Views and choosing from the available options. Clicking the plus (+) and minus (-) buttons will zoom in or zoom out. You can change the zoom in/out percentage by adjusting the slider bar. You can also enable color inversion and select tracking options. Click OK when finished to save your settings.   After a brief period, the Magnifier Toolbar will switch to a magnifying glass icon. Simply click the magnifying glass to display the Magnifier Toolbar again.   Docked Mode In Docked mode, a portion of the screen is magnified and docked at the top of the screen. The rest of your desktop will remain in it’s normal state. You can then control which area of the screen is magnified by moving your mouse.   Full Screen Mode This magnifies your entire screen and follows your mouse as you move it around. If you loose track of where you are on the screen, use the Ctrl + Alt + Spacebar shortcut to preview where your mouse pointer is on the screen.   Lens Mode The Lens screen mode is similar to holding a magnifying glass up to your screen. Full screen mode magnifies the area around the mouse. The magnified area moves around the screen with your mouse.    Shortcut Keys Windows key + (+) to zoom in Windows key + (-) to zoom out Windows key + ESC to exit Ctrl + Alt + F – Full screen mode Ctrl + Alt + L – Lens mode Ctrl + Alt + D – Dock mode Ctrl + Alt + R – Resize the lens Ctrl + Alt + Spacebar – Preview full screen Conclusion Windows Magnifier is a nice little tool if you have impaired vision or just need to make items on the screen easier to read. Similar Articles Productive Geek Tips New Features in WordPad and Paint in Windows 7How-To Geek on Lifehacker: How to Make Windows Vista Less AnnoyingUsing Comments in Word 2007 DocumentsMake Your PC Look Like Windows Phone 7Use Image Placeholders to Display Documents Faster in Word TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips HippoRemote Pro 2.2 Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Windows Media Player Plus! – Cool WMP Enhancer Get Your Team’s World Cup Schedule In Google Calendar Backup Drivers With Driver Magician TubeSort: YouTube Playlist Organizer XPS file format & XPS Viewer Explained Microsoft Office Web Apps Guide

    Read the article

  • Oracle Solaris: Zones on Shared Storage

    - by Jeff Victor
    Oracle Solaris 11.1 has several new features. At oracle.com you can find a detailed list. One of the significant new features, and the most significant new feature releated to Oracle Solaris Zones, is casually called "Zones on Shared Storage" or simply ZOSS (rhymes with "moss"). ZOSS offers much more flexibility because you can store Solaris Zones on shared storage (surprise!) so that you can perform quick and easy migration of a zone from one system to another. This blog entry describes and demonstrates the use of ZOSS. ZOSS provides complete support for a Solaris Zone that is stored on "shared storage." In this case, "shared storage" refers to fiber channel (FC) or iSCSI devices, although there is one lone exception that I will demonstrate soon. The primary intent is to enable you to store a zone on FC or iSCSI storage so that it can be migrated from one host computer to another much more easily and safely than in the past. With this blog entry, I wanted to make it easy for you to try this yourself. I couldn't assume that you have a SAN available - which is a good thing, because neither do I! What could I use, instead? [There he goes, foreshadowing again... -Ed.] Developing this entry reinforced the lesson that the solution to every lab problem is VirtualBox. Oracle VM VirtualBox (its formal name) helps here in a couple of important ways. It offers the ability to easily install multiple copies of Solaris as guests on top of any popular system (Microsoft Windows, MacOS, Solaris, Oracle Linux (and other Linuxes) etc.). It also offers the ability to create a separate virtual disk drive (VDI) that appears as a local hard disk to a guest. This virtual disk can be moved very easily from one guest to another. In other words, you can follow the steps below on a laptop or larger x86 system. Please note that the ability to use ZOSS to store a zone on a local disk is very useful for a lab environment, but not so useful for production. I do not suggest regularly moving disk drives among computers. In the method I describe below, that virtual hard disk will contain the zone that will be migrated among the (virtual) hosts. In production, you would use FC or iSCSI LUNs instead. The zonecfg(1M) man page details the syntax for each of the three types of devices. Why Migrate? Why is the migration of virtual servers important? Some of the most common reasons are: Moving a workload to a different computer so that the original computer can be turned off for extensive maintenance. Moving a workload to a larger system because the workload has outgrown its original system. If the workload runs in an environment (such as a Solaris Zone) that is stored on shared storage, you can restore the service of the workload on an alternate computer if the original computer has failed and will not reboot. You can simplify lifecycle management of a workload by developing it on a laptop, migrating it to a test platform when it's ready, and finally moving it to a production system. Concepts For ZOSS, the important new concept is named "rootzpool". You can read about it in the zonecfg(1M) man page, but here's the short version: it's the backing store (hard disk(s), or LUN(s)) that will be used to make a ZFS zpool - the zpool that will hold the zone. This zpool: contains the zone's Solaris content, i.e. the root file system does not contain any content not related to the zone can only be mounted by one Solaris instance at a time Method Overview Here is a brief list of the steps to create a zone on shared storage and migrate it. The next section shows the commands and output. You will need a host system with an x86 CPU (hopefully at least a couple of CPU cores), at least 2GB of RAM, and at least 25GB of free disk space. (The steps below will not actually use 25GB of disk space, but I don't want to lead you down a path that ends in a big sign that says "Your HDD is full. Good luck!") Configure the zone on both systems, specifying the rootzpool that both will use. The best way is to configure it on one system and then copy the output of "zonecfg export" to the other system to be used as input to zonecfg. This method reduces the chances of pilot error. (It is not necessary to configure the zone on both systems before creating it. You can configure this zone in multiple places, whenever you want, and migrate it to one of those places at any time - as long as those systems all have access to the shared storage.) Install the zone on one system, onto shared storage. Boot the zone. Provide system configuration information to the zone. (In the Real World(tm) you will usually automate this step.) Shutdown the zone. Detach the zone from the original system. Attach the zone to its new "home" system. Boot the zone. The zone can be used normally, and even migrated back, or to a different system. Details The rest of this shows the commands and output. The two hostnames are "sysA" and "sysB". Note that each Solaris guest might use a different device name for the VDI that they share. I used the device names shown below, but you must discover the device name(s) after booting each guest. In a production environment you would also discover the device name first and then configure the zone with that name. Fortunately, you can use the command "zpool import" or "format" to discover the device on the "new" host for the zone. The first steps create the VirtualBox guests and the shared disk drive. I describe the steps here without demonstrating them. Download VirtualBox and install it using a method normal for your host OS. You can read the complete instructions. Create two VirtualBox guests, each to run Solaris 11.1. Each will use its own VDI as its root disk. Install Solaris 11.1 in each guest.Install Solaris 11.1 in each guest. To install a Solaris 11.1 guest, you can either download a pre-built VirtualBox guest, and import it, or install Solaris 11.1 from the "text install" media. If you use the latter method, after booting you will not see a windowing system. To install the GUI and other important things, login and run "pkg install solaris-desktop" and take a break while it installs those important things. Life is usually easier if you install the VirtualBox Guest Additions because then you can copy and paste between the host and guests, etc. You can find the guest additions in the folder matching the version of VirtualBox you are using. You can also read the instructions for installing the guest additions. To create the zone's shared VDI in VirtualBox, you can open the storage configuration for one of the two guests, select the SATA controller, and click on the "Add Hard Disk" icon nearby. Choose "Create New Disk" and specify an appropriate path name for the file that will contain the VDI. The shared VDI must be at least 1.5 GB. Note that the guest must be stopped to do this. Add that VDI to the other guest - using its Storage configuration - so that each can access it while running. The steps start out the same, except that you choose "Choose Existing Disk" instead of "Create New Disk." Because the disk is configured on both of them, VirtualBox prevents you from running both guests at the same time. Identify device names of that VDI, in each of the guests. Solaris chooses the name based on existing devices. The names may be the same, or may be different from each other. This step is shown below as "Step 1." Assumptions In the example shown below, I make these assumptions. The guest that will own the zone at the beginning is named sysA. The guest that will own the zone after the first migration is named sysB. On sysA, the shared disk is named /dev/dsk/c7t2d0 On sysB, the shared disk is named /dev/dsk/c7t3d0 (Finally!) The Steps Step 1) Determine the name of the disk that will move back and forth between the systems. root@sysA:~# format Searching for disks...done AVAILABLE DISK SELECTIONS: 0. c7t0d0 /pci@0,0/pci8086,2829@d/disk@0,0 1. c7t2d0 /pci@0,0/pci8086,2829@d/disk@2,0 Specify disk (enter its number): ^D Step 2) The first thing to do is partition and label the disk. The magic needed to write an EFI label is not overly complicated. root@sysA:~# format -e c7t2d0 selecting c7t2d0 [disk formatted] FORMAT MENU: ... format fdisk No fdisk table exists. The default partition for the disk is: a 100% "SOLARIS System" partition Type "y" to accept the default partition, otherwise type "n" to edit the partition table. n SELECT ONE OF THE FOLLOWING: ... Enter Selection: 1 ... G=EFI_SYS 0=Exit? f SELECT ONE... ... 6 format label ... Specify Label type[1]: 1 Ready to label disk, continue? y format quit root@sysA:~# ls /dev/dsk/c7t2d0 /dev/dsk/c7t2d0 Step 3) Configure zone1 on sysA. root@sysA:~# zonecfg -z zone1 Use 'create' to begin configuring a new zone. zonecfg:zone1 create create: Using system default template 'SYSdefault' zonecfg:zone1 set zonename=zone1 zonecfg:zone1 set zonepath=/zones/zone1 zonecfg:zone1 add rootzpool zonecfg:zone1:rootzpool add storage dev:dsk/c7t2d0 zonecfg:zone1:rootzpool end zonecfg:zone1 exit root@sysA:~# oot@sysA:~# zonecfg -z zone1 info zonename: zone1 zonepath: /zones/zone1 brand: solaris autoboot: false bootargs: file-mac-profile: pool: limitpriv: scheduling-class: ip-type: exclusive hostid: fs-allowed: anet: ... rootzpool: storage: dev:dsk/c7t2d0 Step 4) Install the zone. This step takes the most time, but you can wander off for a snack or a few laps around the gym - or both! (Just not at the same time...) root@sysA:~# zoneadm -z zone1 install Created zone zpool: zone1_rpool Progress being logged to /var/log/zones/zoneadm.20121022T163634Z.zone1.install Image: Preparing at /zones/zone1/root. AI Manifest: /tmp/manifest.xml.RXaycg SC Profile: /usr/share/auto_install/sc_profiles/enable_sci.xml Zonename: zone1 Installation: Starting ... Creating IPS image Startup linked: 1/1 done Installing packages from: solaris origin: http://pkg.us.oracle.com/support/ DOWNLOAD PKGS FILES XFER (MB) SPEED Completed 183/183 33556/33556 222.2/222.2 2.8M/s PHASE ITEMS Installing new actions 46825/46825 Updating package state database Done Updating image state Done Creating fast lookup database Done Installation: Succeeded Note: Man pages can be obtained by installing pkg:/system/manual done. Done: Installation completed in 1696.847 seconds. Next Steps: Boot the zone, then log into the zone console (zlogin -C) to complete the configuration process. Log saved in non-global zone as /zones/zone1/root/var/log/zones/zoneadm.20121022T163634Z.zone1.install Step 5) Boot the Zone. root@sysA:~# zoneadm -z zone1 boot Step 6) Login to zone's console to complete the specification of system information. root@sysA:~# zlogin -C zone1 Answer the usual questions and wait for a login prompt. Then you can end the console session with the usual "~." incantation. Step 7) Shutdown the zone so it can be "moved." root@sysA:~# zoneadm -z zone1 shutdown Step 8) Detach the zone so that the original global zone can't use it. root@sysA:~# zoneadm list -cv ID NAME STATUS PATH BRAND IP 0 global running / solaris shared - zone1 installed /zones/zone1 solaris excl root@sysA:~# zpool list NAME SIZE ALLOC FREE CAP DEDUP HEALTH ALTROOT rpool 17.6G 11.2G 6.47G 63% 1.00x ONLINE - zone1_rpool 1.98G 484M 1.51G 23% 1.00x ONLINE - root@sysA:~# zoneadm -z zone1 detach Exported zone zpool: zone1_rpool Step 9) Review the result and shutdown sysA so that sysB can use the shared disk. root@sysA:~# zpool list NAME SIZE ALLOC FREE CAP DEDUP HEALTH ALTROOT rpool 17.6G 11.2G 6.47G 63% 1.00x ONLINE - root@sysA:~# zoneadm list -cv ID NAME STATUS PATH BRAND IP 0 global running / solaris shared - zone1 configured /zones/zone1 solaris excl root@sysA:~# init 0 Step 10) Now boot sysB and configure a zone with the parameters shown above in Step 1. (Again, the safest method is to use "zonecfg ... export" on sysA as described in section "Method Overview" above.) The one difference is the name of the rootzpool storage device, which was shown in the list of assumptions, and which you must determine by booting sysB and using the "format" or "zpool import" command. When that is done, you should see the output shown next. (I used the same zonename - "zone1" - in this example, but you can choose any valid zonename you want.) root@sysB:~# zoneadm list -cv ID NAME STATUS PATH BRAND IP 0 global running / solaris shared - zone1 configured /zones/zone1 solaris excl root@sysB:~# zonecfg -z zone1 info zonename: zone1 zonepath: /zones/zone1 brand: solaris autoboot: false bootargs: file-mac-profile: pool: limitpriv: scheduling-class: ip-type: exclusive hostid: fs-allowed: anet: linkname: net0 ... rootzpool: storage: dev:dsk/c7t3d0 Step 11) Attaching the zone automatically imports the zpool. root@sysB:~# zoneadm -z zone1 attach Imported zone zpool: zone1_rpool Progress being logged to /var/log/zones/zoneadm.20121022T184034Z.zone1.attach Installing: Using existing zone boot environment Zone BE root dataset: zone1_rpool/rpool/ROOT/solaris Cache: Using /var/pkg/publisher. Updating non-global zone: Linking to image /. Processing linked: 1/1 done Updating non-global zone: Auditing packages. No updates necessary for this image. Updating non-global zone: Zone updated. Result: Attach Succeeded. Log saved in non-global zone as /zones/zone1/root/var/log/zones/zoneadm.20121022T184034Z.zone1.attach root@sysB:~# zoneadm -z zone1 boot root@sysB:~# zlogin zone1 [Connected to zone 'zone1' pts/2] Oracle Corporation SunOS 5.11 11.1 September 2012 Step 12) Now let's migrate the zone back to sysA. Create a file in zone1 so we can verify it exists after we migrate the zone back, then begin migrating it back. root@zone1:~# ls /opt root@zone1:~# touch /opt/fileA root@zone1:~# ls -l /opt/fileA -rw-r--r-- 1 root root 0 Oct 22 14:47 /opt/fileA root@zone1:~# exit logout [Connection to zone 'zone1' pts/2 closed] root@sysB:~# zoneadm -z zone1 shutdown root@sysB:~# zoneadm -z zone1 detach Exported zone zpool: zone1_rpool root@sysB:~# init 0 Step 13) Back on sysA, check the status. Oracle Corporation SunOS 5.11 11.1 September 2012 root@sysA:~# zoneadm list -cv ID NAME STATUS PATH BRAND IP 0 global running / solaris shared - zone1 configured /zones/zone1 solaris excl root@sysA:~# zpool list NAME SIZE ALLOC FREE CAP DEDUP HEALTH ALTROOT rpool 17.6G 11.2G 6.47G 63% 1.00x ONLINE - Step 14) Re-attach the zone back to sysA. root@sysA:~# zoneadm -z zone1 attach Imported zone zpool: zone1_rpool Progress being logged to /var/log/zones/zoneadm.20121022T190441Z.zone1.attach Installing: Using existing zone boot environment Zone BE root dataset: zone1_rpool/rpool/ROOT/solaris Cache: Using /var/pkg/publisher. Updating non-global zone: Linking to image /. Processing linked: 1/1 done Updating non-global zone: Auditing packages. No updates necessary for this image. Updating non-global zone: Zone updated. Result: Attach Succeeded. Log saved in non-global zone as /zones/zone1/root/var/log/zones/zoneadm.20121022T190441Z.zone1.attach root@sysA:~# zpool list NAME SIZE ALLOC FREE CAP DEDUP HEALTH ALTROOT rpool 17.6G 11.2G 6.47G 63% 1.00x ONLINE - zone1_rpool 1.98G 491M 1.51G 24% 1.00x ONLINE - root@sysA:~# zoneadm -z zone1 boot root@sysA:~# zlogin zone1 [Connected to zone 'zone1' pts/2] Oracle Corporation SunOS 5.11 11.1 September 2012 root@zone1:~# zpool list NAME SIZE ALLOC FREE CAP DEDUP HEALTH ALTROOT rpool 1.98G 538M 1.46G 26% 1.00x ONLINE - Step 15) Check for the file created on sysB, earlier. root@zone1:~# ls -l /opt total 1 -rw-r--r-- 1 root root 0 Oct 22 14:47 fileA Next Steps Here is a brief list of some of the fun things you can try next. Add space to the zone by adding a second storage device to the rootzpool. Make sure that you add it to the configurations of both zones! Create a new zone, specifying two disks in the rootzpool when you first configure the zone. When you install that zone, or clone it from another zone, zoneadm uses those two disks to create a mirrored pool. (Three disks will result in a three-way mirror, etc.) Conclusion Hopefully you have seen the ease with which you can now move Solaris Zones from one system to another.

    Read the article

  • BizTalk 2009 - BizTalk Benchmark Wizard: Running a Test

    - by StuartBrierley
    The BizTalk Benchmark Wizard is a ultility that can be used to gain some validation of a BizTalk installation, giving a level of guidance on whether it is performing as might be expected.  It should be used after BizTalk Server has been installed and before any solutions are deployed to the environment.  This will ensure that you are getting consistent and clean results from the BizTalk Benchmark Wizard. The BizTalk Benchmark Wizard applies load to the BizTalk Server environment under a choice of specific scenarios. During these scenarios performance counter information is collected and assessed against statistics that are appropriate to the BizTalk Server environment. For details on installing the Benchmark Wizard see my previous post. The BizTalk Benchmarking Wizard provides two simple test scenarios, one for messaging and one for Orchestrations, which can be used to test your BizTalk implementation. Messaging Loadgen generates a new XML message and sends it over NetTCP A WCF-NetTCP Receive Location receives a the xml document from Loadgen. The PassThruReceive pipeline performs no processing and the message is published by the EPM to the MessageBox. The WCF One-Way Send Port, which is the only subscriber to the message, retrieves the message from the MessageBox The PassThruTransmit pipeline provides no additional processing The message is delivered to the back end WCF service by the WCF NetTCP adapter Orchestrations Loadgen generates a new XML message and sends it over NetTCP A WCF-NetTCP Receive Location receives a the xml document from Loadgen. The XMLReceive pipeline performs no processing and the message is published by the EPM to the MessageBox. The message is delivered to a simple Orchestration which consists of a receive location and a send port The WCF One-Way Send Port, which is the only subscriber to the Orchestration message, retrieves the message from the MessageBox The PassThruTransmit pipeline provides no additional processing The message is delivered to the back end WCF service by the WCF NetTCP adapter Below is a quick outline of how to run the BizTalk Benchmark Wizard on a single server, although it should be noted that this is not ideal as this server is then both generating and processing the load.  In order to separate this load out you should run the "Indigo" service on a seperate server. To start the BizTalk Benchmark Wizard click Start > All Programs > BizTalk Benchmark Wizard > BizTalk Benchmark Wizard. On this screen click next, you will then get the following pop up window. Check the server and database names and check the "check prerequsites" check-box before pressing ok.  The wizard will then check that the appropriate test scenarios are installed. You should then choose the test scenario that wish to run (messaging or orchestration) and the architecture that most closely matches your environment. You will then be asked to confirm the host server for each of the host instances. Next you will be presented with the prepare screen.  You will need to start the indigo service before pressing the Test Indigo Service Button. If you are running the indigo service on a separate server you can enter the server name here.  To start the indigo service click Start > All Programs > BizTalk Benchmark Wizard > Start Indigo Service.   While the test is running you will be presented with two speed dial type displays - one for the received messages per second and one for the processed messages per second. The green dial shows the current rate and the red dial shows the overall average rate.  Optionally you can view the CPU usage of the various servers involved in processing the tests. For my development environment I expected low results and this is what I got.  Although looking at the online high scores table and comparing to the quad core system listed, the results are perhaps not really that bad. At some time I may look at what improvements I can make to this score, but if you are interested in that now take a look at Benchmark your BizTalk Server (Part 3).

    Read the article

  • Cant install software

    - by user53209
    So I just installed the ubuntu 11.10 .. And when i goto software center and try to download any software(use source).. all i get is a window saying that "Failed to download repository information" , "check your internet connection" and W:Failed to fetch bzip2:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric_restricted_binary-i386_Packages Hash Sum mismatch , W:Failed to fetch bzip2:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric_universe_binary-i386_Packages Hash Sum mismatch , W:Failed to fetch bzip2:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric_multiverse_binary-i386_Packages Hash Sum mismatch , W:Failed to fetch htt p ://archive.ubuntu.com/ubuntu/dists/oneiric/main/i18n/Index No Hash entry in Release file /var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric_main_i18n_Index , W:Failed to fetch http ://archive.ubuntu.com/ubuntu/dists/oneiric/multiverse/i18n/Index No Hash entry in Release file /var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric_multiverse_i18n_Index , W:Failed to fetch htt ://archive.ubuntu.com/ubuntu/dists/oneiric/restricted/i18n/Index No Hash entry in Release file /var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric_restricted_i18n_Index , W:Failed to fetch ht tp://archive.ubuntu.com/ubuntu/dists/oneiric/universe/i18n/Index No Hash entry in Release file /var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric_universe_i18n_Index , W:Failed to fetch bzip2:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric-updates_main_source_Sources Hash Sum mismatch , W:Failed to fetch bzip2:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric-updates_restricted_binary-i386_Packages Hash Sum mismatch , W:Failed to fetch bzip2:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric-updates_universe_binary-i386_Packages Hash Sum mismatch , W:Failed to fetch bzip2:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric-updates_multiverse_binary-i386_Packages Hash Sum mismatch , W:Failed to fetch h tp://archive.ubuntu.com/ubuntu/dists/oneiric-updates/main/i18n/Index No Hash entry in Release file /var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric-updates_main_i18n_Index , W:Failed to fetch h ttp://archive.ubuntu.com/ubuntu/dists/oneiric-updates/multiverse/i18n/Index No Hash entry in Release file /var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric-updates_multiverse_i18n_Index , W:Failed to fetch ht tp://archive.ubuntu.com/ubuntu/dists/oneiric-updates/restricted/i18n/Index No Hash entry in Release file /var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric-updates_restricted_i18n_Index , W:Failed to fetch htt p://archive.ubuntu.com/ubuntu/dists/oneiric-updates/universe/i18n/Index No Hash entry in Release file /var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric-updates_universe_i18n_Index , W:Failed to fetch bzip2:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric-backports_main_source_Sources Hash Sum mismatch , W:Failed to fetch bzip2:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric-backports_restricted_binary-i386_Packages Hash Sum mismatch , W:Failed to fetch bzip2:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric-backports_universe_binary-i386_Packages Hash Sum mismatch , W:Failed to fetch bzip2:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric-backports_multiverse_binary-i386_Packages Hash Sum mismatch , W:Failed to fetch h ttp://archive.ubuntu.com/ubuntu/dists/oneiric-backports/main/i18n/Index No Hash entry in Release file /var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric-backports_main_i18n_Index , W:Failed to fetch h ttp://archive.ubuntu.com/ubuntu/dists/oneiric-backports/multiverse/i18n/Index No Hash entry in Release file /var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric-backports_multiverse_i18n_Index , W:Failed to fetch ht tp://archive.ubuntu.com/ubuntu/dists/oneiric-backports/restricted/i18n/Index No Hash entry in Release file /var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric-backports_restricted_i18n_Index , W:Failed to fetch ht tp://archive.ubuntu.com/ubuntu/dists/oneiric-backports/universe/i18n/Index No Hash entry in Release file /var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric-backports_universe_i18n_Index , W:Failed to fetch bzip2:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_oneiric-security_main_source_Sources Hash Sum mismatch , E:Some index files failed to download. They have been ignored, or old ones used instead." Neglect the http typos, its to evade the 2 hyperlink.. I tried making a file in the apt.conf.d folder also, and added proxy entries, also set the system proxy.. and also the "network proxy", but nothing works.. And now I cant install any software!! Help needed

    Read the article

  • CodePlex Daily Summary for Thursday, February 17, 2011

    CodePlex Daily Summary for Thursday, February 17, 2011Popular ReleasesMarr DataMapper: Marr Datamapper 2.7 beta: - Changed QueryToGraph relationship rules: 1) Any parent entity (entity with children) must have at least one PK specified or an exception will be thrown 2) All 1-M relationship entities must have at least one PK specified or an exception will be thrown Only 1-1 entities with no children are allowed to have 0 PKs specified. - fixed AutoQueryToGraph bug (columns in graph children were being included in the select statement)datajs - JavaScript Library for data-centric web applications: datajs version 0.0.2: This release adds support for parsing DateTime and DateTimeOffset properties into javascript Date objects and serialize them back.thinktecture WSCF.blue: WSCF.blue V1 Update (1.0.11): Features Added a new option that allows properties on data contract types to be marked as virtual. Bug Fixes Fixed a bug caused by certain project properties not being available on Web Service Software Factory projects. Fixed a bug that could result in the WrapperName value of the MessageContractAttribute being incorrect when the Adjust Casing option is used. The menu item code now caters for CommandBar instances that are not available. For example the Web Item CommandBar does not exist ...Document.Editor: 2011.5: Whats new for Document.Editor 2011.5: New export to email New export to image New document background color Improved Tooltips Minor Bug Fix's, improvements and speed upsTerminals: Version 2 - RC1: The "Clean Install" will overwrite your log4net configuration (if you have one). If you run in a Portable Environment, you can use the "Clean Install" and target your portable folder. Tested and it works fine. Changes for this release: Re-worked on the Toolstip settings are done, just to avoid the vs.net clash with auto-generating files for .settings files. renamed it to .settings.config Packged both log4net and ToolStripSettings files into the installer Upgraded the version inform...Export Test Cases From TFS: Test Case Export to Excel 1.0: Team Foundation Server (TFS) 2010 enables the users to manage test cases as Work Item(s). The complete description of the test case along with steps can be managed as single Work Item in TFS 2010. Before migrating to TFS 2010 many test teams will be using MS Excel to manage the test cases (or test scripts). However, after migrating to TFS 2010, test teams can manage the test cases in the server but there may be need to get the test cases into excel sheet like approval from Business Analysts ...WriteableBitmapEx: WriteableBitmapEx 0.9.7.0: Fixed many bugs. Added the Rotate method which rotates the bitmap in 90° steps clockwise and returns a new rotated WriteableBitmap. Added a Flip method with support for FlipMode.Vertical and FlipMode.Horizontal. Added a new Filter extension file with a convolution method and some kernel templates (Gaussian, Sharpen). Added the GetBrightness method, which returns the brightness / luminance of the pixel at the x, y coordinate as byte. Added the ColorKeying BlendMode. Added boundary ...AllNewsManager.NET: AllNewsManager.NET 1.3: AllNewsManager.NET 1.3. This new version provide several new features, improvements and bug fixes. Some new features: Online Users. Avatars. Copy function (to create a new article from another one). SEO improvements (friendly urls). New admin buttons. And more...Facebook Graph Toolkit: Facebook Graph Toolkit 0.8: Version 0.8 (15 Feb 2011)moved to Beta stage publish photo feature "email" field of User object added new Graph Api object: Group, Event new Graph Api connection: likes, groups, eventsDJME - The jQuery extensions for ASP.NET MVC: DJME2 -The jQuery extensions for ASP.NET MVC beta2: The source code and runtime library for DJME2. For more product info you can goto http://www.dotnetage.com/djme.html What is new ?The Grid extension added The ModelBinder added which helping you create Bindable data Action. The DnaFor() control factory added that enabled Model bindable extensions. Enhance the ListBox , ComboBox data binding.Jint - Javascript Interpreter for .NET: Jint - 0.9.0: New CLR interoperability features Many bugfixesBuild Version Increment Add-In Visual Studio: Build Version Increment v2.4.11046.2045: v2.4.11046.2045 Fixes and/or Improvements:Major: Added complete support for VC projects including .vcxproj & .vcproj. All padding issues fixed. A project's assembly versions are only changed if the project has been modified. Minor Order of versioning style values is now according to their respective positions in the attributes i.e. Major, Minor, Build, Revision. Fixed issue with global variable storage with some projects. Fixed issue where if a project item's file does not exist, a ...Coding4Fun Tools: Coding4Fun.Phone.Toolkit v1.1: Coding4Fun.Phone.Toolkit v1.1 release. Bug fixes and minor feature requests addedTV4Home - The all-in-one TV solution!: 0.1.0.0 Preview: This is the beta preview release of the TV4Home software.Finestra Virtual Desktops: 1.2: Fixes a few minor issues with 1.1 including the broken per-desktop backgrounds Further improves the speed of switching desktops A few UI performance improvements Added donations linksNuGet: NuGet 1.1: NuGet is a free, open source developer focused package management system for the .NET platform intent on simplifying the process of incorporating third party libraries into a .NET application during development. This release is a Visual Studio 2010 extension and contains the the Package Manager Console and the Add Package Dialog. The URL to the package OData feed is: http://go.microsoft.com/fwlink/?LinkID=206669 To see the list of issues fixed in this release, visit this our issues listEnhSim: EnhSim 2.4.0: 2.4.0This release supports WoW patch 4.06 at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 Changes since 2.3.0 - Upd...Sterling Isolated Storage Database with LINQ for Silverlight and Windows Phone 7: Sterling OODB v1.0: Note: use this changeset to download the source example that has been extended to show database generation, backup, and restore in the desktop example. Welcome to the Sterling 1.0 RTM. This version is not backwards-compatible with previous versions of Sterling. Sterling is also available via NuGet. This product has been used and tested in many applications and contains a full suite of unit tests. You can refer to the User's Guide for complete documentation, and use the unit tests as guide...PDF Rider: PDF Rider 0.5.1: Changes from the previous version * Use dynamic layout to better fit text in other languages * Includes French and Spanish localizations Prerequisites * Microsoft Windows Operating Systems (XP - Vista - 7) * Microsoft .NET Framework 3.5 runtime * A PDF rendering software (i.e. Adobe Reader) that can be opened inside Internet Explorer. Installation instructionsChoose one of the following methods: 1. Download and run the "pdfRider0.5.1-setup.exe" (reccomended) 2. Down...Snoop, the WPF Spy Utility: Snoop 2.6.1: This release is a bug fixing release. Most importantly, issues have been seen around WPF 4.0 applications not always showing up in the app chooser. Hopefully, they are fixed now. I thought this issue warranted a minor release since more and more people are going WPF 4.0 and I don't want anyone to have any problems. Dan Hanan also contributes again with several usability features. Thanks Dan! Happy Snooping! p.s. By request, I am also attaching a .zip file ... so that people can install it ...New ProjectsAlpe d'HuZes: This project contains the source for Alpe d'HuZes, an organization that fights the cancer disease, by giving people the chance to ride the Alpe d'Hues, a mountain in france. By climbing this mountain, money is collected which is entirely donated to the "kankerbestrijding".AstroLib: Astronomical libraryDevon: Devon_Projectearthquake predictor: This project is attempt to create earthquake prediction application , which can help save lives. It is based on theory that number of lost pets, before earthquake, growing up. This statistics can be obtained from free news papers, boards, forums... Technology : C#, ASPX, .NET 4FCNS.Calendar: FCNS.Calendar ??? MonoCalendar(?????????) ?.NET??????????,??????Mac???????????????iCal?????。???????????.??????????.?????????????????.????????????????. FlashRelease [O-GO.ru edition]: FlashRelease it's a tool for easy create description of new video\music\game torrent releases. Developed in Delphi.Forms based authentication for SharePoint2010: Forms based authentication Management features for SharePoint 2010. <a href="http://www.softwarediscipline.com/post/2011/01/03/Forms-based-authentication-feature-SharePoint-2010.aspx" alt="SharePoint 2010 FBA management feature">SharePoint 2010 FBA feature</a>ITune.LittleTools: Reads your ITune library and copy your track rating in ITune on to your file in windows.MingleProject: Just a simple project that showcases the power of ASP.NET and Visual StudioMvcContrib UI Extensions - Themed Grid & Menu: UI Extensions to the MvcContrib Project - Themed Grid & MenuNDOS Azure: Windows Azure projects developed at the "Open Source and Interoperability Development Nucleous" (http://ndos.codeplex.com/) at Universidade Federal do Rio Grande do Sul (http://www.ufrgs.br).ObjectDumper: ObjectDumper takes a normal .NET object and dumps it to a string, TextWriter, or file. Handy for debugging purposes.Open Analytic: Open Analytic is an open source business intelligence framework that believes in simplicity. The framework is developed with .NET language and can be easily integrated in custom product development.Pomodoro.Oob Expression Blend Example App: This is a non-functional Silverlight 4 Out-of-Browser app to demonstrate functionality in Expression Blend and to accompany user group talks and presentations on Blend.Senior Design: Uconn senior design project!Service Monitors - A services health monitoring tool: The idea behind this project is simple, I want to know when a service related to my application is not available. Our first intent to get a tool to generete the necessary data to be compliant with the Availability SLA of our systems. SGO: OrganizerSina Weibo QReminder: A handy utility that display remind message in the browser title for sina weibo.System Center Configuration Manager Integration Pack Extention: This integration pack adds some additional integration points for Opalis to System Center Configuration Manager. These functions are used in my User Self imaging workflow that will be demoed at MMS 2011.TwittaFox: TwittaFox ist ein kleiner Twitter-Client welcher direkt aus dem Tray angesprochen werden kann.Ultralight Markup: Ultralight Markup makes it easier for webmasters to allow safe user comments. It features a stripped-down intermediate markup language meant to bridge the gap between text entry and HTML. And the project includes an ASP.NET MVC implementation with a Javascript editor.Unit Conversion Library: Unit Conversion Library is a .Net 2.0 based library, containing static methods for all the Units Set present in Windows 7 calculator. "Angle", "Area", "Energy", "Length", "Power", "Pressure", "Temperature",Time", "Velocity", "Volume", "Weight/Mass".UTB-PFIII-TermProj-Team DeLaFuente, Vasquez, Morales, Dartez: This is the group project for UTB-PFIII Team project. Authors include David De La Fuente, Louis Dartez, Juan Vasquez and Froylan Morales.Version History to InfoPath Custom List Form: The ability to add a button to view the version history of an item when the display form is modified in InfoPath allows a user easy access to view versioning information. Out of the box, SharePoint does not allow this ability. This is a sandboxed solution.WeatherCN - ????: WeatherCN - ????WinformsPOCMVP: This is a simple, and small proof of concept for the Model View Presenter UI design pattern with C# WinForms.worldbestwebsites: Customer Connecting Websites A website development for customer connecting

    Read the article

  • Windows Azure Service Bus Splitter and Aggregator

    - by Alan Smith
    This article will cover basic implementations of the Splitter and Aggregator patterns using the Windows Azure Service Bus. The content will be included in the next release of the “Windows Azure Service Bus Developer Guide”, along with some other patterns I am working on. I’ve taken the pattern descriptions from the book “Enterprise Integration Patterns” by Gregor Hohpe. I bought a copy of the book in 2004, and recently dusted it off when I started to look at implementing the patterns on the Windows Azure Service Bus. Gregor has also presented an session in 2011 “Enterprise Integration Patterns: Past, Present and Future” which is well worth a look. I’ll be covering more patterns in the coming weeks, I’m currently working on Wire-Tap and Scatter-Gather. There will no doubt be a section on implementing these patterns in my “SOA, Connectivity and Integration using the Windows Azure Service Bus” course. There are a number of scenarios where a message needs to be divided into a number of sub messages, and also where a number of sub messages need to be combined to form one message. The splitter and aggregator patterns provide a definition of how this can be achieved. This section will focus on the implementation of basic splitter and aggregator patens using the Windows Azure Service Bus direct programming model. In BizTalk Server receive pipelines are typically used to implement the splitter patterns, with sequential convoy orchestrations often used to aggregate messages. In the current release of the Service Bus, there is no functionality in the direct programming model that implements these patterns, so it is up to the developer to implement them in the applications that send and receive messages. Splitter A message splitter takes a message and spits the message into a number of sub messages. As there are different scenarios for how a message can be split into sub messages, message splitters are implemented using different algorithms. The Enterprise Integration Patterns book describes the splatter pattern as follows: How can we process a message if it contains multiple elements, each of which may have to be processed in a different way? Use a Splitter to break out the composite message into a series of individual messages, each containing data related to one item. The Enterprise Integration Patterns website provides a description of the Splitter pattern here. In some scenarios a batch message could be split into the sub messages that are contained in the batch. The splitting of a message could be based on the message type of sub-message, or the trading partner that the sub message is to be sent to. Aggregator An aggregator takes a stream or related messages and combines them together to form one message. The Enterprise Integration Patterns book describes the aggregator pattern as follows: How do we combine the results of individual, but related messages so that they can be processed as a whole? Use a stateful filter, an Aggregator, to collect and store individual messages until a complete set of related messages has been received. Then, the Aggregator publishes a single message distilled from the individual messages. The Enterprise Integration Patterns website provides a description of the Aggregator pattern here. A common example of the need for an aggregator is in scenarios where a stream of messages needs to be combined into a daily batch to be sent to a legacy line-of-business application. The BizTalk Server EDI functionality provides support for batching messages in this way using a sequential convoy orchestration. Scenario The scenario for this implementation of the splitter and aggregator patterns is the sending and receiving of large messages using a Service Bus queue. In the current release, the Windows Azure Service Bus currently supports a maximum message size of 256 KB, with a maximum header size of 64 KB. This leaves a safe maximum body size of 192 KB. The BrokeredMessage class will support messages larger than 256 KB; in fact the Size property is of type long, implying that very large messages may be supported at some point in the future. The 256 KB size restriction is set in the service bus components that are deployed in the Windows Azure data centers. One of the ways of working around this size restriction is to split large messages into a sequence of smaller sub messages in the sending application, send them via a queue, and then reassemble them in the receiving application. This scenario will be used to demonstrate the pattern implementations. Implementation The splitter and aggregator will be used to provide functionality to send and receive large messages over the Windows Azure Service Bus. In order to make the implementations generic and reusable they will be implemented as a class library. The splitter will be implemented in the LargeMessageSender class and the aggregator in the LargeMessageReceiver class. A class diagram showing the two classes is shown below. Implementing the Splitter The splitter will take a large brokered message, and split the messages into a sequence of smaller sub-messages that can be transmitted over the service bus messaging entities. The LargeMessageSender class provides a Send method that takes a large brokered message as a parameter. The implementation of the class is shown below; console output has been added to provide details of the splitting operation. public class LargeMessageSender {     private static int SubMessageBodySize = 192 * 1024;     private QueueClient m_QueueClient;       public LargeMessageSender(QueueClient queueClient)     {         m_QueueClient = queueClient;     }       public void Send(BrokeredMessage message)     {         // Calculate the number of sub messages required.         long messageBodySize = message.Size;         int nrSubMessages = (int)(messageBodySize / SubMessageBodySize);         if (messageBodySize % SubMessageBodySize != 0)         {             nrSubMessages++;         }           // Create a unique session Id.         string sessionId = Guid.NewGuid().ToString();         Console.WriteLine("Message session Id: " + sessionId);         Console.Write("Sending {0} sub-messages", nrSubMessages);           Stream bodyStream = message.GetBody<Stream>();         for (int streamOffest = 0; streamOffest < messageBodySize;             streamOffest += SubMessageBodySize)         {                                     // Get the stream chunk from the large message             long arraySize = (messageBodySize - streamOffest) > SubMessageBodySize                 ? SubMessageBodySize : messageBodySize - streamOffest;             byte[] subMessageBytes = new byte[arraySize];             int result = bodyStream.Read(subMessageBytes, 0, (int)arraySize);             MemoryStream subMessageStream = new MemoryStream(subMessageBytes);               // Create a new message             BrokeredMessage subMessage = new BrokeredMessage(subMessageStream, true);             subMessage.SessionId = sessionId;               // Send the message             m_QueueClient.Send(subMessage);             Console.Write(".");         }         Console.WriteLine("Done!");     }} The LargeMessageSender class is initialized with a QueueClient that is created by the sending application. When the large message is sent, the number of sub messages is calculated based on the size of the body of the large message. A unique session Id is created to allow the sub messages to be sent as a message session, this session Id will be used for correlation in the aggregator. A for loop in then used to create the sequence of sub messages by creating chunks of data from the stream of the large message. The sub messages are then sent to the queue using the QueueClient. As sessions are used to correlate the messages, the queue used for message exchange must be created with the RequiresSession property set to true. Implementing the Aggregator The aggregator will receive the sub messages in the message session that was created by the splitter, and combine them to form a single, large message. The aggregator is implemented in the LargeMessageReceiver class, with a Receive method that returns a BrokeredMessage. The implementation of the class is shown below; console output has been added to provide details of the splitting operation.   public class LargeMessageReceiver {     private QueueClient m_QueueClient;       public LargeMessageReceiver(QueueClient queueClient)     {         m_QueueClient = queueClient;     }       public BrokeredMessage Receive()     {         // Create a memory stream to store the large message body.         MemoryStream largeMessageStream = new MemoryStream();           // Accept a message session from the queue.         MessageSession session = m_QueueClient.AcceptMessageSession();         Console.WriteLine("Message session Id: " + session.SessionId);         Console.Write("Receiving sub messages");           while (true)         {             // Receive a sub message             BrokeredMessage subMessage = session.Receive(TimeSpan.FromSeconds(5));               if (subMessage != null)             {                 // Copy the sub message body to the large message stream.                 Stream subMessageStream = subMessage.GetBody<Stream>();                 subMessageStream.CopyTo(largeMessageStream);                   // Mark the message as complete.                 subMessage.Complete();                 Console.Write(".");             }             else             {                 // The last message in the sequence is our completeness criteria.                 Console.WriteLine("Done!");                 break;             }         }                     // Create an aggregated message from the large message stream.         BrokeredMessage largeMessage = new BrokeredMessage(largeMessageStream, true);         return largeMessage;     } }   The LargeMessageReceiver initialized using a QueueClient that is created by the receiving application. The receive method creates a memory stream that will be used to aggregate the large message body. The AcceptMessageSession method on the QueueClient is then called, which will wait for the first message in a message session to become available on the queue. As the AcceptMessageSession can throw a timeout exception if no message is available on the queue after 60 seconds, a real-world implementation should handle this accordingly. Once the message session as accepted, the sub messages in the session are received, and their message body streams copied to the memory stream. Once all the messages have been received, the memory stream is used to create a large message, that is then returned to the receiving application. Testing the Implementation The splitter and aggregator are tested by creating a message sender and message receiver application. The payload for the large message will be one of the webcast video files from http://www.cloudcasts.net/, the file size is 9,697 KB, well over the 256 KB threshold imposed by the Service Bus. As the splitter and aggregator are implemented in a separate class library, the code used in the sender and receiver console is fairly basic. The implementation of the main method of the sending application is shown below.   static void Main(string[] args) {     // Create a token provider with the relevant credentials.     TokenProvider credentials =         TokenProvider.CreateSharedSecretTokenProvider         (AccountDetails.Name, AccountDetails.Key);       // Create a URI for the serivce bus.     Uri serviceBusUri = ServiceBusEnvironment.CreateServiceUri         ("sb", AccountDetails.Namespace, string.Empty);       // Create the MessagingFactory     MessagingFactory factory = MessagingFactory.Create(serviceBusUri, credentials);       // Use the MessagingFactory to create a queue client     QueueClient queueClient = factory.CreateQueueClient(AccountDetails.QueueName);       // Open the input file.     FileStream fileStream = new FileStream(AccountDetails.TestFile, FileMode.Open);       // Create a BrokeredMessage for the file.     BrokeredMessage largeMessage = new BrokeredMessage(fileStream, true);       Console.WriteLine("Sending: " + AccountDetails.TestFile);     Console.WriteLine("Message body size: " + largeMessage.Size);     Console.WriteLine();         // Send the message with a LargeMessageSender     LargeMessageSender sender = new LargeMessageSender(queueClient);     sender.Send(largeMessage);       // Close the messaging facory.     factory.Close();  } The implementation of the main method of the receiving application is shown below. static void Main(string[] args) {       // Create a token provider with the relevant credentials.     TokenProvider credentials =         TokenProvider.CreateSharedSecretTokenProvider         (AccountDetails.Name, AccountDetails.Key);       // Create a URI for the serivce bus.     Uri serviceBusUri = ServiceBusEnvironment.CreateServiceUri         ("sb", AccountDetails.Namespace, string.Empty);       // Create the MessagingFactory     MessagingFactory factory = MessagingFactory.Create(serviceBusUri, credentials);       // Use the MessagingFactory to create a queue client     QueueClient queueClient = factory.CreateQueueClient(AccountDetails.QueueName);       // Create a LargeMessageReceiver and receive the message.     LargeMessageReceiver receiver = new LargeMessageReceiver(queueClient);     BrokeredMessage largeMessage = receiver.Receive();       Console.WriteLine("Received message");     Console.WriteLine("Message body size: " + largeMessage.Size);       string testFile = AccountDetails.TestFile.Replace(@"\In\", @"\Out\");     Console.WriteLine("Saving file: " + testFile);       // Save the message body as a file.     Stream largeMessageStream = largeMessage.GetBody<Stream>();     largeMessageStream.Seek(0, SeekOrigin.Begin);     FileStream fileOut = new FileStream(testFile, FileMode.Create);     largeMessageStream.CopyTo(fileOut);     fileOut.Close();       Console.WriteLine("Done!"); } In order to test the application, the sending application is executed, which will use the LargeMessageSender class to split the message and place it on the queue. The output of the sender console is shown below. The console shows that the body size of the large message was 9,929,365 bytes, and the message was sent as a sequence of 51 sub messages. When the receiving application is executed the results are shown below. The console application shows that the aggregator has received the 51 messages from the message sequence that was creating in the sending application. The messages have been aggregated to form a massage with a body of 9,929,365 bytes, which is the same as the original large message. The message body is then saved as a file. Improvements to the Implementation The splitter and aggregator patterns in this implementation were created in order to show the usage of the patterns in a demo, which they do quite well. When implementing these patterns in a real-world scenario there are a number of improvements that could be made to the design. Copying Message Header Properties When sending a large message using these classes, it would be great if the message header properties in the message that was received were copied from the message that was sent. The sending application may well add information to the message context that will be required in the receiving application. When the sub messages are created in the splitter, the header properties in the first message could be set to the values in the original large message. The aggregator could then used the values from this first sub message to set the properties in the message header of the large message during the aggregation process. Using Asynchronous Methods The current implementation uses the synchronous send and receive methods of the QueueClient class. It would be much more performant to use the asynchronous methods, however doing so may well affect the sequence in which the sub messages are enqueued, which would require the implementation of a resequencer in the aggregator to restore the correct message sequence. Handling Exceptions In order to keep the code readable no exception handling was added to the implementations. In a real-world scenario exceptions should be handled accordingly.

    Read the article

  • Save the Date - Oracle Partner Community Forum: Exadata, Exalogic and Manageability, Vienna, 23-24 April 2013

    - by Javier Puerta
    Hardware and Software Engineered to Work Together .Ritu { font-family: Arial, Helvetica, sans-serif; } .Ritu { font-family: Arial, Helvetica, sans-serif; } .Ritu { font-family: Arial, Helvetica, sans-serif; } body,td,th { font-family: Arial, Helvetica, sans-serif; font-size: x-small; } .color { color: #F00; } .c { color: #F00; } .c { color: #F00; } .c { color: #000; font-size: xx-small; } .c a { color: #F00; } .c { color: #F00; } .cl { color: #F00; } .b { color: #000; font-size: xx-small; } .i { font-style: italic; } .i { font-style: italic; } .i { font-style: italic; } .i { font-style: italic; } .i { font-style: italic; } .c { color: #F00; font-size: small; } .b { font-weight: bold; font-size: x-small; } .c { color: #F00; font-size: x-small; } .clr { color: #F00; } .c { color: #F00; } inside the Click Here The order you must follow to make the colored link appear in browsers. If not the default window link will appear 1. Select the word you want to use for the link 2. Select the desired color, Red, Black, etc 3. Select bold if necessary ___________________________________________________________________________________________________________________ Templates use two sizes of fonts and the sans-serif font tag for the email. All Fonts should be (Arial, Helvetica, sans-serif) tags Normal size reading body fonts should be set to the size of 2. Small font sizes should be set to 1 !!!!!!!DO NOT USE ANY OTHER SIZE FONT FOR THE EMAILS!!!!!!!! ___________________________________________________________________________________________________________________ -- Oracle PartnerNetwork | Account | Feedback SAVE THE DATE ORACLE PARTNER COMMUNITY FORUM: EXADATA, EXALOGIC AND MANAGEABILITY 23-24 APRIL 2013, VIENNA, AUSTRIA The 2013 event expands its scope to cover all the building blocks of the Cloud infrastructure: Exadata, Exalogic and Manageability! Dear partner I am delighted to announce the 2013 edition of the Exadata, Exalogic and Manageability Partner Community Forum for EMEA partners which will take place in Vienna, Austria, on April 23-24, 2013. After the experience of last year where we ran a joint Exadata and Manageability event, we received requests from many of you to add also Exalogic to the scope of the forum, and this way to cover the complete infrastructure architecture on the Exa platform. The continued market adoption of Exadata and Exalogic is being paralleled by a growth in the rate of projects sold and implemented by partners. Sharing customer cases and best-practices presented by other partners constitutes the core of this event. If you want to present an experience of your company around Exadata, Exalogic or Manageability that can be a learning experience for other partners, we still have some slots in the agenda. (Please contact Javier Puerta if you want to present.) Attending the Community Forum you will also have the opportunity to get Oracle’s insight on new products and market trends. And, of course, interact with the Oracle executives responsible for the Exadata, Exalogic and Manageability business. The atmosphere of beautiful Vienna will be the scenario of the event. Detailed venue and hotel booking information will be sent to you in January. Don't miss out on attending this key event! Save the date now - 23 & 24 April 2013, and watch out for your formal invitation coming soon. Kind regards, Javier Puerta Core Technology Partner Programs, Oracle EMEA E-Mail: [email protected] Jürgen Kress SOA Partner Adoption Oracle EMEA E-Mail: [email protected] Copyright © 2012, Oracle and/or its affiliates. All rights reserved. Contact PBC | Legal Notices and Terms of Use | Privacy Oracle Corporation - Worldwide Headquarters, 500 Oracle Parkway, OPL - E-mail Services, Redwood Shores, CA 94065, United States

    Read the article

  • Save the Date - Oracle Partner Community Forum: Exadata, Exalogic and Manageability, Vienna, 23-24 April 2013

    - by Javier Puerta
    Hardware and Software Engineered to Work Together .Ritu { font-family: Arial, Helvetica, sans-serif; } .Ritu { font-family: Arial, Helvetica, sans-serif; } .Ritu { font-family: Arial, Helvetica, sans-serif; } body,td,th { font-family: Arial, Helvetica, sans-serif; font-size: x-small; } .color { color: #F00; } .c { color: #F00; } .c { color: #F00; } .c { color: #000; font-size: xx-small; } .c a { color: #F00; } .c { color: #F00; } .cl { color: #F00; } .b { color: #000; font-size: xx-small; } .i { font-style: italic; } .i { font-style: italic; } .i { font-style: italic; } .i { font-style: italic; } .i { font-style: italic; } .c { color: #F00; font-size: small; } .b { font-weight: bold; font-size: x-small; } .c { color: #F00; font-size: x-small; } .clr { color: #F00; } .c { color: #F00; } inside the Click Here The order you must follow to make the colored link appear in browsers. If not the default window link will appear 1. Select the word you want to use for the link 2. Select the desired color, Red, Black, etc 3. Select bold if necessary ___________________________________________________________________________________________________________________ Templates use two sizes of fonts and the sans-serif font tag for the email. All Fonts should be (Arial, Helvetica, sans-serif) tags Normal size reading body fonts should be set to the size of 2. Small font sizes should be set to 1 !!!!!!!DO NOT USE ANY OTHER SIZE FONT FOR THE EMAILS!!!!!!!! ___________________________________________________________________________________________________________________ -- Oracle PartnerNetwork | Account | Feedback SAVE THE DATE ORACLE PARTNER COMMUNITY FORUM: EXADATA, EXALOGIC AND MANAGEABILITY 23-24 APRIL 2013, VIENNA, AUSTRIA The 2013 event expands its scope to cover all the building blocks of the Cloud infrastructure: Exadata, Exalogic and Manageability! Dear partner I am delighted to announce the 2013 edition of the Exadata, Exalogic and Manageability Partner Community Forum for EMEA partners which will take place in Vienna, Austria, on April 23-24, 2013. After the experience of last year where we ran a joint Exadata and Manageability event, we received requests from many of you to add also Exalogic to the scope of the forum, and this way to cover the complete infrastructure architecture on the Exa platform. The continued market adoption of Exadata and Exalogic is being paralleled by a growth in the rate of projects sold and implemented by partners. Sharing customer cases and best-practices presented by other partners constitutes the core of this event. If you want to present an experience of your company around Exadata, Exalogic or Manageability that can be a learning experience for other partners, we still have some slots in the agenda. (Please contact Javier Puerta if you want to present.) Attending the Community Forum you will also have the opportunity to get Oracle’s insight on new products and market trends. And, of course, interact with the Oracle executives responsible for the Exadata, Exalogic and Manageability business. The atmosphere of beautiful Vienna will be the scenario of the event. Detailed venue and hotel booking information will be sent to you in January. Don't miss out on attending this key event! Save the date now - 23 & 24 April 2013, and watch out for your formal invitation coming soon. Kind regards, Javier Puerta Core Technology Partner Programs, Oracle EMEA E-Mail: [email protected] Jürgen Kress SOA Partner Adoption Oracle EMEA E-Mail: [email protected] Copyright © 2012, Oracle and/or its affiliates. All rights reserved. Contact PBC | Legal Notices and Terms of Use | Privacy Oracle Corporation - Worldwide Headquarters, 500 Oracle Parkway, OPL - E-mail Services, Redwood Shores, CA 94065, United States

    Read the article

  • CodePlex Daily Summary for Tuesday, January 04, 2011

    CodePlex Daily Summary for Tuesday, January 04, 2011Popular ReleasesMini Memory Dump Diagnosis using Asp.Net: MiniDump HealthMonitor: Enable developers to generate mini memory dumps in case of unhandled exceptions or for specific exception scenarios without using any external tools , only using Asp.Net 2.0 and above. Many times in production , QA Servers the issues require post-mortem or low-level system debugging efforts. This Memory dump generator will help in those scenarios.EnhSim: EnhSim 2.2.9 BETA: 2.2.9 BETAThis release supports WoW patch 4.03a at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 - Added in the Gobl...xUnit.net - Unit Testing for .NET: xUnit.net 1.7 Beta: xUnit.net release 1.7 betaBuild #1533 Important notes for Resharper users: Resharper support has been moved to the xUnit.net Contrib project. Important note for TestDriven.net users: If you are having issues running xUnit.net tests in TestDriven.net, especially on 64-bit Windows, we strongly recommend you upgrade to TD.NET version 3.0 or later. This release adds the following new features: Added support for ASP.NET MVC 3 Added Assert.Equal(double expected, double actual, int precision)...Json.NET: Json.NET 4.0 Release 1: New feature - Added Windows Phone 7 project New feature - Added dynamic support to LINQ to JSON New feature - Added dynamic support to serializer New feature - Added INotifyCollectionChanged to JContainer in .NET 4 build New feature - Added ReadAsDateTimeOffset to JsonReader New feature - Added ReadAsDecimal to JsonReader New feature - Added covariance to IJEnumerable type parameter New feature - Added XmlSerializer style Specified property support New feature - Added ...StyleCop for ReSharper: StyleCop for ReSharper 5.1.14977.000: Prerequisites: ============== o Visual Studio 2008 / Visual Studio 2010 o ReSharper 5.1.1753.4 o StyleCop 4.4.1.2 Preview This release adds no new features, has bug fixes around performance and unhandled errors reported on YouTrack.BloodSim: BloodSim - 1.3.1.0: - Restructured simulation log back end to something less stupid to drastically reduce simulation time and memory usage - Removed a debug log entry that was left over from testing of 1.3.0.0 - Fixed a rounding and calculation error with Haste rating - Added option for Rune of SwordshatteringDbDocument: DbDoc Initial Version: DbDoc Initial versionASP .NET MVC CMS (Content Management System): Atomic CMS 2.1.2: Atomic CMS 2.1.2 release notes Atomic CMS installation guide Kind Of Magic MSBuild Task: Beta 4: Update to keep up with latest bug fixes. To those who don't like Magic/NoMagic attributes, you may change these names in KindOfMagic.targets file: Change this line: <MagicTask Assembly="@(IntermediateAssembly)" References="@(ReferencePath)"/> to something like this: <MagicTask Assembly="@(IntermediateAssembly)" References="@(ReferencePath)" MagicAttribute="MyMagicAttribute" NoMagicAttribute="MyNoMagicAttribute"/>N2 CMS: 2.1: N2 is a lightweight CMS framework for ASP.NET. It helps you build great web sites that anyone can update. Major Changes Support for auto-implemented properties ({get;set;}, based on contribution by And Poulsen) All-round improvements and bugfixes File manager improvements (multiple file upload, resize images to fit) New image gallery Infinite scroll paging on news Content templates First time with N2? Try the demo site Download one of the template packs (above) and open the proj...Wii Backup Fusion: Wii Backup Fusion 1.0: - Norwegian translation - French translation - German translation - WBFS dump for analysis - Scalable full HQ cover - Support for log file - Load game images improved - Support for image splitting - Diff for images after transfer - Support for scrubbing modes - Search functionality for log - Recurse depth for Files/Load - Show progress while downloading game cover - Supports more databases for cover download - Game cover loading routines improvedAutoLoL: AutoLoL v1.5.1: Fix: Fixed a bug where pressing Save As would not select the Mastery Directory by default Unexpected errors are now always reported to the user before closing AutoLoL down.* Extracted champion data to Data directory** Added disclaimer to notify users this application has nothing to do with Riot Games Inc. Updated Codeplex image * An error report will be shown to the user which can help the developers to find out what caused the error, this should improve support ** We are working on ...TortoiseHg: TortoiseHg 1.1.8: TortoiseHg 1.1.8 is a minor bug fix release, with minor improvementsBlogEngine.NET: BlogEngine.NET 2.0: Get DotNetBlogEngine for 3 Months Free! Click Here for More Info 3 Months FREE – BlogEngine.NET Hosting – Click Here! If you want to set up and start using BlogEngine.NET right away, you should download the Web project. If you want to extend or modify BlogEngine.NET, you should download the source code. If you are upgrading from a previous version of BlogEngine.NET, please take a look at the Upgrading to BlogEngine.NET 2.0 instructions. To get started, be sure to check out our installatio...Free Silverlight & WPF Chart Control - Visifire: Visifire SL and WPF Charts v3.6.6 Released: Hi, Today we are releasing final version of Visifire, v3.6.6 with the following new feature: * TextDecorations property is implemented in Title for Chart. * TitleTextDecorations property is implemented in Axis. * MinPointHeight property is now applicable for Column and Bar Charts. Also this release includes few bug fixes: * ToolTipText property of DataSeries was not getting applied from Style. * Chart threw exception if IndicatorEnabled property was set to true and Too...StyleCop Compliant Visual Studio Code Snippets: Visual Studio Code Snippets - January 2011: StyleCop Compliant Visual Studio Code Snippets Visual Studio 2010 provides C# developers with 38 code snippets, enhancing developer productivty and increasing the consistency of the code. Within this project the original code snippets have been refactored to provide StyleCop compliant versions of the original code snippets while also adding many new code snippets. Within the January 2011 release you'll find 82 code snippets to make you more productive and the code you write more consistent!...WPF Application Framework (WAF): WPF Application Framework (WAF) 2.0.0.2: Version: 2.0.0.2 (Milestone 2): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Remark The sample applications are using Microsoft’s IoC container MEF. However, the WPF Application Framework (WAF) doesn’t force you to use the same IoC container in your application. You can use ...DocX: DocX v1.0.0.11: Building Examples projectTo build the Examples project, download DocX.dll and add it as a reference to the project. OverviewThis version of DocX contains many bug fixes, it is a serious step towards a stable release. Added1) Unit testing project, 2) Examples project, 3) To many bug fixes to list here, see the source code change list history.Cosmos (C# Open Source Managed Operating System): 71406: This is the second release supporting the full line of Visual Studio 2010 editions. Changes since release 71246 include: Debug info is now stored in a single .cpdb file (which is a Firebird database) Keyboard input works now (using Console.ReadLine) Console colors work (using Console.ForegroundColor and .BackgroundColor)Paint.NET PSD Plugin: 1.6.0: Handling of layer masks has been greatly improved. Improved reliability. Many PSD files that previously loaded in as garbage will now load in correctly. Parallelized loading. PSD files containing layer masks will load in a bit quicker thanks to the removal of the sequential bottleneck. Hidden layers are no longer made visible on save. Many thanks to the users who helped expose the layer masks problem: Rob Horowitz, M_Lyons10. Please keep sending in those bug reports and PSD repro files!New Projects3D SharePoint Tag Cloud in Silverlight: This is a Silverlight Tag Cloud intended for use in SharePoint but can be configured for use separately as well. Based on a blog post from Peter Gerritsen (http://blog.petergerritsen.nl/2009/02/14/creating-a-3d-tagcloud-in-silverlight-part-1/).AffinityUI for Unity 3D: AffinityUI makes writing GUI code for Unity 3D easier with a component based API that allows you to write GUIs in a declarative fashion using a fluent interface, update events and powerful two-way databinding. A visual editor and scene GUI support are planned for the future.calendar synchronization: This will enable tight integration between ivvy core and infragistics scheduler. Will utilize oData and native isolated strorage instead of SQLite as planed earlier.Card Games Library: The aim of this project is to offer a framework for creating card games logics and rules. This project is not focused on particular games but offers a felxible and extensible base for creating card games, like; poker, solitaire, rummy, etcDbDocument: DbDoc tool seeks to be helper tool side by side with MS SQL server management studio tool, you can design your DB Tables in visualized way through Diagrams and then use “DbDoc” tool to generate design document in MS Word table formatDelux: Delux project for educationDnEcuDiag - A .NET Ecu Diagnostic Application: DnEcuDiag is an application framework for developing electronic control unit diagnostic functionality without the need to implement code.Dynamics CRM 4.0 Recycle Bin: This add-on for Microsoft Dynamics CRM 4.0 to enable Recycle Bin Features (Restore, Permenant Delete) for CRM users.EPPLib.it: Il sistema sincrono di registrazione e mantenimento dei nomi a dominio del Registro.it utilizza il protocollo EPP (Extensible Provisioning Protocol). EPPLib consente di interfacciare i propri progetti con il sistema sincrono del Registro.it.IoC4Fun: Very LightWeight IoC or DI Container only required Net Framework 3.5 written in C#. For Small Apps or Educational purpose. Not intrusion code added like others Containers. Just Plan Register Dependencies and Let Container resolve magically the Injection Dependencies.ISocial: WP7 Application, centralise les réseaux sociauxJohnnyIssa: begASP.NET Version 4.LiveCPE: LiveCPE is an Academic project based on C# and ASP.Net. It aims to be a social network website. MS CRM - Skype Connector: Skype Addon for Microsoft Dynamics CRM allows to dial CRM Contacts, Lead and so on via Skype. It's developed in ASP.NET and C#.Ryan's Projects: a group of random c# projectsSeleniuMspec: SeleniuMspec is a template for Selenium IDE that generates Selenium tests for Mspec (a .NET BDD framework). SharePoint FAQ Web Part: The SharePoint FAQ Web Part makes it easier for SharePoint Users to display FAQs feature on their SharePoint sites. You'll no longer have to maintain HTML and anchors in the Content Editor Web Part, as the FAQs are now automatically generated from a SharePoint list. Silverlight Planet: Projetos da comunidade Silverlight Planet www.silverlightplanet.net.brWP7 Expense Report: Expense Report for Windows Phone 7xll - the easy way to create Excel add-ins: The xll C++ library makes it easy to create xll add-ins for Excel. It also supports the big grid, wide-character strings, and thread-safe functions. ????????: ????,??,??,????,????,??????

    Read the article

  • How to create a Global Rule that stores a document’s folder path in a custom metadata field

    - by Nicolas Montoya
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} How to create a Global Rule that stores a document’s folder path in a custom metadata field Efficiency purists would argue that redundancy is not necessary. In real life, we are willing to pay a price for performance –i.e. to have information at our fingertips. We have run into customers opting to store a document folder path as a document metadata field. They have their reasons, half of the ECM community will agree with them, and the other half would raise an eye brow. In the end, they are getting creative to achieve their document management goals. The below steps outlines how to create a Global Rule that would store a document’s folder path in a custom metadata field: Create a Global Rule via Configuration Manager > Rules Tab > Add Then check “Is global rule with priority”. Then check “Use rule activation condition”. The go to “Edit” and check the actions for this Script Properties: Then click OK, and the following rule activation condition will appear: Then Goto to the Fields Tab and add a Rule Field: Select the target Custom Metadata Field and click Ok, then check the “Is derived field”, then “Edit”, then go to the Custom Tab in the Script Properties window and enter the below custom script: <$if #active.dCollectionPath$> <$dprDerivedValue=#active.dCollectionPath$> <$else$> <$dprDerivedValue=#active.xCollectionIDPath$> <$endif$> For more information on the dCollectionPath property, check Section 8.2 Folder Services from the Oracle® Fusion Middleware Services Reference Guide for Oracle Universal Content Management 11g Release 1 (11.1.1) http://docs.oracle.com/cd/E21043_01/doc.1111/e11011/c08_folders002.htm The above rule will keep the Custom Metadata Field updated with the Folder Path information when a document is checked in via the Content Server (CS) Web Interface or the Desktop Integration Suite (DIS).

    Read the article

  • How To Quickly Reboot Directly from Windows 7 to XP, Vista, or Ubuntu

    - by The Geek
    One of the biggest annoyances with a dual-boot system is having to wait for your PC to reboot to select the operating system you want to switch to, but there’s a simple piece of software that can make this process easier. This guest article was written by Ryan Dozier from the Doztech tech blog. With a small piece of software called iReboot we can skip the above step all together and instantly reboot into the operating system we want right from Windows. Their description says: “Instead of pressing restart, waiting for Windows to shut down, waiting for your BIOS to post, then selecting the operating system you want to boot into (within the bootloader time-limit!); you just select that entry from iReboot and let it do the rest!” Don’t worry about iReboot reconfiguring  your bootloader or any dual boot configuration you have. iReboot will only boot the selected operating system once and go back to your default settings. Using iReboot iReboot is quick and easy to install. Just download it, link below, run through the setup and select the default configuration. iReboot will automatically figure out what operating systems you have installed and appear in the taskbar. Go over to the taskbar and right click on the iReboot icon and select which operating system you want to reboot into. This method will add a check mark on the operating system you want to boot into. On your next reboot the system will automatically load your choice and skip the Windows Boot Manager. If you want to reboot automatically just select “Reboot on Selection” in the iReboot menu.   To be even more productive, you can install iReboot into each Windows operating system to quickly access the others with a few simple clicks.   iReboot does not work in Linux so you will have to reboot manually. Then wait for the Windows Boot Manager to load and select your operating system.   Conclusion iReboot works on  Windows XP, Windows Vista,  and Windows 7 as well as 64 bit versions of these operating systems. Unfortunately iReboot is only available for Windows but you can still use its functionality in Windows to quickly boot up your Linux machine. A simple reboot in Linux will take you back to Windows Boot Manager. Download iReboot from neosmart.net Editor’s note: We’ve not personally tested this software over at How-To Geek, but Neosmart, the author of the software, generally makes quality stuff. Still, you might want to test it out on a test machine first. If you’ve got any experience with this software, please be sure to let your fellow readers know in the comments. Similar Articles Productive Geek Tips Restart the Ubuntu Gnome User Interface QuicklyKeyboard Ninja: 21 Keyboard Shortcut ArticlesTest Your Computer’s Memory Using Windows Vista Memory Diagnostic ToolEnable or Disable UAC From the Windows 7 / Vista Command LineSet Windows as Default OS when Dual Booting Ubuntu TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Home Networks – How do they look like & the problems they cause Check Your IMAP Mail Offline In Thunderbird Follow Finder Finds You Twitter Users To Follow Combine MP3 Files Easily QuicklyCode Provides Cheatsheets & Other Programming Stuff Download Free MP3s from Amazon

    Read the article

  • Laptop screen blank after login when external monitor is not connected

    - by Ramon Suarez
    Ubuntu does not switch back automatically to only monitor present when booting after disconnecting external monitor. Here's a video showing what happens. I get to the login window and everything looks ok, then I type my password, the desktop image shows up and... everything goes blank. It does not happen when I just login as a guest. When possible I work with my laptop connected to an external screen via the VGA port. The problem comes when I boot the computer without that secondary screen connected: The login screen comes out ok. After login the screen goes black, but I can hear the login sound. If I hit ctr + alt + backwards-delete and login again sometimes it is fixed, but not all. If I log in as a different user everything is OK. Then I log in as my user and sometimes it works. To have a screen I have to plug a monitor. Although I have turned on the laptop display with that monitor on, if I reboot it goes blank again after login, even if I turn off the external monitor before turning off the computer. I've managed to get my screen back with my username after going into recovery mode, but only sometimes. Failsafe would not load after second screen asking me what I wanted to do (no mouse to click nor keyboard working). My computer is a LDLC Aurore BB1-i5 -8 -S1. Which is the configuration file that keeps the information about the monitors using Displays under lightgdm and where is it? I guess if I could edit it I may have a chance :) One of the things I tried following a solution in another post was removing my monitors.xml file, but it does not work and I don't know how to create a good one that I could use now. When doing DISPLAY=:0 xrandrI get: Screen 0: minimum 320 x 200, current 320 x 200, maximum 8192 x 8192 LVDS1 connected (normal left inverted right x axis y axis) 1366x768 60.0 + 1360x768 59.8 60.0 1024x768 60.0 800x600 60.3 56.2 640x480 59.9 VGA1 disconnected (normal left inverted right x axis y axis) HDMI1 disconnected (normal left inverted right x axis y axis) DP1 disconnected (normal left inverted right x axis y axis) This is the full dmesg after activating sudo xdiagnoseas Bryce sugested. (If you tell me the relevant parts I will paste them here) When conecting the external monitor, only the external will work, although I can see using Displays that the computer thinks that both are working. I've asked the question in Launchpad but have it keeps on expiring without any feedback. In my opinion Ubuntu should be able to detect automatically that there is no external monitor present and switch to the laptop monitor. There's a similar question here, but it does not apply to my case External monitor set as primary even when disconnected from laptop Update: For clarification, the problem happens only with my user and once I log in. I even get to see the screensaver for about a second, and then it goes blank. Tried Bryce's example (see his answer below), but it did not work. This is the info I get from tty1 with Display=:0 xrandr: – Ramon Suarez Jul 9 at 16:36 Screen 0: minimum 320 x 200, current 320 x 200, maximum 8192 x 8192 LVDS1 connected (normal left inverted right x axis y axis) 1366x768 60.0 + 1360x768 59.8 60.0 1024x768 60.0 800x600 60.3 56.2 640x480 59.9 VGA1 disconnected (normal left inverted right x axis y axis) HDMI1 disconnected (normal left inverted right x axis y axis) DP1 disconnected (normal left inverted right x axis y axis)

    Read the article

  • Software Architecture verses Software Design

    Recently, I was asked what the differences between software architecture and software design are. At a very superficial level both architecture and design seem to mean relatively the same thing. However, if we examine both of these terms further we will find that they are in fact very different due to the level of details they encompass. Software Architecture can be defined as the essence of an application because it deals with high level concepts that do not include any details as to how they will be implemented. To me this gives stakeholders a view of a system or application as if someone was viewing the earth from outer space. At this distance only very basic elements of the earth can be detected like land, weather and water. As the viewer comes closer to earth the details in this view start to become more defined. Details about the earth’s surface will start to actually take form as well as mane made structures will be detected. The process of transitioning a view from outer space to inside our earth’s atmosphere is similar to how an architectural concept is transformed to an architectural design. From this vantage point stakeholders can start to see buildings and other structures as if they were looking out of a small plane window. This distance is still high enough to see a large area of the earth’s surface while still being able to see some details about the surface. This viewing point is very similar to the actual design process of an application in that it takes the very high level architectural concept or concepts and applies concrete design details to form a software design that encompasses the actual implementation details in the form of responsibilities and functions. Examples of these details include: interfaces, components, data, and connections. In review, software architecture deals with high level concepts without regard to any implementation details. Software design on the other hand takes high level concepts and applies concrete details so that software can be implemented. As part of the transition between software architecture to the creation of software design an evaluation on the architecture is recommended. There are several benefits to including this step as part of the transition process. It allows for projects to ensure that they are on the correct path as to meeting the stakeholder’s requirement goals, identifies possible cost savings and can be used to find missing or nonspecific requirements that cause ambiguity in a design. In the book “Evaluating Software Architectures: Methods and Case Studies”, they define key benefits to adding an architectural review process to ensure that an architecture is ready to move on to the design phase. Benefits to evaluating software architecture: Gathers all stakeholders to communicate about the project Goals are clearly defined in regards to the creation or validation of specific requirements Goals are prioritized so that when conflicts occur decisions will be made based on goal priority Defines a clear expectation of the architecture so that all stakeholders have a keen understanding of the project Ensures high quality documentation of the architecture Enables discoveries of architectural reuse  Increases the quality of architecture practices. I can remember a few projects that I worked on that could have really used an architectural review prior to being passed on to developers. This project was to create some new advertising space on the company’s website in order to sell space based on the location and some other criteria. I was one of the developer selected to lead this project and I was given a high level design concept and a long list of ever changing requirements due to the fact that sales department had no clear direction as to what exactly the project was going to do or how they were going to bill the clients once they actually agreed to purchase the Ad space. In my personal opinion IT should have pushed back to have the requirements further articulated instead of forcing programmers to code blindly attempting to build such an ambiguous project.  Unfortunately, we had to suffer with this project for about 4 months when it should have only taken 1.5 to complete due to the constantly changing and unclear requirements. References  Clements, P., Kazman, R., & Klein, M. (2002). Evaluating Software Architectures. Westford, Massachusetts: Courier Westford. 

    Read the article

< Previous Page | 627 628 629 630 631 632 633 634 635 636 637 638  | Next Page >