Search Results

Search found 34 results on 2 pages for 'joesph atkinson'.

Page 1/2 | 1 2  | Next Page >

  • James Atkinson - New Blog Home

    - by jatkinson
    I'm migrating my blog that is currently hosted over at vbCity.com (which is an outstanding developer community!) to a new home at geekswithblogs.net. I truly appreciate the comradery of Serge B, Ged Mead, and the other team members at the "City". What you can expect to find here (my interests): Most .NET programming topics General computing Language examples in C#, VB.NET, and Boo WCF WPF Mathematical / GPS solutions F# (in progress... if you can say that much) Obsessed with code performance (speed) Some photography My background: Kansas State University Grad (Agriculture Technology Management) From Richmond, VA Self taught programmer (started with C# in VS2002) NOT a professional programmer (enables free thinking?!)  I'm no Jeff Atwood or Beth Massi, but you should expect to see some interesting stuff to follow.

    Read the article

  • List of resources for database continuous integration

    - by David Atkinson
    Because there is so little information on database continuous integration out in the wild, I've taken it upon myself to aggregate as much as possible and post the links to this blog. Because it's my area of expertise, this will focus on SQL Server and Red Gate tooling, although I am keen to include any quality articles that discuss the topic in general terms. Please let me know if you find a resource that I haven't listed! General database Continuous Integration · What is Database Continuous Integration? (David Atkinson) · Continuous Integration for SQL Server Databases (Troy Hunt) · Installing NAnt to drive database continuous integration (David Atkinson) · Continuous Integration Tip #3 - Version your Databases as part of your automated build (Doug Rathbone) · How the "migrations" approach makes database continuous integration possible (David Atkinson) · Continuous Integration for the Database (Keith Bloom) Setting up Continuous Integration with Red Gate tools · Continuous integration for databases using Red Gate tools - A technical overview (White Paper, Roger Hart and David Atkinson) · Continuous integration for databases using Red Gate SQL tools (Product pages) · Database continuous integration step by step (David Atkinson) · Database Continuous Integration with Red Gate Tools (video, David Atkinson) · Database schema synchronisation with RedGate (Vincent Brouillet) · Database continuous integration and deployment with Red Gate tools (David Duffett) · Automated database releases with TeamCity and Red Gate (Troy Hunt) · How to build a database from source control (David Atkinson) · Continuous Integration Automated Database Update Process (Lance Lyons) Other · Evolutionary Database Design (Martin Fowler) · Recipes for Continuous Database Integration: Evolutionary Database Development (book, Pramod J Sadalage) · Recipes for Continuous Database Integration (book, Pramod Sadalage) · The Red Gate Guide to SQL Server Team-based Development (book, Phil Factor, Grant Fritchey, Alex Kuznetsov, Mladen Prajdic) · Using SQL Test Database Unit Testing with TeamCity Continuous Integration (Dave Green) · Continuous Database Integration (covers MySQL, Perason Education) Technorati Tags: SQL Server,Continous Integration

    Read the article

  • The emergence of Atlassian's Bamboo (and a free SQL Source Control license offer!)

    - by David Atkinson
    The rise in demand for database continuous integration has forced me to skill-up in various new tools and technologies, particularly build servers. We have been using JetBrain's TeamCity here at Red Gate for a couple of years now, having replaced the ageing CruiseControl.NET, so it was a natural choice for us to use this for our database CI demos. Most of our early adopter customers have also transitioned away from CruiseControl, the majority to TeamCity and Microsoft's TeamBuild. However, more recently, for reasons we've yet to fully comprehend, we've observed a significant surge in the number of evaluators for Atlassian's Bamboo. I installed this a couple of weeks back to satisfy myself that it works seamlessly with Red Gate tools. As you would expect Bamboo's UI has the same clean feel found in any Atlassian tool (we use JIRA extensively here at Red Gate). In the coming weeks I will post a short step-by-step guide to setting up SQL Server continuous integration using the Red Gate command lines. To help us further optimize the integration between these tools I'd be very keen to hear from any Bamboo users who also use Red Gate tools who might be willing to participate in usability tests and other similar research in exchange for Amazon vouchers. If you are interested in helping out please contact me at David dot Atkinson at red-gate.com I recently spoke with Sarah, the product marketing manager for Bamboo, and we ended up having a detailed conversation about database CI, which has been meticulously documented in the form of a blog post on Atlassian's website: http://blogs.atlassian.com/2012/05/database-continuous-integration-redgate/ We've also managed to persuade Red Gate marketing to provide a great free-tool offer, provide a free SQL Source Control or SQL Connect license to Atlassian users provided it is claimed before the end of June! Full details are at the bottom of the post. Technorati Tags: sql server

    Read the article

  • The emergence of Atlassian's Bamboo (and a free SQL Source Control license offer!)

    - by David Atkinson
    The rise in demand for database continuous integration has forced me to skill-up in various new tools and technologies, particularly build servers. We have been using JetBrain's TeamCity here at Red Gate for a couple of years now, having replaced the ageing CruiseControl.NET, so it was a natural choice for us to use this for our database CI demos. Most of our early adopter customers have also transitioned away from CruiseControl, the majority to TeamCity and Microsoft's TeamBuild. However, more recently, for reasons we've yet to fully comprehend, we've observed a significant surge in the number of evaluators for Atlassian's Bamboo. I installed this a couple of weeks back to satisfy myself that it works seamlessly with Red Gate tools. As you would expect Bamboo's UI has the same clean feel found in any Atlassian tool (we use JIRA extensively here at Red Gate). In the coming weeks I will post a short step-by-step guide to setting up SQL Server continuous integration using the Red Gate command lines. To help us further optimize the integration between these tools I'd be very keen to hear from any Bamboo users who also use Red Gate tools who might be willing to participate in usability tests and other similar research in exchange for Amazon vouchers. If you are interested in helping out please contact me at David dot Atkinson at red-gate.com I recently spoke with Sarah, the product marketing manager for Bamboo, and we ended up having a detailed conversation about database CI, which has been meticulously documented in the form of a blog post on Atlassian's website: http://blogs.atlassian.com/2012/05/database-continuous-integration-redgate/ We've also managed to persuade Red Gate marketing to provide a great free-tool offer, provide a free SQL Source Control or SQL Connect license to Atlassian users provided it is claimed before the end of June! Full details are at the bottom of the post. Technorati Tags: sql server

    Read the article

  • Latest Ubuntu stuck on "completing the ubuntu installation"

    - by Joesph Atkinson
    So my issue is after installing 12.10 Ubuntu using wubi.exe on a dedicated parition and rebooting the computer I am given the option to boot ubuntu, as I choose ubuntu I am brought to the "completing the ubuntu installation page" after it hits 0 on the countdown it just sits doing nothing. I left my computer on all day and still the same issue so I know its not just a slow install. Some say its because ubuntu cannot find a video driver for my card (xfx radeon 5770) which if that is the case is there a way i can run the install without it needing to look for drivers?

    Read the article

  • Database continuous integration step by step

    - by David Atkinson
    This post will describe how to set up basic database continuous integration using TeamCity to initiate the build process, SQL Source Control to put your database under source control, and the SQL Compare command line to keep a test database up to date. In my example I will be using Subversion as my source control repository. If you wish to follow my steps verbatim, please make sure you have TortoiseSVN, SQL Compare and SQL Source Control installed. Downloading and Installing TeamCity TeamCity (http://www.jetbrains.com/teamcity/index.html) is free for up to three agents, so it a great no-risk tool you can use to experiment with. 1. Download the latest version from the JetBrains website. For some reason the TeamCity executable didn't download properly for me, stalling frustratingly at 99%, so I tried again with the zip file download option (see screenshot below), which worked flawlessly. 2. Run the installer using the defaults. This results in a set-up with the server component and agent installed on the same machine, which is ideal for getting started with ease. 3. Check that the build agent is pointing to the server correctly. This has caught me out a few times before. This setting is in C:\TeamCity\buildAgent\conf\buildAgent.properties and for my installation is serverUrl=http\://localhost\:80 . If you need to change this value, if for example you've had to install the Server console to a different port number, the TeamCity Build Agent Service will need to be restarted for the change to take effect. 4. Open the TeamCity admin console on http://localhost , and specify your own designated username and password at first startup. Putting your database in source control using SQL Source Control 5. Assuming you've got SQL Source Control installed, select a development database in the SQL Server Management Studio Object Explorer and select Link Database to Source Control. 6. For the Link step you can either create your own empty folder in source control, or you can select Just Evaluating, which just creates a local subversion repository for you behind the scenes. 7. Once linked, note that your database turns green in the Object Explorer. Visit the Commit tab to do an initial commit of your database objects by typing in an appropriate comment and clicking Commit. 8. There is a hidden feature in SQL Source Control that opens up TortoiseSVN (provided it is installed) pointing to the linked repository. Keep Shift depressed and right click on the text to the right of 'Linked to', in the example below, it's the red Evaluation Repository text. Select Open TortoiseSVN Repo Browser. This screen should give you an idea of how SQL Source Control manages the object files behind the scenes. Back in the TeamCity admin console, we'll now create a new project to monitor the above repository location and to trigger a 'build' each time the repository changes. 9. In TeamCity Adminstration, select Create Project and give it a name, such as "My first database CI", and click Create. 10. Click on Create Build Configuration, and name it something like "Integration build". 11. Click VCS settings and then Create And Attach new VCS root. This is where you will tell TeamCity about the repository it should monitor. 12. In my case since I'm using the Just Evaluating option in SQL Source Control, I should select Subversion. 13. In the URL field paste your repository location. In my case this is file:///C:/Users/David.Atkinson/AppData/Local/Red Gate/SQL Source Control 3/EvaluationRepositories/WidgetDevelopment/WidgetDevelopment 14. Click on Test Connection to ensure that you can communicate with your source control system. Click Save. 15. Click Add Build Step, and Runner Type: Command Line. Should you be familiar with the other runner types, such as NAnt, MSBuild or Powershell, you can opt for these, but for the same of keeping it simple I will pick the simplest option. 16. If you have installed SQL Compare in the default location, set the Command Executable field to: C:\Program Files (x86)\Red Gate\SQL Compare 10\sqlcompare.exe 17. Flip back to SSMS briefly and add a new database to your server. This will be the database used for continuous integration testing. 18. Set the command parameters according to your server and the name of the database you have created. In my case I created database RedGateCI on server .\sql2008r2 /scripts1:. /server2:.\sql2008r2 /db2:RedGateCI /sync /verbose Note that if you pick a server instance that isn't on your local machine, you'll need the TCP/IP protocol enabled in SQL Server Configuration Manager otherwise the SQL Compare command line will not be able to connect. 19. Save and select Build Triggering / Add New Trigger / VCS Trigger. This is where you tell TeamCity when it should initiate a build. Click Save. 20. Now return to SQL Server Management Studio and make a schema change (eg add a new object) to your linked development database. A blue indicator will appear in the Object Explorer. Commit this change, typing in an appropriate check-in comment. All being good, within 60 seconds (a TeamCity default that can be changed) a build will be triggered. 21. Click on Projects in TeamCity to get back to the overview screen: The build log will show you the console output, which is useful for troubleshooting any issues: That's it! You now have continuous integration on your database. In future posts I'll cover how you can generate and test the database creation script, the database upgrade script, and run database unit tests as part of your continuous integration script. If you have any trouble getting this up and running please let me know, either by commenting on this post, or email me directly using the email address below. Technorati Tags: SQL Server

    Read the article

  • Why does VIM say there is trailing whitespace on this command?

    - by Jesse Atkinson
    I am trying to write a beautify CSS command in vim that sorts and alphabetizes all of the CSS properties as well as checks to see if there is not a space after the colon and inserts one. Here is my code: nnoremap <leader>S :g#\({\n\)\@<=#.,/}/sort | %s/:\(\S\)/: \1/g<CR> :command! SortCSSBraceContents :g#\({\n\)\@<=#.,/}/sort | %s/:\(\S\)/: \1/g These work independently. However, I am trying to pipe them into one command. On save VIM says: Error detected while processing /var/home/jesse-atkinson/.vimrc: line 196: E488: Trailing characters Any ideas?

    Read the article

  • How to restore your production database without needing additional storage

    - by David Atkinson
    Production databases can get very large. This in itself is to be expected, but when a copy of the database is needed the database must be restored, requiring additional and costly storage.  For example, if you want to give each developer a full copy of your production server, you’ll need n times the storage cost for your n-developer team. The same is true for any test databases that are created during the course of your project lifecycle. If you’ve read my previous blog posts, you’ll be aware that I’ve been focusing on the database continuous integration theme. In my CI setup I create a “production”-equivalent database directly from its source control representation, and use this to test my upgrade scripts. Despite this being a perfectly valid and practical thing to do as part of a CI setup, it’s not the exact equivalent to running the upgrade script on a copy of the actual production database. So why shouldn’t I instead simply restore the most recent production backup as part of my CI process? There are two reasons why this would be impractical. 1. My CI environment isn’t an exact copy of my production environment. Indeed, this would be the case in a perfect world, and it is strongly recommended as a good practice if you follow Jez Humble and David Farley’s “Continuous Delivery” teachings, but in practical terms this might not always be possible, especially where storage is concerned. It may just not be possible to restore a huge production database on the environment you’ve been allotted. 2. It’s not just about the storage requirements, it’s also the time it takes to do the restore. The whole point of continuous integration is that you are alerted as early as possible whether the build (yes, the database upgrade script counts!) is broken. If I have to run an hour-long restore each time I commit a change to source control I’m just not going to get the feedback quickly enough to react. So what’s the solution? Red Gate has a technology, SQL Virtual Restore, that is able to restore a database without using up additional storage. Although this sounds too good to be true, the explanation is quite simple (although I’m sure the technical implementation details under the hood are quite complex!) Instead of restoring the backup in the conventional sense, SQL Virtual Restore will effectively mount the backup using its HyperBac technology. It creates a data and log file, .vmdf, and .vldf, that becomes the delta between the .bak file and the virtual database. This means that both read and write operations are permitted on a virtual database as from SQL Server’s point of view it is no different from a conventional database. Instead of doubling the storage requirements upon a restore, there is no ‘duplicate’ storage requirements, other than the trivially small virtual log and data files (see illustration below). The benefit is magnified the more databases you mount to the same backup file. This technique could be used to provide a large development team a full development instance of a large production database. It is also incredibly easy to set up. Once SQL Virtual Restore is installed, you simply run a conventional RESTORE command to create the virtual database. This is what I have running as part of a nightly “release test” process triggered by my CI tool. RESTORE DATABASE WidgetProduction_Virtual FROM DISK=N'D:\VirtualDatabase\WidgetProduction.bak' WITH MOVE N'WidgetProduction' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_WidgetProduction_Virtual.vmdf', MOVE N'WidgetProduction_log' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_log_WidgetProduction_Virtual.vldf', NORECOVERY, STATS=1, REPLACE GO RESTORE DATABASE WidgetProduction_Virtual WITH RECOVERY   Note the only change from what you would do normally is the naming of the .vmdf and .vldf files. SQL Virtual Restore intercepts this by monitoring the extension and applies its magic, ensuring the ‘virtual’ restore happens rather than the conventional storage-heavy restore. My automated release test then applies the upgrade scripts to the virtual production database and runs some validation tests, giving me confidence that were I to run this on production for real, all would go smoothly. For illustration, here is my 8Gb production database: And its corresponding backup file: Here are the .vldf and .vmdf files, which represent the only additional used storage for the new database following the virtual restore.   The beauty of this product is its simplicity. Once it is installed, the interaction with the backup and virtual database is exactly the same as before, as the clever stuff is being done at a lower level. SQL Virtual Restore can be downloaded as a fully functional 14-day trial. Technorati Tags: SQL Server

    Read the article

  • How to restore your production database without needing additional storage

    - by David Atkinson
    Production databases can get very large. This in itself is to be expected, but when a copy of the database is needed the database must be restored, requiring additional and costly storage.  For example, if you want to give each developer a full copy of your production server, you'll need n times the storage cost for your n-developer team. The same is true for any test databases that are created during the course of your project lifecycle. If you've read my previous blog posts, you'll be aware that I've been focusing on the database continuous integration theme. In my CI setup I create a "production"-equivalent database directly from its source control representation, and use this to test my upgrade scripts. Despite this being a perfectly valid and practical thing to do as part of a CI setup, it's not the exact equivalent to running the upgrade script on a copy of the actual production database. So why shouldn't I instead simply restore the most recent production backup as part of my CI process? There are two reasons why this would be impractical. 1. My CI environment isn't an exact copy of my production environment. Indeed, this would be the case in a perfect world, and it is strongly recommended as a good practice if you follow Jez Humble and David Farley's "Continuous Delivery" teachings, but in practical terms this might not always be possible, especially where storage is concerned. It may just not be possible to restore a huge production database on the environment you've been allotted. 2. It's not just about the storage requirements, it's also the time it takes to do the restore. The whole point of continuous integration is that you are alerted as early as possible whether the build (yes, the database upgrade script counts!) is broken. If I have to run an hour-long restore each time I commit a change to source control I'm just not going to get the feedback quickly enough to react. So what's the solution? Red Gate has a technology, SQL Virtual Restore, that is able to restore a database without using up additional storage. Although this sounds too good to be true, the explanation is quite simple (although I'm sure the technical implementation details under the hood are quite complex!) Instead of restoring the backup in the conventional sense, SQL Virtual Restore will effectively mount the backup using its HyperBac technology. It creates a data and log file, .vmdf, and .vldf, that becomes the delta between the .bak file and the virtual database. This means that both read and write operations are permitted on a virtual database as from SQL Server's point of view it is no different from a conventional database. Instead of doubling the storage requirements upon a restore, there is no 'duplicate' storage requirements, other than the trivially small virtual log and data files (see illustration below). The benefit is magnified the more databases you mount to the same backup file. This technique could be used to provide a large development team a full development instance of a large production database. It is also incredibly easy to set up. Once SQL Virtual Restore is installed, you simply run a conventional RESTORE command to create the virtual database. This is what I have running as part of a nightly "release test" process triggered by my CI tool. RESTORE DATABASE WidgetProduction_virtual FROM DISK=N'C:\WidgetWF\ProdBackup\WidgetProduction.bak' WITH MOVE N'WidgetProduction' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_WidgetProduction_Virtual.vmdf', MOVE N'WidgetProduction_log' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_log_WidgetProduction_Virtual.vldf', NORECOVERY, STATS=1, REPLACE GO RESTORE DATABASE mydatabase WITH RECOVERY   Note the only change from what you would do normally is the naming of the .vmdf and .vldf files. SQL Virtual Restore intercepts this by monitoring the extension and applies its magic, ensuring the 'virtual' restore happens rather than the conventional storage-heavy restore. My automated release test then applies the upgrade scripts to the virtual production database and runs some validation tests, giving me confidence that were I to run this on production for real, all would go smoothly. For illustration, here is my 8Gb production database: And its corresponding backup file: Here are the .vldf and .vmdf files, which represent the only additional used storage for the new database following the virtual restore.   The beauty of this product is its simplicity. Once it is installed, the interaction with the backup and virtual database is exactly the same as before, as the clever stuff is being done at a lower level. SQL Virtual Restore can be downloaded as a fully functional 14-day trial. Technorati Tags: SQL Server

    Read the article

  • SQL Server source control from Visual Studio

    - by David Atkinson
    Developers have long since had to context switch between two IDEs, Visual Studio for application code development and SQL Server Management Studio for database development. While this is accepted, especially given the richness of the database development feature set in SSMS, loading a separate tool can seem a little overkill. This is where SQL Connect comes in. This is an add-in to Visual Studio that provides a connected development experience for the SQL Server developer. Connected database development involves modifying a development sandbox database, as opposed to offline development, where SQL text files are modified independently of the database. One of the main complaints of Data Dude (VS DBPro) is that it enforces the offline approach. This gripe is what SQL Connect addresses. If you don't already use SQL Source Control, you can get up and running with SQL Connect by adding a new project to your Visual Studio solution as follows: Then choose your existing development database and you're ready to go. If you already use SQL Source Control, you will need to link SQL Connect to your existing database scripts folder repository, so SQL Connect and SQL Source Control can be used collaboratively (note that SQL Source Control v.3.0.9.18 or later is required). Locate the repository (this can be found in the Setup tab in SQL Source Control). .and create a working folder for it (here I'm using TortoiseSVN). Back in Visual Studio, locate the SQL Connect panel (in the View menu if it hasn't auto loaded) and select Import SQL Source Control project Locate your working folder and click Import. This creates a Red Gate database project under your solution: From here you can modify your development database, and manage your changes in source control. To associate your development database with the project, right click on the project node, select Properties, set the database and Save. Now you're ready to make some changes. Locate the object you'd like to modify in the Solution Explorer, and double click it to invoke a query window or table designer. You also have the option to edit the creation SQL directly using Edit SQL File in Project. Keeping the development database and Visual Studio project in sync is as easy as clicking on a button. One you've made your change, you can use whichever mechanism you choose to commit to source control. Here I'm using the free open-source AnkhSVN to integrate Subversion with Visual Studio. Maintaining your database in a Visual Studio solution means that you can commit database changes and application code changes in the same changeset. This is desirable if you have continuous integration set up as you want to ensure that all files related to a change are committed atomically, so you avoid an interim "broken build". More discussion on SQL Connect and its benefits can be found in the following article on Simple Talk: No More Disconnected SQL Development in Visual Studio The SQL Connect project team is currently assessing the backlog for the next development effort, and they'd appreciate your feature suggestions, as well as your votes on their suggestions site: http://redgate.uservoice.com/forums/140800-sql-connect-for-visual-studio- A 28-day free trial of SQL Connect is available from the Red Gate website. Technorati Tags: SQL Server

    Read the article

  • How the "migrations" approach makes database continuous integration possible

    - by David Atkinson
    Testing a database upgrade script as part of a continuous integration process will only work if there is an easy way to automate the generation of the upgrade scripts. There are two common approaches to managing upgrade scripts. The first is to maintain a set of scripts as-you-go-along. Many SQL developers I've encountered will store these in a folder prefixed numerically to ensure they are ordered as they are intended to be run. Occasionally there is an accompanying document or a batch file that ensures that the scripts are run in the defined order. Writing these scripts during the course of development requires discipline. It's all too easy to load up the table designer and to make a change directly to the development database, rather than to save off the ALTER statement that is required when the same change is made to production. This discipline can add considerable overhead to the development process. However, come the end of the project, everything is ready for final testing and deployment. The second development paradigm is to not do the above. Changes are made to the development database without considering the incremental update scripts required to effect the changes. At the end of the project, the SQL developer or DBA, is tasked to work out what changes have been made, and to hand-craft the upgrade scripts retrospectively. The end of the project is the wrong time to be doing this, as the pressure is mounting to ship the product. And where data deployment is involved, it is prudent not to feel rushed. Schema comparison tools such as SQL Compare have made this latter technique more bearable. These tools work by analyzing the before and after states of a database schema, and calculating the SQL required to transition the database. Problem solved? Not entirely. Schema comparison tools are huge time savers, but they have their limitations. There are certain changes that can be made to a database that can't be determined purely from observing the static schema states. If a column is split, how do we determine the algorithm required to copy the data into the new columns? If a NOT NULL column is added without a default, how do we populate the new field for existing records in the target? If we rename a table, how do we know we've done a rename, as we could equally have dropped a table and created a new one? All the above are examples of situations where developer intent is required to supplement the script generation engine. SQL Source Control 3 and SQL Compare 10 introduced a new feature, migration scripts, allowing developers to add custom scripts to replace the default script generation behavior. These scripts are committed to source control alongside the schema changes, and are associated with one or more changesets. Before this capability was introduced, any schema change that required additional developer intent would break any attempt at auto-generation of the upgrade script, rendering deployment testing as part of continuous integration useless. SQL Compare will now generate upgrade scripts not only using its diffing engine, but also using the knowledge supplied by developers in the guise of migration scripts. In future posts I will describe the necessary command line syntax to leverage this feature as part of an automated build process such as continuous integration.

    Read the article

  • How to restore your production database without needing additional storage

    - by David Atkinson
    Production databases can get very large. This in itself is to be expected, but when a copy of the database is needed the database must be restored, requiring additional and costly storage.  For example, if you want to give each developer a full copy of your production server, you'll need n times the storage cost for your n-developer team. The same is true for any test databases that are created during the course of your project lifecycle. If you've read my previous blog posts, you'll be aware that I've been focusing on the database continuous integration theme. In my CI setup I create a "production"-equivalent database directly from its source control representation, and use this to test my upgrade scripts. Despite this being a perfectly valid and practical thing to do as part of a CI setup, it's not the exact equivalent to running the upgrade script on a copy of the actual production database. So why shouldn't I instead simply restore the most recent production backup as part of my CI process? There are two reasons why this would be impractical. 1. My CI environment isn't an exact copy of my production environment. Indeed, this would be the case in a perfect world, and it is strongly recommended as a good practice if you follow Jez Humble and David Farley's "Continuous Delivery" teachings, but in practical terms this might not always be possible, especially where storage is concerned. It may just not be possible to restore a huge production database on the environment you've been allotted. 2. It's not just about the storage requirements, it's also the time it takes to do the restore. The whole point of continuous integration is that you are alerted as early as possible whether the build (yes, the database upgrade script counts!) is broken. If I have to run an hour-long restore each time I commit a change to source control I'm just not going to get the feedback quickly enough to react. So what's the solution? Red Gate has a technology, SQL Virtual Restore, that is able to restore a database without using up additional storage. Although this sounds too good to be true, the explanation is quite simple (although I'm sure the technical implementation details under the hood are quite complex!) Instead of restoring the backup in the conventional sense, SQL Virtual Restore will effectively mount the backup using its HyperBac technology. It creates a data and log file, .vmdf, and .vldf, that becomes the delta between the .bak file and the virtual database. This means that both read and write operations are permitted on a virtual database as from SQL Server's point of view it is no different from a conventional database. Instead of doubling the storage requirements upon a restore, there is no 'duplicate' storage requirements, other than the trivially small virtual log and data files (see illustration below). The benefit is magnified the more databases you mount to the same backup file. This technique could be used to provide a large development team a full development instance of a large production database. It is also incredibly easy to set up. Once SQL Virtual Restore is installed, you simply run a conventional RESTORE command to create the virtual database. This is what I have running as part of a nightly "release test" process triggered by my CI tool. RESTORE DATABASE WidgetProduction_virtual FROM DISK=N'C:\WidgetWF\ProdBackup\WidgetProduction.bak' WITH MOVE N'WidgetProduction' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_WidgetProduction_Virtual.vmdf', MOVE N'WidgetProduction_log' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_log_WidgetProduction_Virtual.vldf', NORECOVERY, STATS=1, REPLACE GO RESTORE DATABASE mydatabase WITH RECOVERY   Note the only change from what you would do normally is the naming of the .vmdf and .vldf files. SQL Virtual Restore intercepts this by monitoring the extension and applies its magic, ensuring the 'virtual' restore happens rather than the conventional storage-heavy restore. My automated release test then applies the upgrade scripts to the virtual production database and runs some validation tests, giving me confidence that were I to run this on production for real, all would go smoothly. For illustration, here is my 8Gb production database: And its corresponding backup file: Here are the .vldf and .vmdf files, which represent the only additional used storage for the new database following the virtual restore.   The beauty of this product is its simplicity. Once it is installed, the interaction with the backup and virtual database is exactly the same as before, as the clever stuff is being done at a lower level. SQL Virtual Restore can be downloaded as a fully functional 14-day trial. Technorati Tags: SQL Server

    Read the article

  • What is Database Continuous Integration?

    - by David Atkinson
    Although not everyone is practicing continuous integration, many have at least heard of the concept. A recent poll on www.simple-talk.com indicates that 40% of respondents are employing the technique. It is widely accepted that the earlier issues are identified in the development process, the lower the cost to the development process. The worst case scenario, of course, is for the bug to be found by the customer following the product release. A number of Agile development best practices have evolved to combat this problem early in the development process, including pair programming, code inspections and unit testing. Continuous integration is one such Agile concept that tackles the problem at the point of committing a change to source control. This can alternatively be run on a regular schedule. This triggers a sequence of events that compiles the code and performs a variety of tests. Often the continuous integration process is regarded as a build validation test, and if issues were to be identified at this stage, the testers would simply not 'waste their time ' and touch the build at all. Such a ‘broken build’ will trigger an alert and the development team’s number one priority should be to resolve the issue. How application code is compiled and tested as part of continuous integration is well understood. However, this isn’t so clear for databases. Indeed, before I cover the mechanics of implementation, we need to decide what we mean by database continuous integration. For me, database continuous integration can be implemented as one or more of the following: 1)      Your application code is being compiled and tested. You therefore need a database to be maintained at the corresponding version. 2)      Just as a valid application should compile, so should the database. It should therefore be possible to build a new database from scratch. 3)     Likewise, it should be possible to generate an upgrade script to take your already deployed databases to the latest version. I will be covering these in further detail in future blogs. In the meantime, more information can be found in the whitepaper linked off www.red-gate.com/ci If you have any questions, feel free to contact me directly or post a comment to this blog post.

    Read the article

  • Windows 7 - can't uninstall application as admin

    - by James Atkinson
    I'm trying to uninstall Microsoft Virtual PC 2007 from my Windows 7 machine. Logged in as administrator, when I attempt to uninstall I get a messagebox stating "The system administrator has set policies to prevent this installation." Excuse me? No I haven't. Immediately following, another messagebox displays: "You do not have sufficient access to uninstall Microsoft Virtual PC 2007. Please contact your system administrator." I didn't do anything unusual when installing or setting up the software. Just normal home use computer. Any ideas on how to remove this software?

    Read the article

  • Windows Upgrade vs Full Install

    - by James Atkinson
    I'm in the process of purchasing a Netbook for use while traveling. The included OS is XP, however, I would like to upgrade(?) to Windows 7. My question: Does a Windows Upgrade have the same physical footprint and performance as a full install? Does an upgrade leave behind non used files/resources that were originally included in XP? If so, are there ways to reduce this? I'm trying to reduce as much OS bloat as possible. Please let me know if my question is unclear. Thanks. Related to http://superuser.com/questions/60646/is-a-clean-install-really-better-than-an-upgrade however, this doesn't address the "leftovers" question.

    Read the article

  • public key always asking for password and keyphrase

    - by Andrew Atkinson
    I am trying to SSH from a NAS to a webserver using a public key. NAS user is 'root' and webserver user is 'backup' I have all permissions set correctly and when I debug the SSH connection I get: (last little bit of the debug) debug1: SSH2_MSG_SERVICE_REQUEST sent debug1: SSH2_MSG_SERVICE_ACCEPT received debug1: Authentications that can continue: publickey,gssapi-keyex,gssapi-with-mic,password debug1: Next authentication method: publickey debug1: Offering DSA public key: /root/.ssh/id_dsa.pub debug1: Server accepts key: pkalg ssh-dss blen 433 debug1: key_parse_private_pem: PEM_read_PrivateKey failed debug1: read PEM private key done: type <unknown> Enter passphrase for key '/root/.ssh/id_dsa.pub': I am using the command: ssh -v -i /root/.ssh/id_dsa.pub [email protected] The fact that it is asking for a passphrase is a good sign surely, but I do not want it to prompt for this or a password (which comes afterwards if I press 'return' on the passphrase)

    Read the article

  • Intercept a request to read a particular file and instead generate the apparent output of that file

    - by Mike Atkinson
    Sorry for the long and yet still somehow vague title! A friend of mine has a Flash Action script running on a LAMP server that currently reads an xml config file. He's asked me if it's possible to remove the xml file, and replace it somehow with a system (lets call it an 'auto xml generator') that intercepts the request to read that file and generates an output, so it appears to all intents and purposes as if the file still exists and contains the contents that has actually been returned from our auto xml generator Hours of Googling has failed to come up with any promising leads, can anyone offer any advice? Thanks very much! Mike

    Read the article

  • Possible bug in ASP.NET MVC with form values being replaced.

    - by Dan Atkinson
    I appear to be having a problem with ASP.NET MVC in that, if I have more than one form on a page which uses the same name in each one, but as different types (radio/hidden/etc), then, when the first form posts (I choose the 'Date' radio button for instance), if the form is re-rendered (say as part of the results page), I seem to have the issue that the hidden value of the SearchType on the other forms is changed to the last radio button value (in this case, SearchType.Name). Below is an example form for reduction purposes. <% Html.BeginForm("Search", "Search", FormMethod.Post); %> <%= Html.RadioButton("SearchType", SearchType.Date, true) %> <%= Html.RadioButton("SearchType", SearchType.Name) %> <input type="submit" name="submitForm" value="Submit" /> <% Html.EndForm(); %> <% Html.BeginForm("Search", "Search", FormMethod.Post); %> <%= Html.Hidden("SearchType", SearchType.Colour) %> <input type="submit" name="submitForm" value="Submit" /> <% Html.EndForm(); %> <% Html.BeginForm("Search", "Search", FormMethod.Post); %> <%= Html.Hidden("SearchType", SearchType.Reference) %> <input type="submit" name="submitForm" value="Submit" /> <% Html.EndForm(); %> Resulting page source (this would be part of the results page) <form action="/Search/Search" method="post"> <input type="radio" name="SearchType" value="Date" /> <input type="radio" name="SearchType" value="Name" /> <input type="submit" name="submitForm" value="Submit" /> </form> <form action="/Search/Search" method="post"> <input type="hidden" name="SearchType" value="Name" /> <!-- Should be Colour --> <input type="submit" name="submitForm" value="Submit" /> </form> <form action="/Search/Search" method="post"> <input type="hidden" name="SearchType" value="Name" /> <!-- Should be Reference --> <input type="submit" name="submitForm" value="Submit" /> </form> Please can anyone else with RC1 confirm this? Maybe it's because I'm using an enum. I don't know. I should add that I can circumvent this issue by using 'manual' input () tags for the hidden fields, but if I use MVC tags (<%= Html.Hidden(...) %), .NET MVC replaces them every time. Many thanks. Update: I've seen this bug again today. It seems that this crops its head when you return a posted page and use MVC set hidden form tags with the Html helper. I've contacted Phil Haack about this, because I don't know where else to turn, and I don't believe that this should be expected behaviour as specified by David.

    Read the article

  • Create a T4MVC ActionLink with hash/pound sign)

    - by Dan Atkinson
    Is there a way to create a strongly typed T4MVC ActionLink with a hash in it? For example, here is the link I'd like to create: <a href="/Home/Index#food">Feed me</a> But there's no extension to the T4MVC object that can do this. <%= Html.ActionLink("Feed me", T4MVC.Home.Index()) %> So, what I end up having to do is create an action, and then embed it that way: <a href="<%= Url.Action(T4MVC.Home.Index()) %>"#food>Feed me</a> This isn't very desirable. Anyone have any ideas/suggestions? Thanks in advance

    Read the article

  • Render a view as a string

    - by Dan Atkinson
    Hi all! I'm wanting to output two different views (one as a string that will be sent as an email), and the other the page displayed to a user. Is this possible in ASP.NET MVC beta? I've tried multiple examples: RenderPartial to String in ASP.NET MVC Beta If I use this example, I receive the "Cannot redirect after HTTP headers have been sent.". MVC Framework: Capturing the output of a view If I use this, I seem to be unable to do a redirectToAction, as it tries to render a view that may not exist. If I do return the view, it is completely messed up and doesn't look right at all. Does anyone have any ideas/solutions to these issues i have, or have any suggestions for better ones? Many thanks! Below is an example. What I'm trying to do is create the GetViewForEmail method: public ActionResult OrderResult(string ref) { //Get the order Order order = OrderService.GetOrder(ref); //The email helper would do the meat and veg by getting the view as a string //Pass the control name (OrderResultEmail) and the model (order) string emailView = GetViewForEmail("OrderResultEmail", order); //Email the order out EmailHelper(order, emailView); return View("OrderResult", order); } Accepted answer from Tim Scott (changed and formatted a little by me): public virtual string RenderViewToString( ControllerContext controllerContext, string viewPath, string masterPath, ViewDataDictionary viewData, TempDataDictionary tempData) { Stream filter = null; ViewPage viewPage = new ViewPage(); //Right, create our view viewPage.ViewContext = new ViewContext(controllerContext, new WebFormView(viewPath, masterPath), viewData, tempData); //Get the response context, flush it and get the response filter. var response = viewPage.ViewContext.HttpContext.Response; response.Flush(); var oldFilter = response.Filter; try { //Put a new filter into the response filter = new MemoryStream(); response.Filter = filter; //Now render the view into the memorystream and flush the response viewPage.ViewContext.View.Render(viewPage.ViewContext, viewPage.ViewContext.HttpContext.Response.Output); response.Flush(); //Now read the rendered view. filter.Position = 0; var reader = new StreamReader(filter, response.ContentEncoding); return reader.ReadToEnd(); } finally { //Clean up. if (filter != null) { filter.Dispose(); } //Now replace the response filter response.Filter = oldFilter; } } Example usage Assuming a call from the controller to get the order confirmation email, passing the Site.Master location. string myString = RenderViewToString(this.ControllerContext, "~/Views/Order/OrderResultEmail.aspx", "~/Views/Shared/Site.Master", this.ViewData, this.TempData);

    Read the article

  • Unable to specify abstract classes in TPH hierarchy in Entity Framework 4

    - by Lee Atkinson
    Hi I have a TPH heirachy along the lines of: A-B-C-D A-B-C-E A-F-G-H A-F-G-I I have A as Abstract, and all the other classes are concrete with a single discriminator column. This works fine, but I want C and G to be abstract also. If I do that, and remove their discriminators from the mapping, I get error 3034 'Two entities with different keys are mapped to the same row'. I cannot see how this statement can be correct, so I assume it's a bug in some way. Is it possible to do the above? Lee

    Read the article

  • Evaluating a regular expression range

    - by Dan Atkinson
    Hi there! Is there a nice way to evaluate a regular expression range, say, for a url such as http://example.com/[a-z]/[0-9].htm This would be converted into: http://example.com/a/0.htm http://example.com/a/1.htm http://example.com/a/2.htm ... http://example.com/a/9.htm ... http://example.com/z/0.htm http://example.com/z/1.htm http://example.com/z/2.htm ... http://example.com/z/9.htm I've been scratching my head about this, and there's no pretty way of doing it without going through the alphabet and looping through numbers. Thanks in advance!

    Read the article

  • How to return all aspnet_compiler errors (not just those in first directory)

    - by Dan Atkinson
    Hi there! Is there a way to get the aspnet_compiler to go through all views and return all errors, rather than just the errors in the current view directory? For example, lets say I have a project that has a bunch of folders... Views Folder1 Folder2 Folder3 Folder4 Two of them (Folder2 and Folder3) have errors. aspnet_compiler will run, and only return the errors it comes across in Folder2. It won't return those in Folder3 at the same time. Once I fix the errors in Folder2 and run it again, it'll then pick up the ones in the Folder3. I fix those. And then have to run the tool again, and again until it's all fixed. This is getting annoying!! For reference, here's the command I use: C:\Windows\Microsoft.NET\Framework\v2.0.50727\aspnet_compiler -v / -p "C:\path\to\project" Thanks in advance!

    Read the article

1 2  | Next Page >